RedCrawler API
Scheduling Posts V2

Schedule Text Post V2

Schedule a text post for future publishing. Requires bearer token stored in Token Vault. The background scheduler publishes at the specified time using your sto

POST /api/v2/schedule/text

Schedule a text post for future publishing. Requires bearer token stored in Token Vault. The background scheduler publishes at the specified time using your stored bearer and proxy. Max 20 pending per user.

Body parameters

NameTypeRequiredDescription
titlestringYesPost title
subredditstringYesTarget subreddit (name, r/name, or URL)
publish_atstringYesWhen to publish (ISO-8601 UTC, e.g. 2026-04-11T15:00:00Z)
flair_idstringNoFlair template ID
spoilerstringNoMark as spoiler (true/false)
flair_textstringNoFlair text
delete_afterstringNoAuto-delete post after N seconds (optional)
bearerstringNoBearer token. Optional if stored in Token Vault.
nsfwstringNoMark as NSFW (true/false)
brand_affiliatestringNoMark as brand affiliate / commercial (true/false)
bodystringNoPost body — plain text or Reddit markdown both work
proxystringNoProxy. Optional if stored in Token Vault.

Authentication

HeaderValue
X-RapidAPI-KeyYour RapidAPI key
X-RapidAPI-Hostred-crawler1.p.rapidapi.com

Example request

cURL

curl -X POST "https://red-crawler1.p.rapidapi.com/api/v2/schedule/text" \
  -H "X-RapidAPI-Key: YOUR_RAPIDAPI_KEY" \
  -H "X-RapidAPI-Host: red-crawler1.p.rapidapi.com" \
  -H "Content-Type: application/json" \
  --data '{"title": "<title>", "subreddit": "test", "publish_at": "<publish_at>", "flair_id": "<flair_id>", "spoiler": "<spoiler>", "flair_text": "<flair_text>", "delete_after": "<delete_after>", "bearer": "<bearer>", "nsfw": "<nsfw>", "brand_affiliate": "<brand_affiliate>", "body": "<body>", "proxy": "<proxy>"}'

JavaScript

const res = await fetch("https://red-crawler1.p.rapidapi.com/api/v2/schedule/text", {
  method: "POST",
  headers: {
      "X-RapidAPI-Key": "YOUR_RAPIDAPI_KEY",
    "X-RapidAPI-Host": "red-crawler1.p.rapidapi.com"
    },
    body: JSON.stringify({
  "title": "<title>",
  "subreddit": "test",
  "publish_at": "<publish_at>",
  "flair_id": "<flair_id>",
  "spoiler": "<spoiler>",
  "flair_text": "<flair_text>",
  "delete_after": "<delete_after>",
  "bearer": "<bearer>",
  "nsfw": "<nsfw>",
  "brand_affiliate": "<brand_affiliate>",
  "body": "<body>",
  "proxy": "<proxy>"
})
});
const data = await res.json();
console.log(data);

Python

import requests

url = "https://red-crawler1.p.rapidapi.com/api/v2/schedule/text"
headers = {
    "X-RapidAPI-Key": "YOUR_RAPIDAPI_KEY",
    "X-RapidAPI-Host": "red-crawler1.p.rapidapi.com",
    "Content-Type": "application/json"
}
payload = {
  "title": "<title>",
  "subreddit": "test",
  "publish_at": "<publish_at>",
  "flair_id": "<flair_id>",
  "spoiler": "<spoiler>",
  "flair_text": "<flair_text>",
  "delete_after": "<delete_after>",
  "bearer": "<bearer>",
  "nsfw": "<nsfw>",
  "brand_affiliate": "<brand_affiliate>",
  "body": "<body>",
  "proxy": "<proxy>"
}
resp = requests.post(url, headers=headers, json=payload)
print(resp.json())

Example response

HTTP 200 — captured from a live call to the gateway.

{
  "success": true,
  "data": {
    "scheduled": true,
    "id": 113,
    "subreddit": "RedCrawlerAPI",
    "title": "Schedule Text Post V2",
    "type": "text",
    "publish_at": "2026-05-11T15:00:00+00:00",
    "delete_after": null,
    "delete_at": null,
    "status": "pending"
  },
  "error": null,
  "timestamp": "2026-04-29T17:11:50.507537+00:00"
}

On this page