RedCrawler API
Scheduling Posts V2

Schedule Image Post V2

Schedule an image post. Image downloaded at publish time. Requires bearer token stored in Token Vault. The background scheduler publishes at the specified time

POST /api/v2/schedule/image

Schedule an image post. Image downloaded at publish time. Requires bearer token stored in Token Vault. The background scheduler publishes at the specified time using your stored bearer and proxy. Max 20 pending per user.

Body parameters

NameTypeRequiredDescription
subredditstringYesTarget subreddit (name, r/name, or URL)
titlestringYesPost title
publish_atstringYesWhen to publish (ISO-8601 UTC, e.g. 2026-04-11T15:00:00Z)
image_urlstringYesImage URL (downloaded at publish time)
flair_idstringNoFlair template ID
nsfwstringNoMark as NSFW (true/false)
bearerstringNoBearer token. Optional if stored in Token Vault.
brand_affiliatestringNoMark as brand affiliate / commercial (true/false)
spoilerstringNoMark as spoiler (true/false)
proxystringNoProxy. Optional if stored in Token Vault.
bodystringNoPost body — plain text or Reddit markdown both work
delete_afterstringNoAuto-delete post after N seconds (optional)
flair_textstringNoFlair text

Authentication

HeaderValue
X-RapidAPI-KeyYour RapidAPI key
X-RapidAPI-Hostred-crawler1.p.rapidapi.com

Example request

cURL

curl -X POST "https://red-crawler1.p.rapidapi.com/api/v2/schedule/image" \
  -H "X-RapidAPI-Key: YOUR_RAPIDAPI_KEY" \
  -H "X-RapidAPI-Host: red-crawler1.p.rapidapi.com" \
  -H "Content-Type: application/json" \
  --data '{"subreddit": "test", "title": "<title>", "publish_at": "<publish_at>", "image_url": "<image_url>", "flair_id": "<flair_id>", "nsfw": "<nsfw>", "bearer": "<bearer>", "brand_affiliate": "<brand_affiliate>", "spoiler": "<spoiler>", "proxy": "<proxy>", "body": "<body>", "delete_after": "<delete_after>", "flair_text": "<flair_text>"}'

JavaScript

const res = await fetch("https://red-crawler1.p.rapidapi.com/api/v2/schedule/image", {
  method: "POST",
  headers: {
      "X-RapidAPI-Key": "YOUR_RAPIDAPI_KEY",
    "X-RapidAPI-Host": "red-crawler1.p.rapidapi.com"
    },
    body: JSON.stringify({
  "subreddit": "test",
  "title": "<title>",
  "publish_at": "<publish_at>",
  "image_url": "<image_url>",
  "flair_id": "<flair_id>",
  "nsfw": "<nsfw>",
  "bearer": "<bearer>",
  "brand_affiliate": "<brand_affiliate>",
  "spoiler": "<spoiler>",
  "proxy": "<proxy>",
  "body": "<body>",
  "delete_after": "<delete_after>",
  "flair_text": "<flair_text>"
})
});
const data = await res.json();
console.log(data);

Python

import requests

url = "https://red-crawler1.p.rapidapi.com/api/v2/schedule/image"
headers = {
    "X-RapidAPI-Key": "YOUR_RAPIDAPI_KEY",
    "X-RapidAPI-Host": "red-crawler1.p.rapidapi.com",
    "Content-Type": "application/json"
}
payload = {
  "subreddit": "test",
  "title": "<title>",
  "publish_at": "<publish_at>",
  "image_url": "<image_url>",
  "flair_id": "<flair_id>",
  "nsfw": "<nsfw>",
  "bearer": "<bearer>",
  "brand_affiliate": "<brand_affiliate>",
  "spoiler": "<spoiler>",
  "proxy": "<proxy>",
  "body": "<body>",
  "delete_after": "<delete_after>",
  "flair_text": "<flair_text>"
}
resp = requests.post(url, headers=headers, json=payload)
print(resp.json())

Example response

HTTP 200 — captured from a live call to the gateway.

{
  "success": true,
  "data": {
    "scheduled": true,
    "id": 119,
    "subreddit": "RedCrawlerAPI",
    "title": "Schedule Image Post V2",
    "type": "image",
    "publish_at": "2026-05-11T15:00:00+00:00",
    "delete_after": null,
    "delete_at": null,
    "status": "pending"
  },
  "error": null,
  "timestamp": "2026-04-29T17:35:39.453352+00:00"
}

On this page