RedCrawler API
Scheduling Posts V2

Schedule Link Post V2

Schedule a link post. Requires bearer token stored in Token Vault. The background scheduler publishes at the specified time using your stored bearer and proxy.

POST /api/v2/schedule/link

Schedule a link post. Requires bearer token stored in Token Vault. The background scheduler publishes at the specified time using your stored bearer and proxy. Max 20 pending per user.

Body parameters

NameTypeRequiredDescription
link_urlstringYesLink URL
publish_atstringYesWhen to publish (ISO-8601 UTC, e.g. 2026-04-11T15:00:00Z)
subredditstringYesTarget subreddit (name, r/name, or URL)
titlestringYesPost title
flair_textstringNoFlair text
nsfwstringNoMark as NSFW (true/false)
proxystringNoProxy. Optional if stored in Token Vault.
brand_affiliatestringNoMark as brand affiliate / commercial (true/false)
flair_idstringNoFlair template ID
spoilerstringNoMark as spoiler (true/false)
bodystringNoPost body — plain text or Reddit markdown both work
delete_afterstringNoAuto-delete post after N seconds (optional)
bearerstringNoBearer token. Optional if stored in Token Vault.

Authentication

HeaderValue
X-RapidAPI-KeyYour RapidAPI key
X-RapidAPI-Hostred-crawler1.p.rapidapi.com

Example request

cURL

curl -X POST "https://red-crawler1.p.rapidapi.com/api/v2/schedule/link" \
  -H "X-RapidAPI-Key: YOUR_RAPIDAPI_KEY" \
  -H "X-RapidAPI-Host: red-crawler1.p.rapidapi.com" \
  -H "Content-Type: application/json" \
  --data '{"link_url": "<link_url>", "publish_at": "<publish_at>", "subreddit": "test", "title": "<title>", "flair_text": "<flair_text>", "nsfw": "<nsfw>", "proxy": "<proxy>", "brand_affiliate": "<brand_affiliate>", "flair_id": "<flair_id>", "spoiler": "<spoiler>", "body": "<body>", "delete_after": "<delete_after>", "bearer": "<bearer>"}'

JavaScript

const res = await fetch("https://red-crawler1.p.rapidapi.com/api/v2/schedule/link", {
  method: "POST",
  headers: {
      "X-RapidAPI-Key": "YOUR_RAPIDAPI_KEY",
    "X-RapidAPI-Host": "red-crawler1.p.rapidapi.com"
    },
    body: JSON.stringify({
  "link_url": "<link_url>",
  "publish_at": "<publish_at>",
  "subreddit": "test",
  "title": "<title>",
  "flair_text": "<flair_text>",
  "nsfw": "<nsfw>",
  "proxy": "<proxy>",
  "brand_affiliate": "<brand_affiliate>",
  "flair_id": "<flair_id>",
  "spoiler": "<spoiler>",
  "body": "<body>",
  "delete_after": "<delete_after>",
  "bearer": "<bearer>"
})
});
const data = await res.json();
console.log(data);

Python

import requests

url = "https://red-crawler1.p.rapidapi.com/api/v2/schedule/link"
headers = {
    "X-RapidAPI-Key": "YOUR_RAPIDAPI_KEY",
    "X-RapidAPI-Host": "red-crawler1.p.rapidapi.com",
    "Content-Type": "application/json"
}
payload = {
  "link_url": "<link_url>",
  "publish_at": "<publish_at>",
  "subreddit": "test",
  "title": "<title>",
  "flair_text": "<flair_text>",
  "nsfw": "<nsfw>",
  "proxy": "<proxy>",
  "brand_affiliate": "<brand_affiliate>",
  "flair_id": "<flair_id>",
  "spoiler": "<spoiler>",
  "body": "<body>",
  "delete_after": "<delete_after>",
  "bearer": "<bearer>"
}
resp = requests.post(url, headers=headers, json=payload)
print(resp.json())

Example response

HTTP 200 — captured from a live call to the gateway.

{
  "success": true,
  "data": {
    "scheduled": true,
    "id": 118,
    "subreddit": "RedCrawlerAPI",
    "title": "Schedule Link Post V2",
    "type": "link",
    "publish_at": "2026-05-08T15:00:00+00:00",
    "delete_after": 1440,
    "delete_at": "2026-05-08T15:24:00+00:00",
    "status": "pending"
  },
  "error": null,
  "timestamp": "2026-04-29T17:33:22.090801+00:00"
}

On this page