RedCrawler API
Scrape V2

Bulk Communities by ID V2

Fetch multiple subreddits by IDs. Up to 1500 t5_ IDs.

POST /api/v2/batch/communities

Fetch multiple subreddits by IDs. Up to 1500 t5_ IDs.

Body parameters

NameTypeRequiredDescription
idsstringYesComma-separated subreddit IDs (raw or t5_ prefixed)

Authentication

HeaderValue
X-RapidAPI-KeyYour RapidAPI key
X-RapidAPI-Hostred-crawler1.p.rapidapi.com

Example request

cURL

curl -X POST "https://red-crawler1.p.rapidapi.com/api/v2/batch/communities" \
  -H "X-RapidAPI-Key: YOUR_RAPIDAPI_KEY" \
  -H "X-RapidAPI-Host: red-crawler1.p.rapidapi.com" \
  -H "Content-Type: application/json" \
  --data '{"ids": "t3_1sys4r2,t3_1sys51u,t3_1sys5kz"}'

JavaScript

const res = await fetch("https://red-crawler1.p.rapidapi.com/api/v2/batch/communities", {
  method: "POST",
  headers: {
      "X-RapidAPI-Key": "YOUR_RAPIDAPI_KEY",
    "X-RapidAPI-Host": "red-crawler1.p.rapidapi.com"
    },
    body: JSON.stringify({
  "ids": "t3_1sys4r2,t3_1sys51u,t3_1sys5kz"
})
});
const data = await res.json();
console.log(data);

Python

import requests

url = "https://red-crawler1.p.rapidapi.com/api/v2/batch/communities"
headers = {
    "X-RapidAPI-Key": "YOUR_RAPIDAPI_KEY",
    "X-RapidAPI-Host": "red-crawler1.p.rapidapi.com",
    "Content-Type": "application/json"
}
payload = {
  "ids": "t3_1sys4r2,t3_1sys51u,t3_1sys5kz"
}
resp = requests.post(url, headers=headers, json=payload)
print(resp.json())

Example response

HTTP 200 — captured from a live call to the gateway.

{
  "success": true,
  "data": {
    "success": true,
    "subreddits": [
      {
        "id": "t5_6",
        "name": "reddit.com",
        "title": "reddit.com",
        "description": {
          "preview": "To report a site-wide rule violation to the Reddit Admins, please use our report forms or message /r/reddit.com modmail. This subreddit is archived and no longer accepting submissions."
        },
        "isSubscribed": false,
        "subscribersCount": 1053766,
        "communityStats": {
          "weeklyActiveUsersCount": 204683
        },
        "styles": {
          "primaryColor": "#0079D3",
          "legacyPrimaryColor": null,
          "icon": "https://styles.redditmedia.com/t5_6/styles/communityIcon_a8uzjit9bwr21.png?width=64&height=64&frame=1&auto=webp&crop=64:64,smart&s=cae6377349d1a8e29811ed18df135cad4950e659",
          "legacyIcon": null
        }
      },
      {
        "id": "t5_hmeujo",
        "name": "RedCrawlerAPI",
        "title": "RedCrawlerAPI",
        "description": {
          "preview": "RedCrawlerAPI"
        },
        "isSubscribed": false,
        "subscribersCount": 2,
        "communityStats": {
          "weeklyActiveUsersCount": 0
        },
        "styles": {
          "primaryColor": "#4C8CFF",
          "legacyPrimaryColor": null,
          "icon": "https://styles.redditmedia.com/t5_hmeujo/styles/communityIcon_ivw182y793yg1.png?width=64&height=64&frame=1&auto=webp&crop=64:64,smart&s=186174278f92e8219768a31d85af2562491f2df8",
          "legacyIcon": null
        }
      }
    ],
    "count": 2,
    "requested": 2
  },
  "error": null,
  "timestamp": "2026-04-29T16:29:22.410281+00:00"
}

On this page