API Reference
The Lociator API is powered by a Cloudflare Worker that handles all client requests. The worker routes to Supabase for data and publishes crawl jobs to Upstash QStash. API access is available on Pro, Advanced, and Premium plans.
API Overview
| Method | Endpoint | Description |
|---|---|---|
| POST | /api/crawl | Create a new crawl job |
| GET | /api/status/:id | Check crawl job status |
| GET | /api/result/:id | Get crawl results (graph data) |
| GET | /api/jobs | List user's crawl jobs |
| DELETE | /api/crawl/:id | Delete a crawl job |
| POST | /api/analyze-topics | Trigger topic analysis |
| GET | /health | Health check |
Authentication
Requests are authenticated via the Authorization header using a Supabase JWT token, or via the X-User-Id header for internal service calls. CORS is enabled for all origins.
Authorization: Bearer <supabase-jwt-token>
# or
X-User-Id: <user-uuid>POST /api/crawl
Create a new crawl job. The job is published to QStash for async processing by the crawler.
POST /api/crawl
Content-Type: application/json
{
"url": "https://example.com",
"maxPages": 50
}
Response 200:
{
"success": true,
"jobId": "uuid",
"message": "Crawl job created"
}GET /api/status/:id
Check the current status and progress of a crawl job.
GET /api/status/<jobId>
Response 200:
{
"id": "uuid",
"status": "processing", // queued | processing | completed | failed
"pages_crawled": 45,
"max_pages": 200,
"root_url": "https://example.com",
"started_at": "2025-01-01T00:00:05Z",
"error_message": null
}GET /api/result/:id
Retrieve complete graph data for a completed crawl (pages, links, and metrics).
GET /api/result/<jobId>
Response 200:
{
"pages": [
{ "id": "uuid", "url": "...", "normalized_url": "...",
"depth": 0, "in_degree": 0, "out_degree": 12,
"title": "...", "is_orphan": false }
],
"links": [
{ "source_page_id": "uuid", "target_page_id": "uuid",
"anchor_text": "..." }
],
"metrics": {
"total_pages": 150, "total_links": 890,
"max_depth": 4, "avg_depth": 1.8,
"architecture_score": 72.5,
"depth_score": 85, "linking_score": 65,
"silo_score": 70, "pillar_score": 60,
"cross_silo_score": 80
}
}GET /api/jobs
List all crawl jobs for the authenticated user.
GET /api/jobs
Response 200:
[
{
"id": "uuid",
"root_url": "https://example.com",
"status": "completed",
"pages_crawled": 50,
"created_at": "2025-01-01T00:00:00Z"
}
]DELETE /api/crawl/:id
Delete a crawl job and all associated data (pages, links, metrics, topics).
DELETE /api/crawl/<jobId>
Response 200:
{ "success": true }POST /api/analyze-topics
Trigger topic analysis for a completed crawl. Automatically called after crawl completion, but can be triggered manually. Requires the crawl to have completed successfully.
POST /api/analyze-topics
Content-Type: application/json
{ "crawlId": "uuid" }
Response 200:
{ "success": true, "topics": [...] }GET /health
GET /health
Response 200:
{ "status": "ok" }Error Codes
| Code | Description |
|---|---|
| 400 | Invalid request body or missing required fields |
| 401 | Missing or invalid authentication |
| 403 | API access not available on current plan |
| 404 | Resource not found |
| 429 | Rate limit or monthly crawl limit exceeded |
| 500 | Internal server error |