Frequently Asked Questions
General
What is Lociator?
Lociator derives from locus (Latin: "place", "point", "space") and the suffix -ator (one who operates, one who executes).
Lociator = The one who operates loci. The orchestrator of knowledge points in space.
In the age of AI and GEO (Generative Engine Optimization), content is no longer a collection of isolated pages. Every article is a knowledge node. Every topic is a semantic cluster. Every website is a topology map.
Lociator was born to orchestrate that map — visualizing your internal linking structure, identifying SEO issues like orphan pages and deep content, scoring your architecture, and discovering topical clusters using AI.
Do I need technical knowledge to use it?
No! Simply enter a URL and click "Crawl". The tool handles everything automatically and presents results in easy-to-understand visual formats with multiple layout modes.
Crawling
How long does a crawl take?
It depends on your website's size and response speed. The crawler processes 5 pages concurrently with a 200ms delay between batches. A 50-page site typically completes in under a minute. Larger sites (500+ pages) may take several minutes.
Will crawling affect my website performance?
The crawler is designed to be respectful — 5 concurrent requests max with a 200ms inter-batch delay and 10-second timeouts. The impact on your server should be negligible. The User-Agent identifies asLociatorBot/1.0.
Does the crawler follow nofollow or external links?
No. Links with rel="nofollow" are skipped. Only internal links (same origin as the root URL) are followed. External links are completely ignored.
What URLs are skipped?
Asset files (images, videos, fonts, PDFs, CSS, JS — 40+ extensions), non-HTTP protocols (mailto:, tel:, javascript:), fragments (#), and any URLs matching your configured exclude patterns.
Can I exclude specific URLs from crawling?
Yes! Go to Settings and add comma-separated exclude patterns with wildcard support (e.g., /admin/*, /wp-json/*, *.pdf).
Metrics & Scoring
What is the Architecture Score?
A weighted composite score (0–100) combining 6 sub-scores: Depth (25%), Linking (20%), Silo (20%), Pillar (15%), Orphan (10%), and Cross-Silo (10%). See the Metrics & Scoring page for full details.
What is a good depth for my pages?
Most SEO experts recommend keeping important pages within 3 clicks of the homepage. The Depth Score gives maximum points when max depth ≤ 3, and decreases linearly as depth increases toward 10.
What is a pillar page?
A hub page with high out-degree (≥10 outgoing links) and moderate in-degree (≥5 incoming links). Sites should have roughly 1 pillar per 20 pages for optimal Pillar Score.
Topic Analysis
How does topic analysis work?
After a crawl completes, the system extracts clean text content from each page (using smart noise removal), generates vector embeddings, and clusters pages by topical similarity. Results are organized into a hierarchical topic tree.
Is topic analysis automatic?
Yes, topic analysis is triggered automatically when a crawl completes successfully. The crawler sends a request to the /api/analyze-topics endpoint after saving all graph data.
Account & Billing
Is there a free plan?
Yes! The free plan includes 50 pages per crawl, 5 crawls per month, and 3 saved crawl histories. No credit card required.
Can I cancel my subscription?
Yes, you can cancel anytime from your account settings. Your plan will remain active until the end of the current billing period.
What happens when I hit my crawl limit?
Crawl limits reset monthly based on your billing period start date. Your current usage is tracked in the crawls_this_month counter.