High-performance web crawler for discovering and mapping website structure. Use when users ask to crawl a website, map site structure, discover pages, find all URLs on a site, analyze link relationships, or generate site reports. Supports sitemap discovery, checkpoint/resume, rate limiting, and HTML report generation.
/plugin marketplace add leobrival/topographic-studio-plugins/plugin install crawler@topographic-studio-pluginsThis skill inherits all available tools. When active, it can use any tool Claude has access to.
reference.mdscriptsHigh-performance web crawler with TypeScript/Bun frontend and Go backend for discovering and mapping website structure.
Run the crawler from the scripts directory:
cd ~/.claude/scripts/crawler
bun src/index.ts <URL> [options]
| Option | Short | Default | Description |
|---|---|---|---|
--depth | -D | 2 | Maximum crawl depth |
--workers | -w | 20 | Concurrent workers |
--rate | -r | 2 | Rate limit (requests/second) |
--profile | -p | - | Use preset profile (fast/deep/gentle) |
--output | -o | auto | Output directory |
--sitemap | -s | true | Use sitemap.xml for discovery |
--domain | -d | auto | Allowed domain (extracted from URL) |
--debug | - | false | Enable debug logging |
Three preset profiles for common use cases:
| Profile | Workers | Depth | Rate | Use Case |
|---|---|---|---|---|
fast | 50 | 3 | 10 | Quick site mapping |
deep | 20 | 10 | 3 | Thorough crawling |
gentle | 5 | 5 | 1 | Respect server limits |
bun src/index.ts https://example.com
bun src/index.ts https://example.com --depth 5 --workers 30 --rate 5
bun src/index.ts https://example.com --profile fast
bun src/index.ts https://example.com --profile gentle
The crawler generates two files in the output directory:
{
"stats": {
"pages_found": 150,
"pages_crawled": 147,
"external_links": 23,
"errors": 3,
"duration": 45.2
},
"results": [
{
"url": "https://example.com/page",
"title": "Page Title",
"status_code": 200,
"depth": 1,
"links": ["..."],
"content_type": "text/html"
}
]
}
Reduce the rate limit or use the gentle profile:
bun src/index.ts <url> --rate 1
# or
bun src/index.ts <url> --profile gentle
The TypeScript frontend auto-compiles the Go binary. If compilation fails:
cd ~/.claude/scripts/crawler/engine
go build -o crawler main.go
Reduce depth or increase workers:
bun src/index.ts <url> --depth 1 --workers 50
For detailed architecture, Go engine specifications, and code conventions, see reference.md.
~/.claude/scripts/crawler/~/.claude/scripts/raycast/crawl-website.sh~/.claude/scripts/crawler/config/profiles/