internet_mapper/README.md
2025-03-19 15:05:32 -06:00

16 lines
675 B
Markdown

# Surreal Crawler
Crawls sites saving all the found links to a surrealdb database. It then proceeds to take batches of 100 uncrawled links untill the crawl budget is reached. It saves the data of each site in a minio database.
### TODO
- [ ] Domain filtering - prevent the crawler from going on alternate versions of wikipedia.
- [ ] Conditionally save content - based on filename or file contents
- [x] GUI / TUI ? - Graphana
- [x] Better asynchronous getting of the sites. Currently it all happens serially.
- [ ] Allow for storing asynchronously
3/19/25: Took 20min to crawl 100 pages
This ment we stored 100 pages, 142,997 urls, and 1,425,798 links between the two.