11 lines
502 B
Markdown
11 lines
502 B
Markdown
# Surreal Crawler
|
|
|
|
Crawls sites saving all the found links to a surrealdb database. It then proceeds to take batches of 100 uncrawled links untill the crawl budget is reached. It saves the data of each site in a minio database.
|
|
|
|
|
|
### TODO
|
|
|
|
- [ ] Domain filtering - prevent the crawler from going on alternate versions of wikipedia.
|
|
- [ ] Conditionally save content - based on filename or file contents
|
|
- [ ] GUI / TUI ?
|
|
- [ ] Better asynchronous getting of the sites. Currently it all happens serially. |