24 lines
1.1 KiB
Markdown
24 lines
1.1 KiB
Markdown
# Surreal Crawler
|
|
|
|
Mapping with a budget of 1000 (crawl 1000 sites, so many more links are actually discovered), on [my webiste](https://oliveratkinson.net) on 8/26/2024 took 1m9s.
|
|
|
|
This is including the crawl and loading into the database and linking sites. (Locally hosted surreal db instance)
|
|
|
|
This run created 4299 site links with 23286 links between the sites. (It found my this git site which really bolsters those numbers.)
|
|
|
|
## Install / Build
|
|
|
|
* You will need rust to compile the crawler [rustup.rs](https://rustup.rs)
|
|
* You need python3 (will come installed on most linux distros) and poetry for dependancy management.
|
|
* Install `pipx`, `python3`
|
|
* Then: `pipx install poetry`
|
|
* Then: `poetry install` to install the project dependancies
|
|
* You need to install [surrealdb](https://surrealdb.com)
|
|
|
|
## Use
|
|
|
|
Just run `./crawl.sh {url}` and it will start crawling. You can tweak the budget inside [crawl.sh](https://git.oliveratkinson.net/Oliver/internet_mapper/src/branch/main/crawl.sh) if you want.
|
|
|
|
You can also prefix the command with `time` to benchmark the system, such as: `time ./crawl.sh https://discord.com`.
|
|
|