recover from low sites to crawl #17

Closed
opened 2025-08-05 12:49:41 +00:00 by Oliver · 1 comment
Owner

src/main.rs Lines 136 to 147 in f3a51065b5
let uncrawled = get_next(&db.clone(), &config).await;
match uncrawled {
Some(site) => {
process(site, db.clone(), reqwest.clone()).await;
SITES_CRAWLED.add(1, &[]);
// Somehow this write doesn't hang on the while's read?
let mut c = crawled.write().await;
*c += 1;
},
None => {
warn!("fn::get_next() returned None");
return;

If the sites availiable drop too low the crawler won't recover, it will just stop

https://git.oliveratkinson.net/Oliver/internet_mapper/src/commit/f3a51065b555a81ea3d784cc8c75a203abd7fce6/src/main.rs#L136-L147 If the sites availiable drop too low the crawler won't recover, it will just stop
Author
Owner

should be fixed in 95b8af0356

should be fixed in 95b8af0356
Sign in to join this conversation.
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: Oliver/internet_mapper#17
No description provided.