Skip to content

Commit

Permalink
Reduce concurrent requests
Browse files Browse the repository at this point in the history
Countdown API limits requests, so reducing this value from 6 -> 2
reduces the chance of bad API responses.

In the future, we should implement retry, and also not crash the first
time we fail to collect a price. Our scraper should be resilient and
continue to scrape, reporting the failures at the end.
  • Loading branch information
OverHash committed Feb 26, 2024
1 parent 4b3deef commit 13c41c1
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,4 @@ pub const CACHE_PATH: &str = "cache.json";
/// The amount of milliseconds to wait between performing iterations on the pages.
pub const PAGE_ITERATION_INTERVAL: Duration = Duration::from_millis(500);
/// The amount of requests to perform in parallel.
pub const CONCURRENT_REQUESTS: i64 = 6;
pub const CONCURRENT_REQUESTS: i64 = 2;

0 comments on commit 13c41c1

Please sign in to comment.