The async example is useful, but being new to Rust and Tokio, I am struggling to work out how to do N requests at once, using URLs from a vector, and creating an iterator of the response HTML for each URL as a string.
How could this be done?
The async example is useful, but being new to Rust and Tokio, I am struggling to work out how to do N requests at once, using URLs from a vector, and creating an iterator of the response HTML for each URL as a string.
How could this be done?
As of reqwest 0.9:
use futures::{stream, Future, Stream}; // 0.1.26
use reqwest::r#async::Client; // 0.9.14
use tokio; // 0.1.18
type Result<T> = std::result::Result<T, Box<std::error::Error>>;
const PARALLEL_REQUESTS: usize = 2;
fn main() -> Result<()> {
let client = Client::new();
let urls = vec!["https://api.ipify.org", "https://api.ipify.org"];
let bodies = stream::iter_ok(urls)
.map(move |url| {
client
.get(url)
.send()
.and_then(|res| res.into_body().concat2().from_err())
})
.buffer_unordered(PARALLEL_REQUESTS);
let work = bodies
.for_each(|b| {
println!("Got {} bytes", b.len());
Ok(())
})
.map_err(|e| panic!("Error while processing: {}", e));
tokio::run(work);
Ok(())
}
stream::iter_ok(urls)
stream::iter_ok
Take a collection of strings and convert it into a Stream
.
.and_then(|res| res.into_body().concat2().from_err())
Stream::concat2
, Stream::from_err
Take each response's body stream and collect it all into one big chunk.
.buffer_unordered(N);
Stream::buffer_unordered
Convert a stream of futures into a stream of those future's values, executing the futures in parallel.
let work = bodies.for_each(|b| { println!("Got {} bytes", b.len()); Ok(()) });
Stream::for_each
Convert the stream back into a future.
See also: