require 'net/http'
urls = [
{'link' => 'http://www.google.com/'},
{'link' => 'http://www.yandex.ru/'},
{'link' => 'http://www.baidu.com/'}
]
urls.each do |u|
u['content'] = Net::HTTP.get( URI.parse(u['link']) )
end
print urls
This code works in synchronous style. First request, second, third. I would like to send all requests asynchronously and print urls
after all of them is done.
What the best way to do it? Is Fiber suited for that?
It depends what you want to do after the function afterwards. You can do it with simple threads:
see: http://snipplr.com/view/3966/simple-example-of-threading-in-ruby/
Here's an example using threads.
This can be done with the C library cURL. A ruby binding for that library exists, but it doesn't seem to support this functionality out of the box. However, it looks like there is a patch adding/fixing it (example code is available on the page). I know this doesn't sound great, but it might be worth a try if there aren't any better suggestions.
The work_queue gem is the easiest way to perform tasks asynchronously and concurrently in your application.
I have written an in-depth blog post about this topic which includes an answer that is somewhat similar to the one August posted - but with a few key differences: 1) Keeps track of all thread references in "thread" array. 2) Uses "join" method to tie up threads at the end of program.
The full tutorial (and some performance information) is available here: https://zachalam.com/performing-multiple-http-requests-asynchronously-in-ruby/
You could have a different thread execute each one of the Net::HTTP.get. And just wait for all the threads to finish.
BTW printing urls will print both the link and the content.