What's a good way to download HTTP URLs (e.g. such as http://0.0.0.0/foo.htm ) in C++ on Linux ? I strongly prefer something asynchronous. My program will have an event loop that repeatedly initiates multiple (very small) downloads and acts on them when they finish (either by polling or being notified somehow). I would rather not have to spawn multiple threads/processes to accomplish this. That shouldn't be necessary.
Should I look into libraries like libcurl? I suppose I could implement it manually with non-blocking TCP sockets and select() calls, but that would likely be less convenient.
Libcurl is the way to go. See http://curlpp.org for C++ bindings and an excellent set of tutorials.
You can use boost::asio to perform async IO operations. Heres an example of an async http client.
Have you considered Qt's network module? They provide some classes for asynchronous download for example QNetworkAccessManager.
Qt's QThread instance when run can have it's own event loop. Insides QThread you can have an instance of QHttp and since QHttp uses Qt Event loop to function you have your async Http calls from mail thread. Also note that inter thread communication is very easy.
Head straight to http://doc.qt.nokia.com and look at the classes's documentation to understand better.