I'm trying to write my first Perl program. If you think that Perl is a bad language for the task at hand tell me what language would solve it better.
The program tests connectivity between given machine and remote Apache server. At first program requests the directory listing from the Apache server, than it parses the list and downloads all files one by one. Should there be a problem with file (connection resets before reaching the specified Content-Length) this should be logged and next file should be retrieved. There is no need to save the files or even check the integrity, I only need to log the time it takes to complete and all cases where connection resets.
To retrieve the list of links from Apache-generated directory index I plan to use regexp similar to
/href=\"([^\"]+)\"/
The regexp is not debugged yet, indeed.
What is the "reference" way to do HTTP request from Perl? I googled and found examples using many different libraries, some of them commercial. I need something that can detect disconnections (timeout or TCP reset) and handle these.
Another question. How do I store everything caught by my regexp when searching globally as a list of string with the minimal coding effort?