Often I need to download a webpage and then edit it offline. I have tried a few tools and the main feature they lack is downloading images referenced in the CSS files.
Is there a tool (for Linux) that will download everything so that the webpage will render the same offline (excluding AJAX)?
It's possible to do this through Firefox, see this form
Reference - http://www.webdeveloper.com/forum/showthread.php?t=212610
EDIT: meder is right: stock wget does not parse and download css images. There is, however, a patch that adds this feature: [1, 2]UPDATE: The patch mentioned above has been merged into wget 1.12, released 22-Sep-2009:
I ran into the same problem the other day working for a client. Another tool that works really well is HTTrack. The software is available in a commandline verison for both windows and Linux. For Linux they prebuilt packages for most of the more common operating systems found here
For my purposes it worked better than wget with some of the added features/switches that fix links inside the html file.
wget is a great choice for you. Just for more information, the wget version on windows at this time there is no official release on gnu for wget version 1.12. The current version is 1.11
wget version 1.11 cannot download images/fonts in css files. Fortunately, you can find a build of 1.14 from this page. It fixed these problems.
http://opensourcepack.blogspot.com/2010/05/wget-112-for-windows.html
The current version of Opera (12) allows to save a page as 'HTML with images'.
Thereby Opera also downloads images which are referenced in the CSS files and adapts the image URLs in the CSS accordingly.
In Firefox:
File->Save Page As->Web Page, Complete
Saves all javascript and images and css. Nothing else required :)