I want to automatically visit / crawl all the pages on my site in order to generate a cache file. Is there any way or tool to do this?
相关问题
- Views base64 encoded blob in HTML with PHP
- Laravel Option Select - Default Issue
- PHP Recursively File Folder Scan Sorted by Modific
- Can php detect if javascript is on or not?
- Using similar_text and strpos together
surfen's method is correct but if you want a php solution you can check Symfony 2-s BrowserKit component which can be used as a stand alone component.
https://github.com/symfony/BrowserKit
You can use
wget
's recursive option to do this. Changeexample.com
to your domain:do you use a CMS? do you have a list of your pages? you could write a simple PHP loop to load all pages using CURL or php fopen()
basic but I hope you get the idea...
Just use any robot that downloads your entire page:
https://superuser.com/questions/14403/how-can-i-download-an-entire-website
For example wget: