Is there a way or a tool to automatically visit al

2019-06-23 22:54发布

问题:

I want to automatically visit / crawl all the pages on my site in order to generate a cache file. Is there any way or tool to do this?

回答1:

Just use any robot that downloads your entire page:

https://superuser.com/questions/14403/how-can-i-download-an-entire-website

For example wget:

wget -r --no-parent http://site.com/songs/


回答2:

You can use wget's recursive option to do this. Change example.com to your domain:

wget --recursive --no-parent --domains=example.com --level=inf --delete-after


回答3:

do you use a CMS? do you have a list of your pages? you could write a simple PHP loop to load all pages using CURL or php fopen()

$pages_ar = array(
    "http://mydomain.com/page1.htm",
    "http://mydomain.com/page2.htm",
    "http://mydomain.com/page2.htm",
);

foreach($pages_ar as $page)
{
   fopen($page, "r");
}

basic but I hope you get the idea...



回答4:

surfen's method is correct but if you want a php solution you can check Symfony 2-s BrowserKit component which can be used as a stand alone component.

https://github.com/symfony/BrowserKit