I use wget -p $url
to get all the files on a webpage so that I can get a list. But for some URLs, it turns out that only the index.html can be fetched by wget. Is there a way to get a list of files on a specific URL by wget or cURL? Do I need to check the request headers and response headers?
相关问题
- Views base64 encoded blob in HTML with PHP
- Laravel Option Select - Default Issue
- PHP Recursively File Folder Scan Sorted by Modific
- Can php detect if javascript is on or not?
- Using similar_text and strpos together
Some servers do not let you browse directory listings, and if there's a default document in that directory, it takes over and you can't browse either.
You need to implement a spider that parses all the paths and files and links, and creates a directory structure of files that are declared and used in the HTML. Then you can download those files.