any one knows a way to get all the URLs in a website using javascript?i only need the links starting with the same domain name.no need to consider other links
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
Well this will get all the same-host links on the page:
var urls = [];
for(var i = document.links.length; i --> 0;)
if(document.links[i].hostname === location.hostname)
urls.push(document.links[i].href);
If by site you mean you want to recursively get the links inside linked pages, that's a bit trickier. You'd have to download each link into a new document (for example in an <iframe>
), and the onload
check the iframe's own document for more links to add to the list to fetch. You'd need to keep a lookup of what URLs you'd already spidered to avoid fetching the same document twice. It probably wouldn't be very fast.
回答2:
using jquery u can find all the links on the page that match a specific criteria
$("a[href=^domain.com]").each(function(){
alert($(this).attr("href"));
});