I would like to write a program to extract the URLs of websites visited by a system (an IP address) through packet capture.. I think this URL will come in the data section ( ie not in any of the headers - ethernet / ip / tcp-udp ).. ( Such programs are sometimes referred to as http sniffers , i'm not supposed to use any available tool ). As a beginner , I've just now gone through this basic sniffer program : sniffex.c.. Can anyone please tell me in which direction i should proceed..
相关问题
- Angular RxJS mergeMap types
- Google Apps Script: testing doPost() with cURL
- How to instantiate Http service in main.ts manuall
- C#使用http访问网络,有办法用指定网卡访问网络嘛?
- UrlEncodeUnicode and browser navigation errors
相关文章
- C#使用http访问网络,有办法用指定网卡访问网络嘛?
- Is a unicode user agent legal inside an HTTP heade
- git: retry if http request failed
- Flutter - http.get fails on macos build target: Co
- C# HttpClient.SendAsync always returns 404 but URL
- Response body is null, status is 200
- Prevent $anchorScroll from modifying the url
- Returning plain text or other arbitary file in ASP
Note: In the info below, assume that GET also includes POST and the other HTTP methods too.
It's definitely going to be a lot more work than looking at one packet, but if you capture the entire stream you should be able to get it from the HTTP headers sent out.
Try looking at the Host header if that's provided, and also what is actually requested by the GET. The GET can be either a full URL or just a file name on the server.
Also note that this has nothing to do with getting a domain name from an IP address. If you want the domain name, you have to dig into the data.
Quick example on my machine, from Wireshark:
Another example, not from a browser, and with only a path in the GET:
In the second example, the actual URL is http://example.com/ccnet/XmlStatusReport.aspx
I was researching on something similar and came across this. Hope this could be a good start if you are using linux - justniffer.
http://justniffer.sourceforge.net/
There is also a nice http traffic grab python script that would help if you are looking to get information from HTTP requests.
If you are using Linux, you can add a filter in iptables to add a new rule which looks for packets containing HTTP get requests and get the url.
So rule will look like this.
For each packet going on port 80 from localhost -> check if the packet contains GET request -> retrieve the url and save it
This approach should work in all cases, even for HTTPS headers.
No, there is not enough information. A single IP can correspond to any number of domain names, and each of those domains could have literally an infinite number of URLs.
However, look at gethostbyaddr(3) to see how to do a reverse dns lookup on the ip to at least get the canonical name for that ip.
Update: as you've edited the question, @aehiilrs has a much better answer.
What you might want is a reverse DNS lookup. Call gethostbyaddr for that.
Have a look at PasTmon. http://pastmon.sourceforge.net