This question has been asked many times before. However some of the APIs have changed over time and I want to know a good way to implement this.
The best way to this would be using google search api. However, https://developers.google.com/custom-search/json-api/v1/overview tells there are only 100 free search queries per day. I will require more and I dont want to spend money to do it.
I tried it using simple REST apis, however its mostly JavaScript code and I don't seem to find what I need in the response.
I tried using some libraries like http://jsoup.org/ , however, even its response doesn't contain the information I need.
See this Jsoup Crawler example:
http://www.mkyong.com/java/jsoup-send-search-query-to-google/
In java i use crawler4j:
https://code.google.com/p/crawler4j/
I tried using Jsoup and it worked, although the first few results include some undesired characters. Below is my code
package crawl_google;
import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
import org.jsoup.nodes.Element;
import org.jsoup.select.Elements;
public class googleResults {
public static void main(String[] args) throws Exception{
//pass the search query and the number of results as parameters
google_results("Natural Language Processing", 10);
}
public static void google_results(String keyword, int no_of_results) throws Exception
{
//Replace space by + in the keyword as in the google search url
keyword = keyword.replace(" ", "+");
String url = "https://www.google.com/search?q=" + keyword + "&num=" + String.valueOf(no_of_results);
//Connect to the url and obain HTML response
Document doc = Jsoup
.connect(url)
.userAgent("Mozilla")
.timeout(5000).get();
//parsing HTML after examining DOM
Elements els = doc.select("li.g");
for(Element el : els)
{
//Print title, site and abstract
System.out.println("Title : " + el.getElementsByTag("h3").text());
System.out.println("Site : " + el.getElementsByTag("cite").text());
System.out.println("Abstract : " + el.getElementsByTag("span").text() + "\n");
}
}
}