First of all, you need to make sure that you have created your API key and CX. I assume that you have already done this, otherwise you can do it in these places:
- API key (you need to create a new browser key)
- CX (you need to create a custom search engine)
Once you have it, here is just a console application that searches and downloads all the headers / links:
static void Main(string[] args) { WebClient webClient = new WebClient(); string apiKey = "YOUR KEY HERE"; string cx = "YOUR CX HERE"; string query = "YOUR SEARCH HERE"; string result = webClient.DownloadString(String.Format("https://www.googleapis.com/customsearch/v1?key={0}&cx={1}&q={2}&alt=json", apiKey, cx, query)); JavaScriptSerializer serializer = new JavaScriptSerializer(); Dictionary<string, object> collection = serializer.Deserialize<Dictionary<string, object>>(result); foreach (Dictionary<string, object> item in (IEnumerable)collection["items"]) { Console.WriteLine("Title: {0}", item["title"]); Console.WriteLine("Link: {0}", item["link"]); Console.WriteLine(); } }
As you can see, I use universal JSON deserialization in the dictionary instead of strict input. This is for convenience, since I do not want to create a class that implements the search result schema. With this approach, the payload is an embedded set of key-value pairs. You are most interested in the collection of items that results from the search (first page, I suppose). I get access only to the "title" and "link" properties, but they can be seen much more than you can see from the documentation or check in the debugger.
source share