Problem with web browser hosting in asp.net mvc applyaton

In my ASP.NET MVC web application, I have to get data from the site through the website. To utilize the data, I need to go to the site, and then go to the site by clicking on the menu tab, and then from this page I need to abandon the data. The site contains 2 frames for scrapping. I previously implemented a web flap in one of my window projects using windows web browser control.

I followed the following link to integrate a web browser control into my web application. Everything worked perfectly in the visual studio environment, and I get my data from the disposal of the site through a web browser. But when I tried to host the application, it did not work properly. Web browser control does not load. I was looking for a solution, but could not get an acceptable solution. I tried to implement the same thing through an iframe, and also using the Silverlight Webbrowser control and using javascript to trigger login events, but it also fails due to access denied due to a cross-domain problem.

So, is there a way to trim data using loggin to a site in an asp.net web application. Can I make a Windows application as an ActiveX control and use it. Will ActiveX have a browser problem?

Any help would be greatly appreciated.

Thank you in advance

+6
source share
1 answer

As stated in the comments, the web browser approach seems to be difficult and will be subject to other environmental restrictions. Your best approach is to create a separate test repository for data cleaning - on demand or using the spider approach in advance if you really need to (and the target data does not change).

Yes, different browsers will have problems with it if you try to make it ActiveX. Security may not allow this. There are a lot of factors; if your environment is not controlled, this is not a great option.

Assuming you follow the on-demand basis, I would strongly suggest creating a web service or class that you can reference. Then you can use the open source parser server, for example:

  • CsQuery if the document is poorly formed or
  • Fizzler , if you can trust the integrity of the document.

Basically, you need to authenticate, save the authentication cookie, and finally download the target document on the second request filled in by your authentication cookie. Add this page to your parser (CsQuery or Fizzler).

An example of a login will be as follows:

private HttpWebRequest PerformLoginRequest(CookieContainer container) { var request = (HttpWebRequest) WebRequest.Create(YOUR_POST_URL); request.Method = "POST"; request.CookieContainer = container; _logger.DebugFormat("Attempting login for '{0}'", _username); var encoding = new ASCIIEncoding(); // assumes the un/pw is stored in a field var credentials = string.Format("username={0}&password={1}", _username, _password); byte[] data = encoding.GetBytes(credentials); request.ContentType = "application/x-www-form-urlencoded"; request.ContentLength = data.Length; using (var requestStream = request.GetRequestStream()) { try { requestStream.Write(data, 0, data.Length); } catch (Exception e) { _logger.Error("Error in login attempt.", e); } finally { requestStream.Close(); } } return request; } 

A cookie will be set in the cookie container that is returned, which you will need to parse in order for subsequent requests to correctly display the authentication bits. I had to do this and worked out the code that I found somewhere here on SO, but now I can not find the link. It might look something like this (explanation here is Set-Cookie ):

 private static CookieContainer ProcessCookieContainer(HttpWebRequest request, CookieContainer container) { var response = (HttpWebResponse) request.GetResponse(); var cookierReader = new StreamReader(response.GetResponseStream()); string htmldoc = cookierReader.ReadToEnd(); var cookieHeader = response.GetResponseHeader("Set-Cookie"); response.Close(); container = new CookieContainer(); foreach (var cookie in cookieHeader.Split(',')) { // these are ; seperated name/value pairs var split = cookie.Split(';'); string name = split[0].Split('=')[0]; string value = split[0].Split('=')[1]; // create the cookie with the domain var c = new Cookie(name, value) {Domain = "YourCookieDomain.com"}; container.Add(c); } return container; } 

And to download a document for analysis, you can do something like:

 public string GetValueFromSomePage(int first, string second) { var container = new CookieContainer(); // do login var request = PerformLoginRequest(container); // chew on cookies container = ProcessCookieContainer(request, container); var result = string.Empty; var requestUrl = string.Format("http://YourUrlWithParams.com/?first={0}&second={1}", 123, "abc"); var request = (HttpWebRequest)WebRequest.Create(requestUrl); request.CookieContainer = container; using (var serverResponse = (HttpWebResponse)request.GetResponse()) { try { var reader = new StreamReader(serverResponse.GetResponseStream()); var responseDoc = new CQ(reader); // do something with CSS selectors... result = responseDoc["input[name=name]"].FirstElement().Value; } catch (Exception e) { _logger.Error("Error fetching data.", e); } finally { serverResponse.Close(); } } return result; } 

Hope this helps. There are several moving parts here, but you probably expect that you have already set the nature of your task.

Greetings.

0
source

Source: https://habr.com/ru/post/949708/


All Articles