ASP.NET Web Queries for Multithreading

I create a page in my ASP.NET solution that retrieves most of its data from a third-party API service.

The page itself will have to make about 5 individual API calls to populate all of its controls, as each web API request returns multiple datasets.

I would like to deal with every single web request that I make on a new thread, at the same time, so that loading time is reduced.

Each request that I create looks like this: -

WebRequest request = WebRequest.Create(requestUrl); string response = new StreamReader(request.GetResponse().GetResponseStream()).ReadToEnd(); return new JsonSerializer().Deserialize<OccupationSearch>(new JsonTextReader(new StringReader(response))); 

First, should I do this? is it safe? and increase productivity through multithreading.

Secondly, what is the best way to do this? There are many different ways to multithreaded systems, but which are best suited for this task?

+6
source share
3 answers

Multithreading is the right choice, but I would call it asynchronous.

In any case, you should be aware that multithreading works in IIS.

The IIS workflow will complete the request after all the child threads have completed, and this is a big problem because you do not want to hold the workflow for a long time, but reuse it for other requests. This is actually a thread pool.

This is why ASP.NET offers its own approach for implementing asynchrony, and if you use the right approach, IIS will be able to handle more requests at the same time, because asynchronous work will be performed outside the IIS process model.

I would advise you to learn more about ASP.NET async:

Conclusion: use asynchronous operation, and this will make it possible to use server resources more efficiently, but first learn more about how to do it correctly!

+13
source

Multithreading is not recommended on ASP.NET; because ASP.NET/IIS does its own multithreading, and any multithreading that you do will interfere with these heuristics.

What you really want is concurrency, or rather asynchronous concurrency, since your operations are related to I / O.

A better approach is to use HttpClient with a task-based Asynchronous pattern :

 public async Task<OccupationSearch> GetOccupationAsync(string requestUrl) { // You can also reuse HttpClient instances instead of creating a new one each time using (var client = new HttpClient()) { var response = client.GetStringAsync(requestUrl); return new JsonSerializer().Deserialize<OccupationSearch>(new JsonTextReader(new StringReader(response))); } } 

This is asynchronous, and you can easily do this using Task.WhenAll :

 List<string> urls = ...; OccupationSearch[] results = await Task.WhenAll(urls.Select(GetOccupationAsync)); 
+9
source

I would say that multithreading (within reason) will provide benefits, as the way your code is written knows GetResponse () calls. GetResponseStream () blocks.

One of the easiest ways to improve performance is to use Parallel.ForEach :

  var urls = new List<string>(); var results = new ConcurrentBag<OccupationSearch>(); Parallel.ForEach(urls, url => { WebRequest request = WebRequest.Create(requestUrl); string response = new StreamReader(request.GetResponse().GetResponseStream()).ReadToEnd(); var result = JsonSerializer().Deserialize<OccupationSearch>(new JsonTextReader(new StringReader(response))); results.Add(result); }); 

If you are using .NET 4.5, the new approach is to use async / await. Msdn contains a fairly extensive article on this topic: http://msdn.microsoft.com/en-us/library/hh300224.aspx and http://msdn.microsoft.com/en-us/library/hh696703.aspx

Scott Hanselman also has a nice blog post on this topic: http://www.hanselman.com/blog/TheMagicOfUsingAsynchronousMethodsInASPNET45PlusAnImportantGotcha.aspx

+3
source

Source: https://habr.com/ru/post/969967/


All Articles