Caching search results in a session versus saving a large object heap

Ok, so I worked on the ASP.NET project for a while, and it seems I made some bad design decisions that come back to haunt me, as the project continues to get bigger and bigger in terms of data content.

After studying .NET memory management, I think I have identified a number of potential causes. Since the material I am doing is not particularly special, I wonder if there is a standard template for achieving what I want to do, which is missing.

So, I have (a somewhat expensive query) that gives something from 1 to 20,000 results. In subsequent queries, we can simply pager through the result set, so I save this result in the session. Session - InProc. I am wondering:

  • Does it make sense a) to store result b) in a c) in-process session? I want speed (a). I don’t know if there is a more efficient way than storing it by the user (b), and if I use a more sophisticated state server, will it rather become slower (c)? Or could it be a solution, getting rid of these large objects faster, instead of storing the last set of results in RAM until the session expires?

  • If any result set> ~ 20,000 rows ends with a potentially confusing LOH, is there a general way around this?

I know this question is a little unsaid. I just realized that my overall design might be wrong (wrt scalability), and I'm just trying to gauge how accurate this is. I hope some tips on standard templates can be compiled, which nevertheless turn this into a useful question.

+6
source share
3 answers

Why always return all records ?? I think the best way to speed up your request is to return only the data the user needs ... so that only the data that fits into the page!

Try searching Google ROW_NUMBER () (SQL Server) or LIMIT (mySQL).

Here are 2 product tutorials

1) ScottGu Blog

2) 15 Second tutorial

+1
source

Not knowing what your query is, but why do you need to pull more rows from your database than you need to show your user at a time? With good indexes, pulling out subsequent pages should be pretty quick, and then you only need to do this if you need these pages.

An alternative is to simply save the result set identifiers for 20,000 items. Thus, if you need to scroll through the pages, you can quickly enlarge individual lines using the primary key.

Finally, perhaps you should consider using a Cache object to store the results, not the session. This way you let .NET decide when to delete objects, and they do not lead to a bloated session.

+1
source

You should try to avoid storing results in a session. Your application will probably not work if the user uses several browser tabs in one session (this happens).

If you are using a session, definitely do not use InProc mode, because as users grow, the process eats up memory and ultimately reboots, and user sessions will be lost even if the timeout has not expired.

Try to create a page with the database that Keltex mentions, just pull out the data that you display.

+1
source

Source: https://habr.com/ru/post/888545/


All Articles