Django Pagination is too slow with a large dataset

I have a problem with Django Pagination. When I try to split pages into more than 200,000 entries, the portal’s web loading is very slow (> 10 seconds), and I expect to use about 2 million entries.

I can not find on Stackoverflow or another site a GOOD concrete fix for this problem. Each time the code receives the page, it runs a QuerySet, which processes a very large set of data, which makes it slow.

Does anyone know what can be done? I searched everywhere and could not solve this problem. Below I put the code for pagination. Paper_list: model.object.all (). Filter (category = x) (and now there are about 200 thousand objects (all of them belong to this category).

def paginate_context(paper_list, request, context, source, author, title, term, date): num_papers = len(paper_list) paginator = Paginator(paper_list, 4) # Show X papers per page page = request.GET.get('page') try: papers = paginator.page(page) except PageNotAnInteger: # If page is not an integer, deliver first page. papers = paginator.page(1) except EmptyPage: # If page is out of range (eg 9999), deliver last page of results. papers = paginator.page(paginator.num_pages) context['papers'] = papers context['num_papers'] = num_papers context['query_cat'] = create_request_str_and(context['cat'], source, author, title, term, date) 
+5
source share
1 answer

As I see in the code, the num_papers = len(paper_list) evaluates the request, therefore, it may interfere with the work. You can change it to:

 num_papers = paper_list.count() 

Here you can check when the request is evaluated: https://docs.djangoproject.com/en/1.9/ref/models/querysets/#when-querysets-are-evaluated

+2
source

Source: https://habr.com/ru/post/1240421/


All Articles