HTTP 504 when processing a downloaded ZIP image

I am new to web development and I am working on a basic image gallery application (training exercise) using Django. I set it up so that I could upload a zip full of images right away to create a new album. Everything seems to be working fine, but I get an HTTP 504 error when the downloaded file is especially large.

I'm going to (please correct me if I am wrong) this error means that my application is too slow to return an HTTP response. I assume that this is due to the fact that it takes a lot of time to unpack and process (create a Pic object in the database and create thumbnails) of all images.

Is there a way to return a response (say, to some intermediate page) while still doing the processing in the background - perhaps using threads? What is the right way to handle this? Is it time to start learning Javascript / AJAX?

Thanks!


Model:

from django.db import models from blog.models import Post class Album(models.Model): title = models.CharField(max_length=128) slug = models.SlugField() description = models.TextField() parent = models.ForeignKey('self', null=True, blank=True) pub = models.BooleanField() date_created = models.DateTimeField(auto_now_add=True) date_published = models.DateTimeField(null=True, blank=True) date_modified = models.DateTimeField(auto_now=True) def __unicode__(self): return self.title class Pic(models.Model): image = models.ImageField(upload_to='pics/%Y/%m') title = models.CharField(max_length=128) caption = models.TextField(blank=True, null=True) albums = models.ManyToManyField('Album', null=True, blank=True) posts = models.ManyToManyField(Post, blank=True, null=True) date_taken = models.DateTimeField(null=True, blank=True) date_uploaded = models.DateTimeField(auto_now_add=True) date_modified = models.DateTimeField(auto_now=True) def __unicode__(self): return self.title 

View:

I do this manually because I did not start the Django admin when I started. I think it would be better to use the admin setting here.

 def new_album(request): if request.method == "POST": form = AlbumForm(request.POST, request.FILES) if form.is_valid(): from gallery.pic_handlers import handle_uploaded_album pics = handle_uploaded_album(request.FILES['pic_archive']) a = form.save() a.slug = slugify(a.title) a.save() for pic in pics: pic.albums.add(a) return HttpResponseRedirect('/gallery/album/%s/' % a.slug) else: form = AlbumForm() return render_to_response('new_album.html', { 'form' : form, }, context_instance = RequestContext(request)) 

Additional processing:

 def handle_uploaded_album(pic_archive): destination = open(join(settings.MEDIA_ROOT,pic_archive.name), 'wb+') for chunk in pic_archive.chunks(): destination.write(chunk) destination.close() today = datetime.date.today() save_path = 'pics/{0}/{1:02}/'.format(today.year, today.month) tmp_path = 'tmp/' z = zipfile.ZipFile(join(settings.MEDIA_ROOT,pic_archive.name), 'r') pics = [] for member in z.namelist(): if '/' in member or '\\' in member: # don't deal with any directories inside the zip # this also solves the '__MACOSX' issue continue if splitext(member)[1] in IMG_EXT: z.extract(member,join(settings.MEDIA_ROOT,tmp_path)) im = File(open(join(settings.MEDIA_ROOT,tmp_path,member), 'rb')) # create a Pic from this file pic = Pic() pic.title = member pic.image.save( join(save_path, member), im, True) create_thumbnails(pic) im.close() # remove extracted images remove(join(settings.MEDIA_ROOT,tmp_path,member)) # TODO: save date taken if available pics.append(pic) z.close() remove(join(settings.MEDIA_ROOT,pic_archive.name)) return pics def create_thumbnails(pic): fname, ext = splitext(pic.image.path) img = Image.open(pic.image.path) img.thumbnail((512,512), Image.ANTIALIAS) img.save(fname + '_m' + ext) img.thumbnail((128,128), Image.ANTIALIAS) img.save(fname + '_s' + ext) 
+4
source share
2 answers

Long-running tasks, such as this processing, take too much time, and your client and / or your proxy timeout is the 504 error you see.

You should not run long tasks this way!

As you rightly ask at the end, you need a way to separate long executions - through an asynchronous queue system like celery . This way you can immediately return a response to your clients, while the backend performs tasks asynchronously.

You should look at one of the following:

Since django-celery is definitely the best option, the next step is to find out about it; there are also many SO questions around him

If you want to be sure that the processing that fails and does not load, just try turning off all processing and immediately return to your client. If it still does not work, you also need to configure the web server so that it does not go into timeout!

+5
source

Ok, so you get a 504 error, so you need to understand a few HTTP status codes, see here for more information

So, you get the 5xx error that is generated by the server, the 504 error indicates

 504 Gateway Timeout The server was acting as a gateway or proxy and did not receive a timely response from the upstream server. 

Well, I never program in Django, but what I can understand from the details that you indicated is that the cause of this error is, of course, a large file. What can happen is that when you download a large file, a timeout is set, but since the file is large, the time it takes to download exceeds the timeout and therefore an error is generated.

Well, try google, how to increase the timeout or set the maximum file size that does not exceed the timeout when downloading, hopes this helps :-)

+1
source

Source: https://habr.com/ru/post/1389995/


All Articles