Is there a way to set the size of the uploaded file size in a tornado?

I use tornado to do image processing with a RESTful service that accepts images uploaded by shared HTTP, e.g. multipart/form-data . I then access them in handlers using self.request.files .

Perhaps the adversary will try to download a huge file to break the service. Is there a way to tell a tornado a file size limit that exceeds the file that should be dropped and the status of the HTTP error message should be set?

+4
source share
4 answers

I tried this, it works! max_buffer_size default value is 100 M.

 import tornado.httpserver app = tornado.web.Application([ (r'/upload/', UploadFileHandler), ]) server = tornado.httpserver.HTTPServer(app, max_buffer_size=10485760000) # 10G server.listen(8888) tornado.ioloop.IOLoop.instance().start() 
+4
source

You must configure this on the web server. For example, using nginx:

 client_max_body_size 50M; 

Edit: The stream used by HttpServer has the max_buffer_size property. HttpServer will not accept orders more than this. The default value for it is 100 MB. It seems to me that HttpServer just closes the connection instead of sending an HTTP response when this limit is reached.

+3
source

When creating the server, you can pass the max_buffer_size parameter. For example, to allow 160 MB download

 HTTPServer(app, max_buffer_size=167772160) 
+2
source

Source: https://habr.com/ru/post/1437998/


All Articles