How to improve SQLAlchemy performance?

I made a client application that uses HTTP to communicate with a Python 2 server using a simple API. The server uses SQLAlchemy ORM quite widely to serve the data for these HTTP requests. The problem is that the use of my processor is quite large even with several active clients. This server should be able to simultaneously serve several hundred clients for about 1 request per second for each client, so it should still be managed (or I hope).

How can I improve performance? I know that the problem is in ORM, since cProfile shows this pretty clearly. Apparently, a single request executes about 10,000 Python commands, which seem rather strange. I tried to connect various mechanisms / database servers and changed the interpreter to Pypy just for fun, but it obviously did not help the original problem, and also did not improve performance.

What am I doing wrong here? I really hope this is "alright, du!" problem.

Should my relationship be of a different type? impatient, lazy, dynamic, etc.? Right now, I'm not specifying anything in particular.

Help with thanks.

+6
source share
1 answer

How dynamic the queries are, is this just one type of object that always returns, or these are different models. How many rows are you returning? Can you limit the number of columns or the number of rows? With huge amounts of data suggesting that you have already fixed everything easily, even converting columns to the correct data types in python can cause a decent amount of overhead.

I also have SQLAlchemy for quick projects only, but is it possible to use a processor when it expects results? If this is your problem, you can dive into the profiling of the currently running queries and make sure that they are indexed correctly and that orm generates them optimally.

0
source

Source: https://habr.com/ru/post/912331/


All Articles