In my organization, PostgreSQL databases are created with a limit of 20 connections as a policy. This tends to interact poorly when an application uses several applications that use connection pools, since many of them open their full set of connections and keep them in standby mode.
As soon as there are several applications related to the database, we run out of connections, as expected.
Pool behavior is a new thing; so far, we have been managing joint connections by serializing access to them through the database web gateway (?!) or not combining anything at all. As a result, I have to explain (literally 5 problem tickets from one person during the project) over and over how the association works.
What I want is one of the following:
A solid, undeniable rationale for increasing the number of database connections available to play with pools.
If so, what is the safe limit? Is there any reason to limit the limit to 20?
The reason I'm wrong, and we need to reduce the size of the pools or completely eliminate them.
For what it's worth, here are the components in the game. If it is important how one of them is configured, please weigh:
DB: PostgreSQL 8.2. , .
-: Python 2.7, Pylons 1.0, SQLAlchemy 0.6.5, psycopg2
- , SQLAlchemy ORM , , factory (Still sqlalchemy), , , PHP API.
: Python 2.7, 2.1.4, SQLAlchemy 0.6.5, psycopg2