Py-postgresql multithreaded problems

I found that under heavy load my application for the pyramid of web applications py-postgresql exceptions such as postgresql.exceptions.ProtocolError . Some searches have shown that py-postgresql is not thread safe, and one connection cannot be used by multiple threads at the same time.

I tried to create some kind of union mechanism, but I still get ProtocolErrors: (

What am I doing wrong?

First, I create the number of connection objects:

  for x in range(num_db_connections): self.pool.append(Connection(conn_string,x)) 

Each object in the pool contains db_lock = threading.Lock() and a database connection self.conn = postgresql.open( conn_string )

Then I try to acquire a connection lock and do some work with it. This code can be executed by many threads at the same time, but I think that no two threads can run work on the same connection at the same time due to blocking.

  time_start = time.time() while time.time() - time_start < self.max_db_lock_wait_time: for conn in self.pool: acquired = conn.db_lock.acquire(False) if acquired: try: lst = conn.work() finally: conn.db_lock.release() return lst time.sleep(0.05) raise Exception('Could not get connection lock in time') 

Perhaps there are flaws in my code, or am I misunderstood the py-postgresql "insecurity flow" nature? Please help me!

+4
source share
1 answer

Are you sure you are not using cursor objects outside your lock?

Just a suggestion: instead of using time.sleep () and trying to block, use Queue to pop / push connection objects from / to the pool. This is already thread safe and it has a timeout parameter. Much more effective. Especially if you have many threads and only a few connections. (These tiny sleep instructions add up when you need to run 100,000 requests. All of them will increase response time.)

+2
source

Source: https://habr.com/ru/post/1436498/


All Articles