Managing big data memory from oracle database

I am fetching big data from oracle database using cx_oracle using below sample script:

from cx_Oracle import connect

TABLEDATA = []

con = connect("user/password@host")
curs = con.cursor()
curs.execute("select * from TABLE where rownum < 100000")

for row in curs:
    TABLEDATA.append([str(col) for col in list(row)])

curs.close()                        
con.close()

The problem with saving to the list is that it reaches approximately 800-900 mb of RAM. I know that instead I can save this in a file and not store it in a list, but I use this list to display the table with QTABLEVIEW and QABSTRACTTABLE MODEL.

Is there any alternative or more efficient way where I can minimize the use of memory to store this data and also use it to display my table?

+4
source share
2 answers

possobabilities, , qsqltablemodel . , , , , .

, . , , , . , .

+1

Source: https://habr.com/ru/post/1529361/


All Articles