Memory usage with Django + SQLite3

I have a very large SQLite table with more than 500,000 rows with approximately 15 columns (mostly floating point). I want to transfer data from SQLite DB to a Django application (which can be supported by many RDBMs, but Postgres in my case). Everything works fine, but as the iteration continues, memory usage jumps up by 2-3 megabytes per second for the Python process. I tried using 'del' to remove the EVEMapDenormalize and row objects at the end of each iteration, but the inflation continues. Here is an excerpt, any ideas?

class Importer_mapDenormalize(SQLImporter):
def run_importer(self, conn):
    c = conn.cursor()

    for row in c.execute('select * from mapDenormalize'):
        mapdenorm, created = EVEMapDenormalize.objects.get_or_create(id=row['itemID'])
        mapdenorm.x = row['x']
        mapdenorm.y = row['y']
        mapdenorm.z = row['z']

        if row['typeID']:
            mapdenorm.type = EVEInventoryType.objects.get(id=row['typeID'])

        if row['groupID']:
            mapdenorm.group = EVEInventoryGroup.objects.get(id=row['groupID'])

        if row['solarSystemID']:
            mapdenorm.solar_system = EVESolarSystem.objects.get(id=row['solarSystemID'])

        if row['constellationID']:
            mapdenorm.constellation = EVEConstellation.objects.get(id=row['constellationID'])

        if row['regionID']:
            mapdenorm.region = EVERegion.objects.get(id=row['regionID'])

        mapdenorm.save()
    c.close()

I'm not at all interested in wrapping this SQLite DB with Django ORM. I just really would like to figure out how to transfer data without sucking all my RAM.

+3
2

, FAQ Django.

DEBUG.

from django import db 
db.reset_queries()
+3

, select * from mapDenormalize . - script . LIMIT .

, , , .

+1

Source: https://habr.com/ru/post/1736318/


All Articles