Django - testing using large static data tables

I use "manage.py test" along with the created JSON device, which I created using "dumpdata"

My problem is that several tables in the device are very large (for example, one of which contains the names of all cities in the USA), which makes the launch of the test incredibly slow.

Seeing that some of these tables are never changed by the program (for example, city names never need to be changed), it makes no sense to create and break these tables for each test run.

Is there a better way to test this code using this kind of data?

+4
source share
2 answers

This was my solution:

class xxxx(TestCase): def setUp(self): import _mysql db=_mysql.connect('xxxx', 'xxxx', 'xxxx', "test_xxxxxxx") db.query(open('sql/xxxxxx.sql').read()) 

The sql file was a sequence of insert statements that I exported using phpMyAdmin. Reading sql statements is much faster than importing JSON or YAML. This is certainly not the most elegant solution, but it worked.

According to the third answer in Downloading SQL Dump before running Django tests, you just need to drop this sql file in the sql directory inside the application directory. This worked for me for the production database when running "manage.py syncdb", but for some reason this data was not actually imported into the test database when running the "management test .py", although the line is "Install custom SQL for xxxx.xxxx" appeared at the exit. So, I wrote my own code inside setUp ()

+1
source

You should check the structure of the nose . It looks like you have more control when loading test devices and when breaking:

"nose supports fixtures at the package, module, class, and test case level, so expensive initialization can be done as infrequently as possible. See Fixtures for more."

Also, there seems to be django plugins for the nose: on Google

Hope this helps.

0
source

Source: https://habr.com/ru/post/1309983/


All Articles