Fast relational database for easy use with Python

For my link cleaning program (written in python3.3) I want to use a database to store about 100,000 websites:

  • URL only
  • time stamp
  • and for each website a list of several properties

I have no knowledge of databases, but found the following may fit my purpose:

  • Postgresql
  • Sqlite
  • Firebird

I am interested in speed (to access the database and get the required information). For example: there is a y property for website x, and if so, read it. Of course, writing speed is also important.

My question is: Are there big differences in speed or is it not important for my small program? Perhaps someone can determine which database meets my requirements (and it is easy to deal with Python with it).

+6
source share
2 answers

The size and scale of your database is not very large, and it is within almost any turnkey database solution.

Basically, what you are going to do is install the database server on your computer, and it will appear on that port. You can then install the library in Python to access it.

For example, if you want to use Postgresql, you install it on your computer and it will be connected to some port, for example 5000 or port 5432.

But if you have information that you are going to store and retrieve, you probably want to go with a NoSQL solution, because it is very simple.

For example, you can install mongodb on your server, and then install pymongo . A tutorial for pymongo will teach you almost everything you need for your application.

+4
source

If speed is the main criterion, I would suggest going with the database in memory. Take a look at http://docs.python.org/2/library/sqlite3.html

it can also be used as a regular database, since in memory mode use the following, and db must be created in RAM itself and, therefore, is much faster available at run time.

import sqlite3 conn = sqlite3.connect(':memory:') 
+5
source

Source: https://habr.com/ru/post/951155/


All Articles