Efficient way to find value based on php key

With a list of about 100,000 key / value pairs (both lines, mostly about 5-20 characters) I am looking for a way to efficiently find the value for a given key.

This needs to be done on the php website. I am familiar with hash tables in java (which, most likely, I will do if I work in java), but I am new to php.

I am looking for tips on how I should store this list (in a text file or in a database?) And look for this list.

The list should be updated periodically, but what interests me most is the search time.

+4
source share
3 answers

You can do this as a direct PHP array, but Sqlite will be your best bet for speed and convenience if available.

PHP array

Just save everything in your php file as follows:

<?php return array( 'key1'=>'value1', 'key2'=>'value2', // snip 'key100000'=>'value100000', ); 

Then you can access it as follows:

 <?php $s = microtime(true); // gets the start time for benchmarking $data = require('data.php'); echo $data['key2']; var_dump(microtime(true)-$s); // dumps the execution time 

Not the most effective thing in the world, but it will work. It takes 0.1 seconds on my car.

Sqlite

PHP should come with sqlite enabled, which is great for this kind of thing.

This script will create a database for you from beginning to end with similar characteristics to the dataset that you describe in the question:

 <?php // this will *create* data.sqlite if it does not exist. Make sure "/data" // is writable and *not* publicly accessible. // the ATTR_ERRMODE bit at the end is useful as it forces PDO to throw an // exception when you make a mistake, rather than internally storing an // error code and waiting for you to retrieve it. $pdo = new PDO('sqlite:'.dirname(__FILE__).'/data/data.sqlite', null, null, array(PDO::ATTR_ERRMODE=>PDO::ERRMODE_EXCEPTION)); // create the table if you need to $pdo->exec("CREATE TABLE stuff(id TEXT PRIMARY KEY, value TEXT)"); // insert the data $stmt = $pdo->prepare('INSERT INTO stuff(id, value) VALUES(:id, :value)'); $id = null; $value = null; // this binds the variables by reference so you can re-use the prepared statement $stmt->bindParam(':id', $id); $stmt->bindParam(':value', $value); // insert some data (in this case it just dummy data) for ($i=0; $i<100000; $i++) { $id = $i; $value = 'value'.$i; $stmt->execute(); } 

And then use the values:

 <?php $s = microtime(true); $pdo = new PDO('sqlite:'.dirname(__FILE__).'/data/data.sqlite', null, null, array(PDO::ATTR_ERRMODE=>PDO::ERRMODE_EXCEPTION)); $stmt = $pdo->prepare("SELECT * FROM stuff WHERE id=:id"); $stmt->bindValue(':id', 5); $stmt->execute(); $value = $stmt->fetchColumn(1); var_dump($value); // the number of seconds it took to do the lookup var_dump(microtime(true)-$s); 

This waiy is faster. 0.0009 seconds on my car.

MySQL

You can also use MySQL to do this instead of Sqlite, but if it is only one table with the characteristics that you describe, this is likely to be redundant. The above Sqlite example works fine with MySQL if you have a MySQL server. Just change the line that creates the PDO for this:

 $pdo = new PDO('mysql:host=your.host;dbname=your_db', 'user', 'password', array(PDO::ATTR_ERRMODE=>PDO::ERRMODE_EXCEPTION)); 

The queries in the sqlite example should work well with MySQL, but note that I have not tested this.

Let it go a little crazy: file system madness

Not that the Sqlite solution was slow (0.0009 seconds!), But it is about four times faster on my machine. In addition, Sqlite may not be available, MySQL configuration may be excluded, etc.

In this case, you can also use the file system:

 <?php $s = microtime(true); // more hack benchmarking class FileCache { protected $basePath; public function __construct($basePath) { $this->basePath = $basePath; } public function add($key, $value) { $path = $this->getPath($key); file_put_contents($path, $value); } public function get($key) { $path = $this->getPath($key); return file_get_contents($path); } public function getPath($key) { $split = 3; $key = md5($key); if (!is_writable($this->basePath)) { throw new Exception("Base path '{$this->basePath}' was not writable"); } $path = array(); for ($i=0; $i<$split; $i++) { $path[] = $key[$i]; } $dir = $this->basePath.'/'.implode('/', $path); if (!file_exists($dir)) { mkdir($dir, 0777, true); } return $dir.'/'.substr($key, $split); } } $fc = new FileCache('/tmp/foo'); /* // use this crap for generating a test example. it slow to create though. for ($i=0;$i<100000;$i++) { $fc->add('key'.$i, 'value'.$i); } //*/ echo $fc->get('key1', 'value1'); var_dump(microtime(true)-$s); 

It will take 0.0002 seconds to search on my machine. It can also be reasonably persistent, regardless of cache size.

+13
source

Depending on how often you access your array, think how many users can access it at the same time. There are many advantages to storing it in a database, and here you have two MySQL and SQLite features.

SQLite is more like a SQL-enabled text file, you can save a few milliseconds during queries, because it is within the reach of your application, the main disadvantage of which is that it can add only one record at a time (the same as text file), I would recommend SQLite for arrays with static content such as GEO IP data, translations, etc.

MySQL is a more powerful solution, but requires authentication and is located on a separate machine.

+1
source

PHP arrays will do whatever you need. But shouldn't that much data be stored in the database?

http://php.net/array

0
source

Source: https://habr.com/ru/post/1333130/


All Articles