Request 1 million records from MySQL to an array in PHP

My ultimate goal is to send 3 million entries to the Google Maps API to show them as markers, but before I get to that.

I could not even load 1 million into the PHP array. The data for each item is 18 digits, with two columns and 1 million rows.

The request is just straight up SELECT *, but I run out of memory while going through and storing the correct records in an array. I tried using SplFixedArray but had no luck with this.

I need to find a good way to execute a package and split it. After running some tests, I can output about 500 thousand into an array without reaching the memory limit (this is already 512 M!), So I could just do it in 2 or 3 queries? I still need the entire amount of data stored in the arrays on the server side until the page loads, and I can transfer it to the Maps, so I assume that it will be a fix, but it will not be fixed, because it will still be in memory?

edit: there grows a large chain of comments, but basically everyone agrees that for some reason this is a bad idea. Therefore, my solution is to return it to about 300 thousand points, which will be achievable with a much smaller beating of the head.

+4
2

Google.

. - , 3 . 20 , JSON, 60 , . Google , . , Google , - .

, ; , .

Google . .

, ? ? - . Google .

; -, , . . , .

, , Xfinity wifi map. , . .

+2

, 1 M , . , MySQL. : (STORE_RESULT) , , , (USE_RESULT). , USE_RESULT.

:

http://php.net/manual/en/mysqlinfo.concepts.buffering.php

.

+1

Source: https://habr.com/ru/post/1617754/


All Articles