In PHP, it is very difficult for me to use serialize / unserialize for a large array of objects (100000+ objects). These objects can be of different types, but they are all descendants of the base class.
Somehow, when I use unserialize in an array of objects, about 0.001% of the objects are generated incorrectly! Instead, a whole other object is generated. This does not happen randomly, but each time with the same objects. But if I change the order of the array, this happens with different objects, so it looks like an error to me.
I switched to json_encode / json_decode , but found that it always uses stdClass as an object class. I solved this by including the class name in each object as a property, and then use this property to create a new object, but this solution is not very elegant.
Using var_export with eval works fine, but is about 3 times slower than other methods and uses a lot more memory.
Now my questions are:
- what can cause errors / wrong objects created using
unserialize ? - Is there a better way to use
json_decode with an array of objects, so that classes are somehow stored in json automatically? - Is there perhaps another method for reading / writing a large array of objects in PHP?
UPDATE
I'm starting to believe that there should be something strange with my data arrays, because with msgpack_serialize (php extension, alternative to serialize ) I get the same errors (but oddly enough the same objects are not generated correctly!).
UPDATE 2
Found a solution: instead of doing serialize in the whole array, I do it on every object now, first serialize and then base64_encode , and then I store each serialized object as a separate line in a text file, a file. That way I can generate the entire array of objects and then iterate over each object with file() using unserialize and base64_decode : more errors!
Dylan source share