Last week I studied and experimented with face recognition. The planned application is designed so that a person can find information about a person in a database (SQL) simply by taking a picture of his face. The initial expectation was to compress the face to a key or hash and use it as a lokup database. This does not have to be extremely accurate, as the person viewing the information can and most likely will complete the final comparison of the original image with the file and the person in front of them.
OpenCV / JavaCV seems to be an obvious starting point, and the face detection it provides works well, however the Eigenfaces implementation for face recognition is not ideal because online training will recompile hundreds of thousands of users every time a new face needs to be added to the training kit will not work.
I am experimenting using SURF descriptors on a face extracted using OpenCV Haar Cascade functions, and this seems to bring me closer to the intended result, however I cannot think of a way to efficiently search and compare about 30 descriptors (which are 64 or 128 dimensional vectors ) in the database. I read some LSH and Spectral Hashing algorithms, however there are no implementations for Java, and my math is not strong enough to implement them.
Does anyone have any thoughts or ideas on how this can be achieved, or if it is possible?
source share