ALAssetsLibrary Get Photos Online

The following code successfully checks the EXIF ​​data for all photos in the user library and transfers to the completion block an array of photos that were taken in close geographic proximity to a specific location, as well as NSDate from the latest photos taken at that location.

 - (void)getPhotosAtLocation:(CLLocation *)location withCompletionBlock:(void (^)(NSError *, NSDate *, NSMutableArray *))completionBlock { NSMutableArray *photos = [NSMutableArray array]; ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init]; __block NSDate *latestDate = [[NSDate alloc] initWithTimeIntervalSince1970:0]; [assetsLibrary enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) { [group enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) { if (result) { ALAssetRepresentation *representation = [result defaultRepresentation]; NSDictionary *imageMetadata = [representation metadata]; NSLog(@"%@", imageMetadata); //analyze location if ([imageMetadata objectForKey:@"{GPS}"]) { double latitude = [[[imageMetadata objectForKey:@"{GPS}"] objectForKey:@"Latitude"] doubleValue]; if ([[[imageMetadata objectForKey:@"{GPS}"] objectForKey:@"LatitudeRef"] isEqualToString:@"S"]) latitude *= -1; double longitude = [[[imageMetadata objectForKey:@"{GPS}"] objectForKey:@"Longitude"] doubleValue]; if ([[[imageMetadata objectForKey:@"{GPS}"] objectForKey:@"LongitudeRef"] isEqualToString:@"W"]) longitude *= -1; if (fabs(location.coordinate.latitude - latitude) <= 0.0005 && fabs(location.coordinate.longitude - longitude) <= 0.0005) [photos addObject:[UIImage imageWithCGImage:[result thumbnail]]]; //analyze last time at location if ([imageMetadata objectForKey:@"{TIFF}"]) { NSDateFormatter *formatter = [[NSDateFormatter alloc] init]; [formatter setDateFormat:@"yyyy:MM:dd HH:mm:ss"]; NSDate *tempDate = [[NSDate alloc] init]; tempDate = [formatter dateFromString:[[imageMetadata objectForKey:@"{TIFF}"] objectForKey:@"DateTime"]]; if ([tempDate compare:latestDate] == NSOrderedDescending) { latestDate = tempDate; } } } } }]; if ([latestDate isEqualToDate:[NSDate dateWithTimeIntervalSince1970:0]]) completionBlock(nil, [NSDate date], photos); else completionBlock(nil, latestDate, photos); } failureBlock:^(NSError *error) { completionBlock(error, nil, nil); }]; } 

The problem I am facing is that in order to find all the photos taken in a specific place, I have to iteratively iterate through all the photos in the user's camera roll sequentially (taking the time O (n)). Is it possible to speed up this process if the system arranges photos by latitude and longitude before iteration? What algorithm could minimize the time taken to return the corresponding images?

+4
source share
2 answers

It looks like you are looking for some kind of spatial index. Look here in the Spatial Index section for a list of features: http://en.wikipedia.org/wiki/Spatial_database .

0
source

I think you need to iterate over all photos once and save an array of properties from already processed photos. Thus, the next time you iterate, you will only process new photos.

0
source

Source: https://habr.com/ru/post/1495226/


All Articles