Huge data object kernel size

I tried to add 200k messages to the main data object to prove the concept of the Twitter application. This works well and my entities are added. I have a UISearchBar and a UITableView to display them. However, for every new 1000 facility, my time seems to be growing exponentially. This is normal? I expected CoreData to work well for huge datasets. You have the best deal to handle such a huge dataset. I wonder how Dictionary apps .

My console output is here:

-2012-03-26 22: 19: 28.126 TweetReader [3668: 707] Made 1000
-2012-03-26 22: 19: 40.335 TweetReader [3668: 707] Made 2000
-2012-03-26 22: 19: 55.136 TweetReader [3668: 707] Made 3000
-2012-03-26 22: 20: 18.569 TweetReader [3668: 707] 4000 made
-2012-03-26 22: 20: 50.166 TweetReader [3668: 707] Made 5000
-2012-03-26 22: 21: 30.284 TweetReader [3668: 707] Made 6000
-2012-03-26 22: 22: 19.096 TweetReader [3668: 707] Made 7000
-2012-03-26 22: 23: 16.091 TweetReader [3668: 707] Made 8000
-2012-03-26 22: 24: 21.321 TweetReader [3668: 707] Made 9000
-2012-03-26 22: 25: 35.017 TweetReader [3668: 707] 10000 completed
-2012-03-26 22: 26: 57.250 TweetReader [3668: 707] Made 11000
-2012-03-26 22: 28: 27.563 TweetReader [3668: 707] Made 12000
-2012-03-26 22: 30: 06.202 TweetReader [3668: 707] Completed 13000
-2012-03-26 22: 31: 52.645 TweetReader [3668: 707] Made 14000

Here is my code to save to CoreData :

 for (NSInteger i = 1; i <= 200000; i++) { NSAutoreleasePool * myPool = [[NSAutoreleasePool alloc] init]; Tweet *tweetie = [NSEntityDescription insertNewObjectForEntityForName:@"Tweet" inManagedObjectContext:self.managedObjectContext]; tweetie.name = [NSString stringWithFormat:@"%10d",i]; tweetie.message =[NSString stringWithFormat:@"%10d",i]; // Save the context after 1000 objects. if (! (i % 1000)) { NSError *error; NSLog(@"Done %d",i); if (![managedObjectContext save:&error]) { NSLog(@"Unresolved error %@, %@", error, [error userInfo]); abort(); } [myPool release]; } } 
+4
source share
2 answers

What columns did you specify in your database? You tried to lock the context during the add in order to see if it could optimize it (i.e. do inserts transactionally?)

Key data is known to be bad for bulk inserts. The solution to this could simply be to use sqlite directly - see this article for a real life study of whoever was supposed to do this: http://inessential.com/2010/02/26/on_switching_away_from_core_data

+1
source

If you have 200 thousand messages, first find the optimal preservation of the length of each message. It will be trial and error, I found 3000 optimal. run [save context] for every 3000, for example:

  for(i=0; i<num of objects;i++) { // form the object to be saved in context if(i%3000==0){ // save the num of records , optimum value NSError *error; NSLog(@"saved rec nu %d",i); if (![context save:&error]) { NSLog(@"Whoops, couldn't save: %@", [error localizedDescription]); return NO; } [context processPendingChanges];// most important thing , if not objects will // keep adding in context and time would for insertion would drastically increase. } //for last set of the objects which are less than 3000 NSError *error; NSLog(@"saved rec nu %d",i); if (![context save:&error]) { NSLog(@"Whoops, couldn't save: %@", [error localizedDescription]); return NO; } 

Let me know if anything ..

+3
source

Source: https://habr.com/ru/post/1403485/


All Articles