I store arrays of doubles in an NSData * object, which is saved as a binary property in the Core Data (SQLite) data model. I do this to store selected data for graphical display in an iPhone application. Sometimes when a binary contains more than 300 doubles, not all duplicates are stored on disk. When I leave and restart my application, there can be no more than 25 data points that have been saved or up to 300.
Using NSSQLitePragmasOption with synchronous = FULL, and that can make a difference. Itβs hard to say because the error is intermittent.
Given the warnings about performance issues resulting from using synchronous = FULL, I'm looking for tips and pointers.
Thanks.
[[ Edit: here is the code. ]]
The goal (still unrealized) of -addToCache: is to add every new binding to the cache, but periodically update the data object with an error (error?).
From Data.m
@dynamic dataSet; // NSData * attribute of Data entity
- (void) addDatum: (double_t) datum
{
DLog (@ "- [Data addDatum:% f]", datum);
[self addToCache: datum];
}
- (void) addToCache: (double_t) datum
{
if (cache == nil)
{
cache = [NSMutableData dataWithData: [self dataSet]];
[cache retain];
}
[cache appendBytes: & datum length: sizeof (double_t)];
DLog (@ "- [Data addToCache:% f] ... [cache length] =% d; cache =% p", datum, [cache length], cache);
[self flushCache];
}
- (void) wrapup
{
DLog (@ "- [Data wrapup]");
[self flushCache];
[cache release];
cache = nil;
DLog (@ "[self isFault] =% @", [self isFault]? @ "YES": @ "NO"); // [self isFault] is always NO.
}
- (void) flushCache
{
DLog (@ "flushing cache to store");
[self setDataSet: cache];
DLog (@ "- [Data flushCache:] [[self dataSet] length] =% d", [[self dataSet] length]);
}
- (double *) bytes
{
return (double *) [[self dataSet] bytes];
}
- (NSInteger) count
{
return [[self dataSet] length] / sizeof (double);
}
- (void) dump
{
ALog (@ "Dump Data");
NSInteger numDataPoints = [self count];
double * data = (double *) [self bytes];
ALog (@ "numDataPoints =% d", numDataPoints);
for (int i = 0; i
source share