I am creating a Windows form application in C # that reads from hundreds of files and creates a hierarchy of objects. In particular:
DEBUG[14]: Imported 129 system/s, 6450 query/s, 6284293 document/s.
Amount is the total amount of the created object. Objects are really simple, by the way, only some int / string properties and strongly typed lists inside.
Question : Is it normal that my application consumes about 700 MB of memory (in debug mode)? What can I do to reduce memory usage?
EDIT : that is why I have 6284293 objects if you are just curious. Imagine a search engine called "system". There are more requests in the system.
public class System
{
public List<Query> Queries;
}
Each request object refers to a "topic"; this is the main argument (for example, the search for "Italian weekend"). This is a list of the extracted document inside:
public class Query
{
public Topic Topic;
public List<RetrievedDocument> RetrievedDocuments;
public System System;
}
Each received document has an assessment and rank and has a link to a topic document:
public class RetrievedDocument
{
public string Id;
public int Rank;
public double Score;
public Document Document;
}
Each topic has a set of documents inside that may be relevant or not relevant, and a link to its parent topic:
public class Topic
{
public int Id;
public List<Document> Documents;
public List<Document> RelevantDocuments
{
get {return Documents.Where(d => d.IsRelevant());}
}
}
public class Document
{
public string Id;
public bool IsRelevant;
public Topic Topic;
}
There are 129 systems, 50 main topics (129 * 50 = 6450 objects of the request), each request has a different number of documents extracted, a total of 6284293. I need this hierarchy to perform some calculations (average accuracy, topic facilitation, average system accuracy, relevance) . Here's how TREC works ...