Best practice for inserting and retrieving data from memory

We have an application that receives data in real time and inserts it into the database. he works for 4.5 hours a day. We insert data a second time in 17 tables. The user at any time can request any table for the last second data and some record in the history ...

Feed processing and insertion is done using the C # console application ...

User request processing is performed through the WCF service ...

We found that insertion is our bottleneck; spend most of the time there. We spent a lot of time outlining the tables and indexes, but the results were unsatisfactory.

Assuming we have enough memory, it is best to insert data into memory instead of the database. We are currently using datatables that are updated and inserted every second. One of our colleagues suggested a different WCF service instead of the database between the channel handler and the WCF user request handler. It is assumed that the middle level WCF will be based on TCP and stores data in its own memory. We can say that the feed handler may deal with user requests instead of having an intermediate level between the two processes, but we want to separate things, so if the feed handler fails, we want to still be able to provide the user with current records

, . WCF 2 ? , , 3 ( , (WCF), (WCF) , .

, !

+3
3

(, ) , . , , , , . , , , . , .

, , , , .

, . WCF-, , WCF , + . WCF .

:. , , , , - . , . , - , ...

+2

? MySQL MEMORY, , , .

0

Are you using a DataTable with a DataAdapter? If so, I would recommend that you completely abandon them. Insert your records directly using DBCommand. When users request reports, read data using DataReader, or populate DataTable objects using DataTable.Load (IDataReader).

The depletion of data in memory can lead to data loss in the event of a power failure or power failure.

0
source

Source: https://habr.com/ru/post/1753144/


All Articles