I need a Contributon function for my win service instances

I have a table with Guid as the primary key. There were many rows in the table. In addition, I have a win service that performs a series of actions with each row (it may require reading and writing data from other databases). Therefore, processing a single line takes a lot of time. (on average about 100 seconds)

My winning service works as follows:

 public class MyDto { public Guid Id { get; set; } } while (true){ if(time to start){ List<MyDto> rows = LoadData(); foreach(MyDto obj in rows){ Process(obj);//it takes in average about 100 sec } } } 

I need to reduce the execution time of all my lines. For some reason, I decided to increase the insatnces of my winning service. Therefore, I need each win service to execute its own set of strings.

I parameterized my fun LoadData() :

 public List<MyDto> LoadData(int winServInstanceNumber){ } 

So, I need a contribution function that depends on the total number of instances of the win instance and the inappropriate win service instance number.

Can you offer something better than

 //on .net side obj.Id.GetHashCode()%totalWinServiceInstancesCount 

or

 --on sql side HASHBYTES('MD5', CAST(id as varbinary(16)))%totalWinServiceInstancesCount 
+5
source share
2 answers

It seems that all you need to do is unscrew more threads to process your data. But for this you need to control what you are processing so as not to process the same thing twice. To get the control, you can use MSMQ , for example, or System.Collections.Queue . Your service should be responsible for querying the database and loading raw rows into your queue.

Then you can call some static ProcessBatch method. It will go into the queue and scroll through the stream (s) and pass the identifiers of the rows (rows) to the processor (s) / Workers. the worker will process only one line. A worker may be a separate EXE and not work. Your "ProcessBatch" should control what is being processed / not being processed. He must control the number of threads currently. You do not want to spin too much.

So,

 Service ProcessControl Worker | | | |---Load Queue | | | | | | |<--------| | | | | | |-----Call When Q ----->|---Queue | | | | | | |<------| | | | | |---Load Queue |----Start------>| | | |<---Success-----| |<-------| | | | |---Permanent | |-----Call When Q ----->| Dequeue | | | | | | |<------| | 

This is probably a typical workload split that speeds up otherwise slow processes.

+1
source

Instead of trying to run multiple instances of the same service, you should accept an asynchronous producer / consumer pattern. Use the Task object to launch a producer, and then create many consumers. If your data should be processed in a certain order, then you will have to streamline the consumers only to work with the assigned data block. Otherwise, they may pick up their work and begin processing.

The following example assumes that work can continue in any order. You can fine tune the number of users based on system resources. Use AppSetting to customize MaxConsumer and find the perfect number that optimizes processing.

Connect the start / stop method from starting / stopping the service, as well as any necessary processing of logs / exceptions. The example here is simplified and shows the basics of the template.

 public class MyService { BlockingCollection<MyDto> sharedResource = new BlockingCollection<MyDto>(); CancellationTokenSource cancellation; private Task producer; private List<Task> consumers; //Load/Set this from configuration private static readonly int MaxConsumer = 3; public void Start() { this.cancellation = new CancellationTokenSource(); // Start the producer & Consumers, as long running task this.producer = Task.Factory.StartNew(() => this.Produce(), TaskCreationOptions.LongRunning); this.consumers = new List<Task>(); for(int i=0; i<MaxConsumer; i++) { this.consumers.Add(Task.Factory.StartNew(() => this.Consume() , TaskCreationOptions.LongRunning)); } // If you need primary service loop you can do // something like the following // while(!this.cancellation.IsCancellationRequested) //{ // this.cancellation.Token.WaitHandle.WaitOne(1000); //} } public void Stop() { this.cancellation.Cancel(); WaitOnTask(producer); foreach(var t in this.consumers) { WaitOnTask(t); } this.cancellation.Dispose(); } private void WaitOnTask(Task task) { try { if (!task.IsCompleted) { //May want to use timeout //instead of blindly waiting task.Wait(); } } catch(ObjectDisposedException oex) { // Task might have been disposed/closed already } } public void Produce() { var token = this.cancellation.Token; while(!token.IsCancellationRequested) { //Code for your data loading if (time to start) { List<MyDto> rows = LoadData(); foreach(var data in rows) { this.sharedResource.Add(data, token); } } //Wait and repeat token.WaitHandle.WaitOne(1000); } } public void Consume() { var token = this.cancellation.Token; try { foreach (var data in this.sharedResource.GetConsumingEnumerable(token)) { // Code for your data processing Process(data); } } catch(OperationCanceledException ex) { // service stop requested, can log here // or take action for saving state as needed } } } 
+1
source

Source: https://habr.com/ru/post/1242205/


All Articles