Is there too much time to sync?

Today I profiled one of my applications in C # using the Visual Studio 2010 performance analyzer. In particular, I profiled " Concurrency " because it seemed like my application should have more features than it was demonstrated. An analysis report showed that threads spent ~ 70-80% of their time in a synchronized state.

To be honest, I'm not sure what that means. Does this mean that the application is suffering from a state of "live lock"?

In the context ... there are ~ 30 + long threads associated with one AppDomain (if that matters), and some of the threads are very busy ( while(true) { _waitEvent.WaitOne(0); //do stuff } example while(true) { _waitEvent.WaitOne(0); //do stuff } ).

I understand that this is a rather vague question ... I guess I'm looking for some clarification about the meaning of the thread synchronization status. How much is too much, and why? ~ 75% really bad? Do I have too many threads? or should I just start looking in other areas?

+6
source share
3 answers

I'm not sure what that means.

This means that threads spent on average 75% of their time, waiting for another thread to complete some work.

Does this mean that the application is suffering from a state of "live lock"?

May be!

To clarify for readers unfamiliar with the term: “dead end” is when two threads wait for each other to complete, and therefore they wait forever. A "live castle" is a situation where two streams try to avoid a deadlock, but because of their poor choice they spend most of their time waiting. Imagine, for example, a table with two people, a fork and a knife. Both want to pick up both dishes, use them, and then put them down. Suppose I pick up a knife and you lift the fork. If we both decided to wait for the other to put the dishes, we were at a standstill. If we both understand that we are at an impasse and I put the knife down and you put down the fork, and then I pick up the fork and you raise the knife, we are locked. We can repeat this process indefinitely; we both work to resolve the situation, but we do not communicate effectively enough to resolve it quickly.

However, I assume that you are not in a lockout situation. My guess is rather that you just have a huge fight for a small amount of critical resources that can only be accessed one thread at a time. An Ockham razor will indicate that you should accept a simple hypothesis - a lot of threads, using a scarce resource in turn, and not a complex hypothesis - a whole bunch of threads that try to tell each other "no, you go first."

There are ~ 30 + long threads bound to one AppDomain (if that matters), and some of the threads are very busy (while (true) example {_waitEvent.WaitOne (0); // do stuff}).

Sounds awful.

I understand that this is a rather vague question.

Yes it is.

How much is too much and why?

Well, suppose you tried to drive around the city, and you and every other driver in the city spent 75% of their time stopping at the traffic lights, waiting for other drivers. You tell me: this is too much, and why? Spending an hour on traffic to drive 15 minutes can be quite acceptable for some people and completely unacceptable for other people. Every time I take SR 520 at rush hour, I spend an hour in motion to travel a distance that should take 15 minutes; that was unacceptable to me, so now I take the bus.

Whether this lousy performance is acceptable to you and your customers or not, this is your call. Resolving performance issues is expensive. The question you should ask is what kind of profit you will get by taking on the costs of diagnosing and fixing the problem.

Is 75% Really Bad?

Your threads take up four times as much time as you need. It seems to me not too good.

Do I have too many threads?

You are almost certain, yes. 30 is a lot.

But this is a completely wrong technical question to ask in your situation. Ask: "Do I have too many threads?" it is like trying to fix a traffic jam by asking, "Are there too many cars in this city?" The correct question is: "Why are there so many traffic lights in this city where there can be motorways?" The problem is not threads; the problem is that they are waiting for each other, and not driving to their destinations without stopping.

Should I just start searching in other areas?

How should we know?

+14
source

Without knowing the structure of your program, I can only tell you what Synchronization means with respect to threads. I can’t tell you what the problem is in your program.

Synchronization basically means that you coordinate it in such a way that you talk about running threads in parallel, when you need to take measures on the resources that these threads use to avoid “severe” damage to your data.

If you have a string , for example, that your two threads are writing, then if you did not have thread synchronization (for example, using AutoResetEvents or semaphores, etc.), then one thread may be in the middle, in a sense, changing the string , the OS is interrupted (perhaps this cutoff time was up), and now the second thread can start reading from this line, which is in an undefined state. This will lead to working with your program, therefore, to avoid such things and many other possible errors that can be caused by threads, you synchronize access to the shared resource by blocking it, so that only one thread at a time can write / read to / out of it, and any other thread wishing to do this while the other one is doing this should wait for our first thread to release the lock it holds.

This is a very simplified explanation of what thread synchronization is and what it is for.
There are many other things that come with streams, but that is the subject of the whole book.

As for your status, “Thread Sync,” I would suggest that this means that many threads are wasting their time on other threads that contain some shared resource.

Essentially, this means that your program doesn’t really work at the same time, but does everything in order, as threads spend their time on other threads. This means that the program is not written very well to really work at the same time, which, by the way, is not necessarily easily achieved, depending on the situation , obviously.

Hope this helps.

+2
source

There are several ways in which thread synchronization can kill performance:

  • Instructions for implementing synchronization take time to complete, especially for synchronization procedures that require a switch to kernel mode, such as Thread.Sleep (). In the worst case, a single thread, which often calls up for synchronization procedures, represents a number of overheads without real benefits.
  • Whenever multiple threads require exclusive access to a resource at the same time, at least one thread is stuck waiting. The worst case scenario here is that there is some central resource that everyone should access often. In this case, multi-threaded code is in serious danger of becoming an expensive, slow and complex way to use one thread at a time.

In relation to too much: Synchronization is something that takes time, but does not actually do any useful work. Therefore, in terms of performance, the ideal amount of synchronization is always zero. Consequently, the high value that resides in non-shared architectures and immutable data structures . Both are methods that help organize the code in a way that eliminates or reduces the need for synchronization.

Of course, the world is not perfect, so some synchronization is usually inevitable. But even then, this should be done using the lightest designs. For example, do not use the lock statement when the Interlocked method is executed. Or reduce the frequency with which this should happen, creating threads to send the work product to the central data structure in batches, instead of doing a lot of high-frequency updates.

+1
source

Source: https://habr.com/ru/post/902691/


All Articles