DataGridView performance - does it depend on the video card?

I have a C # / application. Net, which seems to use most of the processor time by doing updates to the DataGridView. I manually update the data every 1.5 seconds, updating only the data that has been changed. It ends with approximately 250 updates every 1.5 seconds. I would like to reduce this by 1.5 seconds to a much smaller number (maybe 0.5 seconds). I have profiled and optimized as much as possible, and while the performance is in order, I would like it to be faster.

My question is: will the upgrade from Nvidia FX1800 to Nvidia FX3800 accelerate significantly?

+3
source share
4 answers
Operation

GDI + does not depend on a large number of GPUs, since it uses only the basic accelerated work that each graphics card supports (lines, rectangles, etc.).

I would suggest that the problem is that you are not hiding control during updates. Check the BeginUpdate / EndUpdate methods, if available. If not, setting the visible value to false, updating and setting the visible value to true may someday solve the problem.

+1
source

Your time is almost certainly not spent on drawing a screen, but when updating the internal representation of the data, so no, that won’t help.

EDIT:

To find out where the time is spent, look at the profiler. I personally prefer one of the Red Gate

+1

"".

CPU . , . , . ? , . , , . . , ( , ), .

, .

, , . . , (, , , , ).

?

Jacob

+1

I used a DataGridViewrepresenting list of 1000 elements in the application, where the data (and by this I mean the entire list) is updated every 50 ms with virtually no performance penalty. Therefore, I do not believe that what you want to do is too great for DataGridView.
As Eric J. suggests, you should comment on your code, I'm sure you will find that your performance problem comes from a different place in the code.

0
source

Source: https://habr.com/ru/post/1714899/


All Articles