Recently, I read a lot about software (mainly related to science / math and encryption), which transfers part of its calculations to a GPU, which leads to an increase in speed by 100-1000 (!) Times for supported operations.
Is there a library, API, or other way to run something on the GPU via C #? I think of a simple Pi calculation. I have a GeForce 8800 GTX, if relevant at all (would prefer a solution without a card).
Alex Aug 08 '09 at 21:02 2009-08-08 21:02
source share