I get some confusing Stopwatch results in my C # project. Consider the following code:
static void Main(string[] args) { byte[] myEventArray = GetEventByteArrayFromDatabase(); byte[] myEventItemsArray = GetEventItemByteArrayFromDatabase(); uint numEvents = 1000; uint numEventItems = 1000; Stopwatch sw1 = Stopwatch.StartNew(); TestFunction(ref myEventArray, numEvents, ref myEventItemsArray, numEventItems); sw1.Stop(); float timeTakenInSeconds = (float)sw2.ElapsedTicks / Stopwatch.Frequency; Console.WriteLine("Total time: " + timeTakenInSeconds + " seconds. "); } static void TestFunction(ref byte[] EventArray, uint numEvents, ref byte[] EventItemArray, uint numEventItems) { Calculator calc = new Calculator(); calc.Test(EventArray, numEvents, EventItemArray, numEventItems); }
I run this and get a time of about 0.2 seconds. Now consider the following:
static void Main(string[] args) { byte[] myEventArray = GetEventByteArrayFromDatabase(); byte[] myEventItemsArray = GetEventItemByteArrayFromDatabase(); uint numEvents = 1000; uint numEventItems = 1000; Stopwatch sw1 = Stopwatch.StartNew(); Calculator calc = new Calculator(); calc.Test(myEventArray , numEvents, myEventItemsArray , numEventItems); sw1.Stop(); float timeTakenInSeconds = (float)sw1.ElapsedTicks / Stopwatch.Frequency; Console.WriteLine("Total time: " + timeTakenInSeconds + " seconds. "); }
I run this and get a similar result, as expected. Finally check this out:
static void Main(string[] args) { byte[] myEventArray = GetEventByteArrayFromDatabase(); byte[] myEventItemsArray = GetEventItemByteArrayFromDatabase(); uint numEvents = 1000; uint numEventItems = 1000; TestFunction(ref myEventArray, numEvents, ref myEventItemsArray, numEventItems); } static void TestFunction(ref byte[] EventArray, uint numEvents, ref byte[] EventItemArray, uint numEventItems) { Stopwatch sw1 = Stopwatch.StartNew(); Calculator calc = new Calculator(); calc.Test(EventArray, numEvents, EventItemArray, numEventItems); sw1.Stop(); float timeTakenInSeconds = (float)sw1.ElapsedTicks / Stopwatch.Frequency; Console.WriteLine("Total time: " + timeTakenInSeconds + " seconds. "); }
When I ran this, the synchronization result will be ten times faster for some reason. Any ideas why this could be?
A bit more information: The Calculator class is defined in C ++ / CLI. I use it as a wrapper for my own C ++ code, which ultimately works with byte arrays. I also compile the "unsafe" flag. Not sure if this could have any effect. All code compiled in release mode.