Dec 16, 2007 at 2:24 AM
Does anyone have experience with this .NET class? If so, can you explain to me why these two statements give me different results usually by a factor of 2?

float fps = (float)frameCount / ((float)timer.ElapsedMilliseconds / 1000.0f);  // Where timer is a Stopwatch
float fps = (float)frameCount / ((float)timer.ElapsedTicks / (float)TimeSpan.TicksPerSecond);  // Again, timer is a Stopwatch

This is driving me nuts. I need better than millisecond accuracy, but even when the result time should be in the tens of milliseconds these two lines give different results. Am I just missing something completely obvious here? I'm so close to just P/Invoking to QueryPerformanceCounter and QueryPerformanceFrequency, but I need Xbox compatibility.
Dec 16, 2007 at 2:38 AM
You could try using 10,000,000 instead of TicksPerSecond
Dec 16, 2007 at 2:45 AM
Edited Dec 16, 2007 at 2:45 AM
Also, I get issues with elapsed milliseconds sometimes because I've seen frames where it is less than a single millisecond, so you end up with zero being reported.

This isn't much of a problem later when the engine is larger, getting under a millisecond for a frame is pretty much impossible later on.
Dec 16, 2007 at 10:25 AM
Ticks is a fixed abitary number and not really useable for real performance measurement. I often tend just to use queryperformancecounter as this gives the most accurate measurements. Stopwatch is meant to be using that´, if available.
Dec 16, 2007 at 5:29 PM
Its arbitrary, but using TicksPerSecond as a conversion should be consistent, and its not.