lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]



On 14-Jun-05, at 7:19 PM, Chris Marrin wrote:

I have gotten recent experience with QueryPerformanceCounter(). I naively tried to get its value and divide it by the value in QueryPerformanceFrequency(), first converting both to doubles. I found that this conversion reduced the precision down to about 100ms. This is because on my shiny new 3GHz P4 machine, the frequency is 3GHz! That means the counter increments by 3 billion every second. Since it's a 64 bit counter it will still not rollover for about 200 years. But if your machine has been running for a month or so like mine was, the number gets too big to be represented very precisely by a double!

I find this a bit puzzling. A double has 53 bits of precision. A day has 86400 seconds; log2 of 86400*3e9 is 47.88; in other words, a double can accurately represent a number of 3GHz ticks up to a bit over 34 days. After that, it should lose one bit of precision, which should be barely noticeable. In a year, you would lose less than four bits of precision. If you are reduced to a precision of 100ms, you are losing something like 28 bits of precision. This suggests that you were doing arithmetic with floats rather than doubles; floats have 24 bits of precision, or 29 less than doubles, which would work out to about the loss you are reporting.