如何计算毫秒级的时间差

From: 博客园

FromLink: https://www.cnblogs.com/snser/p/4083812.html

如何计算毫秒级的时间差

计算毫秒级的时间差算是一个常见的需求吧...

手头上是windows编程的项目,所以首先就想到的是GetTickCount(),但MSDN上这么说:



写个程序试一下吧:

#include <stdio.h>
#include <windows.h>

int main(void)
{
	DWORD dwLastTime = GetTickCount();
	for (int i = 0; i != 10; ++i)
	{
		DWORD dwCurrentTime = GetTickCount();
		printf("GetTickCount = %ldms TimeDiff = %ldms\n", dwCurrentTime, dwCurrentTime - dwLastTime);
		dwLastTime = dwCurrentTime;
		Sleep(500);
	}
	return 0;
}


可以看到,算了10次,每次偏差一般都有1ms,更有甚者,达到15ms,跟MSDN里说的实际精度一样。



所以,用GetTickCount()计算毫秒级的时间差是不靠谱的!

那下面,如何满足我们的需求呢?

需求1:计算毫秒级别的时间差。

需求2:返回值最好是unsigned long级别的,以便与现有代码保持兼容。

解决方案1:

clock_t clock(void);

这个函数返回的是从程序启动到当前时刻所经历的CPU时钟周期数。将这个函数封装一下即可:

#include <ctime>

ULONG GetTickCountClock()
{
	return (ULONG)((LONGLONG)clock() * 1000 / CLOCKS_PER_SEC);
}


测试结果:



解决方案2:

SYSTEMTIME (url: http://msdn.microsoft.com/en-us/library/windows/desktop/ms724950(v=vs.85).aspx)

FILETIME (url: http://msdn.microsoft.com/en-us/library/windows/desktop/ms724284(v=vs.85).aspx)

通过SYSTEMTIME和FILETIME,我们可以得到距离1601年1月1日凌晨所经历的时间,单位是100纳秒。

这个时间肯定是足够精确了,但是得到的数值是一个LONGLONG,没关系,我们可以用这个时间来校准原生的GetTickCount()。

ULONG GetTickCountCalibrate()
{
	static ULONG s_ulFirstCallTick = 0;
	static LONGLONG s_ullFirstCallTickMS = 0;

	SYSTEMTIME systemtime;
	FILETIME filetime;
	GetLocalTime(&systemtime);
	SystemTimeToFileTime(&systemtime, &filetime);
	LARGE_INTEGER liCurrentTime;
	liCurrentTime.HighPart = filetime.dwHighDateTime;
	liCurrentTime.LowPart = filetime.dwLowDateTime;
	LONGLONG llCurrentTimeMS = liCurrentTime.QuadPart / 10000;

	if (s_ulFirstCallTick == 0)
	{
		s_ulFirstCallTick = GetTickCount();
	}
	if (s_ullFirstCallTickMS == 0)
	{
		s_ullFirstCallTickMS = llCurrentTimeMS;
	}

	return s_ulFirstCallTick + (ULONG)(llCurrentTimeMS - s_ullFirstCallTickMS);
}


测试结果:



精度比较

每隔50ms获取一次当前时刻,对比TimeDiff与50之间的差距,统计1000次:

#include <math.h>

int main(void)
{
	int nMaxDeviation = 0;
	int nMinDeviation = 99;
	int nSumDeviation = 0;

	DWORD dwLastTime = GetTickCountCalibrate();
	Sleep(50);

	for (int i = 0; i != 1000; ++i)
	{
		DWORD dwCurrentTime = GetTickCountCalibrate();
		int nDeviation= abs(dwCurrentTime - dwLastTime - 50);
		nMaxDeviation = nDeviation > nMaxDeviation ? nDeviation : nMaxDeviation;
		nMinDeviation = nDeviation < nMinDeviation ? nDeviation : nMinDeviation;
		nSumDeviation += nDeviation;
		dwLastTime = dwCurrentTime;
		Sleep(50);
	}
	printf("nMaxDeviation = %2dms, nMinDeviation = %dms, nSumDeviation = %4dms, AverDeviation = %.3fms\n",
	nMaxDeviation, nMinDeviation, nSumDeviation, nSumDeviation / 1000.0f);

	return 0;
}


比较GetTickCount、GetTickCountClock、GetTickCountCalibrate的精度如下:

GetTickCount           nMaxDeviation = 13ms, nMinDeviation = 3ms, nSumDeviation = 5079ms, AverDeviation = 5.079ms
GetTickCountClock      nMaxDeviation =  2ms, nMinDeviation = 0ms, nSumDeviation =    4ms, AverDeviation = 0.004ms
GetTickCountCalibrate  nMaxDeviation =  1ms, nMinDeviation = 0ms, nSumDeviation =    3ms, AverDeviation = 0.003ms


可以看到,原生的GetTickCount误差过大,最大误差13ms,平均误差5ms,肯定无法满足毫秒级的计时需求。

GetTickCountClock与GetTickCountCalibrate在精度上相差无几,都可以满足毫秒级的计时需求。

区别在于,GetTickCountClock是从当前程序运行开始计时,GetTickCountCalibrate是从系统启动开始计时。

有关溢出

4个字节的ULONG最大值是4294967296ms,也就是49.7天,超过这个值就会溢出。

[转载请保留本文地址:http://www.cnblogs.com/goagent/p/4083812.html]

分类: C/C++

标签: c++, 毫秒, 时间差

Link: http://www.asm32.net/web.aspx/article-details-7386.html

浏览次数 0 发布时间 2021-02-07 15:52:52 从属分类 C/C++

| www.asm32.net | 2006版 | 资料中心 | linux | asm/asm32 | C/C++ | VC++ | java | 书签 | ASP.Net书签 | 京ICP备09029108号-1