I created a software, where time is very important. Because of this every time some data is transferred I set also the time on the win ce device. I transfer trough socket DateTime.Now.Ticks and set time with
[DllImport("coredll.dll")] private extern static uint
SetSystemTime(ref SYSTEMTIME lpSystemTime);
Time zone is set correctly to GMT+1 on PC, where server application is running and also on device. HomeDST is 0 on device.
My problem is, that there is always one hour difference between time in OS and time in my software. I retrieving time using
[DllImport("coredll.dll")] private extern static void
GetSystemTime(ref SYSTEMTIME lpSystemTime);
So for example on the right upper corner in device I see 9:12, than in application 8:12.
Do have anybody an explanation / solution for this? It will be very helpfull, because application is unfortunatelly already used in live system and this one make a huge problems...
SetSystemTime takes the time in UTC (see http://msdn.microsoft.com/en-gb/library/windows/desktop/ms724942(v=vs.85).aspx)
So if you're getting time from your server which is +1, you should call .ToUniversalTime() on it before passing it to SetSystemTime().
As a general rule, to make your life simpler, I'd advise keeping all your times as UTC, everywhere, except when displaying in the UI - that's the time to translate them to local time.
Do you use DST? If you do try disabling it and see if the problem persists. I had a problem where the clock could go several hours wrong after setting the system time with both SetSystemTime() and SetLocalTime().
It turned out the device manufacturer hadn't properly implemented DST in a driver because the real time clock could only handle local time. At least that was their explanation. Turning off DST fixed my problem except you have to manually adjust the clock two times a year for daylight saving time.