Clock drift and accuracy

I have an application involving multiple impees spread across the country, and they all must be accurately synchronised. Most of the time they will be asleep, but wake up on a local event and report data back. The data will include a timestamp so that the event can be correlated across multiple sites (ie, the event may occur at different sites at slightly different times - think of a thunderstorm - someone closer hears it sooner than someone further away).

I assume that when the imp powers up, it gets a time update from the server, but is the time ever pushed out after this, or is it just relying on it’s internal RTC? How much does this drift over say, a 6-12 month period? If I power up an imp and leave it for 12 months, then check the time, will it still be accurate, or will it have drifted a few seconds? Is there any temperature compensation on the RTC? Does the server send out regular time updates?

btw, the current implementation is time-synchronised using a GPS pulse-per-second (PPS), but with units inside buildings getting a good GPS fix is often a problem. In these cases we use ntp over ethernet. Replacing the GPS and ethernet interfaces with an imp could be a neat solution.

The imp’s time is updated every time it connects to the server - not just at boot. The RTC is not super-accurate - it’s not temperature compensated.

Note that we only sync to an integer second count too. It’s not ntp or anything fancy.

Interesting. Thanks Hugo.