The Linux kernel always stores and calculates time as the number of
seconds since midnight of the 1st of January 1970 UTC regardless of
whether your hardware clock is stored as UTC or not. Conversions to
your local time are done at run-time. One neat thing about this is
that if someone is using your computer from a different timezone, they
can set the TZ environment variable and all dates and times will
appear correct for their timezone.If the number of seconds since the 1st of January 1970 UTC is
stored as an signed 32-bit integer (as it is on your Linux/Intel
system), your clock will stop working sometime on the year 2038.
Linux has no inherent Y2K problem, but it does have a year 2038
problem. Hopefully we'll all be running Linux on 64-bit systems by
then. 64-bit integers will keep our clocks running quite well until
aproximately the year 292271-million.
Remember Y2K almost 10 years ago? It didn't make too big of a splash because of the media hype - at least me - encouraged companies to make sure things just worked. I like to think of that as the "odd man out" syndrome since no company wanted to consistently be cited as the example of the "big Y2K failure."
Well, we have another computer time death march coming up in 28 years in 2038. I really do hope that we're all running 64-bit operating systems by then, but considering big business it still running 1970s mainframes... I really do wonder. So let's start today by encouraging or in my case demanding 64-bit support of software and operating systems.
A few people have commented to me via Facebook and email.
ReplyDeleteHP only discontinued it's production of 16-bit processors (HP2100 series) in 1996 (13 years ago) despite starting the manufacture of the series in 1966. Hardware may not be the problem in 2038, but we are already seeing affects of the Y2K38 bug.
Many 32-bit programs calculate time averages using (t1 + t2)/2. It should be quite obvious that this calculation fails when the time
values pass 30 bits.
AOL had server software problems in May 2006 due to this bug. A programmer used 1 billions seconds in the future as an arbitrary timeout date.
Also, any software that deals with computations 20 years in the future (common) has the possibility to break if not fixed by 2018 which is only nine years from now.
A quote I once read on the subject: "Dates in calendar are closer than they appear."