Series: Y2K and Your Mac
The Mac OS is in good shape, but Y2K problems can lurk in macros, databases, and more
Article 1 of 1 in series
by Geoff Duncan
This is a bit embarrassing, but I've saved nearly every TidBITS-related email message I've received since joining the TidBITS staff in late 1994. Sure, I delete unsubscribe requests, vacation notices, junk mail, and the like, but I've kept almost everything else, particularly messages from readers and internal email amongst the staff. According to that email archive, I've been avoiding writing about the year 2000 and the Macintosh since we first talked about such an article in February of 1995Show full article
This is a bit embarrassing, but I've saved nearly every TidBITS-related email message I've received since joining the TidBITS staff in late 1994. Sure, I delete unsubscribe requests, vacation notices, junk mail, and the like, but I've kept almost everything else, particularly messages from readers and internal email amongst the staff.
According to that email archive, I've been avoiding writing about the year 2000 and the Macintosh since we first talked about such an article in February of 1995. Why? In part, I don't find Y2K issues - known variously as "the Year 2000 Problem" or "the Millennium Bug" - particularly interesting. Although their ramifications are wide-ranging, Y2K issues are straightforward as computing problems go, and Macintosh hardware and system software have never had trouble dealing with the year 2000. Writing about Y2K and the Macintosh seemed about as relevant as writing about the dangers of highway driving and cars. The topic might be pertinent to many TidBITS readers, but it's not why people read TidBITS.
Things have changed since 1995. Y2K topics have moved from a fringe technology issue to a mainstream cultural thread covered continuously by newspapers, television programs, and Web sites. Opinions and analyses diverge widely. Some experts predict doom and global chaos, and some people are literally heading for the hills. Others experts claim Y2K issues will be minor or nearly non-existent (especially in the United States) and some people think the entire Y2K brouhaha is a conspiracy to sucker users and companies into paying for expensive upgrades and consulting. Further, a great deal of Y2K discussion emphasizes that no one really knows how profound - or how trivial - the problems may be. Mass media messages about Y2K issues are decidedly mixed, creating a sense of trepidation among many people which seems to be increasing as the end of the century draws closer.
Apple hasn't ignored society's growing millennial anxiety. In fact, Apple has been trumpeting the Macintosh's "Y2K compliance" with irreverent quotes, Web sites, and even a television commercial broadcast during the 1999 Super Bowl.
Although Apple's smugness may not be endearing, for the most part it's justified. The Macintosh truly has been ready for the end of the century since it first rolled off production lines in 1984, something mouse-thumping Macintosh advocates espouse as an indication of the Mac's superiority. However, the integrity of the Macintosh's hardware and software design doesn't necessarily mean Macintosh users can blindly assume their computers will be unaffected by Y2K issues.
Defining Y2K -- Fundamentally, Y2K problems concern a system's inability to process century information in dates correctly. This definition is different from the widely held belief that Y2K problems involve a computer interpreting a two-digit year as if it were in the 1900s - how a system handles the omission of century information is a subset of the larger issue. Although opinions vary, in my mind a program is "Y2K compliant" so long as it correctly handles dates with century information. In other words, if I enter "01-Jan-00" into an application and it interprets the year as 1900, I might be unhappy or seriously inconvenienced, but in fact, a two-digit year can easily be interpreted as any year divisible by 100, including 1200, 1600, or 2300. I wouldn't consider this behavior a "Y2K problem" unless the program rejected or otherwise misinterpreted "01-Jan-2000." The former case stems from a conflict between the program's assumptions and my expectations, while the latter stems from a genuine problem with the program's treatment of dates.
Humans often interpret century information by context. If you have an airline ticket dated 05-Apr-99, common sense tells you the ticket doesn't refer to 1899, since the Wright brothers didn't make their famous flight at Kitty Hawk until 1903. The context isn't as clear if you have a train ticket with the same date, although, if nothing else, changes in pricing, typographic style, and ticket materials would probably clue you in.
Computers don't pick up on contextual clues: they simply do whatever programmers tell them to do. In many cases, programers effectively tell computers "all dates are in the 20th century," or "if you see a date without century information, always assume it's in the 20th century" which is a problem if the program doesn't store any century information. The implications are widespread - some systems may crash or do the wrong thing based on unanticipated results from date-based math, some may refuse to start up, some may corrupt data, and others may assess a century's worth of interest penalties. Further, since microcontrollers using date information are present in everything from mainframes to coffee makers, determining what systems have century-related date problems (and what the impact of those problems might be) is an enormously complicated task.
Why did programmers make these seemingly brain-dead errors? In some cases, they weren't errors. Sometimes programmers omitted redundant century information to save memory and storage space: after all, in 1970 a megabyte of memory could cost more than $3 million and may have been larger than a breadbox. In other cases, programmers had little thought for the future because it was inconceivable to them that their software would be in use fifteen, twenty, or thirty years in the future. And sometimes programmers, being human, simply screwed up.
Y2K & Your Mac -- Macintosh hardware and system software from Apple is Y2K compliant - there's no fundamental "Y2K time bomb" ticking away inside your Macintosh. You can check out Apple's Y2K readiness disclosure, as well as a list of products Apple has tested for Y2K problems.
Although it's slightly obscured by a self-satisfied attitude, Apple's statement basically says that Macs won't have problems changing over to the year 2000, but that they don't make any promises regarding third-party products, including macros and custom programming. Obviously, Apple can't guarantee other company's products, but so long as those products use the date routines built into the Macintosh system software - and the vast majority of Mac programs do - they'll be fine. Software on the original Macintoshes can handle dates from January 1, 1904 to February 6, 2040; most Macintosh software released in the last decade uses a more expansive date system that can handle dates from about 30,081 B.C.E. (Before Common Era) to 29,940 C.E. (Common Era), along with non-Gregorian calendar systems.
The two most common cases where a Macintosh application would not use the date routines provided by the Mac OS are when it needs to use dates in a wider range, or when it needs to use date data or procedures originally developed for another operating system. Examples could include programs that model processes that take place over very long periods of time (like geology or stellar evolution), or Macintosh ports of programs for genealogy, statistics, or specialized vertical markets that must read and write date information used by other platforms - these programs may inherit Y2K issues that don't originate on the Macintosh.
Your expectation of Y2K compliance might be another matter. Once the calendar ticks over to the year 2000, you may find some Macintosh programs interpret two-digit years as if they were in the 1900s. Again, unless the program rejects or misinterprets a four-digit year, I wouldn't consider the program broken, although the behavior may be annoying - like an unwanted toolbar or a frequently used command without a keystroke equivalent. Some programs have "date windows" which define how they interpret two-digit years. For instance, in order to be compatible with System 6, Apple's Date & Time control panel still limits user input to years between 1920 and 2019. Current versions of Microsoft Excel handle four-digit years but assume any two-digit year less than or equal to 29 is in the 2000s, while two-digit years 30 and over are in the 1900s. Similarly, current versions of FileMaker Pro handle four-digit years, but use a convoluted window for two digit years, revolving around the first and last decades of the current and preceding century. (Although there are still cases where FileMaker interprets two-digit years provided by formulas or scripting as being the 1900s.)
I haven't been able to find a comprehensive clearinghouse for Macintosh Y2K issues, but Rich Barron is maintaining a list at his Macnologist site; it's a little apocryphal in places, but serves as a reasonable starting point. The best place to look for information about a specific application is with the program's developer (assuming they're still in business).
But Macs Are Immune! The greatest potential for Y2K issues on the Macintosh stems from custom utilities and applications, rather than from the Mac OS or major commercial products. Developers usually know about the Mac OS's internal date capabilities; however, consultants, hobbyists, interns, and everyday Macintosh users may not know about them, or have the tools to access them reliably. Further, because these people aren't necessarily experienced developers, they're more likely to make math errors or incorrect assumptions about dates. Even if the Mac OS and the tools used are Y2K compliant, it's entirely possible to create macros and custom solutions that exhibit classic Y2K problems.
For example, a few months ago a local non-profit organization asked me to identify and fix a "printing problem" with their donations system developed by a former volunteer a few years before in FileMaker Pro. The system is designed to project revenue forward into the next year based on pledges from their supporters, many of whom commit to regular, periodic contributions. The system wasn't printing projected donations beyond 1999. "This isn't a millennium bug, is it?" they asked. "It's only January 1999! Aren't Macs supposed to be immune?"
A few minutes in their databases revealed a typical Y2K problem. The system created donation numbers based on a donor's identification number and level of support, prefixed with (you guessed it) the month and year of the anticipated donation. A typical donation ID might be 9904-4-1234, where "9904" indicated the year and month of the expected donation. These prefixes were used for sorting - it turned out the system was creating the appropriate projections, but they were sorting incorrectly and the database operator didn't know how to find them. Further questioning revealed the number format they'd chosen was deliberate: it was designed to be easy to read over the telephone and to match donation numbers used in a paper-based accounting system dating back to the 1950s. Fixing the problem was simple, but the organization took weeks to decide on the changes that would work best for them, since the numbers are used widely throughout their operations.
Leap of Faith -- For folks who want to look beyond January 1, 2000, the year 2000 is a leap year, and therefore date-dependent systems need to account for February 29, 2000. Again, the Macintosh handles this date correctly, but a few computers stumble over it - in fact, Connectix had to update Virtual PC to 2.1.1 (now at 2.1.2) because its emulated clock chip failed to recognize this leap day.
The Gregorian calendar calls for a leap year whenever a year is divisible by 4, but not in years divisible by 100 unless they in turn are divisible by 400. Ironically, the year 2000 being a leap year sometimes isn't a problem for home-grown utilities, which (if they account for leap years at all) usually assume any year evenly divisible by four is a leap year. Thus, they would incorrectly consider 1900 and 2100 as leap years, but would behave correctly with the year 2000.
If you're curious, the Mac OS does not account for leap seconds (nor do other mainstream devices or operating systems). Leap seconds are a periodic adjustment made to atomic clocks to keep them in sync with the rotation of the Earth, which slows by about two milliseconds a day.
Best Advice -- The Macintosh is remarkably well prepared for the year 2000. For the most part, normal Macintosh users don't have a thing to worry about.
If you use specialized commercial software - particularly if it's ported from another platform - you should contact the program's vendor to see if they're aware of any Y2K issues. If you rely on home-grown macros or custom software, you should check to see if it's ready for the year 2000 or test it yourself, even if it's developed using tools that are Y2K compliant. A basic three-in-one test for Y2K problems would be:
Make a complete backup of your Macintosh. Consider disabling any automatic backup or scheduling utilities so they aren't confused by the following steps.
Set your Macintosh's clock to 11:59 PM on February 28th, 2000.
Wait a minute, then check your computer's clock to verify it handled the leap year correctly.
Use your custom tools as you would normally, taking care to exercise each feature in some depth. It's hard to offer specific advice, but you may need to create and delete records, input new data, sort, or perform comparisons. Obviously, focus on functions that are in some way date-dependent.
Y10K -- Before anyone asks, yes, many Macintosh programs need to be revised to accommodate five-digit years, although the Mac OS can handle them just fine. We promise that TidBITS-384730 will cover the topic in detail.