Editorial missed the mark on Year 2000 challenges

Programmer points out some real challenges with new millennium

Re: End of days put on hold (Our View, Dec, 21)

As one of the many programmers who worked to avert the Y2K problem it is disappointing to read another inaccurate description of the problem and a skepticism that it even existed.

It was not because of “computers’ supposed inability to read a year with two zeros at the end” but that in the year 2000 the two-digit year format would result in invalid dates or cause mathematical calculations to fail.

And because “nary a blip was seen on the landscape” the writer implies that the whole problem was a myth comparable to a 5,100-year-old prediction based on Mayan mythology.

Yes, some people overreacted 12 years ago, but the potential for information anomalies and system outages was real and, had problems not been addressed, the impact would have been considerable.

Early computer programs used two digits instead of four to represent the year because of space constraints: punched cards used for input were limited to 80 characters, computer memory was scarce and storage was expensive. And because programs were only expected to be in use for a few years, they did not take into account the change of millennia.

Obviously programs would fail or produce inaccurate results when the year changed from (19)99 to (20)00 if they had been coded to prefix the two-digit year with 19, expect it to be part of an ascending sequence, or attempt to divide it by four to test for a leap year.

These are just a few simple examples of the kinds of problems that existed. A major challenge was to identify which programs contained any kind of problematic coding.

The article was concerned that beliefs should not be confused with reality. We also need to make sure that real facts and events are not distorted by inaccurate statements and erroneous implications.

Andrea Gagnon