A decade after the non-event that was the Millennium Bug, we look back at whether the problem was blown out of proportion, or whether it wasn't as serious as expected because of the millions spent ironing out the problem in advance.
As awareness grew of the Millennium Bug, it was unclear how pervasive it was.
And that meant that programmers had to review lots and lots of code to see where the problem might exist.
The effort to prevent a Y2K disaster swung into earnest about two years before December 31, 1999, Aaron notes.
The result most of the time was that software did not need any changes to accommodate Y2K, Aaron says. Systems either already were set for the 2000 switch or just needed a simple work-around.
Many systems, such as embedded systems and chips, did not fail from Y2K because they did not even run on Julian calendars, says J. Greg Hanson, executive vice president at technology services company Criterion Systems.
"The clock on the computer chip is not based on calendar time." Y2K was mostly a problem associated with business software, says Hanson, who at the time was chief software engineer for the US Air Force and led its $345 million Y2K program.
Y2K's legacy: Better disaster planning and documentation
Despite the debate over how serious the Y2K problem would have been had companies not invested so much time examining code for the Millennium Bug, it's clear that the IT industry did learn some lasting lessons, Aaron says.
These include doing continuity and disaster planning and documenting systems, he says.
Another lesson, says LogLogic's Roth, was that systems last longer than we think and need to be ready for the future.
"You always have to be future-proofing the software you write, the hardware you build," he says.
But IT organisations still have not learned as much from Y2K as they should have, says Gartner analyst Dale Vecchio, who covered Y2K at the time of the switch.
"I'd like to tell you that there were a lot of lessons learned, but I'm not sure that I've seen a lot," he says.
For example, the Y2K experience might have caused organisations to keep current with knowledge of their IT portfolio, he says.
Instead, "once they passed the risk of Y2K, they went back to the same lack of knowledge and now, when faced with an aging portfolio and aging workforce, they don't know any more now than they knew then", Vecchio says.
Many older applications are still running, but the people with the skills necessary to run them are inching toward retirement age, says Vecchio.
"Baby Boomer retirements [are] bringing back a recognition that [companies] don't understand these application portfolios to move forward," he says.
One reason that IT doesn't understand its application portfolio is due to one response to the Y2K issue: at the time, many businesses replaced homegrown software with vendors' Y2K-certified package software.
"Y2K drove a lot of packaged software sales," Vecchio says. But much of that software is a black box for IT and is harder to maintain knowledge of.
See all laptop reviews
See also: Millennium bug lives on