There was no way to know if it was being overhyped until after it happened, and, for the reasons Dominus and many others have laid out, nobody wanted to take that chance. And, in the media, I do think there was an element of pushing it on purpose in order to combat most people's innate inertia and motivate them, and their companies, to upgrade or at least check.
And it really was silly, in 1998 and '99, for places to have computer software unable to distinguish between 1908 and 2008.
You demonstrated the problem and didn't even realize, I bet. Was it silly in 1998 and 1899? 1699? 2099? You know, that you meant 1999 because you are a human and did not even think about it. The computer doesn't. The computer only knows 99 is less 1998.
Y2K was 24 and a half years ago. How much memory and disk were in your Windows '95 computer back then? In the 70s, 80s, 90s, storage and memory were at a premium and some of the software in big businesses was already 20 years old. Data everywhere in business and even home systems has, in total, zillions of dates for transactions and records. Due to physical system constraints, saving storage and memory was critical. In 1985 the date for today, Aug 4, would have been stored as '850804' instead of '19850804'. Not storing '19' saved a lot of much-needed internal space that was thought to be redundant, when added up. Every byte counted. If they printed something, '19' was appended at print time to make the date 1985. That's basically why it happened.
It all worked fine until the date was '991231' and suddenly tomorrow is '000101' ...oops. Computers look at that value numerically, and numerically 101 < 991231. Big problem! All the date calculations stop working accurately. Today was before yesterday. Critical systems like air traffic, communications, energy, banking, transportation, medical records, suddenly do not have the right date. Every freaking new timestamp in every freaking database and data file is f'ed up. It was a humongous problem and risk.
And you can't just fix the programs to have 4 digit instead of 2 digit dates, and move on. You have to fix years and years of data files as well. You have to convert all that stuff so that it is all compatible and all positioned and aligned correctly and in the right places. Adding 2 digits to the year shifts the position of all the data after it. Programs and data files must account for it. Because, depending on the application, you are also reading, last months, last years, or 10 years worth of data, while adding the current.
Then you have to test multiple iterations of all that stuff so that you know, going forward, you have not dropped something. If you test right, you also run an entire cycle that simulates the flip from 1999 to 2000. Then at some point, you have to cut over all of your new data files, databases, and programs (hopefully well before New Year's Eve). You have to pull out all the old data and programs, and put in the new ones. In a big corporate setting, that can involve thousands of data files and programs. Miss one, and you're screwed. It was a massive undertaking.
No one, NO ONE, who was a programmer took it for granted. We all knew it was doom if this date problem was not fixed. It was not a matter of just taking a chance. We KNEW. I don't think Y2K was over-hyped as much as it was exploited. Some took advantage of a very serious situation and media did a poor job of accurately explaining it. That's unfortunate. I don't think anyone, except for those of us who were down in the trenches fixing this thing, really understands the magnitude and scope of the problem, or the potential disaster that could have happened if it wasn't fixed. The fact that Y2K was fixed, with minimal impact, is an incredible success story.
Okay...I could go on, but this already too long. I hope you all understand.