Tags

, ,

I’ve been going through my old Computer Language magazines, and man have things changed since 1984. It’s a fascinating trip through the early days of personal computing.

I was reading an article about using dates in an application and was struck by the prescience of one bit.

The author writes:

While it is not required by the [ANSI] standard, you should use the full year and not the year-in-century. There is no longer a great need to save computer storage space by cutting off characters, and the year 2000 is not that far away. if you use the year-in-century, you set in motion systems that will collapse in the future. Imagine the fun of having a program that computes interest based on negative time intervals.

Exactly and indeed!

And then Y2K happened. Reading that paragraph in 2021 really cracked me up.

Some of us, even back then, were thinking a few years ahead. I started programming back in the 1970s, and it would never occur to me to use two-digit year values.

For one thing, it’s not uncommon to sort dates in string form. (Filenames, for instance.) If one uses the form: YYYY-MM-DD, this naturally sorts by year, month, and day. If one uses the short form, then names from 99 sort after names from 00-21. That’s usually not what one wants.

Worse are formats such as MM-DD-YY(YY) or DD-MM-YY(YY) due both to their ambiguity and poor sorting characteristics. (Let alone actually using month names!)

The lesson is very simple: Always be as general and forward-thinking as possible.

Ø