Tags

, , , , , , , , , , , ,

What follows is the text of a piece I wrote back May of 1990 in a Star Trek USENET group. There is some minor editing for clarity. Given that 64 bits seems common now, the projections may need some adjustment!

Those of you [in this USENET group] who don’t speak computer may not find this interesting, but the first season repeat of [Star Trek: The Next Generation episode] 11001001 made me start thinking about what kind of bits and bytes they might have in the 24th century. What follows is just one person’s imaginings……

In 11001001, Commander Data refers to bit-groups of 8 and 16, which is common today. These groups are known as a byte (8 bits) and a word (16 bits). However, you should know that the terms and numbers are arbitrary. There is no reason for 8 or 16 — not really (it makes hex a nice convention, but…).

Today, we typically send textual information in bytes. All your modems are using 8-bit “chunks” of data (bytes) to send your messages back and forth. For this reason, one might suppose bytes will be with us as long as we use the English language.

WRONG, bit-breath!

Even today [i.e. 1990], a move is afoot to change to 16 bit chunks (words) for exchanging information. I’m certain that by the 24th century this will not only have occurred, but likely gone beyond 16.

An 8-bit package allows for 256 different values. In the current system [i.e. ASCII], this allows for all 26 letters of the alphabet, plus the capitals (that makes 52), plus the digits (62), plus a bunch of symbols like @#$@%$! This seems more than enough for us, even if we throw in, say 128, graphics symbols like IBM PC’s do.

A 16-bit chunk allows 65,536 different values — hard to imagine why one would need that for just text… or is it? What about Japanese languages, or Arabic? Is is the desire to include these languages in the “International Character Set” that makes us want to move to more than bytes. This will probably occur in our lifetimes. [Ha!]

Move ahead a few centuries, add Klingon and Romulan and Orion and dozens of other languages, and a big bit size seems a must. Our current hardware is structured around 4/8/16/24/32 bit sizes, but that’s not to say that this will be true in the 24th century.

[Unicode and current graphics images work best with 32-bit groups, so here in 2015, “32 is the new 8” (the sane minimum size), and 64 is all but standard.]

Perhaps 8 bits  (with 256), and 16 bits (with 65,536) seem a little strange. There’s no real reason to count in eights. How about a bit size of 10 (like fingers!), which gives us 1024 — 1 K to computer folk. Double that to 20 bits, and you have a nice even million (well, not really, but darn close — what computer folk call 1 Meg).

My theory is that by TNG time, the smallest size will be a 10-bit package, which I propose to call a chunc (“chunk”). A chunc can count to 1024 (1 K). The real working size will be a 20-bit package called a dchunc (“dee-chunk”). A dchunc can count to 1 Meg.

Medium-range computers could double that size for a “quad-chunk” with 40 bits, called a qchunc (“que-chunk”). A qchunc can count to 1 Terabyte (1,000 Gigabytes).

Remember, you heard it here first!!


Well,… it was first back in 1990… It now seems naive. It’s looking like 64 bits might stay around for a while and may become the new minimum. That data width can count up to 18 Exabytes, which is fairly much a  lot!

AI might need to address spaces in the Petabyte range, and 64 bits would support that okay.