I remember it as clearly as yesterday. There it was in my Weekly Reader: “By the year 2000, the United States and the rest of the world will be using the metric system.”
When you’re in the third grade, you typically don’t have a subscription to the Wall Street Journal, so the Weekly Reader was pretty much my go-to publication for all things current and what would be decades in the future.
And believe me, in 1970, the year 2000 sounded like something out of the Twilight Zone. The movie Planet of the Apes had just come out a couple of years before I read this revelation about how we’d be getting rid of the inch, foot, and yard.
So, to discover that the worst thing that was going to happen was we’d have to relearn how to measure things was preferable to coming back from space and finding out that apes had taken over the earth.
But the year 2000 is over 20 years in our rearview mirror now and we still aren’t using the metric system.
We don’t even seem to be close to switching.
So, what was the deal, anyway? Why do some countries use the metric system and other countries do not?
I suspect that the biggest part of it is resistance to change and the cost. If there’s one thing people do not like doing, it’s changing. And people also don’t like spending their money.
When I went to work in the radio business in the 1970s, the music was played from vinyl. We actually played records on the radio. Whether it was 33 1/3 or 45s, we had turntables in the control room and that’s how songs were heard on the air.
There was a true art to being able to “cue up a record.” This was a learned skill that was required of all DJs where you would put the needle on the beginning of the record and turn the platter by hand until you heard the first note of the song. Then, based on how worn out the record player was, you’d back the song up just enough so that when you introduced it or went from another song or a commercial to the record, when you hit the start button, the beginning of the song hit perfectly without any silence between the two.
But there were DJs who didn’t like it when we went from vinyl to tape cartridges, then to CDs, and eventually computers. Most dealt with the first two changes, but it was the computer that put a lot of jocks out of the radio business.
They just refused to learn how to use them. Their argument was that the old ways worked just fine, so why were we messing with things?
I personally worked with two guys who, rather than learn how to play everything off of a computer, refused, and lost their jobs. One wound up driving a school bus, and the other worked in the rental car industry.
Both are admirable ways to make a living, but my point is that maybe somewhere along the way, those folks who proudly proclaimed in the Weekly Reader that the United States would toss the inch, foot and yard, and move to the millimeter, centimeter, and meter, discovered that they might like their idea, but the rest of the country didn’t.
I remember reading that story in the Weekly Reader and wondering how in the world someone even came up with exactly how long an inch is? On what was it based? Did they just make a couple of marks, vote on it while they had a few tankards of ale, and decide that this was an inch?
Since my theories are typically not accurate, I looked it up on the Encyclopaedia Britannica online at Britannica.com. Here’s what it says:
“Inch, unit of British Imperial and United States Customary measure equal to 1/36 of a yard. The unit derives from the Old English ince, or ynce, which in turn came from the Latin unit uncia, which was “one-twelfth” of a Roman foot, or pes. (The Latin word uncia was the source of the name of another English unit, the ounce.) The old English ynce was defined by King David I of Scotland about 1150 as the breadth of a man’s thumb at the base of the nail.
To help maintain consistency of the unit, the measure was usually achieved by adding the thumb breadth of three men—one small, one medium, and one large—and then dividing the figure by three. During the reign of King Edward II, in the early 14th century, the inch was defined as “three grains of barley, dry and round, placed end to end lengthwise.” At various times the inch has also been defined as the combined lengths of 12 poppyseeds. Since 1959 the inch has been defined officially as 2.54 cm.”
Aha. See that last part? They started out just fine by defining an inch as “…the breadth of a man’s thumb at the base of the nail…the thumb breadth of three men (small, medium, and large) and...three grains of barley, dry and round, placed end to end lengthwise…and the length of 12 poppyseeds.”
Those all make sense.
Then, they ran off the rails in 1959 by defining it as 2.54 centimeters.
So, there is our answer. We’re in a catch 22. Americans never really learned the Metric System, so there’s no way we can figure out how much 2.54 centimeters is so that we can convert it to an inch and then back to…whatever it is we’re supposed to convert it back to.
Personally, I’m glad we never moved to the Metric System. Who would have foot the bill?