Here’s something to think about: a dry goods dealer has a five-yard piece of thirty-six-inch wide material, and wishes to sell a customer one-and-a-half yards. But neither a yardstick, nor a tape measure, nor any other measuring device is available. Can the dealer complete the sale?
Yes, it can be done, as you’ll discover later; but for now, take our word for it. Using a ruler or a tape measure would be a lot easier than figuring out this problem in your head. In fact, much of our commercial life would be next to impossible without implements for estimating length and distance. Why “estimate”? Well, if the American mission that landed a man on the moon had depended upon measuring devices only as accurate as the ruler in a schoolboy’s briefcase, our astronauts would most likely have missed the moon altogether!
No ruler or measuring device is completely accurate; in fact, no measurement of any kind is ever absolutely correct. A measuring device is, in effect, only a reproduction of an arbitrary standard. While the standard itself is perfectly accurate, the reproduction never is. For instance, the standard of length measure in the United States is the yard. Theoretically, a yard is a yard is a yard, but no yardstick ever measures exactly thirty-six inches.
The first common unit of length was the cubit, used by the Egyptians and Babylonians thousands of years ago. Originally, the cubit was defined as the length of a man’s arm from the elbow to the end of the middle finger (the word comes from the Latin word for “elbow”), but the actual length of the cubit varied from place to place and from time to time. Through most of Egyptian history, the cubit was equivalent to about 20.6 inches. One advantage of the cubit was that in the absence of a measuring device the unit could be easily, handily, you might say, approximated. Presumably, long-armed merchants were quite popular in Egypt.
In Egyptian, the cubit was called the meh, and divided into units called sheps, or “palms.” There were seven sheps in a cubit, and the shep in turn was divided into four parts, called zebos, or “digits.” “Let’s see-I’ll have two mehs, six sheps, three zebos of that linen.
The ancient Greek cubit was about 20.7 inches, but another unit of measurement, the foot, was more widely used. The Greek foot was about 12.5 inches, divided into twenty-five digits. The Romans adopted a pes, or foot, of about 11.6 inches, divided into sixteen digits.
Prior to the nineteenth century, each country in Europe had its own system of weights and measures. In medieval England, for instance, the foot was equivalent to 13.2 inches, with six feet to a fathom, ten fathoms to a chain, ten chains to a furlong, and ten furlongs to an old mile, equivalent to about 6,600 feet. The English units of measure evolved from many origins, some as old as the Roman conquest.
By 1800, length measure standards were fixed in England at their present values. The unit of length measurement was the yard (from the Anglo-Saxon gyrd, “measure”), divided into feet and inches. English kings kept a bar known as the standard yard in London as the ultimate arbiter of the yard’s exact length.
The idea of a physical standard is an old one. Physical standards are kept, not to settle every measuring dispute by direct reference to the standard, but to have a permanent, or so it was thought-record of each measuring standard. But even a physical standard can be inaccurate. It’s been estimated that the new British standard yard has decreased by .0002 inches since it was cast a little over a hundred years ago.
The original standard yard was lost when the Houses of Parliament burned in 1834. Scientific studies followed to determine a new standard and to produce a unified system of weights and measures. In 1878, the British imperial yard was defined as the distance at a specified temperature between two lines engraved on gold studs sunk in a certain bronze bar.
Measurement systems from England, France, Spain, and Portugal were all brought to America by colonists, and all were used here briefly in various places. By the time of the American Revolution, standards of measurement here were identical to those current in England. In 1832, an American physical standard was adopted, defining the yard as the distance between the twenty-seventh and sixty-third inch of a certain eighty-two-inch brass bar kept in Washington, D.C. Since 1893, the U.S. standard yard has been defined in terms of the meter.
The yard is still the standard of length measurement in this country, of course; but most other nations, including England, have gone over to the metric system. Although a decimal system for measurement was first proposed as long ago as 1670, the essentials of today’s metric system were embodied in a report made by the Paris Academy of Science in 1791.
The original plan for a metric system defined the meter as one ten-millionth part of a meridional quadrant of the earth. Larger units were formed by the addition of Greek prefixes (deca-, hecto-, kilo-), and smaller units with Latin prefixes (deci-, centi-, milli-). A platinum bar was cast according to this definition for use as the physical standard, although few scientists at the time were satisfied with its accuracy.
The new measuring system did not catch on immediately. Many years passed before it was widely adopted in France and other countries. In 1875, a treaty was signed assuring international unification and improvement of the system. Four years later, the standard meter was newly defined with reference to a bar of platinum-iridium alloy. In 1927, a supplementary definition described the meter in terms of light waves, for while a platinum bar can be destroyed or altered, light wave measurement will always be available. With such a standard, a unit of measurement can be verified anywhere in the world without risking damage in transit to the physical standard.
Along with the United States, only Liberia, Southern Yemen, and a handful of other small nations still employ the so-called British system, and the metric system will soon be introduced in most of these countries as well. The exact date for total metric conversion here has not yet been fixed, but Canada began the switchover in 1977.
A number of other terms are sometimes used for length measurement. For instance, have you ever wondered what a furlong was? The term originated with the word “furrow,” and at one time was thought to be the length most suitable for a plow furrow. The furlong is now defined as 220 yards.
The word league has varied in meaning in different times and countries. In England, the league was equivalent to about three miles. For centuries now, the word league has been used chiefly in a figurative sense.
The knot is not a measurement of length, so it you ever hear a sailor say “knots per hour” you’ll know he’s a landlubber in disguise. The knot is actually a measurement of speed, equivalent to one nautical mile per hour.
The nautical mile is defined three different ways, since navigators once regarded the nautical mile as the length of one minute of the meridian at the place where they were taking the reading. The U.S. nautical mile has been identical to the international unit since 1959, measuring 6,076 feet.
The fathom was originally thought to be the length to which a man can extend his arms, if you can fathom that. Today, the fathom is equivalent to six feet, and is used most often in reference to water depth and cable length.
The smallest unit of length measurement in the world is the attometer, equivalent to a mere quintillionth of a centimeter. Your pinky is probably about 7,000,000,000,000,000,000 atto-meters longs
For a measuring device, early man probably first used his arms or his foot, but even during the Egyptian era, when the cubit was the unit of length measurement, a ruler of some kind was more often used. There are two basic kinds of length-measuring devices: the end standard, and the line standard. With an end standard, the unit is defined as the length from one end of the device to the other; with a line standard, from one calibration on the device to another. Most modern rulers are end standards; that is, a foot ruler is one foot long from end to end, and a yardstick is one yard long.
How accurate is a modern ruler? Suppose you want to draw a two-inch line on a piece of paper. First of all, the calibrations on the ruler are a fraction of an inch thick. Thus, a dot that looks correctly placed to you may appear off-center to another eye. The calibrations themselves are never absolutely precise, and the edges of most rulers are slightly warped. The chances of your line measuring precisely two inches then, are slim, indeed.
Rulers come in many shapes and sizes, among them the three-edged or triangular ruler so often used by students, both for measuring and for launching rubber bands across the classroom.
A tape measure is used for very long measurement, or for the measurement of a bent surface. Calipers consist of two prongs joined by a pin; they are best for measuring thickness or the distance between two surfaces.
As any draftsman can tell you, there’s a world of difference between a ruler and a straight edge. No draftsman worth his T-square would use a ruler for drawing an accurate straight line, since rulers are designed chiefly for measuring, not line drawing. For a crisp, straight line, it’s best to use a triangle or other device specifically designed for line drawing.
You remember our dry goods dealer and his measuring dilemma? Well, here’s what you would have to go through to cut the material without a measuring device.
First, take the five-yard piece of material and stretch it along the edge of a piece of paper, marking off five yards on the paper. Then add the thirty-six-inch width to the five-yard measurement on the paper, making a total of six yards. Fold the paper in half for a three-yard measure, then fold the three-yard measure in half for a yard and-a-half measure. Finally, lay the yard-and-a-half measure over the five-yard piece of material, and mark off a yard-and-a-half on the material.