You might not realize it, but certain fundamental units are the basis for all of our everyday measurements as well as those of the most precise scientific experiments. Physicists recognize seven fundamental units: mass, length, time, electric current, amount of matter, temperature and luminous intensity. These, either singly or in combination, describe everything physical that we can measure.
No one knows exactly why the fundamental units are fundamental or even why they are constant. There are quite complex relationships among them, sometimes in surprising ways.
The interrelationships are too complex to detail in this limited space, but the following example of length should serve as a beginning in understanding both the establishment of the units and the types of relationship among them.
The idea of fundamental units is a sound one. Measurements of physical quantities have little meaning unless everyone uses the same standards. Of the seven, length is most fundamental, the most tangible and the simplest.
A meter must mean the same to everyone from the most precise scientist to the elementary school classroom in order for anything physical to make sense at all. More specifically, nuts and bolts must have matching threads, and parts machined on different continents must fit together to build automobile engines and jetliners.
The precision required in defining the seven units depends upon the type and precision of measurements that can be made, which is where the problem of defining fundamental units begins in the first place.
Until the late 1700s there were no standard units. After the French Revolution the French Academy of Sciences defined a meter as equal to one ten-millionth of the distance between the North Pole and the equator, favoring this over the length of a standard pendulum because of the dependence of the pendulum on the strength of gravity.
In 1875 a permanent International Bureau of Weights and Measures — known by its French acronym, BIPM — was established in Sevres, France, to construct and preserve a prototype meter bar, distribute national metric prototypes (copies) and maintain comparisons between them and nonmetric measurement standards.
In 1889 the General Conference on Weights and Measures — whose French acronym is CGPM — created the International Prototype Meter as the distance between two lines on a metal bar composed of an alloy
of 90 percent platinum and 10 percent iridium, measured at the melting point of ice. The bar is still there, but now only as an artifact as there are much more precise ways to measure the fundamental units.
By 1925 physicists used interferometry regularly to measure the meter, but the IPM platinum bar remained the standard until 1960. At that time the 11th CGPM defined the meter in the new International System of Units, or SI, as equal to 1,650,763.73 wavelengths of the orange-red emission line in the electromagnetic spectrum of the krypton-86 atom in a vacuum.
In 1983 it was redefined again to the current definition, a derivation from the speed of light, which has been measured with increased precision over hundreds of years. CGPM settled on the speed of light at exactly 299,792,458 meters per second, which in turn defines a meter as the distance light travels in one-299,792,458th of a second.
What exactly is a second? That will have to wait for a later column.