A look back through the history of mathematics and technology reveals that many fundamental conceptual breakthroughs involve measurement in one form or another. Technical Editor Clay Gordon investigates.
In Euclid’s time (~300 BCE), the available “measuring” tools were a straightedge and a compass. Euclid’s straightedge was not divided; there was no ‘zero’ point, no ‘one’ a specified distance from the zero, no ‘two’ an identical distance between zero and one and so on. Only straight
line segments and arcs could be drawn. What was important were the relationships that could be established and the computations that could be performed using just these two instruments – addition, subtraction, multiplication, division, and square roots. These computations were powerful and are still in use today, though they were until relatively recently restricted to the realm of plane geometry and were not extended to abstract numbers until after the invention of advanced (to Euclid and peers) mathematical techniques.
While there were physical measuring units at the time, they were not used in Euclid’s geometry. These units were not based on absolute references; they were based on familiar, arbitrary references that varied, such as the distance from the nose to the outstretched index finger or the length or width of a thumb or the length of a foot.
Not surprisingly, these referents changed over time. In the 10th century in Wales, the inch was defined as “three lengths of [a] barleycorn.” In the 12th century, King David I of Scotland defined the inch as the width of an average man’s thumb at the base of the nail. In the 14th century, King Edward II of England defined the inch as “three grains of barley, dry and round, placed end to end, lengthwise.” Over time, standardized inch measures were kept in places like the Exchequer chamber and Guildhall, superseding the grain-count referents. In 1866, the US adopted the conversion of 1 meter = 39.37 inches, but it was not until 1959 that the US inch was officially defined as exactly 25.4 millimetres. Today, there is a move away from physical referents to describing length, weight, time, and other units using physical constants that do not change such a specific wavelength of light and the number of atoms of a particular element.
When the native peoples in the Americas started processing cacao beans thousands of years ago using available tools, the metate and mano, there were no (known) measuring instruments and standards to guide them. There would have been an intuitive understanding of how hot the comal needed to be and what was too hot to toast the beans without burning them; but there were no thermometers. While there would have been an intuitive understanding of what the texture of the cacao paste needed to be, there were no micrometres around to measure particle size or tools to measure particle size distribution.
Read the full feature in our magazine.
Never miss a story… Follow us on:
International Confectionery
@InConfectionery
@InConfectionery
Media contact
Hannah Larvin
Editor, International Confectionery
Tel: +44 (0) 1622 823 920
Email: editor@in-confectionery.com