Most consumers in North America—in fact, most of the world—have lived in an era in which the food they buy has some type of product label attached to it. This label goes beyond just identifying what the food item is; it also provides information as to its nutritional value, ingredients, and other important consumer notices. However, this has not always been the case.
You Might Also Like
Explore this issueJune/July 2017
Also by this Author
For most of modern history, there were few to no labels on food. People produced much of their own food and purchased the rest from the farmer, the butcher, or the baker up the road, in which they knew the items were fresh and local. There were no government inspections or labels. The astute consumer knew what to look for in a piece of fruit or a slab of meat and could tell if it was fresh by poking it, smelling it, or simply looking at it.
And there were few “trust” issues when it came to selecting food. The farmer sold or bartered many of his offerings with the same person who made clothes for his family, taught his children in school, or built his farm equipment.
However, as the world’s population grew, much of this trust began to evaporate, and concerns about the purity, safety, and quality of food increased. These concerns are what led to a history of food rules and regulations, along with the food labeling systems that are in place today—all enacted to help protect the consumer.
A History of Food Labeling
One of the first examples of a labeling system, of sorts, regarding food quality, appeared around A.D. 400 in the Roman Empire. At that time, vendors would stand on the steps of a central location in the city to sell their goods. Those with the highest quality of bread and other food products would stand on the highest steps. For the most part, this system worked. Consumers who could afford it knew to climb the steps for the highest quality food items.