(Editor’s Note: This is an online-only article attributed to the June/July 2017 issue.)
Manufacturers, especially in the food and beverage industry, count on customer loyalty to drive recurring revenue from repeat purchases. But in a time when consumers have more brand options than ever and just as many social channels to voice their complaints, it’s nearly impossible to retain their business. Convenient packaging, quick preparation, new flavors, and celebrity endorsements only go so far. The product must be great, rather than simply good, to edge out the competition. And what makes any product great? Quality.
To establish and maintain a level of quality that meets consumer (and federal) demands, manufacturers have to pay close attention to every detail of the products’ lifecycle. This commitment requires extensive time and resources to conduct incoming, spot, and final inspections; export, compile, and analyze data; and make adjustments to inventory levels, process parameters, and packing strategies.
Stop Suffering from Detached Quality Initiatives
Most manufacturers consider quality a plant-level function that’s managed at each individual location. Amazingly, data is rarely standardized from one location to the next, using variations for labels like “weight,” “oz.,” or “ounces.” While this makes sense to the human eye, there’s no way to aggregate data that is not standardized and there’s too much data to manually compile. The vice president of quality reviews reports from the individual plants, but it’s not a true view of overall operations.
Further, data collection strategies are pre-third industrial revolution, putting many manufacturers at least one industrial revolution behind technological capabilities. In a recent survey by InfinityQS of 260 manufacturers, including some of the world’s largest manufacturing organizations, 75 percent of respondents noted that they are still manually collecting data. Astoundingly, 47 percent of those rely on pencil and paper.
This means that once a quality check is complete, the data gets lost in siloed databases and overflowing filing cabinets. When the auditor comes to check on compliance, quality professionals have to scramble to find the right file or piece of paper to satisfy requirements.
Compiling reports for auditors or management takes hours to splice information together from disparate sources. And the resulting information can only summarize what already happened. That means quality professionals can only make recommendations based on intuition and best guesses of what to change so it doesn’t happen again.
According to a joint report by ASQ and APQC, this strategy has led to “approximately 60 percent of organizations [saying that] they don’t know or don’t measure the financial impact of quality. This lack of measurement may be attributed to not having a common method for capturing the financial impact.”
Quality professionals aren’t fortunetellers, but in a business world that lives and dies by numbers, they must prove their value like any other department. What if you could predict how processes would react to incremental changes in specifications? Would it be possible to streamline these processes, while maintaining quality and decreasing costs? You’d have to completely re-imagine quality.
Embrace the Excellence Loop
The first step in the pursuit of manufacturing excellence is automation. With the sheer volume of data that manufactures collect, it is imperative to implement automated data collection strategies that gather and standardize data from all sources, including devices, databases, OPC servers, text files, and enterprise systems.