Why Data Comes First in Manufacturing Analytics

Data Comes First
Facebook
Twitter
LinkedIn

Table of Contents

When I was nine, I wanted to be a fighter pilot. Fortunately for me, I came across the physical requirements early enough to allow me to focus on other passions instead. Because no matter how hard I tried, my poor vision, height, and assortment of allergies made me ill-suited to play the next Tom Cruise in a sequel to Top Gun.

Applying the metaphor to the topic at hand, many manufacturers with whom I work attempt to resolve specific problems, only to discover — too late — that their data is ill-suited for the task.

In a previous blog, Nate Oostendorp and I introduced how data fitness and data readiness are determining factors in the long-term success of manufacturing analytics efforts. If the manufacturer doesn’t have the right data for a specific use case, the project won’t succeed. That’s why data comes first in manufacturing analytics. This is what I refer to as data fitness: data that is “fit” for the purpose you want to use it for.

If in fact, you do have data aligned with your purpose, the next step is to see if it’s actually ready for use. That’s Data Readiness. If your data fails this test, you can’t access and consume it to drive your use cases. (Check out our blog on how you can audit your data for readiness.)  

In this article, I’ll dive deeper into the topic of data fitness. I’ll also provide a checklist of commonplace examples at the end.

The importance of data fitness

For a manufacturer, the first step in data fitness is assessing whether you’ve got the right data on hand to support your objectives. That may sound obvious, but I’ve seen far too many projects fall short because of a misalign between objectives and data requirements.

For example, a common manufacturing analytics objective is figuring out the cause of rejects. This requires data showing what happened to parts that were rejected. Sounds simple, right? The thing is: the data needs are anything but.

Reject analysis entails associating specific produced parts or batches with both production process data (what happened when the part or batch went through a given process) and quality data (was the part or batch rejected and if so, why). The difficulty here is that many manufacturers just don’t generate or collect production process or quality data for particular parts or batches. Which is a deal-breaker for performing reject analysis.

Sometimes a manufacturer will have the right data to investigate a specific problem, only to discover, in the middle of the project, that it’s not at the required level of resolution to be useful. For example, I’ve been involved in many quality-related projects for which the manufacturer believed they possessed the data inputs to determine the root cause of rejects. Unfortunately, though, their data was often not specific to individual parts. If your quality data is collected at the work shift or day level, it’s extremely difficult to associate reject information to individual parts — meaning the data is ill-suited for root-cause reject analysis.

Let’s take a more complicated use case. Currently, predictive maintenance is top of mind for many manufacturers. The cost savings can be significant and aren’t difficult to quantify. But predictive maintenance requires both process data and maintenance records for machines and their components. It also involves downtime data with reason codes that can trace back to component failures. All these metrics must be captured in a consistent and automated manner at the machine and component level. Once again, most manufacturers just don’t have reliable data at this level of granularity.

The Key Takeaway

It’s important to recognize that in most instances, at least some of the data required for manufacturing analytics will not be readily available. In my experience, there’s a fine line between what’s possible and what can be worked around. Remember, data comes first. That said, an accurate understanding of data requirements is essential to realistically assess the feasibility of a project and the adjustments to scope and budget that may be necessary.  

Below are some examples of data requirements for typical use cases. To learn more about how manufacturers are leveraging the Sight Machine platform to generate actionable business insights from production data, visit our use case page here.

 

Data requirements for typical Digital Manufacturing Use Cases

Downtime analysis

  • Machine-specific process data
  • Machine-specific downtime signal (event) data
  • Downtime reason association
  • Downtime event-to-process data association (indication of when downtime occurred during the production process)

Predictive maintenance

  • Component-specific process data
  • Component-specific maintenance records (historical information to build predictive algorithms)
  • Component-to-machine mappings
  • Machine-specific downtime (event) signals
  • Downtime reason association (at component level)
  • Downtime event-to-process data association (indication of when downtime occurred during the production process)

Reject analysis

  • Machine-specific process data
  • Serialized quality data (pass/fail at part/batch level)
  • Quality-state reason mapping
  • Process-to-quality data association (serial — part/batch level)

Material/input use optimization

  • Machine-specific process data
  • Inputs/materials (volume) for each process and what impacts them (this can be complex; for some inputs, time and temperature can impact input use)
  • Process-to-quality data association (serial — part/batch level) to understand how inputs impact yield

Data Comes First

Andrew Home

Andrew Home

Andrew Home is a data scientist and product manager with a passion for understanding the relationships between data and manufacturing processes. Andrew has served as a data science fellow with both Cap Gemini and Galvanize Inc. He holds a BA from Southern Methodist University.

Curious about how we can help? 
Schedule a chat about your data and transformation needs.