The Four Secrets to Accelerating Continuous Flow Manufacturing Analytics Efforts

continuous manufacturing analytics
Facebook
Twitter
LinkedIn

Table of Contents

Despite the industry-wide excitement associated with manufacturing analytics, I often see continuous manufacturers get bogged down early in these efforts.

They get stuck discussing theoretical trade-offs between project feasibility and impact. The uncertainty inherent in hypothetical projects ultimately slows down the ability for large manufacturers to move forward.

What I’ve learned is that it’s fruitless to become mired in the detailed analysis because many variables associated with these efforts are unknowable until you actually get started on the project. Only after diving in, does the manufacturer discover the intricacies of the data: What data is readily available? How ready it is to be used in analysis? What types of data modeling and preparation is needed? What type of questions can be addressed? What types of projects are achievable?

So how are innovative process manufactures overcoming this and accelerate their time to impact?

continuous manufacturing analytics

Here are 4 common approaches I’ve seen successful continuous manufacturers use:

1. Explore, dive in

When it comes to manufacturing analytics, the leaders are using an agile strategy that allows them to start learning and executing. By gathering and aggregating production data into a platform for analytics, manufacturers will immediately get a better understanding of their overall data readiness. Building a data platform provides visibility into: what can easily be addressed, what are the data gaps, and what needs to be built out.

2. Don’t’ follow the hype – fit the right solution to the job

Too often, manufacturers are initially focused on complex analytic approaches, when the simplest solution often works best. One example is the current hype associated with deep learning. In most cases a combination of simpler techniques are much more appropriate. At a minimum, more basic data exploration methods may provide a better understanding of the problem space, and hence make it easier to develop and tune more complex algorithms.

Deep learning and AI have so much market buzz, that many of our customers are looking to apply it to all issues. However, for initial data exploration, there are far faster and easier solutions. Simpler algorithms such as linear regression often give you a “good enough” answer much more quickly.

For example, one of our current customers approached us looking for a predictive maintenance solution. They were looking for an intelligent model that would help them reduce the cost associated with the catastrophic failure of equipment. The customer’s original idea involved the creation of a model that would provide complex rare event analysis. Instead, we quickly addressed their challenge using a simple model for anomaly detection.

One additional benefit of using simple techniques is that they often help establish trust with users of the solution. Modern advances in AI are able to do incredible things with great accuracy. However, it is often difficult to understand why the algorithms make a particular decision. In many cases, a simpler analytic, that is more easily understood by the operations and engineering teams, is more likely to be adopted broadly and incorporated into daily workflows.

3. Build an engine for improving data hygiene

It’s well known that data scientists spend three-quarters of their time cleaning, preparing, and modeling data. This is necessary data housekeeping before performing an analysis. Yet since it is the analysis phase that drives actual business value, it is tempting for manufacturers to attempt to look for shortcuts or to bypass data preparation. I often see teams design algorithms they hope can manage noisy data.

But by expecting the algorithms within the analytic tools to do the work, they become more cumbersome, slow, less accurate, and are more difficult to improve. More importantly, this approach increases the level of maintenance and upkeep required to keep the algorithms current with inevitable changes to the production process.

If you give the algorithm clean, well modeled data, you reduce computation time, achieve higher accuracy in predictions, and are more likely to have easily interpreted links between the underlying data features and the predicted outcomes.

I’ve been involved in numerous projects where the manufacturer requires the algorithm to discover patterns by segmenting and massaging data, only to discover that this limits their ability to scale the project across use cases and increases the ongoing support requirements.

Leading process manufacturers start by building out a platform that turns their raw data into higher-level elements such as digital twins of machines, lines, and facilities. You can read more tips on how to approach data readiness here.

4. Know your users – and only give them what they need

Data scientists love flashy visualizations. But operators or technicians are primarily interested in the concrete actions they need to take to respond to the analysis.

To accelerate the success of early projects, take some time to understand the data sources and visualization system users are already using. Then map the results into reporting that fits into their existing visualizations, vocabulary, and workflow.

For example, at a dairy manufacturer, we had a visualization with sophisticated graphics of expected completion times, uncertainty windows, and potential production issues. However, after developing an interface that provided analysis of the 10 tanks involved in the production process, it was not clear what data was relevant to the operator.

We changed this to a simple graphic: a table with one line for each tank and when it needed to be drained. Just the information the operator needed. Because of this, the analytic project’s output immediately became ingrained in the operator’s workflow.

These four tips will help accelerate your initial digital transformation efforts. However, this is just the start. To truly scale projects, you need to ensure you have a comprehensive technical and organizational foundation in place. Sight Machine has compiled our learnings from multiple engagements into our Digital Readiness Assessment (DRI). The DRI will help you evaluate multiple factors that will impact the long-term success of your digital manufacturing efforts.

Get the Digital Readiness Index (DRI) Are you ready for digital transformation? Assess your assets and get a roadmap for prioritizing plants, building capabilities, and benchmarking against peers. Download Now

Kurt DeMaagd

Kurt DeMaagd

Chief AI Officer and Co-Founder – Kurt co-founded Slashdot.org and has served as a professor at Michigan State University in information management, economics, and policy. Kurt is an accomplished analytics programmer.

Curious about how we can help? 
Schedule a chat about your data and transformation needs.