The 4 Best Ways to Use Data to Optimize Continuous Flow Processes

continuous manufacturing

Table of Contents

In my previous blog, I discussed why it is critical for continuous flow manufacturers to create end-to-end data models, or digital twins, of their entire production process. In this blog, I want to give you insight into how these digital twins help manufacturers optimize production.

At Sight Machine, we have been working for years with continuous manufacturers struggling to use data to lower costs and address quality issues. Most have yet to develop scalable methods of analyzing the massive amounts of sensor data generated during production.

So how can an end-to-end digital twin transform the way continuous manufacturers use data to drive decision-making? Here are the 4 biggest impact areas:

1. Improving overall production efficiency vs. optimizing a single process or piece of equipment

continuous manufacturing

Traditionally, process manufacturers have focused on efficiency improvement efforts on single machines or processes within a production line. But this often leads to challenges:

  • If that process/machine is not a bottleneck in the overall process, then the manufacturer isn’t improving overall production efficiency.
  • Optimizing one piece of equipment or process will have upstream or downstream effects that may negatively impact the overall system.

I’ve seen this problem first hand: After analysis of a particular piece of equipment, manufacturers often find that they can relax tolerances to improve output but still maintain overall quality at that given machine/tank/process. However, this increased variability inevitably impacts downstream processes, causing more overall defects.

Tracing data across the whole process allows manufacturers to take the upstream and downstream dependencies into consideration and determine the most effective improvements. Read more about Sight Machine’s customer case studies here

Lesson learned: Optimizing one machine at a time can actually make things worse, resulting in more defects overall.

2. Evaluating the trade-offs involved with changes in production settings

I’ve found that continuous manufacturers often struggle to balance the tradeoffs between different production KPIs.

Here’s an example from my work with a large glass manufacturer. Typically, when downstream quality issues arose, the operator of an upstream furnace would increase the furnace temperature (higher temperatures generally mean fewer defects). This focus on one process allowed the production facility to address one KPI – quality.

However, increasing the temperature also led to higher energy costs, which negatively impacted a different KPI – Costs. The manufacturer needed to look at downstream batch quality data at the same time as overall energy use data to get a true understanding of the best trade-off to make. They were looking at only half of the problem.

By having integrated end-to-end data, they can optimize the whole process.  

Lesson learned: Creating an end-to-end process model allows manufacturers to evaluate the tradeoffs of proposed optimizations, answering the question, is it really worth it?

3. Combing end-to-end process data with batch data to optimize downstream quality and output

By combining a digital twin of the production process with batch data, manufacturers can optimize for quality and output by looking at how process changes impact downstream activity.

Here’s an example of what I mean.  One of our food processing customers was looking to drive efficiency and improve quality in the separation processes.  Separation, where they divided milk into various products (whey, cheese curds, etc), was often a bottleneck. The length and efficiency of separation was a function of many batch variables including:

  • Blend of protein and fat in the raw materials
  • Temperature of the batches coming out of an upstream process (in this case, a cooker)
  • pH of the output from an upstream fermentation process

We were able to create a data model or digital twin of production that integrated equipment/process sensor data with batch and quality data. This model was then used to analyze upstream cooker and fermentation data for specific batches.

By having an end-to-end model, the processor was able to do chemistry-based modeling to determine optimal temperatures and release times for the fermentor to maximize both quality and throughput of the downstream separation process.

Lesson learned:  Integrating process and batch data allow you to optimize the production settings for an upstream process to improve downstream efficiency.

4. Using integrated process and batch data to identify upstream causes

With an integrated end-to-end view of all process and batch data, continuous manufacturers can identify upstream root causes of downstream issues.

Here’s an example of how this worked. Like many continuous manufacturers, one of our customers was challenged with identifying the root cause of production alarm signals.

The process engineering team was continuously working to address issues on equipment that was generating alarms. But unfortunately, much of this work had limited impact, as the root cause of the alarm was actually associated with upstream production processes. The team was addressing a symptom, not a cause.

By combining an end-to-end model of production with quality data, the manufacturer was able to:

  • Weed out non-relevant alarms to focus their analysis
  • Look for clusters of co-occurring alarms
  • Develop an understanding of cascades of alarms (alarm interdependencies and how one alarm may lead to a sequence of other alarms)

Ultimately this enabled us to determine the root alarm and the corresponding activity that generated that alarm, resolving the issue once and for all.

Lesson learned: An integrated end-to-end view of process and batch data enables you to identify upstream root causes.

The power of a digital twin can transform the way continuous manufacturers operate. It enables process engineers and plant managers to look at old problems in entirely new ways. Check out our use case page to read about some of our recent work. I look forward to hearing from you about the challenges you’re facing in optimizing your environment.

[Webinar] Overcoming and avoiding digital manufacturing failures Sight Machine & Bain Gain insights on lessons learned from Joe Terino and Peter Hanbury of Bain and Jon Sobel of Sight Machine on successful implementations by manufacturers who are conquering complexity and making faster, smarter decisions to drive change that sticks! Watch Now

Picture of Kurt DeMaagd

Kurt DeMaagd

Chief AI Officer and Co-Founder – Kurt co-founded and has served as a professor at Michigan State University in information management, economics, and policy. Kurt is an accomplished analytics programmer.

Curious about how we can help? 
Schedule a chat about your data and transformation needs.