The ability to determine the root cause of defect creation, the frequency of unplanned downtime, the factors causing machine malfunctions… these are just a few examples of what is possible with algorithm-powered manufacturing analytics. It’s clear that algorithms are and will continue to be an important tool in the arsenal of Industry 4.0. But what exactly are these algorithms and how do they deliver valuable, analytic insights?
Put simply, an algorithm is “a process or set of rules to be followed in calculations or other problem-solving operations” [source Oxford English Dictionary]. In the computing world, software engineers or Data Scientists write a set of rules that tells servers, embedded devices, or other computers how to perform a desired task. This is the algorithm. Once written, the user can select the various inputs he or she wants to analyze and then process them via the algorithm, which will output the results.
The concept is simple enough – to enable manufacturing analytics, Data Scientists write step-by-step instructions that tell software programs how to run calculations that will analyze the selected inputs and then deliver the results of the calculation. But computer algorithm or AI technology has been available since the late 1970s. Why are we just now applying analytics to the manufacturing industry to better understand efficiency, performance, and scrap? What’s changed?
The factory floor is overflowing with data from CNCs, MESs, PLCs… insert your factory’s favorite three-letter acronym here. As a result of the IIoT, we have more potential inputs – or data – available for analysis than ever before. This abundance of data means that it’s possible to virtually represent parts, machines, and entire manufacturing lines to create the ‘digital twins’ of these physical artifacts. [UPDATE: See CTO, Nate Oostendorp’s, post Why Your Digital Twin Should Have a Macro Scope for more on the digital twin]. These digital twins are then used as the inputs for Data Science generated algorithms. Looking to understand why a CNC on your line has a high rate of unplanned downtime? Create a digital twin of the CNC and you’ll be able to visualize the isolated performance of that machine. Interested in understanding the root cause of your line’s high scrap rate? Create a digital twin of the entire manufacturing line and you’ll be able to analyze the data using a Root Cause Analysis (RCA) algorithm.
When your data is taken from it’s raw format and structured appropriately – via digital twin and data modeling technology – you can quickly apply the decades-old power of the computer algorithm to unlock the full potential of manufacturing analytics. Conversely, if your data is not automatically formatted for analysis via AI technology, a substantial amount of Data Scientists and time will be needed to manually format the data before any analytics can be developed and applied. As many manufacturing companies are learning, the effort required to manually condition data is so incredibly extensive that it renders the use of analytics cost-prohibitive for most manufacturing problems.
It is for this exact reason that algorithms alone are not enough. It’s the one-two punch of using scalable technology to do the heavy lift of data formatting and then applying the applicable algorithm(s). This is the key to realizing the ROI of digital manufacturing. It is only when data is formatted properly that analytics can be applied to solve high-value manufacturing problems like: why has my defect rate increased? what parts of my manufacturing line are contributing to this problem? and most importantly, how do I modify my factory line to increase efficiency, optimize performance, and reduce scrap?