Many manufacturers turn to edge AI because it enables them to process and filter data locally, and reduce the amount of data sent to a central server, i.e. to the cloud, saving time, cost, and increasing reliability and safety. Cloud, on the other hand, offers an excellent platform for keeping historical industrial data, inventory lists, and demand data, as well as providing an easy means to scale
In this first article, from the panel AI at the edge vs the cloud that took place during the Industrial AI Summit and with Alex West – Principal Analyst at Omdia, Matteo Dariol – Lead Innovation Strategist at Bosch Rexroth, Anders Rahm-Nilzon – Director of Cloud center of excellence at Volvo Group, and our CEO and Data Science Director Eric Topham, we explore state of artificial intelligence in manufacturing today
The status quo in manufacturing
Currently, 70% of potential clients interested in applying new technologies to their processes, don’t have the understanding of what it takes to build an AI system, they don’t know what data is required and subsequently don’t have suitable data pipelines ready. Conversely, some customers have too much data available, and they are moving from having data lakes to data swamps and are being overwhelmed with data.
To add to this, the human element, as well as the data element, is underlooked by many manufacturers today. To obtain good results in any AI project, it’s crucial to have experts that can analyse the data and highlight the important features within the data, as well as communicate with the end-users. Ultimately, AI is the art of asking the right question, hence if businesses are aiming in the wrong direction, and asking the wrong questions, they will not get the results that they are expecting. And that’s where expertise comes in place.
Manufacturers and industrialists, just like almost every other business, need to have several different stakeholders sitting at the same table when embarking on a new and challenging project, as the application of AI might appear initially. These stakeholders need to be the ones who are going to use these new systems, like for instance end-users and operators, and that can give their input. Furthermore, manufacturers should try to have a team of data scientists that are going to choose the right solution for the right task at hand.
Companies need to carefully plan before deciding. Everything needs to come back to the questions: What are you trying to do with the data? How do I need to move it, and can I do things in a distributed manner? Then what’s the purpose of the analysis that we’re doing with the data the application of AI should be considered a team effort and an effort on setting expectations as well as a data effort.
An extra element that defines the status quo of AI for manufacturers, and that customers often forget, is that machine learning models are not static things. The key to delivering continuous improvements comes through the concept of machine learning Ops (MLOps), which is closed-loop feedback; as you get more data, you then refine your network, and you produce a better network for your customer. The distributions in the data that flows through a production pipeline may drift, the mean and standard deviation may change for instance.
In that scenario, the model in production is no longer trained on that distribution and consequently, the model stops performing as well as it might do. The models must then be retrained with the right adjustments, and here is where the dynamicity of the model is clear to see. Just like with employees, ML models need to be trained and kept up to date to make sure they will perform as we would like them to.
At T-DAB, we recognise the common challenges and barriers faced by manufacturers to build and scale effective AI applications, and utilise a ML accelerator to deliver value into the business and transition standard deployments to best practice MLOps.
Stay tuned for the second article of the series on the paradigm between cloud and edge AI, what it takes to strike the balance and what to look for.