Maintenance Mindset: How generative AI could transform manufacturing maintenance and operations
Welcome to Maintenance Mindset, our editors’ takes on things going on in the worlds of manufacturing and asset management that deserve some extra attention. This will appear regularly in the Member’s Only section of the site.
We’ve all seen for ourselves or heard of the potential for generative artificial intelligence (AI), and every industry is experimenting with AI as an assistant, including manufacturers. I recently talked with machine builders, system integrators and machine component vendors about how they are using generative AI and how their manufacturing customers are thinking about this new tool.
In the world of controls engineering, generative AI is making strides at easing the programming load with AI assistance, as well as building machines with AI tools. Manufacturers are exploring how to use generative AI to help troubleshoot equipment issues in industrial settings in real-time. The potential is considerable, but industry has some hurdles to scaling this kind of technology.
If you need a refresher on the difference between artificial intelligence, machine learning (ML), natural language processing, deep learning or large language models (LLMs), this article from our sister publication Control Design explores their interdependence. Many are familiar with the popular chatbots and generative AI tools and LLMs like OpenAI’s ChatGPT or Google’s Gemini, but they won’t do much to help with industrial machine programming or equipment troubleshooting yet, since those models are trained on a finite set of data. It is an enormous amount of finite data (about 570 Gb, or approximately 300 billion words). This is what make the models so effective at reproducing human language and generating mostly accurate text, but also what makes them less effective at operations like troubleshooting site-specific industrial equipment. And what makes them occasionally hallucinate.
Maintaining and operating a generative AI tool for manufacturing and specifically equipment maintenance monitoring and recommendations takes a highly customized AI and that AI tool needs real-time updates, not a static chatbot trained on a finite amount of data, no matter how much.
What is retrieval-augmented generation (RAG)?
So to continually update and customize LLMs, we need retrieval-augmented generation (RAG) models, another AI framework. Generative AI tools have also been known to hallucinate incorrect or misleading information, so it should be used as an assistant, rather than a replacement. But RAG can help mitigate some of the risk for inaccuracy. RAG models work with LLMs to be more accurate and relevant, by searching more external data sources and pre-processing information and prompts for the LLM. This makes it particularly suited to industrial environments with vast amounts of historical and ever-changing equipment data.
Last summer, as part of the 7th European Conference on Industrial Engineering and Operations Management in Augsburg, Germany, July 16 to 18, Ali Narimani and Steffen Klarmann of Valeo Sensors and Switches published a paper on the “Integration of Large Language Models for Real-Time Troubleshooting in Industrial Environments based on Retrieval-Augmented Generation (RAG).”
“RAG models allow one to build recommendations that are specific and actionable on a huge database of knowledge and information confined to the domain that is captured inside technical manuals and sensor-based, real-time data," the authors say.
Industrial settings are a challenging, dynamic environment, making root cause analysis complex because of many different technologies and the cascading relationships between equipment and processes. Narimani and Klarmann outline some of the other challenges for industrial use of RAG models, such as data overload, poor quality data, the sheer volume of data and outliers. In many facilities, there is an operational reliance on human expertise, combined with the shortage of skilled workers, so worker knowledge remains with the workers, instead of in the system. Many facilities also have outdated or static documentation. Troubleshooting requires real-time analysis, but the challenge of integrating many different kinds of technologies and legacy equipment can become an interoperability issue with different machine and component vendors, communication protocols, and standards.
Building a knowledge library
In many cases, the needed information is out there, but it needs to be properly gathered and organized, what the authors call “creating the knowledge library.” For plants and factories, this includes technical manuals that provide detailed information on equipment operation and maintenance, operational guidelines outlining standard procedures, historical maintenance records documenting past repairs and interventions, and sensor data logs capturing real-time operational metrics.
The LLM will need more than the raw data, so it needs to be fine-tuned, organized and properly correlated, and then, further fine-tuned and tested with domain-specific use cases. "Through this training, the generator model learns the specific language and terminologies used in the industrial setting, understands typical problems that may arise, and becomes familiar with effective solutions that have been employed in the past. As a result, the model is better equipped to generate high-quality, relevant responses that are tailored to the nuances of the industrial environment,” Narimani and Karmann say.
If you’ve experimented with ChatGPT or other generative AI chatbots, they can produce amazing results, and they can also regurgitate nonsense. They can sometimes sound like they don’t know what they’re talking about because they really don’t. LLMs understand language only in how words relate to each other statistically. They don’t really understand what words mean contextually. RAG can help supplement an LLM with external sources and a structured framework and help LLMs sound more like they know what they’re talking about.