Exhibitors & Products

The defining feature of Industry 4.0 is the advancing digitalization of production - machines and processes are becoming increasingly intelligently networked, allowing more and more data to be generated. Artificial intelligence (AI) is increasingly being used to make profitable use of this flood of data. It is said to have the potential to generate information from this data that has a positive impact on production and services. Possible application scenarios include predictive maintenance, process optimization and automation as well as quality control. However, this potential of AI in Industry 4.0 cannot yet be fully exploited, as there are several technological barriers that limit the generation and processing of information.

The three major problems with the use of AI in industry

The first obstacle is the so-called multi-vendor landscape in many current production facilities. This means machines from different manufacturers from different technology generations with different - and often proprietary - communication interfaces and protocols. These heterogeneous structures prevent standardized data access. Instead, there are many specific isolated solutions, each of which requires domain knowledge.

The second obstacle is the lack of support for data scientists. They have no domain knowledge and therefore need support in obtaining real-time or historical data. There is also the problem of incompatible, inconsistent and incomplete data sets, as well as missing metadata. This often makes the data processing process tedious, lengthy, manual and coordination-intensive.

Finally, the third obstacle is inflexible AI operation. AI applications are often operated rigidly in the cloud or on a local server. This means that the applications do not have the opportunity to make optimum use of the available resources. In addition, updates to the AI applications are necessary in order to be able to react appropriately to changes in the production facility or processes. All of this is a challenge for the further digitalization of Industry 4.0. These problems therefore need a solution

A framework for the data and AI lifecycle

In order to overcome these problems, researchers at the Fraunhofer Institute for Cognitive Systems IKS in Munich are working on an open, interoperable and technology-neutral framework that supports and optimizes the data and AI lifecycle. The aim of the project "REMORA - Multi-Stage Automated Continuous Delivery for AI-based Software & Services Development in Industry 4.0" is to ensure an automated, continuous and dynamic process. Specifically, this framework aims to achieve the following goals Support of the data scientist, automated and flexible AI integration as well as automation of AI processes.

First step: the interface

The first step is to develop an interface for data scientists to support the AI development process. This interface should make it possible to query data easily and uniformly without having to consider technology-specific aspects such as communication interfaces and protocols. The interface then takes over - internally - the mapping to the technologies and the required data transformations. It also provides an overview of the topology, the metadata and an interface for training and operating an AI model. This interface can be used not only by data scientists, but also by non-experts - in conjunction with an AutoML framework, for example.

Second step: integration of artificial intelligence

An application management component should then enable automated and flexible AI integration into the Industry 4.0 environment - from component level to the cloud. This is done on the basis of the required resources and optimization goals. In addition, the AI application manager works together with the data interface to network the AI applications in order to ensure the flow of data.

Third step: Automation of AI processes

Finally, an AI management component should enable the automation of AI processes, i.e. the automatic re-training and re-deployment of an AI model to ensure the continuous improvement of data analysis. For example, new training data can be collected automatically when machines are replaced in order to train a new AI model. Furthermore, automated operations can be carried out in response to data analysis (e.g. cooling down in the event of overheating) or to increase the efficiency of real-time AI analysis (e.g. adjusting the sampling rate).

Video