In the industrial production of goods it is crucial that, as far as possible, the end product is free of defects and works correctly. To achieve this result, construction processes are monitored systematically by humans, but also increasingly by using sensors, cameras, software and hardware in the context of Industry 4.0. While humans use all their senses quite naturally, automated quality control has so far usually been limited to visual and physical criteria. However, the added sensory dimension in humans doesn’t just offer benefits. Although they can react fast and turn off a machine as a precaution if it sounds abnormal, the only problem is that every person perceives sounds differently. Whether something isn’t right therefore tends to be a subjective feeling and thus increases the error rate. Researchers from the Fraunhofer Institute for Digital Media Technology IDMT are thus developing cognitive systems that use acoustic signals to identify errors precisely. The IDMT is exhibiting a demonstration model at HANNOVER MESSE 2017 to enable visitors to experience how individual sounds from individual manufacturing processes are perceived and correctly attributed, how data is processed and analyzed and how it can be managed reliably.
The technology on show combines intelligent acoustic measuring technology and signal analysis, machine learning and secure, flexible data storage. "We're integrating the intelligence of hearing into the machinery's industrial status control or automated product testing systems," says Steffen Holly from the IDMT Industrial Media Applications business unit. Once trained, cognitive systems can hear more objectively than the human ear. Instead of just two ears, many thousands are available - in the form of millions of neutral datasets. The first pilot projects with industry are already underway and have seen the researchers succeed in detecting up to 99 percent of defects purely acoustically.