skip to content
Article

HMG, the World’s First AI-Based Diagnostics for Fault Detection

- Detects powertrain noise
- Accuracy exceeding 90%; a critical breakthrough in automobile maintenance industry
- Likely to boost consumer confidence and reduce car maintenance costs

Experienced mechanics are often capable of identifying very specific faults in the car, just from the sounds of an idling engine. Essentially, they are attuned to the baseline sounds of well-maintained machines, and abnormal sounds among those sounds are identifiable by ear. The practice of identifying faults by sound or vibrations is rather common. Automobile sound scope is a staple in the auto-mechanic’s toolbox, and when one is not available, long flat screwdrivers are used to similar effect. Based on that principle of identifying specific sounds, Hyundai and Kia Motors Namyang R&D Center has nearly perfected an AI-powered fault detection and diagnosis system. Finishing touches are being put on the AI system, with plans to deploy it into the frontlines of automobile service. Take a closer look into the system’s core ideas and what enabled its development.

Listening comprehension for auto faults

Sounds can have distinct properties and identifiers. Wavelengths and amplitudes are properties of sound that can be objectively identified. For example, animals are known to produce wavelength signatures more complex than humans. Automobiles can produce even more complex sounds. Automobiles are complex machines with components that work together, producing diverse sounds, many of them concurrently. Consider the choral humming from the powertrain as a combination of parts. Faults in even minor components are bound to cause dissonance. Can a skilled individual listen to that sound and diagnose what component is at fault? It is a difficult challenge for even the most skilled mechanics. The sounds we hear from the engine room or powertrain are significantly more varied and complex, more nuanced than sounds comprehensible to human hearing, or even animals. Can artificial intelligence diagnose? The Engine NVH Research Lab at the Hyundai and Kia Motors Namyang R&D Center developed a new technology that allows comprehensive AI-learning of automobile sounds, enabling the AI to identify faulty components.

How AI distinguishes sounds

The human brain can easily distinguish between cats and dogs, apples and oranges, but an AI must be taught by countless images as examples. This is a grossly simplified but generally descriptive of machine learning. Earlier AI development utilized this approach. Now with copious amounts of data available, deep learning became possible, where AI essentially teaches itself the ability to distinguish characteristics. AlphaGo is a well-known example of deep learning AI, using Qipu (aka Kifu, a game record for a game of Go) to teach itself how to master the game of Go at levels previously thought unimaginable. High-quality data is essential for deep learning. Without sufficient bodies of data to draw from, deep learning cannot maximize its potential. Automobile fault-diagnosing AI would of course, need a rich body of audio data.

After years of research, the engineers at the Engine NVH Research Lab successfully extracted high quality data, based on technology that accurately establishes baseline and faulty engine sounds by category. In order to extract raw data that allows AI learning, the research engineers collected sound data from various parts of the engines that were fully functional, to fault-induced engines. When these collected sounds were processed, analyzed, and categorized by tools, they became part of a growing database that the AI could learn from.

The engineers at the Engine NVH Research Lab have 830 sound samples collected thus far. These samples are further categorized under 18 types and 44 sub-types, based on the component and the nature of the fault. For example, a common fault type piston noise, can be further categorized into piston ring noise, or piston friction noise, and so on. Once the AI learns enough variables to sounds that an engine can make, it begins to recognize similar patterns of sounds, providing inferred diagnostics based on what its sound sensors “hear”. 

How accurate is AI?

What matters here is the accuracy of sound analyses, and whether its efficiency outweighs what a human mechanic could provide. Recently, about a dozen experts in engine-noise went head-to-head with the lab’s AI for sound analysis. The result was as stunning as the AlphaGo tournament. Only 8.6% of the experts made the correct diagnosis. AI accuracy was an overwhelming 87.6%. That is a solid ten-fold accuracy over the experts’. AI accuracy will undoubtedly increase as more data is gained and better inferences can be made from it. Highly accurate diagnostics can drastically improve maintenance quality of life.

Currently, failures or faulty causes and components require hours, sometimes days of skilled work by engineers combing over every inch of the machine. Accurate diagnoses by human experts require thorough checkups with time-consuming disassembly for confirmation. With the AI tool, simply place the contact surface on the engine body for immediate analysis. Based on that anlysis of input sounds, the AI tells you the most likely cause of the abnormal sound, sometimes up to 3, from most likely to least. For example, the AI may analyze faulty powertrain operation sounds as caused by a fault in the turbocharger-87%, transmission-12%, and valve-1% probability, and so on.

Automobile mechanics and service experts are not only able to reprioritize their time and efforts to more pressing aspects of automobile repair and maintenance. Ultimately, the vehicle is serviced and returned to the owner with greater expediency, which is always great for customer serivce. This approach to fault detection and diagnosis can be easily expanded to not only Hyundai and Kia automobiles, but other automakers’ combustion systems and even electric-cars, aircrafts, tankers, and trains. Hyundai Motor and Kia Motors have submitted patents for this sound analysis technology to offices in Korea, Germany, and America.

This AI-based powertrain fault detection and diagnosis technology works with approximately 88% accuracy as of now. The lab engineers aim to surpass 90% accuracy within the year and release the technology to automobile service centers. We might be seeing mechanics pulling up a microphone to the engine room for an in-depth fault interview, as soon as 2019.