In the real world, business operations generate data continuously and exponentially. In the virtual world, artificial intelligence translates real-world data into actionable evidence, but nuances of language are easily lost in translation. If we can teach computers to understand syntax, can we improve information intelligence?
A colleague of mine likes to say that understanding language is easier said than done. It’s an idea I’d find easy to dismiss, if not for the fact that his catch phrase itself is evidence that the spoken and written word often is more than meets the eye – or the AI.
“We have forgotten this as adults, but we spent several years at school learning how to read and write,” François-Régis Chaumartin, founder of natural language processing (NLP) software company Proxem told me when I first met him in 2020. “Why is it so difficult? Because ambiguities are omnipresent in human language. A single word can describe many concepts, and a given concept can be described by many words.”
Human language is beautiful, but its intricacies—the slang, clichés, turns of phrase, ironies and homonyms, just to name a few—have proven difficult for computers to comprehend. Around the world, each language and dialect also has its own nuances, leaving room for error with every translation and interpretation. That’s why NLP has been an elusive achievement in data science and AI for many years.
As much as 80% of company’s knowledge is implicit and hidden behind text documents, including regulations, requirements, contracts, emails and social media. Without proper semantic analysis, weak signals are easily overlooked. Misspellings, industry-speak and internet shorthand lead to miscategorization or, worse, disregard of valuable data. Being able to turn mountains of textual data into accurate sets of related concepts and actionable insights at lightning speed is a key business advantage.
Strong semantic intelligence that combines natural language and machine learning technologies enables knowledge interpretation to be automated, transforming it from implicit to explicit. When explicit knowledge is available to everyone, closer connections between the business and its consumers, patients, partners or employees are generated, capturing and contextualizing insights from their experiences and expectations. Because NLP translates huge information flows into clear, sharp insights and trends, individual experiences become collective, reusable knowledge. These insights and trends can be used not only to improve operational excellence, but also to help identify opportunities for innovation. Innovation, in turn, increases customer satisfaction and loyalty and, thus, improves business performance.
Historically, NLP has been used for analyzing reports, responding to customer feedback and improving search results. But now we’re seeing advanced applications that enable predictive maintenance of equipment, improve operating efficiencies. For example, the ability to identify automatically similar customer claims enables early detection of quality defects in any industry. Thanks to NLP, we can layer unlimited knowledge and know-how as part of virtual twin experiences – the accurate 3D experiences that allow designers and engineers to create and test new products and processes without breaking them – leading to endless possibilities.
We’re in the age of Big Data. But all the information we need is not quite at our fingertips. As we inch closer to teaching AI to truly understand human language, we also begin to unlock the potential for businesses to be even more intelligent, innovative and customer-centric.
Morgan Zimmermann, Chief Executive Officer, Dassault Systèmes NETVIBES
Learn how companies are using NLP to improve business intelligence and customer experience