Rob Toews’ article titled “Language is the next great frontier in AI” published on Forbes.com in February this year caught my attention because of my interest in this subject. My Ph.D research was on Natural Language Processing and Text Mining. Building machines that can understand language has been a central goal of the field of artificial intelligence for a very long time. If some predictions had come true, we should now have machines fully capable of understanding the human language. But it has proven “maddeningly elusive”, says Toews.

Yet, there has been major breakthroughs in language technologies in recent years. For example, I have been fascinated by the freely available Google Translate app that you can use to point at foreign language texts such as Chinese or Japanese using your phone camera and get instant translations. 




“The technology is now at a critical inflection point, poised to make the leap from academic research to widespread real-world adoption. In the process, broad swaths of the business world and our daily lives will be transformed. Given language’s ubiquity, few areas of technology will have a more far-reaching impact on society in the years ahead”, adds Toews.

What are Language Technologies?

Language Technologies (LTs) are computer applications that help us do useful things with human language, whether in spoken or written form. So, they are also known as Human Language Technologies (HLT). Text to speech converters, speech to text converters, text classification programs, machine translation systems etc. are just a few examples of LTs.  

LTs deal with the computational processing of human language, whether in spoken or written form to ease both interaction with machines and the processing of large amounts of textual information. 




Why Language Technologies matter

Why are advances in language technologies (LTs) important? The answer is simple. Without language, we cannot reason abstractly or develop complex ideas and communicate to others. The civilization as we know it simply would not have evolved without language. 

LTs form a key technology that will drive advances in AI and computing in general in the near future. An article in the Harvard Business Review published in September 2020 titled “The Next Big Breakthrough in AI Will Be Around Language” has this to say: “The 2010s produced breakthroughs in vision-enabled technologies, from accurate image searches on the web to computer vision systems for medical image analysis or for detecting defective parts in manufacturing and assembly, as we described extensively in our book and research. GPT3, developed by OpenAI, indicates that the 2020s will be about major advances in language-based AI tasks”.




“Imagine being able to talk to your car and have it respond intelligently, giving detailed advice on routes or summarising up-to-date news you just missed on the radio. Or, being able to speak or type queries to your Web search engine in ordinary language, just as you would ask a person, and have it return just the document you were looking for, perhaps in summarised form for easy reading, translated from another language and with the key points for your purposes highlighted. Some of these capabilities are already here, and others are on the horizon”, according to the Centre for Language Technology, Macquarie University, Australia. 

With the advances in Machine Learning and Robotics, people predict that human like servant robots may be invented to do household chores in the future. Advances in LTs could enable seamless two-way communication between the human masters and servant robots, much like the communication between two humans. 

Recent breakthroughs in LTs

The invention of the transformer by a group of Google researchers in late 2017 is considered a major breakthrough. It is a new neural network architecture that has unleashed vast new possibilities in AI. “Transformers’ great innovation is to make language processing parallelized, meaning that all the tokens in a given body of text are analyzed at the same time rather than in sequence”, says Toews’ article mentioned above. Many new innovations have now taken place on top of Google’s original architecture, including Facebook’s 2019 RoBERTa model . 




Some common applications of LTs

Text to Speech systems

As already pointed out, text to speech systems are quite well developed in major languages like English. A great app that I often use is a Text to Speech app to read me e-books I have downloaded in pdf format. The app simply opens the e-book I have saved on my phone and reads it to me as I drive. It is a free app, but the quality is quite good. It doesn’t really feel like a robot reading it to me. The app I am referring to is @Voice Aloud Reader (TTS Reader) by Hyperionics Technology. You can also download it for free from Play Store and try it out.

Speech to Text systems

These systems are used to capture what you say and convert to text. This is already possible on most keyboards on your smart phone and also to search on Google using your smart phone. This basically eliminates the need for a keyboard. 




Chatbots

Chatbots are used by many companies to answer to common customer queries. In 2019, media widely reported that a Chinese software engineer designed a chatbot to chat with his girlfriend while he was busy at work. 

Spoken Language Dialog systems

These systems enable you to talk to a computer via a telephone. “These can be used to call up on the phone and talk to a machine in order to buy or sell stocks and shares, or to get route directions from one city to another” according to sources.

Automatic code generation

OpenAI has announced Codex, a transformer-based model that can write computer code astonishingly well. Human users can give it a plain-English description of a command or function and Codex turns this description into functioning computer code. 




Machine Translation

Machine translation technology takes a document in one language and translates it into a document in another language. The best example of this is Google translate. I am sure most of you have already tried. Machine translation is still not perfect, but improving. A day may not be far off when the need to learn another language is entirely eliminated! 

What about LTs for Dzongkha?

We have to carry out NLP research for Dzongkha and develop relevant language technologies that can promote its use and help its preservation. Sadly, big research projects are not interested in minority languages like Dzongkha because of the small potential market for products or services for such languages. 

Advertisement