What is the big deal with natural language processing?

Download Article as PDF

Recently here at Manchester University, at one class for all PhD students we realized that almost half of student in a group are doing some kind of natural language processing and almost everyone was doing something related with machine learning (even hardware guys are building neural network like multi-processor architecture). Unfortunately, these efforts are not joint, but are executed over several research groups (NLP and text mining research group, National Centre for Text Mining has it’s research student, and probably there is one more group). However, there is a lot of effort going on here, which is about natural language understanding. So what is a big deal? Why so many projects are funded in this particular field? I cannot say how much competent I am to give an answer, but I will try. My PhD project is as well in text mining.


What is Natural language processing?

Natural language processing (NLP) is a field of computer science, artificial intelligence, and linguistics concerned with the interactions between computers and human (natural) languages. As such, NLP is related to the area of human–computer interaction. Many challenges in NLP involve natural language understanding, that is, enabling computers to derive meaning from human or natural language input, and others involve natural language generation. Natural language processing is a branch of artificial intelligence that deals with analyzing, understanding and generating the languages that humans use naturally in order to interface with computers in both written and spoken contexts using natural human languages instead of computer languages. Although, this field has it’s roots in 1950s, and one of the pioneers of the field was Alan Turing, still there are many challenges to be solved. In recent years natural language processing saw it’s first commercial applications. Google’s whole business is based on NLP, Facebook, Yahoo!, Microsoft and many others use NLP. NLP is a key technology behind Apple’s Siri.

Most NLP applications such as information extraction, machine translation, sentiment analysis and question answering, require both syntactic and semantic analysis at various levels.

What are the benefits?

Better communication – Natural language processing is able to help communicate with machine. Even now, there are applications like Siri or Google Now that are able to communicate and understand queries in human natural language. These applications are, however not perfect, but they are getting better. Quite soon, with advance of this field, we will be able to have natural communication with machines as we do with humans. This means better usability and intuitiveness of applications.

Less work Computers already did a great job reducing man’s effort to do simple and repetitive tasks. However, in archiving, searching or finding information, they can do even more and reduce human efforts to bare minimum. There is sub field of summarization that can greatly help when people need to read many long articles and reduce reading for poor reader. Many applications uses natural language processing to find out semantics of data user provides to system, so the application is able to serve him right content, the user will be interested in. And there are many more applications of NLP that actually reduces human work from 30%-90%.

Research and hypothesis generation – Natural language processing is able to find new knowledge from the series of articles of scientific papers. This is particularly used in bio-medicine to create hypothesis about relationships of some deceases and genomes. This is so far the first application of this I am aware. However, text mining and natural language processing can generate hypothesis in all the other fields, and make more space for researcher to do research in. And this can be also applied on non scientific part of life. For example, computers can scan some evidence and support decision making.

As shown from the above examples, natural language processing is a great tool that can make life easier and help progress in science. Applications in bio-medicine is starting saving lives. And there is more to come in near future, especially taking in account how many work is currently done in that field. It is also good to mention that text mining and natural language market grows world wide every year for more then 20%.

Born in Bratislava, Slovakia, lived in Belgrade, Serbia, now living in Manchester, UK, and visitng the world. Nikola is a great enthusiast of AI, natural language processing, machine learning, web application security, open source, mobile and web technologies. Looking forward to create future. Nikola has done PhD in natural language processing and machine learning at the University of Manchester where he worked for 2 years. In 2020, Nikola moved to Berlin and works in Bayer Pharma R&D as a computational scientist.

Leave a Reply

Your email address will not be published. Required fields are marked *