Natural Language Processing’s Crazy Busy Start to 2019 [Slator.com]

Natural Language Processing’s Crazy Busy Start to 2019
March 15, 2019

by Gino Diño

Natural language processing (NLP) and Generation (NLG) continue to boom, powered by rapid advances in machine learning. Slator continually monitors NLP and NLG as the umbrella category to which machine translation (MT) belongs, because developments in these areas may eventually impact the language services market.

Additionally, machine translation as well as language services and tech are mixing well with the broader AI and machine learning scene. In January 2019, for instance, the 2019 Applied Machine Learning Days conference featured AI & Language as one of four main tracks.

Before we go breathless and launch into an update on the most recent NLP launches and fundings, this recent MIT Technology Review interview with an NLP pioneer provides some much needed perspective.

Boris Katz, principal research scientist at MIT and one of the earliest researchers to contribute to the ideas that today underpin NLP and NLG, explained: “If you look at machine-learning advances, all the ideas came 20 to 25 years ago.”

So complex is language, according to Katz, that today’s virtual assistants most would consider intelligent are, essentially, “just counting words and numbers.”

He further explained that the technology of today has simply caught up to the ideas of the past. Moving forward, however, may require a fundamentally new approach...

Read the full article on Slator's website using the link below.

Associated CBMM Pages: