In the following article the author announces Google’s open source offering of SyntaxNet technology that uses TensorFlow for NLU:
Google’s Parsey McParseface is a serious IQ boost for computers
SyntaxNet
How does this help with actual understanding? Is TensorFlow only being used to more accurately label POS and functional roles? If we have a tool that is 100% accurate in labeling the POS and a word’s role in a sentence then what? What do we do with a correctly tagged and labeled parse tree? How do use that to interface with an ontology or world knowledge and reason about the user input in order to provide an insightful response based on the user’s or the chatbot’s goals.
I am not seeing how statistical training and neural nets are providing real understanding in AI beyond linguistic POS and role tagging. Perhaps their use in sentiment analysis starts to cross over into some sort of semantic interpretation or understanding.
Are there any tools that actually address extracting meaning from the sentences or match the input to conceptual knowledge beyond syntax and a words function in a sentence? A tool that would make it easier than say hardcoding each use case in a scripting language depending on the author’s needs?
I know that Chatscript at least tags words with concepts that are user defined in addition to linguistic concepts such as “Verb”, “Noun”, “Main Subject”, “Main Object”, etc. It also uses WordNet and you can hierarchically define concepts and facts so that your chatbot can have some ability to generalize and every case does not have to be hardcoded.
Having a tool that can correctly parse a sentence is a good start, but are there any tools that assist with actual semantic reasoning that are already integrated within a NLU framework?