Google does a great deal of research into natural language processing and synthesis, but not every project has to be a new Assistant feature or voice improvement. The company has a little fun now and then, when the master AI permits it, and today it has posted a few web experiments that let you engage with its word-association systems in a playful way. First is an interesting way of searching through Google Books, that fabulous database so rarely mentioned these days. Instead of just searching for text or title verbatim, you can ask questions, like Why was Napoleon exiled? or What is the nature of consciousness?It returns passages from books that, based on their language only, are closely associated with your question. And while the results are hit and miss, they are nice and flexible. Sentences answering my questions appeared even though they were not directly adjacent to key words or particularly specific about doing so. I found, however, its not a very intuitive way to interact with a body of knowledge, at least for me. When I ask a question, I generally want to receive an answer, not a competing variety of quotes that may or may not bear on your inquiry. So while I cant really picture using this regularly, its an interesting way to demonstrate the flexibility of the semantic engine at work here. And it may very well expose you to some new authors, though the 100,000 books included in the database are something of a mixed bag. The second project Google highlights is a game it calls Semantris, though I must say its rather too simple to deserve the -tris moniker. Youre given a list of words and one in particular is highlighted. You type the word you most associate with that one, and the words will reorder with, as Googles AI understands it, the closest matches to your word on the bottom. If you moved the target word to the bottom, it blows up a few words and adds some more. Its a nice little time waster, but I couldnt help but feel I was basically just a guinea pig providing testing and training for Googles word association agent. It was also pretty easy — I didnt feel much of an achievement for associating water with boat — but maybe it gets harder as it goes on. Ive asked Google if our responses are feeding into the AIs training data. For the coders and machine learning enthusiasts among you, Google has also provided some pre-trained TensorFlow modules, and of course documented their work in a couple of papers linked in the blog post.
The tech giant's other AI experiment is all about word association. Google Research is giving us a (fun) glimpse of how far natural language processing in artificial intelligence has come. Mountain View's research division has rolled out a couple of what it calls Semantic Experiences, which are websites with interesting activities that demonstrate AIs' ability to understand how we speak. One of the two experiences is called " Talk to Books," because, well, you can use the website to talk to books to a certain extent. You simply type in a statement or a question, and it will find whole sentences in books related to what you typed. In the announcement post, notable futurist/Google Research Director of Engineering Ray Kurzweil and Product Manager Rachel Bernstein said the system doesn't depend on keyword matching. They trained its AI by feeding it "billion conversation-like pairs of sentences," so it can learn to identify what a good response looks like. Talk to Books can help you find titles simple keyword searches might not surface -- for instance, when I searched "He says he's the greatest detective who ever lived," one of the results highlighted a sentence that doesn't contain any of my query's keywords, because the AI associated the word "detective" with "investigator." Google Research's other new website called Semantris offers word association games, including a Tetris-like break-the-blocks experience. The two games can recognize both opposite and neighboring concepts, even sounds like "vroom" for motorcycle or "meow" for cat. The development in word vector, an AI-training model that enables algorithms to learn relationships between words based on actual language usage, led to the advancement in natural language processing over the past few years. According to Kurzweil and Bernstein, these websites show how AIs' "new capabilities can drive applications that weren't possible before." They said other potential applications include "classification, semantic similarity, semantic clustering, whitelist applications (selecting the right response from many alternatives) and semantic search (of which Talk to Books is an example). " Google has released a module on TensorFlow other researchers and developers can use, so the tech giant's work could lead to more AI-powered applications that can understand how we wield words better than their older counterparts can.