Facebook is developing a Talk the Walk AI capable of giving walking directions without knowing a users location. What it is: A team comprised of a researcher from the University of Montreal in Canada and Facebook Artificial Intelligence Research ( FAIR) scientists recently published a white paper describing a neural network capable of giving a person plain language directions without the use of GPS or other location tracking aids. According to the researchers: We introduce the Talk the Walk dataset, where the aim is for two agents, a guide and a tourist, to interact with each other via natural language in order to achieve a common goal: having the tourist navigate towards the correct location. The guide has access to a map and knows the target location, but does not know where the tourist is; the tourist has a 360-degree view of the world, but knows neither the target location on the map nor the way to it. The agents need to work together through communication in order to successfully solve the task. How it works: Hypothetically, if this neural network were to be fully fleshed-out, it could provide end-to-end directions to a person, even if location services and internet connectivity was unavailable. In such a case, it would function by allowing users to have a conversation with an AI in much the same way they would with a human. The tourist describes the landmarks they see, such as Im standing next to a theater, and the AI tries to determine where they are. It can ask questions in return, for example it may ask if the user sees a shop on the corner to help narrow down which theater theyre looking at. Once it determines where the user is, it gives a plain language response guiding them to the next waypoint. When its coming: Maybe never, as this isnt a new feature the company is rolling out. Its early research which appears to lay the ground work for future development. The Talk the Walk white paper establishes a data-set and some basic algorithms to prove that the concept works, but its far from being ready for prime time. The major significance of this work is in its focus on creating AI capable of working together with humans to achieve a goal. To learn more about neural networks read our primer here. And dont forget to visit our artificial intelligence section for all the latest news and updates in machine learning.
Virtual guides help a 'lost' AI find its way. As a general rule, AI isn't great at using new info to make better sense of existing info. Facebook thinks it has a clever (if unusual) way to explore solutions to this problem: send AI on a virtual vacation. It recently conducted an experiment that had a "tourist" bot with 360-degee photos try to find its way around New York City's Hell's Kitchen area with the help of a "guide" bot using 2D maps. The digital tourist had to describe where it was based on what it could see, giving the guide a point of reference it can use to offer directions. The project focused on collecting info through regular language ("in front of me there's a Brooks Brothers"), but it produced an interesting side discovery: the team learned that the bots were more effective when they used a "synthetic" chat made of symbols to communicate data. In other words, the conversations they'd use to help you find your hotel might need to be different than those used to help, say, a self-driving car. The research also helped Facebook's AI make sense of visually complex urban environments. A Masked Attention for Spatial Convolution system could quickly parse the most relevant keywords in their responses, so they could more accurately convey where they were or needed to go. As our TechCrunch colleagues observed, this is a research project that could improve AI as a whole rather than the immediate precursor to a navigation product. With that said, it's easy to see practical implications. Self-driving cars could use this to find their way when they can't rely on GPS, or offer directions to wayward humans using only vague descriptions.