Google Home in Spanish: An incomplete understanding

Google’s smart speaker, Google Home, has recently learned a new language. Launched at the end of June this year, Google Home is now speaking Spanish. As real as life itself, Spanish speakers can now listen to all summer hits without lifting a finger, or even look for restaurants nearby by indirectly saying, ‘Ok Google, tengo hambre’. This language update may serve to gain ground in the market of artificial intelligence-powered devices since Spanish is the second most widely spoken language in the world. Notwithstanding, Google Home's Spanish skills leave much to be desired. Bitext team has been testing it and, as a matter of fact, there is still room for improvement.

 

Google Home is now well-equipped to listen and respond to basic voice commands in Spanish, such as those related to the weather, news, entertainment… However, when it comes to more complex queries, this virtual agent says nothing but ‘sorry, I can’t help you further’ or ‘I’m afraid I don’t understand’.  Even though this modern-designed intelligent device is all the rage right now, users end up quite upset when receiving that kind of answers to their requests. People do not want to struggle with virtual assistants to be properly heard, they want them to really understand and talk as a human being would do - through natural language.

 

As mentioned above, our Bitext team in Madrid wanted to go further and test Google Home in Spanish to get first-hand information about its abilities. This smart device could, among many other simple things, tell the time, temperature and forecast for the days coming, give information about some general questions and even make jokes. However, its linguistic competences are still not mature enough or, at least, not as developed as those for the English language:

It’s not Greek, it’s Spanish!

  • Long sentences reap short answers

Telling the time and temperature is not exactly rocket science. The big issue comes when a sentence is longer than expected. If you ask ‘¿Cómo es el tiempo en Saint Tropez? (How is the weather in Saint Tropez?)’, it will give you very detailed meteorological information. However, if you want to be more precise asking “¿Qué tiempo hace en Saint Tropez en septiembre? (How is the weather in Saint Tropez in September?)’, the device will get flustered with this new information and its answer will be something like ‘sorry, but I didn’t understand you’.

  • Multiple intents make it fail

Simple sentences like ‘Dime las noticias/Dime la hora’ (Tell me the news/Tell me the time)’, composed by a verb plus just one direct object, are a piece of cake for the assistant. Nevertheless, if you include a double intent as in ‘Dime las noticias y la hora (Tell me the news and the time)’, you will surely get this as an answer: ‘Sorry, but I can’t help you with that’.

  • Negation is not spotted

Most conversational agents do not understand negation in a phrase because they have been built based on a keyword approach. This fact poses, consequently, a big deal since it can give rise to responses completely the opposite to what the user expects. In one of the tests, our team was playing trivia with Google Home. At the end of the game, the assistant asked if they would play another round: ‘¿Jugamos otra?, it asked. ‘No. No queremos jugar más (No. We don’t want to play any longer)’, Bitext team responded. ‘Ok. ¡Vamos a jugar! (Ok. Let’s play!)’, the device cried. This misunderstanding obviously implied an endless loop of commands followed by wrong answers forcing Bitext team to restart the device to keep on testing.

  • Robot-like interaction comes into play

Most of the times, users need to find the right command for the device to understand what they are saying. At the presentation of Google Home Spanish Assistant, it was announced that the gadget would be able to easily inform you about the traffic in your city. Bitext team was continuously asking: ‘¿Cómo está el tráfico en la A-3? (How is the traffic on the A-3?)’, using many sentence variations and the answers were always like ‘sorry, I can’t help/understand you’. After multiple trials, Bitext team realized that the only way to be understood was by adding the common noun highway before A-3 so that the agent gave them a proper answer.

 

See for yourself one of the tests carried out by Bitext team in the following video:

 

Though the illustrations mentioned above pose a great challenge for intelligent devices, Bitext technology is running way ahead of most of these hurdles. Not only the negation issue but also the multiple intent failure can be smoothly fixed by Bitext Natural Language Processing tools for conversational agents. Looking for real solutions? Check them here.

 

Subscribe to Email Updates