The venture capital firm recently asked Amazon Alexa, Apple’s Siri, and Google Assistant the same 800 questions. Google Assistant was the most successful of the bunch and was able to answer 93% of the questions correctly. In comparison, Siri was only able to get 83% of the questions right, and Alexa got 80%. Samsung’s Bixby and Microsoft’s Cortana, both lesser-used voice assistants, didn’t even make the cut.
That said, the results for all three represented an improvement over where they were a year ago. Loup Ventures performed the same test last year and found that Google Assistant was only able to get 86% of the questions correct, Siri scored 79%, and Alexa only got 61% right.
The smart assistants were judged on both whether or not they understood the question being asked correctly as well as whether or not it delivered a correct response.
Questions were broken into five categories:
• Local – Where is the nearest coffee shop?
• Commerce – Order me more paper towels.
• Navigation – How do I get to Uptown on the bus?
• Information – Who do the Twins play tonight?
• Command – Remind me to call Jerome at 2 pm today.
This year’s test was done with Siri using iOs 12.4, Google Assistant on a Pixel XL running Android 9 Pie, and Alexa via its iOS app.
The only area where Google Assistant trailed behind the competition was in the Command category. Siri was able to handle phone functions like calling, adding things to your calendar, and email better than any of the other apps. And while one might presume Alexa would win when it came to commerce questions,
Despite all three assistants missing the mark in some categories, overall Loup Ventures says that the rate of improvement for the assistants continues to be impressive.
Each platform has seen dramatic improvements since the company started tracking them, which suggests that we’ll also see some significant improvements across the board over the next six months before the company tests them again.