Google Assistant vs Apple Siri vs Amazon Alexa: Which Is The Smartest?

Updated on

Google, Apple, and Amazon have been investing heavily in their respective voice assistants. All three of them have received widespread criticism for allowing their employees and third-party contractors to listen to your conversations to improve their AI assistants. Google Assistant, Amazon Alexa, and Apple Siri assistant are all getting better at understanding and answering your queries/commands.

AI assistants are touted to be the next frontier of computer technology. Analysts at Loup Ventures have been tracking the performance of three of the world’s most popular voice assistants to see how they are evolving. On Friday, Loup Ventures published the results of their 2019 Digital Assistant IQ Test. They found that all three leading digital assistants have seen significant improvements since last year.

Loup Ventures evaluated them by asking each digital assistant the same set of 800 questions. They kept track of whether the assistant correctly understood the question and whether it delivered a correct response. The questions were divided into five categories: Local, Commerce, Navigation, Information, and Command. Google Assistant was tested on Pixel 3 XL while the other two assistants were running on iPhones.

The analysts didn’t include Microsoft’s Cortana in the test because Cortana is vanishing not only from third-party accessories but also from Microsoft’s own products. Microsoft CEO Satya Nadella said earlier this year that he no longer saw Cortana as a direct competitor to Google Assistant, Amazon Alexa, or Apple Siri assistant.

Just like last year, Google Assistant remains the front-runner this year. It understood every single of the 800 questions asked by analysts, and managed to answer 92.9% of them correctly. By comparison, Google’s voice assistant had answered 85.5% questions correctly last year. If it continues to improve at the same rate, it will soon be able to correctly answer any question you ask.

Apple’s Siri assistant was the closest competitor to Google Assistant. It understood questions with 99.8% accuracy, and answered 83.1% questions correctly. That’s a healthy improvement from last year’s 99% understanding accuracy and 78.5% correct answers.

Amazon Alexa occupied the third spot, the same as last year. It understood questions with 99.9% accuracy but was able to answer only 79.8% of them correctly. That’s still a steep rise in its answering accuracy from last year’s 61.4%.

Google Assistant performed better than its rivals in four out of five categories. It fell behind Apple Siri assistant only in Command category, where Siri answered 92% questions accurately compared to Google Assistant’s 86%.

Both Siri and Google Assistant did better than Amazon Alexa in Command category because Siri is baked into iOS and Google Assistant is deeply integrated into iOS. It allows them to make phone calls, send text messages, and perform other functions. Alexa lives as a third-party app on iOS and Android, which means it can’t initiate phone calls, send text messages, or emails.

One area where Google Assistant was far ahead of its competitors was Commerce. Google’s AI assistant answered 92% Commerce questions correctly compared to 71% for Alexa and 68% for Siri. Apple’s Siri occupied the top spot in one category, came second in two categories, and finished last in the remaining two categories. Alexa was ranked second in two categories and third in three categories.

Loup Ventures noted that each assistant has shown improvement in each category every time they conduct new tests. The research-based VC firm tests AI assistants twice a year. Google Assistant tends to do better than others in the Information category while Siri leads in Command. When you look at the correct answers, it was Alexa that registered the most significant jump from 61.4% last year to 79.8% in the latest test.

Each of the three voice assistants is moving towards 100% understanding and answering accuracy. For now, Google Assistant is the smartest. But Loup Ventures cautions that these assistants are still far from “intelligent.” They can understand and answer you within their primary use cases. They will have to evolve to work with additional use cases in the future.

Leave a Comment