Site icon AppleMagazine

Google Assistant’s answers prove more accurate than Siri’s in latest test

Siri

Ever asked Siri to play a song and received directions to the nearest restaurant instead? Siri’s mishaps can be entertaining, and a new report shows Apple’s digital assistant came second to Google Assistant in an accuracy test.

Siri vs Google Assistant

In The Loup Ventures’ latest annual report comparing the leading digital assistants’ accuracy, Siri answered 78.5% questions correctly, an increase from last year when it scored 66%. While it certainly shows Siri is getting smarter, it seems it is still not smart enough, with Google Assistant beating Siri’s record and scoring 85%. Of the 800 questions answered, Siri apparently understood 99%, however, Google Assistant again came top with a score of 100%.

How were they tested?

The Loup Ventures asked each digital assistant the same 800 questions and graded them on if they understood the question and if they delivered a correct response. They split the questions into 5 categories:

Local – Where is the nearest coffee shop?
Commerce – Can you order me more paper towels?
Navigation – How do I get to uptown on the bus?
Information – Who do the Twins play tonight?
Command – Remind me to call Steve at 2pm today.

Siri performed particularly well in the Command category, but Google Assistant outperformed the other assistants in all other areas.

What about others on the market?

Amazon’s Alexa and Microsoft’s Cortana scored poorly in comparison to the two leading digital assistants. However, their tests were conducted using iOS and not their own hardware so the comparison is not completely fair.

What is important is that all the assistants are continuing to improve. Apple has just recently hired John Giannandrea, who previously worked on Google Assistant, as VP Machine Learning and AI Strategy to oversee future Siri developments.

Exit mobile version