A prototype of Apple’s new HomePod is displayed during the 2017 Apple Worldwide Developer Conference (WWDC) at the San Jose Convention Center on June 5, 2017 in San Jose, California.
Apple says it’s using local AI to power Siri. While some data like your name, your contacts, the names of your photo albums and the songs in your music library do get sent to Apple so that you’ll get a better Siri experience, generally speaking, most data is stored only on the device. That includes data about your music tastes, news preferences, the things shown in your photos and more.
This is philosophically different from what Amazon, Google and Microsoft do: data those cmopanies collect is generally shared with their remote servers and, in some cases, mixed with hundreds of millions of other people’s data so that those companies can make their systems smarter with collective knowledge. Apple says it doesn’t care about seeing its users’ data, so all the inference is done locally on an iPad or iPhone.
That may be the Achilles heel of Siri in some regards. While she can gain skills, she’s limited to the data on your device — a pro for folks who are worried about security, but a con for those who want a smarter voice assistant. At least your personal version of Siri is synced across all of your Apple devices with iOS 11, though.
So Siri is improving, but it’s also evolving. Siri has become more about learning what you want while also providing information when asked on command. That trend should continue when Apple launches its HomePod smart speaker later this year. It’ll serve as a competitor to the Amazon Echo and Google Home.
Source: Tech CNBC
Siri is better in iOS 11: Here's what's different and what Apple is doing under the hood