Apple’s Privacy Pledge Complicates Its AI Push

A puny iPhone is no match for a cloud server.
Image may contain Tim Cook Human Person Man Clothing Sleeve Apparel and Long Sleeve
Apple CEO Tim Cook at Apple's global headquarters in Cupertino, California.Andrew Burton/The Washington Post/Getty Images

It’s the simple bargain that made companies like Google and Facebook into giants: in exchange for the convenience of running your life from a smartphone, you hand over gobs of data on your every activity. It zips up into the cloud where algorithms do…well it’s hard to be exactly sure, but everyone's at it. Oh, except Apple.

Tim Cook has aggressively positioned the company as uninterested in collecting user data, and boasts that it sets Apple apart. “They’re gobbling up everything they can learn about you and trying to monetize it,” he said in a 2015 speech. “We think that’s wrong.”

“They,” of course, refers mostly to Google and Facebook, which rely heavily on cloud computing for search and recommendations and other features. Apple, on the other hand, promises to do its machine learning-powered stuff like photo searching and predicting what emoji you want right there on your smartphone or tablet.

You can see the logic here. Apple makes its money selling gadgets, not targeting ads. And denigrating competitors for monetizing your data is a handy marketing and PR too. Who among us doesn’t want to reduce our privacy risk?

But Cook’s steadfast aversion to the cloud presents a challenge as Apple tries to build up new features powered by machine learning and AI. To build and run machine learning services you need computing power and data, and the more you have of each the more powerful your software can be. The iPhone is beefy as mobile device goes, and it’s a good bet Apple will add dedicated hardware to support machine learning. But it's tough for anything it puts in your hand to compete with a server—particularly one using Google’s custom machine learning chip.

Compare the photo management apps from Apple and Google to see how this can play out. Both use neural networks to parse your photos so you can search for dogs and trees and your best friend. Apple’s Photos does this entirely on your iPhone. Google Photos does it all in the cloud.

Of the two, only Apple’s app will let you search your iPhone snaps for “dog” while in airplane mode at 30,000 feet, and not having to wait while your query and the response travel across the internet can in theory make searches snappier. But Google Photos has generally been favored by reviewers (including our own) impressed by the power of the search company’s image-parsing algorithms. Local processing works great for many things, but if you want to push the envelope it's hard for a mobile device to outsmart cloud AI, says Eugenio Culurciello, a professor at Purdue University who works on hardware to accelerate machine learning. “In a server you can do so much more work in any second,” he says.

Companies that haven’t pledged cloud celibacy also have an easier time making their artificial intelligence more, well, intelligent. The most direct way to build a smart new thing to work on your customers' data, is to use lots and lots of that same data to train it, says Chris Nicholson, CEO of Skymind, a startup which helps companies use machine learning. “The more data you have the more valuable your thing gets,” he says. “Google, Amazon and others are benefiting from that and Apple is not.” It’s also easier to continuously update neural networks in the cloud, so they’re always improving, than it is to push updates to ones that reside in people’s pockets, says Nicholson. Apple has started using a technology called differential privacy to pull in some anonymized data on how people use their phones, such as your favorite emoji, but it’s unclear how broadly that can be applied.

To be fair, pocket neural networks have improved tremendously lately and for some use cases they make a lot of sense. Image recognition is particularly good on mobile devices, says Song Han, a Stanford University graduate student working on compressing neural networks. He developed one such system that helps Facebook’s augmented reality platform track objects. For applications like that, where a game’s virtual zombies need to be in exact sync with your coffee table, running everything locally can be a necessity.

Apple now has its own technology to optimize AI for iDevices, in the CoreML platform it released last month (it also launched its own augmented reality toolkit). And it's reasonable to expect future iPhone models to sport new hardware that powers up machine learning, but that may not give Apple a very unique advantage. If it plugs into the new CoreML platform then Google, Facebook, and anyone else with an iPhone app will be able to tap into it, too. And Qualcomm, the leading chipmaker for Android devices, has been working on hardware tricks to speed up neural networks on mobile devices for some time.

Han says many people thinking about the future of pocketable AI are looking at a hybrid approach that combines the speed and convenience of mobile algorithms and the power and sophistication of those in the cloud. Google already does this with speech recognition, using local algorithms to almost instantly produce a quick and dirty transcription before a distant data center provides a more precise answer a split second later. Apple’s insistence on keeping data on your device appears to rule out such an approach. Being able to promise your data stays private helps the company keep up its PR war on data gobblers and won't hurt some uses of AI. But as machine learning becomes more important to all consumer tech companies, Apple devices may think different, but less deeply.