And you know what? It’s managing to pull it off — as the company’s latest AI startup acquisition underlines.This week, Apple purchased Seattle-based AI company Xnor.ai for a reported $200 million. While there’s no shortage of startups in the current AI boom doing machine learning (ML), Xnor.ai’s ambition is a bit different. Whereas everyone else is combing through massive amounts of data to create smart tools, Xnor.ai focuses on building AI algorithms which run locally on devices, rather than in remote data centers.
That means getting the benefits of AI (the things made possible thanks to modern ML technology, without the negatives.Apple gave its usual non-statement when asked to comment on the Xnor.ai deal. It said it “buys smaller technology companies from time to time,” but will not say why. We know why, of course. And privacy is the reason.
AI with extra privacyThis isn’t the first time Apple has shown this balancing act commitment when it comes to AI. In late 2018, it acquired Silk Labs, an AI startup that does image and audio recognition for people detection, facial recognition, and more. All of this is carried out locally, without sending data to the cloud. “Privacy and security is built into our company’s DNA,” read Silk’s now-defunct website. “With every line of code we write and in every design decision we make, Silk takes great measures to ensure that user data on the Silk Intelligence Platform is fully protected at all times.”
Apple has also started to share some details about its privacy-focused AI prowess. Late last year, Apple published a paper describing something called federated learning. Federated learning trains machine learning algorithms on multiple local datasets without exchanging data samples. This allows Apple to do things like get Siri to recognize your voice (and only your voice) as a wake word. However, it does it without sharing this data with a data center. All that gets shared are the updated neural networks, which are used for improving the overall master network.
Apple additionally uses differential privacy. This adds a small amount of noise into raw data so that it’s harder to reverse-engineer audio files from a trained model.It’s a way for Apple to differentiate itself from data-hungry tech giants, which CEO Tim Cook has repeatedly spoken out against. More importantly, it does this without ignoring the importance of AI — which is something Apple previously was guilty of. It has allowed Apple to implement tools like deep learning into every aspect of its product line, while minimizing the negatives along the way.
Is Apple really a privacy-first company?