The Next Privacy War Will Happen in Our Homes

But an add-on that stems the flow of data is almost literally a band-aid solution. Once Alexa, Google Assistant, or Siri morph from standalone pieces of technology that sit on our tables or counters into software built into nearly every necessary gadget in our homes, temporary modifications and add-ons will be insufficient. When we’re fully surrounded by all-listening ears, the privacy breaches we’re witnessing now — the random recordings sent to the wrong people — will not only increase in frequency but in magnitude.

Just as we once assumed all Facebook knew was the information we willingly gave it, we’re unprepared for the myriad ways smart speakers tuned into our surroundings could someday day be exploited, whether for our convenience or not. We’re not equipped to fully appreciate the trade-offs we’re making. We don’t know what we need to do to protect ourselves, how much we even need protection, or if the tools to do so are even available.

What happens when we go from merely interacting with Alexa to living with her?

Mentally, we’re equally unprepared for what’s to come. As long as smart speakers remain a visible, external piece of furniture, we can mentally separate them from our lives. They are not yet seamlessly integrated into our days, but what happens when they are? What happens when we go from merely interacting with Alexa to living with her? When we go from inviting Alexa into our home to accepting the program as a necessary feature of life?

“We envision a world where the consumer devices around us are more helpful, more intelligent, more… human,” says Audio Analytic, a company that has created software capable of recognizing a range of sounds. It reportedly hopes this technology will soon be able to detect the sound of consumer products when they’re being used around smart speakers.
Anyone who’s chatted with Alexa knows the feeling of wanting artificial intelligence programs to feel more human. And despite their stilted speech and limited range of responses, they already do feel somewhat human. That is, of course, by design. Manufacturers want us to feel connected to this tech, not just pragmatically but emotionally. Alexa and Siri can’t just be computer programs. They need to be trusted. They need to be friends we invite into our homes, or that we allow to sleep beside us.

But when voice-activated computer assistants eventually become a necessary feature of our lives, we may notice a profound irony.

When every noise in our lives is a search prompt, the sounds of our homes, the symphony of life — laughing, crying, talking, shouting, sitting in silence — will no longer be considered memories, but data. The more we humanize technology — the more it becomes not just part of the furniture, but part of the family — the more our lives will become less human.

Similar Articles:

Four Questions to Consider When Debating Potential Data Privacy Policy

Four Questions to Consider When Debating Potential Data Privacy Policy

Smart Lock Vendors Under Fire For Collecting Too Much Private Data

Smart Lock Vendors Under Fire For Collecting Too Much Private Data

Thanks For Helping Us Defend the California Consumer Privacy Act

Thanks For Helping Us Defend the California Consumer Privacy Act

Will California lawmakers vote to protect Californians’ privacy or tech industry profits?

Will California lawmakers vote to protect Californians’ privacy or tech industry profits?