A study, albeit from competitor DuckDuckGo, finds that Google search results can vary significantly
The amount of personalization inherent in any one of Google’s many massive software products runs deep, based on everything from your search history to your location to every single search link you might have clicked. And avoiding that personalization seems to have become more difficult over the years. According to a new study conducted by Google competitor DuckDuckGo , it does not seem possible to avoid personalization when using Google search, even by logging out of your Google account and using the private browsing “incognito” mode.
DuckDuckGo conducted the study in June of this year, at the height of the US midterm election season. It did so with the ostensible goal of confirming whether Google’s search results exacerbate ideological bubbles by feeding you only information you’ve signaled you want to consume via past behavior and the data collected about you.
It’s not clear whether that question can be reliably answered with these findings, and it’s also obvious DuckDuckGo is a biased source with something to gain by pointing out how flawed Google’s approach may be. But the study’s findings are nonetheless interesting because they highlight just how much variance there are in Google search results, even when controlling for factors like location.
DuckDuckGo found that a majority of participants in its study saw different results when searching three divisive terms: “gun control,” “immigration,” and “vaccinations.” According to the company, “these discrepancies could not be explained by changes in location, time, by being logged in to Google, or by Google testing algorithm changes to a small subset of users.” DuckDuckGo says it controlled for location by treating local results that varied across regions as though they were identical. “Interestingly, this adjustment didn’t affect overall variation significantly,” the study reads.
For the study, DuckDuckGo compiled 87 result sets (76 on desktop and 11 on mobile), and it conducted the searches consecutively and simultaneously starting at 9PM ET on June 24th, 2018. It did one private, logged-out test and then a logged-in test immediately after, so as not to influence the private test with prior results. What DuckDuckGo found was that using private browsing and logging out of Google had almost no effect on the variation in search results: users saw a roughly equitable amount of variation across all three searches and when searching privately and while logged in.
Some key elements to the variation included changes in news sources and the placement of sometimes identical links in different positions, which has a drastic impact on the likelihood that they get clicked. The study also found variations in how news articles and videos were laid out among standard text links, and as many as 22 different domains showing up in the first page of results for “vaccinations,” with a standard search result page typically containing 10 organic links.
Perhaps most importantly, there doesn’t seem to be any way to get a single, objective search result from Google that can be easily replicable across users or locations. To further drive home the point that private and logged-out searches are still equally variable, DuckDuckGo calculated the difference between text results for private, logged-out users and compared them to both the results of other anonymous users and the standard, logged-in search results for those initial users.
“We saw that when randomly comparing people’s private modes to each other, there was more than double the variation than when comparing someone’s private mode to their normal mode,” the study reads. The data indicates that between standard, logged-in search results and private, logged-out ones, there was typically a difference of only one or two domains, while the variation between anonymous users was from three (for gun control) to five (for vaccinations) domain changes. That indicates that there is some heavy personalization going on for people using private browsing when logged out, and it’s not clear why that is or what effect it’s having on user behavior.
Ultimately, the study seems to prove that there’s no easy way to use Google search without the software seemingly trying to determine who you are and whether it can better serve your needs. That’s not necessarily a bad thing; technology companies all over the world built businesses around being smarter, faster, better automated, and more personal. (And by giving away really useful, ad-supported products for free.)
A majority of internet users wouldn’t use Google search if it wasn’t constantly a best-in-class search engine, with a number of tangential benefits, like the Chrome browser tie-ins, that make it a lot easier and more convenient than, say, DuckDuckGo. But it’s still unsettling to know that there’s so much unseen, algorithmic tinkering going on when you perform a basic action, like searching one or two-word terms on Google. And that there’s really nothing we can do to get a neutral result.
Google could not comment on a study it did not have access to, but the company says that search results can change by the minute and sometimes even by the second, especially for news topics. Google also says that personalization is done a small fraction of the total number of queries entered into search, and relying on recent queries is often to determine the context for a search, like when a word may apply to a sports team and a city simultaneously. The company did confirm that it does not personalize results for incognito searches using signed-in search history.
It’s important to note that this isn’t a new feature or something Google has been doing behind the scenes without anyone’s knowledge. The company began personalizing search results for every user, even those without a Google account, way back in 2009 using an anonymous cookie that would take into account information like your location, search history, and other factors.
More recently, the company says it’s moved away from personalizing search results, even as it increasingly uses personalization for products like Google Assistant and Gmail because it didn’t seem to really improve the search experience. “A query a user comes with usually has so much context that the opportunity for personalization is just very limited,” Pandu Nayak, the head of ranking at Google search, told CNBC in September . Google says it still personalizes for location and “immediate context from a prior search.”
Regardless, Google has come under fire in the last couple of years for engaging in practices that, while not exactly clandestine, were certainly not widely known and likely kept out of the limelight to avoid too much public scrutiny.
One notable example, surfaced by an investigation from The Wall Street Journal, was how app developers were often able to read your personal Gmail messages , which prompted concern from some users and a congressional inquiry. Google also felt the need to stop scanning Gmail messages to target advertisements last year because it was worried that the practice, when it came to light in contract negotiations, might scare away enterprise companies.
Google also failed to disclose a flaw in a Google+ API that could have exposed hundreds of thousands of users’ personal information for roughly six months, an action that might result in a Federal Trade Commission investigation . And in September, Google was forced to reverse a controversial Chrome login feature that would help it better target ads by automatically logging users into the browser without their express permission.
The common thread here is that Google’s products, while often free to use and usually quite well-designed, only make money by using a vast trove of personal information to sell targeted online advertisements. And to improve how much money it makes from each ad sale, and the total quantity of ads it sells, Google often resorts to tactics its users have little insight into or deep understanding about.
Personalized search results aren’t necessarily nefarious on their own. But the ways Google’s algorithms function — especially those that are now aided by complex and often unexplainable artificial intelligence software — are and always will be outside the understanding of the common user. By definition, that means most people have no idea how they work, including lawmakers and members of privacy watchdogs and advocacy groups. That could be a problem for a company like Google as it tries to maneuver a consumer market and a legislative environment that’s not quite as forgiving or negligent as it used to be.
Recommended by Outbrain