12-18-2018 08:20:41 AM -0800
12-17-2018 12:30:12 PM -0800
12-17-2018 09:31:43 AM -0800
12-16-2018 07:57:15 PM -0800
12-16-2018 10:25:25 AM -0800
It looks like you've previously blocked notifications. If you'd like to receive them, please update your browser permissions.
Desktop Notifications are  | 
Get instant alerts on your desktop.
Turn on desktop notifications?
Remind me later.
PJ Media encourages you to read our updated PRIVACY POLICY and COOKIE POLICY.
X


Stretch, grab a late afternoon cup of caffeine and get caught up on the most important news of the day with our Coffee Break newsletter. These are the stories that will fill you in on the world that's spinning outside of your office window - at the moment that you get a chance to take a breath.
Sign up now to save time and stay informed!

Google Tracks Users in Incognito Mode, Study Finds

On Tuesday, Google competitor Duck Duck Go released a study showing that Google gives users personalized search results, even when a user goes into "incognito mode." Dr. Robert Epstein, a Ph.D. psychologist who focuses on search engine manipulation, drew the logical conclusion Duck Duck Go refused to state.

"The incognito mode is a lie, that's what they found," Epstein, whose research features prominently in the recent film "The Creepy Line," told PJ Media on Wednesday. "The kind of search results they were getting from people in incognito mode and in normal mode were extremely similar, and of course the results that one person got were extremely different than the results that another person got."

"That's another way of saying that incognito mode is a lie. It's an illusion," the psychologist explained. "They didn't say that, but that's what they found."

While incognito mode "may hide certain kinds of things from certain kinds of websites," letting users get around paywalls of various kinds, it does not hide a user's identity from Google. This means Google still carries out surveillance to know what users want and applies its advertising to leverage the algorithm to their profit, even in "incognito mode."

Epstein admitted that the study had problems. "We have a credibility problem for sure — they didn't tell us how they found their volunteers, for example," he said. He also faulted the study for "just showing you the raw numbers without using statistical methods to see whether these numbers are significant statistically. They didn't do that, for some reason."

However, the psychologist did a deep dive into the results and found them rather credible, so long as the volunteers were truly chosen at random. That said, Epstein did not side with Duck Duck Go over Google.

"I don't recommend Duck Duck Go to people, even though I believe they don't track data and I think that's great," he explained. "I don't recommend it because I don't find the quality of their search results to be very good."

Google may offer a better product, but the company does not deserve users' trust. "I don't think Google is an honest company and I think this study is another indicator of that," Epstein said.

"They routinely favor search results that benefit the company," he explained, noting multiple fines from the European Union, India, and Russia for abusing its dominant position on the Google Play store.

Some people might see the potential for shenanigans in re-ordering the search results, but Epstein warned that manipulation starts even with the underlying algorithm. "It doesn't matter whether someone is deliberately re-ranking. What matters is how they build the algorithm," he explained. "It favors the company's own values, products and services. That's how they build the algorithm."

The psychologist responded to an article by the Guardian's Oscar Schwartz, which re-assures audiences that Google has no biases against conservatives or others, but just gives the user what users want.

"All the article does is toe the line. It just repeats Google's own claims about what it does and it repeats them uncritically," Epstein shot back. He argued that Schwartz "doesn't understand when a search algorithm favors one product, candidate, or cause more than another. To say that it's because of user behavior is to say nothing. That's Google's defense, and it's absurd."

Epstein referenced a very recent example of Google auto-fill bias. He typed in "The republican party is," and the top suggestions included "the single greatest threat," "dead in california," and "about to drown." The top results for "The democratic party is" were "an example of a 527 organization" and "based on the following ideas."

"If you looked up Google trends and you looked for these phrases, you’d find that virtually no one has ever searched for 'the democratic party is based on the following ideas,'" the psychologist explained. Indeed, I checked Google Trends, and sure enough there was not enough data.

Google Trends screenshot for "The Democratic Party is based on the following ideas."

As Epstein suggested, however, there were a great many results for "The Democratic Party is corrupt."

Google Trends screenshot for "The Democratic Party is corrupt."

If the "corrupt" angle gets far more searches from users than the "ideas" angle, why does the search suggestion favor "ideas" over "corrupt"?

"Why don't we see that? Because we know from research that the simplest way to support any cause or candidate is to suppress negatives," the psychologist explained. "It's a simple manipulation which we know can dramatically shift the opinions and preferences of undecided voters."

Epstein also referenced a study he had performed in 2016, showing that Google suppressed negative search results about Hillary Clinton, while Bing and Yahoo did not.

The psychologist's research has shown that suppressing negative results can shift a 50-50 split among undecided voters to a 90-10 split. "Negatives draw attention, negatives can draw ten to fifteen times as many clicks," thanks to "negativity bias."

His research also concluded that this kind of manipulation is "subliminal" — people don't notice it.

Ironically, when someone like Epstein calls foul on one-sided results, the behemoth search engine company merely alters the system.

"When they get caught, they usually make a change. It's absurd for them to say that all of this is just occurring because of impartial algorithms. That's just not true," the psychologist said.

Then there are the leaks. Google executives bragged about increasing Latino turnout in 2016, hoping it would help Hillary Clinton. In SeptemberThe Wall Street Journal uncovered emails in which Google workers discussed manipulating search results to disfavor President Trump's travel ban. Last week, the Daily Caller unveiled more emails, showing Google executives scheming against conservative media outlets.

Yet journalists like Schwartz can't accept the fact of manipulation. Epstein explained that, too. "People can see the human hand when they're reading an article and on a television or radio show, but when they're dealing with algorithmic output, they don't see the human hand so they trust algorithms more than any other type of media."

Google has three major manipulation strategies. The company "personalizes search suggestions, search results, and answer boxes." The "answer box" effect proves most powerful, the psychologist explained.

"When you query Google either using the search engine's 'I'm feeling lucky' or the home device or Google assistant, they just give you the answer," Epstein said. "When they give you the answer, three things happen: people spend less time searching; they click on fewer search results; and the shift in opinions increases between 10 and 30 percent."

"Giving someone the answer has a more powerful effect than search results that favor one cause or candidate or company," Epstein explained. Google is excellent at personalizing results, and "the better you are at personalizing them, the greater impact you're going to have on people's opinions."

The company also knows how to mask its manipulations, and the psychologist suggested that it may use more obvious manipulations as a red herring, to distract from the more fundamental attempts to shape public opinion. The "fake news" freakout has only made this strategy more effective.

"Google and Facebook have co-operated to combat fake news stories and Russian-placed ads," Epstein noted. "I think these companies are loving that our attention is drawn away from what they themselves are doing."

The psychologist often refers to Google as "GSA." "I call it Google Surveillance and Advertising, L.L.C.," he told PJ Media. "Surveillance is what they do and advertising is how they make their money."

This surveillance and advertising combine to form an effective money-making machine and give the company tremendous power to influence public opinion. Indeed, Epstein's research has found that Google's bias in favor of Hillary Clinton accounts for her margin of victory in the popular vote.

Thanks to Duck Duck Go, Americans know that "incognito mode" cannot hide users from Google's personalized results. If "incognito mode" is a lie, what else is?

Follow the author of this article on Twitter at @Tyler2ONeil.