“Filter bubbles” in internet use: unhealthy info nuggets

Anyone who uses Google or Facebook ends up in a filter bubble: supposedly unpleasant information is filtered out. There is always more of the same.

Everywhere you look, everything is similar. Photo: dpa

Enter the search term "Egypt" in Google. And see what comes up. News about the situation on the ground? Or travel tips with pictures of dunes and camels?

That’s an example of how author Eli Pariser coined the term "filter bubble" in 2011. The idea: Numerous web offers pack the user into a bubble. They put the morsels in front of him that the service thinks the user wants. In return, they withhold other, supposedly unwanted pieces.

At first glance, that sounds like a service. After all, a user’s job, as it were, is to the search engine: show me the relevant links to the search term at the top of a list, and the less relevant ones at the bottom. The problem is: the search engine answers the question "What is relevant" itself. And on a basis that is not transparent to the user.

When Pariser coined the term Filter Bubble four years ago, it was still primarily about information. News, Google, Facebook, Yahoo. But the bubble is getting bigger. Today, there is hardly a commercialized area on the web that can do without it. Amazon has been successfully using a filter bubble for goods for years, and most online stores have followed suit.

The past determines the future

Streaming services are geared to listening habits, video services to the series and genres preferred in the past. App stores suggest applications that users might need, and hotel agents are unlikely to suggest a trip to Scandinavia for two to a vacationer who always goes to Italy with his family. The motto: more and more of the same. Pariser spoke of "information junk food" at the time. An unbalanced diet instead of a little bit of everything. Even what one might not like so much.

The Filter Bubble is based on two mechanisms, each of which is problematic enough on its own. One is the mass collection of personal information about users. Anyone who moves around the Internet without special anonymization tools leaves behind traces. From interests, preferences and financial situation to – presumed – age, gender and location. The second problem: companies evaluate this data and draw conclusions from it – on a basis that the user does not know.

The example of Egypt is not the only one to show how much this can have an impact in practice. A team from Carnegie Mellon University and the International Computer Science Institute investigated which job ads Google’s advertising network presents to its users. And found out according to the study published in the spring: Users identified by Google as male were more likely to be offered high-paying executive jobs than those identified as female.

The world is getting smaller

This doesn’t have to be Google’s fault; after all, third-party ad providers themselves can define criteria for users to whom ads are served. But the ethical problem is the same as with the Filter Bubble: an algorithm that spits out results on a data basis that is not comprehensible to the user and thus limits him.

And the bubble is already expanding to the next area: the home. Thermostats that automatically heat or cool the home according to times, weather conditions and sleeping habits already exist; the next step will be more networked home appliances. They reduce the world within the filter not only by knowledge, news, entertainment and consumption. They also directly reduce the options for action. Why go out in the evening when the apartment is already heated? Why buy cherry yogurt when the automatic order for the purchase including strawberry yogurt has already gone out?

Of course, all this can be turned off, changed, ignored. You can also delete browser cookies, install anti-tracking tools and switch search engines to escape the bubble. According to the analysis portal Statcounter, the market share of the Google search service in Germany was 93 percent in June.