Untangling Personalization, Polarization, Privacy, and the Filter Bubble Theory

In: The Human Program. A Transatlantic AI Agenda for Reclaiming Our Digital Future.

The author, activist, and entrepreneur Eli Pariser argues that personalized search results and content recommendations made by social media algorithms ensure that people read only political opinions they agree with, resulting in entrenched views that are constantly validated and rarely opposed. The result is polarized worldviews that are helping to endanger democracy.

But even though this argument has gained currency in the last decade, the more it is researched, the less it holds up. A more complicated and complex question is: What roles do search engines and social media play in our democracies?

Read the entire essay here.

Previous
Previous

Developing A Sustainable EU-U.S. Privacy and Data-Sharing Regime

Next
Next

Vertrauenswürdige Künstliche Intelligenz durch Privatsphäre wahrende Technologie