Facebook was looking at which links I clicked on, and it was noticing that I was clicking more on my liberal friends’ links than on my conservative friends’ links. And without consulting me about it, it had edited them out. They disappeared.
About This Quote
Eli Pariser, an internet activist and former executive director of MoveOn.org, popularized the idea of the “filter bubble” in the early 2010s—how personalization algorithms shape what information people see online. This quote refers to his account of noticing that Facebook’s News Feed algorithm learned his clicking habits and then silently reduced or removed posts from friends with opposing political views. Pariser used this example in talks and interviews while promoting his book on algorithmic personalization, arguing that opaque ranking systems can narrow users’ exposure to diverse perspectives without their awareness or consent.
Interpretation
Pariser’s point is not simply that Facebook shows people what they like, but that it does so invisibly and unilaterally. The “without consulting me” emphasizes the loss of agency: a platform’s optimization for engagement becomes an editorial function, deciding what counts as relevant and what disappears. The political detail (liberal vs. conservative friends) illustrates how personalization can intensify ideological sorting by reducing incidental contact with disagreement. The quote thus frames algorithmic curation as a democratic and epistemic problem: when the information environment is tailored to predicted preferences, individuals may become less able to encounter corrective facts, competing arguments, or the lived realities of others.



