Our Very Human Need to Pop “Online Filter Bubbles”
A long time ago, when I was a nerd in grad school (totally not a nerd anymore), one of the concepts that I did the most research on is the idea of the “filter bubble.” That is to say, does the massive amount of content on the web – and the means in which we go about accessing that content – either reshape or concrete people’s pre-existing worldviews?
Two professors – Bruce Bimber of the University of California and Richard Davis of Brigham Young University – have been exploring the idea of the online filter bubble since the 2000 presidential election.
In their study “The Internet In Campaign 2000: How Political Web Sites Reinforce Partisan Engagement,” they put this quite elegantly – thanks to a quote from singer-songwriter (and household favorite) Paul Simon:
“People tend to select out for attention those stories and claims that confirm their existing beliefs and predispositions. And when confronted with news or other information that tends to conflict with their assumptions about public life, people are especially likely to disbelieve what they see or hear.
These political habits call to mind lyrics to Paul Simon s 1970 song entitled The Boxer: ‘a man hears what he wants to hear and disregards the rest.'”
A few years later, Bimber and Davis talked about it further in their book Campaigning Online: The Internet in U.S. Elections. Narrowcasting – the niche version of broadcasting, in which a fragmented media environment leads to the ability to access content designed for a smaller, more specific audience – is “one of the defining features of the Internet.” When users have a massive and nearly infinite number of outlets and diverse opinions, they’re more able to seek out stories that interest them. But as a result, they consume content that reinforces their current worldview.
In a TED talk released this week, Moveon.org founder Eli Pariser talks about this very topic. Looking at the idea of one’s personalized Google search results or a “relevant” Facebook feed (which are all based on the idea of what we’re clicking when these pages return results to us), Pariser says that we’re increasingly becoming in danger of “algorythmically editing the web.”
As a result, Pariser says, we’re moving “toward a world in which the Internet is showing us what it thinks what we want to see, but not necessarily what we need to see.” He calls this a “filter bubble,” which he says is “your own unique universe of information that you live in online” – but a universe where you don’t decide what gets in and you don’t see who gets edited out.
Watch the talk here:
The “online filter bubble” problem that Pariser describes demonstrates the disconnect between “our future aspirational selves and our more impulsive present selves.”
Human behavior doesn’t always sync up with what our dreams, goals, and visions are. What we believe morally certainly helps us to endeavor to behave in certain ways, but when we’re clicking around the web, are these even decisions that we’re conscious of? I wonder how much of my human aspirations are displayed as I consume the web. It would be a snapshot, for sure – just like, say, looking through a garbage bag – but I’m willing to bet that it’s just the crumbs and wrappers of all of the Internet junk food that I eat on an ongoing basis.
As Pariser points out, this idea points to the need that we need to reevaluate the mythology of the Internet – that it will be some great democracy enhancer that connects us with everything that’s happening. This isn’t the case if Internet algorithms edit out content that challenges us or causes us to re-evaluate concepts that we believe in.
As humans, we long for more than just correlations about relevancy. We need an experience that mixes the idea of snack food-like relevancy with the sustenance of our human aspirations.