More than most of us know, our online lives are shaped by “personal algorithms,” computer programs that track what we do and show us the Internet (they think) we want to see. The ads we view, the search results we get, even the news we read – they’re served up based on a prediction of what we’ll click; and because we’re all different, we’re more and more likely to be experiencing different versions of the Internet. There’s an upside to this “personalization”: we cut through the online clutter and save time. But, argues Eli Pariser in a new book, The Filter Bubble, this convenience comes at a steep cost: We each wind up living in our own information universe (the “filter bubble” of the title), sealed off from views that don’t conform to our existing beliefs. Pariser, the former head of MoveOn.org, says what’s happening is nothing less than an “invisible revolution” that’s changing “how we learn, what we know, and even how our democracy works.” We caught up with him by phone the other day. Below, highlights from our conversation.
You start the book by describing Google’s switch in 2009 to “personalized search.” Why was that such a big deal?
Well, what it meant was that, even if you weren’t logged in to Gmail or other Google services, Google was going to show you the results that you were most likely to click, rather than the results that were sort of the consensus “right” results. And there’s a big shift, because Google started with this intense focus on an algorithm called PageRank, which was explicitly described in democratic terms; it was the whole web voting on which pages answered which queries the best.
And so, different people can get different results from the same search query. But how different?
Sometimes very different. I had two friends – both female, around the same age – Google the phrase “BP” at the time of the BP oil spill. One got lots of information about the oil spill and the environmental consequences; the other got almost nothing about the oil spill, just investment tips. And when I had two people do this more recently, Googling “Egypt,” one got a lot of information about the democracy protests there, while they other didn’t get anything about the protests. So you can Google BP or Egypt, and not know the most important political fact about them! And what this illustrates is that this is really hard to see in action; you can only see it if you hold two computers side-by-side and start Googling things, but also that you can’t tell what is being left out. You don’t know what you’re not seeing.
Facebook is doing this, too. But how much of the rest of the Web?
Well, more and more of the Web, and for a simple reason: if you can provide people with something that is more personally relevant and reflects their own views, then they’re more likely to come back. These algorithms can actually say, “Well, we showed this link to ten people and three of them clicked, and we showed this link to ten people and seven of them clicked, so let’s move up the one that more people clicked.” So if you imagine that happening on an individual basis, “Ah, we’ve noticed that Eli clicks more on news about Apple computers about news on Afghanistan,” then you can presumably get me to click more overall. And that’s what these companies want to do, because most of them make money off people viewing ads. And the more clicks, the more ad views.
Is that such a bad thing, giving people more of what they want?
I don’t particularly have a problem with the fact that, say, when you Google “pizza,” your favorite local pizza joint will come up, not the definition of the word pizza. But when you’re Googling “climate change” or “Barack Obama” or “9/11,” you want those queries to build in some sense of showing people the whole picture and not just reflecting back to them their own views.
You say this is increasingly being built into news sites.
Right. Different people get different news home pages, and increasingly, the New York Times and the Washington Post are playing with this kind of technology. There’s so much money in it that it’s very hard to resist the temptation to personalize, even though in some ways it goes against the journalistic ethics that those companies try to maintain.
You think this is undermining our democracy?
Well, it just isn’t as comfortable to be confronted with people who disagree with you politically, and if all you’re trying to do is make people as happy and comfortable as possible, then you would want to kind of filter that out of the equation. Alternatively, if what you’re trying to do is actually build a way of processing information that supports a good democracy and a good society, then you really want people to get used to engaging with other points of view. But that runs against personalization.
How have these companies responded?
There’s been a range of responses. Some of them try to justify what they’re doing by saying, “Eh, we’re just giving people what they want.” Which I don’t find very compelling. Others recognize that it’s a problem, but it’s just low on the priority list; yes, it would be better to have better algorithms, but it won’t make nearly as much money for these companies as moving more in this first direction. So, my hope with the book is to help make it clear that there actually are a lot of us who want these companies to make this a priority.
And while we wait for that to happen, what can individuals do to push back?
I posted a list of tips that people can use to avoid some of the most basic tracking and data collection that happens, but I think we all have to the responsibility to balance our own information diets, and that means seeking out news sources that may be uncomfortable but expose you to a different way of looking at the world. And I think the good news is, after you get over that initial hump, it’s actually a very satisfying experience.