More than most of us realize, our online lives are shaped by “personal algorithms,” computer programs that track what we do and then serve up the web experience we (supposedly) want. The ads we view, the search results we get, even the news we read – they’re served up to maximize the likelihood we'll "click," because and the more we click, the more valuable we are to advertisers and marketers.
There’s an upside to this “personalization”: we cut through the online clutter and save time. But there's a steep cost, too, as former MoveOn.org director Eli Pariser argues in a great new book, The Filter Bubble. Increasingly, he says, we each end up in our own information universe (the “filter bubble” of the title), sealed off from views that don’t jibe with ours. That's the opposite of a healthy democracy, he says, where there's "room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas."
What's happening is nothing less than an “invisible revolution” that’s changing “how we learn, what we know, and even how our democracy works.” Below, a Q&A with Pariser and an excerpt from The Filter Bubble. Check back for updates through the week.
Q&A with Eli Pariser
The one-time MoveOn director talks on how online marketing is shaping how we learn and what we know, and why that's bad news for American democracy.
Book Excerpt: The Filter Bubble: What the Internet Is Hiding From You
"Starting in December 2009, Google would use fifty-seven signals—everything from where you were logging in from to what browser you were using to what you had searched for before—to make guesses about who you were and what kinds of sites you’d like. Even if you were logged out, it would customize its results, showing you the pages it predicted you were most likely to click on."