May
I’ve always thought of personalization as a good thing. If Google knows something about me then it can provide results that I’ll find more relevant, right?
Watch this TED talk by Eli Pariser and, like me, you might start having second thoughts.
Pariser is former executive director of MoveOn and is now a senior fellow at the Roosevelt Institute. His book The Filter Bubble is set for release May 12, 2011. In it, he asks how modern search tools — the filter by which many of see the wider world — are getting better and better and screening the wider world from us, by returning only the search results it “thinks” we want to see.
Here’s the very thought-provoking first paragraph of the talk:
Mark Zuckerberg, a journalist was asking him a question about the news feed. And the journalist was asking him, “Why is this so important?” And Zuckerberg said, “A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” And I want to talk about what a Web based on that idea of relevance might look like.
Here’s another provocative statement:
“When confronted with a list of results from Google, the average user (including myself until I read this article) tends to assume that the list is exhaustive. Not knowing that it isn’t … is equivalent to not having a choice. Depending on the quality of the search results, it can be said that I am being fed junk — because I don’t know I have other choices that Google filtered out.”
Aubrey Pek, commenting on Kim Zetter’s “Junk Food Algorithms”: http://www.wired.com/epicenter/2011/03/eli-pariser-at-ted
Pariser’s talk brings up important ethical questions. Who decides what’s relevant for me? Is it ok for a web-site to filter out content it thinks I won’t want to see? Does that insulate me from what’s really happening in the world, leading me to live in a bubble? Isn’t it my responsibility to know what’s happening in the world outside of my bubble of “news.” Shouldn’t I have the right to know who is filtering out what information and by what algorithm? Shouldn’t I have the power to change the filter settings?
If you think the effects of personalization are subtle, or not very significant, then this little piece of the talk might shock or disturb you.
But a couple of weeks ago, I asked a bunch of friends to Google “Egypt” and to send me screen shots of what they got. So here’s my friend Scott’s screen shot. And here’s my friend Daniel’s screen shot. When you put them side-by-side, you don’t even have to read the links to see how different these two pages are. But when you do read the links, it’s really quite remarkable. Daniel didn’t get anything about the protests in Egypt at all in his first page of Google results. Scott’s results were full of them. And this was the big story of the day at that time. That’s how different these results are becoming.
We’re getting our search results seriously edited and, I bet, most of us don’t even know it. I didn’t. One Google engineer says that their search engine uses 57 signals to personalize your search results, even when you’re logged out.
Do we really want to live in a web bubble?
If you enjoyed this post, make sure you subscribe to the RSS feed!
Tags: federated search