Like many people, I am spending increasingly more time working, educating myself, socializing, and shopping online. Search engines like Google and Yahoo, online companies like Amazon and Ebay, and social media networks like Facebook and Twitter are making my online experience more relevant, individualized, and convenient in more ways that I can enumerate here. Last month, for example, I was pleasantly surprised to notice that my New York Times account has a recommended list of articles that I may find interesting, so I do not have to browse through endless articles to find something relevant to my interests. All of this is great, isn’t it? It all seemed great to me until I began having some privacy concerns. How come—I began to wonder—I see so many shoe advertisements after I search for a pair online? They must have superb tracking mechanisms.
These and some of my other concerns are discussed in Eli Pariser’s book The Filter Bubble: What the Internet is Hiding From You. I heard an interview with Pariser earlier this month. He talked about how online companies, Internet search engines, and social media manipulate Internet content to fit the tastes and beliefs of individual users, the detrimental effects that this may have, and what could be done about it. Pariser says, for example, that Facebook shows you only the news feed from people with whom you interact—so that you may never see the news feed from your ‘friends’ who are more politically conservative or liberal than you are.
I was somewhat naïve and, like many other people, believed that information flows on the Internet are still like the wild, wild west. But the reality is very different. If you and I google the same term—but differ in age, political views, gender, and browsing history—we would get different results. Parister says that Goggle tracks 57 categories about you—even if you do not have a Google account.
What’s so bad about this? On a personal level, it creates homogeneity of thought and limits your opportunities to be exposed to views that are fundamentally different from your own. It seems that a simple matter of life as encountering and being exposed to different people, ideas, and products is not something that Google seems to value anymore. It also makes our online experience very passive, atomized, and uninformed by other views—while our democracy needs us to be citizens who are active, diverse, and able to appreciate the different views of our fellow citizens. On a social level, Pariser says that we may, as a society, lose the ability to feel empathy for each other and the ability to understand what other perspectives may look like. This sounds like too high of a price to pay for relevance, convenience, and mental comfort, doesn’t it?
All of this made me think about the Interactivity Foundation’s small group public discussions and our reports, where we present, explore, and develop different and fundamentally contrasting policy possibilities. It also made me think about how important it is to foster IF-style discussions where people meet face to face to explore the different beliefs, interests, values, and goals that are reflected in conceptually contrasting policy possibilities.
It seems that these kinds of discussions are becoming increasingly rare in today’s world, where the customization of the content on the internet happens without our even being aware of it, and where the Internet tailors the world to our individual beliefs, interests, values, and goals like never before.