Eli Pariser's new book, The Filter Bubble: What the Internet Is Hiding From You, confirms what many internet users have suspected for some time: not only search engines but social sites, online retail businesses and other groups, have specifically tailored results to the online user. While this idea may play into facets of vast conspiracies and suspect motives, directing specific information at the user -- in whatever media -- is the point to advertising.
What makes the new form of information filtering practiced by search engines and other online services different, however, is that in most cases the internet user isn't even aware that search results are being manipulated.
In December 2009, Google began customizing its search results for each user. Instead of giving searches the most broadly popular result, Google now tries to predict what you are most likely to click on. MoveOn.org board president Eli Pariser writes that Google's change in policy is symptomatic of the most significant shift to take place on the Web in recent years: the rise of personalization.
Eli Pariser
This makes advertising more than just a pitch: by selectively determining the information a search engine displays according to the user's past searches (a function of algorithmic programming), Google searches don't allow the user to see all the choices available -- only pre-selected, automated results. Pariser calls this phenomenon a "filter bubble," and it restricts the internet user's access to what would otherwise be available.
Though the phenomenon has gone largely undetected until now, personalized filters are sweeping the Web, creating individual universes of information for each of us. Facebook, which is becoming a primary news source for an increasing number of Americans, prioritizes the links it believes will appeal to you so that if you are a liberal, you can expect to see only progressive links.
There is a real danger in this, according to Pariser. His thesis is that in an information- personalized world, we will increasingly be "typed" by advertisers and content providers, and fed only news that is pleasant, familiar, and confirms our beliefs. Because these filters are invisible, we won't (and most times don't) know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas.
At this point, the off-base advertising a reader might see on his own Facebook page ("Like Donald Trump? Click here!") can sometimes just raise a smile. Now, however, I'm beginning to wonder what small nugget of information is lodged in the mighty search engines of the universe to make Facebook think I'm a fan of The Donald. I only wonder what his hair is made of.
No comments:
Post a Comment