Second Place Social and Natural Sciences Upper Level
The Hunter Games
Sociological Theory (SOCI 301)
Professor Beth Gill
The Filter Bubble: How the New Personalized Web is Changing What We Read and How
We Think, written by Eli Pariser (2011), is a thought-provoking and socially
impacting book that seeks to address the potential dangers of the increased personalization
of social media and the inter-webs; however, of all the risks and consequences discussed
within this book, the most disconcerting issue is the personalization of search-engine
marketing and consumer advertisement specification, as well as its impact on the
isolated consumer. This powerful method of marketing tracks our perceived interests
via search-engine and social networking information; additionally, this bears the
ability to limit our access to information across the internet by personalizing
our experiences to match our perceived interests—no two web-surfers are going to
view the same results, nor will they be exposed to the same information.
By slowly and quietly assembling our personal tidbits, information brokers such
as Acxiom and Facebook are able to access and accumulate personal information such
as telephone numbers, addresses and social security numbers—this information can
be used to market products to consumers without their consent (Pariser 2011: 43;
Singel 2003: 1; Sullivan 2012: 1). Moreover, the advancement of technology is also
enabling companies and competitors to market their products to us by purchasing
our browsing and online consumer histories, as well as keeping track of our “clicks”
(Pariser 2011: 114, Singel 2003: 2). Although the presence of advertisements is
often a visual annoyance at best, it is impossible to ignore the implications of
what could become of the marketing of our own personal information back to
us—our information creates a pre-conceived notion of who we are based on
what we search, share and buy on the internet.
Our virtual world is colliding with a capitalistic force that is directing our attention
towards our perceived personal interests and into a narrow standpoint; furthermore,
it is also directing us away from perspectives and products that have the potential
to broaden our views, expand our minds, and heighten our creativity. If we continue
to ignore the potential risks for our personalized internet experiences, how will
our minds perpetuate a “me” society that keeps us trapped within our own
filter bubbles and ignorant of the thoughts and experiences of others?
The Filter Bubble does an excellent job of notifying the public of our own
personalized web bubbles and their potential consequences. By taking notice of how
Google and other search engine sites as well as social media websites like Facebook
are tracking and monitoring our every click and share, it is important to notice
the personalization that has occurred: for example, Google is capable of marketing
the information that I am young and interested in swimming via my past searches;
consequently, I am often bombarded with ads about “cheap” swimming outlet companies
even though I have not competed in well over a year. Similarly, Facebook markets
products to its users based off of their “likes” and “shares” (Pariser 2011: 44,
114; Sullivan 2012: 1). Whether we realize it or not, the voluntary personalization
of our web-space is becoming an informative market used by techno-giants like Google
and Facebook in order to find out who we are and sell us products in lieu of our
personal interests (Pariser 2011: 40). Accidental discovery of other products, brands,
and competitor information becomes slimmer as our interests are sold to the highest
All of this information is compiled in order to construct a theory on “me.”
The “me” theory that is achieved through this information is not an accurate
representation of the “real me”; the lives and interests that are perceived
by Google and other such search engines are not true representations of us, and
only further alienate our internet experiences by limiting us to viewing information
that Google (not the user) has assumed to be apposite (Pariser 2011: 115). Not only
does this serve to surround us in a world that is occupied by only “me,”
but it also severely limits us from the variety of information that the internet
has to offer.
Since our online “me” is constructed from small snippets of our actual selves,
it is important to note that what we choose to reveal to the internet (independent
variable) is helpful in explaining what Google deems appropriate for our viewing
pleasure (dependent variable). Google wants to know the “me” behind the screen.
Once it has an idea of who “me” is, it will begin to filter information relating
to our search based on how it thinks we will value the information (Pariser 2011:
192). Although we willingly choose to put this information onto the internet, we
are often ignorant of the marketable consequences that accompany our personal internet
business and interests. In exchange for the convenience and capability of the internet,
we must pass on pieces of ourselves to be processed and mumbled into personalized
data—with this voluntary sharing, we are also instilling a sense of trust within
the companies that demand our identities in return for service (Pariser 2011: 213;
In the theoretical construct of “me,” our personalized interests remain static,
and will also be based on what we choose to share with the internet world. Interestingly,
the “me” algorithm was originally created in order to aid our internet search.
With the growing amount of the already overwhelming quantity of data on the internet,
a filter is necessary to help us track down and find what we are actually looking
for—suggestion, in this case, is handy and helpful. The personalization that has
accompanied this well-intentioned implement, however, is easy to manipulate and
market if we do not keep a watchful eye and watch where it is trying to direct our
attention (Pariser 2011: 30-31, 215; Terdiman 2011).
A major social consequence of living inside of our own personalized browsing world
is that we are not exposed to other cultures, ideas or conflicting opinions that
may stretch our creativity and advance our thoughts past our own linear and biased
patterns (Pariser 2011: 100-101). Search engines, like Google, are designed at,
“helping us find what we know we want, but not at finding what we don’t know we
want.…” It should not come off as a surprise that current algorithms for these filter
bubbles lack variability and “drift” within their results (Pariser 2011: 104).
Rather than possibly expose users to conflicting yet relevant and creative viewpoints
that may accompany their original search (ex: giving information about multiple
diets in a search for “South Beach Diet”) traditional and current algorithms lack
the advancement to represent multiple viewpoints for our “me” identity—satisfying
only the most basic desire for information. We will be perpetually trapped within
our “me” if we are to continue to remain unaware of its informative power;
therefore, we will also remain ignorant and unaware of the plight and/or opinion
of others unless we actively seek out their data.
If information brokers continue to compile marketable profiles based on their perceptions
of our “me,” society will gradually slip into a one-dimensional mindset explicitly
tailored to their own thoughts and interests; in addition, the consequential society
will remain trapped and isolated within a bubble of “me” that is absent of
conflicting viewpoints, opinions and experiences.
The personalization of our internet lives is slowly becoming a data-mongrel that
may soon be spinning madly out of control despite our best efforts to calm its powerful
influence. The juggernaut of modernity, credited to Anthony Giddens, helps
display the reliance that society places on our technological achievements. It is
difficult to imagine our lives without all of our technological gadgets and widgets
(Ritzer 2007: 130-131). With the growing amount of personal information being stored
away in the confines of the many bytes of the internet and not in traditional filing
cabinets, it would also be too socially complicated to transition from any method
other than our current techno-modern routine.
Applying the implication of the juggernaut of modernity to today’s tricky and profitable
consumption of our own personal information is alarming. While we, as humans, would
like to believe that we have control over the machines that we have created, it
is difficult to ignore our increasing reliance on their capabilities as the years
pass. Man-made algorithms attempt to make sense of our code-encrypted e-world of
data by processing thousands of gigabytes of information in order to help us find
what we are looking for; although useful and incredibly convenient in terms of helpfulness,
speed and application, our reliance on our technological contraptions places much
of society in a vulnerable position to whoever controls these innovations. We all
fall victim in one way or another due to our need of these machines for modern social
The one-dimensional society theory, credited to Herbert Marcuse (1898-1979),
implies that society is selectively being whittled down into viewing only specific
perceptions that are reasoned appropriate for consumption (Ritzer 2007: 111). In
addition to the one-dimensional society theory, Marcuse also theorized about an
increase in irrational thinking based off of rational conceptions called technocratic
thinking. This type of thinking replaces social creativity with rationalized
technological efficiency (Ritzer 2007: 112).
In this case, personalized search engines are able to discredit and misdirect human
creativity by displaying the same biased search results based on its perceptions
of our “me”; moreover, society will consume this information happily, and
is often ignorant of its own plight—we become one-dimensional by falling under the
control of the selective information considered to be most relevant by our own technological
inventions. By remaining unexposed to a variety of stimulants, human creativity
will remain limited to the individual. Society also becomes one-dimensional in this
application due to the lack of social creativity that typically breeds social innovation
and technological advancement. A solution to this problem would be to take back
control of our technological innovations, as well as regulating which information
may be legally shared with and without consent of the consumer by the data dealers
that take stock of our private information.
The increasing rationality of the personalized web falls under the methods of technocratic
thinking. While there certainly needs to be an aid in processing the zettabytes
of information spanning the map of the web, personalized search engines also leave
the user lacking by excluding information that may have been considered “important”
or useful by the user—instead, this information is glossed over and filtered by
the search engine due to its perceived notions of our individual “me.” This
type of technocratic thinking is rationalized under a seemingly altruistic purpose,
but simultaneously fails to address concerns brought up about these “filtered” results.
Efficiency trumps human and social creativity, and spawns a society trapped within
its own cubicle of self-interest and seclusion.
Pariser, Eli. 2011. The Filter Bubble: How the New Personalized Web is Changing What
We Read and How We Think. New York, Penguin Group.
Ritzer, George . 2007. Contemporary Social Theory and its Classical Roots: the Basics.
New York, McGraw Hill.
Singel, Ryan. 2003. Acxiom Opts Out of Opt-Out. Accessed 19 November 2012.
Sullivan, Mark. 2012. Data Snatchers! The Booming Market for Your Online Identity.
Accessed 18 November 2012.
Terdiman, Daniel. 2011. Why a hyper-personalized web is bad for you (Q&A).
Accessed 18 November 2012. http://news.cnet.com/8301-13772_3-20063402-52.html