Research Presentation: Curtis Stanford

CLICK THIS LINK TO VIEW A FULLY FORMATTED VERSION: The Reference Culture: Pop Culture Trending Has Gone Mobile

At the advisement of Professor Collier, I chose to use a traditional format for my research presentation, but decided to give it a semi-formal, semi-informal tone to blend the scholarly aspects with the theme of the argument.

The Reference Culture: Pop Culture Trending Has Gone Mobile
Curtis Stanford

Keywords: pop culture, mobile, trending, social, conversation

Based on my research and experience, I argue that social standing and groupings has transitioned away from the traditional paradigms of popularity, class, and educational hierarchies and is now determined by the number of and to which aspects of pop culture one can reference and respond to. This argument holds true assuming that its presence has been made possible by the widespread integration of mobile computing and interconnectivity, and that because of this, pop culture now refers to literally anything and everything that has captured the public eye. Specifically, I will be working within the physical realm and will be excluding scenarios in which members operate solely within online communities.

The reference culture refers to an idea that the interconnectivity of modern society has championed communication. It suggests that our natural selection is no longer survival of the fittest but is instead dictated by our ability to join and direct the conversation. It’s a sort of battle of the wits that proves mental acuity and establishes our ability to maintain relevance and awareness in today’s culture.

The Way Things Are

Simply put, to know more is to be better. This is where mobile computing comes into play. Most of us aren’t Matt Damon in Goodwill Hunting, and we aren’t able to spew out quotes and author names from every book we’ve ever borrowed from the library, although we are able to reference that scene from the movie and the episode of It’s Always Sunny in Philadelphia where Charlie Day parodies it—because we’re linked in; we know all of the right sites for free television streaming. We know all of the right sites for everything. Did I know those actor’s names off the top of my head, or did I simply look them up on IMDB.com? But let’s not get too meta. Most of us aren’t savants, but more relevantly, many of us couldn’t name the author of the piece we’re currently reading. What’s my name without scrolling back to the beginning? Why is this? Why have we started to internalize an increasingly fewer number of details regarding even our own lives?

It’s because we’ve effectively outsourced our own memories. The internet as it currently exists is largely a compendium of all human knowledge and experience, and it’s at our fingertips. It’s in our pockets. It’s in our bags and on our desks. It’s at our coffee shops, bars, and retail outlets. It’s being beamed through space to and from towers and satellites, around the world, directly into our smartphones (which we now refer to simply as phones), tablets, and soon headsets. It’s everywhere, and more importantly, always there, and so is the information accessed by it. We don’t need to remember an author’s names anymore; we only have to remember the keystrokes to look it up.

This practice, quite obviously, creates a dependency toward our devices and the expectation we have that they will be able to connect to the web. This dependency fosters a habitual use of them, and for many an addiction can form. Without going into the psychology of addiction, let’s just assume that we’re all a bit addicted to our digital devices. While there are instances in which we use our smartphones as a resource to answer or address a ‘real’ world situation, we often times use our phones just to use them—because it feels good, because it stimulates us. We look up YouTube videos for a quick laugh or look at Twitter to see if Daniel Tosh has gone too far yet. We go to Apple’s homepage to see if our iPhones are outdated, and look at our own Facebook pages to see if we still like what we had to say. We dredge through millions of blogs, wikis, and news sites to see if anybody is talking about something interesting because we caught up with our friends in five minutes, no one is getting married, and our meals are taking forever. And frankly, if we have to take a nostalgic tour back to undergrad one more time, we’re going to have a mental breakdown over our stagnant lives in front of everyone at this restaurant.

That’s where the “have you seen…?’s” come in. “Have you seen…?’s” usually refer to something that is trending, or something that is popular online. While scouring through the currents and archives we stumble upon something mildly amusing, look at something we had bookmarked, or look into something we had been told about. Perhaps a famous actor just died, Anonymous thwarted something the government had been shadily doing behind our backs, or a cat squeezed itself into a paper towel roll. “Have you seen…?” we ask our friend. Lo and behold, she has! In fact, everyone at the table has. The conversation has now been downloaded from the internet and is trending in the ‘real’ world. More importantly, our friendships have been saved, at least for the time being, long after high school cliques or our degree choices abandoned us.

Slight adjustments to this scenario yield drastically differing results. If the group is asked of their awareness to a particular popular reference, and one person is ignorant to the subject material, that person is now excluded from the conversation. He or she can either try to save face by accessing the information via their mobile device, or, to the exasperation of the rest of the group, be “filled in.” (Again, an analysis of online communities will vary in choices and outcomes.) If that same person is ignorant to too many references, he or she might cease to be invited to social gatherings or might, of their own accord, seek companionship elsewhere.

Likewise, if no one in the group recognizes the reference, the conversation will never take place. If this occurs too many times, because of our dependency to the technology, the group likely will not attempt to seek commonality through other means and will disband into smaller groups or individually in search of alternative social arenas—those in which their specific interests in popular culture are reciprocated.

Another case exists in which an individual suffers from something known as the Dunning-Kruger effect. The Dunning-Kruger effect essentially says that unskilled individuals suffer from illusory superiority (Dunning). I attribute this disorder to those who refuse or can’t afford mobile technology and because of this delude themselves into thinking that by lacking basic skills in the digital marketplace they are somehow better than the rest of us. We all know the type—the guy that likes to “remain off the grid,” the acquaintance that prefers organic farmers markets to big-corporation consumerism, or even just the guy who is obsessed with his “indestructible” Nokia hand-me-down from 1997. We all have the same thing to say, “Get a job!” These types of individuals often form their own groups or exist within more modernized communities as a sort of caricature—as long as they aren’t too insufferable.

There are also genuine inequalities that create segregation between those who can afford digital devices and those who can’t. Those who are without access to mobile technologies have a harder time maintaining pace in the reference culture, and consequently, their social standing suffers (Crossman). Although, Sherry Turkle, in Alone Together, during her studies of what she calls the “networked culture,” found that every student within her focus group, spanning vast economical and social backgrounds, had a device capable of texting, and most had online capabilities (xiv).

A pattern similar to those afflicted by the Dunning-Kruger effect emerges surrounding individuals who seem to regurgitate cultural references uncontrollably. Some people excel at knowing what’s going on as soon as it’s going on. They’ve seen more movies than us, read more books, know all of the memes, and can even quote commercials. Their social standing within a group is often precarious at best as their comprehensive knowledge leaves others feeling inferior and eventually “turned off.”

The Way They Will Be

Following my current trend, you might think I would have you believe that the human brain no longer possesses the ability to remember. Of course, I don’t think this. While younger generations seem to have diminished short-term memories, we still remember the important stuff, that of which we focus on, and even some of the trivial that slips through and sticks during all of that mindless browsing. I would, however, have you believe that we might not have to possess the ability to remember for much longer. If to know is to be better, then to know off the top of one’s head is to be even better.

Traditionally, this meant actually knowing the information, accessing it within our memories, and communicating it or using it in a skillful way. However, as the technology progresses, it may be able to do these things for us. Google Glass, introduced by Google as early as April 2012, is a groundbreaking technology that revolutionizes mobile computing. Shaped and worn similarly to eyeglasses, and weighing about the same, Glass operates as a hands-free smartphone. It’s equipped with an augmented reality display, speaker, microphone, camera, and touchpad (for scrolling). Its capabilities as of now seem to consist of taking pictures and video, translating languages, performing Google search queries, providing directions and public transportation information, displaying weather, and sending/receiving messages. All of this is accessible through natural voice commands (Leonard).

While the technology currently seems limited in operation and means of input, it’s only going to become increasingly sophisticated. Eventually, Glass or similar devices will be able to analyze their surroundings, cross analyze them with multiple online resources, and then provide feedback. Eventually, they’ll do this at the flick of an eye or even automatically. Imagine, that you’re meeting a new acquaintance or interviewing for a job while wearing a headset. While you’re shaking their hand, Glass has already used facial recognition to locate their online identity and activities, compiling their Facebook, Twitter and Tumblr accounts into a bio that is in real time being displayed or read back to you. Now aware of your new acquaintance’s obsession with Sacha Cohen, you reference Borat and gain their attention and respect. Of course, this parlor trick won’t work as the technology becomes widespread and online identities are known instantaneously to all. Formalities will have to be reconsidered and privacy and politeness observed. The conversation will turn to, “Tell me something about yourself that’s not online.” As people become more and more accustomed to the technology and the patterns of day-to-day people, individualistic interest will wane and the conversation will again return to what’s popular culturally. At this point, the reference culture will evolve from simply knowing to the speed in which its members know. How quickly will your Glass be able to provide answers to the current point of interest? Did you shell out for the newest hardware? Can you seamlessly find the most relevant or interesting information amongst the search results that your Glass provided automatically through its voice recognition software?

The mere act of possessing the newest technologies contributes to the conversation. If you don’t have an iPhone or Android, you don’t know what Snapchat is. If you don’t have a laptop, or at the very least a computer, you have to wait for the newspaper or television for your news—which will be outdated by the time it reaches you. While technological advancements can make it harder to maintain social standing for those who don’t choose to or don’t have the wherewithal to continuously educate themselves, a higher level of interconnectivity will hopefully lend itself to a higher standard of empathy. The reference culture, in its ceaseless endeavor to stay at the forefront of public interest, will hopefully stray away from the trivial and direct its full attention toward social, ethical, and moral endeavors that will further our race.

Works Cited

Crossman, Ashley. Sociology of the Internet. Education, Sociology. About.com 2013. <http://sociology.about.com/od/Disciplines/a/Sociology-Of-Internet.htm>.
Dunning, David; Kruger, Justin. Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments. Journal of Personality and Social Psychology. 1999.
Leonard, Tom. Google’s sinister glasses will turn the whole world into search giant’s spies. Mail Online. Associated Newspapers Ltd 2013. March 17, 2013. <http://www.dailymail.co.uk/news/article-2295004/Google-Glass-Googles-sinister-glasses-turn-world-search-giants-spies.html>
Turkle, Sherry. Alone Together: Why We Expect More from Technology and Less from Each Other. Basic Books. Sherry Turkle 2011.

Additional Articles Read

http://www.mels.gouv.qc.ca/sections/cultureEducation/moisculture11/index_en.asp?page=reperesculturels
http://www.wikihow.com/Become-a-Person-of-Culture
http://en.wikipedia.org/wiki/Google_Glass
http://curiosity.discovery.com/question/how-technology-affected-social-networks
http://www.nokia.com/global/about-nokia/about-us/the-nokia-story/