Friday, February 22, 2013

Memes: Not really an in-joke anymore

One of my favorite parts about teaching Media Literacy is hearing/reading about what media my students use, what content they enjoy, and how that compares to the experiences of me and my peers. As someone who is roughly twice their age, I don't really expect that we will engage in many of the same types of media experiences. Just as my teachers would make stilted references to M.C. Hammer in order to garner a laugh, I made a reference to Kendrick Lamar's Swimming Pools (... ... drank!) and got a hearty chuckle from the kids. We're in different age cohorts, different life cycles, and we're living in an increasingly fragmented media environment. What could we possibly have in common?

This makes it all the more surprising when I discover that many of them are encountering the same memes as I am. In some cases, we're on the same website, but in others, we're on different sites (or highly personalized versions of the same site, like Twitter and Facebook) that are increasingly comprised of viral jokes that often re-purpose amateur or professional media content in order to comment  on current events or a relatable situation (i.e., memes). Do we watch the same TV shows? No. In fact, I'm willing to bet more students in my classes have the media experience of seeing a Sweet Brown meme in common than will watch the Super Bowl, the Grammys, or the Oscars. Supposedly, these water-cooler TV events would remain a common cultural touchstone, and they likely will be the one thing (along with some big movies) that cut across age groups. But there is something going on with memes that is interesting. They're often originating from relatively tiny communities or obscure sources well outside of the mainstream, but they become the references that my students and I both have in common. If I include a reference to Mad Men in my slides, I'll get blank stares, but a picture of Grumpy Cat gets them laughing every time.

The first thing that occurs to me about this is that at least for certain populations (young people?), media users may not need TV and celebrities as a subject of common experience and conversation, at least to the extent that previous generations did. I think the use of memes is partially substituting for the use of TV and celebrities as a way to joke about norms, blow off steam, bond, etc. Based on my casual observations, I'd say that music and musicians as personalities are just as central to these young people as they were to me and my parents when they were that age. But TV and celebrities? I'm not so sure.

This isn't to say that TV and celebs are going away, but that they may not be as essential to leisure media use as they once were. Perhaps TV has already started to adapt to this, although the meteoric rise of memes to this stage in which they are something that my students and I have in common seems to have happened so suddenly that I doubt anybody has had time to adjust. Like the TV content that serves/served as our common cultural reference point, these memes, ultimately, only serve as a vehicle for advertisers and websites to build audiences to sell stuff to. But the professional content producers have been cut out of the equation. Just how much time is spent creating, consuming, and distributing memes? And if more time is spent re-purposing and creating amateur content, regardless of how solipsistic and retrograde its humor may be, isn't this something worth celebrating?

Wednesday, February 20, 2013

What Future Does Professional Media Content Production Have?


In the Media Literacy class I’m teaching this semester, my students are engaging in a role-playing exercise in which they assume the roles of four groups that, traditionally, play a role in the development of a new medium: governments, advertisers, technology developers, and content producers. First, the students research the role that these groups played in the creation and popularization of print, radio, TV, film, internet, etc. Then, they form new groups comprised of representatives from each of these groups and discuss how to develop a heretofore un-developed medium. The first example I thought of for the un-developed medium is: virtual reality.

Most of us have an idea of what virtual reality could be. And this imagined reality of VR could have government regulation (determining how the content will be distributed, if the roll-out of VR will be subsidized like a utility, how to regulate violent/sexual content, etc.), advertising (product placement in virtual reality? Pretty much an advertiser’s dream!), and technology developers (Apple VR might have a cleaner look than Microsoft’s somewhat cluttered-looking VR), and content producers (custom-made luxury environments for you to relax in). Sounds like a good fit!

I’m going to do this exercise again later in the semester, and I’m having no trouble coming up with several other “media technologies of tomorrow”: augmented reality glasses, superior surveillance technology, a portable instant-fMRI machine, affordable 3D printing. Some of these technologies are already gaining a foothold in the market. It’s easy to see how governments, advertisers, and technology developers would be involved in the creation and development of these technologies. But where would the content producers fit in?

The more I thought about the exciting media technologies of the future, the more trouble I have thinking about how professional content producers (e.g., screenwriters or the equivalent) will fit into the picture. I’m quite confident that there will always be an appetite for well-told stories. People skilled at telling these stories, through words or pictures or sounds, will have a place in our media environment. But I suspect that people will devote less time to consuming those stories than in years past. During the golden age of radio and television, people spent hours every day consuming content created by professionals. Increasingly, we spend more and more time using Facebook, Twitter, and other activities that don’t involve much in the way of content production (yes I know, lots of conversations on FB & Twitter are about content produced by professionals, but still, most of the aggregate value of these sites, I would contend, is generated by the users and the creators of the venue, i.e., the technology developers). In thinking about the media technologies of the future, it’s hard to think of a place for the writers, the producers, the directors. I’m sure there will be a handful of greats who produce content we all talk about, but perhaps a shrinking middle ground, and a shrinking window of attention and time we all spend consuming professionally produced content. 

Aside from making my group role-playing project a bit more difficult to design, its hard to think of a downside to this future. Definitely something I'll come back to in class. 

Sunday, February 17, 2013

A media dieting manifesto

A majority of American adult Facebook users have taken a voluntary break from using Facebook, according to a recent Pew Research poll. Though only 8% of these people reported doing so because they felt they were spending too much time on the site, 38% of people 18-29 expressed a desire to spend less time using the site next year. So the desire to use less of some media to which we have access is there and, I think, it is likely to grow. But what are we doing about it, other than taking short breaks? Not much. Not yet.

I know its a bit daft to speculate about media use in the future, but since this is a blog and not a job talk, here is my prediction: in the next 5 years, more than 50% of internet users in the United States over the age of 22 will use some form of self-restriction from media to which they would normally have access. Either they will use software restricting their use of the internet or phone or use a self-imposed schedule in which they do not allow themselves to use the internet, certain applications or website, or their phone.

This is the start of an era in which we (adult Americans, perhaps others as well) look at our intake of information and social contact similarly to the way that we look at our intake of food. Food dieting is a billion dollar industry, one that is notorious for generating quick fixes that do not work out in the long run. This failure to generate lasting solutions to a public health problem is unsurprising to anyone who has reviewed the literature on habit and self-control. Habits are extremely robust; recidivism is the norm. Everything from cues in our environment to the chemistry of our brains cause habits to persist in the face of repeated attempts by the individual and others to change them. And yet sometimes, behaviors (particularly long-standing habits) change, permanently.

Though there are similarities between our developing view of media use as a kind of guilty pleasure or alluring, potentially addictive activity and our relationship with unhealthy foods, there are important differences. These differences, I believe, make it easier to alter media use behavior than to change our eating habits, provided we use the tools at our disposal.

First, our media use behavior is easier to passively track, far easier than counting calories. Chances are that right now, you could access information about how much and what type of media you are using just by looking at your browser history, on your laptop, tablet, or phone. More sophisticated tracking software is certainly available. That data may just make us feel guilty when we look at it, but if we know what to do with it, it can help us understand more precisely why we fail at media self-regulation and how we can change our use for the long term.

Second, the same technologies that brings so many tempting, immediately gratifying options in close temporal and physical proximity to us can also deliver us from this problem. They provide a sufficiently motivated media user the opportunity to alter the timing and amount of access they have to many different kinds of media. As in the world of food consumption, the individual and the self-regulation industry is in a never-ending battle with those making and promoting tempting options. The more we try to regulate our environment, the more insidious their pitches become. But the fundamental malleability of new media, the bottom-up nature of Code, makes it difficult for the purveyors of temptation to maintain a direct line to our Ids for very long, at least more difficult than it is for advertisers in the still top-down universe of food production, promotion, and consumption in the US.

So far, those of us who have bothered to use or enhance these tools haven't yet used them in the most effective way. The first stabs at internet self-regulation technologies (SelfControl, Freedom, Leechblock, StayFocusd) are all, in some sense, overcompensating for our newly empowered Ids. By totally restricting us from all media for a time period, these programs (or the strategy of "unplugging" that seems to be popular among certain crowds these days) leads to reactance, ultimately leading to workarounds (finding a computer that doesn't have the restricting program installed, justifying a little cheating here and there, etc.).

The way forward, I believe, is "nudging" (a la Sunstein and Thaler): designing our information environments so that they do not deprive us of access to all tempting options at any point, but instead creating menus comprised of both tempting options and less-tempting options that benefit us in the long-term. Information and social contact is available in various combinations on a regular schedule so that we are not so utterly deprived of "fun" things that we break our diets. Each combination will be specifically calibrated to each individual (customizability being another virtue of new media) to maximize not only productivity but happiness, social responsibility, or whatever the individual's long-term goals entail.

I make this seem simpler than it is. But that's typically what manifestos do, right? There is much research to be done.