Monday, December 06, 2010

Getting attention


I'm sitting here in a lecture. I should be listening, but instead, I'm writing. I'm doing this because I just thought of a flip side to the debate over attention span, distraction, and the use of networked media by young people. Even if we ban the laptops in lecture, their/our minds might have been altered in some way that makes lectures like this pointless.

Sure, maybe people who use networked media a lot are accustomed to instant gratification (the emphasis being on "gratification": experiences that offer pleasure). And yes, media experiences are briefer than ever before.

But maybe this just means they're engaged in the expression and consumption of important ideas in a more efficient manner. Rather than thinking of networked media users' listening, reading, and thinking skills as being hobbled by networked media, they may just have become more efficient at finding and using relevant information. Thus, when they are forced to sit through a linear lecture with a lot of information that is not relevant and cannot be searched or scanned, they tune out, not out of boredom or lack of discipline necessarily, but because they prefer a more efficient flow of information.

So we sit here, slackjawed, unable to concentrate on a lecture. Whose fault is this: the speaker or the listeners? My hunch is that its a little bit of both. Of course, those who use networked media a lot would tend to think the problem is with the old fogeys who still lecture in the same way, insist that we consume information in the same way we did hundreds if not thousands of years ago. Those who do not use networked media (the tendency to use such media more likely being an effect of age cohort and/or income than an effect of holding any ideological disposition; the ideological disposition is a justification for an existing behavior, not the other way around) would tend to think that young people cannot think with the same depth and rigor and are thus intellectually stunted by their near-constant use of networked media.

I think what we need to isolate are the types of media experience or content. If consumers of information choose indulgent experiences over virtue experiences, then they are at fault. But if there isn't a significant different in virtue/vice value that one seeks out when using networked media, then perhaps producers of information should be more efficient in the manner in which they convey information.

I keep returning to the questions: What does it mean to be efficient communicators? What is the goal of communicating information? There is a category of media use referred to by Uses and Gratifications researchers as "information seeking." Its time to start scrutinizing that category and splitting it up into smaller ones. So much of what we do now is information seeking, but clearly, some of it is in our collective long-term interest and some of it is not. Some of what might look like dawdling is really the kind of creative play that leads to new ideas. And some of it really is dawdling.

Two approaches might help us tell the difference between the two: experiments that restrict people's access to parts of the vast quantity of information and detailed tracking of people's work using networked media, tracing the paths of new ideas to certain types of free-association information seeking.

Monday, November 29, 2010

The Medium is the Message: Blog



Reading this recent article in the New Yorker on blog magnate Nick Denton prompted me to revisit McLuhan's adage that the medium is the message. I do not take this phrase to mean that content has no significant effect beyond the medium by which it is accessed, but rather that the medium can influence what content people end up choosing and thus alter the media experience. I also take "medium" not to mean all internet applications but to mean various websites or applications that allow users access to information in unique ways.

Denton's blog empire, which includes Gawker, Gizmodo, and Deadspin, feature a subtle but important tweak to the standard blog format (eloquently encapsulated by Ben McGrath as "links with commentary, presented in reverse chronological order"): it presents statistics on how many people have read and commented on each article. Denton also pays his writers based on the amount of readers their articles garner. There is a kind of herd mentality in readers: they read what is popular, and the more they do this, the more popular those particular articles become. It is worth asking whether the content preferred in this system is different from a system where writers and editors take it upon themselves to design the format and layout of an online publication (as is the case with traditional blogs and online news sites like NYTimes). In the New Yorker article, there seems to be agreement among journalists on both sides of the traditionalist/new wave divide that such a format leads to more "sensational" news preference.

This way of presenting news seems somehow more democratic than traditional editor-selected alternatives. It sounds as though it provides more choice, and that would be better, wouldn't it? In a way, Gawker's innovation is similar to what Google or Reddit or Digg do: they take tons of raw information and they privilege the most popular links. That judgment as to what is worthwhile is made very quickly by users. They scan headlines and if the headline is interesting enough, they click on it. The assumption that this is somehow good rests on some other fundamental assumptions worth questioning:
  • We are being held back by gatekeepers (editors) from what we really want. We are used to having the type of content or experiences we have access to being controlled by gatekeepers who may not have our best interests at heart. As long as we can choose for our "selves", we'll be fine. This is, in turn, based on another assumption
  • As individuals, we are single, desiring, decision-making selves with stable, unique sets of desires.
Returning to the "medium is the message" mantra, we could consider the difference between the concerns over television and the concerns over the internet. The case against television was always a case against content. Watching television was bad because most of what was on television was so bad. By contrast, the case against the internet is a case against its users. There are wonderful, amazing, informative, pro-social, productive things one can do on the internet: create and run a business or a charity, disseminate useful, accurate information, take a course online. If people fail to do these things, its not because there is any lack of positive options from which to choose, so it must be their fault.

Denton gives people what they want, and its not the caliber of people that he's polling that leads to the move away from hard news toward gossip. Its the fact that he's accessing our inner hedonists, our immediate selves.

Tuesday, November 02, 2010

Discipline & Privacy


I was fortunate enough to have the opportunity to showcase some of my ideas on networked media, selection habits, and immediate gratification (which is shaping up to be my dissertation topic) in front of some esteemed colleagues from the psych department here at UMich. The great part about giving talks like this is that it can get you to look at a problem you've been staring at for way too long in a new way.

Something Ed O'Brien said prompted me to return to the issue of privacy. Research suggests that we keep our immediate desires (or inner hedonist, if you will) in check in various ways: by making a habit of things that we don't like doing but know will help us in the long term, by taking temptations out of our immediate grasp, by having reminders of what will happen to us if we fail to curb our immediate desires, or by exercising good ol' fashioned self control. There's also another thing that prevents our inner hedonists from taking over: public censure, or the judgment of others. We don't fill our faces with chocolate cake, fritter away our time on leisure pursuits instead of working, cheat on our husbands/wives, or buy boatloads of porn for various reasons, but one of them is likely that we will be looked down upon by others. We will be publicly embarrassed. But what if you could do all of those things in private, taking the punishment or barrier of public censure and embarrassment out of the equation?

I posit that new media allows us to do just that. We can do so many things that we would have been judged for without being judged because we can do them in private. In many cases, this is good. We can pursue our true passions without repressing who we really are for fear of some sort of cultural judgment. But in other cases, the selves that are being repressed by public scrutiny are just our inner hedonists. The inner hedonist is just one aspect of a self, an aspect that has been kept in check by public scrutiny for probably all of human history in one way or another. Give that aspect of the self space to do whatever it wants and you're going to get behavior that does not reflect what you consider to be "your self." You will get behavior that reflects one aspect of your self.

Here's the counter-intuitive part of this. Most people think our privacy is being eroded by networked media, that more and more of our lives are on display. There are two problems with this line of reasoning. First, though that information is out there and it is attainable, we must look at who is attaining it, how its being used, the extent to which we know about those things, and the extent to which that feeds back to alter our behavior. I would submit that most of the information about our online habits either sits unused, is used in some sense that ultimately doesn't embarrass us or punish us directly or indirectly (used to show us various kinds of ads), and thus does not alter our patterns of behavior. Its all fine and well to quote Orwell and come up with possible future scenarios in which our data is used by others to alter our desirous behavior, but I haven't encountered any evidence that indicates that has happened or is happening.

Here's the second problem. We never had the opportunity to do so many things that we can do now with networked media. Doing any number of things that feel good in the moment but are frowned upon by society (be it cheating on your wife/husband, learning how to build a bomb, trading bigoted jokes, etc.) were just too logistically difficult to pursue. It would be too much of a pain in the ass to find the right people or goods to carry those things off, and you had to sneak around to do them. These are just much, much easier to do now with networked media. We didn't do those things partly b/c we would not have had privacy when we did them and thus we would have been punished for doing them. Now, we can use the privacy that we should expect (privacy regarding our bank statements, voting behavior, private conversations with family members) to cloak the privacy that we never had (privacy regarding things that we know we shouldn't do but do anyway, not because they don't line up with the current public's social mores, but because they conflict with our own long-term goals). Online privacy is expected to cover a wider range of activities, some of them fine and some of them not so fine, than old school offline privacy. Is privacy eroded by these technologies? The technology allows it to be eroded, but I don't see that happening.

So if the eyes of the public can't keep your inner hedonist in check, then who can? Your inner disciplinarian, of course! And various kinds of technologies (like the program Freedom, or the Me timer or time tracking software for firefox) can do something that has never been done before with such efficiency and to such an extent: it can put your self at one point in time (the self that sits there not in direct proximity to temptation, well aware of what it should do) and allow it to exert power over your self at another time (the self that is surrounded by temptation and requires the discipline of some outside force to prevent it form acting). Think of it as a more nuanced, self-imposed chastity belt.

The most important difference is that we need to realize that the inner hedonist was always being kept in check, but that he was being kept in check by forces (i.e. the public, the government, mom, dad, etc.) that didn't always have our best interests at heart and had their own incentives for altering our behavior. We've found ways around them, but are we really any better off with our inner hedonists running the show? Now we can discipline our inner hedonist while preserving our autonomy and privacy. That's a big deal, and that's what I'll most likely be studying for the next couple of years.

Tuesday, October 12, 2010

Portable Technology: Size, Time, & Weight


My new computer - a netbook - has me thinking about how the physical characteristics of a device can influence how I feel about it and then what I do with it. I'm not talking about technological affordance - what the software/hardware allow me to do - but rather how the size and weight of the object influence my feelings about it.

My first working hypothesis: the smaller the device, the more "handy" it is, the more it will be suited for short bursts of use. Its hard to bust out anything bigger than my hand when I'm on the go, on a bus or walking around campus. Also, the use of these smaller devices for longer periods of time seems somehow fatiguing. Trying to block out all this other sensory information while concentrating on a smaller screen for a prolonged period of time is more difficult than concentrating on a larger laptop screen. For these reasons, smaller devices = shorter duration of use sessions.

Then I thought about whether weight has anything to do with use. I don't bring my laptop everywhere I bring my netbook b/c of the weight of my laptop. Its not so much that its literally too heavy for me to carry, but that it feels burdensome. I'm constantly reminded that its in my backpack. If I had a super light MacBook Air, I might feel better about bringing it more places b/c I wouldn't feel burdened by its presence.

So far, I'm finding that the netbook is making me more productive b/c I can "sneak up on myself" and start working on a project. This is inspired by a project I'm embarking on regarding study habits and affirmation (with Emily Falk and Elliot Berkman) related to my dissertation work on self control and virtue/vice media habits. Basically, if I think about going to a place (usually my office in my house), sitting down and doing work, I don't feel good about it, and I tend to avoid that place. But if I can take the "place" of work out of the equation, if I can get to work as impulsively as I can engage in time-wasting leisure activity, if work becomes as accessible as play, then I think I can get to work before I have a chance to dread it. At least that's the way it worked today.

Wednesday, October 06, 2010

Flow 2010 Conference - Serial TV Narrative Wrap-up


Attending this year's Flow 2010 Conference reminded me of a few things. It reminded me how far I've wandered down the social science path of communication studies. It reminded me of the value of attending round-table conferences where the focus isn't on simply presenting your findings in a concise format, but rather on thinking out loud with many other informed individuals. Social science is often a matter of incremental gains in general knowledge. Where do the significant changes in theory and thinking about a topic come from? I think they come from places like the Flow 2010 Conference (or the blogosphere?).

I presented a brief position paper at one of two panels dedicated to serial TV narrative. My basic point was that a narrative doled out in semi-regular intervals over the span of years feels like the interactions we have with other people in a way that narratives doled out by other means do not. Something about reading a novel or watching a new Harry Potter movie once every 3 years feels inert and lifeless; I think some of that has to do with the evolution of TV stories over time (like our relationships with people evolving over time) and some of it has to do with our lack of total control over the pace and content of the story/relationship.

The discussion in the first panel got me thinking more about these gaps in between story parts. That seems to be a defining characteristic of narratives on TV as they are broadcast. What happens in those gaps? Janet Staiger noted that the gaps allow for extended periods of speculation of hypothesizing as to what is going to happen net in the story. The internet makes this kind of speculation collective, thereby heightening fan community, and I think this collective hypothesizing helps manage some of the anxiety people experience waiting for the next installment. The gaps also create the possibility for the creators of the story to adjust some of the elements in the story to respond to real life events (e.g. the incorporation of anti-terrorism federal agents in The Sopranos and The Wire post 9/11) while retaining a pre-established overall story arc (see Lindlof and Cuse's account of how they wrote Lost).

Its also important to note the regularity or irregularity of the gaps and how that affects the audience's experience of the story. I don't think its the existence of the gaps that makes audiences anxious so much as the existence of irregularity of gaps. Once I come to expect intervals of absences of certain durations at certain times, be it in a story I'm enjoying or a relationship of some sort, then I shape my expectations to conform to this. But if the intervals are of indeterminate length (is the show coming back? when is it coming back), people get antsy, or at the very least, they are somewhat detached in their devotion to the show. I like the idea of looking at Facebook newsfeeds as narratives, ones that evolve with us and are relevant to us. We have choice over when we choose to view them (although the bundling of these feeds together and with other sub-applications like party-planning makes it difficult to say "I want to check out how this one particular person's narrative is progression") but we can't choose when they update and typically people update irregularly. So perhaps we cannot be as devoted to an evolving Facebook feed narrative as a fictional TV narrative.

The second panel (in particular Ryan Lizardi's comments) got me thinking about the ideological potential of serial narratives, what I think Ryan referred to as a "long drip of ideology." There's something about a TV show, something about that frequent reinforcement of an idea over time and the ability to get everyone talking about certain topic in a certain way that has the potential to change people's minds about things. I provided two examples at the panel (The Wire changed how I thought about inner-city blight; The Up Series changed how I thought about success and happiness), but upon further reflection, I watched both of those on DVDs, not as they were broadcast. I suppose I'm not saying that other forms of narrative (TV on DVD, novels, movies) can't alter the way people think, but that TV as it is broadcast has the potential to change the national conversation about a topic (again, the water cooler).

Serial TV narratives may have the power to shift people in a positive direction, to help them consider a topic from a new point of view. But what do we lose when more narratives become increasingly serialized? I think we potentially lose the diversity of sampling different kinds of content (this is the "commitment" issue brought up by Bordwell on his blog on why he doesn't write about TV). If I have to watch 10 hours of a show and then watch it every week to be part of the conversation (which is the case with serial TV narrative), then I can't use that time to try out other content. Hence, there is less overlap in what we watch. This leads to a loss of a type of discourse about TV and, by extension, culture. Readers or DVD viewers can plow through a serial in a few days. Watching a show as it is broadcast requires a kind of protracted, scheduled commitment, and the talk about the show in between episodes is part of that commitment. So, serialization of narratives may be part of the social fragmentation phenomenon that comes with expansion of media options that other social critics have feared.

There is, of course, much more to say about serial TV narrative, but these are just a couple of ideas that I picked up from the panels that I wouldn't mind expanding on.

Tuesday, September 28, 2010

Is this a Big Deal? Hysteria, Social Media, and the News


I am watching, listening, and reading (and now writing) about a news story as it unfolds: a gunman has shot himself in the undergrad library of UTexas. The story is especially salient to me because I'm going there for a conference tomorrow, I spent two years there as a Master's student, and I know a fair amount of people who live there, some of which attend the university.

This event (which may be in the past or may still be occurring, depending on whether reports of a second gunman who has yet to be apprehended are true) came to my attention through reddit. I woke up, fixed breakfast, watched a little Sportscenter on ESPN, and then opened my computer to go online. Although I have my browser homepage set to the NYTimes, I rarely start a day by opening a browser but instead click on a bookmarked page from my already-open browser, usually checking my email or going to reddit first out of habit and out of a desire for self-relevant information (email) and appealing, brief distractions (reddit). So if, say, some event of great consequence were to happen, it is likely that I would find out about it this way. But if it were truly of GREAT consequence (e.g. 9/11), I would've found out about it either on NPR (which I flipped on when I was barely awake, as is my habit) or ESPN.

So there was the headline on reddit: Shooter at University of Texas library RIGHT NOW. Rather than click on the link or the comments, I went to NYTimes to corroborate it. Oddly, the story didn't appear to be there. I thought that the shooter maybe didn't kill anyone, or that it was a hoax. But then, buried in smaller headlines on the page, I saw mention of it. I read that story and it said that the gunman fired shots and killed himself, but no mention of anyone else being hurt. I then turned on CNN and, oddly, they weren't talking about it. Neither was Fox News. Other news stations were playing commercials.

I didn't have all the details of this story, but the fact that it wasn't plastered all over the front page of the Times and that CNN wasn't covering it at that moment led me to this conclusion: it is not a big deal. Because of the lack of coverage, it is extremely unlikely that anyone I know was hurt in anyway as a result of this event. It would be safe for me to assume that everything was fine with them, everything was fine with me going there tomorrow, and that everything would, in general, be fine.

This is not the picture that I got when I went to Facebook and twitter. Many people in my online social network have connections to UT, and so many of them posted links to news stories about the shooting, just within the last hour, with very little supplementary information other than brief declarations of fear or concern. With twitter, I had to search for Austin, UT, but the result is the same: brief expressions of fear, rumors of a second shooter still on the loose. From this information, it seemed like a big deal, a huge deal. Reading my Facebook feed, I was...well, not necessarily gripped with fear or concern, but I got the impression that I needed to keep monitoring the situation out of concern for people I knew and to see if I would still be traveling to Austin tomorrow. If I watched mainstream media, I wouldn't get that feeling. I would get the feeling that I could relax and go back to work.

There's this narrative that mainstream media is too big, too ad-supported, to get at the truth. The truth percolates up from the blogosphere, from citizen journalists. If this story isn't covered by CNN or NYTimes right away, its because they're slow moving dinosaurs who just got scooped, and they don't know what's really important but instead promote the values of their sponsors. But here, in terms of "level of appropriate concern," I think that the mainstream media has it right. If I step back and look at the situation, at least what I know of the situation (seems like an isolated, botched shooting, not a coordinated, successful attack by several people), the appropriate reaction is: don't panic, its okay, no one is in more danger than they were a day ago. I just don't get this from the little social network newsfeed that's evolved over the past few years. There have been legitimate criticisms of coverage of disasters like 9/11 and Katrina saying that certain frames used in the coverage gave viewers the impression that there was more panicking and more reason for panicking than there really were. I've had this feeling before about bias in the news, but I think it applies here, too: you thought this was a problem when mainstream interests were handling reporting the news? Just wait until 'the people' start doing the reporting.

I think that in some cases, crowdsourcing news helps information get out faster and that the crowd can, counter to our intuition, promote the accurate, relevant information over the inaccurate information in an efficient manner, more efficiently than MSM. But the emotion that seeps into our facebook feeds, the undiluted fear that cannot help but contaminate our accounts of our experiences and is being mistaken for "news", doesn't move us any closer to gaining an accurate impression of what the true risk is to our selves and people we know and love when a bad thing happens. Its self-relevant and is provided by many different people, and thus appeals to us as "truth," but its laden with emotions that aren't present in detached MSM accounts of events. One may say that those emotions are part of the truth of an event. That sounds good, but I think that emotional reporting of an event as it is happening is likely to be associated with overestimates of how threatening the event is to our selves and others (I'll avoid using the loaded term "hysteria").

One thing that really stuck with me about the coverage of 9/11 was the strangely detached monotone of many of the anchors as we all watched the World Trade Centers collapse. Certainly not the way twitter would've handled it. I feel like this change has happened in the past couple of years and that it is significant: many people no longer turn on the TV when disaster strikes. They go online. Online promises more viewpoints, more immediate updates, deeper and broader accounts of everything. Which is better? Which is truer? Which leads to worry and which leads to action?

Tuesday, September 14, 2010

The hidden choice of establishing your media environment or repertoire


When we think about any choice we make – whether to eat that apple, move to Omaha, or avoid writing your dissertation and check the headlines instead – we typically think about it in two ways: in-the-moment decisions or general tendencies. But there’s an in-between step that’s missing from this picture, a decision or a series of decisions that preceded the one we’re examining: the decision to shape our media information environment. If characteristics of our environment affect the comparisons we make and the decisions we make (e.g. how many options are there, how easily accessible are they, what are their characteristics), then we must ask how we ended up in the choice environments in which we find ourselves. Increasingly, I think we’re partially responsible for them, though we tend to forget this quickly.

When we decide to text someone or check our email or go on Facebook or do some work, that decision is shaped by earlier decisions: the decision to have the internet in our house, the decision to keep our cell phone by our side. These decisions feel obvious because they bundle so many different kinds of options together. You cannot give up the distraction of Facebook because it comes bundled with access to the internet which you need in case a work-related email demands that you respond promptly or you will lose professional standing. Technologies that allow a user to block certain websites or applications from their own future access seem initially absurd, but really, they represent an act of un-bundling. Sometimes you want access to social media; other times, it is not in your best interest. Such prioritizing and self-restriction will become necessarily in a distraction-saturated media environment.

Today’s media environments are often characterized by the extent to which they are saturated by distractions: the vibrating cell phone, the regularly updated blogs waiting to be scanned, the pinging email inbox. We might consider the extent to which they are saturated by temptations, and how we might make a distinction between these two terms. All temptations surely are distracting, but are all distractions tempting? The term “distraction” is used to refer to messages that expect our particular, unique attention at a specific time (e.g. email correspondence, text message) and it is used to refer to messages or characteristics of the environment that are generally directed to us (as members of a larger group or as individuals) but are always there, not needing our attention that that moment. Facebook, ESPN, and that chocolate chocolate-chip muffin just sit there, waiting for my self-control to ebb, at which point I will succumb to temptations and indulge in a distraction.

When I give in to the temptations – Facebook, ESPN, muffin – I feel responsible for the choice, as is indicated by my feeling of guilt upon indulgence. But when I check my work email, read and respond to a work email, I feel no sense of responsibility. This is simply a condition of our collective environment! So, when we say we live in an era of distraction, we fold these two kinds of distractions together: the temptations (i.e. vices) and the expectations of immediate attention related to ostensibly virtuous activities like work. We do this in part because of the way media is sold: in bundles of virtue and vice.

And yet I don’t think we should let ourselves off the hook that easily when it comes to work-related email or other virtuous online distractions. We have some agency in designing our media choice environment. We could’ve decided to put our cell phone on silent or not checked our email for 8 hours. One might complain that to do this would be to give up social and professional standing, and I think that if you just one day decided to do it, without giving the people you are in contact with any warning, then you would certainly give up social and professional standing. But if you simply gave the people around you the expectation that you are unreachable sometimes (either explicitly by saying that you will be “at work” during certain hours or implicitly by purposely delaying your responses to them), you could reduce the number of “necessary distractions” in your life. We have the tools and the ability to unbundle our lives but we have to acknowledge the self that can alter the media choice environment, the repertoire from which our future selves select an activity. That self has more and more say over what we end up doing with our time.


Saturday, August 14, 2010

Tattoos as Human Meta Tag


This is a bit more random than my usual posts, but I was just watching TV and an ad came on for the unfortunately named Lugz shoes (the male version of Uggs?). It caught my attention because of the way it combined sex appeal and violence: man wakes up in bed next to a sexy dame, thinks about an MMA fight he had, cut back and forth between the sexy woman, the dude sitting in bed remembering whilst pondering what shoes to wear today, and a brutal fight. I didn't have the sound loud enough to hear but the message seemed to be that men earn their women through fighting other men...and these men wear lugz. It was hard to tell if the guy in the ad was just some guy or if he was a particular MMA fighter. Without any mention of his name, it would be hard to google him to find out. However, he happened to have a huge, ostensibly unique tattoo across his chest: BROWN PRIDE. How many guys have that tattoo? I googled the tattoo and, sure enough, I came up with the guy from the ad who is, indeed, an MMA fighter: Cain Velasque.

I'm sure the same thing happened with Tupac, or would have if Google were more prominent when he was alive. It reminded me how text-based search (and the internet) still is. I keep waiting for facial recognition to get better, faster, and more accessible. Someday, I'll just be able to point my IPhone at the TV or at someone walking down the street and get their name and whatever else I want to know about them. Until then, I suggest we all get unique phrases tattooed in prominent places.

Tuesday, August 03, 2010

Discussing Inception


After seeing Inception for the 3rd time, I've become almost as interested in the way people discuss this film as I am in the film itself.

I first became interested in the discussion around psychological puzzle or "mindfuck" movies when I wrote a chapter of my thesis on the IMDB discussion of Mulholland Drive. I found that the arguments fell into two distinct categories:

Those who thought the film's creator was trying to create a coherent fictional universe in which cause-and-effect applied vs. those who thought the movie was like an abstract painting. The first group thought that the plot was like a puzzle and that if you understood which parts were "real" for the characters and which parts were dreams, fantasies, or hallucinations, and you understood how the "real" lives of the characters influenced those dreams/fantasies, then you could "understand" the film. The second group thought either that the artist is having a laugh at the audience by making them think that the art is somehow "important" b/c its so inscrutable or that it has real beauty which cannot be analyzed the way a cause-and-effect narrative is typically analyzed.

In a way, the same thing is happening with Inception (based on my observations of a very limited sample of audience reactions: my friends and people on reddit). But I think b/c Inception has all the markings of a traditional Hollywood film as opposed to the art-house pedigree of Mulholland Drive and b/c the film's characters spend so much time explaining how everything fits together, most people debate what the final reality (for the characters) of the film was and/or whether the creator of the film was successful in his attempt to create a coherent fictional reality. Those who loved the film seem to believe that it all fits together and those who didn't like it either felt that it didn't all fit together or that its construction - its attempt to weld together an action film, a psychological puzzle, and an emotional story of loss and acceptance - is clumsy and draws attention to itself as artifice. Here's what I'm interested in: why do people fall into those groups?

Here's my pet theory. People think that some characteristic of the plot's coherence is to blame/praise for why they liked/disliked the film. They argue as if that were the case. But really, its their abilities to emotionally connect with the characters and some pre-existing disposition towards the type of movie they see Inception as (a summer blockbuster, a Christopher Nolan film, a smart film, a trying-to-be-too-clever film) that determines how hard they look for those flaws. This categorization and emotional identification with the situations and characters in the film, in some sense, precede viewers' desire or attempts to piece together what exactly happened in the fictional reality of the film, even though we're not aware that they do. With Inception as with many sci-fi and/or mindfuck movies, there certainly are inconsistencies or blind-spots to find. Those who were pre-disposed to liking the film could acknowledge these, but they would probably say that they don't matter, that its all part of the willing suspension of disbelief, that all stories must leave out some information and that the rules of the genre demand that we overlook some un-reality of the film's world. Those who were pre-disposed to disliking the film would call these "plot holes" and treat them as a kind of empirical evidence that the film is poorly constructed.

So when people have debates over whether the film made sense or what happened in the reality of the film, they're really just reflecting whether or not they identified with the film and whether they typically like the type of film that, before they entered the theater, they had categorized the film as.

I have this hunch b/c I try to imagine if these relatively minor things in the movie that people talk about were altered (the spinning top falling over/not falling over, a bit more explanation here or there, things going a bit differently with the kicks, etc.) and what the people having these arguments might think of that altered movie. I am highly skeptical that their opinions would change. They talk as if these minor changes would alter their opinions of the movie, that their arguments for why this movie was good/bad rests on these features of the film, but this seems ludicrous to me.

Perhaps its easier and safer to talk about one's like or dislike of a movie in this way. To understand how or why one categorizes a film as a certain type is a really hard thing to be aware of. Also, when you talk about what characters or situations you identify with, you can be revealing something very personal about who you are. Any statement against the film could be construed as a statement against someone who identified with the characters, situations, or sentiments in the movie, which would cause a whole lotta friction.

It should be obvious which group I fall into by now. But the more I think about the multi-layered pleasure I get out of this film, the more I think, "why can't I experience this more often with more movies, TV shows, songs, etc.?" We talk about these things as if its the artists' responsibilities to bring us a pleasurable experience, but I think we as audience members have some control over this, more than we think. If I'm arguing this for Inception, I'm arguing this for all art. What if I had gone to see Eclipse instead? My knee-jerk reaction would have been to hate it and to argue why this was a bad film. But I wouldn't have to have done that. I could've acknowledged the fact that I categorized the film as "awful teen chick romance" and understood that, on the face of it, I probably wouldn't have identified with the characters, situations, or sentiments expressed in the movie. But that doesn't mean I couldn't have found something positive in the experience and that I shouldn't have to wait for an artist to cater to my desires or find a fan to convince me of the work's merit. Everyone could love everything, and if loving something is more fun than hating something (I know that hating films, TV shows, people can feel good, but as someone who really loves Inception, I can tell you that loving feels better than hating), why shouldn't we try? I think my arguments for the greatness of some works are arguments for enjoyment in general, arguments against not enjoying yourself at a movie.

Thursday, July 08, 2010

Hate the sin (use of media) and love the sinner (the media/the user)


Urban dictionary, the open-source authority on popular phraseology, defines the phrase "Don't hate the playa, hate the game" as:
"Do not fault the successful participant in a flawed system; try instead to discern and rebuke that aspect of its organization which allows or encourages the behavior that has provoked your displeasure."

This phrase popped into my head several times over the past few days. I've been visiting home, and each time I speak with people outside of communication/media studies about the media, I usually end up on the receiving end of a diatribe against "the media." Although "the media" is blamed for pretty much every social or psychological ill one can conceive of, it might help to focus on one recent example: the over-coverage of LeBron James's free agency. The problem, as I understand it, seems to be misplaced priorities. Why are we giving so much airtime/webspace to something so frivolous when more pressing matters (e.g. global warming, BP oil spill, the economy, Israel/Palestine) are clearly of more importance? Similar arguments have been leveled at reality TV, human interest news, soft news about television shows, celebrity news, etc. Usually, the people to blame are "the media" and the people who care about such things are helpless, ignorant addicts and dupes.

The argument is countered by many cultural critics or those with some reason to defend such fare (e.g. those in the entertainment business, fans, etc.) on the grounds that it serves as the basis for discussion and debate of important cultural mores. LeBron's free agency decision is about loyalty, reputation, and avarice. The Jersey Shore is about our love/hate relationship with our own bodies and those of others, classism, and ethnic identity. Sandra Bullock's divorce is about the meaning of family, sex, and marriage in 21st century America. Pretty much all so-called frivolous media fare is, in some way, about romantic love: how we define it, how we find it, how we keep it. When people put down these debased forms of culture, so the argument goes, they are performing an act of cultural elitism, holding preferred forms of discussion and debate of mores in high esteem because they were created by rich, white, heterosexual American men and not because those forms are inherently superior. To use another colloquialism: haters gonna hate.

I'd like to offer a third viewpoint on LeBron coverage, Jersey Shore and its coverage on the news and in the blogosphere, and coverage of Sandra Bullock's divorce. Yes, despite their apparent frivolity, they all contain elements which could, and indeed do, lead to what anyone might recognize as productive dialog about important issues. But they also contain elements that lead to negative outcomes: narcissism, hostility towards out-group members, and poor self-image, for starters. If we can't get to a point where we recognize these possible negative and positive outcomes of the content, then we can go no further in discussing whether any type of media is good or bad. But if we acknowledge those ground rules, then we can move forward. What determines whether one who watches or reads this stuff gets something positive or something negative out of it?

I think the answer lies not in exposure to the content or the mere existence and availability of the content, but in use: quantity, level of engagement, and motivations for use. Plenty of media effects research bears this out. If you read a lot of tabloids, comment on blogs about celebrity gossip, and do so after a hard day's work (showing signs of an "escapist" motivation as opposed to an "information seeking" or "social" motivation) and you exhibit higher levels of narcissism and lower levels of self-esteem and civic knowledge than someone who reads the same content but less often and for other reasons, well then, you've got something.

My hunch is that the people who use media in ways that end up being associated with negative outcomes have poor impulse control and trouble delaying gratification, and that these attributes were established early on in life. If they weren't watching too much celebrity news, they'd drink too much or spend too much time on Facebook, or overeat. They can still train themselves to steer clear of things that trigger the undesired outcomes, but first they have to recognize the links between the behavior and the undesired outcomes.

Let's get back to LeBron. I suppose members of the media are culpable on some level, in that each time an editor leads with a story about LeBron or another person tweets about it (making all tweeters members of the hated "media"), they make it easier for everyone to pay attention to LeBron and ignore more serious matters. Regarding the "if you don't like it, change the channel/website" counter-argument, this assumes that people freely choose what to watch or what to read (the old "rational agent" fallacy) when, in fact, they watch or read what is easiest to access and if all of the easy-to-access sources concern matters that contain frivolous elements, it becomes more likely that many people (even very media-savvy people) will find themselves accessing more and more of this fare and exhibiting undesired outcomes for reasons unknown to them.

But if your objective were to curtail the negative outcomes of narcissism and the lack of civic awareness and engagement, trying to stamp out "frivolous" media seems like the wrong way to go about achieving it. Better to establish the links between certain kinds and amounts of use of certain kinds of media content and agreed-upon negative outcomes. Give these facts to people in language they can understand and in metrics that they care about: how does this affect your happiness, your lifespan, your ability to earn money? Let them make their own decisions. Consumer-driven change helped drive the recycling movement as well as the organic food movement. Why couldn't it change what we see in the news?

Tuesday, June 29, 2010

Paying for Liveness


One big stumbling block for television surrogates like the various incarnations of webTV, Apple TV, netflix, and hulu was the failure to create the relatively passive, lean-back, couch potato experience of sitting on a sofa and watching something on a big screen. Viewing something on your laptop makes it difficult to give one's self over to a semi-passive viewing experience. The opportunities for distraction go beyond other content that one may channel-surf through. They are different experiences, extremely personal and almost endless in variety: posting a comment on Facebook, reading a blog, checking your email. It offers pleasure, but not the same pleasure as what we refer to as "television."

Finally, some of those TV surrogates are moving beyond that stumbling block on to our TVs. The evolution of netflix and now hulu towards easy-to-install view-on-your-TV versions reveal something about value and the definition of television. All three platforms offer much of the same content, yet all three are based on different pay structures (cable = lots of ads, higher fees; hulu = some ads, lower fees; netflix = no ads, lower fees). While it should be noted that the content libraries are not exactly the same (netflix and hulu offer deeper, broader catalogs but do not have sports or news), it may seem as though they offer the same product, more or less. Why would anyone pay more for more ads and a smaller catalog of titles?

Basically, the consumer pays for live-ness, either with ads or money. This applies especially to sports and news which lose value immediately after they're aired, and we might consider any kind of soft news (gossip) part of this. But even with shows that do not need to be viewed live in order to be enjoyed, there is some added value in being able to view them as they air. Wanting to watch a program as it is aired isn't just a matter of impatience. Being able to discuss the show with others matters, and its easier to do this if everyone is viewing the show simultaneously.

Conversations about shows taken from a vast catalog (either netflix or hulu now) have a different tenor than those about television. They're often simply attempts to convert people who haven't seen the program or attempts to describe what happens in the show to the uninitiated. The catalog is just too vast for a lot of overlap in people's viewing experience. TV's appeal lies in its limiting of the available choices as well as its temporality. Its also something that doesn't really require our careful consideration. Like a more personal version of a newspaper, it should just blurt out what's happening so that viewers can talk about it. More reality TV, more sports, more direct address.

Saturday, June 05, 2010

Panic/Panek & Social Media Use


As I write, there is a tornado warning in my area. I have taken the appropriate precautions: I'm in my basement away from the West and South walls. And I brought my laptop. Aside from being the most valuable of all my possessions, my laptop (assuming its connected to the internet, which it is) is quite a valuable thing in this situation.

First, I should say that I have very little personal experience w/ tornadoes. My closest brush was when I visited a friend in Iowa and woke up to an ungodly, sustained siren. Half-conscious, I thought the world might be ending. My knowledge of tornadoes - the odds of them occurring, the damage wrought - was informed by weather channel specials and mediocre Bill Paxton movies. It turned out they were just testing the siren and there was no tornado at all.

So, I had no personal experience w/ this danger, and the second-hand knowledge I had was from unreliable sources. There's this air of paranoia that comes w/ over-preparedness that gets drummed into our heads by authority figures legitimately concerned for our safety. I would assume that this leads us to believe that the odds of encountering a life-threatening earthquake or tornado or hurricane are far higher than it actually is. Am I a fool to be in the basement?

Anyway, back to my laptop. I first knew about the warning via an almost comically old-fashioned medium - the siren. I then went online and tried to go to the weather channel website which wasn't loading, so i turned on the TV and indeed there was a tornado warning. I googled "tornado safety" to confirm my hunch that being in the basement was a good idea. I felt like an idiot for doing this, partly b/c it suggested that I was dependent on technology and possessed no horse sense of my own. Its only a matter of time before I start asking google whether or not I'm in love or where I should move.

Then I went to Twitter, which, aside from being a terrific way of knowing what David Allen Grier had for lunch, really is a great resource for immediate local news and/or reaction to an event. The only other time twitter was of use to me was when my phone wouldn't send/receive texts. I wanted to know whether it was just me or whether the network was down and Twitter told me what was up. Similarly, now, I want to know if I'm a fool for being in my basement. Twitter tells me that, at the very least, I am a fool in good company.

Its funny to imagine people like me, in a basement, who feel the need and have the ability to broadcast something to the outside world about their unusual condition. For every remarkable tweet or blog post or YouTube video of an earthquake, a tornado, or an insurgence, there must be a growing number of false alarms, dispatches from intact basements.

There is something absurd about someone in a time of genuine peril using social media, kind of a "I know I'm about to die a horrible death, but just let me tweet about it first" thing. This is not as absurd as it might initially seem. This is a natural instinct - to reach out, to try to connect w/ as many people as possible. 9/11 happened before the rise of web 2.0, mobile access to social networking sites, etc. People on the planes called the people they loved and left poignant messages. If they had the means, why wouldn't they also want to connect w/ other people they loved?

In reality, these dispatches provide valuable information to others and serve a primal instinct to connect when one's demise (however unlikely) might be near. I think of them as moments in which our survival instincts overcome social mores (the "boy, is this lame" view of the actions) and do something that we "should" be doing.

Now, with the oldest medium of all - my own two ears - I have detected that the storm has passed. That's how they did it in the old days, I suppose, before Twitter, before the emergency broadcast system, before the sirens. Weather.com still says there's a tornado warning. How long do I stay in the basement? Do I believe my ears or do I believe the website?

Sunday, May 23, 2010

Last Thoughts on Lost

I never really liked Lost all that much. Every now and then, I would become immersed in it the way anyone can become immersed in a good story, but mostly I enjoyed it as a popular experimental narrative. It is the type of show that makes one reflect on one's own journey, so I struggled to recall when I started watching the show, while I was in Austin, off of bootlegged DVDs my roommate scored in Hong Kong. Since then, I've changed. Maybe I've become less tolerant of sentiment, or at least I look for it in different places (Eternal Sunshine of the Spotless Mind and the music I love give me all the deeply emotional experiences of memory that this show only began to give me). I've become more cerebral, less ashamed to engage in long debates about the possibility of time travel and social science experiments, and there again, I've found other places to scratch that itch (the internet, more school).

If we left behind the goal of loyalty or fidelity to an initial spirit of a show, the flexibility of written-as-they-go narratives could be a strength: they can grow with or grow apart from the audience and the creators. Most shows seem to become victims of their own success, losing the original thread and exploiting the trust of the audience. What Lost added to this was the transparency of the audience reaction to the show as it went along. Everyone could see the reactions to each show. The debate about how good or bad the show was in some way affected by viewers' immediate knowledge of others' opinions. I would think this would hinder a show from growing or changing.

What seems more significant is the ways in which my TV watching and media use habits have changed since I started watching the show, along with everyone else's. When I started watching, the show, like pretty much every other show, wasn't easy to watch online. For the last season, I watched most of the shows online, delaying watching as long as possible, reading synopses when I didn't feel like watching. For the series finale, I watched as a friend skyped with her sister hundreds of miles away, the laptop open, half of our attention focused on the show. In other words, I half-watched the show, something that wasn't possible in the way that its now possible. I did this not because I can't give TV shows my full attention anymore (I gave my full attention to Mad Men and continue to do so next season). I only half-liked Lost, only cared about the philosophy and the scientific experiments and mentally or physically tuned out when things got sappy. The show seems as ripe as any for a re-edit.

Then there are all the strange things happening outside of the show on ABC, the things that reminded me that I was watching television and not sitting in a darkened theater. The emotional ending gave way to a teaser for the local Detroit news: real people, really dead. Then, Jack Shepard walks out on the stage of Jimmy Kimmel's show for some incongruously semi-serious analysis of the show. Kimmel seemed unclear on whether to play things straight or crack wise, as I was while the last show was airing (should I make a snarky comment? Should I cry?).

I suppose what I liked most about the ending was that it ended with a death. That's the only real ending, right? The end of an individual consciousness. And the more I think about it, it really reminded me of Donny Darko: airline related disaster that does/does not kill the protagonist, protagonist engages in what may or may not be a prolonged hallucination about time travel after which he finally comes to terms with his own death and, in a way that is uncharacteristic of most Western narratives about death, dies peacefully, fondly recalling the lives he touched. Both stories tap into some cultural preoccupations: post-9/11 anxiety about flying (DD was made before 9/11 and was oddly prescient; its popularity might have something to do with the cultural resonance of plane crashes after its release), considering the implications of recent developments in philosophy and physics, trying to make sense of death and memory (these are more universal, long-standing preoccupations).

Wednesday, March 10, 2010

Living while distracted


As I walked hurriedly down South University on my way to catch a bus, fishing my headphones out of the jacket pocket as I went, I passed an older man who was standing still. He was holding his hands behind his back, idly gazing in the window of an ice cream shop while students whizzed by him on their way to class. His idleness struck me - what might he be thinking about all the people around him. In particular, what might he be thinking of me or any of the many students fiddling with Ipods and phones. The answer is: probably nothing (I tend to assume people are thinking about me when they're not, so egocentric am I), but it made for a productive jumping-off point for some musings on media distraction.

Media is an exceptional tool for distraction, perhaps the best that has ever existed. It is always available. It presents significant variety in terms of songs we can listen to, people we can text, shows we can watch, profiles we can check out. We can use it to distract ourselves (actively choosing the content or people we access) or we can be distracted by it (being confronted by people or content we didn't seek out). Usually, its some combination of the two: going to a site that links to some unexpectedly interesting other site, texting one person and getting a text from another, etc. All of it serves to take our minds off whatever we happened to be thinking about or feeling before. It distracts us.

Distraction has a bad reputation. First, it is other people's doing. We wouldn't choose to be distracted, but more and more, we are confronted with a distracting environment, cluttered with advertisements and solicitations. It prevents us from working efficiently, prevents us from thinking deeply or for sustained periods of time about problems which helps us to solve those problems (either personal or societal). It reduces us to attention-depleted pleasure junkies, incapable of reflection.

OK, fine. I'm not necessarily doubting this curmudgeonly way of thinking about our modern world, a view I probably falsely ascribed to the stationary man staring into the window of Stucchi's ice cream. But I think there is pleasure in distraction (or at least the potential reduction of pain). Its not something that advertisers forced upon us.

Jonah Lehrer cites the research of Walter Mischel on self-control, finding that the key to preventing yourself from indulging in something you desperately want but should not have is distraction (this is a gross oversimplification of it, but whatevs). Children were presented with a short-term reward - a marshmallow - and told that if they held out, they could receive a larger long-term reward (two marshmallows). Some kids were able to hold out, others were not. Those who were able to hold out, on average, went on to be more successful in life, having fewer behavioral problems, higher SAT scores, and better jobs.

What helped those children delay gratification wasn't that they desired the reward any less, but that they were skilled in the art of “strategic allocation of attention.” In Lerher's words, "instead of getting obsessed with the marshmallow—the 'hot stimulus'—the patient children distracted themselves by covering their eyes, pretending to play hide-and-seek underneath the desk, or singing songs from 'Sesame Street.' Their desire wasn’t defeated—it was merely forgotten." It isn't just that the children were distracted and that they resisted temptation b/c of this, but that they understood how their minds worked. They engaged in meta-cognition, and were able to make themselves think of something else to help them hold out for a long-term reward.

In other words, self-imposed distraction was good. It helped prevent the children from indulging in immediate gratification, helped them hold out for long-term rewards which, as it turns out, is a big part of succeeding at life. Could we then think of media as an aid in this process, a super-effective tool for self-distraction? When we are tempted to indulge in some immediate gratification that will hinder our abilities to succeed in the long run, if we pop on our ipod or distract ourselves with an episode of Glee or text a friend, all other things being equal, does that help prevent us from indulging?

There are other effects of this kind of use of media, ones that may hinder our abilities to achieve our long-term goals (lowered attention span, expecting positive and novel stimuli at the push of a button at all times, etc). But it is interesting to consider this particular incentive for using media, as a way of distracting ourselves from things that we should be distracting ourselves from, things that it would be in our best interest to forget about for a bit. If you distract yourself from working through an issue you have with, say, your father or your job, this isn't good. You should take time to reflect on those problems, think about how to resolve them in a way that benefits all parties. Otherwise, they will fester and grow bigger. But if you distract yourself from your immediate desires for the proverbial marshmallow, this may just help you hold out for long-term rewards. No amount of reflection on that desire will help you. its just rumination, perseveration. It helps no one.

How, then, to tell the difference between the two, between escapist distraction and beneficial distraction?

Monday, January 18, 2010

Substitutes


In today's NYTimes, David Carr posits that one reason for the depressed ratings for The Tonight Show over the past couple of years is that the internet - specifically, sites like TMZ, blogs, YouTube, Facebook, Twitter, and even forwarded emails - serve the audience's desire for snarky commentary on current events, which is most of reason why people watched late-night TV. The internet does so more efficiently, tailoring content to individual interests and providing it anytime people want, even right after the events happen, not to mention the fact that it doesn't have Broadcast Standards and Practices looking over its shoulder.

Maybe this is why ratings are down, and maybe it isn't. All we have are the Nielsen numbers, which indicate that, yes, people are watching less late-night TV and going online more, but we don't know if that's because Jeff Zucker screwed up when he switched Leno to 10pm and put Conan at 11:30, because people are satisfying their desire for snark online, or some combination of the two. But the possibility raised by Carr provides an ideal example of why scholars and critics cannot make claims about how an audience/user responds to a medium, or even a media text (like The Tonight Show), that would be true even if there were other media options available to consumers. Certainly, there's a need for in-depth commentary that applies in-the-moment to an audience's relationship to a genre or a show or a medium at a given time, and certainly, it is impossible to anticipate what the mediascape will look like in 2 years let alone 10 or 20, but there is an alternative to simply making claims about texts or genres or a medium: making claims about media experiences.

In an era with fewer media options, you could describe the audience's (or your own) experience with a medium such as television and assume that when someone read your analysis, that person would know what "television" was and would've experienced "television" in ways that are similar to the ways in which the audience did at the time the analysis was written. You could describe an audience's experience with a show or a genre, and even if the show or genre weren't available to the reader, the reader could find an equivalent in his or her own time and place. If I read a piece about what The Tonight Show meant to viewers in the 1960's, I would imagine that viewers of 2005's Tonight Show or perhaps viewers of any late-night comedy show might feel similarly. What Carr suggests is that although the show is still called The Tonight Show and that it hasn't radically altered its content, its relationship to its audience is not the same. Therefore, anything we said about the appeal of The Tonight Show 20 years ago doesn't really apply anymore.

In his essay, Carr says, "The show hasn’t changed, we have." I don't think this is quite right. We stay more or less the same. We experience the same basic set of emotions, our minds are just as capable or incapable of deciding between various options, we relate to one another in ways we related to one another thousands of years ago, we are creatures of habit. To use the example of The Tonight Show, we're still in need of funny commentary on current, shared, cultural events on a regular basis. So really, in the ways that count, in the ways that determine what media we choose and the ways we act in life, we have not changed.

So, if the show hasn't changed and we haven't changed, what has? The ways in which we are connected to each other have changed and will continue to change at rapid pace. We're left with an environment that won't sit still long enough for us to make claims about the role of any one of its elements - be it a technology or a TV show - that will be relevant to the world 10 years from now. Also, there are simply too many texts for any scholar or critic to keep up with. If twitter or Facebook feeds are the functional equivalent of late-night comedy TV for some people, how do we get access to this information and how do we preserve it? We can't.

What we need is a way of identifying and categorizing media experiences that is not contingent upon the text or medium's relationship to other available options at a given time, or rather, one that takes into account that relationship in the claims that it allows scholars to make.

To stick with The Tonight Show example, here are two ways of making claims about people's use of that show (these claims are probably not true, but they're just an illustration of the kinds of claims that one could make):
  1. People who watch The Tonight Show do so b/c they want to relax, they want to stay informed about current events, they want to be entertained, and they want something to talk about with other people. We know this b/c they tell us. We also know that its true especially of people who watch that one show b/c people who watch other shows do not rate these reasons for viewing as highly. We also know that watching the show is associated with higher levels of political knowledge and better relationships with others.
  2. Given a set of options that is roughly equivalent in terms of its monetary and temporal costs, if people are in a certain mood and they been encouraged to engage in an experience habitually by the way in which the experience is made available (once every night, 5 nights a week, year round), and that experience allows them to relax, stay informed, be entertained, and maintain bonds with others, they will choose that media experience. That media experience, so long as it contains commentary related to current events, will result in higher levels of political knowledge and better relationships with others.
The first claims do not address the shifting media landscape. Yes, one can make sound claims that were true at the time, and maybe the person who reads the analysis 20 years after the fact can adjust the findings in his/her mind to suit the subsequent media landscape, but there are ways for the media scholar/critic to make claims which do not require such adjustments. The second set of claims are about a media experience, one that was chosen from a number of options that were harder or easier to access, and was chosen b/c the user was in a certain mood. The 20-years-later reader can substitute any media experience for the one mentioned in the study. That reader knows what qualities of the experience are important - current-ness, cost, regularity, ability to provide relaxing entertainment.


The categories that we're using now - genre, medium, text - either won't exist or won't mean the same thing that they mean right now. The state of rapid media change does not require that we be able to see into the future. No one in any other branch of science is any more capable of seeing into the future. It just requires us to concentrate on other variables, ones which have numbers of levels that remain relatively stable. An incomplete list:
  • emotions
  • motivations
  • the number of options one can consider at a given time
  • monetary and temporal costs
  • whether or not some experience appeals to our ideology
  • whether or not we identify with characters
  • ease of use
Theoretically, we can locate any possible media experience somewhere along the spectrum of each of these variables, and we can determine what prior emotional states those experiences are associated with (thereby predicting usage patterns) and what the subsequent effects of such experiences are.