Showing posts with label self control. Show all posts
Showing posts with label self control. Show all posts

Tuesday, July 12, 2016

What Does Everyday Media Use Look Like? (and why this might bias our perception of digital media's effects)

Two brief anecdotes:

1: I’m sitting in Railroad Park in Birmingham, enjoying a breezy, temperate Saturday morning. A few folks pose for a photo that someone else takes with a smartphone. I guess I notice these people first in part because I cannot take a picture with my smartphone - the camera in it is broken. There is nothing like being robbed of something that you have taken for granted to get you to notice it more and to think about it more. I see a few other people in the park typing messages on their smartphones. Other people talk to one another face-to-face. Others stand around and take in the landscape, pet other people’s dogs, read a magazine or a book. And here I sit, typing on a laptop. What do we look like to each other? What kinds of assumptions might we be making about each other based on what we're doing in a public place with or without media technologies?

2: I'm enjoying a beer and a musical performance at Band of Brothers, a new brewery here in Tuscaloosa. The beer, the music, and the general vibe are all great. At the end of the performance, the lead singer, an elegant, attractive, charismatic young woman sits down at a table near the stage, gets out her smartphone, and seems to instantly transform herself into a zombie, hunched over a tiny screen that illuminates her dead-eyed stare.

It isn't hard to find people who are concerned about excessive media technology use (in particular excessive smartphone use). The most popular books about digital media use and the most popular articles take a pretty dim view of it. If there was a utopic moment in the history of such technologies, that moment seems to have passed. This recent commercial for Cisco feels like a relic of a time before we became so deeply suspicious of digital media (perhaps it's a response to that suspicion. Why else would we need a "pep talk" about the virtues of technology?). 

Where do we get these ideas about media use? Or: why are we so keen to agree with, and so reluctant to question, those who provide anecdotal evidence of its ill effects?

Often times, we are heavily influenced by our direct observations: we look at the people around us, the students in our classes, the family members at our cookouts, the friends at parties, the kids at our friends' houses, and the people in the park. Direct observation informs our intuition, and we seek out confirmation of that intuition in the books, blogs, articles, and documentaries we consume.

I've been thinking about how the relationship between our typical first-hand observations of media use and the actual fact of media use (regardless of whether or not it is observed by others) has changed over the years, and how this might influence our opinions of media use and its effects and may, in part, account for the dystopic view that seems to be dominating public discourse on the topic. 

One change: media use has become more public, and hence more visible. People watched a lot of TV before, teens used instant messenger a lot before, but they engaged in these activities mostly in their own homes. The ascendant variety of media use, mobile media use (primarily smartphones), is much easier to observe than TV use or AIM use on a home computer. Thus, we may think that there is a lot more media use going on than there was before, but this may be distorted by the fact that it is simply more easily observed. Evidence suggests that average screen time is increasing, but by the way people talk about digital media, you would think we spent very little time staring at screens before the smartphone, or that smartphones have doubled the amount of time people spend looking at screens (the real change in average screen time among Americans over the past 4 years is probably around 7%).

There are, of course, older media that are used in public: books, newspapers, and magazines. But there is an important difference between print media and smartphones: in the case of print media, it is easy for observers to know precisely what the print media user is reading while it is difficult for observers to tell what users are doing with mobile media. Observers cannot see the screens of smartphone or tablet users; even when observers physically can see the screens, there is an expectation that they do not look too closely at them. So, observers know that the other person is using media but they don’t know how they are using it: work email, liberal news, conservative news, nearby places to eat, connecting with family members on social media, bragging on social media, looking up how to build a bomb, how to build a raised-bed garden, etc.

What they are doing with media really matters. If they’re being social and supporting others while staring at the plastic rectangles in their hands, then this is very different than obsessing over how many people like what they've just posted on social media, and both these things can be done on the same website or application (e.g., Snapchat, Facebook, Reddit). There is such a broad range of activities in which a public media user could be engaged. My sense is that an anxiety arises simply because observers know so little about others that they share space with.

Then there's the question of what people would be doing if they weren’t using their phones. How do the particular activities in which they are engaged on their smartphones stack up to those other possibilities? If they weren't engaged in those media use activities, would they be talking to other strangers face-to-face? Would they be reading a magazine? Would they have spent time in deep, productive contemplation? Would they have stood there and dwelt on a mistake they made the day before? Would they have stayed home and watched TV? I ask these questions not in a rhetorical sense, to assert that the fear and resentment so many feel about media use is exaggerated. I raise them because I honestly don’t know, and that I believe that one cannot honestly say whether the fears about increased screen time and smart phones are justified or not without evidence that speaks to these questions. 

What about witnessing the media use not of strangers, but of someone more familiar to us: a friend, spouse, parent? Our observations of strangers are largely free of interpersonal influences; whether two people at the park talk to one another or ignore one another and stare at their smartphones doesn't directly affect our interpersonal relationships with them or anyone else. It's simply a snapshot of behavior in our society. When we witness the media use of someone with whom we have some sort of relationship, there is an emotional component to our judgment of their use. Commonly, we compare their media use to one particular alternative: having a good conversation with us.

This view ignores other alternatives. If they didn't have a smartphone or laptop, perhaps they would have elected to watch television, or would have elected to leave the room and call a friend on a landline phone in the other room, or would have read a book, newspaper, or magazine. Perhaps we would have had all of their attention instead of some of it, but perhaps we would have had none of it. We only see that they are not talking to us, which is something we wouldn't see and be reminded of if they were not in the room with us.

Compare the experience of sharing a room with someone you know while that person uses a smartphone to the experience of watching TV with someone you know. In the case of the TV-watching friend, we know what our friend is watching; we're watching it, too. In the case of the digital media user, we're likely to fear the worst when we can't see what the other person is doing on their smartphone or laptop, and the fear (of not really knowing this person and what they're up to) likely has a greater impact on us when it is a close friend, spouse, child, or parent. TV can spark conversation, but then again, so can smartphone use. During my casual observations of smartphone use at bars and coffee shops, I've noticed frequent "screen sharing" behavior in which phone use serves as the impetus for conversation rather than an alternative to it. I've also participated in such "phone-aided" conversations at home with my wife.

When we see other people using smartphones and laptops, we feel ignored. We often compare the situation to ideal alternatives rather than making the effort to determine what the likely alternatives might be. We don't think about what the person is doing with the media (often because the expectation of privacy prevents us from knowing this). When we see the elegant, charismatic performer transformed into a hunched-back zombie, we feel a visceral repulsion. This is what we do by default. We then seek out justification for these feelings in anecdotes, books, articles, documentaries, etc.

Making any sort of correct judgment about the impact of media technologies on society necessitates that we recognize the ways in which we respond emotionally to the sight of other people's media use. By the looks of the most popular opinions on smartphone and laptop use, many of us have yet to take that step.


Tuesday, March 17, 2015

Instant Gratification & Digital Media: An Assumed Connection


At this year's South by Southwest Interactive conference, I attended a panel about the connection between instant gratification and digital technology. While these types of gatherings are great because they bring together people from so many different disciplines (e.g., education technology designers, academics, filmmakers, bloggers, etc.), this can result in conversations in which folks are talking past one another rather than listening and responding to one another.

The panelists at this talk tended to fall into two camps: "hand wringers" and "digital media apologists". The hand-wringers spent their time listing concerns about the ways in which overuse of digital technologies would lead to a society in which people could not delay gratification (which was assumed to be necessary to forge lasting, fulfilling relationships and for general social harmony). They relied on the growing body of evidence supporting the importance of gratification delay and grit (i.e., persistence in the face of multiple setbacks) in a variety of domains, including work and relationships. The apologists pointed out how the instantly gratifying digital media badmouthed by the hand-wringers (e.g., Twitter) connects and empowers formerly disenfranchised members of our society and gives rise to important social movements like #blacklivesmatter.

I kept waiting for a more nuanced discussion to break out, but it never happened. The experience did, however, make me think about how the conversation about this topic would benefit from some clarification of arguments and concepts. So, here would be some starting places:

1. Is there solid evidence of any kind of link between digital media use and any of the effects discussed (namely, reduced attention span and reduced ability to delay gratification)? The connection between these things is assumed to exist by almost everyone. Even many the digital media apologists assume that it exists, but differ in that they think that in addition to these effects, there are positive effects as well. I've found there to be a link among American college students between self-control and social media use as well as digital video viewing,  but I didn't find a connection between self-control and cell phone use. This data was gathered before smartphones became truly dominant, so I might find different connections if I replicated the study.

But what about this assumed connection between grit (or lack thereof) and digital media use? Has anybody even tested this yet?

2. Does this affect young people or every user? There is plenty of evidence to suggest that the habits we acquire as younger people affect our behavior later in life, and that greater neuroplacticity of younger brains mean that media affects young minds, habits, and other behaviors more profoundly. But it is possible that adults who start using digital media in adulthood may be affected by it (specifically, may experience reduced ability to delay gratification as a result of heavy use of digital media).

3. Lowered attention span vs. Inability to delay gratification. Many people seem to conflate these two. Some experimental designs would conflate the two (e.g., an experiment in which people had to choose between reading for homework, which often requires sustained attention AND an ability to forego something more immediately gratifying, and a video game, which provides greater engagement and novelty as well as an immediate sense of accomplishment and pleasure). But it is worth testing these two things separately. It could be that digital media presents us with short bursts of information, and so it hurts our ability to concentrate on or pay attention to anything for a sustained period of time, and/or it may hurt our abilities to forego more immediately gratifying options for less immediately gratifying ones.

4. Hedonic experiences vs. habitual "empty" experiences vs. social surveillance. As the hand-wringers were talking about how digital media provides us with so many opportunities for feelings of accomplishment and affirmation and stimulation, I thought, "what about email?" Email seems to be one of the hardest habits to break, and yet almost everyone I know hates using it. It may be "gratifying" to check one's email in the way that scratching an itch may be gratifying, but I wouldn't call it pleasurable or hedonic. I'd imagine many people feel similarly about social media use: they don't like it, and they don't want to be doing it, yet they feel compelled to do it.

This gets me thinking about designations between things we have to do (like work), things we want to do (like reading a book or climbing a mountain), and things we end up doing (like channel surfing or frittering away time online). It also gets me thinking about the use of the term "addiction" in the media context. When we say that we are addicted to some kind of media use, maybe this just means that it's something we do but don't have to do, like work, nor do we want to do it (i.e., it doesn't give us pleasure), like having a blast with friends. It's value isn't immediately apparent in the way that the value of work or the value of hanging out with friends is. And yet, it could present us with some value: the value of social surveillance, of knowing where we stand with those around us, our family, friends, and co-workers. Email and social media provide us with relevant information about where we stand with these folks.

At the same time, there may be a "purely habitual" component to email and social media use. That is, through repetition, one might do it without thinking about what value it holds. It just is what you do when you pick up your phone, when you sit down at your laptop, or when you aren't otherwise engaged. There is evidence to suggest that when we aren't otherwise engaged, our brains "default" to self-reflection. Perhaps our seeking out of information on where we stand with others (i.e., the standing of our social self) is a symptom or a consequence of this kind of thinking.

5. Do the effects of digital media use carry over to non-digital contexts (e.g., eating), or does the inability to delay gratification assume that digital media is available at all times? When we talk about the poor decisions that people who have reduced ability to delay gratification make, is it because they are choosing some proximate digital instantly gratifying option, or is it because the use of digital media has reduced their abilities to delay gratification of any kind (not just digital kinds). If it were the former, then simply taking the digital temptation out of the environment would immediately reduce the harm, but if the effects of digital media use manifest themselves in other domains, then changing one's ability to delay gratification would take a bit longer, and you would need to take the digital temptations out of the environment for a longer period of time to change the habits of the individual in all domains.

So, going to the talk made me want to think more carefully about how to test these connections. It also reminded me of how easily discussions of this topic can fall into something repetitious, resembling age-old battles between hand-wringing finger-waggers and apologists. Keeping an open mind going into the process of inquiry is essential, but so is greater specificity regarding the concepts and claims we are putting to the test.

Thursday, February 13, 2014

Remote Controls

This moment keeps nagging at me, demanding that I think about it, write about it. First, I must acknowledgement the ways in which metaphors, or the likening of one moment in history to the present moment, can hinder understanding. By cherry-picking the ways in which the two moments are alike based on our preconceived notions of the fundamental nature of the present moment while ignoring all the ways that the two moments are not alike (or the ways in which the present moment is similar to another moment in history), we don't move any closer to understanding our current moment. But here I use the past moment not as a means of comparison or metaphor, but as a way of identifying how certain trends in media use got started. 

I'm speaking of the invention and popularization of the television remote control. The remote, along with the increase in the number of channels, marked a crucial lowering of the barrier to toggling among choice. It was possible to browse entertainment options before, but not quite as easy, and that shift toward easy browsing marked a change from comparing several options to one another to what I call entertainment foraging. Our experiences of using media in an impulsive manner and the attendant feelings of guilt grow out of this moment. The internet and mobile devices have merely extended the logic of the remote control to more moments and areas in our lives. Even when we stay on a single website like Facebook or Buzzfeed, we are often hunting or foraging for some unknown thing. We tend to think of media use as content consumption or connection with an other, as individual experiences: skyping with a friend, watching a video, spending time on Facebook. But I'm interested in the moments in between, the time spent looking for something, the time spent choosing, the proliferation of what you might call "choice points". It's the glue that holds together the other moments, but it takes up a lot of time, perhaps as much time as the moments themselves.

When I started thinking about media choice, I thought that change from the traditional media choice environment to the new media choice environment was the change from deliberative choice (System 2, in Kahneman's terms) to impulsive choice (System 1). But eventually I came to believe that even if the options are few, when its a matter of how you spend your leisure time, the stakes are very low, and so you make a quick choice. There isn't much at stake, so why deliberate? Even when the choices were few, we probably still chose impulsively or ritualistically, without much careful consideration. So perhaps our media choices were always usually impulsive, but they were impulsive with many borders or restrictions, different borders and restrictions than the ones we have now. The options from which we chose leisure media experiences were limited by bandwidth and shelf space. The times at which we chose such experiences were limited by synced schedules and clear demarcations between work and leisure times and places. Without the borders, without the restrictions, the options have changed. When the options change (and this is highly counter-intuitive, but supported by a ton of empirical evidence), our choice patterns change. Increasingly, our impulsive choices, collectively or individually, feedback into the system that generates the option menus. Our options, and our selections, are dictated by the impulsive self with less interference from the outside world. This doesn't bode well for our long term self, our abilities to achieve long term goals.

What can we do about it? What are we doing about it? There are new technologies that form a middle layer between media applications that offer us options and our impulsive choosing selves. I call these software applications, like Freedom or Self Control, choice prostheses. Are they effective? That depends. In some ways, use of choice prostheses resembles dieting, and most diets do not work in the long term. In other ways, they resemble choice architecture or nudges, which are more effective in changing behavior in the long term. This is the next step in my research on media choice: to better understand how choice prostheses work and how they might best be used to change our choices for the better. 

Sunday, October 13, 2013

Mobile media revealing our selves to ourselves

Mobile phones have many, many purposes: entertainment, art, communication, education. One of the more controversial applications of this technology is using it for monitoring. When certain people have the ability to covertly monitor others while not being monitored themselves, there is a power imbalance. It is possible for those doing the monitoring to judge and punish those who they catch doing something bad when, in fact, this is something that they do themselves but no one is able to observe them doing it. I'm not sure this actually happens all that often or will be as likely to happen in the future as most people believe, but for the time being, let's assume this concern about mobile media as monitoring device is a valid one.

But what about self-monitoring? The possibility of using mobile technology for self-monitoring and self-feedback has not yet been fully realized, and I think it may help people overcome two significant obstacles to behavior change.

The first obstacle is being aware of the patterns in your own behavior: the environmental cues that cause you to unconsciously respond in a way that isn't in your long-term interests, the moods and thoughts that precede worse moods and thoughts. It will be tricky to use mobile tech to reveal these patterns without being too obtrusive. I've tried out some experience sampling technology on my phone and its hard to even get it to work right, and when it does work right, it may just be too much of a nuisance to put up with. It feels like one more thing you have to do, like a diet, and almost all diets fail. But its conceivable that you could design a less intrusive way of tracking thoughts, feelings, and behavior throughout the day.

The second is one I've been thinking about a lot: people do not like to be told what is good for them by others. They may be a bit more accepting of such advice if its coming from a trusted expert (say, a medical doctor), but even then, it doesn't feel good to be told what to do. People will start to look for reasons to doubt the expertise of the advice giver. If we give people the tools to make connections for themselves and to use that new knowledge to alter their choice environment, there is no external force telling them what to do. If anything, they are telling their selves, their future selves, what to do.

We all do this already. We resolve to do things in the future that are in our best long-term interests but then fail to do so in the face of temptation or distraction. Mobile media, because it is with us at all times in all contexts, can be a tool with which we cope with temptation and distraction. Ideally, it will not proscribe particular ways of being, but will merely be a tool for individuals to closely observe and then structure their lives.

Modern existence necessitates being part of an increasingly complex (i.e., hard to understand) set of interactions with an increasing number of people. It is difficult to know why you feel the way you feel, or do the things you do. Designing an unobtrusive, secure self-monitoring application and using it in tandem with some choice-limiting technology is a way to exist in such a society. Potentially, mobile tech is a layer of technology between all those aforementioned applications - entertainment, art, communication, education - and the self who is equipped with the older "technologies" of the brain, eyes, ears, and nose.

Saturday, September 21, 2013

Choose your metaphors wisely

The term "digital detox" is now in the Oxford English Dictionary. This suggests a certain cultural awareness of the concept of media overuse which, to someone studying self-control and media use, is heartening. But I wonder about the choice of words.

"Detox"'s meaning, recently, related to food consumption, or the lack thereof. When you traded in the fast food for celery and refrained from consuming anything but tea, you were "detoxing", or "cleansing". But, of course, the term detox originally achieved notoriety when it was used in reference to addictive drugs like heroin. When I think of the term, I still think of it in terms of drugs.

Even beyond that single term, we are apt to think of new experiences in terms of older, more familiar ones. More and more, I hear people talk of Facebook or Candy Crush "addiction" which, again, conjures thoughts of hopeless alcoholics or strung-out meth-heads. Perhaps, in our effort to blame anyone but ourselves for the fact that we're unable to eschew immediately gratifying options for activities that will help us achieve our long-term goals, we want to trump up the power of our indulgences, and we do this by likening our habits to the almost-literally-irresistible urge of the crack addict to smoke more crack.

If we have to understand our new experiences with habitual digital media use in terms of older experiences (and, as much as people love to bad-mouth metaphors as some sort of crutch that keeps us from seeing new things as they truly are, I think this is the only way anything new can be understood at all, at least at first), then food and dieting would probably be a better metaphor than drugs.

Information (that is, the content of all media, digital or otherwise) is like food (and unlike, say, cocaine): we need it to survive. You can't really say no to media any more than you can say no to food. Getting too much of it is bad, but not nearly as bad as injecting large doses of certain drugs. Many more people struggle with tempting foods than with tempting drugs, and I think media over-use and habitual media use should be understood as things that are as common and as benign as bad eating habits, not as rare and harmful as drug addiction.

As time goes on, we'll understand our relationship with digital media on its own terms. But until then, its important to at least consider how our comparisons to other experiences (both in thought and language) affect our perception of threats and responsibilities.

Monday, September 02, 2013

Do you have a Candy Crush Habit or a Candy Crush Addiction?

Most of my recent research concerns media habits and/or what you might call "unconscious" media use. These are the times when we open up a tab on our web browsers and go to a website without thinking too deeply about why we're doing this or considering the long-term value of such an act. Over time, such behavior become habits, and habits can be hard to break. It seems that if habits are sufficiently difficult to break, we call them addictions.

But I, like many others in the field of psychology, am not too keen on applying the term "addiction" to habitual media use. Why not? If you're playing Candy Crush Saga five hours a day and you feel unable to stop playing, what is the difference if we call this a bad habit or an addiction?

Well, I suppose it has to do with how our culture currently understands addiction. We treat it as a disease that requires professional intervention. We assume, as is the case with most diseases, that the afflicted is not responsible for their affliction and that it is unlikely that they can get better on their own. They need help. This diagnosis is well-meaning in the sense that when people are going through something bad (and the feeling of being unable to stop doing something is usually bad) it would be worse to heap the extra guilt that comes with responsibility for their current condition (and for altering the condition) on top of their existing troubles. In addition, professionals have years of experience dealing with addictions and decades of research to help develop systems for fighting addiction.

But there's something that's lost: the individual's sense of self-efficacy, the sense that they can do something about the behavior. In some cases, it's possible that self-efficacy can be an important part of altering habitual/addictive behavior, that the individual finding a way to change their behavior is more effective and efficient than sending all of those individuals to professionals and/or through a series of institutions.

As more people find themselves with habits/addictions to games like Candy Crush, it's important to address the following questions: What role does self-efficacy play in breaking habits? If we call the habit an addiction, does this diagnosis reduce the person's sense of self-efficacy thereby making it harder (or perhaps more expensive) to quit?

This isn't to say that simply labeling this kind of behavior as "habits" isn't without drawbacks. People may not take the threat that their behavior poses as seriously if they call it a habit (most of us have bad habits, after all). Even when we do call the behavior addiction, we're increasingly liberal in our use of that term which waters it down (much like the term "stalking", which is used in a casual, everyday sense).

So, whether we call this kind of behavior "addiction", I think, is not just a matter of semantics. It is possible that diagnosis affects self-efficacy, which affects likelihood of behavior change. Whether you call it addiction or habit, the end game should be the same: understanding how people stop doing things that they, at first, feel they are unable to stop doing. But it's important to recognize the role of words and diagnoses in that process.

Wednesday, July 20, 2011

Puppies & Iraq


I just saw Page One, a documentary about the New York Times, which raised some interesting (if oft repeated) questions about journalism that come along with the financial instability of the industry: is there something about a traditional media outlet like the NYTimes that is superior to the various information disseminating alternatives (news aggregation sites, twitter, Facebook, Huffpo, Daily Kos, Gawker, etc.) and, if so, what is it? What is it about the New York Times (or the medium of newspapers in general) that would be missed if it was gone?

Bernard Berelson asked a similar question in a study of newspaper readers who were deprived of their daily newspaper due to a workers' strike in 1945. The reasons people liked (or perhaps even needed) the paper back then - social prestige, as an escape or diversion, as a welcome routine or ritual, to gather information about public affairs - are all met by various other websites and applications, some of which seem to be "better" - that is, more satisfying to the user - at one or all of these things than any newspaper is.

I want to pick apart this idea of that which is "more satisfying" to the user, or what it means to say that they "want" something. The mantra of producers in the free market, no matter what they're selling, is that they must give the people what they want. Nick Denton of Gawker has a cameo in Page One in which he talks about his "big board", the one that provides Gawker writers with instant feedback about how many hits (and thus, how many dollars) their stories are generating. Sam Zell, owner of the Tribune media company, voiced a similar opinion: those in the information dissemination business should give people what they want. Ideally, you make enough money to do "puppies and Iraq" - something that people want and something that people should want. To do anything else is, to use Zell's phrase, "journalistic arrogance".

Certainly, a large number of people are "satisfied" with the information they get from people like Denton and Zell. But Denton and Zell, like any businessmen, can only measure satisfaction in certain ways: money, or eyeballs on ads. There are other, often long-term social, costs paid when people get what they supposedly want. When news is market driven, the public interest suffers. So goes the argument of many cultural theorists. But who are they to say what the public interest is? Why do we need ivory tower theorists to save the masses from themselves?

Maybe that elitist - the one who would rather read a story about Iraq than look at puppies - is not in an ivory tower but inside of all of us, along with an inner hedonist (that's the one that would rather look at puppies all day). There are many ways to measure what people like, want, need, or prefer. I'm not talking about measuring happiness as opposed to money spent/earned. I'm considering what happens when we're asked to pay for certain things (bundled vs. individually sold goods) at certain times (in advance of the moment of consumption vs. immediately before the moment of consumption). There is plenty of empirical evidence to suggest that those two variables, along with many others situational variables external to the individual, alter selection patterns of individuals. Want, or need, or preference does not merely emanate from individuals. When we take this into account, we recognize that a shift in the times at which individuals access options and the way those options are bundled together end up altering what we choose. We click on links to videos of adorable puppies instead of links to stories about Iraq because they're links (right in front of us, immediate) and because they've already been paid for (every internet site is bundled together, and usually bundled together with telephone and 200 channels of television). If it wasn't like that, if we had to make a decision at the beginning of the year about whether we "wanted" to spend all year watching puppy videos or reading about Iraq...well, I guess not that many people would want to spend all year reading about Iraq. But I reckon that many people would want, would choose some combination of puppies and Iraq if they had to choose ahead of time. The internet is a combination of what we want and what we should want, and so is the NYTimes, but they represent a different balance between those two things. The Times is 100 parts puppies, 400 parts Iraq. The internet is 10000000000 parts puppies, 100000000 Iraq (or something to that effect. When you change how things are sold, you may not change what people want, as many theorists claim, but you do change how we measure what people want.

Maybe we never have to defer to a theorist to tell us what we should be reading or watching in order to be a better citizen. Maybe we just need to tweak our media choice environment so that it gives the inner elitist a fighting chance against the inner hedonist.

Monday, July 04, 2011

The Ethical Issues of Analyzing Time, Desire, and Self-Control

My tentative dissertation project (becoming less tentative as my defense date draws closer) has to do with time, desire, and self-control. One basic premise of the project is that each of us has short-term desires and long-term desires, and that these desires are often in conflict with one another. We might say that "part of us" wants to eat that chocolate cake or spend time on our favorite leisure website, and another "part of us" wants to eat less fat and carbs and spend more time working on projects, exercising, or volunteering. This, in and of itself, doesn't seem that controversial.

Through parents/caregivers and the education system, most people learn at an early age the consequences of too-frequently indulging their short-term desires. The more immediate, painful, and affective the negative consequence, the easier it is to convince yourself to refrain from future indulgence. Even before our parents/caregivers, evolution gave us in-born, visceral reactions to things that are good for us in the long run (eating nutritious berries = yummy!) and things that are bad for us in the long run (eating poisonous berries = vomit). But evolution doesn't provide the fine tuning, and in a fast-changing, complex environment, our consequence estimations need outside assistance. A different kind of convincing-of-the-short-term-self needs to happen when the feedback isn't visceral and immediate.

People have been smoking tobacco for roughly 5000-7000 years, but it wasn't until the last hundred years that large numbers of people knew that it hastened their death. Of course, most ancient smokers died from other ailments when the lifespan wasn't long enough for them to die of lung cancer. Once it became long enough, and once scientists had found a connection between smoking and cancer, a large number of people who would've enjoyed smoking in the short-term stopped or cut down (or at least felt guilty) because they had been informed by some trusted "other" that doing so would bring about long-term benefits. This isn't just self-control. Its informed self-control.

In some ways, this is the role of culture in general: to produce informed self-control (Freud's super-ego). We've all got the easy behavioral imperatives figured out: don't eat stuff that makes you puke; avoid situations that evoke terror. Rules exist because some of us (or all of us under some circumstances) may be inclined to behave in ways that are prohibited by those rules. Rules are not so much "made to be broken" as made to correct what was "broken" about our perceptions of consequences. For better or worse, this has become the domain of doctors: first physicians and perhaps now psychologists and psychiatrists. They make rules based on observations of seemingly disconnected actions and consequences. They are experts in consequences. Did psychologists, educators, or scientists aspire to the role of rule-maker? Probably not, but they're a necessary bi-product of a complex world in which our finite senses can't keep track of the many connections between actions and consequences. To believe otherwise is to succumb to nostalgia for a by-gone world.

Things get messy when we get personal about our analysis of time, desire, and self-control: media use (my area of research) and, even more personal, marriage and sex. There have been some terrific articles and commentary about marriage and fidelity in the wake of Anthony Weiner's virtual infidelity and NY's passing of a gay marriage law. A defining characteristic of marriage is the pledge of individuals to stick together. Its an attempt of the long-term-thinking self to override the future short-term-thinking selves so that the long-term self can benefit. But who is informing that long-term-thinking self? What is their evidence? What is their agenda?

This leaves us with an uncomfortable reality: those who can demonstrate the negative long-term consequences of things you know are pleasurable in the short-term and you think are not harmful in the long-term are telling you what to do, and people tend to not like being told what to do. For good reasons, too. Those in positions of power abuse it for their own gain. If I own stock in a cookie company, I'll fund research and coverage of research suggesting that another indulgence is particularly harmful, leading people away from that indulgence and toward cookies. Similarly, certain relationship experts might promote a certain view of monogamy because they benefit from its success in the marketplace of ideas, not because its any more accurate at predicting negative consequences than any other theory. The same might be said of a media effects researcher. Those who reject the findings of so-called experts analyzing this complex causal world can simply blame another aspect of that complex world that isn't under their control, freeing their short-term selves from blame. If people who aren't in long-term, monogamous relationships aren't happy, its not because they couldn't exert the self-control recommended by experts; its because they're being judged by an unfair, retrograde society intent on maintaining a certain kind of social order. If people who play lots of violent video games are more aggressive, then its because you measured "aggression" wrong or its due to some variable the researchers didn't control for. Basically, this leaves everybody believing what they want to believe, deferring to no one, and assessing consequences based on personal experience and the limited experience of those around them.

Since I don't want this entry (or anything else I write) to be an empty exercise in hand-wringing, I'll suggest some priorities for research and reporting on research.

We'll have to move from a proscription paradigm to an explanation paradigm, one that is supported by replicable empirical evidence. It is best to demonstrate how to find the links between short-term behavior and long-term consequences, to let people "see for themselves" as much as you can. Our society has become more complex, making it difficult to see the connections. Much of the study of the world, in science and the humanities, has become equally complex: full of impenetrable jargon and statistics. We've got to make explanations clearer, better educate ourselves so that we have some basic fluency in these languages, and support an education system that helps students understand how to find connections for themselves. Yes, we live in an extremely complex world, but the good news is that we've just scratched the surface of how technology can be used to explain concepts, patterns, and connections to large numbers of people in a customizable, individualized way, for free. Behavioral scientists and theorists might be at the forefront of finding patterns in behavior across time, but they can't maintain the trust of the public unless the public can see for themselves.

Not only can the public see for themselves, but maybe they can do the restricting themselves, too. We've all got a conscience. We just don't have the societal restrictions to assist it, and physically/temporally proximate temptation makes it harder to listen to that voice. We're not all tempted by the same thing, so the restrictions really shouldn't be one size fits all. If people can design their own restrictions, you avoid the possibilities of reactance and the totalitarian manipulation of taste that inspires it.

So, I'd like to run an outside-the-lab experiment to provide evidence that supports my dissertation hypothesis, but I'd also like people to be able to try the experiment on themselves, to plug in their own individual variables.

Tuesday, October 12, 2010

Portable Technology: Size, Time, & Weight


My new computer - a netbook - has me thinking about how the physical characteristics of a device can influence how I feel about it and then what I do with it. I'm not talking about technological affordance - what the software/hardware allow me to do - but rather how the size and weight of the object influence my feelings about it.

My first working hypothesis: the smaller the device, the more "handy" it is, the more it will be suited for short bursts of use. Its hard to bust out anything bigger than my hand when I'm on the go, on a bus or walking around campus. Also, the use of these smaller devices for longer periods of time seems somehow fatiguing. Trying to block out all this other sensory information while concentrating on a smaller screen for a prolonged period of time is more difficult than concentrating on a larger laptop screen. For these reasons, smaller devices = shorter duration of use sessions.

Then I thought about whether weight has anything to do with use. I don't bring my laptop everywhere I bring my netbook b/c of the weight of my laptop. Its not so much that its literally too heavy for me to carry, but that it feels burdensome. I'm constantly reminded that its in my backpack. If I had a super light MacBook Air, I might feel better about bringing it more places b/c I wouldn't feel burdened by its presence.

So far, I'm finding that the netbook is making me more productive b/c I can "sneak up on myself" and start working on a project. This is inspired by a project I'm embarking on regarding study habits and affirmation (with Emily Falk and Elliot Berkman) related to my dissertation work on self control and virtue/vice media habits. Basically, if I think about going to a place (usually my office in my house), sitting down and doing work, I don't feel good about it, and I tend to avoid that place. But if I can take the "place" of work out of the equation, if I can get to work as impulsively as I can engage in time-wasting leisure activity, if work becomes as accessible as play, then I think I can get to work before I have a chance to dread it. At least that's the way it worked today.