Friday, February 23, 2007
Lifelogging: How We Forget
Just read this fascinating article in the Chronicle of Higher Ed about life-logging: recording audio (and eventually, video) every moment of your waking life. You can then search through that data the way you would search through your memory, only it would not decay the way your memory does. Would this be a good thing?
I have kept a personal journal for 11 years now. I've made an effort to record my thoughts and many events from my life, no matter how embarrassing or mundane. Now that I've got 2000 pages worth of data, I can search for a person's name or an emotion (e.g. hate, love, crush) and analyze my life, my behavior, and my consciousness in ways that my decaying memory doesn't allow me to do. So, how is this any different than lifelogging? Isn't it just a matter of degree?
When I have a conversation with someone, even an especially private one, my memory is recording their every word. I than have the option of going back to my computer and recording those words. Those words can then end up on the Internet or who knows where (though I'm obviously very careful to guard them and not put them on the net). But we've had this option of recording private events...forever, right?
So then it is a matter of degree. But that doesn't make it any less significant in the ways in which it could potentially disrupt social life. In fact, this article finally convinced me of the worth of privacy in this age of surveillance. I was a longtime holdout only b/c the crux of most privacy advocates' arguments seemed to be invoking Orwell and leaving it at that. Indeed, its an extremely hard argument to make b/c 1) the march of surveillence technology feels inexorable and 2) its hard to point to many widespread instances of abuse or make the chilling effect on behavior visible.
But that's why this lifelogging experiment that the people in the article engaged in was worth doing. By pushing it to an extreme, by making it personal rather than political, I could finally see the ways in which it would radically alter social behavior. We totally underestimate the role of forgetting and deception in our self-images and the images of others. We are designed to underestimate these things. Perhaps we each need to record our lives (or read about someone else who has done this, in my case) to understand how much we forget and how much we distort our memories.
This brought me back to a thought I had after my hard drive crashed a few weeks ago. I was watching 2001 on TMC, and considering the words of HAL, thinking about whether or not the fear of a sentient machine was still a fear of ours, 40 years after this film was made. The big mistake HAL's programmers made (and indeed a fault of most programmers) was to think that they could design an infallible computer. No matter what, a computer, like a human, can screw up. The reason why computers are inhuman (and perhaps why they strike fear in our hearts) is because they fail in different ways than us. But if we recognized this, and tried to design them to fail in ways more like the ways in which we fail (to design them to "forget" things gradually, to act erratically in certain situations, instead of aspiring to perfection), then computers and robots wouldn't be anything worth fearing. We should get to know the design of our minds, and then design computers in a similar but slightly less flawed fashion.
Really, what makes computers unlike humans is the ways in which they decay. In one of my classes, I'd claimed that digital technology did not decay. It either worked or it did not. I was proven wrong the next week when we brought in several gaming consoles, including my old (roughly 18 year old) Nintendo Entertainment System. One of my students played Mega Man 3 (I think it was 3, but I may be misremembering) and the game gradually became more "buggy," the screen increasingly clogged with glitchy graphics until finally, inevitably, it froze. So I realized that digital technology gradually decays, but it decays in a different way than our minds. And it is vulnerable in ways that we are not.
That is what defines us, or at least sets us apart from computers: the ways in which our minds and memories decay or become damaged. The ways we forget. Perhaps we shouldn't be working on computers that recall everything, but on computers that "forget" data in the same ways we do. I suppose that's the promise of meta-data: to get computers to recognize importance and meaning in the ways that our minds do.
As applied to my personal journal, I'd need some way to teach the computer that some information (my happiest memories) is more important than others (whether it was Mega Man 3 or 4 that student was playing).
http://chronicle.com/free/v53/i23/23a03001.htm
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment