Wednesday, June 4, 2008

Dual Tasking

Yesterday's lab meeting opened with a message passed on from my advisor: people notice when grad students take laptops to colloquia/seminars/things where a crowd of people gather to sit and listen for an hour or more, then spend the time listening with half the brain and doing work with the other half. The general consensus of these "people" is that if you're going to do that, you might as well not come. My advisor is at least willing to admit that we don't actually have a choice about attendance to most of these events, but discouraged dual-tasking all the same.

There are four grad students in our lab, all of whom have laptops we carry around just about everywhere. It was a general message, but I know I am one of the worst offenders. I briefly considered making it a personal rule not to take the laptop to these meetings (unless I'm actually taking notes on the talk, which happens fairly often), but then I started wondering exactly why this is considered to be such a big problem.

There's disrespect to the speaker; this would seem to be the most likely culprit. And yet, the back rows of chairs at these events are lined with faculty who showed up 5, 10, 15 minutes late. How is that less disrespectful? I'm not just sitting there and ignoring the speaker, so I probably only miss as much of the talk as the people who show up late. When the speaker and others in the audience get into a nuanced debate I have no hope of understanding, I lose nothing by running a quick lit search or composing an email to someone.

There's missing the details of the talk; this was my advisor's tack. The point of attending these events is to learn more about other fields, and to learn how to ask good questions. The problem here is that I am not the best learner from the colloquium format. There are too many details, and the slides are too poorly organized, to get a grasp of a talk without interrupting in the middle, which we're not supposed to do. I'm rarely going to do better than getting the gist of the talk anyway, and checking the weather or typing up a blog entry bit by bit is not going to change that.

It doesn't matter how often someone tells you that there are no stupid questions; you can tell how few graduate students buy it by how few actually ask questions. Surrounded by faculty, in a talk far out of my own area, I'm not going to ask what might turn out to be the most basic fact imaginable.

Then there's the appearance. Showing up at a talk and only paying half the attention it might be merited reflects upon you as a student (and your advisor as an advisor, which is probably why we got the lecture in the first place). All those other members of the faculty see that you aren't dedicated, and this is bad, so at least look like you're devoting your entire attention to the talk.Since I can see at least one faculty member doing the same thing, such an angle is slightly hypocritical.

The problem with any of these reasons is that removing the laptop treats the symptom, not the cause: boredom, loss of attention, inability to follow what's being talked about. The worst case was when I watched my lab mate give a practice brown bag, then give the actual brown bag, and then give an extended version for the new summer series. The talk got better each time, but it also got old; the best strategy for attendance was to follow the new information and listen with one ear while he went over the unchanged slides. Hence, the laptop. I'm currently listening to the fourth or fifth iteration of another lab mate's talk, and obviously the umpteenth explanation of the phenomenon she studies does not hold my entire attention.

No one has ever suggested that I have ADHD, but I am a product of my generation. I put a CD or DVD on while I work, I move around cleaning house when I'm on the phone; I multi-task as a way of life; the only time I don't is when I read a book, which is accompanied by a lot of meta-analysis on writing style and possibly upcoming plot turns.Perhaps I should cultivate the ability to put my entire mind behind a single task, but this almost seems counter-productive. I've been involved in several long discussions about "what is intelligence", and to me multi-tasking is a critical component.There may be arguments for how it can be better to get a lot out of the one thing, but there's not clear case for one or the other that might make me put the effort into changing my own thought patterns.

I suppose the real question is this: If hearing about research and findings stimulates my brain, and I'm stuck in an environment where that creative energy can't be channeled into discussion and brainstorming, why is it so wrong to caputre as much of it as possible on my electronic notepad?


No comments: