Sunday, September 26, 2010

Going Off the Meds


The first time that I experienced panic attacks, I was seventeen, and I felt like I was living in a horror movie. I would be walking down the street in the middle of the day, when suddenly the light would go all funny. The ominous overture to Carl Orff's Carmina Burana would be playing faintly in the background, except I hadn’t heard Carmina Burana before except in scenes depicting hell in movies. The color of everything would be sickly and wrong and I would feel all the blood in my body drain to my feet, and there would be the presence of some sickly grotesque thing that defied the normal order of the world, some dark thing come from the other side to suck me under.

This feeling would overcome me every few minutes, hundreds of times each day, so that life was rather like creeping my way through the hotel in The Shining, or maybe one of those new homemade-looking horror movies whose advertisements celebrate how “disturbing” they are.

The therapist I saw took me to the psychiatrist to see if he would recommend medication. He shot a dismissive glance at my gaunt figure and tattered thrift-store clothing and said, “If this is still happening in six months, we’ll give you some antidepressants.”

Six months, I remember thinking in horror. If this is still happening in six months, I’ll be dead.

Six months later, the attacks weren’t gone, but through weekly therapy, I was learning to control them, so that I could usually anticipate them and stop them before they got really bad. I learned that I needed to get a full night’s sleep and eat reasonable meals and not wallow in stress or immerse myself in bizarre, alienating music and literature, frustrating lessons, but ones that I suppose my body thought it was high time for me to learn.

Other than sedatives, I don’t think anti-anxiety medication had been developed back then. If it was, I certainly wasn’t offered any. Since that time, I have had several other bad bouts of panic attacks, and I have never been offered any sort of medicine by the psychiatric and medical professionals with whom I have consulted.

From the stories that I’ve heard, that you’ve heard, too, I am an anomaly. Friends, books, TV shows, magazines all tell us that psychotropic drugs are dispensed like corn syrup, an efficient solution to our culture’s wealth of mental disorders that avoids the expense of psychological treatment while providing easy revenue to the drug companies. At times, I have felt cheated—why haven’t they offered me any of these “overprescribed” drugs? Am I not screwed up enough to warrant a healthy dose of that Prozac they’re debating in all the magazines, that Paxil and Zanex that they seem to be prescribing for everyone else I know?

I especially feel cheated when I think of that horribly suffering teenager I once was, and wonder: couldn’t they have given me something? Surely there must have been something, some sedative, some kind of valium or Quaalude or horse tranquilizer that could have calmed down my overactive brain, rather than leaving me to pull myself, with grasping bloody fingers and a year of therapy, up out of that horror movie and into some kind of sane, properly-ordered mental functioning.

On the other hand, when I watch the struggles of the many people I know who have been placed on medication, I feel like I got out easy, relieved and lucky to have not been pulled into a deeper kind of morass—the engulfing cycle of being on medication and getting off of it.

I’ve watched a lot of people try to go off of their medication, and from what I’ve seen, the process of going off of medication for mental illness will in itself make a person mentally ill. I’ve watched some of the closest people in my life shake and cry and wail that they need the medicine, with all the desperation of heroin addicts. I’ve watched people adjust the dosage—just a little more now, now a little less, cut down by half, and if you completely freak out, go back up by a quarter. One friend who struggled with her romantic life went to her primary care physician to get referred to a therapist, and instead got prescribed Prozac; a year later, her struggle was not to maintain a healthy relationship but to get off the Prozac.

Of course, one obvious explanation for the anguish caused by going off medication is that the medication is needed, and that it is the only thing that was preventing the suffering in the first place. Your brain is like any other organ, and if your brain is sick, you may require medicine for the rest of your life, just as you would if you suffered from thyroid or heart disease.

This may be the case for some mentally ill people, but as far as I understand, it is impossible to diagnose conclusively who they are. There is no physical test to diagnose mental illnesses; they are diagnosed by symptom. It is as though I went to the doctor complaining of fatigue and thirst and, based on those complaints, I was assumed to have diabetes and put on a regimen of insulin. If my symptoms abated in a few months, it might be difficult to tell if this was due to the insulin or something else altogether. The only way to tell would be for me to stop taking the insulin, and see if I felt sick again. Except, since my body is accustomed to the insulin, I may feel faint and ill when I stop taking it, and this could be a normal sign of withdrawal, or a deadly symptom of my now-untreated illness, and it may only be my death that provides a conclusive answer.

I don’t mean to insult psychiatry as an imprecise science. I have seen many people I care about treated with drugs that seem to have saved their lives, and I will always be an avid supporter of anybody’s right to stay on a medication that is preventing them from being miserable, numb, non-functional, or delusional. I know firsthand what mental illness can be, and if the path to my own health lay in a pill, I would take it without hesitation.

A few weeks ago, on the second anniversary of the death of David Foster Wallace, I heard several radio programs discussing his life, his writing, and his battle with depression. Wallace had done something that I have seen so many people I care about do: he reached a happy, stable time in his life, and, no longer suffering the symptoms of his depression, he decided to go off of the medicine that he had taken for years.

The people in my life have done this with varying levels of success. A few people I know have actually managed to go cold turkey, and found that the depression they experienced as young adults seemed to have dissipated with maturity. But more often, my friends and family members have found that their depression returned. They may be able to take lower doses of the antidepressants, but can’t seem to go off of them entirely.

David Foster Wallace’s version of this process sends a chill through anyone who takes antidepressants or cares about somebody who does: when he stopped taking his medication, his depression returned full-force, and when he returned to his medicine, it no longer worked, and he committed suicide.

It’s the fear of this very sort of dependence that leads so many people I know to want to get off medication. What if there is some disaster? I don’t want to be unable to function without medicine. Unlike diabetics and other people who have a diagnosable, physiological need for a medication, people with illnesses like anxiety and depression are never sure just how dependant they really are, nor are their doctors. It’s impossible to find out just how dependant you are on antidepressants without risking your life or your sanity.

Now when I watch people I know try to reduce or go off their medicine, I give grudging thanks to that judgmental psychiatrist who did not want to give me medication when I was a teenager. Without a chemical remedy, I had to figure out how to make my own mind healthy. I learned a lot of skills that I have used throughout stressful times in my life, even times when my panic attacks have returned almost as strongly as the first ones I experienced. I have never had to wonder whether I really need a medication, to feel the temptation to turn to it when times are difficult, to try to distinguish anxiety caused by needing medication from anxiety caused by withdrawing from medication. I don’t believe that everyone’s mental health problems can be solved without medicine, but I will be forever grateful to know that mine can, and to have the tools to do it.

The illustration depicts Berkeley's Hate Man. I don't know if he ever went on or off his meds.

Friday, September 3, 2010

Inspirational YouTube Videos


These are the words I most dread at the end of a student presentation:

And now, we’re going to show a video that sums up our topic.

Then it comes: four minutes long, the mournful tones of Sarah McGlaughlin or Suzanne Vega playing in the background, the heartrending images, the atrocious use of punctuation.

As soon as the students announce the video, I cringe. Never mind that these videos are a kind of visual and rhetorical assault on my consciousness, making me violated for having to watch them. If my students had created them, I would still beam with pride. The videos represent the very sorts of projects that we hope the students will create, in fact—their assignment is to create a Powerpoint presentation creatively presenting their research on a social justice issue. They are first-semester college students, many of whom have severe learning disabilities or come from continuation high schools where they were never assigned homework. If their discussions of the issues are a bit melodramatic, that’s fine; we are happy if they simply complete the assignment without having a nervous breakdown or punching one of their group members.

The problem with the videos is that they are someone else’s overly sentimental Powerpoint presentations, posted on YouTube so that anyone with an Internet connection can watch them, absorb their message, overlook their poor organization, research, and spelling, and use them as part of their own presentation on the same topic.

By presenting the videos as part of a researched presentation, the students imbue them with a sense of credibility or authority, as though they count as research. I can envision the students watching the videos and finding them more powerful than the actual data the students have collected; “Hey, guys, I finally found something interesting,” I imagine them saying.

The fact that these videos always appear at the end of the presentations is even more frustrating from the perspective of a writing teacher. Rather than end with their own ideas, findings, and voice, the students give away that platform to someone else, and worse, someone no more informed than themselves. The videos would bother me much less if the students would speak after them, explaining why they had chosen that video and what it meant to them. But instead, the video is given the final word, this anonymous authority rising out of the ether to lend imagined credibility and emotional power.

I have seen the same one about domestic violence several years in a row, a rambling stream-of-consciousness polemic. It juxtaposes a series of distressing images—a woman with a horribly bruised and swollen face, for example—with meaningless statistics, or cryptic bits of narrative, apropos of nothing that came before them—Forty percent of women say they have been abused by a partner; She is afraid to leave him.

Finally one year, I remembered to put a policy about YouTube videos on the assignment instructions: “Your presentation may include no more than one minute of any video that you did not create yourself.”

When presentation day came, one group ended by introducing a video: “We know you said that we shouldn’t have more than a minute of a video, but this one is really good,” they said to me.

Then the familiar music began, and the familiar series of domestic violence images and factoids began again.

I knew that every student in the group had experienced abuse, either in their parents’ homes or at the hands of high-school boyfriends, so I did not want to reprimand them for flagrantly dismissing the assignment instructions. But it saddened me to realize that they believed this cheesy video would be more powerful than their own words, ideas, and experiences. This is what years of schooling has drummed into their head: that if something is worth saying, it has already been said by someone more important than you, and you’re better off using that person’s words rather than your own.

I’ve noticed that many of the power figures at my college also rely on the voices of others to support, or make, their arguments. Our former college president was a great fan of inspirational quotations. They adorned her office, engraved into glass paperweights and printed on postcards. She opened and closed every address with them, relying heavily on the classics: Your playing small doesn’t serve the world, she would say, or The definition of insanity is doing the same thing over and over and expecting different results. They were quotes attributed—verifiably or not—to great writers and thinkers. But I would bet my union-negotiated contract that these quotations were the only things that she, or most of the people who use them, had ever read by these esteemed authors.

The internet allows us to find a wealth of these kinds of dismembered quotations, inspirational words of advice from authors we have never heard of, without even having to leave our house and buy a book of them. My students have recently taken to opening their essays with barely-relevant passages by people ranging from George Bernard Shaw to Anthony Robbins, names that I am sure have no meaning at all to the students citing them.

With YouTube, we do not need to limit ourselves to using the words of published authors or well-known personalities or even people who know how to write. Anybody can put a video on YouTube, and it is possible to find a relevant, if poorly-constructed, opinion piece on any topic at all.

The college president took advantage of this expanded quotation marketplace in a speech welcoming all the school’s employees back to work after summer break.

“Last year, when I talked about leveraging abundance,” she said, “some people asked for specific examples to show what I meant. So this presentation has lots of examples.”

It turned out that these were not, as I presume “some people” (i.e. everybody) had wanted, examples of how colleges had leveraged their abundance, or suggestions as to how we might do so at our college.

No, the examples were YouTube videos. The first was a cartoon positing the premise that “everyone can be an entrepreneur.” Anyone who creates their own business is an entrepreneur, the video explained, whether that business is a lemonade stand, a yoga studio, or a software corporation. “I like the positive spirit of this video,” said the president. “It shows people taking initiative.”

She then showed a video of a nine-year-old boy who had given a motivational speech to thousands of teachers. “Do you believe in me?” he asked them, pointing into the crowd. “I believe in me.” The thousands of teachers applauded wildly.

“This video reminds us to always think of our students first,” said the president.

I imagine that the president used these videos in her speech for the same reasons my students use them: because they are fast, easy, and vaguely reminiscent of something like actual research. Finding original examples of local colleges using innovative strategies to stretch minimal resources (which is what I think leveraging abundance is supposed to mean) would take a lot of time, energy, and thought, not only to find the examples, but to thoroughly understand and explain them.

Don’t get me wrong: I actually love YouTube. I love the poorly-constructed opinion pieces, at least in theory, and often in practice. The democratizing of public expression is one of the best things about the internet. Everyone can express their views and have them read by strangers from around the world. Insecure students like mine can create a project that embodies their research, and then post it for their friends and families to see.

The problem comes when we allow things we find on the internet to stand in for well-thought-out arguments and evidence. With so much information instantly available, we begin to believe that doing research is as easy as shopping for shoes: point, click, it’s yours.

But just like with the shoes, it is easy to ignore the cost we will pay for our ideas, if we want them to be good. As an English teacher and a writer, I am painfully aware that good ideas are not free. We pay for them, not in money, but with our labor—the labor of thought, of real research, of reading real books cover-to-cover and formulating our own ideas about them rather than allowing them to speak for us. It’s a lot of work, but the rewards are worth it: really understanding what it is we are talking about.