Friday, December 24, 2010

Christmas


Generally speaking, I’m a pretty privileged American. I am white, I’m able-bodied, I have a good education, a professional career, all the things that get me an invitation to participate in mainstream American culture.

But then, once a year, that culture engages in a month-long ritual that I have absolutely no part in. It starts in late November, as the decorations appear, and the stores launch their special sales and promotions, and every commercial on TV tries to convince some other viewer that something not normally thought of as a gift (a power tool, gold coins, a lottery ticket, a donation to a charity) would in fact be just the right gift to bring a glow to his loved one’s heart.

This cultural phenomenon starts out subtle and easy to ignore. By mid-December, though, it is all anyone can talk about. It is on everyone’s mind: “Are you ready for it? Have you finished your shopping?” People bring desserts to their workplaces and have parties to celebrate its impending arrival.

And then this fervor builds to a climax, and the day arrives. Everything goes abruptly quiet. Streets are empty, stores are closed, an eerie silence reminiscent of Superbowl Sunday overtakes the city centers and shopping districts.

Then it’s all over, gone as enigmatically as it arrived, and I reacquaint myself with this culture that has seemed so alien and bizarre for the last month, my culture, once again.

Christmas isn’t even a religious holiday anymore
, numerous snotty high school acquaintances of mine used to inform me. Why don’t you Jews just give it up and celebrate it already? It’s the same question that gets asked of every minority whose minority status is based on a behavior: Why can’t lesbians just suck it up and marry a man? Why can’t left-handed people just use their right hand? Why can’t those Muslim women stop wearing that crazy veil?

From the outside, the question makes sense: why not celebrate Christmas? It’s fun, it’s festive, there are presents; what’s the downside? But people don’t just start celebrating holidays they have never celebrated before. Who would be the first member of my family to declare, This year, we’re getting a tree and buying a bunch of ornaments for it! This year we’re going to sing Christmas carols and putting up tinsel! This year, Santa’s coming for the first time!

My family has never done these things, and they don’t hold any meaning for me. The Jews I know who celebrate Christmas do so because they have Christians or former Christians in their families. I do know a few Jewish people whose families just love holidays and so decided to put up a tree or some decorations, but they’re a rarity. Religious Jewish people abstain from holidays celebrating Jesus on principle, but the rest of us view Christmas about like Kwanza or Ramadan, just with a lot more publicity: just fine, but not something that belongs to us.

When I was in high school, I worked at a Jewish gift and book store. When customers would ask us if we were open on Christmas, my boss would arch her eyebrows and say, “We call that day December 25th.” This was meant to answer their question: that day is not a holiday around here.

The message wasn’t always received: “So, are you open on December 25th?” people would ask.

“Unless it’s the Sabbath,” she would reply.

Although I worked at the Jewish bookstore, my family was not very religious. We didn’t go to synagogue when I was growing up. My parents sent my sister and me to a secular Jewish Sunday school at the Jewish Community Center, meant to enhance our sense of Jewish identity. There we learned about Jewish history, holidays, and ceremonies, but not about what we should believe or what God—or G-d, as my teachers called him—wanted us to do.

Before Sunday school, I didn’t always know what it meant to be Jewish, but I learned early what it meant to not be Christian. When I came home from preschool one day excited for Easter, which we had been informed was taking place that weekend, my parents had the grim duty of telling me that we did not celebrate that holiday.

“Why not?” I asked, disappointed at the thought of the candy and Easter eggs I would not be receiving as predicted by my teachers.

“Because we don’t believe in Jesus,” my father said.

Jesus. I had heard that word before. I was pretty sure it referred to something important.

“What if I do?” I asked.

“You don’t,” said my mother.

Not only did Jews inherently not believe in Jesus, but all the other characters associated with Christian holidays. I had to learn to remember that some kids were pretty sensitive about their delusions. In second grade, I got kicked hard in the leg by a third-grade boy because I said that Santa Claus didn’t exist. But he’s eight, I thought to myself, baffled that someone could be so naïve at such an advanced age.

Now that I’m old enough not to miss the presents and desserts, I still feel like I miss out on something because of Christmas. My teaching semester ends about two days before Christmas Eve. This has usually been a dreaded time of year for me. I have all the free time in the world, but all my friends are out of town or occupied with visiting family. I’m too tired to do something useful like clean my apartment or write the syllabi for my spring courses. It would be a great time to go shopping for some clothes…but no, it’s two days before Christmas, a horrible time to try to shop for anything. It’s dark, usually raining, horrible weather to go for a run or a hike. Whenever I get my annual jury duty summons, I defer it to December 23rd. If they want to hold a trial that day, I’m there; it’s the most useless time of the year.

This year, I had several invitations to Christmas parties. They were all with people I like a lot, and would have been fun enough, so I wondered why I didn’t feel like going. Then I realized: I have started to look forward to not doing anything on Christmas. Just like the Superbowl, it’s a rare chance to withdraw from the stream of our cultural calendar, to enjoy the meditative pleasure of disconnection and focus on ourselves. So maybe I do celebrate Christmas after all.

Sunday, November 21, 2010

Palo Alto






In a city of the future, it is difficult to concentrate.
—Radiohead, “Palo Alto”

During my first year in California, when I was nine, a group of local parents wrote and produced a theater piece called Perfect Palo Alto. It was a series of skits that lovingly mocked the eccentricities of my new home: Every single adult here works in computers! Our city is populated with well-to-do ex-sort-of-hippies! We’re all really liberal, overeducated, and self-righteous!

This is a weird place, I thought. No one would have ever written a play like that about Framingham, Massachusetts, or Nashua New Hampshire, the most recent cities I had lived in. What would you even say about those places? We’re a suburb of Boston! We have a big mall! And what practical New England parent would ever decide to write that play, much less be seen by their children and peers performing in it?

Palo Alto, I came to understand, was a weird town that prided itself on its weirdness. It was like that kid who makes sure to do everything on purpose to be as bizarre as possible; and, not surprisingly, it was filled with kids like that. Everyplace I went seemed haunted by a strange hippy heritage that traced back to the Sixties. Grace Slick and Joan Baez both went to my high school. My favorite coffee shop, Saint Michael’s Alley, used to be a hangout for the Grateful Dead when they were still the Warlocks. Only back then, the coffee shop had been housed in a different building, one that had long since been converted into the Varsity Theater, where you could watch Rocky Horror or Spike and Mike’s Sick and Twisted Festival of Animation and imagine Jerry Garcia having a smoke there back when he looked like a Chasidic Jew.

The town reveled in its iconoclasm, the college-town in the midst of the Silicon Valley, the town only twenty minutes from San Jose that considered itself a steadfast satellite of San Francisco (an hour’s drive).

When my economics and government teachers explained the difference between liberals and conservatives, they would always say, “Conservative voters tend to be more prevalent in wealthier areas. Where we live is an exception, though.”

And was it ever an exception. Growing up in Palo Alto, Republicans were like polar bears; I’d only ever seen one on TV. Before an election, every front yard sign, every bumper sticker, every snide comment from an adult seemed to be preaching to a city-wide choir: Mondale, Dukakis, Clinton. A few of my fellow students claimed doggedly to be Republicans, in what always struck me as an Alex P. Keaton style act of contrarian rebellion. I never met a real Republican until I went to college—at U.C. Berkeley, a school I chose over a small liberal arts school primarily because I had heard there were Republicans there and I wanted to confirm that they really existed.

Palo Alto’s demographic and social weirdness, when I was growing up, seemed to stem from the fact that people mostly moved there for the school district. At a time when California’s schools were in a fast decline, Palo Alto’s schools were consistently rated amongst the highest in the nation. These schools kept the property values sky-high for the small, single-story, space-efficient tract houses that pervaded most of the city.

In nearby Atherton, every resident lived in a mansion, but they shared a lackluster school district with neighboring Menlo Park. This incongruity used to strike me as odd, until I realized that many affluent areas don’t care about the quality of their public education system, since the residents would all be sending their children to private schools.

But in Palo Alto, all of my friends’ parents were like mine: they had sunk all of their money into a small, expensive house so that they could send their children to a top-ranked public school. So while I grew up in a town with a high average income and high property values, no one I knew ever seemed to have much money. We all had parents who carefully budgeted, who fretted over the money we spent on clothes and food, who considered every purchase seriously: Do you really need that?

Thinking back, I realize that our parents could have been living in bigger houses, driving fancier cars, not worrying about every dollar, if they had chosen to live in cheaper cities with lesser school districts. Palo Alto self-selected for people who valued education—public education—over every other luxury in life, and that was what made it truly a town of weirdos.

The values of our parents seemed to have rubbed off on most of the students I knew. Certainly the interest in education did. Grades were so high that my school could not publicize class rankings for fear of keeping us from being admitted to universities. My 3.6 GPA put me in the 68th percentile of my graduating class.

But our view of money seemed to come from our parents as well. In a city with one of the highest real estate prices per square foot in the country, no one wanted to be seen as rich; instead, students tended to brag about how poor they were. Having a lot of money, or spending it frivolously, was something to be ashamed of. In my high school, it was considered horribly uncouth to have anything expensive or new. I knew one very rich kid whose parents bought him a fancy sports car for his sixteenth birthday, and everyone mocked him behind his back. He was one of the handful of people I knew who had a new car at all; most of my friends, like me, did not have a car, and none had a car manufactured after the Seventies. At the high school across town from mine, the newspaper ran a “Wreck of the Week” column featuring students boasting about the decrepit state of their vehicles. There seemed to be an acute understanding that we had not earned the money we spent on clothes or cars, and that wasting your parents’ money didn’t make you cool; it made you a spoiled brat.

This rule probably did not hold amongst the small crowd of “popular” kids, but no one cared about them. While they held some sway in middle school, by high school, they could no longer manage to lord despotically over the masses, hopelessly outnumbered as they were. All the horrors I hear reported from other high schools—the football players and cheerleaders and game days and school spirit—were all reversed at my school. Sporting events were under-attended, with teachers begging us to show up. School spirit was for losers. I never once heard a girl admit to being a cheerleader without an embarrassed disclaimer: “Actually I’m a cheerleader. But it’s just because I’m really into dance.”

I moved away from Palo Alto at what ended up being the dawn of the dot-com era. When I applied to college, I had visited the internet one time. By the time I graduated, my father was asking me whether I might want to put off the Ph.D. program I had just been accepted to and make a bunch of quick money as a tech writer, just like every other jackass with an English degree.

Whenever I came back to visit Palo Alto, every other car was a BMW or Jaguar, and there was a new yuppie restaurant in the place of each quirky old diner or bookstore I used to love. Saint Michael’s Alley had been converted from a grungy coffeehouse to a stylish gourmet brunch spot. The Varsity Theater became a Borders bookstore. All the eccentric little corners were swept clean, as if the weird haunted hippy town of my youth had never existed.

During those boom years, when I would visit my mom, I used to go for a run around my old neighborhood—a run that took me past Steve Jobs’s house, an unassuming neighborhood landmark—and each time, I would pass at least four houses knocked all the way down to their foundations. They were about to be built up again from scratch, now taller and with basements and bloated out to the far edges of their lots.

Like every affluent place, Palo Alto seems culturally improved by economic downturn. The bloated houses still stand, but I don’t see so many knocked down when I visit now. The roving crowds of dot-com twenty-somethings in cocktail attire no longer swarm University Avenue in search of mates. Palo Alto still makes me a little sad, seeing it from the outside, its sanitized plazas and former-dives where my teenaged friends used to write on the walls with Sharpies. Still, that is only from the outside. I have a lot of hope that the high school kids are still traipsing around like lost hoboes somewhere I never go, that young Grace Slick is there writing scandalous songs, hidden away somewhere my respectable grown-up eyes can no longer see.

Sunday, October 31, 2010

Empire Building


I never used to be able to stand A Prairie Home Companion. I would turn on the radio and there, like a cartoon hound dog with his mouth full of mud, would be the voice of Garrison Keillor, singing some old standard song that he had slightly rewritten the words to or telling some nonsensical rambling story about some mildly dysfunctional couple in Minnesota.

What is this, I would ask myself, quickly changing the channel. And why would anyone listen to it?

Then about a year or two ago, I suddenly became fascinated with the show. I would listen every week, waiting to hear what extremely similar thing would happen. Would a cowboy meet up with his old flame…again? Would a Midwestern expatriate writer have a guilt-ridden phone conversation with his provincial parents…yet again?

Once you embrace the logic of A Prairie Home Companion, it’s easy to get sucked in to the bizarre parallel universe it depicts. The show’s audience seems to love the predictability of it, the comforting if illogical repetition. They laugh hysterically at the same joke about Lutherans every week. They love the twisting personal narrative, often told in the second person (and right about then is when you realize…) that always ends up with the same song about rhubarb pie. They chuckle as Keillor inevitably finds himself romantically attached to a much younger, much more attractive woman, and even on the radio they can tell she’s out of his league. And when the show is over, they can even go on the show’s website if they want to delve more deeply in to this make-believe world where not only are all the children above average, but where the red states are full of old lefties and everyone loves gospel music and spoken word poetry and choirs performing the native folksongs of former Soviet Bloc countries.

Predictability like this brings a certain comfort with it. That’s why people enjoy sitcoms, or watching the same Saturday Night Live characters play out variations on the same gag week after week. The joke ceases to be funny and instead becomes soothing like a lullaby.

That’s how I found this alternate prairie reality, a strangely calming bizzaro-world. Each time I heard that Keillor had written an offensive article in a magazine (why gay people shouldn’t get married; why Jews shouldn’t write Christmas songs), I would go look it up, eager to see what new levels of curmudgeonry he had achieved. And because he existed in a half-reality where it was never clear whether Keillor spoke as himself or some sort of parody of himself, the ridiculous beliefs he espoused were more quaint than upsetting.

The more I read and the more I listened, the more I thought: I want an empire. Not a big, scary, hegemonic empire like Rome or the USSR or McDonald’s. Just a small, self-contained empire with a legion of devoted followers who are willing to celebrate my every bizarre whim as utter genius, to lovingly embrace my foibles, to delight at the same joke for years and decades on end.

I have been admiring a number of these small empires lately, and the one I really want isn’t Garrison Keillor’s, but Dan Savage’s. Like Keillor, the sex-advice columnist and gay rights activist has his own brand of logic and language, including a number of acronyms for concepts that are so fundamental to his reasoning that they require shorthands. A good lover is GGG, “good, giving, and game,” which means that you had better let your boyfriend suck on your toes if he enjoys it, no matter how boring or gross it seems to you. If you don’t, Savage will urge him to DTMFA, “dump the mother fucker already.” These abbreviations are so well-known to Savage’s audience that they often misuse them in incorrect and even disturbing ways, forgetting what they actually stand for:

I love my boyfriend but he’s moving out of the country in two months and we’re going to break up then. Should I stay with him for these last months or DTMFA?

I’m a mother of a twelve-year-old son, and I’m doing my best to raise him in a GGG manner.


As these terms suggest, Savage’s advice and opinions follow certain well-worn paths. Like Keillor’s audience, Savage’s devotees, including myself, know what to expect from him. Yet I read and listen to him with a voracious appetite that speaks to either the comfort of the familiar or perhaps some sort of subliminal brainwashing. And I have come to realize that many people I know are equally brainwashed.

For example, when I visited my sister recently, I started to mention something that Dan Savage had said on his podcast. I started summarizing a phone call that he had taken and responded to.

“To save time,” my sister interrupted me, “you can just assume I’m familiar with every episode of Dan Savage’s podcast.”

Sometimes I wonder if the men who rule the empires I admire ever get sick of them—sick of the routines, the predictable logic, the cute terms and sayings? Does Garrison Keillor ever wake up and think, I don’t ever want to sing that song about rhubarb pie again? Does Dan Savage ever get sick of having to talk about sex every single day? Does Ira Glass ever get sick of saying, And what happened next was truly bizarre, as though this is the first time he has ever narrated a bizarre occurence? As successful as they have been in building up their own recognizable brands and selling them to adoring audiences, do they ever get horribly, nauseatingly sick of themselves?

This is the reason that I would be a horrible emperor, not to mention a horrible CEO, middle-manager, public relations officer, or cheerleader; I would get horribly sick of myself. To be a representative of something, to be an unceasing champion, is a really draining job, one that requires a kind of confidence and perseverance that I don’t have a lot of. This is part of the reason I admire the people who are able to maintain an empire, because I appreciate how grueling it is to be the leader of a cult of personality.

I attend a number of schools that are run by a single person. Each of these schools reflects the vision and energy of its teacher: the flashy muay thai school with the blaring music and assertive display of clothing and equipment for sale; the tidy jiu jitsu school where there are specially designated spots for shoes and water bottles and sweaty students and dry guests; the yoga studio whose bare wooden floors are all business but whose ceiling is decorated with Christmas lights and flying monkey puppets; and the one I relate to most strongly, the kung-fu school treading so lightly in its rented gymnasium that it is only a school when we are practicing there.

Sometimes I see signs that my yoga teacher or my kung fu teacher are tired of their jobs, bored with us, their students, discouraged by the low energy of the class. Perhaps they are not really discouraged at all; perhaps I am projecting onto them the discouragement I feel as a student in a sluggish class, the discouragement I fear they feel because I would feel it in their place.

I know that if my suspicion is correct, if my teacher really is disheartened, bored, uninspired, that he can simply close the school. I have recurrent dreams that my kung fu teacher announces at the end of class one day that he is closing the school. I suppose this isn’t such an unreasonable fear, given that I joined his school after my previous kickboxing teacher closed his school in just that way, except he didn’t wait until the end of class: “This will be the last day,” he said casually, as his students jumped rope to warm up. “I’m cancelling the class. But go ahead and work out on your own today. See you later.” And he walked out the door.

This is the danger of a monarchy: it ceases to exist without the monarch. My uneasiness with devoting myself to a single school that could disappear at any moment must have something to do with why I feel more comfortable with a diversified portfolio of schools, even though my heart is fully invested in one of them.

But more, noting the empires created by my teachers, I often think: I am really glad I don’t have that power or that responsibility. I am a teacher, too. But I could quit my job tomorrow and my school would keep on going without me, just as they did before they ever knew I existed, unaware of me being born thousands of miles away the same year they were being founded, and just as they will after I retire and presumably long after I die.

If the only thing keeping the school alive was my own faith in it—if it ran on my energy alone—I don’t know if I could keep it open. Would I have given up during one of those semesters when all my students were disgruntled and bitter and I wondered what business I had being in charge of them? Would I persevere through those times? Or would I decide the entire enterprise was futile and go find some other way to spend half my waking hours?

Every time I walk into a class and my teacher is still there, I know he could have decided not to be, and I am grateful. My teachers might not be able to imagine how grateful, just as I often imagine my students would hardly notice if I were replaced with some other teacher or a t.v. screen or a robot. And I’m also grateful that, nice as an empire sounds sometimes, I have the security and freedom of not having to be an emperor.

Tuesday, October 12, 2010

It's a Girl


As I sat on the couch holding Samantha’s newborn baby, her mother came through the front door carrying a small package wrapped in plastic.

“From Vivian,” her mother said, handing Samantha one of those ornate little envelopes that I had only seen on Chinese New Year.

“How much?” Samantha’s mother asked her.

Samantha opened the envelope and showed me the contents: a crisp new hundred dollar bill.

“Very good,” said Samantha’s mother.

She turned to me. “This is a Chinese tradition,” she told me. “When you have a boy baby, is traditional to bring money and meat.” She pointed at the package she was holding. I couldn’t see through the opaque white plastic wrapper to figure out what sort of meat it contained, whether it was chicken or beef, raw or cooked.

“For boy, you give money and meat,” she repeated.

“What if it’s a girl?” I asked. Samantha’s mother didn’t seem to understand my question. Samantha repeated it: “Ma. What do they give you if the baby is a girl?”

“Oh, if a girl,” said Samantha’s mother, nodding her head. “You still bring meat, but less money. Maybe twenty dollars.”

Samantha and I looked at each other and burst into hysterical laughter. Her mom joined us laughing, too. It was a sinister moment of female bonding as we laughed in shared acknowledgment of our lesser worth.

For two women raised in America like Samantha and myself, this laughter is partly directed at the quaint misogyny of less enlightened nations. A daughter is to be celebrated with one-fifth the enthusiasm of a son, we are thinking. How cute!

In America, at the moment of birth, a daughter is worth as much as a son. Nowadays, she could grow up to be anything—almost. As long as she doesn’t want to pilot a submarine or become a philosophy professor, the sky is the limit. Hell, if she’s white and Christian, we now have documented evidence that she could grow up to almost become president of the United States.

This is how I was raised—to believe I could be anything I wanted to be. My father imbued my sister and I with all the high expectations he would have had for a son, buying us erector sets and electrical engineering kits, staying up late doing mathematical proofs with us while we were still in elementary school, expecting us to magically know how to throw a baseball properly because it came naturally to him.

School confirmed this impression of equality for me. The honors math and science classes I took were equally populated by boys and girls, and the girls were often the strongest students. Many of those girls went on to become scientists and engineers. While I have not talked to most of them about their experiences, I don’t get the impression that they had to crack any significant glass ceilings on their way to these positions. The main obstacle they felt was loneliness, as the numbers of their fellow women scientists and engineers dwindled, as women like me and my sister tossed aside our technical aptitudes in favor of more traditionally feminine careers.

The ideology of equality in high schools like mine is why I’m never surprised when my students believe sexism no longer exists.

“Men and women are equal now,” a girl in hot pants and a halter top will say, and her male classmate will nod serenely in his baggy sweats and oversized tee.

If you ask them if men and women are treated the same in our culture, though, their narrative is strikingly different.

“My friend has a twin brother, and he’s allowed to go out whenever he wants and do whatever he wants, but she has a curfew. They’re the same exact age.”

“How do her parents explain the difference?” I asked.

“They say, You’re a girl and he’s a boy.”

I ask the students why parents would treat their children differently based on their genders.

“They don’t want their daughters to get pregnant,” the students say.

What if their sons get their girlfriends pregnant? Is that as bad?

Noooo, they all shake their heads. Definitely not as bad.

My friend who teaches health told me today that during a discussion of types of contraception, one of these same students volunteered a chastity belt as an option.

“I’m not writing that one on the board,” she said.

“I’m serious,” the student said. “If I have a daughter, she’s never going to be allowed to have sex.”

This is the logic that explains the articles opposing abortion that my students often bring in as part of a debate assignment. Young people need to take responsibility for their actions, the articles say. A young woman needs to learn that if she wants to be sexually active, there are consequences to that decision.

I’ve scanned so many of these articles, and found nary a mention of boys. Articles praising abstinence-only education will laud the positive outcome of far fewer high school girls having sex (or admitting that they do), without ever explaining if this same result was seen in boys; evidently the main goal was to stop the girls from having sex.

These little inequalities and indignities are well-known and obvious, tiresome and uncouth to talk about, so we don’t. No one needs to hear that we still live in a society where women are largely judged on how they look, while men are largely judged on what they can do. If you go out to a bar where men and women are looking for mates, it’s not worth the breath it would take to point out that the men are off-handedly mentioning how they can shoot a gun and they’re working on their pilot’s license and they are an ace at poker, while the women are flashing their cleavage and batting their heavily-mascara’d eyelashes and making the perfect cute face whenever someone points a camera in their direction. If he’s marginally handsome, so much the better, and if she’s really good at playing pool, well, that’s a small enough transgression to be sexy, as long as she’s really pretty.

Only stupid people play out these tired gender roles, you’ll say, and you’ll be right, mostly. I know plenty of women who are considered quite attractive, and whose attractiveness lies in their intelligence and skill—as long as they are also pretty and thin. However, I don’t know any women who aren’t conventionally attractive but who are largely courted for their intelligence, sense of humor, athleticism, or power. There are a million billion trillion men like this, who are funny-looking, with protruding bellies or bad clothing or pockmarked faces or the wrinkles that give them character, who are still considered models of attractiveness, sex symbols, because they are amazing actors or athletes or singers or businessmen.

You’ll say this has less to do with social roles and more to do with whom we are trying to attract. Men are fundamentally visual in their attraction, you’ll say, and women are less so, which explains why gay men are more likely than lesbians are to feel pressure to be good-looking. And you’ll be right, maybe. Maybe if we stopped feeling the need to be so damned attractive all the time, we could accomplish a lot more.

I think this hypothesis could explain the decided lack of femininity in the two recently appointed Supreme Court justices, Elena Kagan and Sonia Sotomayor. Perhaps they needed to renounce the distraction of feminine self-presentation in order to focus on the achievements that got them to the Supreme Court. Or maybe that's what we need to believe about them. It almost seems that, to be seen as credible in one of the most powerful jobs in the country, women have to present themselves as men. The female mannerisms, the politeness, coyness, flirtiness that characterize femininity all suggest lack of credibility, a lack of focus on what's really important. Women can be judged by what we can do, but only if we are willing to renounce our femininity.

Maybe that’s why my friend and her mother and I laugh so bitterly at the greater promise that the Chinese see in their sons. It is not because the girl can’t grow up to do everything that the boy can; it’s because society won’t find her attractive if she does, and, we are horribly afraid, neither will we.

Sunday, September 26, 2010

Going Off the Meds


The first time that I experienced panic attacks, I was seventeen, and I felt like I was living in a horror movie. I would be walking down the street in the middle of the day, when suddenly the light would go all funny. The ominous overture to Carl Orff's Carmina Burana would be playing faintly in the background, except I hadn’t heard Carmina Burana before except in scenes depicting hell in movies. The color of everything would be sickly and wrong and I would feel all the blood in my body drain to my feet, and there would be the presence of some sickly grotesque thing that defied the normal order of the world, some dark thing come from the other side to suck me under.

This feeling would overcome me every few minutes, hundreds of times each day, so that life was rather like creeping my way through the hotel in The Shining, or maybe one of those new homemade-looking horror movies whose advertisements celebrate how “disturbing” they are.

The therapist I saw took me to the psychiatrist to see if he would recommend medication. He shot a dismissive glance at my gaunt figure and tattered thrift-store clothing and said, “If this is still happening in six months, we’ll give you some antidepressants.”

Six months, I remember thinking in horror. If this is still happening in six months, I’ll be dead.

Six months later, the attacks weren’t gone, but through weekly therapy, I was learning to control them, so that I could usually anticipate them and stop them before they got really bad. I learned that I needed to get a full night’s sleep and eat reasonable meals and not wallow in stress or immerse myself in bizarre, alienating music and literature, frustrating lessons, but ones that I suppose my body thought it was high time for me to learn.

Other than sedatives, I don’t think anti-anxiety medication had been developed back then. If it was, I certainly wasn’t offered any. Since that time, I have had several other bad bouts of panic attacks, and I have never been offered any sort of medicine by the psychiatric and medical professionals with whom I have consulted.

From the stories that I’ve heard, that you’ve heard, too, I am an anomaly. Friends, books, TV shows, magazines all tell us that psychotropic drugs are dispensed like corn syrup, an efficient solution to our culture’s wealth of mental disorders that avoids the expense of psychological treatment while providing easy revenue to the drug companies. At times, I have felt cheated—why haven’t they offered me any of these “overprescribed” drugs? Am I not screwed up enough to warrant a healthy dose of that Prozac they’re debating in all the magazines, that Paxil and Zanex that they seem to be prescribing for everyone else I know?

I especially feel cheated when I think of that horribly suffering teenager I once was, and wonder: couldn’t they have given me something? Surely there must have been something, some sedative, some kind of valium or Quaalude or horse tranquilizer that could have calmed down my overactive brain, rather than leaving me to pull myself, with grasping bloody fingers and a year of therapy, up out of that horror movie and into some kind of sane, properly-ordered mental functioning.

On the other hand, when I watch the struggles of the many people I know who have been placed on medication, I feel like I got out easy, relieved and lucky to have not been pulled into a deeper kind of morass—the engulfing cycle of being on medication and getting off of it.

I’ve watched a lot of people try to go off of their medication, and from what I’ve seen, the process of going off of medication for mental illness will in itself make a person mentally ill. I’ve watched some of the closest people in my life shake and cry and wail that they need the medicine, with all the desperation of heroin addicts. I’ve watched people adjust the dosage—just a little more now, now a little less, cut down by half, and if you completely freak out, go back up by a quarter. One friend who struggled with her romantic life went to her primary care physician to get referred to a therapist, and instead got prescribed Prozac; a year later, her struggle was not to maintain a healthy relationship but to get off the Prozac.

Of course, one obvious explanation for the anguish caused by going off medication is that the medication is needed, and that it is the only thing that was preventing the suffering in the first place. Your brain is like any other organ, and if your brain is sick, you may require medicine for the rest of your life, just as you would if you suffered from thyroid or heart disease.

This may be the case for some mentally ill people, but as far as I understand, it is impossible to diagnose conclusively who they are. There is no physical test to diagnose mental illnesses; they are diagnosed by symptom. It is as though I went to the doctor complaining of fatigue and thirst and, based on those complaints, I was assumed to have diabetes and put on a regimen of insulin. If my symptoms abated in a few months, it might be difficult to tell if this was due to the insulin or something else altogether. The only way to tell would be for me to stop taking the insulin, and see if I felt sick again. Except, since my body is accustomed to the insulin, I may feel faint and ill when I stop taking it, and this could be a normal sign of withdrawal, or a deadly symptom of my now-untreated illness, and it may only be my death that provides a conclusive answer.

I don’t mean to insult psychiatry as an imprecise science. I have seen many people I care about treated with drugs that seem to have saved their lives, and I will always be an avid supporter of anybody’s right to stay on a medication that is preventing them from being miserable, numb, non-functional, or delusional. I know firsthand what mental illness can be, and if the path to my own health lay in a pill, I would take it without hesitation.

A few weeks ago, on the second anniversary of the death of David Foster Wallace, I heard several radio programs discussing his life, his writing, and his battle with depression. Wallace had done something that I have seen so many people I care about do: he reached a happy, stable time in his life, and, no longer suffering the symptoms of his depression, he decided to go off of the medicine that he had taken for years.

The people in my life have done this with varying levels of success. A few people I know have actually managed to go cold turkey, and found that the depression they experienced as young adults seemed to have dissipated with maturity. But more often, my friends and family members have found that their depression returned. They may be able to take lower doses of the antidepressants, but can’t seem to go off of them entirely.

David Foster Wallace’s version of this process sends a chill through anyone who takes antidepressants or cares about somebody who does: when he stopped taking his medication, his depression returned full-force, and when he returned to his medicine, it no longer worked, and he committed suicide.

It’s the fear of this very sort of dependence that leads so many people I know to want to get off medication. What if there is some disaster? I don’t want to be unable to function without medicine. Unlike diabetics and other people who have a diagnosable, physiological need for a medication, people with illnesses like anxiety and depression are never sure just how dependant they really are, nor are their doctors. It’s impossible to find out just how dependant you are on antidepressants without risking your life or your sanity.

Now when I watch people I know try to reduce or go off their medicine, I give grudging thanks to that judgmental psychiatrist who did not want to give me medication when I was a teenager. Without a chemical remedy, I had to figure out how to make my own mind healthy. I learned a lot of skills that I have used throughout stressful times in my life, even times when my panic attacks have returned almost as strongly as the first ones I experienced. I have never had to wonder whether I really need a medication, to feel the temptation to turn to it when times are difficult, to try to distinguish anxiety caused by needing medication from anxiety caused by withdrawing from medication. I don’t believe that everyone’s mental health problems can be solved without medicine, but I will be forever grateful to know that mine can, and to have the tools to do it.

The illustration depicts Berkeley's Hate Man. I don't know if he ever went on or off his meds.

Friday, September 3, 2010

Inspirational YouTube Videos


These are the words I most dread at the end of a student presentation:

And now, we’re going to show a video that sums up our topic.

Then it comes: four minutes long, the mournful tones of Sarah McGlaughlin or Suzanne Vega playing in the background, the heartrending images, the atrocious use of punctuation.

As soon as the students announce the video, I cringe. Never mind that these videos are a kind of visual and rhetorical assault on my consciousness, making me violated for having to watch them. If my students had created them, I would still beam with pride. The videos represent the very sorts of projects that we hope the students will create, in fact—their assignment is to create a Powerpoint presentation creatively presenting their research on a social justice issue. They are first-semester college students, many of whom have severe learning disabilities or come from continuation high schools where they were never assigned homework. If their discussions of the issues are a bit melodramatic, that’s fine; we are happy if they simply complete the assignment without having a nervous breakdown or punching one of their group members.

The problem with the videos is that they are someone else’s overly sentimental Powerpoint presentations, posted on YouTube so that anyone with an Internet connection can watch them, absorb their message, overlook their poor organization, research, and spelling, and use them as part of their own presentation on the same topic.

By presenting the videos as part of a researched presentation, the students imbue them with a sense of credibility or authority, as though they count as research. I can envision the students watching the videos and finding them more powerful than the actual data the students have collected; “Hey, guys, I finally found something interesting,” I imagine them saying.

The fact that these videos always appear at the end of the presentations is even more frustrating from the perspective of a writing teacher. Rather than end with their own ideas, findings, and voice, the students give away that platform to someone else, and worse, someone no more informed than themselves. The videos would bother me much less if the students would speak after them, explaining why they had chosen that video and what it meant to them. But instead, the video is given the final word, this anonymous authority rising out of the ether to lend imagined credibility and emotional power.

I have seen the same one about domestic violence several years in a row, a rambling stream-of-consciousness polemic. It juxtaposes a series of distressing images—a woman with a horribly bruised and swollen face, for example—with meaningless statistics, or cryptic bits of narrative, apropos of nothing that came before them—Forty percent of women say they have been abused by a partner; She is afraid to leave him.

Finally one year, I remembered to put a policy about YouTube videos on the assignment instructions: “Your presentation may include no more than one minute of any video that you did not create yourself.”

When presentation day came, one group ended by introducing a video: “We know you said that we shouldn’t have more than a minute of a video, but this one is really good,” they said to me.

Then the familiar music began, and the familiar series of domestic violence images and factoids began again.

I knew that every student in the group had experienced abuse, either in their parents’ homes or at the hands of high-school boyfriends, so I did not want to reprimand them for flagrantly dismissing the assignment instructions. But it saddened me to realize that they believed this cheesy video would be more powerful than their own words, ideas, and experiences. This is what years of schooling has drummed into their head: that if something is worth saying, it has already been said by someone more important than you, and you’re better off using that person’s words rather than your own.

I’ve noticed that many of the power figures at my college also rely on the voices of others to support, or make, their arguments. Our former college president was a great fan of inspirational quotations. They adorned her office, engraved into glass paperweights and printed on postcards. She opened and closed every address with them, relying heavily on the classics: Your playing small doesn’t serve the world, she would say, or The definition of insanity is doing the same thing over and over and expecting different results. They were quotes attributed—verifiably or not—to great writers and thinkers. But I would bet my union-negotiated contract that these quotations were the only things that she, or most of the people who use them, had ever read by these esteemed authors.

The internet allows us to find a wealth of these kinds of dismembered quotations, inspirational words of advice from authors we have never heard of, without even having to leave our house and buy a book of them. My students have recently taken to opening their essays with barely-relevant passages by people ranging from George Bernard Shaw to Anthony Robbins, names that I am sure have no meaning at all to the students citing them.

With YouTube, we do not need to limit ourselves to using the words of published authors or well-known personalities or even people who know how to write. Anybody can put a video on YouTube, and it is possible to find a relevant, if poorly-constructed, opinion piece on any topic at all.

The college president took advantage of this expanded quotation marketplace in a speech welcoming all the school’s employees back to work after summer break.

“Last year, when I talked about leveraging abundance,” she said, “some people asked for specific examples to show what I meant. So this presentation has lots of examples.”

It turned out that these were not, as I presume “some people” (i.e. everybody) had wanted, examples of how colleges had leveraged their abundance, or suggestions as to how we might do so at our college.

No, the examples were YouTube videos. The first was a cartoon positing the premise that “everyone can be an entrepreneur.” Anyone who creates their own business is an entrepreneur, the video explained, whether that business is a lemonade stand, a yoga studio, or a software corporation. “I like the positive spirit of this video,” said the president. “It shows people taking initiative.”

She then showed a video of a nine-year-old boy who had given a motivational speech to thousands of teachers. “Do you believe in me?” he asked them, pointing into the crowd. “I believe in me.” The thousands of teachers applauded wildly.

“This video reminds us to always think of our students first,” said the president.

I imagine that the president used these videos in her speech for the same reasons my students use them: because they are fast, easy, and vaguely reminiscent of something like actual research. Finding original examples of local colleges using innovative strategies to stretch minimal resources (which is what I think leveraging abundance is supposed to mean) would take a lot of time, energy, and thought, not only to find the examples, but to thoroughly understand and explain them.

Don’t get me wrong: I actually love YouTube. I love the poorly-constructed opinion pieces, at least in theory, and often in practice. The democratizing of public expression is one of the best things about the internet. Everyone can express their views and have them read by strangers from around the world. Insecure students like mine can create a project that embodies their research, and then post it for their friends and families to see.

The problem comes when we allow things we find on the internet to stand in for well-thought-out arguments and evidence. With so much information instantly available, we begin to believe that doing research is as easy as shopping for shoes: point, click, it’s yours.

But just like with the shoes, it is easy to ignore the cost we will pay for our ideas, if we want them to be good. As an English teacher and a writer, I am painfully aware that good ideas are not free. We pay for them, not in money, but with our labor—the labor of thought, of real research, of reading real books cover-to-cover and formulating our own ideas about them rather than allowing them to speak for us. It’s a lot of work, but the rewards are worth it: really understanding what it is we are talking about.

Friday, July 30, 2010

Danger


After our weekly sparring class, my friend paraphrased an idea from Marcus Aurelius:

When a gladiator has to fight another gladiator who is unpredictable and dangerous, he does not resent or hate this other gladiator. Instead, he thinks to himself, “Danger.” This isn’t a negative judgment of his character but simply a factual statement: in the arena, this person is a danger to me, and so I must be careful.

Actually I’m quite certain that the word my friend used in place of unpredictable and dangerous was spazzy. This might sound immature, but for martial artists, spazziness has a very particular meaning. A spazzy fighter is one who makes abrupt, clumsy movements. If you spar with a spaz, he is likely to injure you with something that was not a purposeful technique. Perhaps he will elbow you in the forehead as you lean in to throw a body shot, or kick you hard in the Achilles tendon as he attempts to sweep your foot.

So the way my friend paraphrased Marcus Aurelius was something like: When a gladiator fights a spaz, he should not be annoyed with the spazzy opponent, but simply think, “Danger,” and try to avoid getting hurt, without any further negative judgments.

In citing this idea, my friend was specifically thinking about his reaction to a specific incidence of spazziness in our class that day. The most instinctive reaction to a spaz for most people is annoyance and frustration: why does he keep doing that? Controlling these sorts of emotions—frustration, anger, annoyance, hostility—is one of the main principles of fighting, since they will distract a fighter from performing well.

Sparring is as much a lesson in controlling emotions as in learning to attack and defend successfully. One of my friends put it like this: “In soccer, you get angry when somebody fouls you. But if you are boxing, the person is supposed to be hitting you, so you can’t get upset with them.” Sparring would be a pretty miserable activity if you were to get upset every time someone hit you—you’d be angry and upset throughout all of your training, and would hate all your training partners.

Still, while I am usually quite content to be punched and kicked by graceful and clever fighters, I tend to think of spazzy sparring partners as my enemies. I don’t necessarily dislike them personally, but during the time I spend in the ring with them, they are likely to injure me and thus are my foes.

I took Aurelius’s hypothetical passage to mean that I should stop focusing on the antagonistic relationship between myself and the spaz, and instead direct my attention only to the immediate danger that he poses to me. When he does something that might injure me, I should not think, This guy is such an asshole, but rather, without emotion, Watch out for that move. It is sort of a love-the-sinner-hate-the-sin approach to sparring, except there is no hate for either the spaz or his spazzy moves: there is only the dispassionate assessment of the danger involved and how to avoid it.

This idea made a strong impression on me, and for months I searched in vain for the actual passage from Marcus Aurelius’s Meditations. When I asked my friend to help me find it, he could not remember ever having described it to me. You know, that passage about spazzy gladiators, I asked him? I don’t remember any passage like that about gladiators, he would tell me.

Still, even without the exact quote, I loved this idea, which seemed to me equally applicable to fighting and to the rest of life. I would think of specific friends or ex-boyfriends who were prone to hurt my feelings: Danger, I would think, as I calmly distanced myself. I would meet people whose lives seemed filled with excessive melodramatics: Danger, I would think, deftly sidestepping their overtures of friendship. It was nothing personal, not an expression of dislike or personal affront. It was just what I needed to do to keep myself safe.

Increasingly, I could apply this principle to almost anyone. That girl whose backpack kept bumping into me on the bus? Danger. That guy smoking a cigarette slightly upwind of me? Danger. That frustrating coworker? Danger. That ex-girlfriend of my ex-boyfriend? Danger and danger. I could write off almost anyone I didn’t want to deal with as a purveyor of danger, even if it was only the danger that they would annoy me. I could bob and weave my way through life, refusing to engage, positively or negatively, with anyone who would cause me the least bit of irritation or distress.

I began to suspect, eventually, that I was interpreting this passage incorrectly. And then, a few weeks ago, after every kind of digital and analog search through Meditations, I finally found the passage my friend had been referring to. One reason that it had been difficult to find was that Aurelius had not used the word gladiator; instead, he had referred to the circus, which could be a reference to other types of performative combat:

“If an antagonist in the circus tears our flesh with his nails, or tilts against us with his head, we do not cry out foul play, nor are we offended, nor do we suspect him afterwards as a dangerous person. Let us act thus in the other instances of life. When we receive a blow, let us think that we are but at a trial of skill and depart without malice or ill will.”
But besides this circus/gladiator disparity, the idea of the passage is a bit different than the interpretation I had been embracing. It says that we should not see our opponent as dangerous, that we should not be offended by the danger he affords us. Instead, we should see him as a helper, someone who is collaborating with us to make us better.

While this is similar to my interpretation, I was still viewing the “dangerous” person as a kind of enemy, at least during the time that we were engaged in an activity (whether sparring, a work meeting, a conversation). My refusal to fully engage with that enemy was indeed a sign of malice and ill-will, even though I thought it was not. It’s nothing personal, I would think, but viewing someone as a dangerous enemy to be deflected and avoided is certainly personal, just as it is personal when “Christians” tell gay people that their relationships are a sin, no matter how much they profess to still love the sinner.

In Psychotherapy East and West, Alan Watts argued that our binary language structures prevent us from seeing that even our real enemies, the ones who truly wish to hurt us, are also helping us: “An inadequate system of classification has made it too difficult to understand that there can be an enemy/friend and a war/collaboration.” Watts describes how a war between two “enemy” societies might in fact have positive benefits for both sides: keeping their populations in check, forcing them to hone their martial skills. Even as they consider their interests to be diametrically opposed, the two sides are collaborating in a single system, and their war is a kind of partnership.

While Watt’s example of war/collaboration makes logical sense, emotionally it is difficult to think of the brutal violence and death of war as something positive. Still, this mental exercise points out the relative ease of understanding people like the spazzes as my partners and collaborators. Certainly, they are making me better at sparring by forcing me to truly defend myself against whatever might come, whether that is a sanctioned kickboxing technique or an accidental headbutt. But also, they are strengthening my ability to cooperate in any situation, to not shut myself off to people who I find challenging, to compete with pure focus on my own performance rather than on the properness of their strategy. After all, my annoyance with them, no matter how detached, impersonal, or provisional, is a really just a way of causing myself discomfort and grief, sensations that could be completely avoided if I simply refused to get annoyed. While I might think my opponent is the dangerous one, my reaction to him is the true danger to myself.

The illustration depicts a black eye I received from an accidental headbutt.

Thursday, July 15, 2010

Appetite

A woman gave my friend Tom the following advice:

You should never date the person whom you are most attracted to. If you go to a party and talk to a woman that you find incredibly attractive, don't pursue her. Skip past the woman you find second most attractive as well. The woman who comes in third--that's the woman you should date.

"It's ridiculous," Tom said to me. "It's saying that your appetites are fundamentally wrong. It's like saying, if you feel like eating broccoli, don't eat broccoli. Eat a food you feel more indifferent about, like, I dunno, cheese or cucumbers or something. No, I say if you want to eat broccoli, you should eat broccoli."

Like my childhood therapist, who had an eating disorder, Tom tends to use a lot of food analogies. Lately I’ve noticed myself using them more and more, too.

"But Tom," I said, "what if you crave ice cream? I think that's what she's getting at. Maybe the person who appeals most to your appetite is actually the worst thing for you."

Tom scoffed. "That's not an appetite--that's a sickness. If your appetite isn't sick, then you crave broccoli, not ice cream."

This point was easy for me to concede, because I never crave ice cream. I do crave cookies, especially chocolate chip with walnuts, or peanut-butter, or anything with fruit flavor like the lime sugar cookies that I ate four of when my coworker brought them to a meeting. Luckily we weren’t talking about cookies, however; we were talking about ice cream, which I could do without, frankly.

"But Tom," I persisted, "A lot of people have unhealthy appetites; I think that's what she was getting at. I mean, look at me." I thought of a guy I had recently dated; let's call him Mr. Confused. "I had such a strong appetite for Mr. Confused, and he was horrible for me."

"No, no, no," Tom said. "You were not wrong in your appetite for Mr. Confused. He's smart, and attractive, and interesting. Hell, I congratulate you for bagging that guy." Tom sounded pretty enthusiastic. "Just because he couldn't handle being with you doesn't mean that your appetite was wrong. Your appetite was dead-on."

At the time, I took great comfort in this. What a nice thought--my appetite wasn't wrong. Mr. Confused just couldn't return my feelings for a complex array of really dumb-ass reasons. The problem lay with him, not with me.

There seem to be two predominant ways of understanding appetite. One is an optimistic, or we might call it Lockean, viewpoint: our appetites are fundamentally good, and will steer us towards what is healthiest for us if we will only listen to them. This model is promoted in a book I read recently about raw food diets. One of the benefits of this diet, the book claimed, is that after a brief, unpleasant “detoxification” phase, during which the dieter might feel nauseated, dizzy, listless, etcetera, the dieter will begin to crave healthy, raw foods and will no longer crave junk food and desserts. Tom’s reasoning followed this same model. A healthy person will have a healthy appetite for healthy foods; an appetite for unhealthy foods is a symptom of illness, or perhaps an illness in itself.

The pessimistic, or Hobbesian, view of appetite is the one the woman at the party was expressing. In this view, appetites are fundamentally deceptive or misleading. Tom’s friend believes that if we find somebody extrememly appealing, it is a sign that the person is bad for us. This is like when we are too full to finish our dinner but still hungry for dessert; our appetites lead us towards what is unhealthy and unnecessary. The doctor on the advice show Loveline agrees with this philosophy, often counseling women who are drawn to “bad-boy” types that they must choose someone who appeals less to their appetite in order to find a healthy relationship.

Paul Cameron, founder of the anti-gay Family Research Council, takes the Hobbesian position regarding what he considers to be the danger of homosexuality: “If you isolate sexuality as something solely for one's own personal amusement, and all you want is the most satisfying orgasm you can get—and that is what homosexuality seems to be—then homosexuality seems too powerful to resist.” In contrast, he says that, “Marital sex tends toward the boring end. Generally, it doesn't deliver the kind of sheer sexual pleasure that homosexual sex does.” Sex with men is clearly what appeals most to Cameron’s appetite, yet he finds it morally superior to marry someone for whom he does not have much appetite at all, like people who force down their daily portion of vegetables for the sake of their health, even though the only things they want to eat are cheeseburgers and nachos.

The Lockean philosophy sounds more appealing and reasonable because it portrays human nature as innately tending towards good—given the right information and options, we will choose the one that is beneficial for us. But of course, public interest in food options like the KFC double down or Friendly’s grilled cheese burger melt might steer us towards the Hobbesian view that our appetites are inherently destructive and need guidance from a benevolent leader or perhaps dietitian.

Last week, I went to a workshop on sugar, led by a nutritional counselor I know. She asked us to write down our questions so she could answer them. Several people asked her about the sugars in fruits, a category of food which is both “healthy” (natural, unprocessed, full of vitamins) and easily appealing to many people’s appetites. Of course, the very feature that makes fruit appetizing is the one that might make it somewhat unhealthy, its high sugar content.

“How much fruit is it okay to eat?” one person asked. “Can I eat as much as I want?”

The counselor couldn’t give us a clear answer on this. Instead, to her credit, she asked us how our individual bodies responded when we ate fruit: “Does it ever give you that feeling of having a sugar rush and crash? Have you been gaining weight? If so, you might want to gradually cut back on the sugar you consume from fruit. Maybe try eating a sour apple like a granny smith, instead of a sweet apple like a fuji.”

I was happy to see that I wasn’t the only person who grimaced. Fujis are my favorite apples; if I have to substitute granny smiths, I would prefer just to skip the apple altogether. I am someone who thinks a lot about what I eat and tries to make good choices, but I can’t stand the thought of putting limits on how much fruit I eat. In the summer, when strawberries and cherries and blueberries and mangoes and peaches and plums and every other kind of fruit I love best is in season, I eat really incredible quantities of it. I have no way of telling whether all this fruit is bad for me or whether I am eating too much of it. I know that high blood sugar has been shown to be a risk for cancer and other diseases, and I don’t eat much refined sugar, but all the fruit I eat could be undermining my health. I hate the idea the one type of food that is wholesome, nutritious, and endlessly appealing to my appetite might be bad for me.

Here is the Hobbsean philosophy of appetite in action: unchecked by some instruction about what is good to eat, I would eat enough fruit to potentially damage my health.

Ultimately, the problem with assessing whether our appetites lead us towards what is good for us and bad for us is that we don’t actually know what is good and bad or us. We have so many choices to satisfy our appetites, some pretty obviously good (broccoli), some bad (grilled cheeseburger melt). Michael Pollan describes the concept of the omnivore’s dilemma primarily as the process of avoiding toxins: “When you can eat just about anything nature has to offer, deciding what you should eat will inevitably stir anxiety, especially when some of the potential foods on offer are liable to sicken or kill you.” This sounds as though our modern food practices should be simple: avoid things that will sicken and kill us. Rhubarb stems: good. Rhubarb leaves: bad.

But as the rest of his book demonstrates, even people who are trying to eat the most healthy food, both for themselves and for the planet, may not be able to do so. People buy “all natural” food, not realizing it is filled with unhealthy but natural additives. “Free range” chickens spend five of their seven weeks of life crowded into a warehouse, and are allowed access to a small yard, which they are too fearful to enter, for the last two weeks of their lives.

It is not so scary to indulge in our unhealthy appetites when we have willingly accepted their unhealthiness. Enthusiasts of the double down and the grilled cheeseburger melt are not trying to enhance their health when they eat those things; in fact, the appeal of those foods may lie in the willful flaunting of health. I certainly had no delusions, when I dated Mr. Confused, that he was good for me, though I’d like to think of him more as a chewy, home-baked chocolate-chip cookie rather than a greasy fast-food sandwich variant—something totally unhealthy but of such undeniably high quality that you’d be stupid to turn it down.

The real fear is that you might pick something you thought was good for you, but it turns out not to be. All that putrid, died-yellow margarine people ate in the seventies and eighties was terrible for them; meanwhile, they could have been eating actual butter. If Tom were to pass up the woman he was most attracted to and date the woman he found third most attractive, he might find her to be as bad for him as any other woman might be—because really, it’s almost impossible to tell who is good and bad for us, just as we can’t tell that our chicken did not ever go into its allotted bit of “range.” That would be the worst mistake, like forgoing the ice cream in favor of a salad covered in ranch dressing. It's not what you wanted, but it's terrible for you nevertheless.

Wednesday, June 30, 2010

Love is the Drug

My friend introduced me to a very useful but little-known word, limerence. It means the pain that we feel when we are obsessively in love with someone, when we can’t stop thinking about them, when our mood directly correlates with whether we’ve seen them and how we’ve been treated by them that day.

This term was coined by a relatively obscure psychologist in the Seventies, and it has no obvious etymological roots. This inscrutability mirrors the nature of the concept itself: like a prime number, the idea can’t be clarified by dividing it into parts. Nor is there any synonym in English that parallels the word’s meaning. The closest would be obsessive crush, or unreasonable pining, or love so bad it hurts.

And yet, in our culture that elevates romantic love over all other kinds, limerence is an everyday byproduct of our need to find our own mates, rather than having them assigned to us by our families or other social limitations like caste, class, or proximity. We can pair ourselves with virtually anyone, and this unending possibility leads to unending disappointment as we project ourselves into the lives of inappropriate and uninterested potential partners, wondering why they can’t see how great we’d be together.

Using the famous Eskimos-have-fifty-two-words-for-snow logic, you would think English would have dozens of words meaning obsessive, unrequited love, love that makes people anguished, insane, irrational, love that distracts us from every meaningful thing in our lives, that makes us irresponsible, unreliable, unproductive, that makes our lives into a long visit to the oncologist’s office, hoping with all our being for good news, crushed but not surprised when the news is bad.

But aside from limerence, we don’t really have a word to describe this state of being, which might explain why the emotion itself is so difficult to remember or relate to until we are experiencing it ourselves. Every time I find myself in this condition, I am shocked by how physical it is, how immune to reason, how distraught I can be over somebody who I have absolutely no claim over.

Nor do we have simple words to describe all the other extreme psychological and physiological consequences of attraction: obsessive elation, a feeling that nothing else matters besides seeing the object of our affection, the diminished importance of everything we usually care about, horrifying loneliness when the person leaves town for two weeks, the certainty that we could never be enough for this person, wanting to kill the person, wanting to kill the person’s attractive friend/coworker/second cousin, wanting to kill ourselves, the lacklusterness of everything else once the person is gone, the forgetting of what the purpose of life was and how we got through our days before we met this person.

What we do have, however, are hundreds of thousands of songs alluding to this sort of love, songs that are as blank and inscrutable as the word limerence itself, except when we are experiencing love-related derangement, at which point the lyrics suddenly light up with flagrantly obvious meaningfulness. For example, crazy and baby are the most commonly rhymed words in pop songs, despite the fact that they don’t rhyme at all.
You drive me crazy, baby.
I go crazy for you baby.
Love is crazy, pretty baby.
Our stubborn insistence on forcing these two words together shows how acutely we feel love as a mental illness. And the consciousness-altering effects of love are evident in all the songs that link the words or concepts love and drug:
Love is the drug for me.
You are the perfect drug.
I want a new drug, one that makes me feel like I feel when I’m with you.
These songs all seem to describe the pleasant part of the love-trip, the drug “that makes me feel like I feel when I’m with you” rather than the drug that makes me feel like my life is completely pointless without you. We have to focus on this aspect of love or we would come to our senses and go cold turkey, just as we go out drinking for the high, not for the hangover.

Still, there’s a new love-drug song out right now,“Your Love is My Drug,” that describes the negative side-effects:
I’m looking down every alley.
I’m making those desperate calls.
I’m staying up all night hoping,
Banging my head against the walls.
Songs like these show that we instinctively understand the connection between love and drugs, that love makes us completely out of our minds, that it’s a bunch of chemicals that make us feel this way.

We tend to minimize and dismiss this insanity, but we feel it, and we see its results every day: students dropping out of college over break-ups, people ruining their hard-earned careers by having affairs with their coworkers, politicians sneaking out of the country to be with their lovers when they are supposed to be governing, people walking in front of trains because they can’t imagine a life without the person who has rejected them.

These are serious drugs that have consequences as bad or worse than cocaine or heroin. But we can’t outlaw them or wage a war on them, because these drugs are in our own bodies, and we can’t get them out until we’re dead.

Friday, June 18, 2010

Verbal Jiu-Jitsu


“Are you challenging me to a fight?”

The statement above is an official sample of verbal jiu-jitsu, as sanctioned and promoted by the Gracie Brazilian jiu-jitsu empire. In their Bullyproof instructional video, Ryron and Rener Gracie explain how kids can use the principles of Brazilian jiu-jitsu to stand up to bullies. Their program emphasizes respect, self-discipline and restraint, and the avoidance of physical force. Verbal jiu-jitsu is one tool a child can use to assert dominance over the bully without getting into a fight.

This question was not what I had expected from the “verbal jiu-jitsu” section of the curriculum. I had anticipated a statement that would allow the child to stand up for himself—something like, “I am not going to do what you tell me to”— not an invitation for the bully to take a swing at him.

The Gracies explain that this is a strategic question that might produce several outcomes. The bully might answer in the negative, in which case he or she has effectively backed down and can’t continue harassing the child (according to the Gracies, although I envisioned alternate scenarios, like, “No, I’m just going to carry on making your life a living hell, if that’s okay with you”). Or the bully might answer in the positive, in which case he or she will have to make the first move in the fight, at which point the child is justified in defending him or herself. That’s when the physical jiu-jitsu comes into play.

In this way the question truly does mimic Brazilian jiu-jitsu, which anticipates a range of outcomes and responses in reaction to any single move. A jiu-jitsu lesson often sounds something like: You will pull on your opponent’s right arm with your left arm. If he steps his left foot forward, you will put your right foot behind his knee. If he kicks his leg back for balance, you will put your foot behind his opposite ankle. If he falls to the side, follow him and take his back.

So I suppose the question about whether the bully is actually looking for a fight is a good example of verbal jiu-jitsu. Rather than waiting passively and without a strategy, the child asks a question that could provoke several predictable reactions, and then prepares for each possibility.

Whenever I hear the term verbal jiu-jitsu, I wonder to what degree it is possible, or advisable, to view a verbal interaction as a martial art. On the one hand, we often face situations where we are trying to use language to achieve some end, to gain power over a situation or get something we want, and strategic use of language could help us do that. On the other hand, all dialogue can be seen as trying to get our own interests met; so how do we know when to call off the battle and just talk?

Most of the time that I hear the term—and I hear it a lot, mostly on NPR—it seems to refer to a use of language that is misleading or tricky, either in a positive way. A common statement like, He is a master of verbal jiu-jitsu, might be a compliment or an insult.

When I hear the use of the term to mean trickery, I often wonder what the speaker thinks jiu-jitsu is. The word jiu-jitsu means “the way of yielding,” which refers to using an opponent’s force against him, usually to trap him into a technique that causes pain and damage without much force, such as a joint lock or choke.

I imagine that the concept of verbal jiu-jitsu came from someone who actually knew what jiu-jitsu entailed, someone who observed similar uses of deflection and submission in language. But now, the concept of jiu-jitsu is as blurry as the line we are toeing—or is it towing?—in that other dying metaphor. It reminds me of a recent complaint, sent in by an NPR listener, about the use of the word kabuki to refer to something that is a sham, a performance that is just for show. These Japanese terms may lend an air of worldliness to our critiques of our own culture, but not if we don’t know what the words actually mean.

Occasionally, though, I see verbal jiu-jitsu used to mean something related to the actual principles of jiu-jitsu. One good example is from a website critiquing an exchange between President Barack Obama and Senator John McCain:
MCCAIN: I would just make one comment. Why in the world, then, would we carve out 800,000 people in Florida that would not be—have their Medicare Advantage cut? Now, I proposed an amendment on the floor to say everybody would be treated the same. Mr. President, why should we carve out 800,000 people because they live in Florida to keep the Medicare Advantage program, and then want to do away with it?

OBAMA: I think you make a legitimate point.

MCCAIN: Well, maybe….

OBAMA: I think you do.

MCCAIN: Thank you very much.
Despite some typos and spelling errors on the website, the person who called this verbal jiu-jitsu was actually working with an idea of dialogue as a martial art. He writes, “When Obama does not meet rhetorical force with force of his own McCain quite simply does not know what to do next and so he just stops.”

In other words, Obama is using the force of McCain’s attack against him. McCain expects a disagreement, and when he is confronted instead with an agreement, he rotely disagrees with it, which leads him to disagree with his own initial statement: “Well, maybe [I make a legitimate point],” he says. This reflects a classic martial arts principle: if your opponent pushes, you should pull; if he pulls, you should push. McCain pushed Obama, expecting him to push back—when Obama pulled instead, McCain lost his balance and fell on his face.

One other place this metaphor gets fully teased out is in a Wall Street Journal interview with David Mamet. The article’s title, “Mamet’s Jiu-Jitsu Isn’t Just Verbal,” refers to the playwright and filmmaker’s six years studying Brazilian jiu-jitsu, which was the topic of his movie Redbelt.

In contrast to its title, the article spends quite a bit of time explaining how Mamet’s jiu-jitsu is verbal. Most obviously, as a playwright, he is known for his obsessive focus on the slipperiness and trickiness of language.

In the article, Mamet explains how he applies lessons from jiu-jitsu to conflicts in his regular life. For example, when Mamet was betrayed by a friend, he asked his teachers how he should handle the situation.

"My teacher Renato, of course, came back with 'Don't carry someone else's weight. Let him carry the weight; let it come back to haunt him.' This is one of the central tenets of jiu-jitsu. When you carry the other person's mass you tire yourself and so lose your ability to think clearly. That was the group's way of telling me to let the situation go, to walk away—which I did."

But when Mamet uses so-called verbal jiu-jitsu on the interviewer, he really just gives obnoxious, evasive answers to the interviewer’s questions. When the interviewer asks how Redbelt relates to Mamet’s other work, Mamet responds, “It’s later.” When the interviewer asks if there are differences between the use of language in this film from his earlier ones, Mamet responds, “None.”

The interviewer considers these deflections—which many people would just call rude—to be linguistic martial arts moves: “Mr. Mamet proved as slippery as a well-oiled grappler,” he writes, in preface to this section of the article.

This made me wonder, when we think we are being martial artists with our speech, how often are we just being rude and obnoxious?

Like Mamet, and perhaps like most people who study martial arts, I have tried at times to apply these arts to verbal interactions. I sometimes feel a thrill of power as I sit through a heated debate at a meeting at work and keep myself from engaging in the pointless bashing of conflicting ideas against one another. I sink into a kind of meditation, waiting until all of their respective punches have been thrown. Then, if there is still time, and if I think I have anything productive to add that will help resolve the problem, I will say it. If not, I don’t waste my energy countering force with force.

I know this strategy is working because it’s had the unfortunate side effect of getting me dragged to a lot of meetings—I’ve been told that I am seen as a good mediator. However, I am often also told that I freak people out when I am sitting, blank-faced and disengaged, waiting for the arguers to wear themselves out. Which makes me wonder—am I like Mamet? When I practice the martial art of discussion, am I being strategic, or am I just being an asshole?

Ultimately, the idea of showing off one’s clever wrestling moves in conversation is misguided. With the exception of some boxers and professional fighters, who like to use talk to bolster their confidence, real martial artists don’t talk a lot. They listen, and they speak selectively. That’s the real verbal jiu-jitsu, knowing when you need to assert your views and when you don’t. I’m not always so good at it, but I’m learning.

Thursday, June 10, 2010

Roads

Every morning, I drive on the 580 freeway thirty miles southeast to my workplace in Livermore. I spend at least an hour every day on this freeway—speeding past all sorts of plants and walls and dead animals and garbage and cityscapes and views of the bay that I could barely begin to describe, if I’ve noticed them at all. The freeways are a bizarre no-man’s land, somewhere we go every day, but not anywhere that we think of as a place. In fact, we do all we can to pretend the giant roads don’t exist, building high walls to shelter our neighborhoods from their noise and dirt.

I’ve been thinking a lot about freeways lately. Like approximately twenty percent of people I know, I recently read Cormack McCarthy’s novel The Road. In case you’re one of the other eighty percent, The Road follows the story of a father and his young son as they navigate the perils of a post-apocalyptic world where almost all living creatures have died in some unnamed disaster. The father and son travel by foot down a state highway, pursuing the bleak hope that conditions will be better further south and closer to a coast.

Lots of stories use roads as symbols. In the hero’s journey, the structure described by Joseph Campbell that underlies legends and adventure stories from around the world, the road of trials is the central part of the story, the series of mystical challenges that the hero must overcome.

George Lakoff’s theory of conceptual metaphors helps explain why roads are found in so many narratives. Lakoff points out that almost any event that follows a progression can be symbolized as a journey. People tend to talk about their lives, their romantic relationships, their education, and their careers as journeys. And the metaphor, or perhaps the metonym, for a journey is often the path that it follows, the road itself. So if I ask you how things are going with your significant other, you might say:

We’re at a crossroads.
It’s been bumpy.

We’re encountering obstacles.
We’ve reached a dead end.

This metaphor of road as progress explains why so many stories center on roads. The road becomes a manifestation of the forward movement of the plot. Dorothy follows the yellow brick road as it leads her to every character and obstacle she is destined to encounter; and when she gets to the end of the road, the story reaches its climax. In fact, we might see the road as a symbol for narrative itself.

Roads are such a fundamental part of so many stories that they tend to become invisible, just like the actual freeways we hide out of sight. But the road in McCarthy’s novel is something different. It isn’t just a pathway for the stories and the characters to follow. It’s a thing. The characters walk along it, but they also sleep on it. They kick ash out of it, and climb over dead tree limbs and abandoned vehicles that litter it.

It reminds me of another novel, Octavia Butler’s The Parable of the Sower, which also depicts a journey down a road that has been largely abandoned by a fallen civilization. The two novels share a fascination with how one might use roads after the cars that they were built for are gone, along with all that went with those cars: the anonymity, the safe distance from one’s fellow travelers, the orderly rules. It is jarring to see a freeway as a place where people might walk, eat, sleep, fight, all the things that people usually do at their destinations, not on the roads that take us there.

Since the characters are not flying down the road at seventy miles per hour, readers can see it for what it really is: one of the last, most enduring signifiers of what it means to live in a society. Even after society has crumbled, the roads remain, providing an open path for travel and a promise that they will lead to a destination. When everything around has fallen into chaos and there is no promise of the food, shelter, protection that a society affords, the roads still provide a service, a trusted path, so that travelers don’t need to worry about navigating their journey.

Roads signify civilization because they assert mankind’s dominance over the mystery and disorderliness of nature. They take an uncharted territory and impose human ideas of logic and organization onto it. The Roman Empire was famous for its use of roads to signify their imperial, “civilizing” power. The roads reflected the logic of the empire, all of them leading from Rome to a conquered territory.

In America, roads symbolize social welfare and democracy. They are one amenity that Americans across the political spectrum can agree our government should provide. Conservatives want to limit government’s role in providing education or health care, but no one argues that we should privatize the roads. Roads are for everyone, the ultimate public service, shared equitably by rich and poor alike. Of course, roads in rich areas are better maintained than roads in poor areas, and poor people are often harassed when they drive on roads in affluent neighborhoods. That’s why freeways are the ultimate symbol of democracy, the connectors between places, taking us out of our own neighborhoods and cities and states and into any other place we wish to go.

During recent debates about nationalized health care, Whole Foods Market CEO John Mackey wrote, “A careful reading of both the Declaration of Independence and the Constitution will not reveal any intrinsic right to health care, food or shelter. That's because there isn't any. This ‘right’ has never existed in America.”

I thought at the time: we don’t have a right to a lot of things. There is no constitutional right to public education, or transportation, or a fire department, or police or military protection, or roads. But I’m really glad we have all of those things, since they connect us to one another and shape us into a community.