Josna Rege

Archive for the ‘1990s’ Category

466. Originals and Adaptations

In 1990s, 2000s, blogs and blogging, Books, history, Immigration, Media, postcolonial, Stories, storytelling, United States on April 18, 2020 at 7:16 pm

This is the fifteenth entry in a month-long series, Fifty years in the United States: An immigrant’s perspective, as part of the annual Blogging from A to Z Challenge.

Originals and Adaptations.

There was a while in the 1990s and early 2000s when it seemed that every other book was a contemporary rewriting of another book and every film either an adaptation of a book or a remake of an earlier film. Whether or not this was a new phenomenon, it was something everyone seemed to be talking about and, increasingly, a cause for concern. It was as if we were afraid that we’d run out of things to say, and that all we could do was to recycle old stories in new guises.

There was Michael Cunningham’s 1998 novel, The Hours, each of whose three interwoven subplots engaged differently with Virginia Woolf’s brilliant 1925 novel, Mrs. Dalloway. One imaginatively follows the struggle of the author herself, another explores the inner emptiness of a 1949 version of Clarissa Dalloway, and a third focuses on a New York woman’s party preparations in the 1990s for a friend dying of AIDS. The Hours was wildly successful, winning the Pulitzer Prize for fiction and later being made into a 2002 Oscar -winning film, starring Nicole Kidman, Julianne Moore, and Meryl Streep. (Incidentally, Mrs. Dalloway is one of my favorite novels of all time and happens to be a favorite of readers in coronavirus quarantine.)

Alice Randall’s The Wind Done Gone (2001), billed as an “unauthorized parody” of Margaret Mitchell’s 1936 blockbuster novel, Gone with the Wind, challenged the powerful myth of a pre-civil-war good life among Southern white people, and got into legal trouble with Mitchell’s estate. Unfortunately, although the novel importantly addressed matters that were regularly swept under the rug (in this case the mixed-race children born of white slaveholders’ unwelcome dalliances with black women), the rewrite was unable to challenge the seductive nostalgia of the self-justifying original.

Aladdin was criticised for its Orientalist stereotypes of the Middle East and Asia (Credit: Alamy)

Disney’s blockbuster, Aladdin (1992), was a heavily Disneyfied cartoon adaptation of that oft-retold Arabian Nights tale. I must confess that I refused to watch it on principle, much as I adore Robin Williams, who did the voiceover for the genie, because of the Orientalist stereotyping in the movie’s representations and song lyrics. Sophia Smith Galer, on Dr. Jack Shaheen’s successful campaign to change some of them, noted “The original lyric in the first verse of the song “Arabian Nights” described Arabia as [a place where] ‘Where they cut off your ear if they don’t like your face.’” This was changed; however, many of the crude stereotypes remained.

A well-known 1990s example of a contemporary movie based on a classic novel is Clueless, the 1995 remake of Jane Austen’s 1815 novel, Emma, set in the wealthy city of Beverly Hills, bordering Los Angeles and Hollywood  instead of among the landed gentry in the countryside around London, and spoiled, self-involved valley girl Cher (Alicia Silverstone) playing the Emma character. It’s not clear whether the audience of the movie were spurred to read the novel, or even how many of them made the link between the two. However, viewers of Clueless do make the link to Autumn DeWilde’s Emma (2020), yet another more direct movie adaptation of Austen’s beloved novel (which—another confession–my mother presented to me when I was about ten, but I didn’t actually read until I was well into adulthood).

Kenneth Branagh’s Mary Shelley’s Frankenstein (1994) was just one of numerous film adaptations of Mary Shelley’s 1818 novel going all the way back to James Whale’s 1931 sci fi horror film. Branagh’s adaptation, also a horror film, made much of its faithfulness to Shelley’s plot, just as he signaled in the title itself. And this is the point I want to make here regarding the relationship of an adaptation to the original. Mary Shelley’s Frankenstein protested too much, anxious to bill itself as closer to the original than many of its predecessors, and indeed it was, at least, as far as the plot went. But the overall effect was merely gruesome. Consequently, I don’t remember much of the film except for the blood and gothic horror of some of the scenes featuring Helena Bonham Carter as Elizabeth.

In different ways, both the theorists Fredric Jameson and Jean Baudrillard point to a post-modern nostalgia in late twentieth-century works that signal the “loss of the real,” that is, the lost ability to distinguish between the real and the reality-effect, or simulation. Hence Branagh’s anxiety to point to his film as authentic because of its faithfulness to the original. I don’t know if either Jameson or Baudrillard would have said this, but I wonder if our need to link fictional works to the authors’ actual lives, as in Michael Cunningham’s The Hours, or the effort to make an adaptation “relevant” by setting it in the present moment, also bespeak this anxious need for “authenticity”?

Postmodern works are particularly known for the self-reflexive “metacommentary”, that is, their tendency to dramatize and draw attention to what they are trying to do, instead of just doing it. The 2002  Adaptation was the epitome of this genre, a “meta-film” that drew attention to the anxiety of adaptation, with one brother writing an adaptation of a novel while the other brother is working on an original screenplay.

Back in the 1990s I was decidedly old-school. Having been raised amidst books and hardly having seen any film or television until I came to the U.S. as a teenager, I was a snob about literature, the literary, and the value of originals over adaptations. I never ceased to be horrified whenever my students hailed Disney’s Jungle Book as a classic, whereas I winced every time I heard Baloo the bear mispronounced. Had they never heard of Rudyard Kipling’s The Jungle Book and All the Mowgli Stories? I would ask in scandalized tones, remembering my father lovingly reading them to us at bedtime. But they hadn’t, and their warm fuzzy feelings were routinely reserved for films that made me shudder.

However, the times have changed and so have I. Now I believe that a well-made film adaptation of a novel can actually improve on the original (case in point,  director Mira Nair’s film of The Namesake over Jhumpa Lahiri’s original novel of the same name). I also argue against the very idea of an original. For example, can we ever reach back to an original of the much-travelled, much-translated Arabian Nights, which has always been a compilation of disparate tales, loosely connected by a frame story? Even my beloved Jungle Book—how authentic could it be, springing out of the nostalgic imagination of an Englishman full of contradictions, desperate to recover the beloved India of his childhood but known for his jingoistic nationalism and justification of British colonial rule? Nowadays I suspend my judgement a little and just set out to enjoy the novel or film. Whether it is faithful to a supposedly authentic original is no longer so much of an issue for me; I’m more interested in how and why it speaks to me now, in this moment.

Of course, as an immigrant and a postcolonial critic I’m attentive to whose story a given work tells and from what perspective, and how the different characters are represented. Postcolonial literature is known for challenging dominant elite and colonialist representations and for privileging the voices and stories of the marginalized.

If Baudrillard and other critics were warning us of the Loss of the Real in the late twentieth century, the manufacture and marketing of fake news on social media today has taken it to the nth degree. The press briefing is the reality show, with avid fans following the conversation on Twitter after the show. In this post-truth era, it doesn’t seem to matter whether or not the powerful are telling the truth, as long as what they say confirms the pre-existing beliefs of their target audience (confirmation bias).

In the current COVID-19 pandemic, the reality is so terrible that it is certainly understandable why one might want to deny or soften it, but unfortunately things will get a lot worse if it is not faced-head-on. Nowadays references to originals and adaptations are returning to earlier outbreaks and pandemics, such as the yellow fever outbreaks in Louisiana that kept recurring in the mid-19th Century; or the global flu pandemic in 1918, that followed hard upon the (First) World War and killed at least 50 million people worldwide; or, much closer to the present, how some countries have responded more effectively than others. Returning to these old stories is valuable if it teaches us how to avoid the mistakes of the past. Whether or not COVID-19 is like other coronaviruses or entirely different from them is for the scientists to determine. It doesn’t matter much whether it’s an original or an adaptation; what matters is how we respond to the crisis, how we learn from the experience, and how we protect the most vulnerable among us.

Tell Me Another (Contents to Date)

Chronological Table of Contents

464. Middle Age

In 1990s, Aging, Family, Immigration, parenting, reflections, Stories, women & gender on April 16, 2020 at 10:26 pm

This is the thirteenth entry in a month-long series, Fifty years in the United States: An immigrant’s perspective, as part of the annual Blogging from A to Z Challenge.

Middle Age.

In the late 1990s I officially entered middle age, if the authority of the Oxford English Dictionary and the United States Census are to be accepted. Since they both designate middle age as the years from about age 45 to 65, I am just moving out of that middle period now, and entering a whole new stage of life. But can I cast my mind back to those years in which I was still approaching it? To be honest, it is all a bit of a blur.

During the decade of the 1990s our son moved from starting kindergarten to finishing his first year of high school, with the dizzying array of activities that fill those years. How busy we keep our children! In parallel, I completed my doctoral work and started my first fulltime faculty position, a 215-mile roundtrip commute north of us. Rather than relocate our nuclear family, which was settled happily in a congenial community with our parents on both sides having recently retired nearby, I opted to drive up on Tuesday mornings, rent a room in a house for two nights a week, and return home on Thursday evenings. I suppose it worked, more or less, but it was exhausting, and the almost-continuous shuttling made it hard to simply rest in any one place for long. Sometimes I wonder what it was all for. Perhaps that’s the nature of the striving that defines so much of our working lives. At the time it seems essential; but in retrospect, not so much.

Despite how officialdom defines age groups, they also vary depending on place, education, and social class. In the mid-1970s, when I was looking into midwifery, one of the paths I considered for a time after college, the British midwifery manual labeled a thirty-year-old first-time mother an “elderly primipara.” (Now, by the way, that age has been scaled up to thirty-five.). In  the 1980s when we moved to a farm in a rural community I was an ancient first-time mother at thirty. There were plenty of grandmothers not much older than I was. But when in 1990 we moved to the university town where we still live, I was enviably young with a kindergartner at 35, since so many women had postponed having children until they were established in their professional careers.

The 1978 portrait of the Brown sisters (© 2014   Nicholas Nixon)

There’s another interesting thing about the relativity of age: one’s perception of one’s own age in relation to the rest of the population. In my twenties and early thirties, I felt that I was younger than most other people round me. Whether or not that was indeed the case, I was caught up in my youthful concerns and nobody else really mattered. In my later thirties and forties, I still felt on the young side, but noticed that there were about as many people younger than me as there were older than me. But increasingly, entering my fifties and on up into my sixties, I’ve become acutely aware that I am either the oldest person in the room or alternatively, one among many grey-haired or bald people in my age group, with nary a young face to be seen.

  The 1988 portrait (© 2014 Nicholas Nixon)

How did my perceptions square with actual population demographics? In 1980, when I was 25, the median age of the U.S. population was 30, so I was younger than many others, but comparatively speaking not as young as I had thought. Ten years later, in 1990, when I was 35, the median age was 32.9, so I was just about in the middle; and by 2000, when,at 45, I was entering middle age, the median age of the U.S. population was 35.3, making me fully ten years older than the average American. I still didn’t feel my age.

The 1999 portrait (© 2014 Nicholas Nixon)

All through the 1990s I had the metabolism of my youth. I was pretty much the same weight as I had been in high school, and I still could and did eat anything, and as much of it as I liked without the scales moving in the slightest. My hair was getting greyer, but I was dyeing it at home with an peroxide-free German product that looked very natural, so nobody noticed but me. I seemed to have boundless energy, too, although the long commutes were silently taking  their toll on my system.

It turns out that I was a kind of Dorian Gray through most of my middle age, in that while until age 55 I was regularly considered the person in our group of friends who had aged the least, I was living as if there was no tomorrow in terms of diet, exercise, sleep, and stress reduction. The hidden painting was the one that was aging, not me. But sometime in my early 60s it all caught up with me at once, the middle-age spread, wrinkles, thinning hair, “senior moments,” the inability to concentrate after a certain hour in the evening. Suddenly, it seemed, far from looking young for my age, I looked considerably older than my agemates who had been steadily taking care of themselves. But perhaps that too is all a matter of self-perception.

Something else happened to me as I approached middle age that was less about self-perception than about how one is perceived by others. Not just anyone, though; I’m talking about women in particular. At a certain age, women just disappear; once they are no longer perceived as sexual beings, they are no longer noticed at all. I had read of this phenomenon of middle-aged women’s invisibility and my mother had been telling me about it for years. She would storm in, furious at having been passed over while waiting for service in a store in favor of a much younger woman. “It was as if I wasn’t even there,” she would fume. “I complained, but then they looked as me as if I was crazy and answered in patronizing tones as if I were a child.” I would sympathize with her but had no idea of what it was really like until it started happening to me. With regularity.

Still, despite the messages from society, I persisted in feeling younger than I was. A 2009 survey by the Pew Research Center, Growing Old in America: Expectations vs. Reality, found that the older people get, they younger they feel; until they’re about 30 they feel their actual age, but by age 45 they feel ten years younger.

What has advancing middle age meant to me as an immigrant? Having come to the United States when there were very few immigrants here from anywhere except Europe, I feel like a living historical archive, that I have a lot to share with those who have arrived more recently. I also feel less lonely. As a 1.5-generation immigrant (known as such because they bring with them or maintain characteristics from their home country, meanwhile engaging in assimilation and socialization with their new country), I feel that I can understand both first-generation immigrants and their American-born children. And as I move into and beyond middle age, I delight in the fact that the demographics of the American population are starting to skew in favor of immigrants and people of color. While I was in a tiny minority when I first arrived in this country in 1970, when immigrants made up only 5 percent of the population, in 2020 it has risen to nearly 15 percent; if you additionally count the American-born children of immigrant parents, we are looking at fully 28 percent of the population.


Going back to that 2009 Pew Research Center survey about growing old in America, it found that people aged 75 and older had a count-my-blessings attitude when asked to look back over the full arc of their lives and measure it against their expectations. Younger people, by contrast, were much less forgiving of themselves. I am learning to replace judgement with acceptance. My invisibility—a magic cloak for older women. My steel-grey hair—I embrace it. As for my middle-aged spread, I’ve always been scarecrow-thin. Now I’m what Indians of an earlier generation would have called “healthy”, before Euro-American norms reshaped their standards of beauty.

Looking back, I feel protective toward the forty-year-old me, approaching middle age. I want to give her a gold star for effort, but also give her permission to slow down, breathe, and enjoy life a little more.

Tell Me Another (Contents to Date)

Chronological Table of Contents

462. The Kuwait Phenomenon

In 1990s, blogs and blogging, history, Media, Politics, Stories, United States on April 14, 2020 at 10:11 pm

This is the eleventh entry in a month-long series, Fifty years in the United States: An immigrant’s perspective, as part of the annual Blogging from A to Z Challenge.

The Kuwait Phenomenon.

President George HW Bush and Secretary of Defense Dick Cheney, 1990. (Doug Mills/ AP)

George H.W. Bush was inaugurated President on January 20th, 1989, and, with Dick Cheney as his Secretary of Defense, wasted no time in giving the world a taste of things to come. On December 20, 1989, a month after the fall of the Berlin Wall, Bush launched the invasion of Panama to depose President Noriega. No other countries were informed in advance of this illegal action that completely disregarded the then-hallowed principle of national sovereignty. As Greg Grandin wrote in Mother Jones at the 25th anniversary of the invasion, “it was George H.W. Bush’s invasion of that small, poor country . . . that inaugurated the age of preemptive unilateralism, using ‘democracy’ and ‘freedom; as both justifications for war and a branding opportunity.”

This did not bode well for the 1990s.

On August 2, 1990, the day that Iraq occupied and later annexed Kuwait in an oil dispute between the two countries, President Bush launched Operation Desert Shield, followed, on 17th January 1991, by the  military offensive, Operation Desert Storm. By 28th February 1991, after a remotely conducted bombing campaign and a brief but gruesome ground war, Iraqi forces had been decimated and the United States, announcing the liberation of Kuwait, proclaimed victory in what would come to be known, ominously, as the First Gulf War. On December 26, 1991, almost simultaneously with the onslaught of Operation Desert Storm, the fall of the U.S.S.R. made the U.S.A. the world’s sole superpower, advancing the neoconservative agenda of a New World Order in which the United States would be the uncontested world leader, able to take unilateral action against any other nation.

What do I mean by the Kuwait Phenomenon? To start with, I mean that this global superpower has a habit of invading small countries which its own citizens cannot even point to on a map. When I began to write this I thought that perhaps when the United States invaded a country it would at least motivate Americans, notorious for their ignorance of geography, to find out where it was located. In fact, a 2017 survey of Americans on North Korea, which the Trump Administration was threatening to bomb at the time, suggested that those Americans  who could identify North Korea correctly on a map were more likely to prefer diplomacy, and conversely, those Americans who could not identify North Korea on a map were more likely to support bombing it. Perhaps our ignorance suits the war hawks just fine.

To me, the Kuwait Phenomenon also refers to the way in which the United States uses the media to sell the American people on a given war. Not only were most Americans completely ignorant of Kuwait, a wealthy, authoritarian, oil-rich nation—in fact, the fourth-richest country in the world, but they were completely uninterested in it. This posed a problem for the Bush Administration, which was planning to launch a war against Iraq with the ostensible goal of liberating Kuwait, since people were unlikely to want to risk American lives to liberate a country they had no fellow-feeling for. Joshua Holland has written a devastating article about how the First Gulf War was sold to the American public on “a mountain of war propaganda,” including the utter falsehood, first reported by Scott Peterson in the Christian Science Monitor, that Iraq was engaged in a troop build-up to invade Saudi Arabia next and, in a campaign dreamed up by an expensive public relations firm, the fabrication that Iraqi occupying forces in Kuwait had ripped scores of babies out of incubators. In fact, the primary reason for the war was that the United States could not countenance the idea of Iraq controlling more than twenty percent of the world’s oil supply.

A third element of the Kuwait Phenomenon is the sad spectacle of U.S. might arrayed against any country in the world, given the global pre-eminence of the American military; in particular, the obscenely uneven death toll. Political scientist Steve Yetiv called the First Gulf War one of history’s most lopsided victories. U.S. aerial bombing of Baghdad killed approximately 3,500 Iraqi civilians and it is further estimated that 150,000 or more civilians died in the aftermath of the war due to the destruction of the electrical generating system. On the other side, Iraqi occupying forces are estimated to have killed more than 1,000 Kuwaiti civilians; no U.S. or coalition civilians were killed.

 Highway of Death

Comparing combatant casualties, it is estimated that “at least 65,000 Iraqi soldiers were killed”, most of the killing taking place during the brutal ground war, on the so-called Highway of Death, where retreating soldiers were bombed with a neat device known as a ‘fuel-air explosive.’ In contrast, 378 coalition troops were killed, only 190 of whom were killed by Iraqi combatants, the rest in “friendly fire” or accidents. A 1991 United Nations report described the effect of the U.S. bombing campaign on Iraq as ‘near apocalyptic’ (The New York Times, cited in Gulf War/Iraqi Casualties). To Americans, who saw the bombing on cable news as surgical strikes from computers at a sanitized distance, the war was like a video game. Iraqis, on the other hand, bore the brunt of the strikes on the ground.

For 42 consecutive days and nights, the coalition forces subjected Iraq to one of the most intensive air bombardments in military history. The coalition flew over 100,000 sorties, dropping 88,500 tonnes of bombs, and widely destroying military and civilian infrastructure. (The Gulf War Air Campaign)

Tens of thousands in San Francisco, Saturday, Jan. 19, 1991, protesting the United States attack on Iraq and Kuwait.  (AP Photo/Eric Risberg)

We opposed the war, of course, as did so many other Americans, and even more people around the world. Nikhil, in kindergarten at the time, seemed to draw bigger and more menacing fighter planes every day, as warplanes from nearby Westover Air Force Base flew low and loud overhead several times a day.

The nineties had not got off to a good start.

Tell Me Another (Contents to Date)

Chronological Table of Contents

257. Leaving on a Jet Plane

In 1960s, 1970s, 1980s, 1990s, Inter/Transnational, Music, people, places, Stories, travel, United States on April 11, 2014 at 8:43 pm

atoz [2014] - BANNER - 910



More than anything, the word jet suggests speed to me—speeding up and away, powered by a jet engine. It seems I’ve been leaving somewhere all my life, at first at a slower speed, by steam-powered ships and locomotives, then faster and faster with the revving-up of the jet age, starting in 1963 with a flight on a Boeing 707 from London to Bombay.

Jet Magazine cover, 1954

Jet Magazine cover, 1954

The song that first captured my feelings of having to leave a beloved place and to leave beloved people behind was Harry Belafonte’s 1957 Jamaica Farewell, a great favorite of my mother’s (see TMA #34, His Master’s Voice). Sung by a sailor, it was well suited to departures by sea. (Here’s a photograph of Harry Belafonte with Dorothy Dandridge in 1954 on the cover of Jet (invoking the other sense of the word, blackness).)

In 1966, less than a decade later and just a couple of years after my family’s first flight, the American singer-songwriter John Denver wrote Leaving on a Jet Plane. No song evokes quite as it does the wrenching sadness of having to blast off with jet-propelled force, perhaps never to return, leaving behind someone whom you love. Although John Denver wrote it, Peter, Paul and Mary were the first to make it a hit in 1969, when it reached #1 in late December. It was still in the air and on the charts in early 1970 when we arrived in the States.


I have always liked the Jamaican reggae deejay Yellowman’s 1982 version, perhaps because it has a little humor which balances out the sentimentality of the song and prevents it from getting too schmaltzy, and because it brings it back to Jamaica. More recently, in 1998, it re-emerged in the soundtrack of the movie Armaggedon (talk about speed!), sung by Chantal Kreviaz.

There seem to be more songs about leaving than about coming home, perhaps because in this life, leaving wrenches our hearts again and again, while coming home, though longed for so deeply, is often attended by disappointment. Time speeds on, and the home to which we return can never be the same one that we remembered with such reverence.

[To counteract the sadness of that thought, you may want to listen to a boisterous rendition of Back in the USA, Chuck Berry’s celebration of homecoming (Well oh well, I feel so good today/We’ve just touched down on an international runway), sung here with Linda Ronstadt.]


Tell Me Another (Contents to Date)

Chronological Table of Contents

atoz [2014] - BANNER - 910

247. Gauri Deshpande: A Distinctive Voice

In 1990s, 2000s, Books, India, people, postcolonial, reading, Stories, women & gender, writing on March 19, 2014 at 2:36 pm
Gauri Deshpande (1942-2003)

Gauri Deshpande (1942-2003)

In Marathi or in English, in person or in print, the prolific poet, fiction-writer, and translator Gauri Deshpande (1942-2003) has a distinctive voice: strongly feminist, wryly humorous—usually at her own expense, confident yet self-critical, irreverent yet steeped in tradition, cosmopolitan yet grounded in her love for language and place. No matter who or where her audience is, she is bound to challenge their assumptions, producing both discomfort and delight.

In 1993, as a postgraduate student preparing with trepidation for our first meeting at the University of Poona’s English Department where she was teaching at the time, I carefully donned a traditional Pune sari to meet the daughter of the illustrious anthropologist Iravati Karve and the granddaughter of the illustrious social reformer D.K. Karve. To my embarrassed surprise, a tall, lanky, imperious-looking woman dressed in torn trousers came striding toward me and grasped my hand in a firm handshake. We became friends quickly, thanks to her openness and generosity, and my husband, son, and I have fond memories of our visits to her house during our stay in Pune, as we all ate and talked non-stop, and played fast and furiously competitive card games (the game of Running Demons I shall forever associate with Gauri Deshpande) with her and her daughters, son-in-law, and grandsons. Back in the United States a decade later when I heard the sad news of her untimely death over the internet, I could hardly imagine returning to Pune without her there.

While Gauri Deshpande was unquestionably one of the most important and innovative writers in contemporary Marathi literature, and was well-known and respected throughout India and among scholars of Maharashtra, she began her career writing well-received poetry in English. She published three collections with P. Lal’s Calcutta-based Writers Workshop and edited a collection of Indian poetry in English in the late sixties and early 1970s, but then switched over to writing fiction in Marathi and made her name with her stories and novellas (9 in all), and her translations. At the time of her death in 2003 she was relatively unknown beyond India; however, that was changing, since her work in English had been gaining greater exposure throughout the 1990s. One of her Marathi stories was translated into English and anthologized in the important two-volume Women Writing in India, published in 1993, and her first collection in English, The Lackadaisical Sweeper and Other Stories, was published in 1997. Several of her important Marathi-English translations were also published or re-issued in the late 1980s and 1990s, including Sumitra Bhave’s Pan on Fire: Eight Dalit Women Tell their Story (1988), Jayawant Dalvi’s searing social critique, Chakra: a novel (1974, 1993), and Sunita Deshpande’s …and Pine for What is Not (1995), a controversial memoir by the wife and secretary of the popular Marathi playwright P.L. Deshpande.

Like Gauri Deshpande herself, her stories confound readerly expectations—whether the readers are Indians or non-Indians—of Indian society, and specifically of women, and the stories are often profoundly unsettling, jarring the reader out of complacency. In addition, they continually shift perspective, from India to the United States and back, from gender to caste-class, from mother to daughter, from the rational to the emotional, from the abstractly philosophical to the earthily physical, and back again. Further, the categories themselves are unsettled, as women resist femininity, Indians refuse to behave in a stereotypically “Indian” manner, and the direction of global flows are reversed, as Americans migrate to India and become entirely assimilated.

Women_Writing_in_49ecaf8ba7e67In “That’s the Way It Is” (Ahe he ase ahe), the story published in Women Writing in India, the utterly rationalist first-person narrator gains a new perspective on herself at middle age, in a chance meeting with an old friend, an American long-settled in India. As she goes literally and figuratively to buy glasses to correct her far-sightedness, she discovers that she has understood nothing at all of life and love. When her friend observes, “You really do need glasses to see up close” (475), his comment prompts a shift in perspective, as the narrator, remembering so many incidents in the past, realizes that he has loved her silently ever since their childhood, while she has remained oblivious. “And suddenly, I saw…I was all wrong; I had missed my way in life. My constant arrogant insistence—“What I say is right!”—had kept me from knowing what it was that others understood about life. I didn’t let myself know. All this” (475-476). This capacity to be at once opinionated and self-critical is typical of Gauri Deshpande’s writing.

In the title story of The Lackadaisical Sweeper, two newlywed upper-middle-class wives, one Indian, the other American, stationed in Hong Kong with their businessman-husbands, meet and become friends as they take their daily morning walks. At first the American woman appears to be stereotypically brash and self-involved, the young Indian woman (aptly named Seeta) equally stereotypically meek and submissive.  However, the Indian wife’s unquestioning submissiveness to her husband’s demands leads her to betray her American friend’s open confidences about her husband’s business dealings. Learning from Seeta that her American friend and her husband are Jewish, Seeta’s businessman-husband is able to use anti-Semitism and his wife’s inside information to force the couple to flee the country, grabbing their real-estate holdings just as the property market is booming. The reader’s disgust shifts from the uninhibited sex talk of the American woman to the unethical behavior of the Indian woman. And then, in a characteristic shift, Gauri Deshpande gives a silent, sullen street sweeper the last word. Every morning the American woman has greeted him as they pass him on their morning walks, trying in vain to elicit a response from him. In the closing scene, Seeta greets him and he answers back, to her delight, though she understands nothing of what he has said. In her parting shot, Deshpande leaves us with a view from below: “It was fortunate that she did not know Cantonese” (28). Wealthy, sheltered Seeta’s naiveté does not excuse her from complicity with her husband’s land-grab plot, neither does it excuse her total ignorance of the sweeper’s point of view. The reader is left pondering the sweeper’s judgment of Seeta, who may be a virtuous Indian wife, but is not a good human being.

In “Map”, a tribute to Edward Said, a middle-aged woman reclaims her body as her own territory after a love affair has ended.  The story draws upon the postcolonial critique of colonial thought as a gendered discourse that designates the colonized as female, a blank canvas passively desiring to be conquered and mapped. Her ex-lover was the colonial explorer cartographer, drawing the map of her body in his own, exoticized terms. As in Said’s Orientalism, where the “Orient” as represented by the European Orientalism bears no resemblance to actuality, but is a projection, a “will-to-power”, of Europe itself, in Deshpande’s story the female first-person narrator now recognizes “that the me in his mind had nothing to do with the me in my mind” (55). Taking pleasure in self-discovery at last, she declares, “it’s my body now and my map” (61).  Gauri Deshpande’s refreshing frankness in discussing the female body and female sexuality can never be pornographic, because pornography is a language of power and domination, while hers is a language of love and self-acceptance.

In “Insy Winsy Spider,” another story in the same collection (translated from the Marathi original “Bhijata Bhijata Koli”), a mother is forced to recognize her daughter’s difference from herself. The mother is a highly-educated professor of Buddhist philosophy, a scholar of the Self who, ironically, seems to have little self-knowledge. She and her husband, also a philosophy professor, who have named their daughter Maitreyi “to help her on her way to greatness,” are mortified when the daughter announces that she has no interest in studies and is going to get married without even having done her B.A. The next day, as the mother clears her mind to write an academic paper on the development of self-awareness in the ‘self’, the sight of her daughter chopping onions gives her a sudden revelation: while all growing children must learn to differentiate the ‘I’ from the ‘not-I’, she, in her self-involvement, has failed to differentiate herself from her daughter, despite her age and education. Like the spider in the nursery rhyme, climbing back up the water spout, “It was necessary to begin all over again…‘I’ am not this Maitreyi” (125).



I want to close with a few personal reminiscences of Gauri Deshpande that might shed some light on how her mind worked. With regard to the title of the story, “Insy Winsy Spider,” she once told me that one of her professors during her postgraduate studies in English literature insisted that his Indian students read English nursery rhymes in order to become as fully immersed in the language as a native speaker. She herself was in complete command of English, confident enough to reshape it in her own image. With regard to her firm commitment to write in Marathi, she once observed with a wry smile how much more money she could be making if she were writing in English. With regard to her exalted caste status and eminent parentage, although she rejected many upper-caste/class social and gender norms, Deshpande loved the language and culture of her community. Talking to fellow-Marathi writer Ambika Sirkar, she once observed sadly that, with the passing of their generation, certain turns of phrase particular to their community would disappear forever. When we visited her in Pune, even as she offered her guests a cold glass of beer, she also offered us a tumblerful of panha, a cooling green mango drink, explaining that it had to be drunk at this particular time of year.

As Shanta Gokhale wrote soon after her death, “How could this strapping, handsome, vibrant, gutsy, intense and intellectually passionate woman have just ceased to exist? Gauri had an insatiable zest for living, for experiencing new places and people, for friendship, for loving and giving”  (Gokhale, “Woman of Substance”). As a writer and as a person, Gauri Deshpande has left a gap in English and Marathi fiction and society that is not easily filled.

First published in SPARROW Newsletter (SNL 14, August 2008)
Also posted on Gauri’s daughter Urmilla Deshpande’s blog.  (Urmilla is also a writer: see her blog  for all her titles and  how to  order them.)

Tell Me Another (Contents to Date)

Chronological Table of Contents

244. A Chip off the Old Block? If Only.

In 1990s, 2010s, Family, Inter/Transnational, Music, parenting, people, Stories, women & gender, Words & phrases, Work on February 18, 2014 at 9:01 pm
Another Chip off the Old Block (Artist: Robert Deyber)

Another Chip off the Old Block (Artist: Robert Deyber)

Mama tried to raise me better, but her pleading I denied
That leaves only me to blame ’cause Mama tried.
—Merle Haggard

My dear mother has always had a preternaturally sensitive sense of smell, though now tempered somewhat by age. (She also has a keen sense of justice, but that’s another story.) I shall never forget her entering my kitchen one day, tilting her head slightly, nostrils twitching almost imperceptibly. She made a beeline for my fridge and, pulling the door open, drew forth an innocuous-looking cardboard carton and raised it to her nose. “Your milk’s gone off,” she announced.

Milk goes off easily in the tropics, but this was in temperate New England. Wherever in the world we have lived, Mum has always kept up her home, our home, to her exacting standards, clearing, cleaning, scouring fiercely. She has banished bric-a-brac and done battle with dirt as if life depended on it. Cleanliness has indeed been next to godliness for her, despite her being a lifelong agnostic.

When my husband and I moved into our current house, Mum was still working fulltime in Boston some 90 miles away, but announced that she planned to come out to visit on the very first weekend thereafter. Her intention, she said, was to scrub, sand, and repaint the baseboards and the risers of the staircase all the way from the front hall to the second-floor landing. The previous owners had laid new carpeting up the stairs, but had neglected to refinish them first, so it would be tricky to do the work after the fact. A painstaking job, and Mum was itching to take it on. But I stopped her, indignant. How dare she make plans for my house! I would determine what needed to be done and when, and that painting job, merely decorative in my book, was nowhere near the top of my To Do list.

The thing is, every single time I walk up my front stairs, I notice the grime on the baseboards and the chipped paint on the risers. I still haven’t got round to doing that job. If Mum had had her way it would have been done, and done well, that very first weekend, nearly a quarter of a century ago.



I have absorbed some of my mother’s sensitivity, but not so much of her drive and determination, often tending to drift and dream rather than simply getting on with it. As a young householder, I must have noticed the dust gathering along the tops of the baseboards, but until my mother was coming to visit I would be quite content to let it lie there undisturbed. Then I would fly into a whirlwind of activity, my eye falling on all the little details I routinely ignored with ease, but which she would spot the instant she stepped into the house, just as she detected that whiff of sourness exuded by the milk in my fridge.

Mum’s habits of cleanliness have lasted a lifetime and, to this day, she instinctively picks up a kitchen cloth or paper towel and wipes down the counters, sinks, and stovetop. Wherever she sees piles of papers, scattered crumbs or a jumble of odds and ends, she attempts to restore order, folding, sweeping, stacking them with care. I see these things too, the inevitable flotsam and jetsam of life, although I have become adept at sweeping them out of my mind’s eye. There they pile up, out of sight, but alas, since I am my mother’s daughter, not entirely out of mind. They continue to trouble me until eventually the disturbance is too great to ignore.

I think of Pete Seeger singing We Are Climbing Jacob’s LadderEvery rung goes higher, higher.  I am now nearly the same age that Mum was when she set her mind to repainting my stairway. It’s time, high time.

Tell Me Another (Contents to Date)

Chronological Table of Contents

240. Heaven’s Gate: Two Degrees of Separation

In 1970s, 1990s, 2000s, Books, history, parenting, people, places, reading, Stories, travel on January 19, 2014 at 4:14 pm
The Pied Piper of Hamelin. Artist: Anette Bishop (

Because it was generally felt that we needed to read something “uplifting” for a change, my book group chose Christopher Castellani’s 2013 novel All This Talk of Love for our first meeting of the new year. I’m still only halfway through it, and have already been brought to tears twice; the jury is still out on the question of whether or not it is uplifting. Whatever that means will also be up for discussion at the meeting. The novel, set in Boston and Wilmington, Delaware in the late 1990’s, raises close-to-the-bone issues of relationships in immigrant families; different ways of coping with death, illness, and aging (by the way, the large-print edition I checked out from the library is changing the reading experience for me); memories, secrets, and silence; differing values, perspectives, loyalties, and emotional attachments between generations, siblings, husbands and wives; and where Home can be found. But yesterday another kind of passage brought me up short, sending me down to the basement to rummage through old papers for nearly an hour and haunting me for the rest of the day:

He [Frankie, the still-unmarried youngest son of Italian immigrants, writing his doctoral dissertation in postcolonial literature] flips through the six channels that come with basic cable and settles on PBS. . . It’s a low-budget documentary on the Hale-Bopp comet, and though it’s yesterday’s news, it captivates him. The comet, the greatest natural spectacle of the nineties, is long gone and won’t be back for two thousand years. The thirty-nine brainwashed believers who followed it into oblivion won’t be back at all. Meanwhile, the earth remains in a perpetual state of loneliness, welcoming but never visited, a ghost whose friends drive by once in a while but don’t stop in.

Immediately I was back in 1997, when I first heard the hair-raising news of the mass suicide (some say murders) of 38 members of the Heaven’s Gate cult, along with their remaining leader, Marshall Applewhite, in Rancho Santa Fe, California. All the more unsettling for me because nearly two decades beforehand I had had a brush with Applewhite and his late partner, Bonnie Nettles. Well, not exactly a brush: more accurately, a near miss; but it was a near-enough miss that the news gave me a curiously contaminated feeling, and made it impossible for me to simply dismiss the dead cult members as another bunch of loonies who had drunk the Kool-Aid.

The news sent me down to a cardboard box  in the basement, full of posters, flyers, notebooks, and newspaper clippings from our activist days in the 1970s, where I almost immediately laid my hands on what I was looking for: a photocopied flyer announcing a visit of “the Two” to UMass-Boston, and inviting interested people to come and meet them. Although I wasn’t aware of it at the time, these “Two” were Marshall Applewhite and Bonnie Nettles, later to be known as “Bo and Peep” or “Do and Ti.” And as strenuously as I reject the idea that I would ever be tempted to join any kind of cult, I cannot deny that back in late 1978 or early 1979 Andrew and I were intrigued enough about the Two to make our way one night to the deserted commuter campus of UMass Boston.

Between the summers of ’78 and ’79 Andrew and I lived in New Mexico, driving out and back in Andrew’s 1951 International Harvester milk truck. In-between we took a short trip back to the Boston area, mostly to see my uncle Nandu, who was the first of either of my parents’ siblings to visit us since we had immigrated to the United States. Frustratingly, I can’t remember exactly when we took that trip, but for the purposes of this story about the Two it matters whether it was late 1978 or early 1979, and if the former, then exactly how late in the year; for it was in November of 1979 that more than 900 members of the People’s Temple committed suicide in Jonestown, Guyana under the direction of leader Jim Jones. But it didn’t cross my mind, as I decided to check out the Two, that I might well have been the next recruit for a suicide cult.

UMass-Boston campus (/

UMass-Boston campus (/

Looking down at that nearly 20-year-old flyer (which I have since mislaid) back in March, 1997, I recalled what had transpired that evening: thankfully, not very much. We took the South-East Expressway from Somerville to the lonely peninsula of Columbia Point where the stark concrete campus of UMass-Boston was located. It was only after we had arrived that we realized that the flyer had not mentioned a venue for the meeting, so we found a place to park and proceeded to wander around in the dark, looking for a sign. No sign. Then we looked around the empty campus for other lost-looking souls like ourselves to approach and ask whether they too were seeking the Two. I can’t remember now whether we did and, if so, whether they were; I do know that we made our way home rather disappointed, consoling ourselves with the thought that it had been a silly idea to follow up on the flyer in the first place.

But what was the most chilling to me back in 1997 when the news first broke, and again yesterday after reading the passage in the novel (whose meaning I can’t fully contextualize until I’ve finished the book), was that rational, skeptical, educated people like Andrew and me, people who were socially engaged and had close, loving families, would nevertheless be interested enough in what we had heard of the Two that we would follow a cryptic flyer so as to hear first-hand what they had to say. They had been traveling the country speaking and recruiting, creating a bit of a buzz in the alternative and New Age youth cultures, and we had heard of them while we were out in the Southwest. Now, while we were on a short sojourn Back East, so were they. And that, it seems, was enough to draw us to them.

Other curious young people like us were not as fortunate as we were that night. Talking to our neighbor Bob that early Spring of 1997, in the aftermath of Heaven’s Gate, l learned that while Andrew and I had had a near miss, he was only one degree of separation from the tragedy. When he had read the names of the dead, he had realized with a shock of recognition that one of them had briefly been a housemate of his, back in the early 1970s. Looking up the cult on the Internet yesterday, I found that one of them—in fact, the woman who became Applewild’s nurse—had been a caring, compassionate young nursing student at UMass Amherst in 1975 when the Two had first begun their countrywide recruiting. Along with the leader himself, she was one of the last to die, as she had prepared the apple-sauce concoction that was the vehicle for their quick and painless deaths.

The lame child left behind. Artist unknown (

The lame child left behind. Artist unknown (

Yesterday, before I was recalled to my work in the present, I watched part of a British documentary on the Heaven’s Gate cult. In it, the film-makers interviewed surviving members of the group, some of whom had left long before they had begin planning their last fatal action, and others who had originally been part of the suicide pact. As I listened to one of them, I was put in mind of the sole-remaining child in Hamelin, the rest of whose playmates had followed the Pied Piper and never returned. His wistful words haunt Robert Browning’s poem:

‘It’s dull in our town since my playmates left!
I can’t forget that I’m bereft
Of all the pleasant sights they see,
Which the Piper also promised me.
For he led us, he said, to a joyous land,
Joining the town and just at hand,
Where waters gushed and fruit-trees grew,
And flowers put forth a fairer hue,
And everything was strange and new;
The sparrows were brighter than peacocks here,
And their dogs outran our fallow deer,
And honey-bees had lost their stings,
And horses were born with eagles’ wings:
And just as I became assured
My lame foot would be speedily cured,
The music stopped and I stood still,
And found myself outside the hill,
Left alone against my will,
To go now limping as before,
And never hear of that country more!’

In case you think it is rather a leap to connect the fate of those hapless cult members with the children of the poem, read the first lines of the very next stanza (with my emphasis):

Alas, alas for Hamelin!
There came into many a burgher’s pate
A text which says that heaven’s gate
Opens to the rich at as easy rate
As the needle’s eye takes a camel in!
The mayor sent East, West, North and South,
To offer the Piper, by word of mouth
Wherever it was men’s lot to find him,
Silver and gold to his heart’s content,
If he’d only return the way he went,
And bring the children behind him.

dontdrinkWe nearly always recognize our mistakes when it is too late to correct them. While the parents and friends of those who had set their sights on Heaven’s Gate, along with most of us who read the story in the newspapers, were left as mystified as the burghers of Hamelin who had lost their children forever, I can never again distance myself from those children, who were earnest, disaffected young people not so very different from myself, seeking a better world, and only two degrees of separation away.

Tell Me Another (Contents to Date)

Chronological Table of Contents

239. No, It’s Not Political Incorrectness

In 1960s, 1980s, 1990s, 2000s, history, Media, Politics, Stories on January 7, 2014 at 12:35 am


The opening scene of Blake Edwards’ Breakfast at Tiffany’s (1961) is subtle, delicate, and totally charming, with Audrey Hepburn wandering waiflike along a deserted Manhattan street in shoulder-baring Givenchy, accompanied by Henry Mancini’s romantic rendition of “Moon River.” (Did you know, by the way, that Moon River was written expressly for that movie?). I stretched out luxuriously and prepared to relax for an hour or two, escaping into the dreamworld of this American classic, which, amazingly, I’d never before watched all the way through. But America had something else in store for me: as Holly Golightly, our gamine of a heroine, gets back to her apartment building she is met, not by a handsome beau, but by a loud, shrill caricature of an ugly, buck-toothed, lascivious Japanese man. It was such a jarring shift that I stopped the movie; it had completely spoiled the mood for me.


It has no doubt been argued in its defense that Breakfast at Tiffany’s was made more than 50 years ago. The 50th anniversary DVD of the film tries to make amends,  confronting Mickey Rooney’s racist impersonation head-on with the inclusion of the video, Mr. Yunioshi: An Asian Perspective. But this heartening fact doesn’t help me personally. The movie provides plenty of fodder for an analysis of orientalist representations of Asians, but that would be a busman’s holiday. What it doesn’t do is allow me to kick back, half-close my eyes, and be carried away. Instead I feel like Bruce Lee, as he watches the same scene in The Bruce Lee Story.

Such scenes are not limited to movies of yesteryear, but crop up with depressing regularity in movies that go on to become blockbusters. In fact, far from being aberrations, they are an integral part of the Hollywood formula. Everyone of a certain age knows of the embarrassing figure of Long Duc Dong in Sixteen Candles (1984). Is there some unwritten American movie rule that says that there has to be a ridiculously inappropriate Asian geek as a repulsive foil to the (white) American sweetheart?

When Nikhil was in elementary school, I had a similarly jarring experience, this time when a group of his friends were at our house for a sleepover. We had duly bought the requisite pizza, set up all the mattresses and folding cots, and rented a stack of movies suitable for pre-preteen boys. By popular acclaim, the evening’s choice was Happy Gilmore (1996), starring an up-and-coming young comic actor by the name of Adam Sandler. I agreed to the selection, despite its PG-13 rating (“for language and some comic sexuality,” which I felt sure would be innocuous). I had never heard of the lead actor before and sincerely wish it had stayed that way.

This movie, too, introduces the Asian caricature, this time a Chinese woman, early on, both to drive home the unredeemed crassness of the loser lead character and to serve as a foil for his pretty American girlfriend who won’t touch him with a ten-foot pole. Here’s the scene, so offensive and irredeemably crass itself that I hesitate to provide a link to it. I winced as I watched it that day, with a squirming, Dorito-munching knot of little boys stretched out on the floor in front of me, but didn’t say anything, hoping that it would go under their radar or right over their heads. Sadly, it did neither. In the car the next day as I drove the boys home they were chatting away in the back seat, and one of them said, in those bragging tones that boys assume when they’re trying to act sophisticated and worldly, “Wasn’t the scene funny in the movie last night, the one with the Chinese woman” (nod nod wink wink)? I cringed all over again, realizing that not only had they not missed it, but that in fact its brand of humor had probably been pitched at viewers of precisely their mental age. The whole source of the humor was that the Chinese woman who had thrown herself at our hero was so ugly that no one would ever desire her, yet he had not only happily used her for the night, but just as carelessly discarded her the next morning at the prospect of the all-American girl who was the true object of his desire, even as his Chinese doormat was preparing to do what Asian women are purported to do best: serve.


Perhaps an even more disturbing scene in Happy Gilmore involves an African American. It’s been a long time since that sleepover in 1996, and I have no intention of watching the movie again, but  Chubbs Peterson, an African American former golf pro who lost his hand to an alligator, chooses to give his blessing and his special golf club to Happy Gilmore and, as if in repayment, is killed off completely gratuitously, after which Happy goes on to win the championship. For the life of me I can’t understand why so many viewers seem to find this scene funny.

The winning Hollywood archetype that Chubbs Peterson (played by football-player-turned-actor Carl Weathers) enacts is that of the Magical Negro, a black character who selflessly (and often for no apparent good reason) sacrifices him or herself to mentor and redeem a washed-up (and often utterly unworthy) white character. Once you identify this tired trope you will see it everywhere.


Still, aside from the adolescent humor of Adam Sandler and his ilk, there have been some encouraging developments in the past half-century of film. The Japanese American Gedde Watanabe playing the execrable role of Long Duc Dong was arguably some kind of advance, if a dubious one, over Mickey Rooney in yellowface as Mr. Yunioshi. But, no thanks to Sixteen Candles, 1984 was a breakthrough year for Asian American actors; it was the year when, as the character of Mr. Miyagi, Japanese American actor Pat Morita played a winning role in a Hollywood movie, The Karate Kid. Even though it was still a somewhat stereotyped role and remained a supporting one in which an elderly Asian American man mentors a callow white youth, Mr. Miyagi was a complex character, not a clown, and he paved the way for more and better roles for Asian American actors. (Unfortunately, 30 years on, their prospects are little better, and in 2010 Aly Morita, Pat Morita’s daughter, called for a boycott of the Karate Kid remake.)


Given that one of my main goals in sitting down to watch a movie is to relax and enjoy myself, I’m not afraid to admit that I have a distinct preference for romantic comedies. While I don’t discriminate racially among my male romantic heroes, and freely confess to a fondness for the late Christopher Reeve, Colin Firth, and even Hugh Grant (in his time), I do take exception to the common practice in the U.S. of emasculating Asian American male characters. If they’re not utter buffoons, they’re nerds or geeks, and they are never romantic leads (unless they’re martial artists). That’s why I welcomed the Harold & Kumar films (Harold & Kumar go to White Castle (2004), Harold & Kumar Escape from Guantanamo Bay (2008), and a third, which I haven’t seen) with open arms, despite their adolescent-boy humor and silly stoner genre. Korean American John Cho and Indian American Kal Penn are intelligent, good-looking, romantic-hero material, and babes fall for them right and left. I take pleasure in this—and in them. I defy you to watch this trailer or this clip without doing the same.

In regard to my rant against Hollywood’s humor at the expense of brown and black folk, readers may wonder why I don’t just lighten up. It’s only comedy after all; why such tedious insistence on political correctness? My answer: because I watch movies to feel good, and characters like Mr. Yunioshi in Breakfast at Tiffany’s or the Chinese woman in Happy Gilmore completely spoil the fun.

No, it’s not political incorrectness that is the problem here: it’s bad old-fashioned racism.

Tell Me Another (Contents to Date)

Chronological Table of Contents

235. December 5th, 2013

In 1970s, 1980s, 1990s, 2010s, history, Inter/Transnational, Music, people, Politics, Stories on December 6, 2013 at 4:38 pm


December 5th, 2013: Nelson Mandela has died, at the age of ninety-five.

Nelson Mandela is dead. The news was to be expected, of course, at his age and after his long illness, but it was nonetheless so hard to grasp and accept the loss of this beloved statesman, harder still to let him go. In the outpouring of grief and praise for one whose life, work, and example changed and inspired the whole world, comes a rush of personal memories that might not have been made at all, and that certainly would not have been the same, had it not been for this one remarkable man.

The very first bumper sticker that Andrew made, years before we even started Whetstone Press, was in solidarity with the South African people’s struggle against apartheid, in memory of the Soweto Uprising the year before, when police had shot an estimated 200 schoolchildren among the thousands protesting for a better education. It read, Remember Soweto—June 16, 1976, and was a linoleum cut that he carved himself and printed in black ink on orange pressure-sensitive paper. It was never sold, just given away to whoever would display it on their car. We supported and identified with all the anti-colonial national liberation struggles in Africa, in Angola and Mozambique and in Rhodesia (not yet Zimbabwe), but none more so than in apartheid South Africa.

I don’t quite know why I have always felt such a personal connection with South Africa, never having been there and not having any family or close friends based there. And yet somehow its political struggles, people, and history and culture, particularly its literature, have become as dear to my heart as if I were actually from there. Perhaps it is because the country’s history and make-up is a mirror of my own, in its internal diversity, in its populations of both English and Indian heritage, in being the place where Mahatma Gandhi first developed his theory and tactic of non-violent civil disobedience, in continuing to grapple with problems of inequality, social stratification, and traumatic collective memory, in having given birth to some of the greatest writers of our time, foremost among them the novelist Nadine Gordimer, who was one of the first people Nelson Mandela asked to see in January 1990 after his release from prison.  I credit my love of South African literature and culture to my graduate-school professors Ketu Katrak and Stephen Clingman, who also introduced me to the works of Bessie Head, Alex La Guma, Peter Abrahams, Sipho Sepamla, Njabulo Ndebele, Mongane Serote, and Miriam Tlali, whom I had the opportunity to meet. Also thanks to Stephen Clingman, we had the thrill of attending the first U.S. lecture given by Gordimer after the announcement of her Nobel Prize in Literature in 1991, the year after Nelson Mandela’s release from prison. She spoke then, as her country was moving toward the dismantling of apartheid, not as an individual, but as a representative of the African National Congress (ANC), of which she was a staunch supporter.

Later, after the fall of apartheid and the birth of the New South Africa, I was to discover and teach the work of still more South African writers: Olive Schreiner, Athol Fugard, Ellen Kuzwayo, Sindiwa Magone, Dennis Brutus, who donated his papers to the university where I now teach, and J. M. Coetzee, whom I had the honor of introducing when he visited the college where I was teaching at the time.

But throughout the clampdown of the bleak 1980s and the State of Emergency in the second half of the decade, the popular resistance continued to grow, not only at home, where thousands of people were incarcerated, tortured, and killed, but throughout the world, where opponents of apartheid pressured their governments to impose sanctions on the South African regime. On February 11, 1990, following long behind-the-scenes negotiations, the regime was finally forced to announce ANC leader Nelson Mandela’s unconditional release from prison after 27 years, and the whole world rejoiced.

Living then in the small rural-Massachusetts town of Winchendon, we too rejoiced. As members of the a cappella Noonday Singers, we had been singing South African freedom songs throughout the period of the State of Emergency: “Senzenina, Shosholoza, We Shall Not Give Up the Fight, Siyahamba, Freedom is Coming. We had sung them in churches, at political demonstrations, conferences, fundraising events, even in a prison. But oh, what a feeling, when Nelson Mandela won his freedom at last, to be able to retire Bamthatha (He’s Locked Up): there was no further need for it!

Throughout the 1970s and 1980s, the global music of the African diaspora had also provided the anthems of freedom that fuelled and sustained a spirit of joyful resistance, not merely fighting against oppression, but fighting for a vision of freedom and justice for all. There was the Jamaican-born Reggae music of Bob Marley and the Wailers (Get Up, Stand Up, War, Zimbabwe), Peter Tosh (Equal Rights, Fight On, Rumors of War), Third World, Burning Spear, and many others; and in Britain, the anti-racist 2 Tone ska bands such as The Beat, The Selecter, and The Specials. In 1984, after the band had officially disbanded, The Special AKA released Nelson Mandela: another song that we no longer had to sing after February 11th, 1990 (except in celebration, as in this 90th birthday concert in London’s Hyde Park).

Just four months after his release, in June 1990, Nelson Mandela came to Boston, accompanied by his wife Winnie, and spoke at a joyful celebration on the Esplanade, where the Noonday Singers were invited to join the Amandla Chorus as one of the groups welcoming him in song. Watching the video on Youtube alone last night, with the news of his death flooding the airwaves, my tears joined those of millions in South Africa and around the world. That red-letter day had also happened to be my birthday, something I had completely forgotten until I noticed the date on one of the banners.

On the eve of the election of April 27th, 1994, South Africa’s first democratic election, the Noonday Singers helped to mark that hope-filled moment in history with a concert in which we opened for and were joined in song by the legendary civil-rights-era folksinger Odetta. The moment was both joyful and solemn, in view of all the sacrifice and loss that had led to this day, the countless lives ruined in prison, the incarceration and killing of activists like Stephen Biko, and most recently, the assassination of Chris Hani, who had come to my university while I was a graduate student, and whom, with many others, I had joined in singing Nkosi Sikelel’ iAfrica, the anthem of the African National Congress during the apartheid era and now part of the new, hybrid anthem of the Republic of South Africa.

Hearing the news crackling over the radio on my drive home from work yesterday evening, at first it failed to register. It had been pending for many months, and now it had come at last; I continued to drive, my mind tuning the words out. A message came in on my cell phone, but there wasn’t enough reception on that lonely stretch of road to listen to it. As soon as I got into cell phone contact again I went to my voicemail. It was Andrew:

—Hi Jo, I just heard on the news that . . . Nelson Mandela has died.
I thought I’d just let you know.

His voice faltered as he signed off; and then the memories began flooding in, starting with Andrew lovingly carving that block of linoleum all those years ago, when Nelson Mandela was only halfway through those 27 years in prison and already the same age as I am now.

Throughout his long struggle, this man, Nelson Mandela, had kept the faith, held his vision clearly in his mind’s eye, steadfastly held on to his own humanity, and affirmed the humanity of others. The victory was not his, he always maintained, but that of the South African people who had given their lives for it, and of the people the world over who had stood in solidarity with them. But the failures to achieve the vision, as millions of black South Africans remained in poverty, he claimed responsibility for those, and he continued to struggle to achieve them as long as he lived.

May we continue to hold fast to his vision, his humanity, and our own. May his example live on in our troubled world and in his beloved country.

Mayibuye iAfrika!
(Africa—may it return!)

Tell Me Another (Contents to Date)

Chronological Table of Contents

232. Before Interstates, Before Automobiles

In 1970s, 1990s, Books, history, Nature, places, reading, Stories, travel, United States on November 9, 2013 at 10:41 am
The Oxbow—Connecticut River Near Northampton by Cole Thomas (Wikimedia Commons)

The Oxbow—Connecticut River Near Northampton by Thomas Cole, 1836 (Wikimedia Commons)

I’ll never forget the time Andrew and I drove cross-country with our world-traveled friend Peta, on her first trip to the United States in the nineteen seventies. Anticipating a rugged ride into the Wild West, she was singularly unimpressed with the reality of American highway travel. “Everything looks the same,” she complained, citing the HoJo’s restaurants and motor lodges with their ubiquitous orange roofs all along the Eastern highways and the Stuckey’s chain with their corn-syrup-filled pecan pies and log rolls regularly clogging the arteries through the South, Midwest, and Southwest. What blots on the magnificent landscape! Even as we argued with Peta, insisting that she would see wild aplenty as soon as we got off the Interstate, we couldn’t help agreeing with her.

The U.S Interstate Highway System has blasted through rugged rock and ridge, cutting out the sinuous curves that follow the natural contours of ridges, valleys and rivers, and replacing them with straight-arrow lines; leveling out the ups and downs to create a smooth, easy ride across the country; making it possible—as long as you have the gasoline—to cover long distances in a very short period of time. While I certainly appreciate what has been achieved by the construction and maintenance of these travel and transportation networks, I often try to imagine what it might have been to actually feel the topography, what the country looked like, and how different travel must have been before they existed—heck, before automobiles themselves existed.

The Connecitcut River from French King Brdge, Erving-Gill, Mass. (photo: Wikimedia Commons)

The Connecticut River from French King Bridge, Erving-Gill, Massachusetts (photo: Wikimedia Commons)

When I’m on a long highway drive, especially on Interstate 91 through the Connecticut River Valley, I often wonder whether I would know where I was if I were to be dropped down at that very location in the past, say, during the colonial era. I realize that although I must have driven hundreds of times back and forth over the 100-miles of highway between the Pioneer Valley in South Deerfield, Massachusetts and the Upper Valley near White River Junction, Vermont, I still couldn’t describe with any precision the land I have travelled so often. The only natural landmark I can identify is the Connecticut River, since the Interstate runs parallel to it for much of that stretch. I know that there are hills and valleys along the way, because the car engine would strain at times, or a truck would barrel down behind me, forcing me over into the left lane, or I would suddenly be enveloped in a dense fog and have to slow down to a crawl or even pull over and stop until it cleared. But set me down at any point along the way without the highway signs to orient me, and I would be absolutely helpless.

Old-growth forest, Cummington, Massachusetts

Old-growth forest, Cummington, Massachusetts

I imagine myself surrounded by dense old-growth forest, trying to find a high overlook that would allow me to get a fix on my location; and then I imagine the journey on foot or by carriage that now takes me all of two hours, fortified by a travel mug of tea and a Ben and Jerry’s Heath Bar Crunch picked up at the convenience store in Putney, Vermont at the halfway point of my drive. The same ground covered by horse and carriage would have required at least one overnight stay at an inn, and perhaps would have been impassable in a snowy winter or a wet mud season. No wonder visits to distant friends and relations are so protracted and emotionally charged in 19th-century novels. For instance, when, in Pride and Prejudice Elizabeth Bennet goes to visit her dear friend Charlotte after Charlotte has married Mr. Collins, she does not know when she might be able to see her again, so she stays as long as she can. Rather than consisting of a quick overnight or a weekend at best, visits in the days before automobile travel lasted weeks on end, sometimes months.

Mount Holyoke Range near the Horse Caves, Amherst (photo by Alison Myra Ozer)

Mount Holyoke Range near the Horse Caves, Hadley (photo by Alison Myra Ozer)

I once read something by Emily Dickinson in which she mentioned that, as youths, her brother and his friends spent their days rambling over the Mount Holyoke Range. To my shame, although I see those hills every day as I drive to the gas station or the supermarket, after having lived in the area for 23 years I can count on the fingers of one hand the number of times I have actually set foot on them. I was going to ask for a GPS (or sat nav) for Christmas, but I’ve changed my mind. Time to get out the topo map and re-orient myself, not to manmade networks, whether satellites or highways, but to the features of the natural landscape of this beautiful planet we are so privileged to inhabit.

Tell Me Another (Contents to Date)

Chronological Table of Contents

%d bloggers like this: