It’s the late morning, and my wife Mardena and I are headed back to our hostel in Antalya, a city on Anatolia’s southwestern coast. We’ve just returned from a trip to the archeological museum, where we saw a stunning display of Roman mosaics set out under clear glass walkways. As we duck out of the 111-degree heat and into the hostel’s lobby, we come upon a young man, probably in his early twenties, standing with his head craned forward and eyes fixed on a TV mounted high on the wall. A Turkish news report is discussing the war raging in neighboring Syria. The camera footage shows smoke, rubble, and bombed-out buildings, but I have no idea what the reporter is saying. I ask the young man what is happening. “Assad is bombing Homs,” he says, his eyes still on the screen.
What Tomorrow Brings is an intimate portrait of a girls’ school in rural Afghanistan and the challenges its students face in trying to get an education.
In an early scene of What Tomorrow Brings, Pashtana, a seventh-grader at a girls’ school in rural Afghanistan, describes just how much her education means to her. “My biggest hope is to finish school,” she says, smiling brightly. “That’s how my life will turn the corner, and I’ll be on my way.”
Her smile fades. “But I’m worried there are people around me who will try to stop me.”
Chelsea Rudman Chelsea Rudman is an international development professional and freelance writer who lives in Washington, DC. Her writing has previously been published in the NY Press and Matador Travel.
Dear Reader,In The Fray is a nonprofit staffed by volunteers. If you liked this piece, could you please donate $10? If you want to help, you can also:
Hopewell-Mann is a predominantly Latino neighborhood in the predominantly Latino city of Santa Fe. Close enough to downtown to make it a short commute, yet a world away so that tourism doesn’t quite reach it, it’s a stark reminder of some of the inequalities present in this city. While the stunning adobe architecture downtown looks like it’s been preserved in aspic, Hopewell-Mann’s main drag is lined with big-box stores, fast-food restaurants, and cheap motels offering month-to-month leases. The neighborhood attracts a mix of the transient and the locally displaced, and not surprisingly, people downtown tend to avoid it.
Being a Peace Corps volunteer is about cultural exchange, but you don’t always get to decide what culture gets exchanged.
Hannah Jiang
“H
ey, Ching-Chong. Bus fee.”
The driver’s words slapped me across the face. I handed him some money and waited for my change. Everyone on the bus was silent, watching. “Here you go, Chong-Chong,” he quipped while handing me a coin in return. My face turned red.
“Please don’t call me that. I really don’t appreciate it.” I felt my voice quivering but hoped that it sounded steady.
He laughed. “Okay, darling. What’s your name?”
“I’m Hannah.”
“Okay, Hannah.” He walked around the bus to collect the other passengers’ fares and muttered something in Kweyol, the local dialect. A few people chuckled. Then he climbed into the driver’s seat and we took off on the winding road toward my village. I put my headphones on and tried to calm my pounding heart.
I am a Peace Corps volunteer in St. Lucia, a small Caribbean island country north of Venezuela. Not many Americans know much about the island beyond the fact that the season finale of ABC’s The Bachelor was filmed at a resort here a couple years ago. Likewise, not many locals from my community know much about America aside from what they’ve seen on television.
One of the stated goals of the Peace Corps is to fill these sorts of gaps in cultural awareness. As volunteers, we explain what it means to be American to the people we meet here, and we share our experiences abroad with those back home. Instead of trading goods, we are trading cultures.
I teach English to schoolchildren in Canaries, a rural fishing village with a population under 2,000. It is one of St. Lucia’s poorest and most underdeveloped areas. Although luxurious hotels and resorts dot the island, jobs within tourism remain largely inaccessible to the people in my community. There are no attractions, restaurants, or gas stations in Canaries, so even when visitors do drive by they have no reason to stop.
Cut off from the tourist world, most villagers have never been exposed to the idea that America is ethnically diverse. Canaries has had Peace Corps volunteers in the past, but as far as anyone can remember, all of them have been white. As a result, confusion about me—the first Chinese American to live in the community—has been inevitable.
There was a teenage girl, fourteen or so years old, who approached me urgently as I walked to an evening aerobics class. “Miss, I just have to ask,” she said. “Are you from China or Japan?”
“I’m actually from America, but my family is from China.”
She said, “Oh,” and ran on.
There was an older man who gave me a ride to the grocery store and told me he was “also” a Buddhist. “If you don’t believe me, I have Buddha statues all over my home. Want to stop by to see?”
“No thank you, I’m actually not Buddhist.”
There was an elderly lady on the bus who mistook me for a member of a Japanese volunteer group visiting the island. “You did such a lovely job at the choir performance last weekend!”
I thanked her for the compliment.
In my application to join the Peace Corps, I wrote that one of the challenges I expected to face during my time abroad was having to answer the question, “If you’re from America, how come you look Chinese?” The possibility hadn’t discouraged me, though. In fact, before I left for St. Lucia, I was excited about sharing what I knew about America’s racial diversity with the locals I would meet. I assumed they would be open-minded and just as interested in American culture as I was in theirs.
Getting called “Ching-Chong” made me think I had been too idealistic.
One day, I was riding in a large food-supplier truck heading to Castries, the island’s capital, where I had a meeting scheduled that afternoon. It was a forty-five minute drive to town, and to pass the time I chatted with the two men in the truck. They soon discovered that I was a Peace Corps volunteer from the States. Kenny and Shem had heard of the Peace Corps before, but they weren’t at all interested in learning about anything related to America. They wanted Mandarin lessons.
“Teach us some bad words in Chinese!” they prodded me. Kenny, a friendly man with cornrows, took out a pen and paper to jot down the phrases phonetically. “We’re going to say these to our boss next time he makes us angry!”
“Is your boss Chinese?” I asked, nervous that this would eventually be traced back to me.
“Nope. He’s Lucian. He won’t have any idea what we’re saying!”
When we reached Castries, I got out of the truck and waved goodbye. After Kenny and Shem drove off, I found myself thinking about our conversation. Thanks to me, the two of them had learned enough Mandarin phrases to get themselves fired. But they hadn’t learned anything about America, the country I was supposed to be representing. I began to realize that even though I’m here in St. Lucia to exchange cultures, I don’t necessarily get to decide which culture I exchange. Kenny and Shem had heard my story about being Chinese American, yet Chinese culture was what they had insisted on trading with me.
Perhaps this wasn’t a bad thing. Recently a local friend asked me, “You don’t know karate, do you?”
I told him no, not at all.
“Well, you know, people here watch a lot of kung fu movies and that’s the main thing they know about Chinese people. They probably all assume you know karate.” He paused. I wasn’t sure how to fill the silence. “You should let them think that,” he continued. “It might keep you safe here because nobody’s going to want to harm you.”
I smiled. His words were unexpectedly reassuring. Maybe, I thought, it was okay to leave the choice in their hands.
One of my favorite things about St. Lucians is their love of sharing food—something they have in common with the Chinese, as my own family’s experiences have taught me. For people here, sharing food establishes trust and a sense of community. My school principal will buy fish from our village, clean it, cook it with local seasoning, and give it to me in a Tupperware container to take home. A teacher will surreptitiously hand me mangos in the middle of class while the children are doing their work, whispering, “These are from our tree.” Students will come to school with a small bag of love apples, present one to me and say, “Teacher Hannah, look.” “For me?” “For you.”
The first time I brought food for the school staff, I chose something quintessentially American: toasted blueberry bagels with Philadelphia cream cheese. None of them had ever eaten bagels before, and they all wanted seconds to take home. At the end of the school year (some trays of brownies and Christmas sugar cookies later), I decided to share something completely different: mantou, a steamed bun that my grandfather would make for us every time our family visited him in China.
I remembered how he would spend all day making the mantou. Carefully mixing the flour with water, yeast, and a little bit of sugar. Kneading the dough meticulously with his hands. Rolling, cutting, and forming it into the bun-shape so familiar to us. We would eat his mantou every day for breakfast until the batch was gone, and then he would happily make more.
I told the teachers that the mantou was Chinese bread that they could either eat plain or with any kind of butter, jam, or sauce. They were amazed that the bread didn’t need to bake in the oven, and that it was so powdery white, without a trace of brown. They loved it. Most of them ate it with local cheese.
Though none of them knew it, bringing my grandfather’s mantou to my school was an important moment for me. It was the first time I chose to share my Chinese culture with the people here. This time, they hadn’t needed to prompt me with their questions. This time, I hadn’t agonized over whether I, their cultural ambassador from America, was exchanging something “un-American.”
I was two years old when my parents and I immigrated to the States. Growing up, I felt as if we were all learning what it meant to be American together. The ways of my parents were often at odds with the ways of my classmates’ parents. My classmates went to church on Sundays; I went to Chinese language school to learn Mandarin. My classmates brought PB&J sandwiches for lunch; I brought rice and vegetables, with a pair of chopsticks. My classmates had turkey, stuffing, and pie for Thanksgiving; I had Chinese hot pot.
As my brothers and I got older, our family started traveling to China during the summers to visit relatives. For Mom and Dad, these trips were like going home. Everything in China was familiar to them. I could tell that a sense of peace washed over them when we were there—they became less anxious, laughed more easily, and seemed to know everything intuitively. For me, though, these trips were the opposite of peaceful. They made me feel even more displaced, even more conscious of the fact that as a Chinese American, neither culture was truly mine.
When I started living on my own, I decided that being in this sort of limbo wasn’t healthy—I needed to commit to one culture. Because I felt that Chinese culture had isolated me from my peers when I was younger, as soon as I had the choice to turn away from it, I did. Aside from the occasional Mandarin conversation with a cab driver, late-night order of Chinese takeout, or short trip to visit my family, nothing about my adult life was culturally Chinese. The lunches I brought to work, the holidays I celebrated, the movies, books, and music I consumed—all of it was American.
By the time I arrived in St. Lucia, I had developed a tense, almost in-denial relationship with my Chinese heritage. I had put so much effort into belonging to something else that when people here reminded me of my ethnicity—when they asked about Chinese culture and insisted on exchanging it with me—I felt like they were challenging my fundamental sense of self.
Sharing my grandfather’s mantou was the moment I made peace with my identity. I realized that my culture isn’t confined to a particular country—not America, not China. It’s the blend of values, customs, and traditions that I’ve absorbed throughout my life, from all of my surroundings.
For other volunteers—those who conform to what people in more remote parts of the world imagine Americans to look and behave like—maybe the act of cultural exchange is straightforward. But for me, it cannot happen so simply. And just because my experience is different, I’ve learned, doesn’t mean that it’s wrong.
The bus driver and I have never spoken about our “Ching-Chong” encounter, and we probably never will. But I still ride his bus all the time. He knows the exact curve of the hill where I call out, “Stopping, please,” and will drop me right in front of my house. On my way home the other day, he glanced up at me in the rearview mirror before that turn in the road, and I nodded at him. Without my saying a word, he pulled over. “Thank you,” I said to him as I climbed out. “Take care, darling,” he said back.
Hannah Jiang is currently a Peace Corps volunteer in St. Lucia, where she teaches at a school and contributes to the national news station. A Yale graduate, she previously worked in Manhattan for an executive search firm.
Dear Reader,In The Fray is a nonprofit staffed by volunteers. If you liked this piece, could you please donate $10? If you want to help, you can also:
It’s a small coffee shop, a Shingle-style shack with blue trim, listed by Yelp as one of Laguna Beach’s best. Cookies and biscotti lie in a basket in front of the order window. The barista, an upbeat blonde woman in her late fifties, early sixties, comes over to me. As I’m trying to choose what flavor to put in my coffee, we start talking. She finds out I’m from Phoenix and asks what brought me to Laguna.
“My friend passed away two weeks ago. I’m here to clear my head,” I tell her. Hal, a pastor, was one of the first friends I’d made after moving to Phoenix a year and a half ago with my fiancé. He had helped us through some tough times.
She’s curious about where my accent is from. I tell her I was born in Iran. “But I have lived here longer than I have lived there,” I quickly add.
It’s a cool, sunny November morning. As she’s making my coffee, the woman spots the book I’m carrying in my hand, The Ministry of Guidance Invites You to Not Stay, by Hooman Majd. She asks me what it’s about. I tell her it was written by an Iranian immigrant who had left Iran when he was eight months old. When he turned fifty, he decided to go looking for his grandmother’s house halfway around the world, hoping to find his roots. He found the area, the familiar scents, the leftover mud walls. But he couldn’t find the actual house.
His story is not much different from mine, I say. Several years ago, I visited the neighborhood where my family used to live in Tehran. For the first time in more than two decades, I walked our old block, looking for the home I had grown up in. But it wasn’t there anymore.
Describing my trip to Iran reminds me of a passage I read in Majd’s book: Maybe it was better that the house and even the street weren’t there. Reality could not possibly rival a childhood memory, and my memory was intact, if rose-colored. Not finding the house also kept me somewhat rootless—now there truly was nothing for me to directly claim as mine.
“My grandparents emigrated from Denmark,” the woman tells me. She starts talking about how her grandparents worked hard, how they worked their way up—until one day they owned their own business.
I expect her to go on about her family, yet she abruptly turns the conversation elsewhere. “I don’t see my culture here anymore,” she says, testily. At first, I think she’s talking about Danish culture. But I’m mistaken. “Have you been to Heisler Park?” she asks me.
“No.” I know from my friends in Laguna that Heisler is the park Iranians go to for their New Year celebrations.
“When I went to Heisler Park, I had to pass all these Asian tents to be able to celebrate my Memorial Day. Before the Asians, it used to be Mexicans.”
There’s an awkward pause. I’m not sure how to respond. “You know what?” I finally say. “It’s too cold to sit outside. I’ll come back later.”
As I turn away, I feel disappointed with myself for not saying what’s really on my mind. I want to tell her that she should have compassion for those who leave their countries to come here. I want to remind her that her ancestors were also immigrants. But I don’t have the courage to speak up.
Instead, I walk away, thinking about what I should have said, feeling like the outsider I still imagine myself to be—twenty-seven years after coming to America.
• • •
I was born in Tehran. When I was seventeen, my family decided to leave Iran. We immigrated to America not because of conflict—the Iran–Iraq War had already ended by then—but for opportunity. My father was an engineer, and my mother had studied law before becoming a stay-at-home mom. They wanted their children to have a good education.
Growing up in Houston in the nineties, I fought with my mom because I didn’t want her to pack Persian food in my lunchbox. The salt-laden Lunchables would do—anything to fit in among my new classmates. During lunch breaks, I used to hide in the piano rooms to avoid the humiliation of not speaking English. At home, I practiced pronouncing words properly, without the thick Iranian accent. “The,” not “de”—so what if we didn’t use the sound th in Farsi? We were living in America. We needed to respect its language.
As I grew older, I distanced myself from the Iranian community and embraced American culture. Though I was born a Muslim, most of my family and friends had lost interest and trust in religion, thanks to Iran’s Islamic Republic. From early on, I had steered clear of mosques. Yet in America, whenever friends invited me to church, I went. Defying my parents and their Iranian values, I dated American boys without plans to marry. Once, my mom—playing matchmaker—asked me to meet her friend’s son, who had traveled from Switzerland to see me. I refused to go.
When it came to school, however, my two brothers and I were good Iranian children. My father, who had gone from building factories to manufacturing blinds, had no time for nonsense, and demanded hard work and excellence in whatever we did. He was the type of immigrant dad who, if I got a 98 on a test, would ask me, “Who got the 100?” Once the managing director of an engineering firm in Iran, he had been forced to take a job as a low-level supervisor at the blinds plant when we moved to America. He expected much more for us, and his high expectations paid off: two of us earned doctorates, and the third, an MBA.
At dental school, I’d been one of a large number of immigrants in my class. Being around people who shared my experience as a newcomer had been good for me, and by the time I graduated I was no longer as anxious about sticking out as an Iranian. After I finished school, I started an oral and maxillofacial surgery residency at a hospital in New Jersey.
I was in my second year there when the September 11 terrorist attacks happened. A few days afterward, a group of us were in the operating-room holding area waiting for a patient. News streamed on a small television set. We congregated around it, watching footage of the collapse of the Twin Towers over and over.
Soon the anesthesiologist and nurses were peppering me with questions about why the terrorists had done this. “You are from the Middle East, aren’t you?” someone asked me.
For years, every time I traveled by air, I was pulled out of the line for a “random search”—something my then-husband, also Iranian, avoided because of his light complexion.
After my hospital residency, we settled down in Atkinson, a tiny New Hampshire town along the Massachusetts border, and I became a US citizen. When I was finally able to vote in 2008, I felt such a sense of joy as I waited at the polling station to cast my ballot—only to have that feeling vanish when I overheard someone in line say, “We oughta show these towelheads who’s the boss.”
Let it go, I told myself. Why bother about what a couple of people in a small town think?
• • •
In Iran, we grieve the loss of a loved one for forty days, with ceremonies on the third, seventh, and fortieth days. We even have specific foods for funerals—halva, a sweet dish made with flour, butter, sugar, and rosewater and decorated with pistachios, is often served to mourners with tea.
In America, a culture of positivity, I don’t really know how to mourn. Right after Hal died, I disconnected from my emotions. Eventually, I became angry—at others for their platitudes, and with myself for how long it was taking me to get over his death. I needed to get away from it all, and so I went to Laguna Beach, where many of my old Iranian friends live.
I tell them how much Hal’s death has shaken me—and how much I’ve struggled to properly grieve for him. I need the ceremonies we used to practice, I tell them. I crave the taste of halva with tea.
They listen. Ironically, out of all my Iranian friends, the one who has the least nostalgia about her former life best understands the pain of that old wound. “You know, our parents are Iranian,” she says. “Our children are American. We, I’m afraid, are neither. We are orphans.”
I think back to my search for my childhood home in Tehran. I remember coming to the block where my family’s house had once stood, and seeing the sterile apartment complex they had built in its place. Good, I told myself. Who has the energy to feel emotional?
Afterward, a friend drove me around the neighborhood. We passed my middle school. The old mosque. The pastry shop I used to stop by on my way to school.
Suddenly, tears started streaming down my cheeks.
It wasn’t there anymore. It wasn’t there.
Bahar Anooshahr is an Iranian American writer and recovering oral and maxillofacial surgeon. Twitter: @banooshahr
Dear Reader,In The Fray is a nonprofit staffed by volunteers. If you liked this piece, could you please donate $10? If you want to help, you can also:
A protester-made statue with the Spanish words "dignity" and "fight" stands outside the Chicago Board of Trade building following a march in favor of a higher minimum wage. Scott L, via Flickr
This weekend, low-wage workers from around the country will be arriving in my city, Richmond, to make a case for increasing the minimum wage. It’s the first-ever national convention for the Fight for $15 movement, which in the past few years has launched wide-ranging strikes and protests to raise awareness about how a $7.25-an-hour wage—the current federal minimum—just doesn’t cut it for many workers struggling to make ends meet for themselves and their families.
There’s a long line of economic arguments in favor of, and opposed to, increases in the minimum wage. Among other things, opponents say it will raise prices for consumers, cause employers to slash jobs or cut back on workers’ hours, and put many companies out of business. Advocates say it will help the economy by giving workers more money to spend in their communities, encouraging the unemployed to seek out work, and reducing the stress and anxiety the working poor deal with, as well as their reliance on government benefits.
As important as the economic impacts of this policy are, however, it’s even more important to consider its cultural and moral implications. After all, that’s what drives much of the widespread public support for increasing the minimum wage, even among people who have never heard of, say, the elasticities of labor supply and demand. Many Americans just don’t think it is right that people who work hard should have to struggle so hard.
To be sure, the research on the minimum wage gives us little reason to despair—or cheer—over its impact on the economy. The most rigorous studies seem to suggest that it doesn’t make a big difference in terms of employment and growth. A 2014 open letter signed by 600 economists, including seven Nobel laureates, advocated raising the minimum wage to $10.10, noting that the “weight of evidence” showed “little or no negative effect” on employment for minimum-wage workers. Meanwhile, the increase would lift wages for them and likely “spill over” to other low-wage workers, too, possibly stimulating the economy to a “small” degree, the economists wrote.
Most recently, a University of Washington study of the increase in Seattle’s minimum wage to $11—on its way to $15 in 2017—tried to sort out the impact of the wage hike alone, sifting away the effects of other changes in the economy occurring at the same time. It found mixed results. A bit higher wages, but a bit fewer hours. Somewhat less employment, but no increase in business closings.
Make of these studies as you will, but it’s hard to argue that the sky is falling down in places where wage policies have changed. And while a higher minimum wage will give low-wage workers fatter paychecks, it obviously cannot, by itself, pull the working class out of its decades-long malaise of stagnant wages and growing insecurity.
These economic analyses provide important context, but the policy question really boils down to one of values. America has always prided itself for being founded on principles rather than a single cultural persuasion, and Americans have held onto few principles as steadfastly as the value of hard work. An honest day’s toil should get you by. And yet we have millions of Americans who work full-time and are still in poverty. We have millions working at global corporations like Walmart and McDonald’s that pay their workers so little that their business models rely on government to pick up the tab—by providing Medicaid, food stamps, refundable tax credits, and the like.
Adapting our laws and our economy to match our principles will take time. With any change, there will be some who gain, and some who lose out, more than others. But overall society will be better off—and it’s not just because some people will make more than they used to.
When we pay living wages, the culture changes, too. As Katherine Newman found in her classic study of fast-food workers, No Shame in My Game, part of what makes it hard to take a low-wage job is not that people don’t want to work—it’s that society has such disdain for those making chump change behind a McDonald’s counter or in a Walmart stockroom. (This is also one reason that immigrants—who aren’t under the same sorts of social pressures as the native-born—will do the poorly paid jobs others won’t.)
In the research for my book about the long-term unemployed in America and Canada, I came across one man out of work for more than a year after the car-parts plant that employed him shut down. He had avoided having to live on the street by moving into his mom’s house. When I spoke to him, he had just given away his last unemployment check to his daughter so that she could have something of a normal Christmas.
“I’m forty-three years old and living off my mother,” he told me. He was ashamed about accepting his family’s help, but he felt he had to do it. What he wasn’t willing to do, though, was work at a fast-food restaurant. He had put in twelve years at a respectable job, he pointed out. “I don’t want to throw on a goofy hat.”
If we believe that certain jobs are so undignified that we won’t even pay someone a decent wage to do them, then we shouldn’t be surprised that people with a decent amount of self-respect won’t do them. Opponents of raising the minimum wage seem to be blind to this. They talk about the economic pros and cons of wage laws as if those were the only things that matter. But people in the real world don’t just have balance sheets, they also have pride.
If you don’t think that making economic policy based on principle is realistic, then consider the extent to which it has already occurred—in the direction of greater income inequality. In 1965, CEOs made 20 times more than a typical worker, according to the Economic Policy Institute; in 2014, they made 300 times more. Part of this shift was due to global competition and changes in labor and financial markets, but some of it can be linked to the dwindling sense of obligation that those at top now have toward their workers, as Mark Mizruchi and other scholars have noted.
As many of today’s corporate leaders see it, making obscenely larger amounts of money than their employees do is no longer cause for guilt. The boardroom culture tells them they deserve it. And so they continue to push for changes in tax laws to make sure the economy’s outcomes reflect their own principles of self-profit.
Indeed, in other rich countries with different social norms, the gap between CEO and worker pay is nowhere near as extreme—and the minimum wage tends to be much higher, too. These countries have clear notions of what’s fair and appropriate to pay for a day’s work, and they have chosen to pursue practices and policies in line with those beliefs.
Even those of us who want government to do more for the working poor often forget the importance of this broader cultural context. Yes, we should take advantage of targeted, technocratic solutions such as earned-income tax credits that make low-wage work pay better. But it should trouble us that these policies often amount to having the government subsidize employers who refuse to foot any extra labor costs. Furthermore, having a company pay a higher wage and having the government supplement that wage are very different things. Or at least they are when we look from the vantage point of flesh-and-blood human beings—as opposed to that of the rational-actor stick men in economic models. We brag about our paychecks, not our tax credits.
What we pay those at the bottom also has something to say about the dignity and connectedness of our society as a whole. If every wage is a living wage, those of us who are more fortunate won’t be living in such a different world from those sweeping our floors and serving our food. An entry-level job won’t be such a laughable and undignified proposition that a kid in a poor town or neighborhood won’t even consider taking it over a flashier (and deadlier) gig on the corner. If we think people are worth more than a pittance, they will act that way—and treat others that way.
In a sense, it’s fitting that Richmond, the former capital of the Confederacy, a city with a history of stark racial and economic inequalities, should host the Fight for $15 convention. The old plantation-based economy disappeared not because it wasn’t profitable. It disappeared because it wasn’t just. If we truly believe in our values, we should make our economy reflect them.
Hillary Clinton formally accepts the Democratic Party's nomination for president on the fourth night of the 2016 Democratic National Convention in Philadelphia. Ali Shaker/Voice of America, via Wikimedia Commons
Hillary Clinton’s acceptance speech on Thursday brought to mind the wide gap that separates those in this country who want sweeping change and those who favor incremental reform. It’s played out during the presidential campaign, obviously, in the fierce primary clashes between Bernie Sanders and Clinton, and between Donald Trump and his Republican rivals. But it’s also a tension that can be seen in Clinton’s own politics.
Today, Clinton is the centrist foil to Sanders’s bold and radical idealism. She has explicitly described herself that way. “You know, I get accused of being kind of moderate and center,” Clinton told supporters last September. “I plead guilty.”
It’s easy to make the case Clinton has never really been a liberal, much less a progressive. As she noted in her autobiography, she was once a “Goldwater girl.” Raised in a conservative household, she volunteered for the campaign of Republican presidential candidate Barry Goldwater in 1964 (whose archconservatism later inspired the Reagan Revolution). In her first year of college, she served as the president of the Wellesley Young Republicans.
By then, however, she was supporting moderates—Rockefeller Republicans like John Lindsay, the mayor of New York, and Edward Brooke, a Massachusetts Republican and the first African American US senator. Her politics shifted further, as it did for many young people, as the Vietnam War and the civil rights movement made her question the policies and norms of the day. In his biography of Clinton, Carl Bernstein quotes a letter from around this time in which Clinton described herself as “a mind conservative and a heart liberal.”
Her first major speech—one that unexpectedly made headlines—was the address she gave at her Wellesley College graduation in 1969, the first year that Wellesley featured a student speaker at its commencement. Clinton went up on stage following the commencement speaker—who that year was Senator Brooke. In her speech, a young Clinton mingles her characteristic pragmatism with an uncharacteristic idealism—a bold demand for transformative change.
Brooke had criticized the disruptive protests of the day as “unnecessary” and “ineffective.” “Potential allies are more often alienated than enlisted by such activities, and their empathy for the professed goals of the protesters is destroyed by their outrage at the procedures employed,” he said. Brooke then highlighted the “measurable progress” of recent years, including the drop in the poverty rate over the past decade. Change within the system works, he concluded.
In impromptu remarks at the beginning of her speech—words that incensed university officials—Clinton chided the senator:
Part of the problem with just empathy with professed goals is that empathy doesn’t do us anything. We’ve had lots of empathy; we’ve had lots of sympathy, but we feel that for too long our leaders have viewed politics as the art of the possible. And the challenge now is to practice politics as the art of making what appears to be impossible possible. What does it mean to hear that 13.3 percent of the people in this country are below the poverty line? That’s a percentage. We’re not interested in social reconstruction; it’s human reconstruction. How can we talk about percentages and trends? The complexities are not lost in our analyses, but perhaps they’re just put into what we consider a more human and eventually a more progressive perspective.
A practitioner of the “art of the possible”—that seems to describe perfectly Hillary Clinton’s reformist politics of recent years. And yet five decades ago, she was talking—eloquently and off the cuff—about a more profound kind of change.
Glimpses of that younger, idealistic Clinton came out in her husband’s remarks on Tuesday, as he described her legal work on behalf of children and the poor. But even after Bill Clinton was elected president, Hillary Clinton could still sound at times like the socialist Vermont senator she’d face decades later in the primaries. When White House advisers critical of her single-payer health care plan called it unfriendly to business, she bluntly told her husband, “You didn’t get elected to do Wall Street economics.”
How much things have changed for Clinton: from a First Lady berating her husband for doing Wall Street’s bidding, to a presidential candidate being berated for doing Wall Street’s bidding. By the time of her 2000 Senate campaign, Clinton was projecting an image of being anything but business-unfriendly—one that she further cemented by developing, as New York’s junior senator, close ties to the financial sector. In terms of policy, she advocated piecemeal reforms. “I now come from the school of small steps,” she said.
Her critics might call this shift a sign of her inveterate duplicity—her willingness to do anything to get elected. More charitably, you could call it a symptom of a political post-traumatic stress disorder. It’s clear that she was chastened by the catastrophic failure of health care reform. She was humbled, too, by the disastrous 1994 midterm election that swept Republicans into power.
But in her acceptance speech on Thursday, Clinton seemed to be trying to bridge the gap between her younger and older selves. She spoke of “big ideas.” She spoke of “understanding.” She spoke of “healing.”
I refuse to believe we can’t find common ground here. We have to heal the divides in our country. Not just on guns. But on race. Immigration. And more. That starts with listening to each other. Hearing each other. Trying, as best we can, to walk in each other’s shoes.
It was the sort of touchy-feely rhetoric that might have come from the lips of George McGovern, the unabashedly liberal Democratic senator whom she and Bill Clinton campaigned for after college.
Of course, even as she appealed to ideals rather than policies, Clinton turned to what her campaign calls the central motif of her career: action. As in her 1969 commencement speech—when she made the brash statement that “empathy doesn’t do us anything”—she stressed the long and hard struggle for political change. But also like in that earlier speech, she made a conscious effort to balance that pragmatism with idealism—in her words, “action” with “understanding.”
I went to work for the Children’s Defense Fund, going door-to-door in New Bedford, Massachusetts on behalf of children with disabilities who were denied the chance to go to school. I remember meeting a young girl in a wheelchair on the small back porch of her house. She told me how badly she wanted to go to school—it just didn’t seem possible. And I couldn’t stop thinking of my mother and what she went through as a child. It became clear to me that simply caring is not enough. To drive real progress, you have to change both hearts and laws. You need both understanding and action. So we gathered facts. We built a coalition. And our work helped convince Congress to ensure access to education for all students with disabilities.
It’s a big idea, isn’t it? Every kid with a disability has the right to go to school. But how do you make an idea like that real? You do it step-by-step, year-by-year … sometimes even door-by-door.
Clinton has been described—and has described herself—as a “work horse, not a show horse.” In Thursday’s speech, she could have said more more about her background and experiences to soften her hard-nosed public image and connect with voters. After all these years in the political spotlight, she still comes across as (at least relative to most politicians) a private person, one who is uncomfortable with making a personal connection from far across a stage. As she noted, “The truth is, through all these years of public service, the ‘service’ part has always come easier to me than the ‘public’ part.”
Luckily for her, many Americans can, in theory, relate to that kind of personality type—because it describes who they are, too. Clinton hasn’t done enough to relate to voters in this way, but her speech on Thursday was a step in that direction, stressing to them her indefatigable determination—an oft-ignored, almost folksy trait in a political system increasingly fueled by Hollywood-style celebrity and telegenic charisma.
But I’m here to tell you tonight—progress is possible. I know because I’ve seen it in the lives of people across America who get knocked down and get right back up. And I know it from my own life. More than a few times, I’ve had to pick myself up and get back in the game. Like so much else, I got this from my mother. She never let me back down from any challenge. When I tried to hide from a neighborhood bully, she literally blocked the door. “Go back out there,” she said. And she was right. You have to stand up to bullies. You have to keep working to make things better, even when the odds are long and the opposition is fierce.
Bill Clinton is a master of the politics of personal connection; Obama, a master of the politics of inspiration. Those traits matter mightily for any president, who must often rely on charm offensives and the bully pulpit to advance policy. Obama once implied that his hope was to “change the trajectory of America” and put the nation on a “fundamentally different path,” in the ways that his predecessors Reagan and Kennedy did. How would he do that? Through persuasion as much as policy—by tapping into the culture of the moment as much as altering the structure of law.
If Hillary Clinton wins the presidency, we’ll learn whether a modern president of a quite different temperament can also succeed in this task. Her strength—as she admitted half-jokingly on stage—is in the unglamorous work of rolling policy boulders up political hills.
And yet her speech also reminds us that there is still something of that young college grad in Hillary Clinton. There’s still a belief in big ideas—a boldness that the Sanders campaign, among other things, has helped stir in her again. There’s still an ambition for something more than small steps—for a politics of the impossible. If she can rekindle that part of her, she may put this country on a fundamentally different path.
Sub-Saharan Africans started sleeping in these concrete pipes after they were forcibly evicted from their homes in Tangier’s Boukhalef neighborhood (visible in the background of the photo). Many of them had their belongings thrown out or burned by police, according to local activists. Some were carried off to other cities in Morocco.
Before they manage to reach Spain or Italy or Greece, people fleeing poverty and war in Sub-Saharan Africa head to port cities like Tangier. There, they face the risk of beatings and repression at the hands of authorities—or dying on the crossing to Europe.
“This is a picture of the forests,” Michael says, flicking through the photos on his laptop. “At night they will struggle. I mean, how can a plastic bag save you from the cold? If it’s cold, it will be cold on you. If it rains, it will rain on you. If the police come there, they will burn down all this.”
Michael, a twenty-something man from Gambia, is showing me photos that he has collected during his year and five months in Morocco: some taken by him, others by journalists that he has met and befriended, others by friends who are migrants like himself. At great personal risk, he has been documenting human rights abuses and the daily struggles that migrants undergo in Morocco. (Some of the photos from his collection are interspersed throughout this story.) The images are unsettling. Young children sleeping in the cold. Men who have been beaten half to death. Families living in squalor in the forests.
Jo Magpie Jo Magpie is a freelance journalist, travel writer, and long-term wanderer currently based in Granada, Spain. Blog: agirlandherthumb.wordpress.com
Dear Reader,In The Fray is a nonprofit staffed by volunteers. If you liked this piece, could you please donate $10? If you want to help, you can also:
After the birth of his daughter Adeline, Dustin Davis became a “work-at-home dad”—picking up his design career after a layoff by doing freelance work, but making his daughter his chief priority.
Today’s stay-at-home dads have little in common with Mr. Mom. Responsible, nurturing, and home by choice, they are eager to prove that—aside from the breastfeeding—they can do whatever a woman can.
e’re gonna be on this airplane,” R. C. Liley says, showing his two-year-old daughter a pink, two-seater toy plane. Twenty-nine, Liley is tall and fit and towers over Avery, a toddler in a light-green T-shirt with the words “Never Mess.” “We’re gonna start from the ground, and w-o-o-o-o-sh!” he says, mimicking the sound of the jet engines as he lifts the plane higher and higher.
Liley ends his demonstration. “Okay, Avery, that’s an airplane,” he says. “We’re gonna fly on it—are we gonna be good?”
“Yes,” Avery says, a bit hesitant, her dimples sinking into her cheeks as she smiles.
Liley is a stay-at-home dad. He looks after Avery when his wife Kelley, Avery’s mother, is working at the finance department of a large corporation. Unlike some stay-at-home dads, who feel awkward about taking on a role that many people still consider feminine, Liley is open about being the primary caretaker for his child—so open, in fact, that he regularly shares his experiences on his blog.
Since 1989, the number of stay-at-home dads, or SAHDs, has nearly doubled, according to the Pew Research Center. About two million fathers in America now care for children younger than eighteen while not working outside the home. They account for 16 percent of at-home parents. Likewise, across Europe and even in more traditional cultures around the world, men who take on this role are increasingly visible.
In the United States, many men who lost their jobs during the recession wound up staying at home with their kids, at least temporarily. For a growing number of men, however, their choice to become stay-at-home dads is actually that—a choice. Surveys support this view that fathers’ attitudes are changing. For example, just 5 percent of stay-at-home dads in 1989 said that the main reason they were home was to care for their home or family; today, 21 percent do.
These fathers have little in common with Jack Butler, the hapless stay-at-home dad played by Michael Keaton in the 1983 comedy Mr. Mom—still the cultural reference many people turn to when thinking of men at work at home. Forced to care for his kids after being sacked, Butler seemed bent on wreaking havoc in the house.
Today’s more gender-equal generation of stay-at-home dads shoulder domestic tasks more responsibly. For his part, Liley grew up in an upper-middle-class family in Texas. He studied finance and got a well-paying job as an accountant at a mutual fund. But in 2013, he decided to quit his job and care for Avery. “My wife always made more,” he says. “I was already the one doing the cooking and the rest of the household.” Staying home with Avery wasn’t something Liley felt forced into. Quite the contrary, he says—“I was counting down the days till I became a SAHD.”
But as much as attitudes about parenting have changed, stay-at-home dads still find themselves facing skepticism and derision, often subtle in form—the ways that stay-at-home moms steer clear of them at the playground, or the media portrays them as clueless and dumb, or friends and family drop hints that what they’re doing is strange.
“Being the man, it sounded crazy for me to quit my job,” says Matt Dudzinski, thirty-six, a former interior designer for an architectural firm in Detroit who now cares for his two daughters, six and three. He and his wife Aya, a trim engineer for an automotive company, had each thought—to themselves—that having Dad at home would work best for them as a couple. But they avoided talking about it. “We were both worried about being judged—her, for wanting to keep her career while being a mother, and me, for not being a breadwinning man.”
Then Dudzinski was laid off. “The arrangement we both knew we wanted, but were afraid to voice, was decided for us.”
In certain parts of the world, men (and women) have an easier time staying at home with their kids. For more than two decades, Canada has granted paid leave to fathers who want to be the primary caretakers of their children. In Japan, a country known for its stark gender divide, the law nonetheless requires employers to give their workers—men and women—time off after the birth of a child. In Sweden, one of the most SAHD-friendly countries, both moms and dads can receive government benefits for up to 480 days if they choose to care for their kids at home.
In America and Australia, there is much less in the way of support. Stay-at-home dads have fewer role models or resources to help them, and when government policies do exist to assist families with young children, they tend to treat these men as second-class parents.
Regardless of what their governments do, however, broad economic and cultural shifts seem to be pushing new dads in all these countries to consider what their own fathers would not.
In the United States, the number of stay-at-home dads peaked at 2.2 million in 2010, but then fell slightly once the economy picked up. Clearly, household decisions about who does what have much to do with the state of a family’s finances: in an uncertain economy, men who wouldn’t otherwise stay home are willing to do so when it seems practical. The massive unemployment of the economic downturn is only part of this story, though. Years after the official end of the recession, the typical American household makes less income, adjusted for inflation, than it did in 2007. Having a parent stay at home sometimes makes more sense than paying for a nanny or daycare—and now that women often make more than their partners, the sensible choice in some cases is for the dad, not the mom, to stay home.
Changing values may also be drawing men out of the workplace and into the home. Today, parenting is seen as both the cause and solution to a wide variety of social ills, says writer and sociologist Tiffany Jenkins, while “work is not as important as it used to be for one’s identity and purpose.” As work inside the home becomes, as Jenkins puts it, “professionalized,” more men may think of it a worthy life calling.
In turn, some of today’s new fathers may be reacting to what they think their own dads got wrong. GenXers and millennials, who grew up at a time when dual-income families were the norm, are already more comfortable with the idea of a woman breadwinner. Like every generation before them, they are finding their own ways to rebel—and in the case of the stay-at-home dads among them, this may involve rejecting their fathers’ workaholic schedules, which left little time for children. “I think a lot of people from my generation grew up without dads, or without good dads, and we are trying to change that—showing that we can be great dads,” says Josh Hardt, twenty-eight, a stay-at-home dad in Durham, North Carolina.
For his part, Hardt never felt close to his biological father, he says. After he moved away from home, he did find a fatherly role model in his stepdad, who was a more hands-on parent. Now that he’s a father himself, Hardt works on a freelance basis as a filmmaker but focuses on caring for his two-year-old daughter. His wife works as a retail store manager and provides most of the family’s income.
Hardt enjoys his role at home. The idea of a woman supporting a man financially isn’t that far a cultural leap for someone young like him, but Hardt knows that others—especially the older generation—think otherwise. “They come from a different time, so I understand why it’s hard to understand,” he says.
If capable stay-at-home dads like Hardt are growing in numbers, though, you wouldn’t know it by watching TV. From Fred Flintstone to Homer Simpson, from Al Bundy to Alan Harper, the most popular on-screen dads of the past several decades have been roundly portrayed as doofuses. And the stay-at-home dads among them have not been spared the low expectations that both men and women have concerning male parenting skills. Even when TV dads are praised for being practically minded problem solvers in the home, the compliments are woefully backhanded—in a controversial 2011 detergent commercial, for instance, the savvy stay-at-home dad has to qualify his competence by calling himself a “dad-mom.”
Paul Schwartz knows the stereotype of the bumbling dad well: he was asked to play one on TV. Schwartz, a forty-two-year-old former labor lawyer, has gained a large following on his blog, which chronicles his adventures as a stay-at-home dad in Paris. A few months ago, a cable channel asked him if he wanted to be in a reality show they were developing about stay-at-home dads. The idea was interesting, but in the end Schwartz backed out. “They insisted that we act like morons,” he says.
Perhaps the negative portrayals of stay-at-home dads in the media aren’t so surprising, though, given how prevalent these stereotypes are in the public at large. It needs to be stressed that perceptions of stay-at-home dads tend to be much more hostile outside of America and Europe: in China, SAHDs often hide their status, fearing humiliation, and in many Muslim nations, such a role for men is considered religiously subversive. Nevertheless, large numbers of people in rich Western countries continue to have a lopsided view of who should be taking care of the kids. In a 2013 Pew survey, for instance, 51 percent of Americans said that children are better off if their mothers are home, while only 8 percent said the same of fathers.
Stay-at-home dads are regularly reminded that other people see them as, at best, an oddity. “I usually get one of two responses when people ask what I do for a living,” Dudzinski, the stay-at-home dad from Detroit, writes in an email. “‘Oh, that’s great’ (with a straight face, changes subject and stops talking to me). Or: ‘That’s awesome! If I didn’t have to work, I’d totally stay home all day!’ (assuming I watch TV and order pizza every day).”
Schwartz has stayed at home with his son Malcolm for a decade, but he still gets his share of clueless and patronizing questions from people he meets—inquiries along the lines of, “How does it work? Do you do laundry, too?” “Most are a bit shocked to learn that I have been a stay-at-home parent for all of Malcolm’s life,” Schwartz says. At PTA events, parent gatherings, and playdates, Schwartz is still frequently the only man in the room. “I don’t have a problem with it, although it occasionally means that my sense of humor doesn’t go over well.”
Schwartz and his family used to live in San Francisco. After Malcolm was born, he quit his job as a lawyer to take care of him. Then, in 2013, an international software company offered his wife Amy an executive position in Paris. She decided to take the job, and the family relocated overseas. Once in Paris, Schwartz immediately set to work establishing a new support network for Malcolm. He reached out to a local moms’ group about joining—only to learn that he, as a man, wasn’t invited.
When he did meet other stay-at-home parents, their interactions were “a bit weird,” he says. At a coffee for parents new to Paris, the group talked for half an hour about breast feeding, vaginal births versus C-sections, and similar topics. “You’d think that sitting around with a bunch of women talking about their intimate body parts would be terribly exciting, but to tell you the truth, I was bored.” To find more parents he could relate to, Schwartz eventually turned to the SAHD networks that have sprouted up around the globe in recent years. The people he’s met in this virtually connected community have been an important source of support, he says.
There is some irony to the fact that stay-at-home-moms can be some of the least understanding people whom SAHDs encounter. One obvious reason for the distance these women keep is apprehensiveness about sexual tensions—fears, for instance, that SAHDs must get lonely and want to hit on them. “Women are afraid they are forming a relationship that’s more [than] a friendship, so they don’t want any part of that,’” says Michelle P. Maidenberg, president and clinical director of Westchester Group Works, a community center in New York focused on group therapy.
The awkwardness, however, may have to do with more than just unwanted sexual attention. Women may see stay-at-home dads as threats—interlopers in a domestic sphere they thought was theirs alone, Maidenberg says. Or, they may see the SAHD as a sign of their own inferiority. The modern woman faces a daunting work-life balancing act: the need to juggle a thriving career and a thriving family. Meeting a stay-at-home dad, then, might raise some unsettling questions about how others have succeeded where she has failed—questions like, “Who is the high-powered female married to this man? How incredibly successful and rich is she that she has her husband at home?”
There is a joke going around his circles, Schwartz says. “The new status symbols for women are driving a hybrid car, and having a stay-at-home-dad for a spouse.”
Among other things, skepticism about stay-at-home dads is rooted in the widespread view that women are just more caring and empathetic than men, and thus better suited to be caregivers. Science backs that view up—though the degree to which it does, and the degree to which any gender difference is due to nature or nurture, are hotly contested.
Women tend to have higher levels of activity in their mirror neurons, brain cells linked to the workings of empathy. But scientists disagree whether empathy is determined by mirror neurons alone. Furthermore, research finds that these mirrors neurons can be altered through simple and brief training tasks. This suggests that empathy is not impervious to the power of culture, and that the gender differences we see may be due, at least in part, to the way children are socialized, not their innate traits. Indeed, studies of infant boys and girls find that boys are equally sensitive and attentive to other people at this early stage in their development.
What happens, then, when men care for their kids at home? Not surprisingly, studies find that it is a good idea for fathers to get involved, generally speaking, in their children’s lives. For example, one British study gathered a sample of 11,000 adults and asked their mothers how often the children’s fathers had read to them, gone out with them, and otherwise spent time with them during their childhoods. The researchers found that, on average, adults whose fathers had been more involved when they were growing up had higher IQs, were more sociable, and enjoyed a healthier sense of self. Perhaps being raised by the most involved kind of father—a stay-at-home dad—can lead to even more benefits for children’s sense of self-worth.
That is a hypothesis that researchers are evaluating, says Dr. Michael B. Donner, president of the San Francisco Center for Psychoanalysis. Another is that men who care for their kids have personality traits that distinguish them from other men. For instance, there is anecdotal evidence that stay-at-home dads are more connected to, and comfortable with, their feminine side, Donner says. (For their part, many stay-at-home dads delight in the idea they are different: they want to show other people—especially their own children—that masculinity is also about compassion and nurturing, they told me.)
As interesting as this research can be, Donner is quick to add that the debate over gender differences can obscure the larger point: children just need supportive parents. “It’s not about gender or testosterone levels, or who nurtures or challenges. It is about feeling safe and secure in your parents’ hands, and these properties have no gender.” The bottom line is that children raised in nurturing environments exude confidence when they become adults, he says. “Two can play at that game, moms or dads.”
Dustin Davis has spent the last few years proving just how nurturing a dad can be. In 2013, Davis was laid off from his job as a designer. When his daughter Adeline was born two years ago, he decided he would use the opportunity to become—as he puts it on his personal blog—a “work-from-home dad.” During the day, his wife Jessica works as a designer at a marketing agency, while Davis cares for Adeline in their St. Louis home.
Davis, thirty-three, is as manly as you can get, as evidenced by his impressive ZZ Top beard. But like any stay-at-home parent, he revels in the milestones he’s been able to see first-hand—the other day, it was the five steps Adeline took, in a moment of particularly good coordination. Like many stay-at-home dads (and for that matter, like many stay-at-home moms), he has a career he continues to pursue. But now he is a freelancer working part-time from home, and his chief priority, he says, is Adeline.
“While I cannot breastfeed a child, I can do everything else a woman does. I can be nurturing and loving. I can raise a child.”
Stav Dimitrοpoulos Stav Dimitropoulos is a writer and journalist whose work has appeared in major US, UK, Australian, and Canadian outlets. A native of Greece, she received the Athens Medal of Honor at the age of seventeen and went on to receive a master's degree. She experimented with journalism along the way, and has been writing ever since.
Facebook | Twitter: @TheyCallMeStav
Dear Reader,In The Fray is a nonprofit staffed by volunteers. If you liked this piece, could you please donate $10? If you want to help, you can also:
It was the last night of my conference in Paris, and I was sitting with some new friends in a Brazilian restaurant near the Avenue de la République. We had just wrapped up a day of panels and presentations on the topic of race at the Sorbonne, and the six of us—two Dutch scholars, an Italian, a Belgian, a French woman, and me, the American—had gone out to celebrate. I felt a bit sheepish, as an American, to be eating food from the Americas in Paris, but a few drinks erased that feeling.
We had just finished eating and were sitting around chatting when the once emptying restaurant became full of people again. A young French couple hurriedly slipped into the restaurant and sat down at the table next to us. The man spoke English to us. “Don’t go outside,” he said.
The people at my table huddled anxiously around him. People were running in the streets away from something, he told us. I glanced around the restaurant and saw that everyone was already staring at their phones. Looking at my own, I saw a news alert that said that several bombs had gone off in the Bataclan concert hall.
“That is just 1,000 meters from here,” the French man said, eyes wide. Some of the women around me gasped.
Chinyere Osuji Chinyere Osuji is the author of Boundaries of Love: Interracial Marriage and the Meaning of Race, uses social science to understand how Blacks interact with ethnic and racial “others,” and has watched Something in the Rain five times. Site | Instagram | Twitter | Clubhouse
Dear Reader,In The Fray is a nonprofit staffed by volunteers. If you liked this piece, could you please donate $10? If you want to help, you can also:
Dr. Martin Luther King delivering his "I Have a Dream" speech in Washington on August 28, 1963. National Archives and Records Administration, via Wikimedia
All the discussions today of how much racial progress we’ve made since Dr. Martin Luther King was alive reminded me of a disturbing point about the black−white health gap mentioned in recent research, some of which I discussed in an Atlanticessay over the weekend.
According to the Centers for Disease Control, African Americans have been catching up with whites in terms of life expectancy at birth. So things are looking up, right?
Yes, and no. To a sizeable extent, what explains the narrowing of the life-expectancy gap in the last couple decades is not just that things are better for African Americans (though they have improved), but also that things are worse for whites—working-class whites above all.
A New York Timespiece over the weekend highlighted this fact. “A once yawning gap between death rates for blacks and whites has shrunk by two-thirds”—but that’s not because both groups are doing better, according to the article. Overall mortality has declined for African Americans of all ages, but it has risen for most whites (specifically, all groups except men and women ages 54-64 and men ages 35-44).
Furthermore, younger whites (ages 25-34) have seen the largest upticks in deaths, largely because of soaring rates of drug overdoses, and those who have little education are dying at the highest rates. The mortality rate has dropped for younger African Americans, a decline apparently driven by lower rates of death from AIDS. Together these trends have cut the demographic distance between the two groups substantially.
For middle-age African Americans, the progress in improving health outcomes implied by the shrinking black−white mortality gap is also less cause for celebration than it might seem at first.
A much-discussed study last year by the economists Anne Case* and Angus Deaton found that huge spikes in deaths by suicide and drug poisonings over the last couple decades have meant that the trend of declining mortality rates we’ve seen for generations actually reversed for whites ages 45-54 between 1999 and 2013. Again, those with little education were hit the hardest.
In my Atlanticpiece, I pointed out that the growing social isolation and economic insecurity of the white working class might explain some of these trends. One of the caveats I mentioned is that death and disease rates remain much higher among African Americans and Latinos. (I should have been more precise in the article: although Latinos have higher rates of chronic liver disease, diabetes, obesity, and poorly controlled high blood pressure, they have lower rates of cancer and heart disease, and lower or at least equivalentrates of death).
But it’s not just that the black−white gap persists. Here’s an important passage from Case and Deaton’s paper:
Over the 15-[year] period, midlife all-cause mortality fell by more than 200 per 100,000 for black non-Hispanics, and by more than 60 per 100,000 for Hispanics. By contrast, white non-Hispanic mortality rose by 34 per 100,000. CDC reports have highlighted the narrowing of the black−white gap in life expectancy. However, for ages 45–54, the narrowing of the mortality rate ratio in this period [1999−2013] was largely driven by increased white mortality; if white non-Hispanic mortality had continued to decline at 1.8% per year, the ratio in 2013 would have been 1.97. The role played by changing white mortality rates in the narrowing of the black−white life expectancy gap (2003−2008) has been previously noted. It is far from clear that progress in black longevity should be benchmarked against US whites.
Let me reiterate their point: for Americans ages 45-54, the narrowing in the black−white gap in life expectancy in recent decades was “largely driven” by more deaths among whites.
It’s heartening that overall life expectancy is increasing for many Americans, including African Americans. But it’s also important to remember that, almost a half century after King’s death, people of all races continue to be left out of this country’s progress, and some—whites and nonwhites—may, in fact, be seeing an unprecedented step backward.
* I want to apologize to Dr. Anne Case for mistakenly identifying her as “Susan Case” in the original version of my article in the Atlantic. (The only reason I can think of for why I made that dumb mistake is that a friend of mine is named Susan Caisse.) This brilliant scholar has already suffered the injustice of having her study erroneously called the “Deaton and Case study” rather than the “Case and Deaton study” (for better or worse, first authorship is everything to us academics), and here I’ve added insult to indignity. My sincere apologies.
We use cookies to improve your experience while you navigate through the site. Cookies that are categorized as necessary are stored on your browser, as they are essential for the working of the site’s basic functionality. We also use third-party cookies that help us analyze and understand how you use this site. These cookies will be stored in your browser only with your consent, and you have the option to opt out of using them.
Necessary cookies are essential for the basic functionality and security features of this website. These cookies do not store any personal information.
Any cookies that are not necessary for the website to function and are used to collect user personal data via analytics or other embedded content are termed non-necessary cookies. It is mandatory to procure user consent prior to using these cookies.