After London’s Terrorist Killing, Asking the Big ‘Why?’

Lee Rigby, murdered in Woolwich, UK. Source: Wikimedia Commons
Lee Rigby, murdered in the London district of Woolwich. UK Ministry of Defence, via Wikimedia

What is it that makes people capable of hacking another human being to death on a peaceful street? It is a question that demands asking after last week’s brutal murder of a British soldier. The suspects, captured on cellphone video, are two men who claimed they were avenging Muslims killed by British armed forces.

One easy answer is: Islam, or a bit more subtly, radical Islam. After the bombing of the Boston Marathon — whose perpetrators similarly cited U.S. military aggression against Muslims — conservative commentator Erik Rush called Islam “wholly incompatible with Western society.” Another alternative is to take the terrorists at their word and characterize these murderous acts as “blowback” resulting from Western imperialism. This is the position taken by people like Glenn Greenwald, who argues that although the U.S. isn’t totally to blame for the attacks by extremist Muslims on Western targets, it must accept the lion’s share of that blame.

Greenwald recently clashed with Bill Maher, another liberal commentator, on this matter. Greenwald certainly has a point, and is far more thoughtful than extremists like Erik Rush. Going back at least to the U.S.-backed coup in 1953 that ousted Iran’s democratically elected prime minister and made the Shah an absolute monarch, U.S. policy has indirectly fueled radicalism in Muslim countries. Britain and other European states have mucked around in the Middle East even longer. Nevertheless, Maher also makes the point that — in the twenty-first century at least — only Muslims react with such widespread violence to blasphemous writings or cartoons. This kind of fanaticism may be waning, however, and Western Christians are perfectly willing to murder innocents as well: recent examples include the Norwegian who massacred seventy-seven people to “protect” his country from Islam and multiculturalism, and the white supremacist who gunned down six of his fellow Americans in a Wisconsin Sikh temple. What motivates any of these terrorist murders?

The question is not easy to answer. What it comes down to, I suspect, is a combination of hate and fear, which feed upon each other. The violence begets more violence, a vicious cycle of bloodshed that becomes increasingly difficult to halt. What I can say with more certainty, though, is that we should be highly suspicious of anyone who claims to have a simple answer.

As many moderate Muslims know well, their communities need to be more vocal in standing up to fanaticism, and more willing to tolerate those who have different beliefs. Yes, there are religious extremists and terrorists among non-Muslims in the West as well. But the Muslim world has numerous theocratic states (Saudi Arabia, Iran, etc.), along with radical Islamist movements and insurgencies in a number of countries. So let’s not make false equivalencies.

On the other hand, Western governments need to help the moderates fight extremism in their countries by shrinking its military footprint abroad. With the end of the U.S. presence in Iraq and, a year from now, a drastic reduction in the number of coalition soldiers in Afghanistan, that is already happening, but a greater drawdown there and elsewhere is needed.

At the same time, the U.S. cannot wall itself off from the world’s problems. It must protect its citizens (who include millions of Muslim Americans) from those violent extremists who would harm them — whatever the reason. And it has to figure out a way to do so that does not simply end up increasing the number of such people. Squaring that circle is the only way to end the cycle of violence and hate that has plagued relations between the Western and Muslim worlds for far too long.

Ian Reifowitz is the author of Obama’s America: A Transformative Vision of Our National Identity. Twitter: @IanReifowitz

 

Is it Time to Put Morality on the Market?

What Money Can't Buy, book coverOver the last thirty years, Americans have seen an infusion of market thinking into areas that were previously governed by collective ethics and morality. Today, the drive to make a profit dictates the way we view things like health, education, national security, criminal justice, environmental protection, and even procreation. In What Money Can’t Buy: The Moral Limits of Markets, Harvard University professor Michael J. Sandel argues that markets have become detached from morals, and that it’s time we reconnect them. The book is an engaging exploration of where to draw the line between having a market economy and being a market society.

In the introduction, Sandel makes it clear that providing definitive answers to the questions he raises is not his intention. Instead, he views himself as the kickstarter of a much-needed, public debate on markets and morality, and offers a philosophical framework in which we might have the conversation. The inquisitive title of Sandel’s book reinforces this position. For now, his focus is on highlighting the questions we haven’t been asking over the last three decades, but probably should have been.

So, what does economics have to do with morality? Since he’s the expert, I’ll let Sandel explain:

“Some of the good things in life are corrupted or degraded if we turn them into market commodities,” Sandel argues.

If the role of markets were simply to allocate goods, Sandel would be hard-pressed to find an ethical objection to using an economic rationale to solve all our problems — but, he explains, the reach of markets goes beyond goods allocation to express and promote attitudes toward whatever is being exchanged. It is our job as members of a just society to interrogate what those attitudes are, and whether they reflect the values we want to promote in our culture. If we determine that the values are out of sync with the ethical standards of our culture, then we need to regulate the markets to avoid the unintentional promotion of morally questionable social norms.

For many Americans, regulation is a dirty word. But Sandel asks us to consider the idea of regulation in the context of the parameters we’ve already placed on things that currently cannot be bought and sold, such as human beings and civic duties. For example, it is illegal in the United States to sell one’s vote in an election or a child through adoption processes. These boundaries were not established by the rules of economics; they were established by our moral compass as citizens in a participatory democracy.

So, what values do our markets presently exude? And are we satisfied with that? Because Sandel isn’t. He believes we need more robust engagement in civic discourse around these issues.

“When we think of the morality of markets, we think first of Wall Street banks and their reckless misdeeds, of hedge funds and bail-outs and regulatory reform,” he writes. “But the moral and political challenge we face today is more pervasive and mundane — to rethink the role and reach of markets in our social practices, human relationships, and everyday lives.”

As funny as it is intellectually engaging, What Money Can’t Buy is an excellent point of entry for those concerned with addressing the challenges of markets and morality. It will augment your view of laissez-faire economics and what is a stake in our society if we don’t intervene.

Mandy Van Deven was previously In The Fray’s managing editor. Site: mandyvandeven.com | Twitter: @mandyvandeven

 

The Pendulum of Curiosity: Why I Am a Writer

graphic of Tina VasquezI recently came to the realization that my life is full of extremes, and those extremes facilitate my work as a writer. This revelation struck while I was sitting in bed on a Saturday night, simultaneously editing an e-learning course on fair housing laws and watching the America’s Cutest Cat countdown on Animal Planet. This brief indulgence in the hilarious and heartwarming antics of curious cats provoked a moment of self-reflection. I was compelled to consider the ways my own curiosity drives me, personally and professionally. Writers are known to be troublemakers, after all — though perhaps this is an unfair casting unless viewed in the right sort of light.

As evidence of my unruly ways, I’d spent the previous weekend with a group of friends in San Francisco’s Castro District. I threw back doubles of Crown Royal in wonderfully seedy dives and chatted up the oddest strangers I could find. Essentially, I was looking for trouble. But in a way, I’m always looking for trouble, alcohol notwithstanding or required.

By all accounts, I am a responsible adult. During the day, I work, write, and volunteer for a women’s rehabilitation program. I go grocery shopping and cook for my aging father and great uncle. I walk the dog and feed the cat. When the sun sets, however, I get an all-too-familiar itch to seek out the untamed.

So, what does being a troublemaker mean anyway? For me, it means going places I’ve been told not to go, doing things I’ve been told not to do, talking to people I’ve been told not to talk to, and writing about it all with humility and compassion. This lifestyle is deemed unsuitable for a “good Latina” like me. Sometimes you have to toe the line, but other times you have to be willing to step over it and see where the other side leads.

My connection to outsiders started when I was young. I was always attracted to things that seemed out of place, pushed boundaries, or had clearly gone awry. When driving in downtown Los Angeles with my dad, he would lock the car doors and tell me to avert my eyes from the people who were struggling with homelessness, mental illness, addiction, and disease. But his warnings only widened my field of vision and amplified my interest in the troubled lives that were being vehemently ignored.

As a young adult, I spent hours driving around the same dodgy areas with a friend in the middle of the night. When that wasn’t getting me close enough to the action, I ditched the car to walk around on the streets. (This was about the same time Los Angeles Times columnist Steve Lopez wrote a series about Skid Row that would become one of my favorite pieces of journalism.)

I developed an unquenchable desire to understand how this hell on earth came to be. My questions eventually led to anger that my city had failed so many. My anger led to me discovering that I had a gift for deep inquiry and exploration through writing.

Today, my curiosity fuels what I do for a living. It pushes me to want to know the who, what, where, when, why, and how of everything — the more disputed the topic, the more engaging it is to me. My goal is to write about people’s lives respectfully, never dehumanizing or exploitative. I want to tell their stories as honestly as I can and shed a bit of light into some of society’s darker corners.

In many ways, I have been lucky that my curiosity hasn’t gotten me killed. It has placed me in more than a few unsafe situations. I’ve been in cars I shouldn’t have been in, with people I shouldn’t have been with. I’ve been cornered in dark alleys. I’ve been followed. I’ve had my life threatened. My flirtation with danger wasn’t a healthy courtship, and I am fortunate to have sidestepped a messy ending. Still, I go on to the next story.

Not all of my work is focused on situations of heartbreak and melancholy. In fact, much of what I write to pay the bills takes a lighter tone. Juggling this odd combination has landed me with innumerable moments of absurdity. Accidental offense is an on-the-job hazard.

While writing an article for my local newspaper, I went to an elementary school to observe a class of fourth graders. When fishing in my purse for a business card to give the classroom teacher, I accidentally pulled out one for a self-proclaimed “anal expert” I’d met in a bar a week earlier. The card pictured the man in a latex dog suit. Although I quickly pushed the card back into my bag — hoping the teacher hadn’t seen it — the look on her face indicated otherwise. I smiled self-consciously as I handed her the correct one.

I didn’t go to college to learn how to write. In fact, I didn’t finish college at all. Instead, I built my career on being curious and trusting my instincts. As a writer, the only thing about which you can be certain is that those two traits will guide you to where you need to be. And just like those comical kitties, I always seem to land on my feet.

Economic Abe. Via Wikimedia

When Our Information Changes

Shinzo Abe in crowd
Economic Abe. Via Wikimedia

It’s rare to see a macroeconomics experiment play out in real time in the way we are seeing it right now in Japan and Europe.  Prime Minister Shinzo Abe has embarked on aggressive measures to stimulate Japan’s long-moribund economy since he took office in December, and the result so far has been strong growth — and, perhaps, liftoff after a triple-dip recession. Europe, on the other hand, remains mired in the muck of austerity and economic contraction.

To briefly recap Japan’s economic woes: the Japanese economy has been largely stagnant for the last two decades. Since the financial crisis in 2008, it has gone through three bouts of negative growth. Its economic output per person — GDP per capita — was actually lower in 2012 than it was in 2008.

In the economics profession, this is what they refer to in technical terms as “not good.”

However, Japan’s economy surged in the first quarter of this year, growing at an annualized rate of 3.5 percent. For its part, the Abe administration credits a three-pronged economic strategy, dubbed Abenomics: “unprecedented monetary stimulus, a big boost to government spending, and structural reforms designed to make Japanese industry and institutions more competitive.”

Then there’s Europe, which refuses to shift away from austerity. Its economy shrank for the sixth consecutive quarter — its longest downturn since World War II.

“The real economy is responding [in Japan],” said Adam S. Posen, president of the Peterson Institute for International Economics in Washington. “The last five, six months, there’s been a mini consumer boom. All the things that people said could never happen in Japan have turned around.”

He added: “Japan’s central bank is supporting recovery, and it’s working. The European Central Bank is supporting stagnation, and it’s working.”

Some in Europe understand that austerity is the problem, not the solution. Unfortunately, that “some” does not include the people making the decisions:

“The elites in Europe don’t learn,” said Stephan Schulmeister, an economist with the Austrian Institute of Economic Research. “Instead of saying, ‘Something goes wrong, we have to reconsider or find a different navigation map, change course,’ instead what happens is more of the same.”

Schulmeister added that German Chancellor Angela Merkel — austerity’s champion and the one person who could push Europe to change course — is “not willing to learn” the lesson offered by Japan’s recent switch from contraction to growth.

Change in GDP, Japan: 2007-present
Change in GDP, Japan: 2007-present
Change in GDP, Europe and the U.S., 2005-2012
Change in GDP, Europe and the U.S.: 2005-2012

Apparently, Europe (read: Germany) sees austerity as a kind of “morality play” whereby the profligate must suffer for their sins. And yet the people most responsible for Europe’s economic crisis are the ones suffering the least from austerity. Although unemployment in the euro zone reached a new high in March, you don’t see bankers and politicians on the unemployment line. What’s really immoral is an austerity policy that punishes the innocent while one guilty party bails out the other.

Regardless of who is hurting, austerity is simply not always the best way to achieve its supposed goal: reducing government deficits. As Europe reminds us, it prevents recession-battered economies from growing. The alternative is to prime the economic pump by having governments engage in fiscal and monetary stimulus. When economies grow under this approach, Keynesian economists like Paul Krugman argue, governments collect more in the way of revenues, straightening out their finances faster than they would by reducing their spending.

Once a country’s economy is again operating at capacity, government should cut spending — and increase taxes on those who can afford it — in order to deal with the problem of deficits in a balanced, moral way that neither grievously harms the economically vulnerable nor sacrifices the long-term investments by government that are necessary to further growth over time.

The lessons to be drawn from the recession are counterintuitive. The dominant morality tells us to tighten our belts and save up. But if the government as well as the private sector hoards cash during a recession, the economy slows to a crawl. That is the kind of economic suicide that Europe has leaped into: painful cuts, no growth, and rampant unemployment. America has avoided the worst of Europe’s fate thanks in part to the stimulus passed in 2009, and Japan, at last, looks to be hurtling in the opposite direction due to its recent stimulative policies. The key question is whether the pro-austerity politicians who currently control the purse strings in Washington and Brussels will take a hard look at the evidence accumulating around them — or retreat back into their comfortable, self-righteous views of the world.

John Maynard Keynes, the father of the proactive approach to economic policy that now bears his name, had something to say on this topic as well.  Responding to a critic who questioned his shifting position on monetary policy during the Great Depression, the British economist answered: “When my information changes, I alter my conclusions. What do you do, sir?”

Ian Reifowitz is the author of Obama’s America: A Transformative Vision of Our National Identity. Twitter: @IanReifowitz

Senator Rand Paul speaks at a town hall in New Hampshire. Last month the Kentucky Republican visited Howard University, a historically black college, to reach out to the African American community.

The Blunter Edge of the Racial Wedge

 

Rand Paul speaks at at New Hampshire town hall
Senator Rand Paul speaks at a town hall in New Hampshire. Last month the Kentucky Republican visited Howard University, a historically black college, in an effort to reach out to the African American community. Gage Skidmore, via Wikimedia

The U.S. Census Bureau just released its report on voter turnout in America’s 2012 presidential elections. For the first time, the percentage of eligible blacks who voted surpassed that of eligible whites. Meanwhile, explosive growth in the country’s Asian and Hispanic populations continues to mean that those who go to the polls are increasingly nonwhite.

The turnout story is not just about Barack Obama running for president. In 1996, when the government began to collect this kind of data,  whites outvoted blacks by eight percentage points. Black turnout has increased in every election since then.

The turnout rates for Hispanics and Asians — both just shy of 50 percent — continue to lag far behind the other two groups, with much smaller gains over the years. And yet their share of the voting public almost doubled over that same span of sixteen years, even as the white share of voters dropped nine percentage points, to 74 percent.

Furthermore, partisanship is becoming more racial and regional. In the last four elections, Republicans have tended to get just under three-fifths of the white vote, while Democrats have consistently drawn about nine-tenths of the black vote (only slightly higher with Obama on the ballot). Meanwhile, Hispanic and Asian voters have moved significantly toward Democrats. Between 2004 and 2012, the Asian Democratic vote jumped 17 points, to 73 percent, while the Hispanic Democratic vote jumped 18 points, to 71 percent. Across that same period of time, the white vote for Democrats was lower in the South than any other region, and lowest in the deepest Southern states (Louisiana, Mississippi, Alabama). 

It does not bode well for the GOP that its voters were almost 90 percent white in 2012.  If America’s minority voters continue to turn out for Democrats, and their share of the population continues to grow as rapidly as projected, it will become ever harder for Republicans to win the White House.

I am a progressive, but I don’t celebrate these trends. For the sake of this country’s multiethnic democracy, I want Republicans to do better among nonwhite voters. A society where ethnicity defines the political parties is doomed to disaster. The political process becomes a zero-sum game where each ethnic group fights for its share of the pie. Any commitment to a broader common good is lost, as is any sense that citizens of different backgrounds can come together and feel a strong patriotic bond.

My hope is that the GOP’s leaders read these numbers and adopt both a tone and policy stances that unite rather than divide. Too many on the right — from Rush Limbaugh to Mitt Romney to Sarah Palin — have sought to gin up white anxiety over demographic changes, to motivate white voters by fear.

Giving up this losing strategy is the best way to win over the growing ranks of minority voters. We’ll see in the coming months whether that happens. The impending vote over immigration reform will be a crucial test. But for the health of their party — and the health of our country — Republicans need to change.

Ian Reifowitz is the author of Obama’s America: A Transformative Vision of Our National Identity. Twitter: @IanReifowitz

 

A Month Burned from Memory


What does it feel like to go insane and not know why? In her memoir, Brain on Fire: My Month of Madness, author Susannah Cahalan describes what it is like in terrifying detail: “My body continued to stiffen as I inhaled repeatedly, with no exhale. Blood and foam began to spurt out of my mouth through clenched teeth.… This moment, my first serious blackout, marked the line between sanity and insanity. Though I would have moments of lucidity over the coming weeks, I would never again be the same person. This was the start of the dark period of my illness, as I began an existence in purgatory between the real world and a cloudy, fictitious realm made up of hallucinations and paranoia.”

At the time, Cahalan was twenty-four years old and working at the New York Post. Having climbed up slowly from an intern to a full-time news reporter, she was young, ambitious, and known for being confident and professional. Cahalan’s future was bright when she was suddenly struck by an affliction that stumped her, her family, and most medical professionals.

Cahalan uses her reporter’s skills to knit together the incidents surrounding her downward spiral. She tries to piece together a time about which she has little or no recollection. Her few existing memories range from fuzzy half-truths to full-out hallucinations. She recounts the experience of paging through her father’s diary like she was reading about a stranger.

Cahalan deftly weaves together intimate moments with intricate medical explanations of her condition, which at times reads like a detective story. By meticulously retracing her own footsteps through seizures, rampant paranoia, and delusions, Cahalan engages with her passion for research. She walks readers through her various misdiagnoses — including one doctor who insisted that alcoholism was to blame — and eventually reaches the point of an accurate diagnosis, treatment, and recovery.

Brain on Fire makes for a gripping read. As Cahalan describes in her introduction, the book is “a journalist’s inquiry into that deepest part of the self — personality, memory, identity — in an attempt to pick up and understand the pieces left behind.”

After interviewing a host of doctors and experts around the globe, Cahalan was able to report every aspect of her illness and treatments — including her own brain surgery — in detailed yet accessible terms. “With a scalpel, Dr. Doyle made an S-shape incision, four centimeters from the midline of the scalp over the right frontal region. The arm of the S extended just behind my hairline,” she writes. “He parted the skin with a sharp blade and gripped each side with retractors.… The whole procedure took four hours.”

Divided into three parts, the first section of Brain on Fire leads us from the murky confusion of Cahalan’s initial seizures and bouts of paranoia through the fragmented and reconstructed memories of her time in the hospital. Cahalan writes with flagrant honesty, piecing together hospital records, her parents’ shared diary, video footage from her time in a monitored epilepsy ward, and her own disjointed scribblings. This timeline of events is combined with the narrative occurring within Cahalan’s own distorted mind, which is set apart in italics to differentiate the two realities.

During these highly personal accounts, Cahalan describes her hallucinations and paranoia. At one point, she obsessively searches her boyfriend’s apartment for proof of his alleged infidelity. We feel her panic and confusion escalate as the book progresses, and Cahalan struggles to maintain a sense of her authentic identity. “No one wants to think of herself as a monster,” she writes.

As Cahalan’s situation worsens, the heroic Dr. Souhel Najjar arrives on the scene. After seemingly endless tests using the highest technology, Dr. Najjar is able to solve the puzzle. Cahalan is diagnosed with a little-known, recently discovered autoimmune disease called anti-NMDAR encephalitis.

“Her brain is on fire,” Dr. Najjar tells Cahalan’s parents. “Her brain is under attack by her own body.”

Cahalan goes on to detail her bumpy road to recovery, in which she deals with the burden of “survivor’s guilt” — a kind of post-traumatic stress disorder — and a fear of relapse that is said to occur in twenty percent of cases. It is frightening how little is known about this rare disease, and as Cahalan writes, “It just begged the question: If it took so long for one of the best hospitals in the world to get to this step, how many other people were going untreated, diagnosed with a mental illness or condemned to a life in a nursing home or a psychiatric ward?”

Aside from being an excellently written memoir, Brain on Fire is also a valuable case study of a rare neurological disease. Cahalan is the 217th person to ever be diagnosed with anti-NMDAR encephalitis, and her diagnosis occurred just two years after the disease was discovered. Brain on Fire and “My Mysterious Lost Month of Madness,” the article from which the book emerged, have been instrumental in helping more people receive a correct diagnoses and treatment for the disease.

Cahalan’s work raises many questions about the root of “madness” and how easily we sling the term about. For those struggling to make sense of a disease like hers, Brain on Fire offers guidance and understanding. For the rest of us, it’s a fascinating and well-told cautionary tale.

Jo Magpie is a freelance journalist, travel writer, and long-term wanderer currently based in Granada, Spain. Blog: agirlandherthumb.wordpress.com

Serb leader Slobodan Milošević, who died in 2006. Via Wikimedia

Civilization and Its Peacemakers

Serb leader Slobodan Milošević
Serb leader Slobodan Milošević, who died in 2006. Via Wikimedia

After protracted, months-long negotiations, Kosovo and Serbia recently agreed to a compromise on sovereignty and autonomy that would end two decades of conflict. In extinguishing the last embers of war in what was Yugoslavia — the volatile, ethnically divided nation where the assassination of an Austrian archduke launched World War I, and where civil war throughout the nineties led to ethnic cleansing and other atrocities — Europe is nearing the end of its long journey to overcome its tribal enmities and build a cohesive, peaceful civilization.

These hopeful developments overseas have been on my mind recently. This semester, I’ve been teaching a course built around on the debate within the West over human nature: What are we? What can we be? Why do we act the way we do? John Locke argued that we are born a blank slate, that our experiences and interactions form our character. Overall, Enlightenment thinkers believed people could, if properly educated, learn to act solely based on reason.

My students later encounter Friedrich Nietzsche, who praised the “will to power” as motivating the strong to dominate, and Sigmund Freud, who feared that our inclination toward aggression could destroy civilization. Freud believed that although we could be rational at times, we’d never “enlighten” away our instinctual impulses. He recognized Nazism as the extreme manifestation of these impulses, a system based on hate that rejected the idea of justice — that the strong must be prevented from subjugating the weak — on which rested his definition of civilization.

Centuries after the Enlightenment, we’ve arrived at a more humble view of the possibilities of reforming human nature. We’ve seen too much evil — above all, in the cataclysm of World War II — to expect a paradise of reason. Yet democracy, significant warts and all, stands virtually alone in a West that has rejected Nazism and communism. Although they don’t always live up to them, democracies operate from principles centered on equality before the law. Democracy proclaims that the strong cannot — by virtue of their will to power — claim the right to dominate the weak.

Serbia holds some of the last vestiges of Europe’s ancient blood feuds. In the 1990s, Serbs clung to the idea that racial superiority justified their rule over supposedly inferior neighbors. Serbian ethnic nationalism stirred up people’s base instincts, fomenting hate as a motivation for murder and conquest.

The European Union, alternatively, appealed to reason. It offered little in the way of emotional attraction or visceral triumphs, and drew on no traditional identities. During the 1990s, the EU’s expansion into Eastern Europe stood alongside the tribal bloodshed unleashed in the former Yugoslavia by Serb leader Slobodan Milošević.

One question I’ve posed in my class is whether Europeans will ultimately choose EU integration over ethnic nationalism. Membership in the European Union — which, despite the travails of Greece and Cyprus, offers the promise of greater prosperity — is a strong incentive to choose peace. And yet Freud’s concern about humanity’s indelible aggressive urges remains relevant. One country can drag a continent into darkness.

The key question for Serbia, in Freud’s terms, has been which part of its “mind” will triumph: the id — its nonrational instincts — or the superego — the part that suppresses those instincts in favor of pursuing the norms of “civilization” and the material benefits that accompany it.

The compromise between Serbia and Kosovo is a sign that reason has won out. The EU brokered the agreement, and made clear that its acceptance removes the existing roadblocks to membership for both countries. Each one moved off its maximalist positions — despite the emotional cost of those concessions — because the benefits outweighed that cost. That’s a rational decision of the kind Freud wasn’t confident societies would make.

Civilization will always have challenges to overcome, but the end of racial wars of conquest in a continent long riven by them gives hope that humanity is, finally, making progress.

Ian Reifowitz is the author of Obama’s America: A Transformative Vision of Our National Identity. Twitter: @IanReifowitz