Stanford University is one of eight schools where wealthy parents fraudulently secured spots for their children as part of a nationwide college admissions bribery scheme. HarshLight, via Flickr
The college admissions scandal that implicated Hollywood stars and other wealthy parents produced its first convictions in September, with actor Felicity Huffman among the growing list of those sentenced to prison time for engaging in bribery and fraud to get their children into a selective college (though in Huffman’s case for a short term of fourteen days). The nature of this scandal—which involved FBI wiretaps, paid-off SAT proctors, and even doctored photos of students playing sports—turned an intense media spotlight on the spectacularly unethical behavior of certain well-off families. But the scandal is a symptom of a much deeper problem in modern American life: widening income inequality and the destructive competition it engenders across the class divide.
When income inequality rises, the stakes of the economic game rise. Where children end up along a steep gradient of academic achievement matters all the more for their chances later in life. For example, in 2018, edging your way into the top 5 percent of earners would have made your household $119,000 richer than one that had just made it into the top 20 percent; back in 1978, that difference was just $56,000 in inflation-adjusted dollars. Because every step up the ladder pays off more, parents feel greater pressure to do all they can to improve their kids’ prospects. The payoff for cheating grows, too—even elaborate frauds of the sort that William Rick Singer and his team allegedly perpetrated to get his high-profile clients’ kids into Stanford, Yale, the University of Southern California, and other schools. (Singer, who pleaded guilty to fraud and a host of other criminal charges in March, admitted to bribing university administrators and colluding with wealthy parents to secure admission for their children.)
Beyond the ranks of celebrities and the elite, economic anxieties abound. It has become commonplace to observe that children from middle-class families are less likely to achieve a better standard of living than their parents. And as those chances dwindle, a greater burden falls on children and their parents to ensure their future success.
Not all millennials are this damn good-looking—and not all of them are struggling in the ways that working-class millennials generally are. Photo via Flickr.
Why can’t millennials afford their own homes? Reading much of the popular press, one is led to believe it’s their unrealistic expectations, indulgent spending, and general allergy to adulthood that have trapped them in a renter’s purgatory. Nebraska senator Ben Sasse wrote a whole book, The Vanishing American Adult, in which he argued that young people today are stuck in a Peter Pan–like state of carefree childhood, spending their time playing video games, buying stuff, and snapping selfies—even posting ironic memes about “adulting”—rather than seeking meaning in career, family, and a stable home.
It is true that millennials have been slower to reach various milestones on the way to an all-American adulthood—including buying a home—than prior generations at the same point in their lives. For example, Americans ages eighteen to thirty-four are now more likely to be living with their parents than in any other housing arrangement, according to 2014 data from the Pew Research Center (which defines millennials as those born between 1981 and 1996). That has never been the case before, according to census data going back to 1880.
A more rigorous explanation for their failure to launch is that the Great Recession stunted millennials’ economic lives at a critical age. As a result, they’re still struggling to obtain gainful employment in a more demanding labor market, or find affordable housing in a contracted mortgage market. In other words, it’s not their lousy values—it’s their lousy economic prospects.
But this line of argument, too, misses something crucial. Its focus on middle-class, if downwardly mobile, millennials obscures just how diverse a generation millennials are. Not all of them are into Snapchat and kombucha juice, that’s for sure, but what’s less appreciated is that not even a majority of them are college-educated. Barely four out of ten Americans between ages twenty-five and thirty-four—members of what will be the most academically credentialed generation ever—have a bachelor’s degree. Thanks in part to the country’s widening income gap, the picture of “how millennials are doing” is dramatically different depending on which segment of the population you happen to be looking at. And to an overlooked degree, what determines whether, when, and how members of this generation attain the traditional markers of adulthood—a house and career, marriage and kids—is one factor: class.
The Captured Economy: How the Powerful Enrich Themselves, Slow Down Growth, and Increase InequalityBy Brink Lindsey and Steven M. TelesOxford University Press. 232 pages.
The rents are too damn high. That’s the conclusion of Brink Lindsey (of the center-right Niskanen Center) and Steven M. Teles (of Johns Hopkins University and Niskanen) in their book The Captured Economy: How the Powerful Enrich Themselves, Slow Down Growth, and Increase Inequality. By “rents,” Lindsey and Teles don’t mean what you’re late in paying your landlord, but rather “rent” as economists understand it: profits in excess of what a free market would normally allow. In recent years, they argue, large corporations and wealthy individuals have taken larger and larger slices of the economic pie not by creating things of value—inventing the next iPhone-like innovation, say—but by using government policies to quash competition. This involves not just “regulatory capture” (a social-science term for when the industry fox watches the consumer henhouse) but a broader takeover, with all levels of the government—both those who write the rules, and those who enforce them—bending the knee to particular business interests or organized elites.
In his new book The Vanishing Middle Class, MIT economist Peter Temin provides a short and accessible take on this country’s deeply unequal economy, which he argues now represents two different Americas. The first is comprised of the country’s elite workers: well-educated bankers, techies, and other highly skilled workers and managers, members of what he calls the “finance, technology, and electronics sector” (FTE)—the leading edges of the modern economy. A fifth of America’s population, these individuals command six-figure incomes and dominate the nation’s political system, and over the past half-century, they have taken a greater and greater share of the gains of economic growth. The other America, what he calls the “low-wage sector,” is the rest of the population—the dwindling ranks of clerks, assemblers, and other middle-income workers, and an expanding class of laborers, servers, and other lowly paid workers.
While there are many reasons why Donald Trump won the election, it’s clear that the movement of the white working class away from the Democratic Party had something to do with it. Given that this demographic seems to have put Trump over the top in the Electoral College, what do we expect his administration’s policies to do for this group—and for the working class (which, importantly, is increasingly nonwhite) more broadly?
Hopewell-Mann is a predominantly Latino neighborhood in the predominantly Latino city of Santa Fe. Close enough to downtown to make it a short commute, yet a world away so that tourism doesn’t quite reach it, it’s a stark reminder of some of the inequalities present in this city. While the stunning adobe architecture downtown looks like it’s been preserved in aspic, Hopewell-Mann’s main drag is lined with big-box stores, fast-food restaurants, and cheap motels offering month-to-month leases. The neighborhood attracts a mix of the transient and the locally displaced, and not surprisingly, people downtown tend to avoid it.
A protester-made statue with the Spanish words "dignity" and "fight" stands outside the Chicago Board of Trade building following a march in favor of a higher minimum wage. Scott L, via Flickr
This weekend, low-wage workers from around the country will be arriving in my city, Richmond, to make a case for increasing the minimum wage. It’s the first-ever national convention for the Fight for $15 movement, which in the past few years has launched wide-ranging strikes and protests to raise awareness about how a $7.25-an-hour wage—the current federal minimum—just doesn’t cut it for many workers struggling to make ends meet for themselves and their families.
There’s a long line of economic arguments in favor of, and opposed to, increases in the minimum wage. Among other things, opponents say it will raise prices for consumers, cause employers to slash jobs or cut back on workers’ hours, and put many companies out of business. Advocates say it will help the economy by giving workers more money to spend in their communities, encouraging the unemployed to seek out work, and reducing the stress and anxiety the working poor deal with, as well as their reliance on government benefits.
As important as the economic impacts of this policy are, however, it’s even more important to consider its cultural and moral implications. After all, that’s what drives much of the widespread public support for increasing the minimum wage, even among people who have never heard of, say, the elasticities of labor supply and demand. Many Americans just don’t think it is right that people who work hard should have to struggle so hard.
To be sure, the research on the minimum wage gives us little reason to despair—or cheer—over its impact on the economy. The most rigorous studies seem to suggest that it doesn’t make a big difference in terms of employment and growth. A 2014 open letter signed by 600 economists, including seven Nobel laureates, advocated raising the minimum wage to $10.10, noting that the “weight of evidence” showed “little or no negative effect” on employment for minimum-wage workers. Meanwhile, the increase would lift wages for them and likely “spill over” to other low-wage workers, too, possibly stimulating the economy to a “small” degree, the economists wrote.
Most recently, a University of Washington study of the increase in Seattle’s minimum wage to $11—on its way to $15 in 2017—tried to sort out the impact of the wage hike alone, sifting away the effects of other changes in the economy occurring at the same time. It found mixed results. A bit higher wages, but a bit fewer hours. Somewhat less employment, but no increase in business closings.
Make of these studies as you will, but it’s hard to argue that the sky is falling down in places where wage policies have changed. And while a higher minimum wage will give low-wage workers fatter paychecks, it obviously cannot, by itself, pull the working class out of its decades-long malaise of stagnant wages and growing insecurity.
These economic analyses provide important context, but the policy question really boils down to one of values. America has always prided itself for being founded on principles rather than a single cultural persuasion, and Americans have held onto few principles as steadfastly as the value of hard work. An honest day’s toil should get you by. And yet we have millions of Americans who work full-time and are still in poverty. We have millions working at global corporations like Walmart and McDonald’s that pay their workers so little that their business models rely on government to pick up the tab—by providing Medicaid, food stamps, refundable tax credits, and the like.
Adapting our laws and our economy to match our principles will take time. With any change, there will be some who gain, and some who lose out, more than others. But overall society will be better off—and it’s not just because some people will make more than they used to.
When we pay living wages, the culture changes, too. As Katherine Newman found in her classic study of fast-food workers, No Shame in My Game, part of what makes it hard to take a low-wage job is not that people don’t want to work—it’s that society has such disdain for those making chump change behind a McDonald’s counter or in a Walmart stockroom. (This is also one reason that immigrants—who aren’t under the same sorts of social pressures as the native-born—will do the poorly paid jobs others won’t.)
In the research for my book about the long-term unemployed in America and Canada, I came across one man out of work for more than a year after the car-parts plant that employed him shut down. He had avoided having to live on the street by moving into his mom’s house. When I spoke to him, he had just given away his last unemployment check to his daughter so that she could have something of a normal Christmas.
“I’m forty-three years old and living off my mother,” he told me. He was ashamed about accepting his family’s help, but he felt he had to do it. What he wasn’t willing to do, though, was work at a fast-food restaurant. He had put in twelve years at a respectable job, he pointed out. “I don’t want to throw on a goofy hat.”
If we believe that certain jobs are so undignified that we won’t even pay someone a decent wage to do them, then we shouldn’t be surprised that people with a decent amount of self-respect won’t do them. Opponents of raising the minimum wage seem to be blind to this. They talk about the economic pros and cons of wage laws as if those were the only things that matter. But people in the real world don’t just have balance sheets, they also have pride.
If you don’t think that making economic policy based on principle is realistic, then consider the extent to which it has already occurred—in the direction of greater income inequality. In 1965, CEOs made 20 times more than a typical worker, according to the Economic Policy Institute; in 2014, they made 300 times more. Part of this shift was due to global competition and changes in labor and financial markets, but some of it can be linked to the dwindling sense of obligation that those at top now have toward their workers, as Mark Mizruchi and other scholars have noted.
As many of today’s corporate leaders see it, making obscenely larger amounts of money than their employees do is no longer cause for guilt. The boardroom culture tells them they deserve it. And so they continue to push for changes in tax laws to make sure the economy’s outcomes reflect their own principles of self-profit.
Indeed, in other rich countries with different social norms, the gap between CEO and worker pay is nowhere near as extreme—and the minimum wage tends to be much higher, too. These countries have clear notions of what’s fair and appropriate to pay for a day’s work, and they have chosen to pursue practices and policies in line with those beliefs.
Even those of us who want government to do more for the working poor often forget the importance of this broader cultural context. Yes, we should take advantage of targeted, technocratic solutions such as earned-income tax credits that make low-wage work pay better. But it should trouble us that these policies often amount to having the government subsidize employers who refuse to foot any extra labor costs. Furthermore, having a company pay a higher wage and having the government supplement that wage are very different things. Or at least they are when we look from the vantage point of flesh-and-blood human beings—as opposed to that of the rational-actor stick men in economic models. We brag about our paychecks, not our tax credits.
What we pay those at the bottom also has something to say about the dignity and connectedness of our society as a whole. If every wage is a living wage, those of us who are more fortunate won’t be living in such a different world from those sweeping our floors and serving our food. An entry-level job won’t be such a laughable and undignified proposition that a kid in a poor town or neighborhood won’t even consider taking it over a flashier (and deadlier) gig on the corner. If we think people are worth more than a pittance, they will act that way—and treat others that way.
In a sense, it’s fitting that Richmond, the former capital of the Confederacy, a city with a history of stark racial and economic inequalities, should host the Fight for $15 convention. The old plantation-based economy disappeared not because it wasn’t profitable. It disappeared because it wasn’t just. If we truly believe in our values, we should make our economy reflect them.
Dr. Martin Luther King delivering his "I Have a Dream" speech in Washington on August 28, 1963. National Archives and Records Administration, via Wikimedia
All the discussions today of how much racial progress we’ve made since Dr. Martin Luther King was alive reminded me of a disturbing point about the black−white health gap mentioned in recent research, some of which I discussed in an Atlanticessay over the weekend.
According to the Centers for Disease Control, African Americans have been catching up with whites in terms of life expectancy at birth. So things are looking up, right?
Yes, and no. To a sizeable extent, what explains the narrowing of the life-expectancy gap in the last couple decades is not just that things are better for African Americans (though they have improved), but also that things are worse for whites—working-class whites above all.
A New York Timespiece over the weekend highlighted this fact. “A once yawning gap between death rates for blacks and whites has shrunk by two-thirds”—but that’s not because both groups are doing better, according to the article. Overall mortality has declined for African Americans of all ages, but it has risen for most whites (specifically, all groups except men and women ages 54-64 and men ages 35-44).
Furthermore, younger whites (ages 25-34) have seen the largest upticks in deaths, largely because of soaring rates of drug overdoses, and those who have little education are dying at the highest rates. The mortality rate has dropped for younger African Americans, a decline apparently driven by lower rates of death from AIDS. Together these trends have cut the demographic distance between the two groups substantially.
For middle-age African Americans, the progress in improving health outcomes implied by the shrinking black−white mortality gap is also less cause for celebration than it might seem at first.
A much-discussed study last year by the economists Anne Case* and Angus Deaton found that huge spikes in deaths by suicide and drug poisonings over the last couple decades have meant that the trend of declining mortality rates we’ve seen for generations actually reversed for whites ages 45-54 between 1999 and 2013. Again, those with little education were hit the hardest.
In my Atlanticpiece, I pointed out that the growing social isolation and economic insecurity of the white working class might explain some of these trends. One of the caveats I mentioned is that death and disease rates remain much higher among African Americans and Latinos. (I should have been more precise in the article: although Latinos have higher rates of chronic liver disease, diabetes, obesity, and poorly controlled high blood pressure, they have lower rates of cancer and heart disease, and lower or at least equivalentrates of death).
But it’s not just that the black−white gap persists. Here’s an important passage from Case and Deaton’s paper:
Over the 15-[year] period, midlife all-cause mortality fell by more than 200 per 100,000 for black non-Hispanics, and by more than 60 per 100,000 for Hispanics. By contrast, white non-Hispanic mortality rose by 34 per 100,000. CDC reports have highlighted the narrowing of the black−white gap in life expectancy. However, for ages 45–54, the narrowing of the mortality rate ratio in this period [1999−2013] was largely driven by increased white mortality; if white non-Hispanic mortality had continued to decline at 1.8% per year, the ratio in 2013 would have been 1.97. The role played by changing white mortality rates in the narrowing of the black−white life expectancy gap (2003−2008) has been previously noted. It is far from clear that progress in black longevity should be benchmarked against US whites.
Let me reiterate their point: for Americans ages 45-54, the narrowing in the black−white gap in life expectancy in recent decades was “largely driven” by more deaths among whites.
It’s heartening that overall life expectancy is increasing for many Americans, including African Americans. But it’s also important to remember that, almost a half century after King’s death, people of all races continue to be left out of this country’s progress, and some—whites and nonwhites—may, in fact, be seeing an unprecedented step backward.
* I want to apologize to Dr. Anne Case for mistakenly identifying her as “Susan Case” in the original version of my article in the Atlantic. (The only reason I can think of for why I made that dumb mistake is that a friend of mine is named Susan Caisse.) This brilliant scholar has already suffered the injustice of having her study erroneously called the “Deaton and Case study” rather than the “Case and Deaton study” (for better or worse, first authorship is everything to us academics), and here I’ve added insult to indignity. My sincere apologies.
Best of In The Fray 2015. As they head into what should be their golden years, many older immigrants still work low-wage jobs and remain undocumented. Unable to save up or receive benefits for the elderly, they can do little but hope they stay healthy and employable. Part two of a two-part series.
“Nursing homes are sad places. People are abandoned there,” says Gloria Murray, sixty-six, a Jamaican immigrant who worked for more than two decades as a health aide at a nursing home. Over the course of her career, Murray became close to many of her clients. It was important to her that they be shown kindness and respect. In Jamaica, she says, “we take care of our old.”
Yet as Murray grew old herself, she quickly learned that no one was going to take care of her. In 2010, a fire destroyed her home in New York. Homeless for two years, she struggled to navigate the city’s shelter system. Life there was unbearable, she says: “It was drugs, pimps, the whole lot. I never knew it would come to this.”
Dana Ullman Dana Ullman is a freelance photographer based in Brooklyn. Her photography is focused on social engagement: chronicling everyday epics, investigating subjects crossculturally, and humanizing faceless statistics through storytelling. Site: ullmanphoto.com
Dear Reader,In The Fray is a nonprofit staffed by volunteers. If you liked this piece, could you please donate $10? If you want to help, you can also:
I wrote an essay that appeared in the Atlantic yesterday. Based on the research for my book on unemployment, the piece talks about the debate over Denmark in last week’s Democratic presidential debate—and how the real debate should be over Canada:
Clearly, America won’t expand its social safety net to anywhere near the scale of Denmark’s over the next president’s time in office. Judging from their rhetoric in the debate, though, Clinton and Sanders both agree that government can and should play an important role in extending economic opportunities more broadly. Canada’s approach to policy shows us some of the practical ways a country can do that—without having to go far from our roots as a New World society of dreamers and strivers.
Today’s federal election in Canada should be interesting: will Canada move in the direction of America, or vice versa? (That said, as my friend Barry Eidlin reminded me, the provinces have a lot of say in putting forward policies of their own—to help the employed and unemployed alike—and so some things probably won’t change, regardless of the outcome.)
Unfortunately, writing the headline for this post put the South Park song “Blame Canada” in my head. Here is the video, so that you can share in my pain (NSFW, obviously):
We use cookies to improve your experience while you navigate through the site. Cookies that are categorized as necessary are stored on your browser, as they are essential for the working of the site’s basic functionality. We also use third-party cookies that help us analyze and understand how you use this site. These cookies will be stored in your browser only with your consent, and you have the option to opt out of using them.
Necessary cookies are essential for the basic functionality and security features of this website. These cookies do not store any personal information.
Any cookies that are not necessary for the website to function and are used to collect user personal data via analytics or other embedded content are termed non-necessary cookies. It is mandatory to procure user consent prior to using these cookies.