At one extreme, there is the for-profit company owned by investors and run by managers in a top-down fashion. At the other extreme, there are what the sociologist Joyce Rothschild calls “collectivist-democratic organizations.” This latter category includes worker cooperatives, consumer cooperatives, and social movements built on democratic principles. What these groups share is some form of collective ownership, a commitment to democratic decision-making, a communal spirit, and a focus on values and goals other than just making a profit.
The economic impact of market restrictions prompted by the pandemic—not to mention the coronavirus’s broader toll of more than 200,000 Americans deaths and other losses from ruined health and well-being—will likely linger well into the next president’s term. In the meantime, the pandemic appears to be accelerating trends toward greater income and wealth inequality within the country. U.S. billionaires have fared spectacularly well under the lockdown, having increased their wealth by $931 billion since March, according to data from Forbes analyzed by Chuck Collins and his collaborators. A report by the anti-poverty group Oxfam estimates that Amazon CEO Jeff Bezos now has so much money that he could pay each of his employees a six-figure bonus and still have more wealth than he had in March. Meanwhile, less advantaged Americans have been hit hard by the lingering downturn. Although stimulus checks and temporary expansions of unemployment benefits for a time worked well to mitigate the damage, poverty rates have recently spiked. Covid-19 cases, hospitalizations, and fatalities are disproportionately high among people of color. And while high-wage earners have recouped almost all their job losses, employment among low-wage earners remains almost a fifth lower than it was at the start of the pandemic, according to an analysis by Raj Chetty and other researchers.
Amid this upheaval, the next president will make policy decisions with major implications for whether the gap between the rich and poor in this country grows or narrows. Joe Biden and Donald Trump have put forward two starkly different visions for the country’s economy—particularly in regards to tax policy, which will dramatically shape income and wealth inequality over the next decade. In general, Trump argues that the tax cuts on high earners that his administration pushed through in 2017 should be extended, which he believes will lead to greater economic growth. Biden supports rolling back tax cuts for those who earn more than $400,000, saying on the campaign trail that the wealthy need to pay their “fair share.” The continued impact of the coronavirus on the economy will complicate these policy decisions moving forward, but we can sketch out the sort of agenda each candidate will likely push forward once in office—based on their stated proposals as well as their track record while in office—and the possible impact of a Biden or Trump presidency on economic inequality.
Timothy Beryl Bland Timothy Beryl Bland, PhD, is a writer based in Richmond. For his doctorate in public policy and administration from Virginia Commonwealth University, he researched the influence of think tanks.
Dear Reader,In The Fray is a nonprofit staffed by volunteers. If you liked this piece, could you please donate $10? If you want to help, you can also:
In his new book The Vanishing Middle Class, MIT economist Peter Temin provides a short and accessible take on this country’s deeply unequal economy, which he argues now represents two different Americas. The first is comprised of the country’s elite workers: well-educated bankers, techies, and other highly skilled workers and managers, members of what he calls the “finance, technology, and electronics sector” (FTE)—the leading edges of the modern economy. A fifth of America’s population, these individuals command six-figure incomes and dominate the nation’s political system, and over the past half-century, they have taken a greater and greater share of the gains of economic growth. The other America, what he calls the “low-wage sector,” is the rest of the population—the dwindling ranks of clerks, assemblers, and other middle-income workers, and an expanding class of laborers, servers, and other lowly paid workers.
While there are many reasons why Donald Trump won the election, it’s clear that the movement of the white working class away from the Democratic Party had something to do with it. Given that this demographic seems to have put Trump over the top in the Electoral College, what do we expect his administration’s policies to do for this group—and for the working class (which, importantly, is increasingly nonwhite) more broadly?
A protester-made statue with the Spanish words "dignity" and "fight" stands outside the Chicago Board of Trade building following a march in favor of a higher minimum wage. Scott L, via Flickr
This weekend, low-wage workers from around the country will be arriving in my city, Richmond, to make a case for increasing the minimum wage. It’s the first-ever national convention for the Fight for $15 movement, which in the past few years has launched wide-ranging strikes and protests to raise awareness about how a $7.25-an-hour wage—the current federal minimum—just doesn’t cut it for many workers struggling to make ends meet for themselves and their families.
There’s a long line of economic arguments in favor of, and opposed to, increases in the minimum wage. Among other things, opponents say it will raise prices for consumers, cause employers to slash jobs or cut back on workers’ hours, and put many companies out of business. Advocates say it will help the economy by giving workers more money to spend in their communities, encouraging the unemployed to seek out work, and reducing the stress and anxiety the working poor deal with, as well as their reliance on government benefits.
As important as the economic impacts of this policy are, however, it’s even more important to consider its cultural and moral implications. After all, that’s what drives much of the widespread public support for increasing the minimum wage, even among people who have never heard of, say, the elasticities of labor supply and demand. Many Americans just don’t think it is right that people who work hard should have to struggle so hard.
To be sure, the research on the minimum wage gives us little reason to despair—or cheer—over its impact on the economy. The most rigorous studies seem to suggest that it doesn’t make a big difference in terms of employment and growth. A 2014 open letter signed by 600 economists, including seven Nobel laureates, advocated raising the minimum wage to $10.10, noting that the “weight of evidence” showed “little or no negative effect” on employment for minimum-wage workers. Meanwhile, the increase would lift wages for them and likely “spill over” to other low-wage workers, too, possibly stimulating the economy to a “small” degree, the economists wrote.
Most recently, a University of Washington study of the increase in Seattle’s minimum wage to $11—on its way to $15 in 2017—tried to sort out the impact of the wage hike alone, sifting away the effects of other changes in the economy occurring at the same time. It found mixed results. A bit higher wages, but a bit fewer hours. Somewhat less employment, but no increase in business closings.
Make of these studies as you will, but it’s hard to argue that the sky is falling down in places where wage policies have changed. And while a higher minimum wage will give low-wage workers fatter paychecks, it obviously cannot, by itself, pull the working class out of its decades-long malaise of stagnant wages and growing insecurity.
These economic analyses provide important context, but the policy question really boils down to one of values. America has always prided itself for being founded on principles rather than a single cultural persuasion, and Americans have held onto few principles as steadfastly as the value of hard work. An honest day’s toil should get you by. And yet we have millions of Americans who work full-time and are still in poverty. We have millions working at global corporations like Walmart and McDonald’s that pay their workers so little that their business models rely on government to pick up the tab—by providing Medicaid, food stamps, refundable tax credits, and the like.
Adapting our laws and our economy to match our principles will take time. With any change, there will be some who gain, and some who lose out, more than others. But overall society will be better off—and it’s not just because some people will make more than they used to.
When we pay living wages, the culture changes, too. As Katherine Newman found in her classic study of fast-food workers, No Shame in My Game, part of what makes it hard to take a low-wage job is not that people don’t want to work—it’s that society has such disdain for those making chump change behind a McDonald’s counter or in a Walmart stockroom. (This is also one reason that immigrants—who aren’t under the same sorts of social pressures as the native-born—will do the poorly paid jobs others won’t.)
In the research for my book about the long-term unemployed in America and Canada, I came across one man out of work for more than a year after the car-parts plant that employed him shut down. He had avoided having to live on the street by moving into his mom’s house. When I spoke to him, he had just given away his last unemployment check to his daughter so that she could have something of a normal Christmas.
“I’m forty-three years old and living off my mother,” he told me. He was ashamed about accepting his family’s help, but he felt he had to do it. What he wasn’t willing to do, though, was work at a fast-food restaurant. He had put in twelve years at a respectable job, he pointed out. “I don’t want to throw on a goofy hat.”
If we believe that certain jobs are so undignified that we won’t even pay someone a decent wage to do them, then we shouldn’t be surprised that people with a decent amount of self-respect won’t do them. Opponents of raising the minimum wage seem to be blind to this. They talk about the economic pros and cons of wage laws as if those were the only things that matter. But people in the real world don’t just have balance sheets, they also have pride.
If you don’t think that making economic policy based on principle is realistic, then consider the extent to which it has already occurred—in the direction of greater income inequality. In 1965, CEOs made 20 times more than a typical worker, according to the Economic Policy Institute; in 2014, they made 300 times more. Part of this shift was due to global competition and changes in labor and financial markets, but some of it can be linked to the dwindling sense of obligation that those at top now have toward their workers, as Mark Mizruchi and other scholars have noted.
As many of today’s corporate leaders see it, making obscenely larger amounts of money than their employees do is no longer cause for guilt. The boardroom culture tells them they deserve it. And so they continue to push for changes in tax laws to make sure the economy’s outcomes reflect their own principles of self-profit.
Indeed, in other rich countries with different social norms, the gap between CEO and worker pay is nowhere near as extreme—and the minimum wage tends to be much higher, too. These countries have clear notions of what’s fair and appropriate to pay for a day’s work, and they have chosen to pursue practices and policies in line with those beliefs.
Even those of us who want government to do more for the working poor often forget the importance of this broader cultural context. Yes, we should take advantage of targeted, technocratic solutions such as earned-income tax credits that make low-wage work pay better. But it should trouble us that these policies often amount to having the government subsidize employers who refuse to foot any extra labor costs. Furthermore, having a company pay a higher wage and having the government supplement that wage are very different things. Or at least they are when we look from the vantage point of flesh-and-blood human beings—as opposed to that of the rational-actor stick men in economic models. We brag about our paychecks, not our tax credits.
What we pay those at the bottom also has something to say about the dignity and connectedness of our society as a whole. If every wage is a living wage, those of us who are more fortunate won’t be living in such a different world from those sweeping our floors and serving our food. An entry-level job won’t be such a laughable and undignified proposition that a kid in a poor town or neighborhood won’t even consider taking it over a flashier (and deadlier) gig on the corner. If we think people are worth more than a pittance, they will act that way—and treat others that way.
In a sense, it’s fitting that Richmond, the former capital of the Confederacy, a city with a history of stark racial and economic inequalities, should host the Fight for $15 convention. The old plantation-based economy disappeared not because it wasn’t profitable. It disappeared because it wasn’t just. If we truly believe in our values, we should make our economy reflect them.
Dr. Martin Luther King delivering his "I Have a Dream" speech in Washington on August 28, 1963. National Archives and Records Administration, via Wikimedia
All the discussions today of how much racial progress we’ve made since Dr. Martin Luther King was alive reminded me of a disturbing point about the black−white health gap mentioned in recent research, some of which I discussed in an Atlanticessay over the weekend.
According to the Centers for Disease Control, African Americans have been catching up with whites in terms of life expectancy at birth. So things are looking up, right?
Yes, and no. To a sizeable extent, what explains the narrowing of the life-expectancy gap in the last couple decades is not just that things are better for African Americans (though they have improved), but also that things are worse for whites—working-class whites above all.
A New York Timespiece over the weekend highlighted this fact. “A once yawning gap between death rates for blacks and whites has shrunk by two-thirds”—but that’s not because both groups are doing better, according to the article. Overall mortality has declined for African Americans of all ages, but it has risen for most whites (specifically, all groups except men and women ages 54-64 and men ages 35-44).
Furthermore, younger whites (ages 25-34) have seen the largest upticks in deaths, largely because of soaring rates of drug overdoses, and those who have little education are dying at the highest rates. The mortality rate has dropped for younger African Americans, a decline apparently driven by lower rates of death from AIDS. Together these trends have cut the demographic distance between the two groups substantially.
For middle-age African Americans, the progress in improving health outcomes implied by the shrinking black−white mortality gap is also less cause for celebration than it might seem at first.
A much-discussed study last year by the economists Anne Case* and Angus Deaton found that huge spikes in deaths by suicide and drug poisonings over the last couple decades have meant that the trend of declining mortality rates we’ve seen for generations actually reversed for whites ages 45-54 between 1999 and 2013. Again, those with little education were hit the hardest.
In my Atlanticpiece, I pointed out that the growing social isolation and economic insecurity of the white working class might explain some of these trends. One of the caveats I mentioned is that death and disease rates remain much higher among African Americans and Latinos. (I should have been more precise in the article: although Latinos have higher rates of chronic liver disease, diabetes, obesity, and poorly controlled high blood pressure, they have lower rates of cancer and heart disease, and lower or at least equivalentrates of death).
But it’s not just that the black−white gap persists. Here’s an important passage from Case and Deaton’s paper:
Over the 15-[year] period, midlife all-cause mortality fell by more than 200 per 100,000 for black non-Hispanics, and by more than 60 per 100,000 for Hispanics. By contrast, white non-Hispanic mortality rose by 34 per 100,000. CDC reports have highlighted the narrowing of the black−white gap in life expectancy. However, for ages 45–54, the narrowing of the mortality rate ratio in this period [1999−2013] was largely driven by increased white mortality; if white non-Hispanic mortality had continued to decline at 1.8% per year, the ratio in 2013 would have been 1.97. The role played by changing white mortality rates in the narrowing of the black−white life expectancy gap (2003−2008) has been previously noted. It is far from clear that progress in black longevity should be benchmarked against US whites.
Let me reiterate their point: for Americans ages 45-54, the narrowing in the black−white gap in life expectancy in recent decades was “largely driven” by more deaths among whites.
It’s heartening that overall life expectancy is increasing for many Americans, including African Americans. But it’s also important to remember that, almost a half century after King’s death, people of all races continue to be left out of this country’s progress, and some—whites and nonwhites—may, in fact, be seeing an unprecedented step backward.
* I want to apologize to Dr. Anne Case for mistakenly identifying her as “Susan Case” in the original version of my article in the Atlantic. (The only reason I can think of for why I made that dumb mistake is that a friend of mine is named Susan Caisse.) This brilliant scholar has already suffered the injustice of having her study erroneously called the “Deaton and Case study” rather than the “Case and Deaton study” (for better or worse, first authorship is everything to us academics), and here I’ve added insult to indignity. My sincere apologies.
Our nonprofit magazine very much needs donations from readers like you if we’re going to continue publishing in 2016. Please support our efforts to produce the kinds of content you don’t find elsewhere—stories that further our understanding of other people and encourage empathy and compassion—by making a tax-deductible donation to our nonprofit magazine.
And while you contemplate the importance of independent media, check out some of the great articles from around the globe you might have missed in our pages this year. Here are the In The Fray pieces that our editors judged to be the best.
Amid all the controversy over the recent push in New York and elsewhere for a $15 minimum wage, it’s important to remember the big picture.
In the decades after World War II, the United States had powerful policies and popular movements that lifted up working men and women. A third of employed Americans were members of unions, and a pro-worker lobby pushed Washington to raise the minimum wage to more than $10 in today’s dollars.
That culture has changed—so much so that today we’re even debating whether a worker should, at a minimum, earn enough to make ends meet.
I’ll also be doing a radio interview with Shep Cohen on The World of Work today at 4 p.m. ET. Listen live at WDVR (89.7 FM in Sergeantsville, NJ, and 96.9 FM in Trenton, NJ) or WPNJ 90.5 (Easton, PA).
This week, after 167 years, the futures trading pits in Chicago closed down. Computers now handle the work that shouting traders flashing hand signals used to do. I was struck by this part of the story:
What’s also disappearing is a rich culture of brazen bets, flashy trading jackets and kids just out of high school getting a shot at making it big. The pits were a ruthless place, but they were also a proving ground where education and connections counted for nothing next to drive and, occasionally, muscle.…
Grant, the runner turned clerk who now oversees his own trading firm, says he has embraced change, too. But he mourns the loss of the kind of entry-level positions that gave kids without much education a chance to prove themselves, just as he did.
“The customer doesn’t have to call anyone to execute a trade,” he says.
Sullivan, the broker, puts it bleakly.
“It’s kind of a slow death for people,” he says. “Maybe I am holding on to something that needs to go.”
In my latest book, I talk about the dwindling away of these sorts of high-paying jobs for people with less education. In many ways, this is a positive development. The futures market is undoubtedly faster and more efficient now that computers are running the show. It’s good for people to get more education and find better-paid, more personally gratifying work—for instance, jobs running and fixing the machines.
But it’s important to remember how critical these sorts of jobs are in halting a widening gap between the rich and poor. After all, unionized factory jobs helped build a strong and broad middle class in this country in the decades after the Second World War. And as much as we tout education as a cure-all for all the problems that arise from these sorts of economic transitions, the fact remains that educational opportunities are wildly unequal. People largely get the quantity and quality of education that their parents did, and the academic gap is growing between the children of more and less privileged families.
Technological change always creates more good jobs, but for whom exactly? Greater efficiency makes our lives easier as consumers, but what are its consequences for us as members of families and communities? The middle-class jobs that sustained many households and neighborhoods and cities are being automated and outsourced away. In our vast economy the loss of these sorts of jobs barely makes the daily headlines, but in the long run it matters. Perhaps it’s the slow death of something important.
Here is a short piece I wrote recently for a Zócalo Public Square discussion on the question “Is Rising Inequality Slowly Poisoning Our Democracy?” The discussion included experts from the Brennan Center for Justice, Cato Institute, Economic Policy Institute, and Georgetown University Center on Poverty and Inequality.
When Michael Young coined the term “meritocracy” half a century ago, he meant it to be an insult, not an ideal. In his view, a society where only the best and brightest can advance would soon become a nightmare. Young predicted that democracy would self-destruct as the talented took power and the inferior accepted their deserved place at the bottom.
Of course, the world we live in today is still no meritocracy. If most Americans are expected to go it alone, without the help of government or unions, elites continue to block competitors and manipulate the rules—as Wall Street did in spectacular fashion in the lead-up to the 2008 financial crisis.
Celebrated French economist Thomas Piketty argues that even when—or especially when—the market operates efficiently, inherited wealth becomes an ever more potent force within the economy, slowly strangling the opportunities for ordinary individuals to advance.
Nevertheless, the myth of meritocracy tells us that the rich are rich because they—like Young’s talented ruling class—are smarter and better. They worked their way up. They are the “makers” growing the economy. Anyone who can’t do it on his or her “own” is just a “taker,” suckling on the government’s teat.
I found hints of this viewpoint when I interviewed the long-term unemployed for my book. Some felt enormous shame and blamed themselves for their inability to land another job. Often, the sense of failure had a negative impact on their personal relationships and their belief that they had something at all to contribute to society.
Preserving our democracy will require forceful government regulation and strong unions. Such approaches have their own flaws, but there is no other way to restore balance to an economy and society increasingly under the sway of an elite class.
Beyond that, we need to tackle head-on the culture of judgment, materialism, and ruthless advancement used to justify extreme inequality—and temper it with a measure of grace.
We use cookies to improve your experience while you navigate through the site. Cookies that are categorized as necessary are stored on your browser, as they are essential for the working of the site’s basic functionality. We also use third-party cookies that help us analyze and understand how you use this site. These cookies will be stored in your browser only with your consent, and you have the option to opt out of using them.
Necessary cookies are essential for the basic functionality and security features of this website. These cookies do not store any personal information.
Any cookies that are not necessary for the website to function and are used to collect user personal data via analytics or other embedded content are termed non-necessary cookies. It is mandatory to procure user consent prior to using these cookies.