by Joe Pinsker: Reducing hours without reducing pay would reignite an essential but long-forgotten moral project: making American life less about work…
The 89 people who work at Buffer, a company that makes social-media management tools, are used to having an unconventional employer. Everyone’s salary, including the CEO’s, is public. All employees work remotely; their only office closed down six years ago. And as a perk, Buffer pays for any books employees want to buy for themselves.
So perhaps it is unsurprising that last year, when the pandemic obliterated countless workers’ work-life balance and mental health, Buffer responded in a way that few other companies did: It gave employees an extra day off each week, without reducing pay—an experiment that’s still running a year later. “It has been such a godsend,” Essence Muhammad, a customer-support agent at Buffer, told me.
Miraculously—or predictably, if you ask proponents of the four-day workweek—the company seemed to be getting the same amount of work done in less time. It had scaled back on meetings and social events, and employees increased the pace of their day. Nicole Miller, who works in human resources at Buffer, also cited “the principle of work expanding to the time you give it”: When we have 40 hours of work a week, we find ways to work for 40 hours. Buffer might never go back to a five-day week.
At a moment when the future of work is being decided—when businesses are questioning the value of physical office space and when lower-paid workers are agitating for better treatment as the economy reopens—what worked for this small, somewhat quirky tech company might be much less radical than the rest of the American workforce has been led to believe. People who work a four-day week generally report that they’re healthier, happier, and less crunched for time; their employers report that they’re more efficient and more focused. These companies’ success points to a tantalizing possibility: that the conventional approach to work and productivity is fundamentally misguided.
“We live in a society in which overwork is treated as a badge of honor,” Alex Soojung-Kim Pang, an author and consultant who helps companies try out shorter workweeks, told me. “The idea that you can succeed as a company by working fewer hours sounds like you’re reading druidic runes or something.” But, he said, “we’ve had the productivity gains that make a four-day week possible. It’s just that they’re buried under the rubble of meetings that are too long and Slack threads that go on forever.”
Regardless of any benefits to businesses, stripping away all of work’s extra scaffolding and paying people the same amount for fewer hours—whether they’re salaried or paid hourly—would genuinely nurture human flourishing. It would make caregiving, personal development, and the management of modern life easier for people across the economic spectrum. And it would reignite an essential but long-forgotten moral project: making American life less about work.
Over the past couple of years, companies and governments around the world have become more open to the possibility that a four-day workweek could be better for businesses and the people who make them run. Before the pandemic, Microsoft Japan and the burger chain Shake Shack tried the schedule out with some employees, with positive results. The international conglomerate Unilever’s New Zealand offices are currently in the middle of a year-long four-day experiment, the results of which could inform the schedules of the company’s 155,000 employees worldwide. The governments of Spain and Scotland are planning trials that would subsidize employers that give workers an additional day off, and politicians in Japan and New Zealand have spoken favorably of the idea of a shorter workweek.
Later this month, Jon Leland, an executive at Kickstarter, and Jon Steinman, who works in political advocacy, will launch, along with Pang, a nationwide campaign promoting the four-day workweek. Their plan is to spark interest among workers, and then use that interest to recruit companies to participate in a pilot program next year, which will be advised by academic researchers and which Leland and Steinman hope will generate a more robust body of data on four-day weeks.
Still, four days’ work for five days’ pay is a rarity in the landscape of American business—Pang is aware of only a few dozen organizations in the U.S. with this arrangement. Many—though not all—of them match the profile of Buffer: They are relatively small, they do analytical, computer-based “knowledge” work, and they are still run by their founder, a setup that makes big changes easier to implement. But their experiences suggest that when done right, reducing workers’ hours doesn’t necessarily hurt profitability.
In 2018, Andrew Barnes approached the employees of his company, a New Zealand firm called Perpetual Guardian that manages wills, estates, and trusts, with an offer: If they could figure out how to get more done in a day, they could work one fewer day per week. In consultation with employees, the company installed lockers in which workers can voluntarily stash their phones for the day, and soundproofed meeting spaces to reduce the sound of ambient chatter. Meetings were shortened; employees started putting little flags in their pencil holders whenever they wanted to signal to coworkers that they didn’t want to be disturbed. It worked: Perpetual Guardian’s business didn’t suffer, and the four-day workweek is still in place three years later.
When employees are given a good reason to work harder, they often focus more ruthlessly on their most important tasks. Barnes found that even though weekly working hours were cut by 20 percent, employees’ time spent on nonwork websites fell by 35 percent. It also helped that employees had more time outside of work to manage the rest of their lives, so nonwork responsibilities were less likely to intrude on the workday. “Because people have no time for home duties—trying to track down that plumber or sorting things out with the kids—all of that was eating into the day,” he told me. “So if I gave people more time outside of work to do those tasks, that would stop those things interfering in the business hours.”
Natalie Nagele, the CEO of Wildbit, a small software company, introduced a four-day, 32-hour week in 2017, after reading about research indicating that the optimal amount of intense cognitive work is no more than four hours a day. (The four-day schedule even applies to Wildbit’s customer-support team; their days off are staggered so they can respond to inquiries all week.) “I have this dream that knowledge workers can get to a point where we can clearly define what enough means,” Nagele told me. “We don’t do a good job of saying, ‘This is done,’ or ‘I can put it away.’” She wonders if Wildbit’s next schedule could be four six-hour days.
That may sound preposterous, but schedules like this intrigue productivity experts. Cal Newport, the author of Deep Work: Rules for Focused Success in a Distracted World, has written that the current version of office work, defined by long hours and “always-on electronic chatter,” seems poorly suited to cognitive labor. This mode of working has been around for only a decade or two, and we have found better ways to work before; it would be “arrogant and ahistoric,” he says, to assume that the current approach is best.
This model doesn’t just work for computer programmers and other knowledge workers. In his book Shorter: Work Better, Smarter, and Less—Here’s How, Pang writes about a nursing home near Roanoke, Virginia, that was struggling to hire and retain nursing assistants, who do important but unglamorous, often low-paid work. To improve retention, the facility tried giving them 40 hours of pay for 30 hours of work, which necessitated hiring more nursing assistants to compensate for the reduced hours. That came at a price, but the change also yielded substantial savings on recruitment expenses and overtime pay, such that the overall cost worked out to only about $20,000 a year. Plus, call-bell response times, residents’ infection rates, and the number of falls and skin tears all declined.
Last year, Diamondback Covers, a Pennsylvania-based manufacturer of metal truck-bed covers, shaved five hours off its factory team’s 40-hour week, but didn’t decrease pay, as it hired more workers to meet rising demand during the pandemic. The company expected that the 12.5 percent drop in working hours would lead to a rise of similar magnitude in the labor costs for each cover it made. But the cost increase was only 3 percent, due to increased efficiency.
“It’s not by running a sweatshop … it’s more about working smarter,” Diamondback’s CEO, Ben Eltz, told me. During a 40-hour week, “very rarely does a person say, ‘I got my work done—now I’m going to go see how else I can help.’ It’s that teamwork idea of, everyone’s shooting for that common goal of ‘Let’s make this work.’” On top of that, with shorter days, the company is shedding its employees’ least-productive hours, when they’re worn out near the end of a shift. With the expected savings from reduced turnover and fewer safety incidents, Diamondback’s schedule change could end up saving the company money.
Success stories like Diamondback’s—and Buffer’s, and Wildbit’s—point to a failure of imagination on the part of America’s bosses at a moment when they should be ready to reimagine corporate culture. Barnes thinks the same insight that is inspiring the post-pandemic spread of remote and hybrid work—that productivity is not a function of time spent in the office, under managers’ supervision—should also make business leaders more amenable to shorter workweeks.
Pandemic aside, when he hears from people who doubt that a four-day week would work in their industry, “They’re saying nothing can be better than the way we work today,” he told me. “That’s a pretty closed-minded view.”
There is nothing sacred about a five-day, 40-hour workweek—which, in actuality, is more than 40 hours for about half of full-time U.S. workers—but it is certainly an improvement over what came before it. For most of the 19th century, the typical American worker was a male farmer who worked as many as 60 to 70 hours per week. The precipitous decline in working hours since then was made possible by productivity growth: The internal-combustion engine, electrification, and other advances meant that workers were able to get things done more quickly.
The tempo of early factory work led to a push for a 10-hour day starting in the late 1820s; unions, which gained strength in the ensuing decades, fought for, as a popular slogan put it, “eight hours for work, eight hours for rest, eight hours for what we will” closer to the end of the century. The standard workweek in that era was still six days, and the shift to five occurred gradually, over the course of decades. According to Benjamin Hunnicutt, a historian at the University of Iowa and the author of Work Without End, the transition actually began in England, where in the 19th century it became normal for people to show up late to work, or skip it entirely, on Monday, basically because they would rather do other things. To discourage workers from “keeping Saint Monday,” as it was called, employers started agreeing to give them half of Saturday off.
In the U.S., one of the earliest instances of a business implementing a five-day week was a mill in New England that in 1908 gave its Jewish workers a two-day weekend, to cover their Saturday sabbath. The practice caught on more widely in the following two decades, when unions pushed for it and business owners, applying the principles of “scientific management,” studied their production processes and concluded that a shorter week could increase productivity. In 1926, the Ford Motor Company adopted the five-day week, doubling the number of American workers with that schedule.
Not all business leaders favored the change. “Any man demanding the forty hour week should be ashamed to claim citizenship in this great country,” the chairman of the board of the Philadelphia Gear Works wrote shortly after Ford rolled out its new hours. “The men of our country are becoming a race of softies and mollycoddles.” Less aggressive but just as resistant, the president of the National Association of Manufacturers, a trade group, wrote, “I am for everything that will make work happier but against everything that will further subordinate its importance.”
It took a crisis to cement the five-day week as a standard. During the Great Depression, reducing hours was considered a way to spread the finite amount of work available among more people. The appetite for shorter schedules was so great that, in 1933, the U.S. Senate passed a bill that would have temporarily capped the workweek at 30 hours. President Franklin D. Roosevelt and his administration found it too extreme, however, and instead tried to provide economic relief to workers in the form of the New Deal—rather than limit work, they sought to create more of it. Five years after the 30-hour week fell apart, Roosevelt signed the Fair Labor Standards Act, which mandated higher pay beyond 40 hours in certain industries, effectively formalizing the five-day workweek.
During this span of roughly 100 years, the notion that Americans could spend less and less time working didn’t elicit the same widespread sense of impossibility that it might today—it was in keeping with the common belief that expanding leisure time was a mark of moral progress. And for a time, it seemed that the workweek would continue to shrink. In 1930, the renowned British economist John Maynard Keynes made the prediction that in a hundred years, productivity growth would permit people to work as few as 15 hours per week. A quarter century later, Richard Nixon, as vice president, said he expected a four-day week soon enough. “These are not dreams or idle boasts,” he said. “They are simple projections of the [recent] gains we have made.” In the mid-1960s, a contributor to The New York Times Magazine surveyed the state of technological progress and concluded that it was “unlikely that the four-day week will be postponed indefinitely.”
There isn’t one straightforward explanation for why it is still being postponed. One reason might be that working hours have fallen to the point that pushing them down further wouldn’t bring such a large payoff—it’s less vital to move from 40 hours to 30 hours than it was to move from 60 to 50. Another might be that, once salaried workers started receiving benefits such as pensions and health insurance through their jobs, hiring an additional worker became more expensive, so employers were incentivized to squeeze more hours out of their existing staff rather than bringing on someone else. And perhaps the workweek would have continued to shrink if unions’ influence hadn’t waned nationwide.
A somewhat fuzzier explanation is that Americans’ fundamental aspirations changed. Hunnicutt argues that before the early 20th century, “work and wealth had a destination—that was a richer, fuller human life.” But after a cultural shift, he told me, “work was for more work … wealth was for more wealth, for ever and ever,” as a job became a religion-like source of meaning for many people. Hunnicutt also notes a blossoming of advertising and consumerism around this same time, which set people on a course of working more in order to buy more.
Whatever the underlying causes, the standard American workweek is the same as it was when Roosevelt signed the Fair Labor Standards Act some 80 years ago, even as productivity has continued to shoot up. Some of those gains did get distributed to workers—Nelson Lichtenstein, a labor historian at UC Santa Barbara, told me that the working class’s buying power doubled between 1947 and 1973. But consider what happened to productivity growth after that. Starting in the mid-’70s, productivity continued to rise, but median pay stopped rising with it. Instead of going to the typical worker, much of the additional income flowed to highly paid workers—those with college degrees, particularly college grads in industries such as tech and finance.