Blogs

Job growth


Authors: Jonathan Grudin
Posted: Mon, January 11, 2016 - 10:03:16

Automation endangers blue and white collar work. This refrain is heard often, but could new job creation keep pace with job loss? Some leading technologists forecast that few of us will find work in fifteen years. They describe two possible paths to universal unemployment.

1. Robots or computers become increasingly capable. They have already replaced much human labor in farms, factories, and warehouses. Hundreds of thousands of telephone operators and travel agents were put out of work. Secretarial support in organizations thinned. In this view, jobs will be eradicated faster than new ones are created.

2. The technological singularity is reached: We produce a machine of human intelligence that then educates itself around the clock and designs unimaginably more powerful cousins. Human beings have nothing left to do but wonder how long machines will keep us around. Wikipedia has a nice article on the singularity. The concept arose in its current form in the mid-1960s. Many leading computer scientists predicted that the artificial intelligence explosion would occur by 1980 or 1990. Half a century later, leading proponents are more cautious. Some say ultra-intelligence will arrive before 2030. The median forecast is 2040. Ray Kurzweil, an especially fervent analyst, places it at 2045.

If the singularity is never reached, the jobs question centers on the effect of increasingly capable machines. If the singularity appears, all bets are off, so our discussion is limited to employment between now and its arrival.

My view is that the angst is misplaced: The singularity won’t appear [1] and job creation will outpace job loss. I apologize in advance for a U.S.-centric discussion. It is the history I know, but in our increasingly globalized economy much of it should generalize.

Occupational categories such as farming, fishing, and forestry are in long-term decline. Automation eliminates manufacturing jobs and reduces the need for full-time staff to handle some white collar jobs. Even when more new jobs appear than are lost, the transition will be hard on some people. Not everyone had an easy time when computerization displaced telephone operators and digital cameras eliminated the kiosks where we dropped off film canisters and picked up photos a day later. Nevertheless, jobs increased overall. Productivity rose, and could provide resources for safety nets to help us through disruptions.

The first massive employment disruption

For hundreds of thousands of years, until agriculture arose in the Fertile Crescent, China, Mesoamerica, and South America, our ancestors were hunters and gatherers. To shift from hunting to domesticating animals, from gathering to planting and tending crops, required a significant retooling of job skills. Suddenly, fewer people could produce enough food for everyone! Populations soared. With no television or social media, what would former hunters and gatherers do with their time? 

The parallel is strong. Existing jobs were not needed—more efficient new production systems could be handled by fewer people, in a time of population growth. Some people could continue to hunt and gather, and decry change. The effect was not mass unemployment, it was an unprecedented rise in new occupations.

These included working to improve agriculture and animal husbandry, breed more productive plant and animal species, and develop irrigation systems. But most new occupations were outside food production. Music, arts, and crafts flourished. Pottery and weaving reached exquisite levels; the Inca developed light tightly woven garments superior to the armor worn by the Spanish. Metallurgy flourished, useful and aesthetic. Trade in these goods employed many. Accounting systems were developed: Literacy and mathematics arose in agricultural communities. Stadiums were built for professional athletes. Surplus labor was used to build pyramids, which involved developing and applying engineering methods and management practices. Armies and navies of a scale previously unimaginable appeared on different continents. Political, religious, and medical professions arose.

Charles Mann’s 1491 describes what our species accomplished in the western hemisphere following the annihilation of traditional jobs. Before diseases arrived from Europe, western hemisphere populations were far larger. Archaeologists have only recently discovered the extent of their accomplishments. Mann identifies fascinating distinctions between the agricultural civilizations in the south and the hunter-gatherers who held sway in the north.

Prior to the transition to agriculture, relatively primitive tool-making, healing, cave-painting, and astronomy were part- or full-time occupations for some [2]. When agriculture automated the work of hunting and gathering, side activities exploded into organized occupations. Self-sufficiency in food made possible Chinese philosophers, Greek playwrights, and Incan architects.

Industrial revolutions

I lived in Lowell, Massachusetts, where ample water power in the 1820s (somewhat before I took residence) gave rise to the first industrial revolution in the U.S., built on pirated 50-year-old British technology. The transition from hand-crafted to machine production started with textiles and came to include metals, chemicals, cement, glass, machine tools, and paper. This wide-scale automation put many craft workers out of jobs. The Luddite movement in England focused on smashing textile machines. However, efficient production also created jobs—and not only factory jobs. In Lowell, the initial shortage of workers led to the extensive hiring of women, who initially received benefits and good working conditions [3]. Over time, they were replaced by waves of immigrant men who were not treated as well. Other jobs included improving factory engineering, supplying raw materials, and product distribution and sales. Inexpensive cement and glass enabled construction to boom. Despite the toll on craft work, the first industrial revolution is credited with significantly raising the overall standard of living. Of course, pockets of poverty persisted. As is true today, wealth distribution is a political issue.

The second industrial revolution began in the late 19th century. This rapid industrialization was called “the technological revolution,” though we might like to repurpose that title for the disruption now underway. Advances in manufacturing and other forms of production led to the spread of transportation systems (railroads and cars); communication systems (telegraph and telephone); farm machinery starting with tractors; utilities including electricity, water, and sewage systems; and so on. Not only buggy whip manufacturers were put out of business. Two-thirds of Americans were still employed in agriculture at the outset; now it is 2%. The U.S. population quadrupled between 1860 and 1930, largely through immigration. Job creation largely kept pace and the overall standard of living continued to rise, although many people were adversely affected by the changes, exacerbated by economic recessions. In developed countries, democracies offset disruptions and imbalances in wealth distribution by constraining private monopolies and creating welfare systems.

Since the end of the second industrial revolution in 1930, the U.S. population has tripled. Technological advances continue to eradicate jobs. Nevertheless, unemployment is lower than it was in the 1930s. How can this be?

A conspiracy to keep people employed

Productivity increases faster than the population. People have an incentive to work and share in the overall rise in the standard of living. When machines become capable of doing what we do, we have an incentive to find something else to do. Those who own the machines benefit by employing us to do something they would like done. They do not benefit from our idle non-productivity; in fact, they could be at risk if multitudes grow dissatisfied. The excesses of the U.S. robber barons gave rise to a socialist movement. High unemployment in the Great Depression spawned radical political parties. The U.S. establishment reacted by instituting a sharply progressive tax code, Social Security, and large jobs programs (WPA, CCC), with World War II subsequently boosting employment. Should machines spur productivity and unemployment loom, much-needed infrastructure repair and improvement could employ many.

If we face an employment crisis. The U.S. does not at present. The Federal Reserve raised interest rates in part to keep unemployment from falling further, fearing that wages will rise and spur inflation.

Many new jobs are in the service sector, which some say are “not good jobs.” Really? What makes a job good? Is driving a truck or working an assembly line more pleasant than interacting with people? “Good” means “pays well,” and pay is a political matter as much as anything else. Raise the minimum wage enough and many jobs suddenly get a lot better. Service jobs that are not considered great in one country are prestigious in others, with relative income the key determinant.

Where will new jobs come from?

The agricultural revolution parallel suggests that activities that already have value will be refined and professionalized and entirely new roles will develop. Risking a charge of confirmation bias, let me say that I see this everywhere. For example, in the past, parents and teachers coached Little League and high school teams for little or no compensation (and often had little expertise). Today, there is a massive industry of paid programs for swimming, gymnastics, soccer, dance (ballet, jazz, tap), martial arts, basketball, football, yoga, and other activities; if kids don’t start very young they won’t be competitive in high school. There is a growing market for paid scholastic tutors. Technology can help with such instruction, but ends up as tools for human coaches who also address key motivational elements (for both students and parents). At the other end of the age spectrum, growth is anticipated in care for elderly populations; again, machines will help, but many prefer human contact when they can afford it. For those of us who are between our first and second childhoods, there are personal trainers and personal shoppers, financial planners and event planners, uber drivers and Airbnb proprietors, career counselors and physical therapists, website designers and remodel coaches. Watch the credits roll for Star Wars: The Force Awakens—over 1000 people, many in jobs that did not exist until recently.

My optimism is not based on past analogies. It comes from credit for human ingenuity and the Web, which provides the capability to train quickly for almost any occupational niche. Documents, advice repositories, YouTube videos, and other resources facilitate expertise acquisition, whether you select teaching tennis, preparing food, designing websites, or something else. Yes, anyone who wants to design a new website can find know-how online, but most will hire someone who has already absorbed it. The dream of “end-user programming” has been around for decades; the reality will never arrive because however good the tools become, people who master them will have skill that merits being paid to do the work quickly and effectively. For any task, you can propose that a capable machine could do it better. But a capable machine in the hands of someone who has developed some facility will often do even better, and developing facility becomes ever easier.

For example, language translators and interpreters are projected to be a major growth area as globalization continues. Machine translation has improved, but is not error-free. Formal business discussions will seek maximum accuracy. Automatic translation will improve the efficiency of the human translators who will still be employed for many exchanges.

A challenge to the prophets of doom

When well-known technologists predict that most of their audience will live to see zero employment, I wonder what they think the political reaction to even 50% unemployment would be. The revolt of the 19th-century Luddites with torches and sledgehammers could be small potatoes compared to what would happen in the land of Second Amendment rights.

Fortunately, it won’t come to that. Instead of predicting when all the jobs will be gone, let the prophets of job loss tell us when the number of jobs will peak and begin its descent. Until that mathematically unavoidable canary sings, most of us can safely toil in our coal mines.

Let’s assume that machines grow more capable every year. It doesn’t always seem that way, but I don’t use industrial robots. The amusing Amazon warehouse robot videos do show automation of reportedly not-great jobs. Despite our more capable machines, the U.S. economy has added jobs every single month for more than five years. Millions more are working than ever before, despite fewer government workers, a smaller military, and no national work projects. Once or twice a decade a “market correction” reduces jobs temporarily, then the upward climb resumes [4].

Is it a coincidence that as the population doubled over and over again, so did the jobs? Of course not.

Endnotes

1. We can’t prove mathematically that the singularity will not be reached, but the chance of it happening in the 21st or 22nd century seems close to zero, a topic for a different blog post.

2. Why did these appear so late in human evolution? Possibly a necessary evolutionary step was taken. Perhaps reduction in predators and/or climate stabilization made hunting and gathering less of a full-time struggle.

3. The national park in Lowell covers the remarkable women’s movement that arose and was suppressed in the mills.

4. Use the slider on this chart: https://alfred.stlouisfed.org/series?seid=PAYEMS

Thanks to John King for discussions on this topic; his concerns about short-term disruptions have tempered my overall optimism.



Posted in: on Mon, January 11, 2016 - 10:03:16

Jonathan Grudin

Jonathan Grudin is a principal design researcher at Microsoft.
View All Jonathan Grudin's Posts



Post Comment


No Comments Found