Thursday, October 27, 2011
The past decade has been terrible in terms of job growth and median wage growth, and sadly that was true even before it culminated in the worst recession since the 1930s.
But not all the news is bad. Although it’s not much discussed, this has actually been the best decade since the 1960s for productivity growth. Last year, labor productivity grew by over 4% and it has averaged over 2.5% in the preceding 10 years.
Why does this matter? Simply this: productivity, output per unit input, is by far the most important determinant of our living standards. As Bob Solow showed in his Nobel Prize winning work, the main thing that makes an economy richer is not working harder or even using more capital or other resources. Instead, the main driver is innovations in products, services and business processes that let us create more value without using more inputs. Productivity comes from new technologies and new techniques of production. The most important of these is what economists call general-purpose technologies like the steam engine or electricity. They contribute to productivity directly, but more importantly, they also spur countless complementary innovations that can keep driving productivity growth for decades.
Our era is fortunate to work with one of the most important and powerful general-purpose technologies in history, information technology, in all its forms. Some of my research suggests that IT has been driving the lion’s share of productivity growth in recent years. What’s more, there is no sign that the digital revolution is slowing. On the contrary, I think we are only in the early stages of a transformation that will be no less important than the ones engendered by the steam engine and electricity.
Unfortunately, not everyone is benefitting from strong productivity growth. In fact, many have been directly hurt as their jobs are automated. This is one of the main themes of my new ebook with Andrew McAfee, Race Against the Machine. Grappling with this paradox, high productivity but stagnating employment, is one of the great challenges for our generation.
How do you think we should address it?
Sunday, October 23, 2011
Andy McAfee and I have just released a short e-book, Race Against the Machine. In it, we try to reconcile two important facts. 1) Technology continues to progress rapidly. In fact, the past decade has seen the fastest productivity growth since the 1960s, but 2) median wages and employment have both stagnated, leaving millions of people worse off than before. This presents a paradox: if technology and productivity are improving so much why are millions being left behind?
In the book, we document remarkable advances in digital technologies in particular. Innovations like IBM’s Watson, Google’s self-driving car, Apple’s Siri are turning science fiction into reality. Machines are doing more and more tasks that once only humans could do.
The good news is that this has radically increased the economy’s productive capacity – productivity is at record highs and increasing at an accelerating rate. The 2000s had faster productivity growth than even the booming 1990s. However, technological progress does not automatically benefit everyone in a society. In particular, incomes have become more uneven, as have employment opportunities. Recent technological advances have favored some skill groups over others, particularly “superstars” in many fields, and probably also increased the overall share of GDP accruing to capital relative to labor. While trillions of dollars of value were created between 2002 and 2007, over 60% of the increase went to the top 1%, as technology made it easier for them to leverage their talents globally.
The stagnation in median income and employment is not because of a lack of technological progress. On the contrary, the problem is that our skills and institutions have not kept up with the rapid changes in technology. In the past, as each successive wave of automation eliminated jobs in some sectors and occupations, entrepreneurs identified new opportunities where labor could be redeployed and workers learned the necessary skills to succeed. In the 19th and 20th centuries, millions of people left agriculture, but an even larger number found employment in manufacturing and services.
In the 21st century, technological change is both faster and more pervasive. While the steam engine, electric motor, and internal combustion engine were each impressive technologies, they were not subject to an ongoing level of continuous improvement anywhere near the pace seen in digital technologies. Already, computers are thousands of times more powerful than they were 30 years ago, and all evidence suggests that this pace will continue for at least another decade, and probably more. Furthermore, computers are, in some sense, the “universal machine” that has applications in almost all industries and tasks. In particular, digital technologies now perform mental tasks that had been the exclusive domain of humans in the past. General purpose computers are directly relevant not only to the 60% of the labor force involved in information processing tasks but also to more and more of the remaining 40%.
As the digital revolution marches on, each successive doubling in power will increase the number of applications where it can affect work and employment. As a result, our skills and institutions will have to improve faster to keep up lest more and more of the labor force faces technological unemployment. We need to invent more ways to race, using machines, not against them.
In the end, Andy and I are optimistic that that we can harness the benefits of accelerating innovation. But addressing the problem starts with a correct diagnosis, and that’s what our e-book sets out to provide.
Do agree with our diagnosis? What is your prescription?
Tuesday, October 18, 2011
If you’re interested in technology, employment and the economy, you might be interested in three events happening in the next few weeks.
The first is the Compass Summit in Palos Verdes, California, on October 23-26. There will be an impressive array of technologists, business leaders, visionaries and policymakers coming together to discussion how innovation can lead us out of some of messes we’ve created lately. I’ll be giving a talk called “Race Against the Machine: How the Digital Revolution Irreversibly Transforms Employment and the Economy” and my colleagues Tom Malone, Andy McAfee and many others will also be participating.
The second event is a whole symposium on technology and employment that the MIT Center for Digital Business is hosting on October 31 (yes, Halloween), followed by a game of Jeopardy! between a team from the MIT Sloan School, a team from Harvard Business School, and IBM’s Watson. Dave Ferrucci, the “father” of Watson will be speaking, along with some amazing technologists and economists. We’ll look at how technologies like Watson, Google’s self-driving car, Apple’s Siri, Heartland Robots and other amazing technologies have gone from fiction to reality, and what it means for jobs, wealth and the economy.
The morning sessions are at the new MIT Media Lab building on October 31, starting at 9:00 am. In the afternoon, we’ll head over to Harvard Business School. Space is limited, so if you’d like to attend please email Joanne Batziotegos (jtegos at mit dot edu) to reserve a spot.
The third event is Techonomy. This was the best non-MIT conference I went to last year and I’m really looking forward to it this year. It will be in Tucson this year, from November 13-15. I’m going to debate Tyler Cowen, the uber-blogger and economist, on the question “Can Technology Be Society’s Economic Engine?” It should be a lot of fun.
Let me know if you plan to attend any or all of the events, but if you miss them, watch this space for a summary of some of the highlights afterwards.
Friday, October 14, 2011
Information technology has created a data explosion. We now record virtually every click of every visitor to website, every search on Google or Bing, every transaction at every cash register, every call or text on cellphones, every inventory change in our supply chains and petabytes of other data on what we buy, sell, or even consider. This creates a level of visibility that managers and economists have never had before. And it creates enormous opportunities to use data to change the way decisions are made.
There have been some great case studies of how analytics have affected specific companies, but ironically, there has been relatively little systematic data on this question. Working with Heekyung Kim and Lorin Hitt, we sought to help address this gap in a research paper. We found that publicly-traded companies that were more data-driven were about 5% more productive than their competitors, a statistically significant difference.
I talk a bit about "big data" and how it can change decision-making in this short video that McKinsey recorded when the visited me a couple of months ago.
Is your organization using data more aggressively for decision-making? If not, what's holding you back? If so, what have been the results?