Join us in Seattle on June 26-29 for the NCEE Leaders Retreat. Learn more here.

Cross-posted at Education Week.

Readers of this blog will recall that, many moons ago, I summarized a couple of papers I wrote for the New South Wales, Australia Department of Education on the implications of advances in artificial intelligence, robotics, neural networks, machine learning and the like for education.  Though I was careful to make it clear that the experts in this area disagree violently with one another on the likely effect of these technologies on jobs, work and society as a whole, I concluded that the effects on all these things are more likely than not to be profound.  And, if that is so, I said, they will have dramatic effects on the context for setting education policy.  We would have to reconsider the purposes of education and greatly ramp up our game, both because much of the work that can be done with the skills most high school graduates now have will be done by machines, not people, and because the political upheaval that will be caused by that fact could spell the end of democratic government as we know it if we do not give our students a much better understanding of history and politics than they have now.

That conclusion obviously assumes that these new technologies will have a revolutionary impact on the way we live, an impact that should be felt first in our economy.  If that is true, if these technologies are going to do all sorts of work that humans do now, then it should be possible for the same number of humans to produce far more in the way of goods and services than they do now or did yesterday.  Or maybe the same amount of goods and services would be produced by the same number of people working many fewer hours.  Either way, what the economists call productivity—the amount of goods and services produced by the average worker—should be going up.

But it isn’t.  In fact, productivity growth has been declining since the 1970s and has been declining more rapidly in recent years than it was earlier.  That worries a lot of economists, because, in the long run, that will lead inexorably to a decline in our standard of living, the very opposite of what should be happening as we introduce all this labor-saving technology.

For many economists, this contrast between the onrush of these labor-saving technologies and the decline in productivity growth is paradoxical.  We can, they say, have one or the other, but it is hard to explain both at the same time.  And that suggests that all this talk about the revolutionary impact of these new technologies may be no more than a mountain of hype.

That view would be embraced quickly by Robert Gordon, author of The Rise and Fall of American Growth, perhaps the most celebrated book on economics of the last couple of years. In it, Gordon suggests that the growth rates we saw in the American economy, and the growth in productivity that accompanied them, between the American Civil War and the 1970s were a fluke, the likes of which were never seen before in human history and unlikely ever to be seen again.  He pointed to the invention of gasoline and electric motors, the telegraph and the telephone, motion pictures, insurance, synthetics, antibiotics, electric lighting, airplanes, ready-made clothing, the automobile, modern finance and investment systems and on and on and on.  He concluded that this flood of invention powered a rate of improvement in the human standard of living that would prove to be unique in human history, so we had better get used to much lower rates of growth in productivity and therefore in our standard of living.

It is quite clear that the iPhone has made some people startlingly rich, but less clear that it has improved anyone’s productivity.  Put the iPhone and the internet together and you may have sharply decreased the productivity of the average workers by increasing the time they spend on the internet while at work playing computer games, checking their social media and playing the market when they are supposed to be working.  Being able to stream every movie ever made may be great, but how does that compare to replacing the horse drawn carriage with the automobile?  Getting your recipes by computer may make it unnecessary to buy a lot of cookbooks, but how does that compare to getting your water pumped into your house from the municipal water system instead of filling a bucket at the well?  Sharing your photos through the internet is fun, but how does that equate with the opportunity to travel by plane as opposed to the stagecoach?  Gordon clearly had a point.

Or did he?  I just read a fascinating paper by Erik Brynjolfsson and some colleagues of his that does a very nice job of addressing the paradox. Sounds a bit wonky, and it is, but I thought you might find its key points interesting.

Brynjolfsson and his colleagues posit four possible ways to resolve the apparent paradox.  The first is false hopes (it was all hype and no substance); the second is mismeasurement (the gains in productivity were there, but the statisticians failed to capture them in the stats); the third is redistribution (the advances in productivity in some areas have been offset by losses in others (see above)); and the last is lags in implementation (the gains will be real and profound but they are not here yet).  In a nutshell, having analyzed and largely dismissed the first three explanations, the authors put most of their marbles on the last of these.

Their case is, I think, very strong.  The technologies I dismissed—cell phone technology, the internet, streaming and so on—as contributing little or nothing to the core productivity of the American economy are not (save the internet) the technologies of interest here.  Nor is the first wave of artificial intelligence.  The technologies to watch now are the second wave of artificial intelligence, advanced robotics, neural networks, advanced sensors, machine learning and all of these together coupled to very fast processors, cloud data storage and the internet.  It is the synergy among all of these related technologies that is creating the future.  And it is the astonishing amount of money that is now being poured into this concatenation of technologies that is accelerating their development at what amounts to lightning speed.

Did you know that companies are now selling robots that will do surgery on your spine with better success rates than good surgeons who do not use the robots?  Did you know that software is now available that will do a more accurate job of transcribing spoken English than human transcribers?  That there are now machines that can produce original works of popular and classical music that experts cannot distinguish from music written by the masters?  That disabled people who have lost their limbs are being fitted with artificial limbs that their owners can control with their thoughts?  That machines can now diagnose many diseases more accurately than very good doctors?  That other machines can write accounts of sports events that their readers cannot distinguish from those written by professional sports writers?  That machines can pick out a face in a very large crowd and identify its owner?

Robert Gordon was wrong.  This new round of technology is not the iPhone connected to the internet or a file-sharing service or the next round of social media.  None of the technologies developed between the Civil War and the 1970s were more significant in their potential for productivity improvement, general welfare (both positive and negative) and social change (ditto) than the technologies I have just listed.  They don’t show up in the statistics yet because they are so new.  People who study technological change have learned that what takes time is not the introduction of the technology itself but the endless changes that have to be made in countless processes and institutions to make use of the new technologies, not least because it is always the case that the people who stand to lose the most from the new technologies are the most powerful people in the society, precisely because they were the ones who most successfully mastered the old technologies.  But watch out as the disruptors take over.  This time, they run some of the most powerful organizations in the world and are investing unprecedented sums in their revolution.

The consequences for education will be profound.