April 16, 2015

...AND CHEAPER...:

Moore's Law has well exceeded Moore's expectations (Bret Swanson, 4/16/15, AEI)


[T]he case for information technology is much brighter than either the pessimistic or catastrophically optimistic scenarios. (See our new report: Moore's Law: A 50th Anniversary Assessment.)

It's true that to avoid excessive heat, clock speeds of chips leveled off around 2005. To compensate, however, Intel and other firms started designing chips with, among other innovations, multiple processor cores. Steve Oliner, a former Federal Reserve economist now with the American Enterprise Institute, shows that annual chip performance gains over the last decade have actually continued at 32%, near the five-decade average of 40%. A microprocessor in 2013 was thus 1.5 million times more powerful than Intel's first in 1971.

In addition, Google, Amazon, and others spent the last decade linking tens of thousands of chips and disks to deliver "warehouse scale computing" - or supercomputing for the masses. Parallelism on chips was thus matched and augmented with parallelism across Internet data centers, now accessible via two billion smartphones. In 1965, the only computers were large, centralized mainframes at universities and large corporations that were still fed by punch cards. A small number of lucky workers, professors, and students had to wait in line to access scarce computer time. Today, the centralized computers of our era - those in "the cloud" - generate 4.7 zettabytes (1021) of data per year.

The economic benefits have likewise been underestimated. The government's producer price index (PPI), for example, says microprocessor prices have virtually stopped declining. But Oliner and his colleagues believe this reflects a quirk in Intel's pricing strategy. They estimate that the actual annual price drop for 2000-13 was 44 percent. According to the official government measure, $100 worth of computing power in 2000 could be purchased for $1.40 in 2013, which sounds impressive. The actual cost in 2013, according to Oliner, however, may be just 5 cents ($0.05). By this measure, consumers can purchase 28 times more computing power per dollar than the official data suggests.

Dale Jorgenson of Harvard, using new data from the Bureau of Economic Analysis, shows that over the last 40 years nearly all the gains in total factor productivity (or technical innovation) have come from information technology and that IT accounts for between 50% and 70% of all productivity growth. In 2007, William Nordhaus of Yale, looking back two centuries, found that the labor cost of computing (hourly income per unit of computation) had declined by a factor of 73 trillion. Based on his numbers, I estimate that since Gordon Moore plotted those first few data points the labor cost of computing has fallen by a factor of roughly one trillion.

Posted by at April 16, 2015 2:21 PM
  

blog comments powered by Disqus
« RACE OR FAITH?: | Main | GOVERNING AMERICA HAS GIVEN US A DEEP BENCH: »