Friday, September 30, 2011

Kindle-ing Competition


We don’t think of the Kindle Fire as a tablet. We think of it as a service.

– Jeff Bezos

The analysts predicted that Amazon would introduce its new Kindle Fire tablet today with an aggressive low price of $250 to $300, in line with low margin competitors like Samsung.

They were wrong. Amazon priced it at $199, with some versions of the Kindle selling for as little as $79.

How can Amazon afford to price it so low? Is their manufacturing and supply chain that much more efficient than Samsung, RIM and Apple? In a word, no. They key is the increasingly important economics of two-sided networks and information complements, as analyzed in the seminal work of Geoff Parker and Marshall van Alstyne.

Amazon isn’t simply selling a device, it’s selling a portal into a cornucopia of books, music, movies and other media, all available a click away at Amazon. Kindle owners trust Amazon with their credit cards, and with an easy and enticing user interface that directs users to Amazon media, recommendations that are eerily accurate, and virtually instant delivery, it’s hard for infovores to resist spending far more via the Kindle than they ever did via the web. Believe me, I know from personal experience.

Of course, Amazon knows this and makes a healthy, but not unreasonable, margin on every media sale. What’s more, they avoid having to pay 30% commission that Apple extracts when Amazon sells ebooks via the iPad. Because the profit stream from Amazon’s media products is boosted every time another customer buys a Kindle, Amazon can afford to price it at very low, or even negative margins. That gives them an advantage over standalone competitors. What’s more, Amazon can skimp on memory in the Kindle Fire—only 8 gigabytes – because owners can store an infinite number of books, songs, movies and documents on Amazon’s cloud servers at no cost. They even throw in a 30 day trial of Amazon Prime, the two-day delivery program that boosts loyalty among customers of Amazons non-digital goods.

The battle of the tablets is not a battle of devices, but a battle of ecosystems. Jeff Bezos and his team at Amazon have learned well the lessons of two-sided markets.

Tuesday, September 27, 2011

The Dismal Economics of Moneyball


Moneyball is a huge hit, which doesn’t happen too often to movies featuring an economics major who’s good at statistics. It tells the true story of how the Oakland A’s became a competitive team despite having a payroll less than 1/3 of the Yankees. They did it by using statistical techniques pioneered by sabermetrician Bill James and applied by Harvard economics grad Paul DePodesta. For instance, old school baseball scouts undervalued the simple talent of getting on base via walks. That’s not very exciting, but a team the does it over and over tends to win more than its competitors. By combing through the data to find undervalued traits like drawing walks, the Oakland A’s were able to find talented players without spending a lot of money.

To an economist, that’s a story not only about the power of information, but also the importance of innovation in creating competitive advantage. Oakland’s General Manager, Billy Beane, didn’t compete the same way as all the other teams, he did something new and different, and that gave the A’s an edge.

However, that’ s not the end of the story. Competition leads others to match that innovation, and over time, the excess returns are competed away. Oakland’s competitive secret didn’t not remain a secret for long. In 2003, when Michael Lewis's book Moneyball was published, the Boston Red Sox hired Bill James to advise them, and apply analytic techniques to optimize their much larger payroll. They promptly won the World Series the next year, and again in 2007. Today there are whole conferences, like the MIT Sloan Sports Analytics Conference devoted to these techniques. So does Moneyball still provide an edge?

According to an academic study by Jahn Hakes and Raymond Sauer:

….certain baseball skills were valued inefficiently [in 1999-2002] and this inefficiency was profitably exploited by managers with the ability to generate and interpret statistical knowledge. Consistent with Lewis’s story and economic reasoning, as knowledge of the inefficiency became increasingly dispersed across baseball teams the market corrected the original mispricing.

Sadly, the insights Bill James identified no longer provide a measurable advantage. This year, Oakland will finish with another losing season. And my beloved Red Sox? They lost their lead in the wildcard race tonight and may not make the playoffs. Time to hunt for the next big innovation.

Monday, September 12, 2011

Is Koomey's Law eclipsing Moore's law?


Most people are familiar with Moore's Law, the doubling of computer power roughly every 18 months. But as technology becomes more mobile "Koomey's Law" may be more relevant to consumers. Dr. Jon Koomey and his colleagues recently completed a study showing that energy consumption for computing is improving just as fast as processing power. At left, is a chart from a their paper "Implications of Historical Trends in the Electrical Efficiency of Computing". The paper is highlighted in the latest issue of MIT's Technology Review, where Dr. Koomey explains:

"The idea is that at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half," says Jonathan Koomey, consulting professor of civil and environmental engineering at Stanford University and lead author of the study. More mobile computing and sensing applications become possible, Koomey says, as energy efficiency continues its steady improvement.

The battery size and battery life of an iPad or Android phone is one of the biggest design constraints. Furthermore, in the next five years, we may see a trillion small computing devices blanket the planet as the Internet of Things awakens. Understanding Koomey's Law will be the key to making this possible. Progress is happening on many different fronts.

Sunday, September 11, 2011

What CAN'T computers do?

Not too long ago, there was a relatively long list of things machines couldn't do by themselves: play chess, read legal briefs, translate poetry, vacuum floors, drive cars, etc. But that list is getting shorter and shorter every year. The latest casualty may be writing newspaper articles.

Kris Hammond and Larry Birnbaum at Northwestern's Intelligent Information Laboratory have started a company called Narrative Science which does just that. Here's a sample, produced with 60 seconds of the end of the third quarter of a recent football game:

“WISCONSIN appears to be in the driver’s seat en route to a win, as it leads 51-10 after the third quarter. Wisconsin added to its lead when Russell Wilson found Jacob Pedersen for an eight-yard touchdown to make the score 44-3 ... . ”

According to Steve Lohr, in the New York Times:

The Narrative Science software can make inferences based on the historical data it collects and the sequence and outcomes of past games. To generate story “angles,” explains Mr. Hammond of Narrative Science, the software learns concepts for articles like “individual effort,” “team effort,” “come from behind,” “back and forth,” “season high,” “player’s streak” and “rankings for team.” Then the software decides what element is most important for that game, and it becomes the lead of the article, he said. The data also determines vocabulary selection. A lopsided score may well be termed a “rout” rather than a “win.”

He ends his article with a prediction by Dr. Hammond:

“In five years,” he says, “a computer program will win a Pulitzer Prize — and I’ll be damned if it’s not our technology.”

That may be a bit ambitious, but one nearly-certain prediction is that computer power will increase by roughly 10-fold in the next five years, and by 100-fold within a decade.

You can also be sure that journalism won't be the only job affected.