Tuesday, October 18, 2016

Video on AI and the future of jobs: Humans Need Not Apply

Deb Olsson shares the video below. While it may be viewed as somewhat pessimistic about the impact of Artificial Intelligence on jobs in the future, it contains many good points.

Monday, October 10, 2016

Congrats to Oliver Hart and Bengt Holmstrom on the Nobel Prize

Congratulations to my friends and advisors Oliver Hart and Bengt Holmstrom on winning the Nobel Prize in Economics.

Oliver was on my PhD thesis committee back when he taught at MIT and helped me understand the Theory of the Firm. I used his work a lot in this paper of mine "Information Assets, Technology and Organization" but my biggest contribution to economic science might be the small bit of feedback I gave Oliver on his paper "Property Rights and the Nature of the Firm".

Bengt is colleague at MIT and we share common interests on how information affects economic organization. I remember getting advice from him when I was working on my thesis and Bengt was a professor at Yale.

It seems like Nobel Prize winners are disproportionately very nice people. Oliver and Bengt certainly fit that mold.

Thursday, October 6, 2016

Amazon Prime benefits keep growing

Amazon has just added free books to it's Prime offering. For more info, check out this article shared by Dennis Schwedhelm.

A summary of Google's strategies

Jacob Loewenstein shared this article with us summarizing some of Google's most recent announcements. It's particularly interesting how they describe Google as competing directly with Apple, Amazon, Samsung and others.

Another article sent in by Christian Umbach takes a somewhat more pessimistic view on Google's strategy that points out some of the upcoming challenges they face.

Monday, October 3, 2016

The Blurring Lines Between the Tech Titans

The tech titans are all getting into each others' core businesses.
Here's the latest example: Facebook launches "marketplace"

Facebook launches Marketplace, a friendlier Craigslist

Thursday, September 29, 2016

Google pushes digital Assistant to the next level

Check out this article talking about Google's next move with Google Assistant. This article talks about how Google are seeking to stay relevant in a world with declining web searches. Thanks to Ana Maria Sanchez for sending in the article!

Amazon's battle for grocery dollars

John Martell shares this article about Amazon's fight in the grocery space. Despite an influx of funds for grocery delivery startups and lots of room for growth, traditional brick & mortar groceries have proven very difficult to displace.

Thursday, September 22, 2016

How Microsoft competes with Apple & Google in product connectivity

April Baker shares with us an interesting article about the inter-connectivity of products among three big tech firms, Apple, Google, and Microsoft. We found the visualization of product connections particularly interesting. Have things changed much in the past year since the article was posted?

Learning from Video Games: Micropayments and Unbundling

Thanks to David Hong for sharing an interesting article that discusses “micropayments”—not as a new idea, but an increasingly viable alternative to advertising and paywall revenue models. Specifically, how can digital content providers combat the squeeze in profits that’s resulted in Facebook and Google’s dominance in advertising? Freemium pricing models within the gaming sector offer a lot of promise.

Tuesday, September 20, 2016

Deep and Cheap Learning: Why does it work so well?

Henry Lin and Max Tegmark have a fascinating new paper arguing that the success of deep learning in so many domains has deep connections to the fundamental laws of the universe.  Both take potentially enormously large sets of possible data sets and simplify them to tiny set of outcomes, governed by just a few parameters. Luckily, they both simplify to much the same tiny subset.

As the authors put it:

We will see in below that neural networks perform a combinatorial swindle, replacing exponentiation by multiplication: if there are say n = 106 inputs taking v = 256 values each, this swindle cuts the number of parameters from v n to v×n times some constant factor. We will show that this success of this swindle depends fundamentally on physics: although neural networks only work well for an exponentially tiny fraction of all possible inputs, the laws of physics are such that the data sets we care about for machine learning (natural images, sounds, drawings, text, etc.) are also drawn from an exponentially tiny fraction of all imaginable data sets. Moreover, we will see that these two tiny subsets are remarkably similar, enabling deep learning to work well in practice.