« Mgive lets you text money immediately to your favorite non-profit | Main | Upstart corporate prediction market success story »
Friday
Jun262009

Netflix prize is (nearly) awarded! A model in crowdsourcing

The Netflix $1,00,000 prize to the team able to increase the accuracy of its recommendation system by 10% is nearly sure to be awareded to BellKor's Pragmatic Chaos, a super-team of 4 teams that today claimed to have reached the 10.5% mark.


Tracking the Netflix prize has been fascinating. Back in November 2008, one of the leaders, who was at the time 8.8% better than Netflix's own Cinematch, estimated that the movie Napoleon Dynamite alone accounted for 15% of his remaining error rate. Why? Because people either love it or hate it; it has received nearly exlusively 1 or 5 stars and it is nearly impossible to predict whether or not someone will like it based on his past history.

Netflix did right opening up the competition to the public. It inspired teams around the country -- from AT&T engineers to father-son teams -- and the 10% increase will be worth well more than only $1M to the company.

This is one of the best examples of crowdsourced innovation and problem-solving out there. The winning team was initially 4 disparate pairs or individuals who they realized that they had complementary skills -- machine learning, computer science, engineering -- and decided to collaborate. Throughout the competition, as the market leaderboard tracked the top performers, teams would routinely share lessons learned. Even with $1M at stake, the market can indeed be collaborative and come to a better solution than a single company alone.

PrintView Printer Friendly Version

EmailEmail Article to Friend

References (54)

References allow you to track sources for this article, as well as articles that were written in response to this article.

Reader Comments (2)

Strange that all the emphasis is on the techies and Netflix and not the customer. From my limited and highly biased point of view (although I was a customer at one point), this is an exercise in frutility that seems to run rampant in the tech business.

Napoleon Dynamite is the solution, not the problem. Simply give me a Napoleon Dynamite type film every time and I can guarantee you that more than 50% of the time, I will love the film.

Given Netflix's current formula and adding a pitiful 10+% would given them about a 88% error rate in my case.

Perhaps if they took a look at string theory or chaos theory or even talked to their customers they might come up with a better formula.

June 29, 2009 | Unregistered CommenterEcogordo - twitter

Ecogordo-

Part of the Napoleon Dynamite "problem" is what exactly IS a "Napoleon Dynamite type film"? The movie seems to defy typecasting. On the other hand, some movies ARE a lot like others (I sense this is part of your frustration). I've found the Amazon book recommendation system to be remarkably useful. They have 100% of my book buying business because of it -- because the system knows my tastes.

By putting the algorithm out to the public, I would say that they were "talking" to their customers. I would love to see how you apply string theory to recommendation engines. Let us know if you do!

June 29, 2009 | Registered CommenterMelody
Member Account Required
You must have a member account on this website in order to post comments. Log in to your account to enable posting.