Michael Wu, Ph.D. is Lithium's Principal Scientist of Analytics, digging into the complex dynamics of social interaction and group behavior in online communities and social networks.
Michael was voted a 2010 Influential Leader by CRM Magazine for his work on predictive social analytics and its application to Social CRM.He's a regular blogger on the Lithosphere's Building Community blog and previously wrote in the Analytic Science blog. You can follow him on Twitter or Google+.
The recent business success of social games has put the spotlight firmly on this subject. The hype often blinds people from the fact that gamification is in reality very hard to get right. Behind every successful gamification effort, there are probably hundreds that fail. In fact, if we view our lives in the context of a big game, then school and work are great examples of failed gamification that produced many bored students and dispassionate employees. Today, I’d like to talk about an unintended side effect of gamification that could undermine its success.
Justice for All is Essential
Commercial applications of gamification often use material rewards or cash prizes as motivators to drive certain consumer actions. Because these motivators are tangible, visible and often highly desirable, people often fixate on these rewards and prizes. One of the side effects of people getting too focused on the rewards is that they may start gaming the gamification system. Two examples come to mind:
On Foursquare, I know quite a few people who will check-in to any shop or restaurant they happen to walked by without actually visiting it. They are checking-in to Foursquare for the points and badges rather than the serendipity of finding friends in the vicinity.
Some online community rewards their members with points when they post content. And some members will game the system by posting random junks that are useless to the community. Again, these members are posting for the sake of points rather than contributing value to the community.
According to Prof. Byron Reeve, author of Total Engagement and renowned game researcher at Stanford University, “competition under explicitly enforced rules” is a critical ingredient of a good game. When “cheaters” game the system they break these rules and make the game less enjoyable for everyone else, because the game is no longer fair. It is important to realize then, that it only takes a small fraction of cheats to ruin the experience for the majority of players. When that happens, people will eventually stop participating, and the whole gamification strategy fails. Not only is this a waste of money and time, it can create a detrimental backlash that may be very difficult to repair or overcome.
Defeat the Cheats
So how can we prevent cheaters from gaming the system? The answer to this question is both good and bad news. The bad news is that you can’t stop cheating. No system, regardless of its sophistication, is completely immune to being gamed. For example, despite all of Google’s effort to design a fair PageRank algorithm that is resistant to gaming (e.g. link farming, link-baiting, etc.), it can still be gamed. It’s just disguised under a different name (i.e. search engine optimization, SEO).
We can certainly follow Google’s approach and continue to tackle this problem like engineers by making their algorithms harder to cheat. This works, but ultimately you will reach a point of diminishing return.
Since cheating is a human problem, we can also tackle it like psychologists or economists. That is the good news! Turns out that we don’t need to make the gamification system completely bullet proof to gaming. We just need to make it hard enough. So, how hard is enough? Strictly from an economics perspective, we only need to make the system robust enough that the effort required to game the system is greater than the perceived value that people can gain from gaming the system. Naturally, most people wouldn’t bother spending the time and effort to game the system, because the reward they get is far outweighed by their effort. Nevertheless, if they want to, they can still game the system in theory, most people just wouldn’t.
This is actually harder to achieve than it sounds, but there are two ways to do this:
Decrease the perceived value of the reward
Increase the effort required to game the system
Lowering the perceived value of the rewards is easier, but you can’t make the perceived value too low. Otherwise people would not be motivated enough to do the gamified task in the first place. The real challenge is finding the right balance between these two levers.
Since I promised to try to keep my posts shorter, I will stop now and we’ll talk about how to control these two levers next time. For now, let me summarize what we’ve learned:
An effective gamification must be fair and relatively immune to gaming (i.e. cheating), but gaming of commercial gamification seem inevitable, especially when the rewards are big
It is nearly impossible to make a gamification system completely immune to gaming, but we don’t need a bullet proof gamification scheme to stop the cheaters
We only need to make the gamification system hard enough to cheat that the reward is not worth the cheaters’ efforts
There are two ways to do this. We can either: (a) decrease the perceived value of the reward (b) or increase the effort required to game the system
Next time, I will show you practical things that you can do to lower the perceived value of reward and metrics that you can use to make cheating harder.
Finally, I will be on the road again for Lithium’s “Likes to Loves” World Tour in Amsterdam and London in early October. And prior to that, I will be taking a one-week vacation around the Netherlands during the last week of September. So my post frequency may drop a bit while I am traveling. But I will still respond if you comment. :-) If you are in the Netherlands and UK, drop me a line, maybe we can meet up for a bike ride. In the meantime, I welcome any discussion. Stay tuned and see you next time.