Back in 2006, I took the undergraduate course Individual and Organizational Behaviour offered to students in the commerce programme at the University of Toronto. One of the highest quality lectures I have attended was given in this class by a guest speaker, a professor in the Rotman School of Management. During this lecture, students were put into groups and pitted against each other in a friendly competition. We faced off in what seemed to be a tragedy of the commons scenario and our gameplay was analysed by the speaker. However, I argue that what we were playing was not actually a tragedy of the commons scenario and that the analysis given in class was incorrect. Apparently, what I thought was extremely obvious was not. Or perhaps I’m just wrong; I’m no game theoretician.
In Part I of this blog post, I describe the game scenario. In Part II, I will describe the professor’s analysis followed by my take in Part III. In Part IV, I will ruminate about how this might apply to some contemporary social issues.
Note: I have forgotten some the details of the game such as the exact payoff ratios and the number of turns, but the mechanics described below are accurate enough for the purposes of discussion.
The Game Scenario
In the game, lasting 20 turns, each of four players starts off with no money. Each turn, players receive $100 and can choose to put $100 into a group pool (“cooperate”) or to keep it all to themselves (“defect”). Money in the pool accrues 200% interest at the end of the turn and is distributed equally to all players, including defectors. For example, if one player defects, the pool contains $900 (200% interest on $300 + the initial capital). The defector receives $100 + $225 (a quarter of the pool). Each other player receives $225. If only one person cooperates, the pool is $300 after interest (200% interest on $100 + the initial capital). Each of the three defectors receives $100 + $75. The cooperator, who contributed $100, only gets back $75.
This yields the following payoff matrix (change in money at the end of the turn):
# defectors | Defectors | Cooperators |
---|---|---|
0 | N/A | +$300 |
1 | +$325 | +$225 |
2 | +$250 | +$150 |
3 | +$175 | +$75 |
4 | +$100 | N/A |
At the end of the 20 turns, the professor converted each game dollar into a ticket for a draw for one of three prizes of real money. It matters little, but I think the amounts were one $100 prize and two $50 prizes. In any case, we will consider the optimal strategy to be the one which maximizes one’s chances of winning the real money.
During the class, each of the 100 or so student enrolled in the course was seated in front of a computer terminal and would face off against three other players for one game (20 rounds); only actions would be relayed to other players — no communication was permitted with the other players. We were told that the players were either students from our class or students receiving the same lecture at the University of Waterloo. ((At the end of the game, it was revealed that we faced other students in the room and computer “bots” with different pre-programmed playing strategies such as “always cooperate”. That is, there were no students at the University of Waterloo.))
Before we continue, what strategy do you think the professor proposed to be the optimal strategy? What do you think the optimal strategy is? ((You might like to read about the concept of superrationality that Douglas Hofstadter puts forth, though I can’t claim any knowledge of it. Having discussed my take immediately after lecture, both the professor and I assume the other players are rational, not superrational — the word didn’t even come up — though perhaps I’m too optimistic about the reasoning capabilities of the average rational human being.)) Stay tuned for part II of this 4-part series to hear the professor’s take.
Jonathan, I think this game should be forbidden because it is unethical. There’s absolutely nothing to learn from it for real life, because this game is a “negative sum” game – yes, even worse than a zero-sum game! – while real life is a positive sum game.
Playing this game often enough will make everybody defect always – except for some dreamers. Even when you take the global draw ticket view it doesn’t really become cooperative. Imagine other people play like you and get $3000 instead of $2000. Then the other half of people will get $5000 and your chances of winning will be less then if people had been defecting.
Do you know why those computer-players are in the game? Because otherwise everybody would be always defecting after the second or third game.
I bet you could use this game to document cultural differences: in the US and among young people in China, defector rates will probably be much higher than for Japanese people or old people in China. That’s because Americans view the world as this kind of game! And as the game spreads, societies’ morals go down. Continue the game and soon there won’t be much anything but crime and the rule of the strongest.
————————————————————
Of course, my last words have been exaggerated a bit, but it is clearly and scientifically proven that this kind of studies has a strong effect on subject’s (and students) mental models and those mental models in turn influence their behavior.
Actually, Robert, this is a positive-sum game. While each player gets $100 per turn, there is no way they can, collectively, end up with less than $100 on average. W.r.t. the 5:3 split: The tickets earned in a game range between 2000 (worst case) and 6000 (best case). Suppose there is exactly one game other than yours playing. Suppose you are one of two cooperators at your game and everyone defected at the other table. You take home 3000 of the 24000 tickets in play. That’s 1/8th of the tickets. Since there are 8 players in total, you’re no worse off than if you had all defected. If there are two other groups that always defect, you’d have 3000/32000 tickets, or slightly less than 1/11 chance of winning with a total of 12 players — an edge!
Suppose the other group is full of cooperators. In the first case, you’d have 3000/40000 tickets (7.5% chance of winning). This is better than if your whole table defected — you’d have 2000/32000 (6.25% chance of winning).