**티포탯(Tit For Tat)**

This strategy is dependent on four conditions that has allowed it to become the most prevalent strategy for the prisoner's dilemma:

- Unless provoked, the agent will always cooperate
- If provoked, the agent will retaliate
- The agent is quick to forgive
- The agent must have a good chance of competing against the opponent more than once.

In the last condition, the definition of "good chance" depends on the payoff matrix of the prisoner's dilemma. The important thing is that the competition continues long enough for repeated punishment and forgiveness to generate a long-term payoff higher than the possible loss from cooperating initially.

A fifth condition applies to make the competition meaningful: if an agent knows that the next play will be the last, it should naturally defect for a higher score. Similarly if it knows that the next two plays will be the last, it should defect twice, and so on. Therefore the number of competitions must not be known in advance to the agents.

Against a variety of alternative strategies, tit for tat was the most effective, winning in several annual automated tournaments against (generally far more complex) strategies created by teams of computer scientists, economists, and psychologists. Game theorists informally believed the strategy to be optimal (although no proof was presented).

It is important to know that tit for tat still is the most effective strategy if the average performance of each competing team is compared. The team which recently won over a pure tit for tat team only outperformed it with some of their algorithms because they submitted multiple algorithms which would recognize each other and assume a master and slave relationship (one algorithm would "sacrifice" itself and obtain a very poor result for the other algorithm to be able to outperform Tit for Tat on an individual basis, but not as a pair or group). Still, this "group" victory illustrates an important limitation of the Prisoner's Dilemma in representing social reality, namely, that it does not include any natural equivalent for friendship or alliances. The advantage of "tit for tat" thus pertains only to a Hobbesian world of rational solutions, not to a world in which humans are inherently social.^{[citation needed]} However, the fact that this solution does not work effectively against groups of agents running tit-for-tat does illustrate the strengths of tit-for-tat when employed in a team (that the team does better overall, and all the agents on the team do well individually, when every agent cooperates).

## Example of play

Cooperate | Defect | |

Cooperate | 3, 3 | 0, 5 |

Defect | 5, 0 | 1, 1 |

Prisoner's dilemma example |

Assume there are four agents: two use the Tit for Tat strategy, and two are "defectors" who will simply try to maximize their own winnings by always giving evidence against the other. Assume that each player faces the other three over a series of six games. If one player gives evidence against a player who does not, the former gains 5 points and the latter nets 0. If both refrain from giving evidence, both gain 3 points. If both give evidence against each other, both gain 1 point.

When a tit-for-tat agent faces off against a defector, the former refrains from giving evidence in the first game while the defector does the opposite, gaining the control 5 points. In the remaining 5 games, both players give evidence against each other, netting 1 point each game. The defector scores a total of 10, and the tit-for-tat agent scores 5.

When the tit-for-tat agents face off against each other, each refrains from giving evidence in all six games. Both agents win 3 points per game, for a total of 18 points each.

When the defectors face off, each gives evidence against the other in all six games. Both defectors win 1 point per game, for a total of 6 points each.

Each tit-for-tat agent scores a total of 18 points, over the six matches. Each defector scores only 6 points.

Despite the fact that the tit-for-tat agents never won a match and the defectors never lost a match, the tit-for-tat strategy still came out ahead, because the final score is not determined by the number of match wins, but the total points score. Simply put, the tit-for-tat agents gained more points tying with each other than they lost to the defectors.

The more tit-for-tat agents that there are in the described game, the more advantageous it is to use the tit-for-tat strategy.

Implications

The success of the strategy, which is largely cooperative, took many by surprise. In successive competitions various teams produced complex strategies which attempted to "cheat" in a variety of cunning ways, but Tit for Tat eventually prevailed in every competition.

Some theorists believe this result may give insight into how groups of animals (and particularly human societies) have come to live in largely (or entirely) cooperative societies, rather than the individualistic "red in tooth and claw" way that might be expected from individual engaged in a Hobbesian state of nature. This, and particularly its application to human society and politics, is the subject of Robert Axelrod's book *The Evolution of Cooperation*.

Problems

While Axelrod has empirically shown that the strategy is optimal in some cases, two agents playing tit for tat remain vulnerable. A one-time, single-bit error in either player's interpretation of events can lead to an unending "death spiral". In this symmetric situation, each side perceives itself as preferring to cooperate, if only the other side would. But each is forced by the strategy into repeatedly punishing an opponent who continues to attack despite being punished in every game cycle. Both sides come to think of themselves as innocent and acting in self-defense, and their opponent as either evil or too stupid to learn to cooperate.

This situation frequently arises in real world conflicts, ranging from schoolyard fights to civil and regional wars. Tit for two tats could be used to avoid this problem.^{[citation needed]}

"Tit for Tat with forgiveness" is sometimes superior. When the opponent defects, the player will occasionally cooperate on the next move anyway. This allows for recovery from getting trapped in a cycle of defections. The exact probability that a player will respond with cooperation depends on the line-up of opponents.

The reason for these issues is that tit for tat is not a subgame perfect equilibrium.^{[1]} If one agent defects and the opponent cooperates, then both agents will end up alternating cooperate and defect, yielding a lower payoff than if both agents were to continually cooperate. While this subgame is not directly reachable by two agents playing tit for tat strategies, a strategy must be a Nash equilibrium in all subgames to be subgame perfect. Further, this subgame may be reached if any noise is allowed in the agents' signaling. A subgame perfect variant of tit for tat known as "contrite tit for tat" may be created by employing a basic reputation mechanism.^{[2]}

#### 'Business관련' 카테고리의 다른 글

나의 메신저 (0) | 2010.07.28 |
---|---|

www.bazaarvoice.co.uk (0) | 2010.04.26 |

단순하면서도 효과적이고, 오묘한 전략 (0) | 2010.04.05 |

창업력 (2) | 2009.11.11 |

예측하기의 유혹 (0) | 2009.08.26 |

상전이 관점으로 본 조직과 경제 (0) | 2009.08.25 |