When can we incentivize a group of selfish individuals to take cooperative actions? Can cooperation be sustained when people interact with different partners over time?
In communities with relatively few players ... players can cooperate even when they have no information about others’ identities and histories. In communities that consist of a large number of players (e.g., a continuum of players), sustaining cooperation requires players to have some information about their partners’ histories...
...when players can selectively include signals into their records [eg. delete their posts], the maximal level of cooperation a community can sustain is not monotone with respect to the expected lifespans of its members and that it is easier to sustain cooperation when players’ actions are substitutes.
Emphasis is mine.
Corollary
  • Deleting SN posts might have different consequences, the more populated territories get.
  • Open Source projects with larger contributor bases, might operate differently if contributors have lots of historic contributions (as opposed to making new pseudonyms).
It's an interesting theory. I would be way more convinced if the author had actually run the experiment, too, and validated the math with real observed behavior.
Behavior can really vary from theory in these cases where people bring their own moral framework.
reply