Why We're Smart
Most people believe humans evolved intelligence because using tools was an advantage. However, I believe tool use was secondary. Group cooperation was the primary advantage conferred by intelligence. You see, cooperation is fundamentally difficult.
This insight coalesced when I was reading about Mark Satterthwaite, an economist at Northwestern’s Kellogg School of Management. He’s famous for two important impossibility theorems: (1) the Myerson-Satterthwaite Theorem and (2) the Gibbard-Satterthwaite Theorem.
Informally, (1) says that there is no bargaining mechanism that can guarantee a buyer and seller will trade if there are potential gains from trade, while (2) says that there is no voting mechanism for determining a single winner that can induce people to vote their true preferences. In both cases, the reason for the impossibility is that people have incentives to hide their actual values to achieve a strategic advantage.
Add these to the Prisoner’s Dilemma and Arrow’s Impossibility Theorem on the list of fundamental barriers to cooperation (Holmstrom’s Theorem is another good one; it explains why you can’t get everyone in a firm to exert maxium effort). By “fundamental”, I mean there is no general solution. So the evolutionary process cannot just discover a mechanism that guarantees cooperation when it is efficient. There will always be the opportunity for individuals to subvert the cooperative process to promote themselves, thus creating selection pressure against the cooperation mechanism.
(Note that there is a hack: make sure each individual has the same genes. This is how multicellular and hive organisms get around the problem. But the existence of cancer in the former case and the reduced genetic diversity in the latter case make them limited solutions.)
To achieve extensive cooperation in large groups, individuals need the ability to model the strategic situation, estimate the payoffs to various group members, and continuously assess what strategies other members may be playing. On top of that, there’s an arms race between deceiving and detecting deception. It’s the old, “I know that you know that I know…” schtick. The smarter you are, the further you can compute this series.
Bottom line: the impossibility theorems mean the only way to achieve cooperation is to have the machinery in place to make detailed case-by-case determinations. We’ve talked about the Dunbar Number before: the maximum size of primate groups is determined in large part by a species’ average neocortical volume. I claim you need to be smarter to process more complex strategic configurations and maintain models of more individuals’ goals.
If I’m right, there are two interesting implications. First, politics will be with us forever. No magical technology or philosophical enlightenment will eliminate it. Second, if we ever encounter intelligent aliens, they’ll have politics too. Nothing else about them may be recognizable, but they’ll have analogs of haggling over price and building political coalitions.