Eric van Damme Stability and Perfection of Nash Equilibria With 105 Figures Springer-¥erlag Berlin Heidelberg New York London Paris Tokyo Prof. Dr. Eric van Damme Department of Economics, University of Bonn, Adenauerallee 24-42, D-5300 Bonn l/FRG ISBN-13: 978-3-642-96980-5 e-ISBN-13: 978-3-642-96978-2 DOl: 10.1007/978-3-642-96978-2 Library of Congress Cataloging-in-Publication Data. Damme, Eric van. Stability and perfection of Nash equilibria / Eric van Damme. p. cm. Bibliography: p. Includes index. 1. Game theory. 2. Equilibrium (Economics). 1. Title. II. Title: Nash equilibria. HB144.D36 1987 339.5--dc19 87-27292 This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, re-use of illustrations, recitation, broadcasting, reproduction on microfilms or in other ways, and storage in data banks. Duplication of this publication or parts thereof is only permitted under the provisions of the German Copyright Law of September 9,1965, in its version of June 24, 1985, and a copyright fee must always be paid. Violations fall under the prosecution act of the German Copyright Law. © Springer-Verlag Berlin Heidelberg 1987 Softcover reprint of the hardcover 1st edition 1987 The use of registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. Typesetting: With a system of the Springer Produktions-Gesellschaft, Berlin. Dataconversion: Briihlsche Universitatsdruckerei, Giessen. Printing: Saladruck, Berlin. Bookbinding: Liideritz & Bauer, Berlin. 2142/3020-543210 To Jeroen and Jessica The rules of rational behavior must provide definitely for the possibility of irrational conduct on the part of others. In other words: Imagine that we have discovered a set of rules for all participants - to be termed as "optimal" or "rational" - each of which is indeed optimal provided that the other participants conform. Then the question remains as to what will happen if some of the participants do not conform. If that should tum out to be advantageous for them - and, quite particularly, disadvantageous to the conformists - then the above "solution" would seem very questionable. We are in no position to give a positive discussion of these things as yet - but we want to make it clear that under such conditions the "solution," or at least its motivation, must be considered as imperfect and incomplete. In whatever way we formulate the guiding principles and the objective justification of "rational behavior," provisos will have to be made for every possible conduct of "the others." Only in this way can a satisfactory and exhaustive theory be developed. But if the superiority of "rational behavior" over any other kind is to be established, then its description must include rules of conduct for all conceivable situations - including those where "the others" behaved irrationally, in the sense of the standards which the theory will set for them. John von Neumann and Oskar Morgenstern Theory of Games and Economic Behavior, Princeton University Press, Princeton, N.J. (2nd ed. 1947, p. 32) Preface The last decade has seen a steady increase in the application of concepts from noncooperative game theory to such diverse fields as economics, political science, law, operations research, biology and social psychology. As a byproduct of this increased activity, there has been a growing awareness of the fact that the basic noncooperative solution concept, that of Nash equilibrium, suffers from severe drawbacks. The two main shortcomings of this concept are the following: (i) In extensive form games, a Nash strategy may prescribe off the equilibrium path behavior that is manifestly irrational. (Specifically, Nash equilibria may involve incredible threats), (ii) Nash equilibria need not be robust with respect to small perturbations in the data of the game. Confronted with the growing evidence to the detriment of the Nash concept, game theorists were prompted to search for more refined equilibrium notions with better properties and they have come up with a wide array of alternative solution concepts. This book surveys the most important refinements that have been introduced. Its objectives are fourfold (i) to illustrate desirable properties as well as drawbacks of the various equilibrium notions by means of simple specific examples, (ii) to study the relationships between the various refinements, (iii) to derive simplifying characterizations, and (iv) to discuss the plausibility of the assumptions underlying the concepts. The book is addressed primarily to researchers who want to apply game theory, but who do not know their way through the myriad of noncooperative solution concepts. It can also be used as the basis for an advanced course in game theory at the graduate level. It will be successful if it enables the reader to sift the grain from the corn and if it can direct him to the concepts that are really innovative. Acknowledgements. Many colleagues and friends helped to shape my thinking on these topics in the last several years. I am sure that, had I listened to their invaluable advice more carefully I would have understood the subject better· and I would have written a better book. Especially I want to express my thanks to StefTijs, Jaap Wessels, Jan van der Wal, Reinhard Selten, Werner Giith, Roger Myerson and Ehud Kalai for their help at crucial stages of the development. Of course, the views expressed are not necessarily theirs, and I alone am responsible for errors and mistakes. x Preface Sincere appreciation is also extended to my wife Suzan, Ellen Jansen, Lieneke Lekx, Netty Zuidervaart and Rolf-Peter Schneider who shared the effort in typing the several versions of the manuscript. A special thanks also goes to the editorial staff of Springer-Verlag, especially to Werner Muller for exerting just sufficient pressure to get this book finished. Finally, I thank Jeroen and Suzan for their patience, understanding and encouragement while the book was being written. Bonn, July 1987 Eric E. C. van Damme Organization The Chaps. 1-6 of this book are virtually identical to the monograph "Refinements of the Nash equilibrium concept" that I published with Springer Verlag in 1983. Werner Miiller, the economics editor of Springer kindly asked me to extend that monograph with some chapters illustrating the various equilibrium concepts in specific examples. I was pleased to honor that request and I gladly took this opportunity to clear up some inaccuracies and to include some recent developments. This book consists of 4 parts: Part 1 (Chap. 1) provides a general introduction. It is argued that the solution of a noncooperative game should be a Nash equilibrium but that not every Nash equilibrium is eligible for the solution. Various refinements of the Nash concept are introduced informally and simple examples are >used to illustrate these concepts. Part 2 (Chaps. 2-5) deals with normal form games. A great variety of refined equilibrium concepts is introduced and relationships between these refinements are derived, as well as characterizations of several of them. For a quick overview, the reader should consult the Survey Diagrams 1 an(! 2 at the end of the book. A main result, however, is that for normal form games there is actually little need to refine Nash's concept since generically all Nash equilibria satisfy all properties one could hope for. Part 3 (Chap. 6) provides an introduction to extensive form games. Formal definitions are given and elementary properties of several concepts (such as (subgame) perfect equilibria and sequential equilibria) are derived. The main result is that a proper equilibrium of the normal form induces a sequential equilibrium in the extensive form. However, normal form properness does not eliminate all "unreasonable" equilibria. Part 4 (Chaps. 7-10) is devoted to specific applications, illustrating the strength (resp. weakness) of the various concepts. These chapters are independent of each other and familiarity with the basic notions from Chaps. 2 and 6 suffices to follow the discussion. The main theme in Chap. 7 is "how to implement concepts from cooperative game theory by noncooperative methods?" The power ofthe subgame perfectness concept is illustrated by means of a simple fair division problem, by means of the Rubinstein bargaining model (which implements the Nash solution) and by means of the Moulin model (that implements the KalaijSmorodinsky solution). Furthermore, Nash's bargaining model is used to illustrate the essential equi librium concept. In Chap 8 we prove the Folk Theorem, which states that the set of cooperative outcomes of the one-shot game coincides with the set of noncooperative out- XII Organization comes of the repeated game. This chapter shows that the subgame perfectness concept has certain drawbacks and that it is not as restrictive as one might initially think. In Chap. 9 we turn to the biological branch of game theory. Here, the main solution concept is that of evolutionarily stable strategies (ESS), i.e. of symmetric Nash equilibrium strategies that satisfy a certain condition of neighborhood stability. The relationships between (refinements of) the ESS concept and concepts of strategic stability are studied. (An overview is given in Diagram 3.) The ESS concept is very restrictive, however, (especially in extensive games) and it is shown that to coarsen this concept one can use the same methods as those that are used to refine the Nash concept. In Chap. 10 we return to the question of whether a game is adequately represented by its normal form, i.e. whether knowledge of the normal form is sufficient to define rational behavior. In particular, it is investigated what kind of restrictions the Kohlberg/Mertens concept of stability imposes on the beliefs in a sequential equilibrium. Special attention is paid to signalling games, for which several "intuitive criteria" for eliminating "unintuitive equilibria" are discussed (see Diagram 4 for an overview). Many examples are given to illustrate the various concepts. Notational Conventions We have tried to use standard notations as much as possible. The notation that is specific to normal form games (resp. extensive form games) is introduced in Sect. 2.1 (resp. 6.1). Chapter 9 uses the notation that is conventional in the biological branch of game theory and this differs from our general notation for normal form games. At this point, we confine ourselves to introducing some terminology that is used throughout the book and that may not be completely standard. N denotes the set of the positive integers {1, 2, ... } (positive will always mean strictly greater than 0). When dealing with an n-person game we will frequently write Nfor {1, ... ,n}. R denotes the set of real numbers and R m is the m dimensional Euclidean space. For x,yeRm, we write x;;;;;y if Xj;;;;;Yj for all i. Furthermore, x<y means Xj<Yj for all i. We writeR~ for the set of all xeRm which satisfy O;;;;;x andR~ + for the set of all x e R mf or which 0 < x. Euclidean distance on R mi s denoted by (! and A. denotes Lebesgue measure on Rm. The set of all mappings from A to B is denoted by $' (A, B). Iffe $' (R~ +, Ri), then "y is a limit point of a sequence {f(x)}X! 0" is used as an abbreviation for "there exists a sequence {x(t)}teN such that x(t) converges to 0 andf(xt) converges to y as t tends to infinity". If A is a subset of some Euclidean space, than conv A denotes its convex hull and 2A denotes the powerset of this set. Let A and B be subsets of Euclidean spaces. A correspondence from A to B is an element of $'(A, 2B). The correspondence F from A to B is said to be upper semi-continuous if it has a closed graph, i.e. if {(x,F(x»; xeA} is closed. The number of elements of a finite set A is denoted by IA I. If A is finite, and L fe$'(A,R) thenf(A):= f(a). aeA Indices can occur as subscripts or superscripts. Lower indices usually refer to players. Upper indices usually stem from a certain numbering. For instance, when dealing with an n-person normal form game, we write s} for the probability which the mixed strategy Sj of player i assigns to the kth pure strategy of this player. To avoid misunderstandings between exponents and indices, we will sometimes write the basis of a power between brackets. Hence, (s})2 denotes the square of s~ . Definitions are indicated by using italics. The symbol := is used to define quantities. The symbol 0 denotes the end of a proof.