Research Support School of Business Administration August 1994 THE BLIND LEADING THE BLIND: SOCIAL INFLUENCE, FADS, AND INFORMATIONAL CASCADES Working Paper #9405-30 David Hirshleifer University of Michigan

! i;

August 22, 1994 Comments Welcome The Blind Leading the Blind: Social Influence, Fads, and Informational Cascades forthcoming in The New Economics of Human Behavior Cambridge University Press, M. Tommasi, ed. David Hirshleifer School of Business Administration University of Michigan 701 Tappan Street Ann Arbor, MI 48109-1234 (313) 763-1169 I thank Sushil Bikhchandani, Julian Franks, Jack Hirshleifer, Susanne Lohmann, Manuel Santos, Mariano Tommasi and Ivo Welch for helpful information and comments.

Abstract The Blind Leading the Blind: Social Influence, Fads, and Informational Cascades An informational cascade occurs when it is optimal for an individual, having observed the actions of those ahead of him, to follow the behavior of the preceding person without regard to his own information. Among the phenomena that can be explained by informational cascades are conformism at specific times and places, error-prone behavior, and fragility of behaviors.

The Blind Leading the Blind: Social Influence, Fads, and Informational Cascades 1 Introduction Why do teenagers at one school take drugs while at another they "just say no"? Why do Americans act American, Germans act German, and Indians act Indian? Why did English and American youths enthusiastically enlist to fight in World War I, whereas pacifist sentiments were more popular before World War II and in the 1960's? These are all examples of one of the most striking regularities of human society: localized conformity. Localization in either time or place seems to cast doubt upon the idea that people make rational and intelligent choices. If illegal drug use is a bad idea, why do. people participate in waves of abuse of different kinds of drugs at different times? If the sale of marijuana is illegal in America, why is it sold legally in Amsterdam drug cafes? If the Chinese know that cold drinks harm the digestion, are Americans blundering when they guzzle Coke? Part of the explanation is that people in different places don't observe each other's behavior. I know that Germans do many things differently from me, but I don't know specifically what most of these things are. A high school student who sees his friends taking drugs may not realize that members of other cliques do not take drugs. Or even ignoring this, he may wrongly attribute these divergences to differences in the costs and benefits from taking drugs. Rapid changes over time poses an even greater challenge. Why was cohabitation of unmarried couples in the U.S. viewed as scandalous in the 1950's, flaunted in the 1960's, and hardly noticed in the 1980's? College students who flirted with the counterculture in the 60's were succeeded by pre-MBA go-getters in the 80's. The recent collapse of communism in the Eastern Bloc was rapid and unexpected. Religious movements, revivals, and reformations, started by a few zealots, sometimes sweep across populations with astonishing rapidity. Social attitudes toward and popularity of alcohol, cigarettes, and illegal drugs have also 1

fluctuated rapidly. The main theme of this essay is that learning by observing others can explain the conformity, idiosyncrasy, and fragility of social behavior. When people can observe one another's behavior, they very often end up making the same choices. Thus, localized conformity. If the early-movers erred, followers are likely to imitate the mistake; hence idiosyncrasy. If later on a few people start behaving differently for whatever reason, then a sudden phase change can occur in which the old convention is swept away by the new; hence fragility. Such imitation can explain either transient fads or permanent choices among alternative products, sexual and marital options, scientific theories, and religious beliefs. My first crucial point is that imitation can be sophisticated, being based upon rational weighing of pros and cons. Even if the blind are leading the blind, as in Breughel's painting (Figure 1) the followers need not be fools-each individual realizes that he is somewhat ill-informed and that his predecessors are also. How "social influences" such as imitation can lead to instability was analyzed by Gary Becker in "A Note on Restaurant Pricing and Other Examples of Social Influences on Price," which carries a heavy punch for a paper of 8 pages.1 The puzzle that interested Becker: A popular seafood restaurant in Palo Alto, California, does not take reservations, and every day it has long queues for tables during prime hours. Almost directly across the street is another seafood restaurant with comparable food, slightly higher prices, and similar service and other amenities. Yet this restaurant has many empty seats most of the time. Why doesn't the popular restaurant raise prices, which would reduce the queue for seats but expand profits? The explanation Becker provides is that people would rather dine at the "in" restaurant where others are dining as well. The problem with raising prices is that 1After catching the attention of academic economists, his paper was profiled in a respected news magazine, The Economist. By popular demand, Becker published some further thoughts in a paper called "Son of Fish Market:... A Further Note on Restaurant Pricing and Other Examples of Social Influence on Price," in honor of the restaurant that triggered his thinking on the topic. 2

Pieter Breughel: The Parable of the %y'J 1 -

if this drives away a few customers, the lines get shorter, making the restaurant less fashionable. This lower "in-ness" itself deters a few more customers. If the price is raised just a little too high, this process can feed upon itself (so to speak) until the restaurant's clientele suddenly vanishes.2 If the restaurant manager knows precisely how much customers value the "in-ness" of the establishment, he can boost the price to the maximum level that avoids unravelling. At that price there are long lines. But if the manager does not know demand perfectly, he may set the price just a tad too high, and the restaurant fails. A seller's attempt to boost revenue by raising price pushes buyers near the edge of the precipice.3 Hence, Becker's analysis explains at the same time why popular restaurants (or movies) don't just raise the price more, and why, surprisingly often, fashionable restaurants that are packed one year vanish the next. As with restaurant seats, initial public offerings (sales of stock by new companies) are sometimes rationed-bidders do not receive as many shares as they would like to buy. Why don't the sellers just raise the price? - Welch (1992) focuses on the fact that buyers arrive in sequence, so that later buyers gain information from earlier buyers. Translated into the restaurant context, Welch assumes that each person knows something about the quality of the two restaurants. Thus, the choices of the diners who arrive ahead of you provide information about which restaurant is better. If the queue is long at one restaurant, then other people think the food is good, so perhaps you should follow-even if your own limited information possibly suggested otherwise!4 Because of this self-reinforcing tendency, the seller has a strong incentive to induce early-movers to buy. New restaurants sometimes employ people to hang out and eat long meals to make their establishments look popular. Musicians and stage companies have hired claques to provide enthusiastic applause (or, in the case of 2 But contrast Yogi Berra's remark about a once-popular restaurant: "no-one goes there anymore -it's too crowded." 3 There is, of course, a trade-off here. A rational manager will balance the benefits of gaining higher revenues when the price is high against the risk of killing the goose that lays the golden eggs. 4 Banerjee (1992), Chamley and Gale (1992), and Lohmann (1992a) provide related theories of imitation. 3

competitors, to heckle), and professional mourners were engaged at ancient Roman funerals. Another way to get the ball rolling is to charge a low price aimed especially at early purchasers. Thus the familiar phenomenon of new products offered at drastically reduced introductory prices. Bikhchandani, Welch, and I (1992) have analyzed more generally how limited information can explain conformity and fads. As first defined by Welch, an informational cascade occurs when the information implicit in predecessors' actions is so conclusive that a rational follower will unconditionally imitate them, without regard to information from other sources. We show that cascades often spontaneously develop on the basis of very little information. People converge upon one action quite rapidly, and their decisions are idiosyncratic.5 In our model, a sequence of individuals make successive choices (e.g., between two restaurants) based on both private information (information received by a single individual) and on the observed decisions of earlier movers. If there are many individuals, then we show that with virtual certainty a point in the chain of decisions will be reached where an individual ignores his private information and bases his decision solely upon what he sees predecessors do. (He's not ignoring his information foolishly; it's just that the accumulated evidence from many previous individuals dominates even when his private information points in the opposite direction.) Once this point is reached for some individual n, his decision becomes uninformative to later choosers. Everyone knows that individual n was just following the bandwagon, so people do not take his action as reflecting any additional evidence pro or con. In consequence, individual n+l is in the same position as individual n, so she also joins the cascade. This reasoning extends to all later individuals. (Of course, this conclusion does not always follow: later movers may have different costs and benefits from adopting, or different accuracy of their private information signals.) Consider the submission of a manuscript for publication. If a journal's referee is aware that a paper had previously been rejected elsewhere, he should rationally tilt toward rejection. After all, the previous rejection indicates that some other expert reviewer disliked the paper, very likely (though not necessarily) for good reason. If it becomes widely known that several journals have rejected a 5 But for forces that operate to overcome idiosyncratic outcomes, see Liebowitz and Margolis (1990). 4

paper, even a good submission may become unpublishable. The publication or failure of a few papers can make or break a young professor's career: "Up or out, publish or perish!" But the problem of cascades is hardly limited to academics. Consider the stigma placed upon gaps in a personal resume. "Well," the interview thinks, 'You haven't held a job for over 16 months. Clearly other employers have rejected you-I wonder why!" He reasonably infers that other employers have probably detected something negative about the applicant. Thus, even an otherwise-strong applicant may be virtually frozen out. Cascades are often involved in the formation of a crowd or a queue. In communist Eastern Europe, it is said, long shopping lines would suddenly precipitate whenever a few people happened to stand together. On the other hand, since cascades are based upon very little information, they can be easily reversed. If an imitator realizes that the rationale for his choice is weak, then it only takes a very small piece of news to change his mind. So the restaurant that is "in" this week may be "out" next week for no clearly visible reason. Much the same may hold for clothing styles and political election campaigns.6 The crucial point is that the system bounces around randomly until it reaches a point of precarious stability. As decisions are made, evidence (as reflected in people's decisions) gradually accumulates in favor of one action or another. An action is fixed upon when the weight of the evidence grows to be just enough to overcome one person's opposing information. At that point, if the next individual is similar, he is also just barely willing to ignore his own information signal, i.e., he is in a cascade. This reasoning extends indefinitely, so all further individuals do the same thing. Thus, a very small preponderance of evidence causes a landslide majority to take one action over the other. In this situation, a very small shock to the system —such as new public information — can affect the behavior of many people. Owing to informational cascades, society often lands precariously close to the borderline, like a car teetering at the edge of a cliff. This is unlikely to last forever, of course. Aside from external shock, in some cases new information is almost sure to arrive to tip things one way or the other. Consider a newly-opened trendy restaurant that is packed with first-time 6 People observe decisions of others when lining up behind political candidates through opinion polls, bumper stickers, and the comments of loudmouths at cocktail parties. 5

customers. If all they care about is the food, then after a few meals they will learn quality quite accurately. So an initially mistaken cascade will be corrected quite quickly. Of course, if the cascade is for a non-repeated activity (such as seeing a given movie or buying shares of a given initial public offering) or an infrequent activity (such as buying a car), this corrective force is less powerful. In contrast with the cascades model, most theories of conformity (Becker's model being an exception) imply rigid behavior in the face of changing circumstances. For example, the different though related theories of Akerlof, Kuran, and Coleman are founded on the threat of sanctions upon deviants.7 Sanctions can lock social policy in place, even when it becomes evident to most people that a change would be desirable. A small shock can affect the behavior of many people only in very rare circumstances in which society happen to be at a razor's edge balance between alternatives.8 A second class of theories is based on payoff interactions, in which one person's action directly increases the benefit to someone else of doing the same thing (see Schelling [1978], Arthur [1989]). For example, conventions such as driving on the right (or left) hand side of the road are self-enforcing.9 A third theory, conformity preference, holds that people directly prefer to do the same things that others are doing. As everyone hits upon the same actions, that action is stabilized, but it may not be the best one (see Jones [1984], Becker [1991]). The fourth and fifth theories can be mentioned together: parallel reasoning, and direct communication. In parallel reasoning, everyone independently is wise enough to figure out the best choice. In direct communication, those who figure out the best choice simply explain the benefits of alternatives to others. This also implies convergence towards the correct outcome if communication is credible and 7 Some relevant articles and books are Akerlof (1976, 1980), Kuran (1989, 1991), and Coleman (1987). See also my own work with Eric Rasmusen (Hirshleifer and Rasmusen [19891) on enforcing cooperation through ostracism. 8 In Kuran's model of revolution, a small spark can cause a prairie fire, because-owing to sanctions on deviants or on a direct desire to conform-people hide their true beliefs. In Eastern Europe, for example, -many-people who opposed the communist regime remained quiet. Only when public opposition attained a critical mass did change occur. Kuran's prairie fires are sudden and sweeping but occur only in rare circumstances in which society happens to be at the edge of change. 9 Becker's model can also be viewed as a payoff interaction theory. 6

costless. Neither theory explains why mass behavior is error-prone. Among the questions to be covered here are (1) how likely are cascades, (2) how likely it is that the wrong cascade occurs (can a good job candidate be so badly stigmatized by failure as to become unemployable?), (3) what makes fashions change (why did the business major replace the "counter-culture" in college life), (4) how effective are public information releases (e.g. a campaign to publicize the health effects of smoking), and (5) to what extent subsequent individuals free-ride on information purchased by earlier individuals (would an employer who knows that a job applicant has a large gap in his resume even bother inviting the applicant to an interview?). 2 The Basic Model Consider a setting where each individual observes only the actions of predecessors, not their information signals. (Since actions speak louder than words, the information conveyed by actions may be more credible than verbal reports in any case.) Assume a sequence of individuals deciding between one of two actions: say smoking or not smoking. Each individual observes the decisions of all those ahead of him. People are lined up in sequence, and everyone knows his position in the queue. Everyone has the same costs and benefits from adopting the behavior. If smoking has no adverse health consequences, let its net value be +$1, but if smoking is harmful, let its value be -$1. These possibilities are equally likely to begin with in each person's private calculations. Each person then privately observes his own information signal: H (high, or favorable to smoking) versus L (low, or unfavorable). An example of an L signal could be reading about an adverse scientific study. An example of an H signal could be seeing an advertisement in which smoking seems pleasurable. Suppose all individuals observe similar types of signal. Specifically, suppose that adoption is the right thing to do, and that each person has a 3/4 chance of observing H, and a 1/4 chance of observing L. But given the true value of adopting, the signals are independent, so Arnie may see H, and Betsy may see L. Each person forms a probability belief about whether smoking or not smoking is superior based on his information signal and on what he sees his predecessors do. He then makes his own decision. (If exactly indifferent, he flips a 7

fair coin.) In Diagram 1, Arnie, the first person, adopts if his signal is H and rejects if it is L. So Betsy can deduce that if Arnie adopted, his signal was H; if he rejected, his signal was L. If Arnie adopted, Betsy should also adopt if her signal was H; in this case the H's outnumber the L's by 2 to 0. However, if Betsy's signal is L, her L signal is offset by Arnie's H signal, so the expected value of smoking is exactly 0. So Betsy tosses a coin to decide. By similar reasoning, if Arnie had rejected, Betsy rejects if her signal is also L, and tosses a coin if her signal is H. When we get to Clarence, cascades can start. There are three possible situations: (1) both predecessors have adopted; (2) both have rejected or (3) one has adopted and the other rejected. If both Arnie and Betsy have adopted, then Clarence will adopt too. He knows that Arnie observed H, and that quite likely Betsy did too (although she may have seen L and flipped a coin). A probabilistic calculation shows that Clarence will adopt even if he sees an L signal. This situation, in which Clarence adopts regardless of his signal, may be termed an UP cascade.10 Similarly, in the second case, if both Arnie and Betsy rejected, then Clarence will reject too, even if he sees an H signal. This is a DOWN cascade. In the third case, Arnie adopted and Betsy rejected (or vice versa), Clarence knows that Arnie observed H and Betsy observed L (or vice versa). These signals cancel out, so Clarence will follow his own signal, just as if he were the first person in the queue. Returning to the cascade case where both predecessors adopted, since Clarence adopts regardless of his signal, his action provides no information to his successors about the desirability of smoking. Deedee, who has seen adverse medical reports, would reject smoking if only she knew that Clarence had an L signal. Instead, she continues the cascade. This reasoning extends to Fifi, Gerard, Harriet, Ivan, Joy, and so on. Since opposing information remains hidden, even a mistaken cascade lasts forever. An early preponderance toward either adoption or rejection, which may have occured by mere coincidence or for trivial reasons, can feed upon itself until the air is choking with smoke (or becomes particle-free). The overall outcome here is idiosyncratic and history-dependent. Actions '1 Clarence, by the-way, is a sophisticated smoker. Carefully weighing the probabilities, he knows full well that there is a chance that Betsy also observed an L signal, so that Clarence really ought not to smoke. But he can't be sure, so he makes an overall judgement based on his information and the actions of his predecessors. This often leads him to act differently from what he would have done had he known not just Betsy's action, but her information and that of predecessors. 8

I II II I i j i II I Betsyi H /Adopt ~ - - Adopt Clarence<_ L "Adopt. H Adopt * Adopt Clarence L- Adopt Flip Coin K HrAdopt * Reject Clarence L "Reject * * L *. S * S * 5 S S S S ~ Reject Diagram 1

may be poorly attuned to costs and benefits even if the information possessed by individuals would, if combined, suffice to make very accurate decisions. To understand this last point, suppose that a restaurant rating bureau were to interview Arnie, Betsy, Clarence, and so on. Walter's Restaurant Guide could then identify with extremely high accuracy which restaurant was better, and could inform later patrons. Jointly, the diners have enough information to decide correctly, but in fact they may end up flocking to the wrong place. As evidence of the sort of imitative behavior that can lead to cascades, consider the adoption of hybrid corn from 1928-41. Ryan and Gross (1943) interviewed Iowa farmers and found that the most frequent reason given for adoption of hybrid corn was that neighbors adopted. (The importance of neighbors' adoption is a common finding in studies of adoption of innovation.)1 The fallibility of cascades causes them to be fragile. If some people have more precise signals than others (as will be discussed in Section 2.1, or if public news is revealed at a later date (as in Section 3.1), or if the relative desirability of adopting versus rejecting changes (as in Section 5), then cascades can easily be broken. An important proposition in our setting is that cascades will always arise eventually. To see why, suppose that someone fairly late in the sequence, Zorro, is not in a cascade. This means that he and everyone before him must be making decisions that make use of their personal signals. So the actions of each and every predecessor conveys some information to Zorro. If Zorro is far enough down the line, then by combining the information of all his predecessors, with virtual certainty he can infer almost perfectly the true value of adoption. But then, Zorro's own signal contributes virtually nothing to his overall assessment, so he should ignore his signal and follow the action taken by most of his predecessors. In short, he is in a cascade. One may be tempted to think from this line of reasoning that people will ultimately converge on the correct action. This is far from the case. A cascade will take a long time to form only if early information is conflicting. As soon as a fairly mild preponderance of evidence favors one or the other action, a 11 It is quite possible-that many of these farmers observed mainly:whether their neighbors adopt rather than the entire sequence of previous decisionmakers. Cascades can still arise in settings where only some of previous decisions are observable. Cascades can also arise when people see only a summary of previous actions; for example, someone deciding which of two types of cars to buy may look at reports on total sales. 9

cascade starts, preventing decisions from becoming very well-informed. Typically cascades begin surprisingly soon. Calculations show that even when information signals are very noisy (so that H is only slightly more likely to arrive when the adoption is good than when it is bad), the probability of a cascade forming after ten individuals is greater than 99%! The probabilities of an UP cascade, no cascade, or a DOWN cascade after two individuals given that in fact adopting is superior are easily calculated in the model. The probability of correct cascade is increasing in the accuracy of the signal, but even for very informative signals, the probability of the wrong cascade can be remarkably high.'2 Owing to cascades, final outcomes depend heavily on chance early choices. It is tempting to search for deep logical causes of massive social trends, but these may not exist. Early origins may be trivial.'3 Such history-dependence is not unique to the cascades theory; it can arise from some of the other theories of conformity mentioned in the introduction as well. An important extension of the cascades model allows for differing costs and benefits to different individuals (see Lohmann [1992a]). In such a scenario, an adoption cascade that is correct for early adopters may be undesirable for followers who have a lower value of adoption. Followers can be fooled because they don't know for sure whether predecessors have greater or less gain from adoption. A more radical deviation from the basic model would allow people to observe previous information signals, not just previous actions. This makes the outcome less uniform, idiosyncratic, and fragile. Even if someone smokes although his private signal suggested he should not do so (a cascade), that signal joins the common pool of knowledge. A long enough series of adverse information signals eventually cause later movers to choose not-smoking. C 12 Suppose that the signal accuracy is 60%, so that when adopting is preferable, 60% of the time an H signal is observed. Then about 1/3 of the time the wrong cascade occurs. Even if the signal accuracy is 70%, about 1/5 of the time the wrong cascade occurs. 13 According to the theory of psychohistory in Isaac Asimov's Foundation science fiction series, the random behaviors of individuals average out to mass behaviors that are almost perfectly predictable in the long run. Thus, the mathematician Hari Seldon was able to forecast events in the decline of the Galactic empire over a period of thousands of years. The cascades theory, if correct, dashes the hopes of developing accurate psychohistory. 10

2.1 Fashion Leaders Cascades will often be wrong. This blind-leading-the-blind phenomenon is bad enough when someone follows a series of similar predecessors. But when people are different, it may take only a single person to start the ball rolling. As an absurd extreme, one person's decision to eat oat bran (for example) can cause millions of others to follow. Suppose different people have different information precisions (accuracy).14 Suppose Arnie is a doctor, while Betsy is a computer programmer. Everyone thinks that Arnie has at least slightly better information about whether or not to eat oat bran. Assume that there is an equal prior probability, before signals are observed, as to which is superior. Then if Arnie observes his signal and decides first, a cascade starts immediately. Betsy should eat oat bran if and only if Arnie does, because she knows he is better informed. So Betsy's action is uninformative to the next person. So if Clarence, like Betsy, has a lower precision signal, he will follow Arnie just as Betsy did, and so on indefinitely. Thus, everyone will blindly follow Arnie's action, even if Arnie is better informed than the others only by the slimmest of margins. Betsy's willingness to imitate Arnie is reasonable as far as it goes. But Arnie's information is not necessarily better than the combined information of ten people like Betsy, or a thousand. While I have taken the order of moves as given, often the highest-precision person will decide first, because people who are poorly informed have stronger reason to wait and see what others do. Thus cascades should form extremely rapidly (after one person). But unless the first person's information is very much superior to that of followers, this may be disastrous. In contrast, suppose the person who goes first has slightly lower information precision than those who follow. Then Arnie's action does not start a cascade, because Betsy should follow her own information rather than Amie's. This is too bad for Betsy; she'd rather that Arnie was better-informed, so she could exploit his information by imitating him. But Clarence, on the other hand, is glad that Arnie is slightly less well-informed. Clarence gets to see the actions of both Arnie and 14 A more precise signal has a higher probability of H when adopting is superior, and a higher probability of L signal when rejecting is superior. 11

Betsy and now both of these actions are informative. In fact, everyone from Clarence onward benefits substantially from having Arnie being slightly less well-informed instead of slightly better-informed than the others. So small early differences in precision can make a huge difference later. A slightly higher precision of an early individual can lead to an immediate cascade that is even less informative (and so, potentially even more fragile) than when people have equally accurate signals. If a higher-precision individual show up later, he can shatter a cascade, because he is more inclined to use his own information than those that preceded him. Cascade-reversal tends to improve matters, because more information can be aggregated than if only a single cascade occurred. Thus, from a social point of view it will often be desirable to order decision-makers inversely with their precisions. As one example, in U.S. Navy court-martials, the judges vote in inverse order of seniority. More generally, in judicial systems with courts of appeal, judges sitting in lower-level courts making initial judgements are normally less prestigious and experienced (less precise?) than those in higher courts.15 There is a lot of evidence that low precision decisionmakers imitate high-precision ones. In setting up territories, animals new to a habitat tend to locate close to previous settlers more than do animals that are familiar with the habitat (Stamps [1988]). Psychological experiments have shown that a human subject's previous failure in a task raises the probability that in further trials he will imitate a role-model (see Thelen, Dollinger, and Kirkland [1979]). Young children instinctively imitate their parents and teachers. Teenagers discover that in many areas their parents' information precision is no longer greater than their own, a rational reason for them to rebel. The tendency to imitate high prestige individuals may be based on a belief that high-prestige individuals are well-informed decisionmakers.16 15 A fruitful extension of cascades theory would be to study information flows in organizations. If subordinates have useful but noisy information, then an upward flow of project recommendations may preserve their information much more effectively than a decision process that moves downward through the hierarchy. 16 Bandura (1977) states that "in situations in which people are uncertain about the wisdom of modeled courses of action, they must rely on such cues as general appearances, speech, style, age, symbols of socioeconomic success, and signs of expertise as indicators of past successes." 12

3 Fragility of Cascades: Further Considerations The uniformity stemming from the factors described in the introduction usually becomes more robust as the number of adopters increases. Cascades, on the other hand, remain brittle, so that the arrival of a little information, or the mere possibility of a value change (even if the change does not actually occur) can shatter an informational cascade. 3.1 The Public Release of Information Cascades can be sensitive to public information releases, for example, on the hazards of smoking or the effects of medical procedures (e.g. tonsillectomy), drugs (e.g. aspirin), and diet (e.g. oat bran). This subsection examines three questions: (1) Does the release of a single item of information, e.g., a report on aspirin and heart disease, make all people who have not yet decided better off? (2) How easily can such information reverse a cascade? (3) Does the release of multiple items of information over time eventually make people better off? For example, if medical science gradually generates information about the adverse consequences of tonsillectomy without special medical indications, will all doctors eventually reject this practice? The answer to the first question is that the release of public information prior to the start of a cascade can make some people worse off (in an ex ante sense). For example, a newspaper restaurant review, although providing some valid information, can still make some diners worse off. Public information release has two effects on an individual: (1) it directly provides information, and (2) it changes the decisions of predecessors, and thus the information conveyed by their decisions. It may be that the restaurant review provides very noisy information, so that it is only slightly useful to Betsy; but it does affect Arnie's decision. In particular, it can work out that Arnie's choice becomes less informative to Betsy. This indirect disadvantage to Betsy can outweigh her direct benefit from the review. Thus, it is not obvious that public health authorities should act quickly to disseminate noisy information. Sketchy disclosures of advantages of oat bran and fish-oil, by triggering fads, can be harmful.'7 Of course, clearcut information such 7 Early medical reports that oat bran lowers cholesterol levels led to sudden popularity of oat 13

as the evidence that cigarettes cause illness are likely to benefit all consumers. So much for disclosures made before a cascade. In contrast, once a cascade starts, and if people are identical, then everyone welcomes public information. To see this, suppose there is no disclosure. If everyone is identical, once a cascade starts people's decisions don't convey any further information. So a public disclosure at this point directly conveys information without reducing the information conveyed by people's decisions. Such a public disclosure may therefore be beneficial. Of course, if the cascade is very resilient, the disclosure might not make any difference. If Ivan is attached to the belief that Le Burger is better, based on his observation that many earlier people selected Le Burger, then perhaps a restaurant review won't sway him. But, as we have seen, cascades are delicate. How much public information does it take to shatter an established cascade? A public signal can do the trick even if it is less informative than the private signal of just one person. The restaurant reviewer doesn't need to be a genius of connoisseurship. As long as his information is about as good as yours or mine, he can change the behavior of thousands of people. Recall the illustrative model where a cascade of adoption ensues as soon as there is a majority of two for adopt over reject. Suppose that adoptions and rejections are evenly split until Edgar and Fifi both decide to adopt. At this point, everyone recognizes two possibilities: (1) both of them received favorable (H) signals, or (2) Edgar observed an H signal and Fifi observed L, leading her aggregate evidence to be HL, so that she flipped a coin. Consider a restaurant review that is as informative as a typical diner's information signal. Gerard can reason that even if Edgar and Fifi both observed H, a negative restaurant review cancels out one H. So if Gerard observes L, he is on the fence, and may well switch to reject. Since there is a significant probability that in fact Fifi observed L, Gerard should in fact reject. This reasoning remains valid even if the restaurant review is slightly less accurate than an ordinary diner's signal. The public signal is still enough to break the cascade. Since even a minor public information release can shatter a cascade, one may bran products. Newer studies contradicted the efficacy of oat bran, and the fad collapsed. A recent study has suggested that oat bran is moderately effective after all Consumer Reports Health Letter, p. 31, April 1991). 14

wonder whether the arrival of many useful public disclosures will eventually guide people into the correct cascade. This is in fact the case, because eventually the weight of evidence would tip the balance. Although the proposition strictly holds only "eventually," numerical calculation suggests that a moderate amount off public information release can go a long way toward bringing people around to the right decision. 3.2 Summary Each person with a bit of information is a valuable resource to those who decide later. An individual benefits from his ability to observe predecessors, which provides him with useful information. However, his ability to observe predecessors can harm his successors, because if he is in a cascade his action is uninformative. Cascades therefore can lead to bad decisions in which most of the potential gain from observing others is lost. For example, if the first 100 people were prevented from observing each other's decisions, each would decide based on his own signal. This could greatly benefit the next 10,000 people if they get to observe the decisions of the first 100. Even with cascades, some of the benefit of information diversity will generally be recovered because cascades are so easily broken. The blunders of an early few can be rapidly reversed if someone with more precise information appears later in the sequence. The public release of information can also break incorrect cascades, and eventually can persuade people to do the right thing. 4 Some Examples Cascades occur when individuals with private information make decisions sequentially. Cascades can still occur even when non-informational factors (sanctions against deviants, payoff interdependence, desire to conform) are present. Having recognized that factors other than information are important, I will focus on cascades and try to avoid repeated qualifications in the examples that follow. Several criteria are relevant for evaluating the applicability of the cascades model to actual behavior. The first group of criteria pertains to model assumptions: (i) Do individuals observe the actions of others?; (ii) Do individuals learn from direct 15

discussion with others?; and (iii) Are informational effects more important than other effects? Point (ii) carries little weight, since actions speak louder than words. With regard to Point (iii), sometimes one individual's action reduces another's gain from that action, in which case the payoff interaction opposes uniformity. If Wellcome observes that Merck is working on an Alzheimer's drug, then Wellcome has a good reason not to do so: it would have a competitor. If Wellcome still imitates, this suggests that information is involved. Even when payoff interactions support uniformity, cascades may still play a role in determining which alternative is fixed upon. The second group of criteria pertain to implications of the theory: (i) is behavior localized and idiosyncratic (often mistaken)?; (ii) is behavior fragile?; and (iii) do some individuals follow the crowd and ignore their own private information? 4.1 Politics The political scientist Bartels (1988) has discussed "cue-taking" in presidential nomination campaigns, in which one person's assessment of a candidate is influenced by the choices of others: "... the operative logic is, roughly, that '25,000 solid New Hampshirites (probably) can't be too far wrong.'" Political scientists who have studied how candidates build political momentum have developed survey measures (called "thermometer ratings") in which respondents score numerically how much they like the candidate. Several studies have found that after controlling for relevant factors, a candidate is rated more favorably when respondents are aware of more favorable poll results. Bartels points out (consistent with the cascades theory) that "There need not be any actual process of persuasion. the fact of the endorsement itself motivates me to change my substantive opinion of him [the candidate]." An alternative hypothesis is that voters strategically throw support behind candidates who have a better shot at winning. A vote for a sure loser is "wasted," after all. Strategic voting does explain why early successes of a candidate may lead people to vote for him. However, it does not explain why the candidate's thermometer ratings should increase. In the cascades model, the numerical evaluation of the candidate should go up after he wins early victories. In a study of 16

the 1984 Hart-Mondale contest for the Democratic nomination, Bartels found that there was an "internalized effect" on individuals' thermometer ratings, particularly in the early stages of the campaign. In the 1976 U.S. presidential campaign, an obscure Democratic candidate named Jimmy Carter focused on the Iowa caucus to obtain an early success. In the 1988 presidential contest, Democratic candidate Richard Gephardt campaigned for one year in the first two states (Iowa and New Hampshire), with less success. A common criticism of the primary system is that voters in early primaries carry more weight than voters in late primaries. As a result, many Southern states have coordinated their primaries on the same date ("Super-Tuesday"). In Spain, publishing opinion polls within 5 days of local or national elections is prohibited. Belgium, France and Portugal have similar restrictions. Similarly, it has been argued that the early reporting of election results is undesirable because later voters may be influenced (either in their choice of candidate, or in their decisions of whether to vote). Voting is only one of many kinds of political action. Susanne Lohmann has examined a model of political revolution in which public protests, demonstrations and riots occur repeatedly over time, and turnout fluctuates until a cascade forms. Since different people have different gains from a regime change, and since people choose when to protest, protests convey information bit by bit over time. She shows that "In some cases, a small number of political actions may have a large impact on public opinion; in other cases, a huge turnout may be followed by the sudden collapse of the protest movement." If more people turn out than expected, even if the numbers are very small, this communicates to others that opposition to the regime is greater than expected, which can stimulate still others to turn out. Conversely, if a huge turnout is expected, then even a large turnout may be a negative surprise, in which case the protest can peter out. Eventually, either revolt or acceptance of the current regime dominates and a cascade forms. Lohmann (1992a,b) argues that a dynamic informational cascades model provide the most satisfactory explanation for the process by which communism fell in then-East Germany based on evidence from sequences of polls of demonstrators in 5 cycles of protest in Leipzig and the public at large. 4.2 Zoology 17

Zoologists have found that animals frequently copy the behavior of other animals in territory choice, mating, and foraging. Imitation has obvious adaptive benefit, since it is better to learn by watching than by hard experience. Innovations known to have spread by imitation include sweet potato washing by Japanese macaques (Kawai 1965), and milk-bottle-opening by British tits (Hinde and Fisher [1952]). Various studies have found that animals cluster in their territory choice more than can be accounted for by the fact that high-quality sites may be located close together. Zoologists have found that territories are clumped idiosyncratically, in the sense that they are not necessarily located at the best available sites (Stamps [1988] discusses the relevant studies). According to the cascades theory, the location of clusters will be fairly arbitrarily determined by the choices of a few early settlers. Indeed, some biologists have argued that clustering occurs because males using the presence of other males as an indicator of high resource quality in nearby territories. Numerous studies have shown that females copy other females in choosing a male to mate with. Pomiankowski (1990) discusses how "in both fallow deer and sage grouse, the rate at which females enter male territories correlates with the number of females already present." When experimenters placed a stuffed female grouse on the territory of an unattractive male, the number of females entering the territory increased. A recent study by Gibson, Bradbury and Vehrencamp [1992] found that sage grouse females became markedly more unanimous in their mate choices when they arrived at the male mating display area together, so that they could observe each others' choices). The unanimity of mate choice is not explained by the characteristics of males or their sites observable to human researchers. This arbitrariness is consistent with informational cascades. Copying in these species occurs despite some significant waiting costs. Lee Alan Dugatkin and his coauthors provide some remarkable evidence on the mate choices of the Trinidadian guppy Poecilia reticulata. Dugatkin shows that "... females copy the choice of mates made by other females by viewing such interactions, remembering the identity of the chosen male, and subsequently choosing that male in future sexual endeavors."18 In a study of guppies, Dugatkin 1 Dugatkin refers to several studies of female mating choices in fishes indicating that females prefer males that already have broods from prior matings. 18

and Godin (1993) find that "... younger females copy the mate choice of older females, but older females do not appear to be influenced by the mate choice of younger individuals." This is consistent with the "fashion leader" model at the end of Section 2, since older females presumably have more precise information than younger females. Greater experience in choosing mates presumably all?ws older females to interpret environmental cues more accurately. Most remarkable of all, in a 1992 paper Dugatkin and Godin have established that... copying can even override a female's original preference of mates... That is, a female's preference for a particular male... can be reversed if she has the opportunity to see a (model) female choose the male she herself did not choose previously. This experiment identifies a key aspect of an informational cascade, that an individual's private information can be overriden by the observation of others' actions. It would be interesting to verify whether a female's mate choice is more likely to be reversed if she observes multiple females choosing differently than when she observes only a single other female. 4.3 Medical Practice and Scientific Theory The cascades theory predicts fads, idiosyncrasy, and imitation in medical treatments. (The practice of bleeding to remove bad blood, popular until the 19th century is a familiar example). Most doctors are not at the cutting edge of research; their inevitable reliance upon what colleagues have done or are doing leads to numerous surgical fads and treatment-caused illnesses ("iatroepidemics") (Robin [1984] and Taylor [1979]). It appears that many dubious practices were initially adopted based on very weak information. An example is the popularization in the 1970's of elective hysterectomy, the surgical removal without any special medical indications, of the uterus of women past childbearing age.19 In the New England 19 Routine tonsillectomy, the surgical removal of tonsils, seems to be another remarkably unfounded practice. According to Taylor, its adoption was not associated with any definitive supporting evidence, such as controlled studies. An English panel asserted that tonsillectomy was being "...performed as a routine prophylactic ritual for no particular reason and with no particular result." Also notable are the extreme differences in tonsillectomy frequencies in different countries 19

Journal of Medicine, John Burnum discusses "bandwagon diseases" diagnosed by physicians who behave "... like lemmings, episodically and with a blind infectious enthusiasm pushing certain diseases and treatments primarily because everyone else is doing the same." In the fashion leader version of the cascades model, even one adoption can start a cascade. So our advice to patients seeking a second opinion is to withhold the first doctor's diagnosis. Even science (not to mention academia in general) is subject to fads and fashions. The sheer complexity and volume of material a scientist must deal with makes it impossible to examine critically the evidence bearing on all major theories. At best scientists can investigate thoroughly only narrow subfields. But knowledge is interrelated, so a scientist is forced to accept useful ideas and theories because others have done so. Furthermore, there is a strong "fashion leader" phenomenon in academics, including the sciences. In the fashion leader model, the decision by the first (well-informed) individual was enough to persuade everyone after to imitate. In academics and science as well, nascent theories of course enjoy much greater success when the initiator is famous and from a top university than when he is unknown and from a minor school. This can explain the "Matthew effect" described by Robert K. Merton, a well-known sociologist of science.2 When a minor league researcher proposes a good idea, it is often dismissed by editors and reviewers as wrong or uninteresting, whereupon it slips into oblivion. The idea is not linked to any name in people's minds, because almost no-one has even noticed its existence. In contrast, when a distinguished scientist proposes the same idea, it gains credibility by virtue of the endorser, and becomes linked to the distinguished name in people's minds. The cascades theory implies that the rise of academics to eminence is itself idiosyncratic. It is very costly, even for a pro, to assess accurately the achievements of another researcher. So to a large extent we view someone as eminent because we and regions. The routine tonsillectomy of millions of children was not without cost, since some children were injured and died as a result. As with hysterectomy, the rate of tonsillectomy has declined in recent years. 20 The bible tells us that "For whosoever hath, to him shall be given, and he shall have abundance; but whosoever hath not, from him shall be taken away even that he hath." 20

observe that others have done so.21 For example, if you are hired at a top school, prestige rubs onto you (and vice versa). Just as the eminence of academics may be the result of cascades of esteem, a job applicant who receives early job offers may become a "star." In the annual rookie market for professors, later schools give close attention to the interviews or offers granted by earlier schools. Similarly, to be granted tenure or receive a chaired professorship at a university it is helpful to receive tenured or chaired offers elsewhere. Eminence is of course far from meaningless. But academic stature can be surprisingly noisy. Since eminent academics become academic fashion leaders, these errors in turn can cause substantive errors in the acceptance or rejection of new ideas. 4.4 Finance Foresi and Mei (1991, 1992) provide evidence consistent with firms imitating each other in choosing levels of investment. They report in both U.S. and Japanese datasets that a firm's investment level can be explained partly by the investment levels and profitability of its competitors.22 A very important kind of discrete investment is the purchase of another firm. A firm that receives a takeover bid is said to be put "into play," and very frequently receives sudden competing offers. Yet from the bidder's viewpoint, competing for an in-play target is more expensive than buying another target that is not sought after by a competitor. This suggests that potential bidders learn from the first bid that the target is an attractive candidate for takeover. More broadly, takeover markets have been subject to booms and crashes, such as the wave of conglomerate mergers in the 1960's, in which firms diversified across different industries, and the subsequent refocusing of firms through restructuring and bustup takeovers in the 1980's. 21 As Michael Ghiselin has put it, "The mere fact of eminence provides a cheap substitute for inquiring as to the basis upon which that eminence rests. The main reason why a scholar gets an honorary degree is that somebody else has already given him an honorary degree." 22 These studies suggest that imitation is important. However, the finding that the profitability of competitors helps explain own-investment behavior suggests that firms may be observing each others' profits, not just their actions. This suggests that the situation is more like the model with public information releases than the basic cascades model. This is not too surprising. We don't really think that investment decisions, once made, last forever. 21

Providers of capital to firms also may imitate in their investment decisions. When a distressed firm asks creditors to renegotiate the terms of its debt, the refusal of one creditor may affect the decisions of others. Similarly, if some bank depositors withdraw their funds from a troubled bank, others may follow, leading to a bank run. In both these example, there is a payoff interaction as well as an informational interaction between different people. If I get my money out first, this leaves less for you. However, at the very start of the bank run, when only a few people have withdrawn, the information conveyed by their actions may be the dominating influence upon others. An analysis of informational cascades in the start of bank runs is provided by Corb (1993). Chen (1993) examines contagious cascades of runs between banks. It is natural to wonder whether cascades apply to stock market investments. After all, market price fluctuations have been described with such phrases as "manias," "panics," "fads," "animal spirits," "investor sentiment" and "bubbles." The basic cascades model is too simple to capture these phenomena, because the cost of "adopting" the action of buying a stock is not constant. As the price rises, it becomes more expensive to buy the stock. Lee (1993) has shown that market booms and crashes can occur in a modified cascades model. Currently there is little evidence as to the validity of this explanation. 4.5 Peer Influence and Stigma In The Mask of Command, John Keegan describes Alexander the Great as "... the supreme hero. Nowhere do the dimensions of his heroic effort show more clearly than in his personal conduct on the battlefield." For example, "at Multan, he attempted to take the city virtually single-handed. It was thus that he suffered his nearly fatal wound." After confusion led to a delay in bringing up the siege ladders, Alexander... seized one himself, set it against the wall, held his shield over his head and started up... Reaching the battlements, he pushed some of the Indians off it with his shield, killed others with his sword and waited for his followers to join him in the foothold he had won. They were so anxious to reach him... that they overcrowded the ladder, 22

which broke, decanting those at the top on to those at the bottom and so stopping anyone getting to Alexander's help. He, 'conspicuous both by splendor of this arms and by his miraculous courage,' was now under attack by bowmen at close range. He could not remain where he was. He would not jump down to safety. He therefore jumped into the city and began to lay about him with his sword as if Gulliver among the Lilliputians. As Keegan's book makes clear, Alexander courage was "exemplary": his example encouraged others to fight harder.23 More generally, in battle, waves of optimism or pessimism often make the difference between victory and defeat. If others desert, those remaining may not only lose the aid of their comrades (payoff interaction), but may infer that their comrades viewed victory unlikely (du Picq 1921). The ancient emphasis on heroic leadership has an informational interpretation. As in the fashion leader model, officers will normally know better than their troops about the prospects for victory. Thus, courage by officers should have disproportionate effect on the morale of the troops. Rank distinctions aside, it would be very interesting to study whether the actions of a very few early deserters or early fighters can have a disproportionate effect on the morale and behavior of the others. It may be objected that information is not the only explanation for a motivating effect of heroism. Doesn't brave behavior by officers shame the troops into bravery? Possibly, but on the other hand, many troops would rather be shamefully alive than honorably dead. Given the intense emotions and challenges to self-command associated with battle, it is easy to see how some commentators could misinterpret a rational information effect as "shaming". Conformity to peers in general is often assumed automatically to be be due to coercion rather than informational effects. Contrast the cliched, judgemental phrase "peer pressure," with the neutral "peer influence." Conformity may often occur voluntarily when people are faced with similar decisions, especially those with little information or experience, obtain information from the decisions of 23 According to Packenham (1979, p. 133), "There were times in the wars of the nineteenth and earlier centuries when a general had to sacrifice his life to rally the troops." He describes the disproportionately high English officer casualties in the Boer War. 23

others. In a famous set of experiments, the social psychologist Salomon Asch found that people asked to compare the lengths of lines tended to follow the comparisons made by other members of their group. While this is usually viewed as due to a pressure to conform, it is also possible that people genuinely change their beliefs after observing others' choices. Genuine coercion can arise from the threat of stigma, a shared negative treatment of someone who violates the norm of the group. But those who stigmatize may have informational reasons for doing so. Social psychologists have found evidence suggesting that stigma is specific to the group, and that people learn to stigmatize by observing the actions of others such as parents (see Ainlay, Becker, and Coleman [1986]). Just as gaps in one's resume are damaging, a frequent job-switcher may be treated with suspicion. The traditional (and now weakened) stigma carried by divorced persons provides another example. 5 Fads Customs and standards sometimes shift abruptly without obvious reason. In the basic cascades model, people very quickly start to do the same thing, which is quite often a mistake. The initial cascade forms based on very little information. Because of this, if the model is changed by adding small shocks to the system, a persistent behavior among early individuals can become unpopular among later individuals.24 This section considers one kind of shock to the system that leads to seemingly whimsical shifts in behavior. Suppose that there is a small probability that the underlying value of adopting versus rejecting can change after the 100th person (say). Then the cascade can very easily switch. So much is obvious-if it used to be better to adopt, and now it is better to reject, then it is possible people will notice this fact, and act accordingly. What is interesting is that the cascade can switch not just because the right action has changed, but because people think it may have changed. This means that even if adopting (for example) was the right thing to do and the original 24 In this model, since each person makes a single decision in strict sequence, fads are defined as shifts in behavior between early and later individuals. However, as Chamley and Gale (1992) and Lohmann (1992a) have shown, the cascades concept extends to settings in which a given individual chooses when to switch from one behavior to another. 24

cascade correctly involved adopting, for example, individual 101 may happen to observe an L signal and wrongly switch to rejecting because he thinks that the rejection has become better. In effect, the possibility of the value change shakes up a precarious balance. As a result, the likelihood of an action change (from a cascade of rejection to one of acceptance, or vice versa) can be far greater than the probability that the correct choice has changed. Bikhchandani, Welch, and I provide a numerical example in which, after the 100'th person, there is a 5% chance of a switch in best action (from Adopt to Reject, or vice versa). This leads to a probability of over 9.35% that the cascade is at some point reversed, which is 87% higher. The effect of cascades is to create temporary uniformity (during the time after the initial cascade starts in the first 100 individuals), but the situation becomes highly volatile at the time of the shock (just before individual 101). Since people were not very sure that the cascade was correct in the first place, they shift at the slightest provocation. I have focused on one kind of shock-value changes-that causes fads. Other types of noise or shocks can ahve the same effect. For example, if I can't observe or remember perfectly what previous people did, or if I think their costs and benefits differ from mine, I may be inclined to switch. Again, this can lead to abrupt shifts in the behavior of many people. 6 The Decision to Acquire Information Information is costly to acquire; it's cheaper to rely on the cheap information conveyed by the decisions of others. Suppose that each person decides whether or not to investigate (which is costly), and then decides whether to adopt or reject. Then whenever anyone declines to acquire information, everyone who follows will do likewise. To see this, suppose that Deedee does not buy information. Then Edgar's decision about whether to invest in information is based on precisely the same public information that Deedee had. Since Deedee didn't buy information, neither should Edgar. Repeating this reasoning shows that no one later in the queue buys information. Since a cascade is virtually sure to form by the time a late individual is reached (see Section 2), any information he might purchase would have no value. So individuals who are late in the queue virtually never acquire information. An 25

early individual, for whom a private signal may prove decisive, is of course more likely to acquire information. In confirmation, Rogers and Shoemaker (1971) conclude from 12 empirical studies on diffusion of innovations that "early adopters seek more information about innovations than later adopters." Interestingly, even when people can observe the signals of their predecessors (not just their actions), cascades can form and bring about idiosyncratic behavior and fragility. As soon as early investigating individuals generate information with a fairly mild preponderance of evidence in favor of one action or another, a point is reached where later individuals will imitate regardless of their own signals. At this point, they have no reason to investigate, so the general pool of public knowledge stops growing. As with the basic model, society therefore ends up fixed upon an action based on weak evidence; and of course, it only takes a small shock to dislodge such an ill-informed cascade. 7 Conclusion People often seem to do end up doing the same thing without any obvious sanctions against deviants. The theory of informational cascades helps explain why conformity occurs, how it is maintained, and how it is broken. Furthermore, cascades explain why the conformist outcome is often wrong. The reason is that cascades start readily based on very small amounts of information. Once the past history of adoptions or rejections becomes just informative enough to outweigh a person's private information signal, he follows his predecessors. At this point his action is uninformative to later decisionmakers, so later followers join the bandwagon. But while this cascade of identical or conformist behavior can become quite long, it is not strong. A small shock, such as a public information disclosure, a value change, or even the possibility of such a change, can lead to an abrupt shift. The cascades theory should be contrasted with some other theories of instability. Many models of behavior imply that actions of large groups can be fragile under special circumstances. In these alternative models, there is instability only if the system coincidentally is balanced near a knife-edge. In contrast, when there are informational cascades the system systematically moves to a precarious 26

position-everyone is doing the same thing but just barely prefers to do so. In most actual conformist situations, one or more forces may be operative: information transfers, sanctions against deviants, payoff interactions, and direct desire to conform. None of these are inconsistent with informational cascades. In a big group, the behavior of the first few individuals probably doesn't impose sanctions on others, nor directly affect others' payoffs very much, nor does it create much immediate pressure to conform. However, according to cascades theory, the actions of the first few individuals will still be extremely influential. The basic cascades model implies occasional, irregular bouts of sudden change because people re-estimate the costs and benefits different alternatives. In some theories of fads or fashion, change occurs regularly because how much some people value different alternatives directly depends on what others are doing. For example, some people may have a direct preference for change. Or, some people may have a direct preference for deviating from the actions of others. For example, whether a short skirt is acceptable this season depends on who else decides to wear a short skirt. In a setting where most people want to conform, many outcomes may be possible. Will pink be the fashionable color next year or black? It may be that neither color is superior to the other, but people do care about wearing the popular color. If so, people who need to buy clothes early will try to forecast what others will be doing. Even in such cases, the choices of early individuals can start cascades, because the followers may infer from the early choice of black (say) that early people have reason to suspect that black will be "in". Thus, even though this setting is rather distant from the basic cascades model, cascades can still help to explain choices. The key ingredient of the cascades approach is that individuals make decisions more or less in sequence, with later decisionmakers observing something about the choices of early ones. This sequentiality is probably present in the introduction of clothing fashions and in the start of political revolutions, as well as in other applications mentioned earlier. Thus, cascades can help explain the process by which society switches from one steady state to another. I will mention two possible extensions of the model. The first is to include liaison individuals, i.e., people who link two or more cliques. For example, a cascade in France may go in the opposite direction from a cascade in England. If someone can observe both cascades, and if his decision can be observed in both countries, then he may break one of the two cascades. As the world becomes more 27

of a global village, the cascades analysis predicts that such linkage can reverse local cascades. U.S. "cultural imperialism" (in television, cinema, fast food, sneakers, and blue jeans) may be a case in point. Socially, it may be desirable to have separate groups that are only later combined, so that later individuals can aggregate the information of several cascades instead of just one. The other extension is to consider varying costs of adoption. If costs of adoption are increasing, for example, then cascades of adoption can be broken. In work in progress (Hirshleifer and Welch 1993a,b), Welch and I show that cascades are just a special case of a more general phenomenon which we call inertia. When an individual such as a corporate manager can observe previous decisions (of his predecessor) but not previous signals, the new manager is often biased in favor of continuing and even escalating the old policies. A new manager will rationally tend to invest in costly expansions of projects-in-place based on the likelihood that his predecessor had good reason for initiating the original project. We argue that failures of institutional memory (as when a manager is replaced) leads to such problems as sunk-cost biases (continuing investment in failing projects), retention of deadwood, the Peter Principle (promoting employees to their level of incompetence), and undervaluation of future opportunities for growth. 28

References Ainlay, Stephen C., Becker, Gaylene, and Coleman, Lerita M. The Dilemma of Difference: A Multidisciplinary View of Stigma. New York: Plenum Press, 1986. Akerlof, George A. "The Economics of Caste and of the Rat Race and Other Woeful Tales." Quarterly Journal of Economics 90 (4) (Nov 1976): 599-617. Akerlof, George A. "A Theory of Social Custom, of which Unemployment May Be One Consequence." Quarterly Journal of Economics 94 (June 1980): 749-75. Arthur, Bryan. "Competing Technologies, Increasing Returns, and Lock-in by Historical Events." The Economic Journal 99 (March 1989): 116-31. Asch, Salomon E. Social Psychology. Englewood Cliffs, New Jersey: Prentice Hall, (1952). Bandura, Albert. Social Learning Theory. Englewood Cliffs, New Jersey: Prentice Hall, (1977). Banerjee, Abhijit. "A Simple Model of Herd Behavior." Quarterly Journal of Economics; 107, August (1992):797-818. Bartels, Larry M. Presidential Primaries and the Dynamics of Public Choice. Princeton University Press, Princeton, New Jersey, (1988). Becker, Gary S. "A Note on Restaurant Pricing and other Examples of Social Influences on Price." Journal of Political Economy 99 (October 1991): 1109-16. --------- "Son of Fish Market: A Further Note on Restaurant Pricing and Other Examples of Social Influence on Price," University of Chicago, Economics Department Working Paper, May (1992). Bikhchandani, Sushil; Hirshleifer, David; and Welch, Ivo. "A Theory of Fads, Fashion, Custom and Cultural Change as Informational Cascades." Journal of Political Economy, 100(5), October (1992):992-1026. Burnum, John F. New England Journal of Medicine 317, vol. 19 (November 1987): 1220-1222. Chamley, Cristophe and Gale, Douglas. "Information Revelation and Strategic Delay in Irreversible Decisions." Preliminary Draft, Boston University, July (1992). Chen, Yehning. "Payoff Externality Information Externality, and Banking Panics." UCLA Anderson Graduate School of Management, October (1993). Coleman, James. "Norms as Social Capital." in Economic Imperialism: The Economic Approach Applied Outside the Field of Economics, G. Radnitzky and P. Bernholz eds., NY: Paragon House (1987). Corb, Howard M. "The Nature of Bank Runs." Graduate School of Business, Stanford University, (1993). 29

du Picq, Charles Ardant. Battle Studies: Ancient and Modern Battle. New York: MacMillan Press, (1921). Dugatkin, Lee A. "Sexual Selection and Imitation: Females Copy the Mate Choice of Others." American Naturalist, 139, (1992): 1384-9. Dugatkin, Lee A. and Godin, Jean-Guy J., "Reversal of Female Mate Choice by Copying in the Guppy Poecilia reticulata), Proceedings of the Royal Society of London, Series B, (1992 in press). Dugatkin, Lee A. and Godin, Jean-Guy J., "Female Copying in the Guppy Poecilia reticulata): Age-Dependent Effects." Behavioral Ecology, (1993, in press). Foresi, Silverio and Mei, Jianping. "Do Firms 'Keep up with the Joneses?': Evidence on Cross-Sectional Variations in Investment." NYU Salomon Center Working Paper S-91-41, June (1991). Foresi, Silverio and Mei, Jianping. "Interaction in Investment among Japanese Rival Firms." NYU mimeo, August (1992). Galef Jr., Bennet G. "Social Transmission of Acquired Behavior: A Discussion of Tradition and Social Learning in Vertebrates." Advances in the Study of Behavior 6 (1976): 77-100. Gibson, Robert M., Bradbury, Jack W., and Vehrencamp, Sandra L. "Mate Choice in Lekking Sage Grouse Revisited: The Roles of Vocal Display, Female Site Fidelity, and Copying." Behavioral Ecology, 2, (1992): 165-80. Hinde, R. A. and Fisher, J. "Further Observations on the Opening of Milk Bottles by Birds." British Birds, 44, (1952): 393-6. Hirshleifer, David, and Rasmusen, Eric. "Cooperation in a Repeated Prisoners' Dilemma with Ostracism." Journal of Economic Behavior and Organization 12 (August 1989): 87-106. Hirshleifer, David, and Welch, Ivo. "Institutional Memory, Inertia, and Impulsiveness." (in progress). Hirshleifer, David, and Welch, Ivo. "Institutional Memory, Escalation, and Investment Decisions." (in progress). Jones, Stephen R.G. The Economics of Conformism. Oxford: Basil Blackwell, 1984. Kawai, M. "Newly Acquired Pre-Cultural Behavior of the Natural Troop of Japanese Monkeys on Koshima Inlet." Primates 6 (August 1965): 1-30. Keegan, John. The Mask of Command, Ch. 1, Elisabeth Sifton Books-Viking. New York (1987) (pp. 13-91.). Kuran, Timur. "Sparks and Prairie Fires: A Theory of Unanticipated Political Revolution." Public Choice 61 (April 1989): 41-74. 30

Kuran, Timur. "Now out of Never: The Element of Surprise in the East European Revolution of 1989." World Politics 44 (October 1991): 7-48. Lee, In Ho. "Market Crashes and Informational Cascades." UCLA Economics Department, September (1992). Liebowitz, S. J. and Margolis, Stephen E.. "The Fable of the Keys." Journal of Law and Economics, 33, April (1990):1-25. Lohmann, Susanne. "Rationality, Revolution and Revolt: The Dynamics of Informational Cascades." Graduate School of Business Research Paper No. 1213, Stanford University, October (1992a). Lohmann, Susanne. "The Dynamics of Regime Collapse: A Case Study of the Leipzig Monday Demonstrations," Graduate School of Business Research Paper No. 1225, Stanford University, October (1992b). Merton, Robert K. The Sociology of Science: Theoretical and Empirical Investigations, Chicago: University of Chicago Press, (1973). Packenham, The Boer War. New York, Random House (1979). Pomiankowski, Andrew. "How to Find the Top Male." Nature 347 (October 1990): 616-617. Robin, Eugene D. Matters of Life and Death: Risks vs. Benefits of Medical Care. New York: Freeman and Co., 1984. Rogers, Everett M., and Shoemaker, F. Floyd. Communication of Innovations: A Cross-Cultural Approach, (2d Edition). New York: Macmillan Press, 1971. Rogers, Everett M. Diffusion of Innovation, 3rd Edition. New York: Free Press, Macmillan Publishers, 1983. Ryan, Bryce, and Gross, Neal C. "The Diffusion of Hybrid Seed Corn in Two Iowa Communities." Rural Sociology 8 (March 1943): 15-24. Schelling, Thomas C. Micromotives and Macrobehavior. New York: Norton Publisher, 1978. Stamps, J.A. "Conspecific Attraction and Aggregation in Territorial Species." American Naturalist 131 (March 1988): 329-347. Taylor, Richard. Medicine Out of Control, Melbourne: Sun Books, 1979. Welch, Ivo. "Sequential Sales, Learning and Cascades." The Journal of Finance 47 (1992). 31

I I I i I 1