|Published (Last):||17 April 2010|
|PDF File Size:||17.7 Mb|
|ePub File Size:||15.21 Mb|
|Price:||Free* [*Free Regsitration Required]|
First published in print format ISBN eBook Dawsonera ISBN Hardback Cambridge University Press has no responsibility for the persistence or accuracy of urls for external or third-party internet websites referred to in this publication, and does not guarantee that any content on such websites is, or will remain, accurate or appropriate.
Bertrand Russell About 15 years ago, I wrote the book Essentials of Probability, which was designed for a one-semester course in probability that was taken by math majors and students from other departments. This book is, in some sense, the second edition of that book, but there are several important changes: r Chapter 1 quickly introduces the notions of independence, distribution, and expected value, which previously made their entrance in Chapters 2, 3, and 4.
This makes it easier to discuss examples; for example, we can now talk about the expected value of bets. This material is usually covered in an undergraduate stochastic processes course, if you are fortunate enough to offer one in your department, but in our experience this material is popular with students. This decision orig- inated to minimize the reliance on calculus, but in time I have grown to enjoy abandoning the boring mechanics of marginal and conditional distributions to spend more time talking about probability.
This book, like its predecessor, takes the philosophy that the best way to learn probability is to see it in action. There are more than problems and examples. These contain all the old standards: the World Series, dice and card games, birthday problem, Monty Hall, medical testing posterior probabilities, and various applications of the central limit theorem.
Simpson and Sally Clark, how to play blackjack, cognitive dissonance in monkeys, the hot hand in basketball, and option pricing. I have done my best to explain things clearly, but the reader and the instructor should be warned that thinking is required. How should you teach from this book? At this point, one can go on to Markov chains in Chapter 4 which I prefer or to continuous distributions in Chapter 5. From Chapter 4 you can go to Chapter 5 or leave this boring topic to the instructor of the statistics course that follows yours and go on to the law of large numbers and central limit theorem in Chapter 6.
The all-important normal is a continuous distribution, of course, but all computations for it are done with tables, so the only concept one needs is the distribution function. Finally, Chapter 7 is a brief introduction to option pricing.
I find this makes a nice final lecture be- fore one turns to the business of reviewing material in preparation for the final exam. Supporting cast The writing of this book benefited from the comments of several people paid by Cambridge University Press to read various chapters, and in particular by the efforts of one reader who made hundreds of comments on the writing style. In the spring quarter of , Ed Waymire used the book at Oregon State, and in the fall quarter of , Michael Phelan used the book at U.
I am grateful to Michael for his many comments and his enthusiasm for the book. Now David is a senior at Ithaca College, one semester away from graduating with a major in journalism, and wondering if the economic collapse brought on by 8 years of the Bush administration will keep him from getting a job. Greg, who is a junior at MIT double majoring in computer science and math, has better long-term job prospects, but he will probably go to graduate school before deciding how close he wants to be to the real world.
This brings me to the two women who are the most important for this book. The first is my wife, Susan. After 28 years of marriage and almost a dozen prefaces, I have run out of clever things to say. When the kids are home from college, as they are now during winter break, she is a flurry of activity. In between, she fills her empty nest with the New York Times, its crossword puzzles an addiction I share , and tending to her parents who moved to Ithaca from the Sacramento area about 5 years ago.
In December, we had our first vacation away together since David was born in The other important woman is my editor Lauren Cowles. After seeing Essen- tials of Probability moved around from Wadsworth to Duxbury Press and on to International Thompson Publishing and then go out of print without my being told, it is nice to be in the hands of someone who cares about my books.
Even though I and others have spent a lot of effort debugging the book, it is in- evitable that there will be typos. Email them to rtd1cornell. The notions of independence, distribution, and expected value are studied in more detail later, but it is hard to discuss examples without them, so we introduce them quickly here.
As we see, the range of applications extends beyond games into business decisions, insurance, law, medical tests, and the social sciences.
The telephone network, call centers, and airline companies with their randomly fluctuating loads could not have been economically designed without probability theory.
To quote Pierre-Simon, marquis de Laplace from several hundred years ago: It is remarkable that this science, which originated in the consideration of games of chance, should become the most important object of human knowledge. The most important questions of life are, for the most part, really only problems of probability. In order to address these applications, we need to develop a language for discussing them. Euclidean geometry begins with the notions of point and line.
The corresponding basic object of probability is an experiment: an activity or procedure that produces distinct, well-defined possibilities called outcomes.
Here and throughout the book boldface type indicates a term that is being defined. Example 1. If we suppose, for convenience, that they are red and green then we can write the outcomes of this experiment as m, n , where m is the number on the red die and n is the number on the green die.
The goal of probability theory is to compute the probability of various events of interest. Intuitively, an event is a statement about the outcome of an experiment. The formal definition is: An event is a subset of the sam- ple space. In general, the probability of an event C concerning the roll of two dice is the number of outcomes in C divided by For the moment, we use this interpretation of P A to explain the definition.
Given 1. Assumption iii implies that iv holds for a finite number of events, but for infinitely many events the last argument breaks down and this is a new assumption. Not everyone believes that Assumption iv should be used. How- ever, without iv the theory of probability becomes much more difficult and less useful, so we impose this assumption and do not apologize further for it. In many cases the sample space is finite, so iv is not relevant anyway. To describe the probability it is enough to give the values for the individual outcomes since iii implies that P A is the sum of the probabilities of the outcomes in A.
Property 1. Let Ac be the complement of A, that is, the set of outcomes not in A. Subtracting P A from each side of the equation gives the result.
This formula is useful because sometimes it is easier to compute the prob- ability of Ac. Property 2. How many own either a car or a stereo? Given a set A, we use A to denote the number of points in A. The reasoning that led to 1. In this baseball event, the first team to win four games wins the championship. Obviously, the series may last 4, 5, 6, or 7 games. However, a fan who wants to buy a ticket would like to know what are the probabilities of each of these outcomes.
In short, we suppose that the games are decided by tossing a fair coin to determine whether team A or team B wins. Four games. There are two possible ways this can happen: A wins all four games or B wins all four games. Five games. Here and in the next case we compute the probability that A wins in the specified number of games and then multiply by 2. Six games. We then move the remaining win for B through its possibilities.
Seven games. The numbers in parentheses give the number of series in our sample. On the other hand, the NBA finals data look like what we expect to see. The excess of six-game series can be due just to chance. There are 30 people at a party. Should you take the bet? To pose a mathematical problem we ignore February 29 which only comes in leap years, and suppose that each person at the party picks their birthday at ran- dom from the calendar.
There are possible outcomes for this experiment. Let D be the event that all birthdays are different. Let pk be the probability that k people all have different birthdays. Two examples of events that are not independent are Example 1. Rearranging the definition of conditional probability in 1. There are two ways of extending the definition of independence to more than two events. We have already seen an example of events that are pairwise independent but not inde- pendent.
Since there are ways 2 girls can have the same birthday out of possibilities as in Example 1. Similar arguments show that B and C are independent and A and C are independent. Let i, j, k be the values for the three dice. To check this, note that if j is odd then i and k are even, while if j is even then i and k are odd. The last two examples are somewhat unusual.
Elementary Probability for Applications / Edition 1