logo资料库

概率,Probability and Statistics(4th).pdf

第1页 / 共911页
第2页 / 共911页
第3页 / 共911页
第4页 / 共911页
第5页 / 共911页
第6页 / 共911页
第7页 / 共911页
第8页 / 共911页
资料共911页,剩余部分请下载后查看
Cover
Title Page
Copyright Page
Contents
Preface
Acknowledgments
1 INTRODUCTION TO PROBABILITY
1.1 The History of Probability
References
1.2 Interpretations of Probability
The Frequency Interpretation of Probability
The Classical Interpretation of Probability
The Subjective Interpretation of Probability
1.3 Experiments and Events
Types of Experiments
The Mathematical Theory of Probability
1.4 Set Theory
The Sample Space
Relations of Set Theory
8.8
Operations of Set Theory
Summary
1.5 The Definition of Probability
Axioms and Basic Theorems
Further Properties of Probability
Summary
1.6 Finite Sample Spaces
Requirements of Probabilities
Simple Sample Spaces
Summary
1.7 Counting Methods
Multiplication Rule
Permutations
The Birthday Problem
Summary
1.8 Combinatorial Methods
Combinations
Binomial Coefficients
Summary
1.9 Multinomial Coefficients
Summary
1.10 The Probability of a Union of Events
The Union of a Finite Number of Events
Summary
1.11 Statistical Swindles
Misleading Use of Statistics
Perfect Forecasts
Guaranteed Winners
Improving Your Lottery Chances
1.12 Supplementary Exercises
2 CONDITIONAL PROBABILITY
2.1 The Definition of Conditional Probability
The Multiplication Rule for Conditional Probabilities
Conditional Probability and Partitions
Summary
2.2 Independent Events
Definition of Independence
Independence of Two Events
Independence of Several Events
Conditionally Independent Events
Summary
2.3 Bayes’ Theorem
Statement, Proof, and Examples of Bayes’ Theorem
Prior and Posterior Probabilities
Summary
2.4 The Gambler’s Ruin Problem
Statement of the Problem
Solution of the Problem
Summary
2.5 Supplementary Exercises
3 RANDOM VARIABLES AND DISTRIBUTIONS
3.1 Random Variables and Discrete Distributions
Definition of a Random Variable
The Distribution of a Random Variable
Discrete Distributions
Uniform Distributions on Integers
Binomial Distributions
Summary
3.2 Continuous Distributions
The Probability Density Function
Nonuniqueness of the p.d.f.
Uniform Distributions on Intervals
Other Continuous Distributions
Summary
3.3 The Cumulative Distribution Function
Definition and Basic Properties
Determining Probabilities from the Distribution Function
The c.d.f. of a Discrete Distribution
The c.d.f. of a Continuous Distribution
The Quantile Function
Summary
3.4 Bivariate Distributions
Discrete Joint Distributions
Continuous Joint Distributions
Mixed Bivariate Distributions
Bivariate Cumulative Distribution Functions
Summary
3.5 Marginal Distributions
Deriving a Marginal p.f. or a Marginal p.d.f.
Independent Random Variables
Summary
3.6 Conditional Distributions
Discrete Conditional Distributions
Continuous Conditional Distributions
Construction of the Joint Distribution
Summary
3.7 Multivariate Distributions
Joint Distributions
Mixed Distributions
Marginal Distributions
Independent Random Variables
Conditional Distributions
Histograms
Summary
3.8 Functions of a Random Variable
Random Variable with a Discrete Distribution
Random Variable with a Continuous Distribution
The Probability Integral Transformation
Simulation
Summary
3.9 Functions of Two or More Random Variables
Random Variables with a Discrete Joint Distribution
Random Variables with a Continuous Joint Distribution
Summary
3.10 Markov Chains
Stochastic Processes
Markov Chains
The Transition Matrix
The Initial Distribution
Stationary Distributions
Summary
3.11 Supplementary Exercises
4 EXPECTATION
4.1 The Expectation of a Random Variable
Expectation for a Discrete Distribution
Expectation for a Continuous Distribution
Interpretation of the Expectation
The Expectation of a Function
Summary
4.2 Properties of Expectations
Basic Theorems
Expectation of a Product of Independent Random Variables
Summary
4.3 Variance
Definitions of the Variance and the Standard Deviation
Properties of the Variance
The Variance of a Binomial Distribution
Interquartile Range
Summary
4.4 Moments
Existence of Moments
Moment Generating Functions
Properties of Moment Generating Functions
Summary
4.5 The Mean and the Median
The Median
Comparison of the Mean and the Median
Minimizing the Mean Squared Error
Minimizing the Mean Absolute Error
Summary
4.6 Covariance and Correlation
Covariance
Correlation
Properties of Covariance and Correlation
Summary
4.7 Conditional Expectation
Definition and Basic Properties
Prediction
Summary
4.8 Utility
Utility Functions
Examples of Utility Functions
Selling a Lottery Ticket
Some Statistical Decision Problems
Summary
4.9 Supplementary Exercises
5 SPECIAL DISTRIBUTIONS
5.1 Introduction
5.2 The Bernoulli and Binomial Distributions
The Bernoulli Distributions
The Binomial Distributions
Summary
5.3 The Hypergeometric Distributions
Definition and Examples
The Mean and Variance for a Hypergeometric Distribution
Comparison of Sampling Methods
Summary
5.4 The Poisson Distributions
Definition and Properties of the Poisson Distributions
The Poisson Approximation to Binomial Distributions
Poisson Processes
Summary
5.5 The Negative Binomial Distributions
Definition and Interpretation
The Geometric Distributions
Properties of Negative Binomial and Geometric Distributions
Summary
5.6 The Normal Distributions
Importance of the Normal Distributions
Properties of Normal Distributions
The Standard Normal Distribution
Comparisons of Normal Distributions
Linear Combinations of Normally Distributed Variables
The Lognormal Distributions
Summary
5.7 The Gamma Distributions
The Gamma Function
The Gamma Distributions
The Exponential Distributions
Life Tests
Relation to the Poisson Process
Summary
5.8 The Beta Distributions
The Beta Function
Definition of the Beta Distributions
Moments of Beta Distributions
Summary
5.9 The Multinomial Distributions
Definition and Derivation of Multinomial Distributions
Relation between the Multinomial and Binomial Distributions
Means, Variances, and Covariances
5.10 The Bivariate Normal Distributions
Definition and Derivation of Bivariate Normal Distributions
Properties of Bivariate Normal Distributions
Linear Combinations
Summary
5.11 Supplementary Exercises
6 LARGE RANDOM SAMPLES
6.1 Introduction
6.2 The Law of Large Numbers
The Markov and Chebyshev Inequalities
Properties of the Sample Mean
The Law of Large Numbers
Summary
6.3 The Central Limit Theorem
Statement of the Theorem
The Delta Method
Summary
6.4 The Correction for Continuity
Approximating a Discrete Distribution by a Continuous Distribution
Approximating a Bar Chart
Summary
6.5 Supplementary Exercises
7 ESTIMATION
7.1 Statistical Inference
Probability and Statistical Models
Examples of Statistical Inference
General Classes of Inference Problems
Definition of a Statistic
References
7.2 Prior and Posterior Distributions
The Prior Distribution
The Posterior Distribution
The Likelihood Function
Sequential Observations and Prediction
Summary
7.3 Conjugate Prior Distributions
Sampling from a Bernoulli Distribution
Sampling from a Poisson Distribution
Sampling from a Normal Distribution
Sampling from an Exponential Distribution
Improper Prior Distributions
Summary
7.4 Bayes Estimators
Nature of an Estimation Problem
Definition of a Bayes Estimator
Different Loss Functions
The Bayes Estimate for Large Samples
More General Parameters and Estimators
Summary
7.5 Maximum Likelihood Estimators
Introduction
Definition of a Maximum Likelihood Estimator
Examples of Maximum Likelihood Estimators
Summary
7.6 Properties of Maximum Likelihood Estimators
Invariance
Consistency
Numerical Computation
Method of Moments
M.L.E.’s and Bayes Estimators
7.7 Sufficient Statistics
Definition of a Sufficient Statistic
The Factorization Criterion
Summary
7.8 Jointly Sufficient Statistics
Definition of Jointly Sufficient Statistics
Minimal Sufficient Statistics
Maximum Likelihood Estimators and Bayes Estimators as Sufficient Statistics
Summary
7.9 Improving an Estimator
The Mean Squared Error of an Estimator
Conditional Expectation When a Sufficient Statistic Is Known
Summary
7.10 Supplementary Exercises
8 SAMPLING DISTRIBUTIONS OF ESTIMATORS
8.1 The Sampling Distribution of a Statistic
Statistics and Estimators
Purpose of the Sampling Distribution
Summary
8.2 The Chi-Square Distributions
Definition of the Distributions
Properties of the Distributions
Summary
8.3 Joint Distribution of the Sample Mean and Sample Variance
Independence of the Sample Mean and Sample Variance
Estimation of the Mean and Standard Deviation
Summary
8.4 The t Distributions
Definition of the Distributions
Relation to Random Samples from a Normal Distribution
Relation to the Cauchy Distribution and to the Standard Normal Distribution
Summary
8.5 Confidence Intervals
Confidence Intervals for the Mean of a Normal Distribution
One-Sided Confidence Intervals
Confidence Intervals for Other Parameters
Summary
8.6 Bayesian Analysis of Samples from a Normal Distribution
The Precision of a Normal Distribution
The Marginal Distribution of the Mean
A Numerical Example
Improper Prior Distributions
Summary
8.7 Unbiased Estimators
Definition of an Unbiased Estimator
Unbiased Estimation of the Variance
Summary
8.8 Fisher Information
Definition and Properties of Fisher Information
The Information Inequality
Efficient Estimators
Properties of Maximum Likelihood Estimators for Large Samples
Summary
8.9 Supplementary Exercises
9 TESTING HYPOTHESES
9.1 Problems of Testing Hypotheses
The Null and Alternative Hypotheses
Simple and Composite Hypotheses
The Critical Region and Test Statistics
The Power Function and Types of Error
Making a Test Have a Specific Significance Level
The p-value
Equivalence of Tests and Confidence Sets
Likelihood Ratio Tests
Summary
9.2 Testing Simple Hypotheses
Introduction
The Two Types of Errors
Optimal Tests
Summary
9.3 Uniformly Most Powerful Tests
Definition of a Uniformly Most Powerful Test
Monotone Likelihood Ratio
One-Sided Alternatives
Two-Sided Alternatives
9.4 Two-Sided Alternatives
General Form of the Procedure
Selection of the Test Procedure
Other Distributions
Composite Null Hypothesis
Summary
9.5 The t Test
Testing Hypotheses about the Mean of a Normal Distribution When the Variance Is Unknown
Properties of the t Tests
The Paired t Test
Testing with a Two-Sided Alternative
Summary
9.6 Comparing the Means of Two Normal Distributions
The Two-Sample t Test
Power of the Test
Two-Sided Alternatives
Summary
9.7 The F Distributions
Definition of the F Distribution
Properties of the F Distributions
Comparing the Variances of Two Normal Distributions
Properties of F Tests
Two-Sided Alternative
Summary
9.8 Bayes Test Procedures
Simple Null and Alternative Hypotheses
Tests Based on the Posterior Distribution
One-Sided Hypotheses
Two-Sided Alternatives
Testing the Mean of a Normal Distribution with Unknown Variance
Comparing the Means of Two Normal Distributions
Comparing the Variances of Two Normal Distributions
Summary
9.9 Foundational Issues
The Relationship between Level of Significance and Sample Size
Statistically Significant Results
Summary
9.10 Supplementary Exercises
10 CATEGORICAL DATA AND NONPARAMETRIC METHODS
10.1 Tests of Goodness-of-Fit
Description of Nonparametric Problems
Categorical Data
The χ2 Test
Testing Hypotheses about a Continuous Distribution
Summary
10.2 Goodness-of-Fit for Composite Hypotheses
Composite Null Hypotheses
The χ2 Test for Composite Null Hypotheses
Determining the Maximum Likelihood Estimates
Testing Whether a Distribution Is Normal
Summary
10.3 Contingency Tables
Independence in Contingency Tables
The χ2 Test of Independence
Summary
10.4 Tests of Homogeneity
Samples from Several Populations
The χ2 Test of Homogeneity
Comparing Two or More Proportions
Correlated 2 × 2 Tables
Summary
10.5 Simpson’s Paradox
An Example of the Paradox
The Paradox Explained
Summary
10.6 Kolmogorov-Smirnov Tests
The Sample Distribution Function
The Kolmogorov-Smirnov Test of a Simple Hypothesis
The Kolmogorov-Smirnov Test for Two Samples
10.7 Robust Estimation
Estimating the Median
Contaminated Normal Distributions
Trimmed Means
Robust Estimation of Scale
M-Estimators of the Median
Comparison of the Estimators
Large-Sample Properties of Sample Quantiles
Summary
10.8 Sign and Rank Tests
One-Sample Procedures
Comparing Two Distributions
Ties
Summary
10.9 Supplementary Exercises
11 LINEAR STATISTICAL MODELS
11.1 The Method of Least Squares
Fitting a Straight Line
The Least-Squares Line
Fitting a Polynomial by the Method of Least Squares
Fitting a Linear Function of Several Variables
Summary
11.2 Regression
Regression Functions
Simple Linear Regression
The Distribution of the Least-Squares Estimators
Prediction
Summary
11.3 Statistical Inference in Simple Linear Regression
Joint Distribution of the Estimators
Tests of Hypotheses about the Regression Coefficients
Confidence Intervals
The Analysis of Residuals
Summary
11.4 Bayesian Inference in Simple Linear Regression
Improper Priors for Regression Parameters
Prediction Intervals
Tests of Hypotheses
Summary
11.5 The General Linear Model and Multiple Regression
The General Linear Model
Maximum Likelihood Estimators
Explicit Form of the Estimators
Mean Vector and Covariance Matrix
The Joint Distribution of the Estimators
Testing Hypotheses
Prediction
Multiple R²
Analysis of Residuals
Summary
11.6 Analysis of Variance
The One-Way Layout
Partitioning a Sum of Squares
Testing Hypotheses
Analysis of Residuals
Summary
11.7 The Two-Way Layout
The Two-Way Layout with One Observation in Each Cell
Estimating the Parameters
Partitioning the Sum of Squares
Testing Hypotheses
Summary
11.8 The Two-Way Layout with Replications
The Two-Way Layout with K Observations in Each Cell
Partitioning the Sum of Squares
Testing Hypotheses
The Two-Way Layout with Unequal Numbers of Observationsin the Cells
Summary
11.9 Supplementary Exercises
12 SIMULATION
12.1 What Is Simulation?
Proof of Concept
Examples in which Simulation Might Help
Summary
12.2 Why Is Simulation Useful?
Examples of Simulation
Which Mean Do You Mean?
Assessing Uncertainty about Simulation Results
Summary
12.3 Simulating Specific Distributions
The Probability Integral Transformation
Acceptance/Rejection
Generating Functions of Other Random Variables
Some Examples Involving Simulation of Common Distributions
Simulating a Discrete Random Variable
Summary
12.4 Importance Sampling
Summary
12.5 Markov Chain Monte Carlo
The Gibbs Sampling Algorithm
Some Theoretical Justification
When Does the Markov Chain Converge?
Estimation Based on Gibbs Sampling
Some Examples
Prediction
Summary
12.6 The Bootstrap
Introduction
The Bootstrap in General
The Nonparametric Bootstrap
The Parametric Bootstrap
Summary
12.7 Supplementary Exercises
Tables
Answers to Odd-Numbered Exercises
References
Index
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
Y
Z
Probability and Statistics Fourth Edition
This page intentionally left blank
Probability and Statistics Fourth Edition Morris H. DeGroot Carnegie Mellon University Mark J. Schervish Carnegie Mellon University Addison-Wesley Indianapolis New York San Francisco Upper Saddle River Amsterdam Cape Town Dubai London Madrid Milan Munich Paris Montr´eal Toronto Delhi Mexico City S˜ao Paulo Sydney Hong Kong Seoul Singapore Taipei Tokyo Boston Columbus
Editor in Chief: Deirdre Lynch Acquisitions Editor: Christopher Cummings Associate Content Editors: Leah Goldberg, Dana Jones Bettez Associate Editor: Christina Lepre Senior Managing Editor: Karen Wernholm Production Project Manager: Patty Bergin Cover Designer: Heather Scott Design Manager: Andrea Nix Senior Marketing Manager: Alex Gay Marketing Assistant: Kathleen DeChavez Senior Author Support/Technology Specialist: Joe Vetere Rights and Permissions Advisor: Michael Joyce Manufacturing Manager: Carol Melville Project Management, Composition: Windfall Software, using ZzTEX Cover Photo: Shutterstock/© Marilyn Volan The programs and applications presented in this book have been included for their instruc- tional value. They have been tested with care, but are not guaranteed for any particular purpose. The publisher does not offer any warranties or representations, nor does it accept any liabilities with respect to the programs or applications. Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trademarks. Where those designations appear in this book, and Pearson Education was aware of a trademark claim, the designations have been printed in initial caps or all caps. Library of Congress Cataloging-in-Publication Data DeGroot, Morris H., 1931–1989. Probability and statistics / Morris H. DeGroot, Mark J. Schervish.—4th ed. p. cm. ISBN 978-0-321-50046-5 1. Probabilities—Textbooks. 2. Mathematical statistics—Textbooks. I. Schervish, Mark J. QA273.D35 2012 519.2—dc22 II. Title. 2010001486 Copyright © 2012, 2002 Pearson Education, Inc. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher. Printed in the United States of America. For information on obtaining permission for use of material in this work, please submit a written request to Pearson Education, Inc., Rights and Contracts Department, 75 Arlington Street, Suite 300, Boston, MA 02116, fax your request to 617-848-7047, or e-mail at http://www.pearsoned.com/legal/permissions.htm. 1 2 3 4 5 6 7 8 9 10—EB—14 13 12 11 10 www.pearsonhighered.com ISBN 10: 0-321-50046-6 ISBN 13: 978-0-321-50046-5
To the memory of Morrie DeGroot. MJS
This page intentionally left blank
Contents Preface xi 1 Introduction to Probability 1 1 2 5 6 1.1 The History of Probability 1.2 Interpretations of Probability 1.3 Experiments and Events 1.4 Set Theory 1.5 The Definition of Probability 1.6 Finite Sample Spaces 22 1.7 Counting Methods 1.8 Combinatorial Methods 1.9 Multinomial Coefficients 1.10 The Probability of a Union of Events 1.11 Statistical Swindles 1.12 Supplementary Exercises 32 42 51 16 53 25 46 2 Conditional Probability 55 2.1 2.2 2.3 2.4 2.5 The Definition of Conditional Probability Independent Events Bayes’ Theorem The Gambler’s Ruin Problem Supplementary Exercises 66 86 90 76 55 3 Random Variables and Distributions 93 100 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional Distributions 3.7 Multivariate Distributions 3.8 3.9 Functions of a Random Variable Functions of Two or More Random Variables 141 152 118 130 167 107 93 175 3.10 Markov Chains 188 3.11 Supplementary Exercises 202 vii
分享到:
收藏