Store

Test Bank for Mathematical Statistics with Applications (7th Edition) by Dennis Wackerly

By: William Mendenhall , Dennis Wackerly , Richard L. Scheaffer
ISBN-10: 495110817
/ ISBN-13: 9780495110811

Study Guide Details

Format: Downloadable ZIP Fille
Authors: William Mendenhall , Dennis Wackerly , Richard L. Scheaffer
Secure Stripe Payment Logo.png

TEST BANK

$35.00 $30.00

Instant Download to your account.

Description

Table of content:

Half Title
Title
Statement
Copyright
Contents
Preface
Note to the Student
Ch 1: What Is Statistics?
1.1: Introduction
1.2: Characterizing a Set of Measurements: Graphical Methods
1.3: Characterizing a Set of Measurements: Numerical Methods
1.4: How Inferences Are Made
1.5: Theory and Reality
1.6: Summary
Ch 1: References and Further Readings
Ch 1: Supplementary Exercises
Ch 2: Probability
2.1: Introduction
2.2: Probability and Inference
2.3: A Review of Set Notation
2.4: A Probabilistic Model for an Experiment: The Discrete Case
2.5: Calculating the Probability of an Event: The Sample-Point Method
2.6: Tools for Counting Sample Points
2.7: Conditional Probability and the Independence of Events
2.8: Two Laws of Probability
2.9: Calculating the Probability of an Event: The Event-Composition Method
2.10: The Law of Total Probability and Bayes’ Rule
2.11: Numerical Events and Random Variables
2.12: Random Sampling
2.13: Summary
Ch 2: References and Further Readings
Ch 2: Supplementary Exercises
Ch 3: Discrete Random Variables and Their Probability Distributions
3.1: Basic Definition
3.2: The Probability Distribution for a Discrete Random Variable
3.3: The Expected Value of a Random Variable or a Function of a Random Variable
3.4: The Binomial Probability Distribution
3.5: The Geometric Probability Distribution
3.6: The Negative Binomial Probability Distribution (Optional)
3.7: The Hypergeometric Probability Distribution
3.8: The Poisson Probability Distribution
3.9: Moments and Moment-Generating Functions
3.10: Probability-Generating Functions (Optional)
3.11: Tchebysheff’s Theorem
3.12: Summary
Ch 3: References and Further Readings
Ch 3: Supplementary Exercises
Ch 4: Continuous Variables and Their Probability Distributions
4.1: Introduction
4.2: The Probability Distribution for a Continuous Random Variable
4.3: Expected Values for Continuous Random Variables
4.4: The Uniform Probability Distribution
4.5: The Normal Probability Distribution
4.6: The Gamma Probability Distribution
4.7: The Beta Probability Distribution
4.8: Some General Comments
4.9: Other Expected Values
4.10: Tchebysheff’s Theorem
4.11: Expectations of Discontinuous Functions and Mixed Probability Distributions (Optional)
4.12: Summary
Ch 4: References and Further Readings
Ch 4: Supplementary Exercises
Ch 5: Multivariate Probability Distributions
5.1: Introduction
5.2: Bivariate and Multivariate Probability Distributions
5.3: Marginal and Conditional Probability Distributions
5.4: Independent Random Variables
5.5: The Expected Value of a Function of Random Variables
5.6: Special Theorems
5.7: The Covariance of Two Random Variables
5.8: The Expected Value and Variance of Linear Functions of Random Variables
5.9: The Multinomial Probability Distribution
5.10: The Bivariate Normal Distribution (Optional)
5.11: Conditional Expectations
5.12: Summary
Ch 5: References and Further Readings
Ch 5: Supplementary Exercises
Ch 6: Functions of Random Variables
6.1: Introduction
6.2: Finding the Probability Distribution of a Function of Random Variables
6.3: The Method of Distribution Functions
6.4: The Method of Transformations
6.5: The Method of Moment-Generating Functions
6.6: Multivariable Transformations Using Jacobians (Optional)
6.7: Order Statistics
6.8: Summary
Ch 6: References and Further Readings
Ch 6: Supplementary Exercises
Ch 7: Sampling Distributions and the Central Limit Theorem
7.1: Introduction
7.2: Sampling Distributions Related to the Normal Distribution
7.3: The Central Limit Theorem
7.4: A Proof of the Central Limit Theorem (Optional)
7.5: The Normal Approximation to the Binomial Distribution
7.6: Summary
Ch 7: References and Further Readings
Ch 7: Supplementary Exercises
Ch 8: Estimation
8.1: Introduction
8.2: The Bias and Mean Square Error of Point Estimators
8.3: Some Common Unbiased Point Estimators
8.4: Evaluating the Goodness of a Point Estimator
8.5: Confidence Intervals
8.6: Large-Sample Confidence Intervals
8.7: Selecting the Sample Size
8.8: Small-Sample Confidence Intervals for μ and μ1 − μ2
8.9: Confidence Intervals for σ 2
8.10: Summary
Ch 8: References and Further Readings
Ch 8: Supplementary Exercises
Ch 9: Properties of Point Estimators and Methods of Estimation
9.1: Introduction
9.2: Relative Efficiency
9.3: Consistency
9.4: Sufficiency
9.5: The Rao–Blackwell Theorem and Minimum-Variance Unbiased Estimation
9.6: The Method of Moments
9.7: The Method of Maximum Likelihood
9.8: Some Large-Sample Properties of Maximum-Likelihood Estimators (Optional)
9.9: Summary
Ch 9: References and Further Readings
Ch 10: Supplementary Exercises
Ch 10: Hypothesis Testing
10.1: Introduction
10.2: Elements of a Statistical Test
10.3: Common Large-Sample Tests
10.4: Calculating Type II Error Probabilities and Finding the Sample Size for Z Tests
10.5: Relationships Between Hypothesis-Testing Procedures and Confidence Intervals
10.6: Another Way to Report the Results of a Statistical Test: Attained Significance Levels, or p-Va
10.7: Some Comments on the Theory of Hypothesis Testing
10.8: Small-Sample Hypothesis Testing for μ and μ1 − μ2
10.9: Testing Hypotheses Concerning Variances
10.10: Power of Tests and the Neyman–Pearson Lemma
10.11: Likelihood Ratio Tests
10.12: Summary
Ch 10: References and Further Readings
Ch 10: Supplementary Exercises
Ch 11: Linear Models and Estimation by Least Squares
11.1: Introduction
11.2: Linear Statistical Models
11.3: The Method of Least Squares
11.4: Properties of the Least-Squares Estimators: Simple Linear Regression
11.5: Inferences Concerning the Parameters βi
11.6: Inferences Concerning Linear Functions of the Model Parameters: Simple Linear Regression
11.7: Predicting a Particular Value of Y by Using Simple Linear Regression
11.8: Correlation
11.9: Some Practical Examples
11.10: Fitting the Linear Model by Using Matrices
11.11: Linear Functions of the Model Parameters: Multiple Linear Regression
11.12: Inferences Concerning Linear Functions of the Model Parameters: Multiple Linear Regression
11.13: Predicting a Particular Value of Y by Using Multiple Regression
11.14: A Test for H0: βg+1 = βg+2 = ··· = βk = 0
11.15: Summary and Concluding Remarks
Ch 11: References and Further Readings
Ch 11: Supplementary Exercises
Ch 12: Considerations in Designing Experiments
12.1: The Elements Affecting the Information in a Sample
12.2: Designing Experiments to Increase Accuracy
12.3: The Matched-Pairs Experiment
12.4: Some Elementary Experimental Designs
12.5: Summary
Ch 12: References and Further Readings
Ch 12: Supplementary Exercises
Ch 13: The Analysis of Variance
13.1: Introduction
13.2: The Analysis of Variance Procedure
13.3: Comparison of More Than Two Means: Analysis of Variance for a One-Way Layout
13.4: An Analysis of Variance Table for a One-Way Layout
13.5: A Statistical Model for the One-Way Layout
13.6: Proof of Additivity of the Sums of Squares and E(MST) for a One-Way Layout (Optional)
13.7: Estimation in the One-Way Layout
13.8: A Statistical Model for the Randomized Block Design
13.9: The Analysis of Variance for a Randomized Block Design
13.10 Estimation in the Randomized Block Design
13.11: Selecting the Sample Size
13.12: Simultaneous Confidence Intervals for More Than One Parameter
13.13: Analysis of Variance Using Linear Models
13.14: Summary
Ch 13: References and Further Readings
Ch 13: Supplementary Exercises
Ch 14: Analysis of Categorical Data
14.1: A Description of the Experiment
14.2: The Chi-Square Test
14.3: A Test of a Hypothesis Concerning Specified Cell Probabilities: A Goodness-of-Fit Test
14.4: Contingency Tables
14.5: r × c Tables with Fixed Row or Column Totals
14.6: Other Applications
14.7: Summary and Concluding Remarks
Ch 14: References and Further Readings
Ch 14: Supplementary Exercises
Ch 15: Nonparametric Statistics
15.1: Introduction
15.2: A General Two-Sample Shift Model
15.3: The Sign Test for a Matched-Pairs Experiment
15.4: The Wilcoxon Signed-Rank Test for a Matched-Pairs Experiment
15.5: Using Ranks for Comparing Two Population Distributions: Independent Random Samples
15.6: The Mann–Whitney U Test: Independent Random Samples
15.7: The Kruskal–Wallis Test for the One-Way Layout
15.8: The Friedman Test for Randomized Block Designs
15.9: The Runs Test: A Test for Randomness
15.10: Rank Correlation Coefficient
15.11: Some General Comments on Nonparametric Statistical Tests
Ch 15: References and Further Readings
Ch 15: Supplementary Exercises
Ch 16: Introduction to Bayesian Methods for Inference
16.1: Introduction
16.2: Bayesian Priors, Posteriors, and Estimators
16.3: Bayesian Credible Intervals
16.4: Bayesian Tests of Hypotheses
16.5: Summary and Additional Comments
Ch 16: References and Further Readings
Appendix 1: Matrices and Other Useful Mathematical Results
Appendix 2: Common Probability Distributions, Means, Variances, and Moment-Generating Functions
Appendix 3: Tables
Answers
Index
BES-1
BES-2

Reviews

There are no reviews yet.

Be the first to review “Test Bank for Mathematical Statistics with Applications (7th Edition) by Dennis Wackerly”

Additional Information


Resource Type:

Ebook Title:

Authors:

Publisher:

Related Test Books

Reviews

Your #1 Online Study Guide Resource

* We don’t share your personal info with anyone. Check out our Privacy Policy for more information