## Description

This book contains a balanced coverage of both the theory and applications that helps the beginners acquire a thorough knowledge of the concepts of mathematical statistics offered to the Arts, Science, Commerce and Engineering students in Indian Universities.

## Table of Content

**Chapter 1 INTRODUCTION OF STATISTICS – PRESENTATION OF**

STATISTICAL DATA

1.1 Scope of Statistics

1.2 Limitations of Statistics

1.3 Organization of Chapters in the Book

1.4 Classification of Raw Data and Frequency Distribution

1.5 Frequency Graphs

1.5.1 Line Diagram

1.5.2 Rectangular Histogram

1.5.3 Frequency Polygon

1.5.4 Frequency Curve

1.5.5 Cumulative Frequency Distributions and Ogives

Worked Examples 1

Exercise1

**Chapter 2 MEASURES OF CENTRAL TENDENCY – AVERAGES**

2.1 Averages

2.1.1 Arithmetic Mean (A.M.)

2.1.2 Properties of the Arithmetic Mean

2.2 Median

2.2.1 Formula for the Median of a Frequency Distribution

2.3 Mode

2.3.1 Empirical Relation between Mean, Median and Mode

2.4 Geometric Mean

2.5 Harmonic Mean

2.6 Requisites of a Satisfactory Average

2.7 Relative Merits and Demerits of Different Averages

Worked Examples 2

Exercise2

Answers

**Chapter 3 MOMENTS, MEASURES OF DISPERSION, SKEWNESS**

AND KURTOSIS

3.1 Moments – Definitions

3.2 Relations between Central and Raw Moments and Vice Versa

3.3 Computation of Higher Order Moments

3.4 Sheppard’s Correction for Moments

3.5 Dispersion

3.5.1 Range

3.5.2 Quartile Deviation

3.5.3 Mean Deviation

3.5.4 Minimal Property of Mean Deviation

3.6 Standard Deviation (S.D.)

3.6.1 Minimum Value of RMSD sA

3.6.2 Requisites of aSatisfactory Measure of Dispersion

3.6.3 Formula for the Computation of σ for a Frequency Distribution with Classes of Equal Class interval

3.6.4 Empirical Relation between Q.D., M.D. and S.D

3.6.5 S.D. of the Combination of Two Groups of Observations

3.6.6 An Alternative Formula for σ2

3.6.7 Relative Measures of Dispersion

3.7 Skewness

3.8 Kurtosis

3.9 Pearson’s β− and γ− Coefficients

Worked Examples 3

Exercise3

Answers

**Chapter 4 THEORY OF PROBABILITY**

4.1 Deterministic and Random Experiments

4.2 Different Ways of Defining Probability

4.2.1 Mathematical or Apriori Definition of Probability

4.2.2 Statistical or Aposteriori Definition of Probability

4.2.3 Axiomatic Definition of Probability

4.3 Some Theorems on Probability

4.4 Related and Independent Events

4.4.1 Conditional Probability and Product Theorem of Probability

4.4.2 Independent Events

4.4.3 Property

4.5 Theorem of Total Probability

4.6 Baye’s Theorem on Inverse Probability or the Theorem of Probability of Causes

4.7 Bernouilli’s Trials

4.8 Bernouilli’s Theorem

4.9 De Moivre-Laplace Approximation

4.10 Multinomial Probability – Extension of Bernouilli’s Theorem

Worked Examples 4(a)

Worked Examples 4(b)

Exercise 4(a)

Exercise 4(b)

Answers

**Chapter 5 RANDOM VARIABLES**

5.1 Discrete Random Variable

5.1.1 Probability Mass Function

5.1.2 Special Discrete Distributions

5.2 Continuous Random Variable

5.2.1 Probability Density Function

5.2.2 Special Continuous Distributions

5.3 Cumulative Distribution Function (C.D.F.)

5.3.1 Properties of the c.d.f. F(x)

5.4 Two-Dimensional Random Variables

5.4.1 Definitions

5.4.2 Joint Probability Mass Function of (X, Y )

5.4.3 Joint Probability Density Function of ( X, Y )

5.4.4 Cumulative Distribution Function of (X, Y )

5.4.5 Properties of F(x, y)

5.4.6 Marginal Probability Distribution/Density

5.4.7 Conditional Probability Distribution/Density

5.4.8 Independent Random Variables

5.4.9 Random Vectors or n-Dimensional R.V.’s

Worked Examples 5(a)

Worked Examples 5(b)

Exercise5(a)

Exercise5(b)

Answers

**Chapter 6 TRANSFORMATION OF RANDOM VARIABLE**

6.1 Transformation of one R.V. into another R.V

6.1.1 Probability Density Function of the R.V. Y , where Y = g (X), in terms of

the P.D.F. of X

6.2 Transformation from Two R.V.’s into a Single R.V

6.2.1 P.D.F. of Z = X + Y , where X and Y are Independent R.V.’s

6.2.2 P.D.F. of Z = XY , where X and Y are Independent R.V.’s

6.2.3 P.D.F. of Z = XY, where X and Y are Independent R.V.’s

6.3 Transformation of Two R.V.’s into Two Other R.V.’s

6.3.1 An Alternative Method to Find the p.d.f. of Z = g(X, Y )

Worked Examples 6

Exercise 6

Answers

**Chapter 7 MATHEMATICAL EXPECTATION**

7.1 Definitions of Mean and Variance

7.2 Elementary Properties of Mean and Variance

7.3 Expected value of a Function of a R.V

7.3.1 Other Statistical Measures for Continuous Probability Distributions

7.4 Expected Values of a Two Dimensional R.V

7.4.1 Property 1

7.4.2 Property 2

7.4.3 Property 3

7.4.4 Property 4

7.5 Covariance of X, Y

7.5.1 Properties of Covariance

7.6 Conditional Expected Values – Definitions

7.6.1 Property 1

7.6.2 Property 2

7.6.3 Property 3

7.6.4 Property 4

Worked Examples 7

Exercise7

Answers

**Chapter 8 MOMENT GENERATING FUNCTION, INEQUALITIES AND**

CONVERGENCE

8.1 Definitions

8.2 Properties of Characteristic Function

8.2.1 Property 1

8.2.2 Property 2

8.2.3 Property 3

8.2.4 Property 4

8.2.5 Property 5

8.2.6 Property 6

8.2.7 Property 7

8.3 Cumulant Generating Function (C.G.F.)

8.4 Joint Characteristic Function of a 2-Dimensional R.V

8.5 Probabilistic Inequalities

8.5.1 Tchebycheff Inequality

8.5.2 Markov’s Inequality

8.5.3 Bienayme’s Inequality

8.5.4 Cauchy-Schwartz Inequality

8.6 Convergence Concepts in Probability Theory

8.7 Central Limit Theorem(Lindeberg-Levy’s Form)

8.7.1 Central Limit Theorem (Liapounoff’s form)

Worked Examples 8(a)

Worked Examples 8(b)

Exercise8(a)

Exercise8(b)

Answers

**Chapter 9 SPECIAL DISCRETE PROBABILITY DISTRIBUTIONS**

9.1 Binomial Distribution

9.1.1 Characteristic Function of Binomial Distribution, Mean and Variance

9.1.2 Recurrence Formula for the Central Moments of the Binomial Distribution

and Values of μ2, μ3 and μ4

9.1.3 Mode of the Binomial Distribution

9.1.4 Cumulants of the Binomial Distribution

9.2 Poisson Distribution

9.2.1 Poisson Distribution as the Limiting Form of Binomial Distribution

9.2.2 Occurrence of Poisson Distribution

9.2.3 Moment Generating Function and Values of the First Four Central Moments of the Poisson Distribution

9.2.4 Recurrence Formula for the Central Moments of the Poisson

Distribution and Values of μ2, μ3 and μ4

9.2.5 Mode of the Poisson Distribution

9.2.6 Cumulants of the Poisson Distribution

9.2.7 Additive or Reproductive Property of Independent Poisson R.V.’s

9.3 Geometric Distribution

9.3.1 Characteristic Function, Mean and Variance of G*(p) Distribution

9.3.2 The First Four Cumulants and Central Moments of G*(p) Distribution . . . 357

9.3.3 Memoryless Property of G*(p) Distribution and its Converse

9.3.4 Median and Mode of the G*(p) Distribution

9.4 Negative Binomial Distribution

9.4.1 Moment Generating Function, Mean and Variance

9.4.2 Alternative Form of the P.M.F. of the Negative Binomial Distribution

9.4.3 The First Four Cumulants and Central Moments of the Negative Binomial

Distribution

9.4.4 Deduction of Central Moments of the Negative Binomial Distribution from Those of Binomial Distribution

9.4.5 Poisson Distribution as a Limiting Form of the Negative Binomial

Distribution

9.4.6 Recurrence Formula for the Central Moments of the Negative Binomial

Distribution

9.4.7 Reproductive Property of the Negative Binomial Distribution

9.5 Hypergeometric Distribution

9.5.1 Mean and Variance of the Hypergeometric Distribution

9.5.2 Binomial Distribution as Limiting Form of Hyper Geometric

Distribution

Worked Examples 9

Exercise 9

Answers

**Chapter 10 SPECIAL CONTINUOUS PROBABILITY DISTRIBUTIONS**

10.1 Uniform or Rectangular Distribution

10.1.1 Moment Generating Function and Mean, Variance, β1 and β2

10.1.2 The Absolute Moments and Mean Deviation of U(a, b)

10.2 Exponential Distribution

10.2.1 Moment Generating Function and Cumulant Generating Function

10.2.2 Memory less Property of the Exponential Distribution and its Converse

10.3 Erlang Distribution/General Gamma Distribution

10.3.1 Mean and Variance of General Gamma Distribution

10.3.2 M.G.F.,C.G.F. and Central Moments

10.3.3 Reproductive Property

10.3.4 Relation between the Distribution Functions of the Simple Gamma

Distribution with Parameter (k + 1) and Poisson Distribution with

Parameter λ

10.4 Weibull Distribution

10.4.1 Probability Density Function of the Weibull’s Distribution

10.4.2 Mean and Variance of Weibull Distribution

10.5 Double Exponential or Laplace Distribution

10.5.1 M.G.F. Mean and Variance of Two Parameter Laplace Distribution

10.6 Beta Distribution of the First Kind

10.6.1 Simple Moments and Harmonic Mean of the Beta I Distribution

10.7 Beta Distribution of the Second Kind

10.7.1 Simple Moments and Harmonic Mean of the Beta II Distribution

10.8 Cauchy Distribution

10.8.1 Characteristic Function of Cauchy’s Distribution

10.8.2 Mean, Median and Mode of the Two Parameter Cauchy Distribution

10.8.3 Reproductive Property of the General Cauchy Distribution

10.9 Normal (or Gaussian) Distribution

10.9.1 Normal Probability Curve and its Characteristics

10.9.2 Mean and Variance of the Normal Distribution N(μ, σ)

10.9.3 Median and Mode of the Normal Distribution N(μ, σ)

10.9.4 Central Moments of the Normal Distribution N(μ, σ)

10.9.5 Mean Deviation about the Mean of the Normal Distribution N(μ, σ)

10.9.6 Quartile Deviation of the Normal Distribution N(μ, σ)

10.9.7 Moment and Cumulant Generating Functions of N(μ, σ)

10.9.8 Additive Property of the Normal Distribution

10.9.9 Normal Distribution as Limiting Form of Binomial Distribution

10.9.10 Importance of Normal Distribution

Worked Examples 10(a)

Worked Examples 10(b)

Exercise10(a)

Exercise10(b)

Answers

**Chapter 11 CORRELATION AND REGRESSION**

11.1 Scatter Diagram

11.2 Correlation Coefficient

11.2.1 Property 1

11.2.2 Property 2

11.2.3 Property 3

11.2.4 Alternate Formula for r XY in Terms of Variances

11.3 Spearman’s Formula for the Rank Correlation Coefficient

11.4 Equation of the Regression Line of Y on X

11.4.1 Properties of Regression Coefficients

11.4.2 Standard Error of Estimate of Y

Worked Examples 11(a)

Worked Examples 11(b)

Exercise 11(a)

Exercise 11(b)

Answers

**Chapter 12 MULTIPLE AND PARTIAL CORRELATIONS**

12.1 A Note on Yule’s Subscript Notation

12.2 Plane of Regression

12.3 Properties of Residuals

12.4 Coefficient of Multiple Correlation

12.4.1 Multiple Correlation Coefficient in terms of Simple Correlation

Coefficients

12.5 Partial Correlation Coefficient in Terms of Simple Correlation Coefficients

Worked Examples 12

Exercise 12

Answers

**Chapter 13 TESTS OF SIGNIFICANCE FOR LARGE SAMPLES**

13.1 Sampling Distribution

13.2 Standard Errors

13.3 Tests of Significance

13.4 Critical Region and Level of Significance

13.5 One-tailed and Two-tailed Tests

13.6 Interval Estimation of Population Parameters

13.7 Assumptions used in Large Sample Tests

13.7.1 Test of Significance of the Difference between Sample Proportion and

Population Proportion

13.7.2 Test of Significance of the Difference between Two Sample Proportions

13.7.3 Test of Signification of the Difference between Sample Mean and

Population Mean

13.7.4 Test of Signification of the Difference between the Means of Two

Samples

13.7.5 Test of Significance of the Difference between the Sample S.D. and the

Population S.D

13.7.6 Test of Significance of the Difference between the S.D.’s of Two Large

Independent Samples

13.7.7 Test of Signification of the Difference between the Sample Correlation

Coefficient and the Population Correlation Coefficient

13.7.8 Test of Significance of the Difference between the Correlation Coefficient of Two Large Samples

Worked Examples 13

Exercise13

Answers

**Chapter 14 EXACT SAMPLING DISTRIBUTIONS**

14.1 Derivation of the Density Function of Chi-square (χ2)Distribution

14.1.1 Simple Moments and Cumulants of the χ2-Distribution

14.1.2 Mode and Skewness of the Chi-square Distribution

14.1.3 Limiting Form of Chi-Square Distribution for Large Values of n

14.1.4 Additive Property of Chi-square Distribution

14.1.5 Uses of χ2-Distribution

14.2 Derivation of the Density Function of Student’s t-Distribution

14.2.1 Moments of the Student’s t-Distribution and β1- and β2-Coefficients

14.2.2 Recurrence Relation for the Moments and Moment Generating Function

14.2.3 Mode of the t-Distribution

14.2.4 Limiting Form of t-Distribution for Large Values of n

14.2.5 Uses of t-Distribution

14.3 Snedecor’s F-Distribution – Definition and Derivation of P.D.F

14.3.1 Moments of F-Distribution

14.3.2 Recurrence Formula the Simple Moments and Value of the Variance

14.3.3 Mode of the F-Distribution and Form of F-Probability Curve

14.3.4 Uses of F-Distribution

14.3.5 Relations among t, F and χ2-Distributions

14.4 Definition and Derivation of the Density Function of Fisher’s Z-Distribution

14.4.1 Moments of Fisher’s Z-Distribution

14.4.2 Uses of Z-Distribution

Worked Examples 14

Exercise14

Answers

**Chapter 15 TESTS OF SIGNIFICANCE FOR SMALL SAMPLES**

15.1 Critical Values of t and the t-Table used in t-Tests

15.1.1 Test of Significance of the Difference between Sample Mean and

Population Mean

15.1.2 Test of Significance of the Difference between Means of Two Small Samples Drawn from the Same Normal Population

15.1.3 Test of Significance of Sample Correlation Coefficient

15.2 F-Test of Significance and F-Table

15.3 Chi-square (χ2) Tests

15.3.1 χ2-Test of Goodness of Fit

15.3.2 Conditions for the Validity of χ2-Test

15.3.3 χ2-Testof Independence of Attributes

Worked Examples 15(a)

Worked Examples 15(b)

Exercise15(a)

Exercise15(b)

Answers

**Chapter 16 ESTIMATION THEORY**

16.1 Interval Estimation

16.2 Point Estimation

16.2.1 Unbiased Estimator

16.2.2 Consistent Estimator

16.2.3 Efficient Estimator

16.2.4 Most Efficient Estimator

16.2.5 Sufficient Estimator

16.2.6 Neymann’s Factorisation Criterion for a Sufficient Estimator

16.3 Methods of Finding Estimators

16.3.1 Method of Maximum Likelihood

16.3.2 Method of Moments

Worked Examples 16(a)

Worked Examples 16(b)

Exercise 16(a)

Exercise16(b)

Answers

**Chapter 17 TESTING OF HYPOTHESIS**

17.1 Simple and Composite Hypotheses

17.2 Null Hypothesis and Alternative Hypothesis

17.3 Critical and Acceptance Regions

17.4 Errors in Hypothesis Testing and Power of a Test

17.5 Best Test, MP Test and UMP Test for a Simple Hypothesis

17.6 Neyman-Pearson Lemma

17.6.1 Likelihood Ratio Test (L.R.T.)

17.6.2 Two-tailed LRT for the Mean of a Normal Population

17.6.3 One-tailed Test for the Mean of a Normal Population

17.6.4 Two-tailed Test for the Variance of a Normal Population

17.6.5 One-tailed Test for the Variance of a Normal Population

17.7 Sequential Probability Ratio Test (SPRT)

17.8 Basic Assumption Associated with N.P. Tests

17.9 Advantages and Disadvantages of N.P. Methods over Parametric Methods

17.9.1 Sign Test

17.9.2 Wald’s Run Test

17.9.3 Rank Sum Test / Mann-Whitney-Wilcoxon Test

17.9.4 Median Test

17.9.5 Test for Randomness

Worked Examples 17(a)

Worked Examples 17(b)

Exercise17(a)

Exercise17(b)

Answers

## New Product Tab

Here's your new product tab.

## Reviews

There are no reviews yet.