0% found this document useful (0 votes)
6 views3 pages

Stats Formulas

The document provides formulas and concepts related to probability and statistics, including summary statistics for both ungrouped and grouped data, as well as properties of discrete and continuous random variables. It covers various distributions such as binomial, geometric, and Poisson, along with the Central Limit Theorem and sampling techniques. Additionally, it includes critical values for the normal distribution and probability generating functions.

Uploaded by

Thomas Haq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views3 pages

Stats Formulas

The document provides formulas and concepts related to probability and statistics, including summary statistics for both ungrouped and grouped data, as well as properties of discrete and continuous random variables. It covers various distributions such as binomial, geometric, and Poisson, along with the Central Limit Theorem and sampling techniques. Additionally, it includes critical values for the normal distribution and probability generating functions.

Uploaded by

Thomas Haq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

PROBABILITY & STATISTICS

Summary statistics
For ungrouped data:

Σx Σ( x − x ) 2 Σx 2
x= , standard deviation = = − x2
n n n
For grouped data:

Σxf Σ( x − x ) 2 f Σx 2 f
x= , standard deviation = = − x2
Σf Σf Σf

Discrete random variables


E( X ) = Σxp , Var( X ) = Σx 2 p − {E( X )}2
For the binomial distribution B(n, p) :

n
pr =   p r (1 − p) n − r , µ = np , σ 2 = np(1 − p )
r
For the geometric distribution Geo(p):
1
pr = p(1 − p) r −1 , µ=
p
For the Poisson distribution Po(λ )

λr
pr = e − λ , µ =λ , σ2 =λ
r!

Continuous random variables



E( X ) = x f( x) dx , ∫
Var( X ) = x 2 f( x) dx − {E( X )}2

Sampling and testing


Unbiased estimators:

Σx Σ( x − x ) 2 1  2 ( Σx ) 2 
x= , s2 = =  Σx − 
n n −1 n −1 n 

Central Limit Theorem:


 σ2 
X ~ N  µ, 
 n 

Approximate distribution of sample proportion:


 p (1 − p) 
N  p, 
 n 

8
FURTHER PROBABILITY & STATISTICS

Sampling and testing


Two-sample estimate of a common variance:
Σ( x1 − x1 ) 2 + Σ( x2 − x2 ) 2
s2 =
n1 + n 2 − 2

Probability generating functions


G X (t ) = E(t X ) , E( X ) = G ′X (1) , Var( X ) = G ′′X (1) + G ′X (1) − {G ′X (1)}2

9
THE NORMAL DISTRIBUTION FUNCTION

If Z has a normal distribution with mean 0 and


variance 1, then, for each value of z, the table gives
the value of Φ(z), where

Φ(z) = P(Z ⩽ z).

For negative values of z, use Φ(–z) = 1 – Φ(z).

1 2 3 4 5 6 7 8 9
z 0 1 2 3 4 5 6 7 8 9
ADD
0.0 0.5000 0.5040 0.5080 0.5120 0.5160 0.5199 0.5239 0.5279 0.5319 0.5359 4 8 12 16 20 24 28 32 36
0.1 0.5398 0.5438 0.5478 0.5517 0.5557 0.5596 0.5636 0.5675 0.5714 0.5753 4 8 12 16 20 24 28 32 36
0.2 0.5793 0.5832 0.5871 0.5910 0.5948 0.5987 0.6026 0.6064 0.6103 0.6141 4 8 12 15 19 23 27 31 35
0.3 0.6179 0.6217 0.6255 0.6293 0.6331 0.6368 0.6406 0.6443 0.6480 0.6517 4 7 11 15 19 22 26 30 34
0.4 0.6554 0.6591 0.6628 0.6664 0.6700 0.6736 0.6772 0.6808 0.6844 0.6879 4 7 11 14 18 22 25 29 32

0.5 0.6915 0.6950 0.6985 0.7019 0.7054 0.7088 0.7123 0.7157 0.7190 0.7224 3 7 10 14 17 20 24 27 31
0.6 0.7257 0.7291 0.7324 0.7357 0.7389 0.7422 0.7454 0.7486 0.7517 0.7549 3 7 10 13 16 19 23 26 29
0.7 0.7580 0.7611 0.7642 0.7673 0.7704 0.7734 0.7764 0.7794 0.7823 0.7852 3 6 9 12 15 18 21 24 27
0.8 0.7881 0.7910 0.7939 0.7967 0.7995 0.8023 0.8051 0.8078 0.8106 0.8133 3 5 8 11 14 16 19 22 25
0.9 0.8159 0.8186 0.8212 0.8238 0.8264 0.8289 0.8315 0.8340 0.8365 0.8389 3 5 8 10 13 15 18 20 23

1.0 0.8413 0.8438 0.8461 0.8485 0.8508 0.8531 0.8554 0.8577 0.8599 0.8621 2 5 7 9 12 14 16 19 21
1.1 0.8643 0.8665 0.8686 0.8708 0.8729 0.8749 0.8770 0.8790 0.8810 0.8830 2 4 6 8 10 12 14 16 18
1.2 0.8849 0.8869 0.8888 0.8907 0.8925 0.8944 0.8962 0.8980 0.8997 0.9015 2 4 6 7 9 11 13 15 17
1.3 0.9032 0.9049 0.9066 0.9082 0.9099 0.9115 0.9131 0.9147 0.9162 0.9177 2 3 5 6 8 10 11 13 14
1.4 0.9192 0.9207 0.9222 0.9236 0.9251 0.9265 0.9279 0.9292 0.9306 0.9319 1 3 4 6 7 8 10 11 13

1.5 0.9332 0.9345 0.9357 0.9370 0.9382 0.9394 0.9406 0.9418 0.9429 0.9441 1 2 4 5 6 7 8 10 11
1.6 0.9452 0.9463 0.9474 0.9484 0.9495 0.9505 0.9515 0.9525 0.9535 0.9545 1 2 3 4 5 6 7 8 9
1.7 0.9554 0.9564 0.9573 0.9582 0.9591 0.9599 0.9608 0.9616 0.9625 0.9633 1 2 3 4 4 5 6 7 8
1.8 0.9641 0.9649 0.9656 0.9664 0.9671 0.9678 0.9686 0.9693 0.9699 0.9706 1 1 2 3 4 4 5 6 6
1.9 0.9713 0.9719 0.9726 0.9732 0.9738 0.9744 0.9750 0.9756 0.9761 0.9767 1 1 2 2 3 4 4 5 5

2.0 0.9772 0.9778 0.9783 0.9788 0.9793 0.9798 0.9803 0.9808 0.9812 0.9817 0 1 1 2 2 3 3 4 4
2.1 0.9821 0.9826 0.9830 0.9834 0.9838 0.9842 0.9846 0.9850 0.9854 0.9857 0 1 1 2 2 2 3 3 4
2.2 0.9861 0.9864 0.9868 0.9871 0.9875 0.9878 0.9881 0.9884 0.9887 0.9890 0 1 1 1 2 2 2 3 3
2.3 0.9893 0.9896 0.9898 0.9901 0.9904 0.9906 0.9909 0.9911 0.9913 0.9916 0 1 1 1 1 2 2 2 2
2.4 0.9918 0.9920 0.9922 0.9925 0.9927 0.9929 0.9931 0.9932 0.9934 0.9936 0 0 1 1 1 1 1 2 2

2.5 0.9938 0.9940 0.9941 0.9943 0.9945 0.9946 0.9948 0.9949 0.9951 0.9952 0 0 0 1 1 1 1 1 1
2.6 0.9953 0.9955 0.9956 0.9957 0.9959 0.9960 0.9961 0.9962 0.9963 0.9964 0 0 0 0 1 1 1 1 1
2.7 0.9965 0.9966 0.9967 0.9968 0.9969 0.9970 0.9971 0.9972 0.9973 0.9974 0 0 0 0 0 1 1 1 1
2.8 0.9974 0.9975 0.9976 0.9977 0.9977 0.9978 0.9979 0.9979 0.9980 0.9981 0 0 0 0 0 0 0 1 1
2.9 0.9981 0.9982 0.9982 0.9983 0.9984 0.9984 0.9985 0.9985 0.9986 0.9986 0 0 0 0 0 0 0 0 0

Critical values for the normal distribution

If Z has a normal distribution with mean 0 and


variance 1, then, for each value of p, the table
gives the value of z such that

P(Z ⩽ z) = p.

p 0.75 0.90 0.95 0.975 0.99 0.995 0.9975 0.999 0.9995


z 0.674 1.282 1.645 1.960 2.326 2.576 2.807 3.090 3.291

10

You might also like