This page contains the basic Rules for the Mean, Variance,
Covariance, and Correlation for the expectation of random variables. This summary
can be extremely helpful if you do not work regularly in statistics or are a new
student. The proofs of these rules can be purchased for a nominal fee from the Order page.
The proofs are usually required problems or test questions in a non calculus first
course in statistics. The Rules and their proofs are part of a Statistical Review
which is approximately 27 pages in 10 point type. It is a handy review for someone
who has been away from statistics for a while but suddenly finds an article using these
Rules. Students will find them helpful as well. If you have MathType, you may
edit the file.
FORMULAS AND RULES FOR EXPECTATIONS OF RANDOM VARIABLES
Formulas and Rules for the Mean of the Random Variable X
Formulas for the Mean
where pi is the probability of the occurrence of the value of
Rules for the Mean
The expectation of a constant, c, is the constant.
E(c) = c
Adding a constant value, c, to each term increases the mean, or expected value, by the
E(X+c) = E(X)+c
Multiplying a random variable by a constant value, c, multiplies the expected value or
mean by that constant.
E(cX ) = cE(X)
The expected value or mean of the sum of two random variables is the sum of the means.
This is also known as the additive law of expectation.
E(X+Y) = E(X)+E(Y)
Formulas and Rules for the Variance, Covariance and Standard Deviation of
Formulas for the Variance
Formulas for the Standard Deviation
Formulas for the Covariance
Rules for the Variance
The variance of a constant is zero.
Adding a constant value, c, to a random variable does not change the variance, because
the expectation (mean) increases by the same amount.
Multiplying a random variable by a constant increases the variance by the square of the
The variance of the sum of two or more random variables is equal to the sum of each of
their variances only when the random variables are independent.
and in terms of the sigma notation
When two random variables are independent, so that
Rules for the Covariance
The covariance of two constants, c and k, is zero.
The covariance of two independent random variables is zero.
The covariance is a combinative as is obvious from the definition.
The covariance of a random variable with a constant is zero.
Adding a constant to either or both random variables does not change their covariances.
Multiplying a random variable by a constant multiplies the covariance by that constant.
The additive law of covariance holds that the covariance of a random variable with a
sum of random variables is just the sum of the covariances with each of the random
The covariance of a variable with itself is the variance of the random variable.
Formulas and Rules for the Correlation Coefficient of Random Variables
Rules for the Correlation Coefficient
Adding a constant to a random variable does not change their correlation coefficient.
Multiplying a random variable by a constant does not change their correlation
coefficient. For two random variables
Z = a+bX and W = c+dY, where a,b,c, and d are
Because the square root of the variance is always positive, the correlation coefficient
can be negative only when the covariance is negative. This leads to
The correlation coefficient is always at least -1 and no more than +1.
Formulas and Rules for the Sample Mean, Variance, Covariance and Standard
Deviation, and Correlation Coefficient of Random Variables
Rules for Sampling Statistics
The sample mean, is
The sample variance is
The sample standard deviation s, is
The sample correlation coefficient is the same as the population correlation