The Summation Rule: The most critical principle of any probability distribution is that the sum of all probabilities for all possible outcomes must equal 1. This is expressed mathematically as .
Expected Value (): This represents the theoretical mean or long-term average of the random variable. It is calculated by weighting each possible outcome by its probability: .
Linearity of Expectation: The expected value of a function of a random variable, , is found by applying the function to each outcome before weighting: .
Calculating Variance (): Variance measures the spread of the distribution around the mean. The most efficient computational formula is , which is the 'mean of the squares minus the square of the mean'.
Standard Deviation: To return the measure of spread to the original units of the variable, take the square root of the variance: .
Cumulative Distribution Function (CDF): Denoted as , this represents the probability that the random variable is less than or equal to a specific value: .
| Inequality | Meaning | Inclusion |
|---|---|---|
| At most | Includes | |
| Fewer than | Excludes | |
| At least | Includes | |
| More than | Excludes |
The Table Method: Always start by constructing a probability table if one is not provided. This visual organization prevents errors when calculating sums for and .
Sanity Checks: The calculated mean must always fall within the range of the possible values of . If your values are between 1 and 10 and your mean is 15, an error has occurred.
Complement Rule: For complex inequalities like , it is often faster to calculate rather than summing multiple individual probabilities.
Variance Positivity: Variance is a measure of squared distance and must always be positive. A negative result in a calculation is a definitive indicator of an arithmetic error.