Monotonicity: The cumulative probability is a non-decreasing function. As increases, the cumulative probability can only stay the same or increase, never decrease.
Boundary Conditions: For the smallest possible outcome , the cumulative probability is simply the probability of that outcome: .
Total Probability: For the largest possible outcome , the cumulative probability must always equal 1, as it accounts for the entire sample space.
Discrete Jumps: In a discrete distribution, the cumulative probability only changes at the specific values the random variable can take, creating a 'step' effect in graphical representations.
It is vital to distinguish between the Probability Mass Function (PMF) and the Cumulative Distribution Function (CDF).
| Feature | Probability Mass Function (PMF) | Cumulative Distribution Function (CDF) |
|---|---|---|
| Notation | ||
| Meaning | Probability of an exact outcome | Probability of a range of outcomes up to |
| Summation | Sum of all values equals 1 | Final value in the sequence equals 1 |
| Graph | Individual bars or points | Step-like function or increasing bars |
In discrete variables, the inclusion of the endpoint matters significantly. is NOT the same as because the latter includes the probability of the variable being exactly 3.
Check the Endpoints: Always verify if the last value in your cumulative distribution table is exactly 1.0. If it is not, there is an error in your individual probability calculations or your summation.
Read Inequality Signs Carefully: Exams often switch between , , , and . For discrete variables, is calculated as , not .
Sanity Check: Ensure that your cumulative probabilities never decrease as you move from left to right across the outcomes.
Missing Values: If a cumulative table has a gap, you can find the missing individual probability by looking at the 'jump' between the cumulative values before and after the gap.