Identifying Methodological Flaws: Researchers must scrutinize their sampling strategy (e.g., was it truly random or biased?) and their data collection timing (e.g., did a single day of rain skew the results?).
Equipment Audit: Evaluate if the tools used were precise enough. For example, using a digital flow meter is generally more accurate than timing a floating object to measure water velocity.
Human Error Analysis: Acknowledge mistakes in reading scales, recording numbers, or subjective bias in qualitative surveys like Environmental Quality Surveys (EQS).
Proposing Extensions: Suggesting how the study could be broadened, such as by comparing different seasons or increasing the geographical area of the study.
| Feature | Reliability | Validity |
|---|---|---|
| Core Question | Can the results be reproduced? | Is the measurement accurate? |
| Main Threat | Human error, inconsistent timing | Poor equipment, biased sampling |
| Improvement | Standardize methods, repeat tests | Use better tools, refine hypothesis |
Be Specific: When identifying a limitation, do not just say 'the equipment was bad.' Explain why it was bad and how it specifically affected the data (e.g., 'The tape measure sagged in the wind, leading to an overestimation of river width').
Link to Conclusions: Always explain how a limitation impacts the final answer to the enquiry question. If the data is unreliable, the conclusion must be treated with caution.
Suggest Realistic Improvements: Propose changes that are practical, such as 'increasing the sample size from 10 to 50' or 'using a flow meter for more precise velocity readings.'
Check for Anomalies: In the exam, look for data points that do not fit the trend. Evaluating why these occurred (e.g., a sudden localized event) demonstrates high-level analytical skill.