Observation involves watching users interact with the current system to understand authentic workflows. This method is valuable because it reveals actions users may not mention in interviews, especially routine shortcuts or workarounds.
Interviews involve structured conversations where analysts ask probing questions to uncover hidden requirements and deeper insights. Interviews are best used when detailed explanations or clarifications are needed.
Questionnaires gather information from large groups quickly using standardized questions. They help reveal trends and general attitudes, especially when anonymity encourages honest responses.
Document examination involves reviewing existing records, forms, manuals, and logs. This method reveals formal procedures, data structures, and operational constraints that may not be visible in day‑to‑day activity.
Data Flow Diagrams (DFDs) model how data moves through the system, showing inputs, processes, data stores, and outputs. They help analysts visualize inefficiencies and understand how information is transformed.
| Feature | Observation | Interviews | Questionnaires | Document Analysis |
|---|---|---|---|---|
| Best for | Real behavior | Deep insights | Large groups | Formal structure |
| Weakness | Observer effect | Time‑intensive | Low detail | May be outdated |
Functional vs non-functional requirements differ in purpose: functional requirements specify what the system must do, while non‑functional requirements define quality attributes such as reliability, usability, or performance. Understanding both ensures the new system is both effective and efficient.
User requirements vs information requirements help distinguish between what users want to accomplish and what data the system must manage. Keeping these separate prevents confusion between actions and data needs.
Current issues vs future needs must be distinguished so the new system does not merely replicate existing workflows but supports long‑term improvements.
Identify the most suitable research method based on user characteristics, such as availability, group size, or need for detail. Exams often test the ability to match methods to scenarios.
Clarify whether a question refers to user requirements or system requirements, as mixing these leads to vague answers. Always specify what the user wants to achieve and how the system will support it.
When describing analysis tasks, include inputs, outputs, processing steps, and problems. Examiners expect full coverage rather than partial descriptions.
Use precise terminology such as "processing", "data flow", or "functional requirement" to show technical understanding. Vague language suggests weak conceptual grasp.
Always justify research method choices by linking the method to the situation—such as the need for anonymity, speed, depth, or accuracy.
Assuming users always describe their behavior accurately is a misconception; many real workflows rely on undocumented shortcuts. Analysts must confirm statements with observation or documentation.
Believing the current system must be replicated ignores the purpose of analysis, which is to identify improvements. A new system should solve problems, not preserve them.
Confusing problems with symptoms leads analysts to propose fixes that address surface issues but not underlying causes. Correct analysis requires examining workflow logic and data needs.
Over‑reliance on a single research method can produce biased or incomplete requirements. A balanced combination yields more accurate insights.
Assuming technical constraints are secondary overlooks how limitations affect feasibility. Realistic solutions depend on early constraint identification.
Analysis connects directly to the design stage, where findings such as data requirements and process summaries guide file structures, interfaces, and validation routines.
Analysis principles apply beyond ICT, such as in business process improvement or workflow redesign. The same logic—understand before changing—applies universally.
Well-structured analysis supports testing, because clear requirements make it easier to define expected outcomes for test plans.
Analysis contributes to evaluation, since final system performance is judged against the requirements documented in this stage.
Modelling tools like DFDs extend into advanced topics, such as system architecture mapping and process optimization.