Cache is a small amount of high-speed RAM built directly into or very close to the CPU to store frequently used data and instructions.
It acts as a buffer between the fast CPU and the relatively slow main memory (RAM), reducing the time the CPU spends waiting for data to be fetched.
Increasing the Cache Size allows more data to be stored locally, which improves the 'hit rate' and significantly boosts overall system responsiveness.
A Core is an independent processing unit within the CPU that can execute its own FDE cycle.
Multi-core processors (e.g., dual-core, quad-core) can process multiple instructions simultaneously through parallel processing.
While more cores allow for better multitasking, the performance gain depends on whether the software is designed to split tasks across multiple cores.
| Feature | Primary Function | Unit of Measure | Impact on Performance |
|---|---|---|---|
| Clock Speed | Determines cycle frequency | Hertz (GHz) | Increases speed of single tasks |
| Cache Size | Reduces data access latency | Megabytes (MB) | Speeds up repetitive data access |
| Cores | Enables parallel execution | Count (e.g., 4, 8) | Improves multitasking and throughput |
Identify Bottlenecks: Always remember that doubling one characteristic (like cores) does not necessarily double performance if another (like clock speed or RAM) is a bottleneck.
Calculation Precision: When asked about GHz, convert to billions () to show the scale of instructions processed per second.
Software Context: In exam questions regarding multi-core performance, mention that software must be 'multi-threaded' or optimized to utilize multiple cores effectively.
Common Mistake: Do not confuse Cache with RAM; Cache is much faster, smaller, and located on the CPU chip itself.