Cache is close to CPU and faster than main memory. But at the same time is smaller than main memory. The cache organization is about mapping data in memory to a location in cache. A Simple Solution: One way to go about this mapping is to consider last few bits of long memory address to find small cache address, and place them at the found address. Problems With Simple Solution: The problem with this approach is, we lose the information about high order bits and have no way to find out the lower order bits belong to which higher order bits.
We will soon be discussing more details of cache organization. This article is contributed Ankur Gupta.
Advantages and downsides of different cache employer techniques:
Direct-Mapped Cache:
Advantages:
- Simple and easy to put into effect
- Low hardware overhead
- Fast hit time
Disadvantages:
- High pass over fee because of restrained wide variety of cache blocks
- Increased war misses because of block collisions
- Limited flexibility in phrases of block placement
Set-Associative Cache:
Advantages:
- Higher hit fee than direct-mapped cache because of more than one blocks being saved in each set
- More bendy block placement than direct-mapped cache
- Lower struggle misses as compared to direct-mapped cache
Disadvantages:
- Higher hardware overhead than direct-mapped cache
- Longer hit time than direct-mapped cache because of looking multiple blocks
- Limited scalability due to fixed quantity of ways in step with set
Fully-Associative Cache:
Advantages:
- Highest hit rate amongst cache businesses
- Most flexible block placement
- No struggle misses because of fully-associative mapping
Disadvantages:
- Highest hardware overhead among cache agencies
- Longest hit time due to searching all blocks in cache
- Limited scalability because of constrained physical area and huge tag storage necessities