Assume that for a certain processor, a read request takes 50 ns on a cache miss and 5 ns on a cache hit.
Suppose while running a program, it was observed that 80% of the processor's read requests result in a cache hit. The average read access time in nanoseconds is _____.
Now, I know that the answer is 0.8(5) + 0.2(50) = 14 ns.
But, there’s another way of calculating which we usually use for multilevel caches, which is given as:
T(avg) = Hit Time of L1 + MissRate(L1)*(Miss Penalty for L1)
If we use a similar logic for this question,
T(avg) = 5 + 0.2*(50), which gives us 15.
So, how do we know which formula do we have to use in which context?