HPCA Assignment )?Why does increasing cache size increase the hit time( : Submitted by Indranil Nandy MTech, 2006 Roll no. : 06CS6010
Direct-mapped Cache First, let us see how a direct-mapped cache is organized. To read a word from the cache, the input address is set by the processor. Then the index portion of the address is decoded to access the proper row in the tag memory array and in the data memory array. The selected tag is compared to the tag portion of the input address to determine if the access is a hit or not. At the same time, the corresponding cache block is read and the proper line is selected through a MUX.
In the tag and data array, each row corresponds to a line in the cache. For example, a row in the tag memory array contains one tag and two status bits (valid and dirty) for the cache line. For direct-mapped caches, a row in the data array holds one cache line. Now, we will see the delay associated with the components resulting to the access time, i.e., hit time. The delay equations are as follows where ci is a constant. Decoder : c1 * ( # of index bits) + c2 Memory Array : c1 * log2( # of rows ) + c1 * log2( # of bits in row ) + c2 Comparator : c1 * ( # of tag bits ) + c2 N-to-1 MUX : c3 * log2N + c2 Valid Output Driver : c2
Now, when we increase cache size keeping the block size fixed, number of bits in the index portion increase. If the cache size is 2m, block size is 2n, then in data cache memory total number of blocks must be 2m-n, i.e., for m-n index bits are required. Whenever we increase cache size keeping n fixed, number of bits in index portion must increase resulting in the increase in the number of input lines to the decoder. This results in the increment of the hit time according to the equation shown above for the decoder. Again, as number of rows also increases, it will increase the hit time according the equation for memory array causing the hit time increment.