This is an old revision of the document!


4.3. Optimal Caching: A More Complex Exchange Argument

The Problem

  • caching: The process of storing a small amount of data in a fast memory to reduce the time spent interacting with a slow memory.
  • cache maintenance algorithms determine what to keep in the cache and what to delete from it when new data is brought in.
  • The setting:
    • Let U = set of n pieces of data stored in main memory.
    • We have a faster memory, the cache, that can hold k < n pieces of data at once
    • The cache initially holds some set of k items.
    • The sequence of data we must process –> D = d1,d2,…,dm which is drawn from U
    • When processing D, at each time we must decide which k items to keep in the cache
    • When di is presented, we can quickly access it from the cache or bring it from memory to cache.
    • cache miss: if the cache is full,we must remove some other items from the cache to make room for d1.
    • At any specific sequence of memory references, a cache maintenance algorithm determines an eviction schedule:Specifying which items should be removed from the cache at which points in the sequence, and this determines the contents of the cache and the number of misses over time.
  • So the goal of the algorithm is to produce an eviction schedule that incurs as few cache misses as possible.

Designing the Algorithm

Algorithm

When di needs to be brought into the cache
evict the item that is needed the farthest into the future
courses/cs211/winter2012/journals/jeanpaul/chapterfour_sectioniii.1330310386.txt.gz · Last modified: 2012/02/27 02:39 by mugabej
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0