Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2012:journals:jeanpaul:chapterfour_sectioniii [2012/02/26 22:56] – [The Problem] mugabejcourses:cs211:winter2012:journals:jeanpaul:chapterfour_sectioniii [2012/02/27 03:25] (current) – [Designing the Algorithm] mugabej
Line 4: Line 4:
  
   * //caching//: The process of storing a small amount of data in a fast memory to reduce the time spent interacting with a slow memory.   * //caching//: The process of storing a small amount of data in a fast memory to reduce the time spent interacting with a slow memory.
-  * //cache maintenance algorithms// determine what to keep in the cache and what not to when new data is brought in. +  * //cache maintenance algorithms// determine what to keep in the cache and what to delete from it when new data is brought in. 
 +  * The setting: 
 +    * Let U = set of n pieces of data stored in main memory. 
 +    * We have a faster memory, the //cache//, that can hold k < n pieces of data at once 
 +    * The //cache// initially holds some set of k items. 
 +    * The sequence of data we must process --> D = d<sub>1</sub>,d<sub>2</sub>,...,d<sub>m</sub> which is drawn from U 
 +    * When processing D, at each time we must decide which k items to keep in the cache 
 +    * When d<sub>i</sub> is presented, we can quickly access it from the //cache// or bring it from memory to //cache//
 +    * //cache miss//: if the //cache// is full,we must remove some other items from the //cache// to make room for d<sub>1</sub>
 +    * At any specific sequence of memory references, a cache maintenance algorithm determines an eviction schedule:Specifying which items should be removed from the cache at which points in the sequence, and this determines the contents of the cache and the number of misses over time. 
 +  * So the goal of the algorithm is to produce an //eviction schedule// that incurs as few cache misses as possible. 
 + 
 +==== Designing the Algorithm ==== 
 +=== Algorithm === 
 + 
 +>>>>>>>>>>>>>When d<sub>i</sub> needs to be brought into the cache 
 +>>>>>>>>>>>>>>>>>>>>evict the item that is needed the farthest into the future 
 + 
 +  * This algorithm is called the //Farthest-in-Future Algorithm// 
 + 
 +=== Analysis === 
 + 
 +  * A //reduced schedule//: With this schedule, a item d is brought into the cache in a step //i// only if a request to d has been made during the step //i// and d is not already in the cache. This schedule does the minimal work necessary in a given step. 
 +  * For a schedule that is not reduced, an item d can be brought into the cache even if there was no request fro d in step i 
 +  * Let S be a schedule that is not reduced: 
 +    * For a reduction of S(let's call it S**), when S brings in an item d that has not been requested in step i, S** pretends to bring it into the cache but doesn't actually do it, instead, it leaves it in the main memory. 
 +    * S** brings d into cache only in the next step j after which d is requested 
 +    * So, the cache miss incurred by S** in step j can be charged to the earlier cache operation performed by S in step i, when it brought it in d. 
 +    * Thus S** is a reduced schedule that brings in at most as many items as the schedule S 
 +  * Proving optimality of Farther-in-Future: 
 +    * Let S<sub>FF</sub> be a schedule produce by the Farthest-in-Future algorithm 
 +    * Let S* be the schedule that incurs the minimum possible number of misses 
 +    * -->Let S be a reduced schedule that makes the same eviction decisions as S<sub>FF</sub> through the first j items in the sequence, for a number j 
 +    * -->Then there's a reduced schedule S' that makes the same eviction decisions as S<sub>FF</sub>  through the first j+1 items, and incurs no more misses than S does. 
 +    * S<sub>FF</sub>  incurs no more misses than any other schedule S* and hence is optimal 
 +-->-->-->Proof: Course book, page 135-36 
 +\\ 
 +\\ 
 +  * The caching algorithm can be extended to deal with eviction decisions without knowing the future 
 +  * Caching algorithms under this requirement are variants of the //Least-Recently-Used(LRU) Principle which proposes eviction on an item from the cache that was referenced //longest ago//.\\ 
 +\\ 
 +I give this section a 7/10. 
courses/cs211/winter2012/journals/jeanpaul/chapterfour_sectioniii.1330296991.txt.gz · Last modified: 2012/02/26 22:56 by mugabej
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0