Can someone please help me with "Replacement Algorithms"?

i_hate_toms

New Member
I'm a second year student of computer science engineering.
Got this subject called "Computer Architecture and System Software",
And m presently in the chapter called "The memory system".
Studying from McGraw Hill's book "Computer Organization" by Hamacher, Vranesic, and Zaky (5th Edition).
On page 321, section 5.5.2,
it says -- "Performance of the LRU algorithm can be improved by introducing a small amount or randomness in deciding which block to replace".

My questions-->
1>What causes this performance improvement?
2>Where is this randomness generated? CPU or the memory controller?
3>How is this randomness produced? How does a chip generate truly random events/ numbers which can't be explained mathematically?

Thanks.
 
I'm a second year student of computer science engineering.
Got this subject called "Computer Architecture and System Software",
And m presently in the chapter called "The memory system".
Studying from McGraw Hill's book "Computer Organization" by Hamacher, Vranesic, and Zaky (5th Edition).
On page 321, section 5.5.2,
it says -- "Performance of the LRU algorithm can be improved by introducing a small amount or randomness in deciding which block to replace".

My questions-->
1>What causes this performance improvement?
2>Where is this randomness generated? CPU or the memory controller?
3>How is this randomness produced? How does a chip generate truly random events/ numbers which can't be explained mathematically?

Thanks.


Fellow engineering student here.
Imagine you have an array of the heads of 100 000 simply linked lists. You have to store and retrieve as fast as you can, 10 000 000 data units. Well, the heads of the lists can be accessed at O(1), while the rest of the list requires at most O(n), depending of how long is the list. If you stack all your data in the first list, then the last element requires 10 000 000 calculations to retrieve. If you distribute them evenly, an data piece requires at most 100 iterations to reach.

The logic here is quite simple. You want your elements as evenly distributed as possible. The purest distribution is given by pure probabilistic randomness. The computer generates pseudo-random numbers, reading the current time to nanosecond precision, and calculating and number on top of that.

I'm think there's a function that if it's iterated on Z, gets better probabilistic results, but i'm not sure. (See on youtube vihart's video on fibonacci numbers).
 
Fellow engineering student here.
Imagine you have an array of the heads of 100 000 simply linked lists. You have to store and retrieve as fast as you can, 10 000 000 data units. Well, the heads of the lists can be accessed at O(1), while the rest of the list requires at most O(n), depending of how long is the list. If you stack all your data in the first list, then the last element requires 10 000 000 calculations to retrieve. If you distribute them evenly, an data piece requires at most 100 iterations to reach.

The logic here is quite simple. You want your elements as evenly distributed as possible. The purest distribution is given by pure probabilistic randomness. The computer generates pseudo-random numbers, reading the current time to nanosecond precision, and calculating and number on top of that.

I'm think there's a function that if it's iterated on Z, gets better probabilistic results, but i'm not sure. (See on youtube vihart's video on fibonacci numbers).

Ok , got your point. Makes sense while we're talking about arrays.
But does this logic hold for other operations which doesn't engage an array data structure?
Didn't get the second part of your answer though, "iterated on Z"? What's Z?
They haven't taught us that yet.
And the random event/ number generation technique you wrote above, that's not there in the book. At least I haven't come across that yet. The concept sounds interesting. I'll ask the professor tomorrow for a detailed description. It's interesting. Thanks :-)

Thanks for your answer, very good explanation about that array thing, makes perfect sense :-)
 
Last edited:
Ok , got your point. Makes sense while we're talking about arrays.
But does this logic hold for other operations which doesn't engage an array data structure?
Didn't get the second part of your answer though, "iterated on Z"? What's Z?
They haven't taught us that yet.
And the random event/ number generation technique you wrote above, that's not there in the book. At least I haven't come across that yet. The concept sounds interesting. I'll ask the professor tomorrow for a detailed description. It's interesting. Thanks :-)

Thanks for your answer, very good explanation about that array thing, makes perfect sense :-)

We call Z the set of all integers {-infinite ... -2 , -1, 0 , 1, 2, infinte}.

I also have some professors that just recite their papers like bad rappers. Yet some professors, like the one that thought us this trick really knows how to keep us interested. Good chance in your quest for knowledge.
 
We call Z the set of all integers {-infinite ... -2 , -1, 0 , 1, 2, infinte}.

I also have some professors that just recite their papers like bad rappers. Yet some professors, like the one that thought us this trick really knows how to keep us interested. Good chance in your quest for knowledge.

Z={integers}, how could i miss that?!
No wonder I suck in mathematics :-p
Thanks to clarifying :)
R u in WBUT?
 
Back
Top