Open In App

UGC-NET | UGC NET CS 2015 Dec – II | Question 27

Suppose that the number of instructions executed between page fault is directly proportional to the number of page frames allocated to a program. If the available memory is doubled, the mean interval between page faults is also doubled. Further, consider that a normal instruction takes one microsecond, but if a page fault occurs, it takes 2001 microseconds. If a program takes 60 sec to run, during which time it gets 15,000 page faults, how long would it take to run if twice as much memory were available?
(A) 60 sec
(B) 30 sec
(C) 45 sec
(D) 10 sec

Answer: (C)
Explanation:
Quiz of this Question
Please comment below if you find anything wrong in the above post

Article Tags :