• Courses
  • Tutorials
  • Jobs
  • Practice
  • Contests

UGC-NET | UGC NET CS 2015 Dec - II | Question 27

Suppose that the number of instructions executed between page fault is directly proportional to the number of page frames allocated to a program. If the available memory is doubled, the mean interval between page faults is also doubled. Further, consider that a normal instruction takes one microsecond, but if a page fault occurs, it takes 2001 microseconds. If a program takes 60 sec to run, during which time it gets 15,000 page faults, how long would it take to run if twice as much memory were available?

(A)

60 sec

(B)

30 sec

(C)

45 sec

(D)

10 sec

Answer

Please comment below if you find anything wrong in the above post
Feeling lost in the world of random DSA topics, wasting time without progress? It's time for a change! Join our DSA course, where we'll guide you on an exciting journey to master DSA efficiently and on schedule.
Ready to dive in? Explore our Free Demo Content and join our DSA course, trusted by over 100,000 geeks!

Last Updated :
Share your thoughts in the comments