There were three rounds out of which two were system design based and third one was engineering manager round.
Questions asked in first round are as follows:
- Explain complete architecture of current project
- Explain about one challenge that you faced while developing any feature and how did you solve that problem.
- Difference between SQL and NoSQL databases.
- What is sharding.
- Since I have worked on Elastic search, design elastic search cluster, given number of active shards, number of instances and number of replicas.
- How is data actually stored in elastic search?
- Design rate limiter specific to client. Given max R requests recieved from client in T sec, how do design your system. How to handle failure scenario when more requests are recieved? I explained this with the help of queue and doing binary search based on timestamp to get number of requests when a new incoming request was recieved.
Questions asked in second round were somewhat similar to that of first round and are listed as below:
- Again explanation about current project and a feature that recently worked upon.
- Benefits of storing data in elastic search as compared to DBs.
- What is indexing and how is it implemented internally.
- Given two DBs A and B and one table T ( Columns C1, C2)in both DBs, C1 was primary key in them and C2 was another indexed column in DB B. Select query will run on both DBs i.e Select C2 from T where C1=X. Which DB will serve faster data?
- One follow up question, another select query is given say, Select * from T where C1=X and C2=Y. Which DB will serve faster data?
- Given two clouds Azure and AWS and we need to store data on AWS as backup. Read and write op will be done on Azure only. How will you manage this syncing of data?
a)First solution, asnyc update AWS backup and maintain a queue for all write requests. When a worker thread is free, start writing data on AWS.
b) What if many requests come at one time that queue can’t handle? Then write a cron job schedular that will trigger a lambda on hourly basis to sync data between these two clouds.
- Given a log file consisting of all requests log recieved on one single day, how will you calculate max number of requests/sec and max number of active concurrent users.
a) Solution to first problem that I provided : read data from log file and sort data based on timstamp. After sorting, group data based on second interval say 12:00-12:01 and then count max size of these groups.
b) For second problem, we will again sort data and then for every login, do count++ and for evry logout, do count–. If count> max, then update max_Concurrent_users.
Third round was mainly around discussion of current project.
- Zomato Interview Experience (1 years Experience)
- Oyo Interview Experience | 3.5 years Experienced for SDE-2
- OYO Interview Experience | Set 11 (For 2 years experienced)
- Oyo Interview Experience | 3 years Experienced for SDE-2
- Amazon Interview Experience for SDE 2 (3 years Experienced)
- Oracle Interview Experience | Set 43 (For 3 Years Experienced)
- Oracle Interview Experience (2 Years Experienced)
- Amazon Interview experience | Set 334 (For 4.5 Years Experienced)
- Paytm Interview Experience | Set 12 (For 1.5 Years Experienced)
- Cadence Interview Experience | Set 3 (For 6 Years Experienced)
- Vizury Interview Experience | Set 2 (1.5 Years Experienced)
- Nagarro Interview Experience | Set 7 (For 2 Years Experienced)
- Nearbuy Interview Experience | Set 4 (2 Years Experienced)
- Microsoft interview experience: | Set 167 (7 Years Experienced)
- Paytm Interview Experience | Set 18 (For 2 Years Experienced)
- Microsoft Interview Experience | Set 112 (For 4.5 Years Experienced)
- Oracle Interview Experience | Set 44 (For 4.5 Years Experienced)
- Oracle Interview Experience (1.5 years Experienced)
- Paytm Interview Experience For Experienced (2.5 years)
- NoBroker.in Interview Experience | For 4 years experienced
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to firstname.lastname@example.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.