I feel like the first example can be solved more elegantly by doing something like
l =  * 10;
i = 0;
if now() > l[i]:
l[i] = now() + 1 minute
i += 1
if i == l.size:
i = 0
It’s roughly the same as in the post. The main difference is using a circular array to keep track of the times instead of a linked list:
l = new Queue()
while l.peek() < now():
if l.len() > 10:
l.push(now() + 10 minutes)
In fact, you can replace the while loop with an if statement and it will still work.
You’re only allowing one call per minute, you need to allow N calls per minute, but the spirit is the same.
There are n calls per minute.
Oh I see, n = 10 in this case.
as an interviewer, i definitely wouldn’t provide the “expected running time” hint until i had seen what the candidate came up with on their own.
Sounds okay for pre-screening coding tests, but doesn’t seem to be enough for on-site interviews of high tier companies. How do you think?
I would say it still applies for applies for high tier companies. I’ve interviewed with Google twice and have gotten questions like “merge two sorted arrays” and “write a program to play tic-tac-toe”. Both times I got offers. Of course there were also questions out of left field like “solve a maze with 100 servers”.
I think it may have been a Lobster that gave me this insight, many moons ago: since there’s usually an upper limit on how many elements can be stored, in-memory binary trees essentially have constant-time access and update. Ruling them out as “too slow” may be a bit pessimistic.
If you take things in a very literal sense, sure, but that’s usually not what people mean when they talk about big-O. By that definition, any algorithm that runs in finite-memory is constant time.
The formal definition of big-O is what happens when the size of a data structure approaches infinity. In this context, people are usually talking about an idealized computer in which you have infinite memory.
Sure, I know that, and my comment was out of scope for “give the expected answer to this interview question”, but you’re presumably interviewing to work on a non-ideal computer with finite memory.
It’s just as important to be able to figure out when worst-case complexity is rendered irrelevant by context and usage patterns. For instance, in the rate limiter, N is the constant upper bound for n, making everything O(1) anyway - you may even find that N=1 everywhere it’s used.
It would be interesting to attach this to some sort of chatbot, then I can be constantly interviewing and getting job offers.