I’m glad you liked it and found it helpful!
In this case the “API box” refers to the timer module, which is part of Node.js — in other words a “global API”. When for example
setTimeout is called, a timer is scheduled (a
Timeout Object is created internally) and the event loop will run as long as the timer is active.
I’ve never dug that deep inside the implementation, so I’m not 100% sure, but since the timer is just an Object I believe it (and relevant information like the callback, which is just a function) will be stored in memory, i.e. in the memory heap.
The implementation of
setTimeout interacts with libuv, which is the C dependency that actually implements the event loop, the thread pool and thus how callback are scheduled. This means that non blocking I/O operations can run on separate threads, but we can’t control/know that, the runtime and it’s dependencies do.
So this behavior is AWS Lambda specific because it keeps the execution context alive (also the memory heap) for some time in order to optimize. It “freezes” the execution context after the lambda finishes and may or may not “thaw” it when running the lambda again. But this won’t happen when you run it locally on you laptop, or kill the Node.js process while waiting for the callback to fire.
How AWS Lambda exactly optimizes was proprietary, but AWSrecently open sourced firecracker, which powers lambda. So probably we can now also better understand how that works.