If you look at the doc, we can see that it is: “The upper bound on the number of elements that may be stored in this queue.”
Yet, if you push 6 elements inside a queue with capacity 3, none will be lost, instead a mecanism kicks in to save somewhere the extra data and feed it in the queue as soon a some other elements moves out (no need to re-run an extra op to re-feed).
So i don’t really know: my best guess would be about memory handling. You probably tell TensorFlow to keep such an amount of memory ready. But take this with a grain of salt.