I could list here several type of applications that run arrays of numbers that we can’t even pronounce, such as:
Garbage Collector Applications, Stack tracing applications, Log Applications, Big data mining or even some simple ones, such as feeds or messaging system.
and also it’s good to emphasize that, It’s not because you iterate through such a dataset that you may keep it saved in memory, or navigate it through network. We may simple read it through and dispose afterwords or like I said before, strategies of lazy loading.
There are systems that are just based on infinite loop ( It doesn’t mean that you’ll be looping it all at once ) but instead it runs in batches / or lazy loading. Also, these kind of systems usually don’t use regular arrays, but rather specific ones such as:
After all, any waste of resource matters in this case.
I apologize for the lack of clarity, when I didn’t mention in which case we may use such approach, but my goal was just to emphasize that we have a big performance obscurity hidden very subtly on reducers and many people don’t know.
But even in cases that we may have smaller datasets, with a couple of thousands, being accessed by a great amount of users at once, it may significantly show the discrepancy on the performance.
Last but not least, I like to mention that there’s a threshold for everything, I don’t want to sound like a performance freak 😜, but if we have a small system that’s being used by a couple of hundreds users, or even datasets of couple of dozens of items, there’s no need to worry about performance, as it won’t present any difference!
The goal is to always evaluate better and be very careful on big decision, not for trivial ones ☺️.
I hope it clarifies.
Sharing is caring !