A persistent multiset that is built up sequentially and processed at once

Imagine your system wants to communicate with another system which isn’t build to deal with an event driven architecture, so you cannot forward each request immediately. You would need something to temporarily buffer and accumulate all requests. Introducing persistent-bag. Its behaviour is as follows:

Over 15 minutes, several jobs/ requests are accumulated in the application, then after this 15 minute-window, they’re all emitted at once. At the same time, the next window is opened and filled.

Why bag? Well, I felt that the abstract data type multiset describes the intended behaviour best: The jobs/ requests in the multiset (an alternative name for a bag) can have duplicates (hence the multi-), they’re packaged by the time window, but have no index.

Why persistent? To store values within the bag, we’re using a MySQL. For services running on AWS this brings the advantage, that a managed instance can be created easily.

Use it today

persistent-bag can easily be installed via npm:

npm install persistent-bag — save

Obviously, a MySQL database needs to be available. The table structures are created on first instantiation. On instantiation of the object, the required table is created in the provided MySQL database if necessary.

Then items can be .add()ed during the runtime of any number of applications:

Working with the emitted, aggregated items every 15 minutes is done like in kue for redis by subscribing to .process():

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.