Oracle Developers
Published in

Oracle Developers

Asynchronous processing using Worker applications on Oracle Application Container Cloud

This blog demonstrates Worker Applications on Oracle Cloud with the help of an example which makes of

  1. Redis as a job queue (deployed on Oracle Cloud Infrastructure Classic), and

Worker applications ?

Traditionally, any service deployed to Oracle Application Container Cloud was was required to bind to $PORT and was given a public URL. But this is actually not required for all kinds of workloads/app types — so you were forced to do things like expose a REST endpoint for a batch processing app ( I know I have ;-) )

Worker applications were built to tackle these kind of problems (and more) and they differ from traditional ACCS applications

They will continue to enjoy all the other capabilities of the ACCS platform such as service bindings etc.

More in the documentation

When should I use them ?

  • Asynchronous processing e.g. a consumer application processing events from Oracle Event Hub Cloud Service (Kafka) topics

In the process, you can use your Worker apps to

  • Invoke traditional/regular web apps deployed on Oracle Application Container Cloud

Going forward, Worker applications will be enriched to closely integrate with other public (and fellow worker) applications as well

The sample application for this blog demonstrates how you can use Worker applications for asynchronous processing

Sample application

The code is available on Github


It has two components

  • Redis — serves as the (work/job) queue which helps decouple the producers from the workers. The list data structure and pub-sub channel capabilities in Redis have been leveraged

From a platform perspective

  • Redis was deployed on Oracle Cloud Infrastructure Classic using the Bitnami integration for Oracle Cloud Marketplaceread more here
High level system overview


  • once deployed, the worker app establishes a (blocking) connection to Redis where it listens for messages in the list/queue using BRPOP

More in the Test section


Pretty straightforward — the logic consists of two classes

  • — it implementsRunnable and uses the Jedis client for interacting with Redis

Build and deployment


The build process will create in the target directory

Push to cloud

With Oracle Application Container Cloud, you have multiple options in terms of deploying your applications ranging from the console/UI, REST API, Oracle Developer Cloud (CI/CD)

This blog will make use of the PSM CLI — a powerful command line interface for managing Oracle Cloud services. Start with download and setup of PSM CLI on your machine (using psm setup) — details here


  • cd <code_directory>

Once executed, an asynchronous process is kicked off and the CLI returns its Job ID for you to track the application creation

Push to cloud using PSM CLI

Check your application

Your application should be deployed on Oracle Application Container Cloud (notice the highlighted application type i.e. worker)

Worker Application on Oracle Application Container Cloud

Testing the application

Subscribe to the Redis pub-sub channel

In order to receive processing notifications, you would need to subscribe to that (Redis) channel. We will use the redis-cli for this

In case you are new to redis-cli , you can refer to its documentation

  • point the CLI to your Redis instance redis-cli -h redis_host -p redis_port

Push items to Reds work queue

  • (in a new window/shell) use redis-cli to add items to the list (the work queue) lpush workQ “test1” (where workQ is the name of the list)

notice the worker app node (instance) identifier (worker.1 in this case) in the message

Channel notification

Keep going…

Send a few more messages before moving on to the next step — you will see the response in the pub-sub notification shell

lpush workQ “test2”
lpush workQ “test3”
lpush workQ “test4”
lpush workQ “test5”
lpush workQ “test6”

Scale out

Let’s scale out (horizontally) to two instances to demonstrate the worker load balancing feature i.e. the messages from the Redis list (work queue) will be processed by individual worker app instances

Scaling out to two instances

Publish messages to the Redis list

lpush workQ “test7”
lpush workQ “test8”
lpush workQ “test9”
lpush workQ “test10”
lpush workQ “test11”
lpush workQ “test12”
Worker load balancing in action

the node identifier (worker.n) has been highlighted — notice that a different instance handles the processing hence the load is balanced

Don’t forget to…


The views expressed in this post are my own and do not necessarily reflect the views of Oracle.



Aggregation of articles from Oracle engineers, Groundbreaker Ambassadors, Oracle ACEs, and Java Champions on all things Oracle technology. The views expressed are those of the authors and not necessarily of Oracle.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Abhishek Gupta

Principal Developer Advocate at AWS | I ❤️ Databases, Go, Kubernetes