Published in


Secure Kafka with Keycloak: SASL OAuth Bearer

This post will do a step-by-step configuration of the strimzi-operator (Apache Kafka) on Openshift. Expose an external listener as Openshift route over TLS and Securing the Kafka Cluster using Keycloak using SASL OAuth Bearer.

If you don’t want to do a bunch of configurations. An easy option for you would be: Openshift Streams for Apache Kafka

Let’s begin the Journey of securing Kafka-kittens. if you aren’t familiar with SASL for OAuth.

SASL OAuth Bearer: [Simple Authentication and Security Layer (SASL) Mechanisms for OAuth]

OAuth 2.0 Protocol Flow


Running Apache Kafka deployed using Strimzi operator and Keycloak using Keycloak operator on Openshift. Two Quarkus based Clients: producer and consumer application running on an external local system.

  • During the terraforming of the Kafka cluster, it will fetch the JWKs certificate from the Keycloak.
  • Configure a service account in Keycloak for the producer/consumer.
  • Using the service account sends a request to the Keycloak (OpenID-connect end-point) for fetching the token.
  • Producer/Consumer sends the token with the message or consumes the message.
  • Broker verifies the token and authenticate the user & allows to produce/consume.


Installation: OLM: One-click install

  • Keycloak Operator
  • Strimzi Operator

Strimzi Local Installation

git clone
cd strimzi-kafka-operator

Update the namespace in which you wanted to deploy the Operator.

sed -i '' 's/namespace: .*/namespace: kafka/' install/cluster-operator/*RoleBinding*.yaml

Login & Create the namespace/project (kafka-demo)

oc login <cluster>
oc new-project <projectname> //kafka-demo as project name

Create the Cluster-Operator in the namespace: Kafka

kubectl apply -f install/cluster-operator -n kafka-demo

Check the status of the operator pod

oc get pods -n kafka-demo

Keycloak Local Installation

git clone
cd keycloak-operator

Login & Create the namespace/project

oc login <cluster>
oc new-project <projectname> //kc
  • Run make cluster/prepare
  • Run kubectl apply -f deploy/operator.yaml
  • Run kubectl apply -f deploy/examples/keycloak/keycloak.yaml

Once Keycloak is up and running. Create a realm: demo

Create Realm: demo

Creating Kafka Cluster

Whether you have used OLM or followed manual steps, we can now create the Instance.

git clone kafka-sasl-oauth-keycloak/CR

Update the authentication spec in the Kafka Custom Resource. Open my-cluster.yaml in your preferred editor.

Configure the keycloak end-points:

  • Fetch the Keycloak route

export NAMEPSACE=<>

export KEYCLOAK_ROUTE=$(oc get route keycloak -n $NAMESPACE --template='{{ }}')
  • Fetch the keycloak route SSL certificate.
echo "" | openssl s_client -servername $KEYCLOAK_ROUTE -connect $KEYCLOAK_ROUTE:443 -prexit 2>/dev/null| openssl x509 -outform PEM > keycloak.crt
  • Create a secret with keycloak this certificate
oc create secret generic ca-keycloak  \
  • Create a truststore that can be used by the Producer/Consumer application
keytool -keystore keycloak.jks -alias root -import -file keycloak.crt -storepass password -noprompt

Authentication Listener Spec:

  • update <keycloak-host>
  • realm: demo
- name: external
port: 9094
tls: true
type: route
checkIssuer: true
jwksEndpointUri: >-
userNameClaim: preferred_username
checkAccessTokenType: true
accessTokenIsJwt: true
enableOauthBearer: true
validIssuerUri: >-
- certificate: keycloak.crt
secretName: ca-keycloak
type: oauth

Create the Kafka cluster in the same or different namespace. In this case, using namespace: kafka-demo

oc create -f keycloak-integrations/kafka-sasl-oauth-keycloak/CR/my-cluster.yaml

Accessing the Cluster using routes

$ oc get -n kafka-demo routes my-cluster-kafka-bootstrap -o=jsonpath='{.status.ingress[0].host}{"\n"}'//output:kafka broker bootstrap urleg: my-cluster-kafka-bootstrap-kafka-demo.<openshift-cluster-domain-url>

Fetch the ca-cert from the kafka-demo namespace

oc extract -n kafka-demo secret/my-cluster-cluster-ca-cert --keys=ca.crt

Create a truststore for the Producer/Consumer App

keytool -import -trustcacerts -alias root -file ca.crt -keystore truststore.jks -storepass password -noprompt


cd kafka-sasl-oauth-keycloak/producer orcd kafka-sasl-oauth-keycloak/consumer

Open both applications in your favourite editor as Maven-based projects.

Service Account in Keycloak

Create a service account in Keycloak: kafka-client-service-acc (any other name). Make sure to enable the flag: “Service Account Enabled: ON”

Copy the ClientId and Credentials

Export properties or update values in the

export KEYCLOAK_HOST=https://<keycloak-host>
export KAFKA_BOOSTRAP_HOST=my-cluster-kafka-bootstrap-kafka-demo.apps.<domain>:443
export KEYCLOAK_CLIENT_ID=kafka-client-service-acc
export KEYCLOAK_REALM=demo
export KEYCLOAK_TRUSTSTORE=keycloak.jks
export KAFKA_TRUSTSTORE=truststore.jks
export KAFKA_TOPIC=new-topic

Run the producer & consumer app

./mvnw quarkus:dev

Congratulation!! If you are able to see the above output. You are able to produce/consume messages secured by SASL over the Oauth mechanism.

Security Isolation Concern

Our setup works great. but, any service account created in that Keycloak(realm: demo) can authenticate against your Kafka cluster.

How do we do restrict authentication to a set of service accounts?

A new spec called Custom Claim Check is recently added to the authentication listener spec in strimzi-operator. Which allows you to add a custom claim in Token. Which will be checked during the authentication flow.

eg: customClaimCheck: @.userId == '123'

Let’s update the CR and re-deploy the cluster.

- authentication:
checkIssuer: true
jwksEndpointUri: >-
userNameClaim: preferred_username
clientId: kafka-broker
checkAccessTokenType: true
accessTokenIsJwt: true
checkAudience: false
enableOauthBearer: true
validIssuerUri: >-
- certificate: keycloak.crt
secretName: ca-truststore
type: oauth
key: kafka-broker
secretName: clientsecret
customClaimCheck: '''kafka-user'' in @.realm_access.roles'
name: external
port: 9094
tls: true
type: route

Now try to run the producer or consumer application. You will get this error:

org.apache.kafka.common.errors.SaslAuthenticationException: Authentication failed due to an invalid token: io.strimzi.kafka.oauth.validator.TokenValidationException: Token validation failed: Custom claim check failed

Authentication failed! For getting authentication. We need to add a realm role in our service account configuration.

Create a Role in the Keycloak(demo realm)

Now add the realm role to our service account.

If you try to check the token claim. You can see role: “kafka-user” added to the roles array in the realm_access object.

”realm_access”: {
“roles”: [“kafka-user”, “..”]

Custom claim check rule will be like: ‘kafka-user’ in @.realm_access.roles’

Let’s try to run the producer app again

Great, we are now able to authenticate and any other service account without that “kafka-user” role won’t be able to authenticate.


We managed to deploy a secure Kafka Cluster using Keycloak with the help of SASL over the Oauth mechanism. Its easier to manage access using the Keycloak Admin interface. Also, addressed the advanced security challenges around restricting the set of service accounts to authenticate against a Kafka instance.

If you like this post, give it a Cheer!!!

Follow the Collection: Keycloak for learning more…

Happy Secure Coding ❤




Open Source Identity Solution for Applications, Services and APIs

Recommended from Medium

There are Good Things on Regression

The power of pure CSS loading screens

First take on Shopify and GitHub integration

Animating from a Nativescript splashscreen to a login page using Javascript

Generate EC2 Instance with Apache using AWS CLI

“Hello World”? Nah, it’s “Hello Open Source World” now

Featured Images with Captions for News in Drupal 8

Use prefix free

prefix free

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Abhishek koserwal

Abhishek koserwal

#redhatter #opensource #developer #kubernetes #keycloak #golang #openshift #quarkus #spring

More from Medium

Nifty tool-chain for CQRS application development with read model projection

Kafka on Kubernetes

Consume AVRO Messages from Kafka without schema

How We Upgraded PostgreSQL Database Version with Near Zero Downtime