Geek Culture
Published in

Geek Culture

Authentication + Authorization + Encryption

Confluent Platform: Kafka Security

JAAS CONFIGURATION + ACL + SECURED SOCKET LAYER

Confluent Platform: Kafka Security Architecture

Scope

This article is written to help beginners and intermediate to gain insights in Kafka Security architecture. It covers the following concepts with hands on implementation. Implementation Code can be seen at GitHub.

a. Simple Authentication and Security Layer Framework(SASL)
b. SASL PLAIN mechanism with Transport Layer Security(TLS) for encryption
c. Set SASL ACL in Zookeeper node
d. JAAS configuration with KafkaServer section to provide SASL configuration options for Broker
e. JAAS configuration with Client section to authenticate SASL connection with Zookeeper
f. JAAS configuration for Producers and Consumers to connect with Broker
g. Enable SASL_SSL listeners and Advertised listeners
h. Client connection TLS(Producer & Consumer) through Kafka CLI and Java Spring boot Application
i. KeyStore, TrustStore, Cacerts, Certificate Authority(CA)

Prerequisite (To try this exercise)

a. Confluent Platform v7.1.1
b. Ubuntu v20.04 (or) any Linux distros of your choice
c. OpenSSL and Keytool
d. Java v1.8 (or) higher
e. Spring Boot Framework

How Authentication and Authorization is enabled in Kafka ?

Authentication

In order for Kafka Broker/Kafka Server to intercommunicate with other Kafka broker nodes, Zookeeper nodes, Producers and Consumers, JAAS configuration is implemented with JAAS login module and authentication credentials.
In a JAAS configuration file, a KafkaServer section, Client section and KafkaClient section denotes credentials used to authenticate against Kafka Broker, Zookeeper and Producer/Consumer respectively.

Authorization

Producers and Consumers are authorized using ACL(Access Control List).
To implement authorization, its imperative to secure Zookeeper. Kafka stores all ACL’s in Zookeeper and only ADMIN will have full access to create/describe/modify/delete.
kafka-acls tool can be used to allow access to Principal on operations such as create/write/delete/describe/read on a Topic.
Apache Kafka ships with out of the box, pluggable Authorizer implementation that uses Zookeeper to store all ACL’s. It is important to set ACLs, otherwise access to resources is limited to super users when an Authorizer is configured. Access Control Lists (ACLs) provide important authorization controls for your enterprise cluster data.

What is an Authorizer ?

An authorizer is a server plugin used by Apache Kafka to authorize operations. An authorizer controls whether or not to authorize an operation based on the principal and the resource being accessed. The default Kafka authorizer implementation is AclAuthorizer (kafka.security.authorizer.AclAuthorizer), which was introduced in Apache Kafka 2.4/Confluent Platform 5.4.0.
Authorizer should be enabled in Kafka Broker properties.

The below sequence diagram shows how ACL’s are loaded.

Load Access Control List

Why Encryption and how it is enabled in Kafka ?

Lets look at a Use Case;
If a producer or a consumer configure and use SASL PLAIN mechanism, then authentication credentials will be sent over the wire in PLAINTEXT/No TLS.
This could potentially lead to clear passwords being transmitted over the wire.
SASL/PLAIN should only be used with TLS/SSL as transport layer to ensure that clear passwords are not transmitted on the wire without encryption.

Steps to enable Encryption: Setting up SASL with SSL

Creating SSL Keys and Certificates

Create JAAS configuration : Authentication

Kafka Broker/Kafka Server

JAAS configuration: Kafka broker

Zookeeper

JAAS configuration: Zookeeper

Producer

JAAS configuration: Producer

Consumer

JAAS configuration: Consumer

Create ACL : Authorization

ACL Definitions

Let’s Test the application end to end

a. The below demonstration includes both Kafka CLI and Spring boot Application

Start Zookeeper

Zookeeper instance

Start Kafka Broker

Kafka Broker instance

Spring Boot Application

Start Producer Application

Producer Application: Spring Boot

Start Consumer Application

Consumer Application: Spring Boot

Kafka CLI

Console Producer

console producer

Console Consumer

console consumer

Conclusion

To summarize from this article, we have seen how Kafka security can be implemented using SASL mechanism and TLS. Authentication and Authorization are implemented through JAAS configuration and ACL’s respectively. We also created SSL Keys and Certificates and enabled them in keyStore and trustStore. We also implemented Kafka CLI and Java Spring boot application for producers and consumers to send message through authentication, authorization and encryption.

Published on 19th June 2022

--

--

--

A new tech publication by Start it up (https://medium.com/swlh).

Recommended from Medium

Introducing Cloudera Streaming Analytics Community Edition

Reduce Cost and Increase Productivity with Value Added IT Services from buzinessware — {link} -

Intermediate: Integration of Huawei Site Kit Text Search in Food Delivery App (Flutter) Part -1

How to point your Custom Domain to an AWS Load Balancer

Create an Ansible Playbook which will dynamically load the variable file named same as OS_name and…

IP Filter Module: IP Adress Filtering

AWS: Architecting for the Cloud

CSS Grid Crash Course

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Ganesh Nagalingam

Ganesh Nagalingam

https://in.linkedin.com/in/ganeshnagalingam

More from Medium

Using Debezium Source Connector and JDBC Sink with Kafka Connect on AWS RDS PostgreSQL

Sample Series: ASN.1 Format Configuration

Data Replication (CDC) from Oracle to PostgreSQL using Debezium, Run in Docker & Exposed in Grafana

Providing external access to Kafka Cluster