Image for post
Image for post

TL;DR

How do you update your data model in Apache Flink? If you use Flink for long-running jobs that need schema changes in between, you’ll sooner or later find yourself wanting to update the data model you’re using to write data to state. As of now this is not supported by Flink out of the box. This blog post provides a solution to this problem using Apache Avro to serialize and deserialize data.

We’ll show you how to extend your own managed state serializer, ensure your data supports evolution using Avro and how you can then update your Flink job with all your data migrated successfully. …

About

Niels Denissen

Data Engineer

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store