Caching with Spring Boot and Redis can be tricky!

Jérôme Waibel
5 min readMay 31, 2023

--

Caching, like so much else in Spring Boot, is very easy to configure. Add the proper dependencies, annotate @EnableCaching and @Cachable here and there, and you have a fully working cache. If you also want to use Redis for persistence, simply activate the corresponding setting, and that’s it.

Or so I thought.

In this article I will show you which pitfalls can occur when caching with Spring Boot and Redis, especially in conjunction with JSON mapping, hibernate and cache performance monitoring.

Let’s start with my initial setup:

A REST API controller is given, which receives a data structure from a service. This data structure is managed by Hibernate via a Spring JPA repository. It is a customer object who also contains a reference to its company data (the real data structure of the project this is originating from is of course much more complex). The controller is using data from the customer, the company and many more related objects to create a report. Here is a simplified version:

@Service
class CustomerService(private val customerRepository: CustomerRepository) {
fun readCustomerForReport(uuid: UUID): Customer =
customerRepository.getReferenceById(uuid)
}

@Entity
class Customer(
@Id
val uuid: UUID,

var name: String,

@ManyToOne(fetch = FetchType.LAZY)
var company: Company? = null
)

@Entity
class Company(
@Id
val id: Long,

var name: String,

var lastUpdate: ZonedDateTime
)

As getting those objects from the database is quite cumbersome, and the report is requested quite frequently (particularly more often than the data changes), I have decided it would be best to cache the result of readCustomerForReport().

Enabling the caching

Enabling the cache is straightforward. Firstly, include the spring-boot-starter-cache dependency in the project (Redis is already integrated into the application). Next, add the @EnableCaching annotation to the configuration. Finally, annotate the service method with @Cacheable(cacheNames=”customers”). Additionally, I have created a caching configuration class to define a TTL (Time To Live) and ensure that my objects in Redis are serialized as JSON. This allows for easier debugging of entries since, let’s be honest, Java object serialization , well, sucks.

@Configuration
@EnableCaching
class CachingConfig{

@Bean
fun cacheConfiguration(): RedisCacheConfiguration {
return RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofMinutes(5))
.disableCachingNullValues()
.serializeValuesWith(
SerializationPair.fromSerializer(
GenericJackson2JsonRedisSerializer()
)
)
}
}

The date/time problem

When testing my configuration I got this error when calling my new cache enhanced method:

org.springframework.data.redis.serializer.SerializationException:
Could not write JSON: Java 8 date/time type `java.time.ZonedDateTime`
not supported by default: add Module
"com.fasterxml.jackson.datatype:jackson-datatype-jsr310" to enable handling

Which is quite puzzling, as this dependency is automatically added by Spring. I even double checked my maven dependency tree, and yes, the module is there. And so far Spring had no problems serializing that data, including the date/time property, to JSON at the REST controller.

Well, the cause was quickly found: I configured the serialization by constructing a GenericJackson2JsonRedisSerializer. Calling this class with the empty constructor means it created its own instance of a jackson ObjectMapper, which doesn’t configure and register the fancy stuff as Spring does. So let’s simply inject “the good one” from Spring and tell the JSON serializer to use that:

@Configuration
@EnableCaching
class CachingConfig{

@Bean
fun cacheConfiguration(mapper: ObjectMapper): RedisCacheConfiguration {
return RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofMinutes(5))
.disableCachingNullValues()
.serializeValuesWith(
SerializationPair.fromSerializer(
GenericJackson2JsonRedisSerializer(mapper)
)
)
}
}

The Hibernate problem

Another test, another error message:

org.springframework.data.redis.serializer.SerializationException:
Could not write JSON: No serializer found for class
org.hibernate.proxy.pojo.bytebuddy.ByteBuddyInterceptor
and no properties discovered to create BeanSerializer
(to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS)

Ah yes, our customer object is a hibernate generated object. Hibernate and its magic proxy things — I kind of expected problems with that.

After a lot of manual resolution of lazy loaded properties, googling how to “unhibernate” an object and lots of try and error I found out there is a simple solution to this: there is already a module for jackson to deal with hibernate objects. As the old saying goes: one hour of debugging can save you 5 minutes of reading documentation.

So here is the next evolution of my CachingConfig:

    @Bean
fun cacheConfiguration(mapper: ObjectMapper): RedisCacheConfiguration {
val myMapper = mapper.copy()
.registerModule(
Hibernate6Module()
.enable(Hibernate6Module.Feature.FORCE_LAZY_LOADING)
.enable(Hibernate6Module.Feature.REPLACE_PERSISTENT_COLLECTIONS)
)
return RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofMinutes(5))
.disableCachingNullValues()
.serializeValuesWith(
SerializationPair.fromSerializer(
GenericJackson2JsonRedisSerializer(myMapper)
)
)
}

As you can see I am now utilizing a duplicate of Spring’s ObjectMapper to avoid altering the global Jackson configuration of my application. By using the copy() method, I create a instance of the ObjectMapper that inherits all the configurations from the original one, but allows separate customization.

But we are not done yet ...

The class problem

The initial test showed great promise. Upon invoking my controller, I discovered a serialized entry in Redis. The object had been successfully serialized to JSON, all lazily loaded entities had been resolved, and there were no indications of any Hibernate classes.

However, to my dismay, on my subsequent call to the service, I encountered the following exception:

java.lang.ClassCastException: class java.util.LinkedHashMap cannot be cast
to class my.package.Customer

After successfully resolving all the serialization issues, it appears that we are now confronted with deserialization problems. Upon closer examination of the JSON data, it became apparent that it was simply plain JSON without any indication of the originating class. Consequently, Jackson lacks the necessary information to determine the class representation of this data.

{
"uuid": "a1d1ce1a-c2e3-4910-adbd-5ae140f5f49d",
"name": "Tony Test",
"company": {
"id": 1,
"name": "Evil Corp",
"lastUpdate": "2023-05-30T18:11:25.660997+02:00"
}
}

It is time for another adjustment to our CachingConfig. We need to activate the inclusion of type information during the serialization:

@Bean
fun cacheConfiguration(mapper: ObjectMapper): RedisCacheConfiguration {
val myMapper = mapper.copy()
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
.registerModule(
Hibernate6Module()
.enable(Hibernate6Module.Feature.FORCE_LAZY_LOADING)
.enable(Hibernate6Module.Feature.REPLACE_PERSISTENT_COLLECTIONS)
)
.activateDefaultTyping(
jacksonObjectMapper().polymorphicTypeValidator,
ObjectMapper.DefaultTyping.EVERYTHING,
JsonTypeInfo.As.PROPERTY
)
return RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofMinutes(5))
.disableCachingNullValues()
.serializeValuesWith(
SerializationPair.fromSerializer(
GenericJackson2JsonRedisSerializer(myMapper)
)
)
}

I also disabled failing on unknown properties, because I won’t care if my JSON contained additional properties that have been removed from my model, but that’s just a matter of taste.

So this is what the final JSON looked like, which now could succesfully be deserialized. As a result, my caching mechanism is finally operational.

{
"@class": "my.package.Customer",
"uuid": [
"java.util.UUID",
"a1d1ce1a-c2e3-4910-adbd-5ae140f5f49d"
],
"name": "Tony Test",
"company": {
"@class": "my.package.Company",
"id": 1,
"name": "Evil Corp",
"lastUpdate": [
"java.time.ZonedDateTime",
"2023-05-30T18:18:58.558128+02:00"
]
}
}

The monitoring problem

Now that the caching is working, I would like to “reward” myself by examining the statistics provided by Spring. There is information about cache hit rates, which I hope will be as high as possible. A high cache hit rate would demonstrate that all the effort invested in implementing caching has been worth it.

But to my disappointment, my cache does not appear in the Spring metrics, even though I set the spring.cache.redis.enable-statistics parameter to true as suggested in the instructions.

It seems caches that are created dynamically do not have published metrics. “Created dynamically” means the cache is created by the @Cacheable annotation, and not during application initialization by explicitly defining it at the CacheManager.

So I added this little class which explicitly adds each cache that is created dynamically to the metrics registrar after application startup.

@Configuration
class EnableCacheMetrics(
val cacheMetricsRegistrar: CacheMetricsRegistrar,
val cacheManager: CacheManager
) {
@EventListener(ApplicationStartedEvent::class)
fun addCachesMetrics() {
cacheMetricsRegistrar.bindCacheToRegistry(cacheManager.getCache("customers"))
}
}

I’m not sure if this is the optimal solution as I have to add each cache name manually. I found an old issue where someone suggested that spring should emit a CacheCreatedEvent to automate that task, but this idea has been dropped.

After adding my cache to the metrics registrar I was able to get the metrics from both /actuator/metrics and /actuator/prometheus, and, much to my delight, hit rates were high and the application load dropped significantly.

Conclusion

Spring is an incredible framework that I absolutely enjoy working with.

However, as you can see, there are instances where the promised automatic configurations may require additional effort when exploring deeper or venturing onto less-traveled paths. I hope that my article has proven beneficial to those who may have run into the same or similar obstacles along their journey.

--

--