The way I detect out of memory error in a spring boot + spring data web application.

Truong Nhu Khang
5 min readJul 4, 2019

--

I join a freelance project as a part-time job recently. As usual, it hasn’t any special features. Just a simple flow

Receive data -> Do some business -> Return results

put it simply, it is a CURD application.

I have done many CURD projects on my full-time job. And my strongest is the Java ecosystem, so I pick Spring boot for this application. That is nothing more than an example web we usually see on internet articles.

We have these package

— -Domain

— -Repository

— -Service

— -Controller

Such as a normal Spring boot MVC web application. We receive requests from controller manipulate data with service and save it to DB by the repository layer.

Because of the limit of the resource. I only have a 4gb 2cpu VPS for this project but I think it is enough because at the beginning of the project we don’t have many users.

  • Set heap size 2gb for Backend web -> Receive data from user and save it
  • Set heap size 1gb for Admin web -> Extract data from users to dashboard
  • The rest for another application such as MySql

It runs normally for several days. Day by day the amount of data became bigger, but the concurrent user still the same 5–8 concurrent user at the same time. After 1 week I got the Out of Memory error in Backend Web application. It is really weird I don’t have any consuming memory feature for this app. Actually, the admin web takes almost the heavy task such as to get data and extract to the chart or generate the receipt.

I met the error many times in my life but all of them has happened in console applications, maybe the crawler application or worker role application. These app load ton of data ( include images or binary files) to the Heap and manipulate them. It is easy to understand why Out of memory can happen and almost time I got this error I increase the heap size and it has gone.

But this time is not, My customer poor they only provide 20$ a month for hosting and of course I cannot ask them more because it is nonsense when you request 8gb ram for 8 concurrent users and 1000 views each day. They will wonder that I mining bitcoin or do something dirty.

Day 1: Start review code

As we know from the previous my article. The GC will collect all the unreachable objects. So if we have too much unreachable object, our heap will full. It will throw Out of Memory Error when our heap is full and application required heap space for allocating a new object. So I have to review my code to find the unreachable object. Because almost objects of our business were created in the method scope so I will

  • Find the connection or input stream that open but doesn’t close

=> Okay I have 2 or 3 input stream still not close. I closed it

  • Find static or global objects, Singleton object through project

=> Some static DateTimeFormat, ObjectMapper object and Repository and Service Singleton spring bean. I do nothing with them

  • Find cache. Uhmm we don’t use cache ( but another did .. let talk it later )
  • Reduce some memory usage in Spring boot configuration

=> Default max-thread in embedded Tomcat is 50. I reduce it to 10 and thread stack size to 512 Kb

  • Turn on Java heap dump logs

=> Add this to Entrypoint or when you run java application -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/path/to/log/folder . It will print all current object when the program got OutOfMemory

That all I did for the first day I met this error.

After 7 Day , it comes back to me

Day xxx: Start to analyze Heapdump log

SessionFactoryImpl class account for 99 percent of my heap size
Show it in the object tree. the majority of SeesionFactoryImpl object is a HashMap in QueryPlanCache. it contains a ton of query plan

I find some information about this Class. It seems I’m not the only one got this trouble. We can refer to this problem here.

In QueryPlanCache, we see that hibernate will put to the cache if the query doesn’t exist.

The default cache size is 2048 query. When I analyze the heap dump, each query will store in HQLQueryPlan object and it’s size approximate 3mb. It means I need ~3 * 2048 = ~6gb and I have 2gb for this application.

The solution is quite easy. Just decrease limit the QueryPlanCache size. May be 50–100 is fit for my application. We can set it by the following properties

  • hibernate.query.plan_cache_max_size — controls the maximum number of entries in the plan cache (defaults to 2048)
  • hibernate.query.plan_parameter_metadata_max_size — manages the number of ParameterMetadata instances in the cache (defaults to 128)

Another thing spring boot doesn’t support config these properties in default spring properties. We have to create a class implement HibernatePropertiesCustomizer.

@Component
public class MyHibernateCustomizer implements HibernatePropertiesCustomizer {
@Override
public void customize(Map<String, Object> hibernateProperties) {
hibernateProperties.put("hibernate.query.plan_cache_max_size",40);
hibernateProperties.put("hibernate.query.plan_parameter_metadata_max_size",40);
}
}

Conclusion

From the day I config the plan_cache_max_size, our application is run smoothly until now. No OutOfMemory every week.

After investigating this problem, I learn another thing that OutOfMemory does not mean that is an error, maybe our application actually lacks memory for some feature.

--

--