The problem was that this method was extremely slow, and I couldn't understand why. I had implemented a simple service method that fetched the information of an entity from the database by using its id as a search criteria. It took me some time, but I was able to select the transaction size that solved the problem (at least for the time being). I started JProfiler and started looking for the "best" transaction size by using its VM heap telemetry view. Because I wanted to make a decision that is based on facts, I had to test different transaction sizes and measure how much memory the batch job takes. However, I didn't know how small transactions I should use. It was obvious that I could fix the problem by modifying the batch job to use many small transactions. Persist these changes to the database when the transaction was committed.The problem was that the batch job was updating too many entities inside a single transaction, and since the we used Hibernate, Hibernate had to: When I started to investigate this problem, I knew what was wrong. In fact, sometimes it crashed because the JVM ran out of memory. The problem was that batch job very slow and took too much memory. One project had a batch job that processed a lot of entities inside a single transaction. This way you can ensure that you don't make the situation worse. Also, it's important to understand that when you fix a problem like this, you have to make many incremental changes and profile your code after every change. JProfiler has a built-in JPA/Hibernate probe that is able to give me the information I need.Īfter I got the data, I fixed the problem by using the combination of lazy fetching and joins (this function was later replaced with an implementation that used SQL). Once again, the easiest to way to get this information was to use JProfiler. Hibernate invoked so many SQL statements that it was obvious that this search function was suffering from the N+1 SELECTs problem.īefore I started to fix this problem, I wanted to know what information is fetched from the database and how long it takes to fetch that information. I started by taking a quick look at the code and found out that it fetched a list of entities from the database by using Hibernate, converted these entities into DTOs, and returned the search results.īecause I noticed that these entities had a lot of one-to-one relationships and all of them were loaded eagerly, I configured Hibernate to write the invoked SQL statements to the log and tested the search function. I was asked to take a look at a search function that was very slow. Some of the details have been changed to protect the people involved. Also, This post was written me by and all these scenarios are real. Let's get started.ĭisclaimer: This is a sponsored post, but I recommend only products which I use myself. This blog post describes three disasters that I solved with JProfiler. But more importantly, I like JProfiler because it has saved my skin many times during these years. I like its user interface that looks good and it is (in my opinion) quite easy to use. ![]() ![]() JProfiler has been my trusted friend and ally for many years. ![]() Luckily for me, there are exceptions to this rule and today I will identify one such exception. Unfortunately the world is full of useful development tools that are not easy to use or easy to learn. This means that they must be easy to use (or easy to learn). It is not easy to earn a place in my toolkit because I want that my tools make my life easier. We'll use ThreadMxBean to capture the thread dump.I have a toolkit that helps me to do my job. The last approach we'll discuss in this article is using JMX.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |