HotSpot JVM JIT Profiling Using JitWatch Tool

Enter Just In Time Compilation – JIT.

Java code is usually compiled into platform independent bytecode (class files) using “javac” command. This “javac” command is the Java programming language compiler.

The JVM is able to load the class files and execute the Java bytecode via the Java interpreter. Even though this bytecode is usually interpreted, it might also be compiled in to native machine code using the JVM’s Just-In-Time (JIT) compiler.

Unlike the normal compiler, the JIT compiler compiles the code (bytecode) only when required. With JIT compiler, the JVM monitors the methods executed by the interpreter and identifies the “hot methods” for compilation. After identifying the Java method calls, the JVM compiles the bytecode into a more efficient native code.

In this way, the JVM can avoid interpreting a method each time during the execution and thereby improves the run time performance of the application. For each method, the JVM maintains a call count, which is incremented every time the method is called. The JVM interprets a method until its call count exceeds a JIT compilation threshold. Therefore, often-used methods are compiled soon after the JVM has started, and less-used methods are compiled much later, or not at all. The JIT compilation threshold helps the JVM start quickly and still have improved performance. The threshold has been carefully selected to obtain an optimal balance between startup times and long term performance.After a method is compiled, its call count is reset to zero and subsequent calls to the method continue to increment its count. When the call count of a method reaches a JIT recompilation threshold, the JIT compiler compiles it a second time, applying a larger selection of optimizations than on the previous compilation. This process is repeated until the maximum optimization level is reached. The busiest methods of a Java program are always optimized most aggressively, maximizing the performance benefits of using the JIT compiler. The JIT compiler can also measure operational data at run time, and use that data to improve the quality of further recompilations.

To learn more about the JIT compilation process, see “Understanding the Java HotSpot VM Code Cache,” and “Introduction to JIT Compilation in Java HotSpot VM.”


Hotspot has a large number of different optimization techniques for JIT compilation – but one of the most important  is inlining. This is the process of removing virtual method calls, by effectively hoisting the body of methods into the caller’s scope. For example consider following:

 public int add(int x, int y) {
    return x + y;
  int result = add(a, b);

With inlining this code is effectively transformed just to:

 int result = a + b;  

The values a and b have been substituted for the method parameters, and the code comprising the body of the method has been copied into the caller’s scope. This speeds up frequently executed method calls.

For more on JIT optimization phases see –

Ah, but is there a catch?

So the JIT process marks a method for compilation to machine instructions. After compilation, method invocations are generally pretty fast on a method since the JIT process done analysis ads to what happens during invocation of this method and can optimize accordingly. The thing is, when performing this operation of marking methods for compilation there’s a catch. An 8k byte code instructions catch to be exact. If a method in byte code contains more then 8000 byte code instructions it will never ever be marked for compilation, period.  You can attempt to use “-XX:-DontCompileHugeMethods”  parameter when invoking JVM, but I have read in many places that it’s a bad practice – it forces the JVM to dig through all that ugly code and try and do something with it, which can have a negative effect on performance rather than a positive one! Refactoring, or better still not writing methods that huge to start with would be the way forward.

Types of JIT Compilers

There are two different JIT compilers for client and server systems in Java. A server application needs to be run for a longer time and therefore it needs more optimizations. However a client application may not need lot of optimizations compared to a server application.

.Prior to some of the later Java SE 7 releases, these two modes were available using the -client and -server switches, respectively.

Very good internals explanation is available here –

A Window into World of Dynamic Compilation.

JITWatch is a log analyser for Java HotSpot JIT compiler. It consumes JIT log files and visualizes its activity.

JarScan is a tool included in JITWatch to analyze jar files and count the bytes of each method’s bytecode.

With this tool, we can identify the methods, which are too large to JIT.’

JitWatch can be downloaded from GitHub –

The JITWatch tool can analyze the compilation logs generated with the “-XX:+LogCompilation” flag.

The logs generated by LogCompilation are XML-based and has lot of information related to JIT compilation. Hence these files are very large.


What is great that download includes a sandbox folder with few examples of JIT code that can be used to test drive this tool.


Here is an example after running provided sample looking at the logs:


Chris Newland who created this tool has excellent presentation on it and internals of Java HotSpot JIT on video here –, also there is a great article from Bill Evans on InfoQ that has great JarScan examples –


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s