Java on Truffle — Going Fully Metacircular

Oleg Šelajev
graalvm
Published in
8 min readJan 19, 2021

Up until now, GraalVM has offered two ways to run Java programs: using the Java HotSpot VM with the GraalVM JIT (just-in-time) compiler, and compiled to a native executable using GraalVM Native Image.

Today, we’re happy to announce a new way to run Java on GraalVM. GraalVM 21.0 introduces a new installable component, named espresso, which provides a JVM implementation written in Java.

Java on Truffle place in the GraalVM architecture

Espresso makes it possible to run Java code via the Truffle framework elevating Java to the level of the other languages supported on GraalVM.

Try it out

Trying Java on Truffle out is extraordinary straightforward. It is available as a component installable into a base GraalVM distribution with a gu command.

gu install espresso
gu install -L espresso-installable.jar # if downloaded manually

After the installation to run your favorite app on Java on Truffle you just need to pass -truffle to the java command.

java -truffle -jar myApp.jar

Download Java on Truffle and give it a spin! There are some example apps with the instructions that illustrate the particular capabilities of Java on Truffle.

Note that current raw performance of Java on Truffle isn’t representative of what it will be capable of in the near future. The peak performance is several times lower than running the same code in the usual JIT mode. The warmup also hasn’t been optimized yet. We focused in this initial release entirely on the functionality, compatibility, and making Java on Truffle open source available for a broader community.

Expect performance, both warmup and peak performance to increase rapidly in each of our upcoming 21.x releases.

Let’s now look in more details what Java on Truffle is, explore some notable use cases where it can help you, and try to place the project into the larger GraalVM and Java ecosystems.

Java on Truffle

Java on Truffle is a JVM implemented using the Truffle language implementation framework. It provides all core components of a Java virtual machine:

  • Bytecode interpreter
  • Bytecode verifier
  • Single-pass .class file parser
  • Simple object model
  • Java Native Interface (JNI) implementation in Java
  • Virtual Machine Implementation in Java
  • Java Debug Wire Protocol (JDWP)

A very important detail of this implementation is that it’s implemented in Java. Java on Truffle is Java on Java! Self-hosting is the holy grail of Java virtual machine research and development.

What it can do is run Java programs. Well, programs written in other JVM languages too of course. As you can see from the list above, it also supports the debug protocol, so you can debug Java applications using it too.

Java on Truffle is available for both Java 8 and Java 11 based GraalVM distributions, so technically you can use it as a replacement for the JVM of your choice. Java on Truffle is currently experimental and not very fast yet, so it is not recommended for production workloads today, but let’s explore what you can get from running applications with Java on Truffle.

Java on Java

As we mentioned previously Java on Truffle is implemented in Java. It’s a virtual machine implementation, so in order to actually run Java code it needs access to the class library and the native libraries and methods JDK provides. Java on Truffle reuses the JARs and native libraries from the GraalVM distribution.

Java on Truffle is a metacircular Java VM.

Being implemented in Java and being able to run Java gives Java on Truffle a very interesting property: it can run itself. Indeed, Java on Truffle is a metacircular VM, it can run itself several levels deep (albeit slower and slower every time).

Being a Java program gives a number of advantages. One of them is ability to be compiled to the native executable with the native image, we’ll explore an interesting use case for that in the following chapter.

Another advantage is that the code is nice, familiar and understandable to Java developers. Consider going to the GitHub repository and looking at the source code for it. Your day-to-day tools work for it, your IDE supports it, you can explore the code base the same way you explore any other Java dependecies. This transparency and familiarity should allow Java on Truffle be efficient at rapidly changing to the better.

Embedding 11 in 8

Java on Truffle is an actual JVM, and it’s also a Java program, which means you can run it within another Java program. This opens very interesting avenues for compartmentalisation of different components in your applications. For example, if you point Java on Truffle to a JDK11 distribution it can run Java 11. With access to Java 8, it becomes Java 8. When you have both distributions available, you can run Java on Truffle in the context of a Java 8 app and use it to run Java 11 byte code, and vice versa. If there’s a library that is only available for Java 8, you can migrate to a newer base JDK and still run that particular library, with some programmatic efforts to establish the interoperability, in the compatible JDK 8 within the same Java process.

Mixing AOT and JIT

Since Java on Truffle, Truffle, the GraalVM compiler and all other necessary components to run Java on Truffle efficiently are all written in Java, it is possible to build a native image executable with the infrastructure to run Java on Truffle.

This means that you can take a Java app, build a JVM into it, and then run that app either on a JVM or as a native image. Note that in the latter case, Java on Truffle can actually execute arbitrary Java code which doesn’t necessarily need to be known at the build time.

That’s right, Java on Truffle can bring the JIT compiler and the dynamic Java runtime to an ahead-of-time compiled binary.

Java on Truffle allows adding dynamic language features to apps build with native image

We’ve prepared a sample application to illustrate this concept. There’s a JShell implementation example that takes a normal JShell app, which consists of two separate parts: frontend CLI app and the backend computation engine, and replaces the latter with the Java on Truffle implementation.

You can also follow this demo in a video tutorial:

Mixing AOT and JIT with GraalVM

It actually very neatly reuses all the classes from the original implementation by just loading them. So the original part of the sample application is the “glue” code that connects the host Java part to the Java on Truffle part of the app.

The sample can be compiled as a native executable, resulting in a nice binary that starts faster than the usual JShell because of the native executable performance characteristics and still can execute the Java code we throw at it.

Here’s a screenshot of the Tetris game loaded and started from the JShell implemented with Java on Truffle.

Mixing AOT and JIT is a fascinating option for applications that cannot leverage the native image performance improvements because their functionality depends on dynamic code which does not work easily with Native Image.

Advanced class redefinition

Another really cool feature where Java on Truffle is more powerful than HotSpot is the enhanced Hot Swap ability — changing classes at runtime during a debugging session.

As of GraalVM 21.0 the following changes are supported:

  1. Add and remove methods
  2. Add and remove constructors
  3. Add and remove methods from interfaces
  4. Change access modifiers of methods
  5. Change access modifiers of constructors
  6. Changes to Lambdas
  7. Add new anonymous inner classes
  8. Remove anonymous inner classes

What will make hot swap even more powerful is the ability to make changes to class fields. It’s in the works and will be added in a future release.

Reload changed classes action in the IntelliJ IDEA debugger.

The setup is the same as with HotSpot: you start the debugger, change the code, recompile the class, hit “Reload the classes” in your IDE debugger and resume the program with the new code running next time the changed class is used.

Follow this demo to see the enhanced HotSwap feature in action:

Enhanced HotSwap with Java on Truffle

GraalVM ecosystem support

Java on Truffle benefits out of the box from the developer tooling support that GraalVM languages get from the Truffle framework.

For example, you can run your application with some java -truffle --cpusampler and have the sampling profiler run on your code. You can enable the tracing profiler or memory tracer to tell you which parts of the code generate more memory pressure than others.

Another facet of the ecosystem is the supported languages. Java on Truffle allows you to create polyglot programs where different components are written in different languages. The details about how to load code written in other languages, export and import objects between languages, and so on are a little bit more involved than the scope of the current article, but details can be found from the docs.

Next steps

GraalVM 21.0 is the initial release of Java on Truffle. It’s an experimental component right now and there are major improvements planned for it in upcoming releases.

There are many things to improve, from supporting javaagents, to having a better implementation of the interop protocol with other languages, major performance improvements, and so on.

We’ll be working on these and other improvement, and would be absolutely thrilled to hear any and all feedback, feature requests, potential use cases, discovered issues, and shortcomings of the current version. You can share your feedback via Slack, GitHub, or Twitter. To get started, head to graalvm.org/java-on-truffle.

Java on Truffle is a new and very exciting way to run your Java code. Take a look and consider the possibilities!

--

--