Comparing Implementations of the Monkey Language XI: Going Native (and JS) with Kotlin.

Previously

In the last episode, We review some updates to the performance of several languages and runtimes.

Kotlin Native

Many moons ago, I tried to compile my Kotlin code to Native, but it failed with a Segmentation fault when I tried to run it.

Recently, the Kotlin team released Kotlin 1.8.0, and I thought I could try it again. My problems were already fixed by version 1.7.*, and now I can compile and run Kotlin natively.

Now, we have three different runtimes/compilation targets: JVM, GraalVM and Native.

Let's talk about performance.

Kotlin Native Interpreter

First, interpreter mode:

And from this post on, thanks to some feedback I received, I'll start including some graphs.

The tool that I use, hyperfine, includes some scripts that generate graphs with matplotlib, but it requires many more steps and disrupts my usual flow.

So I wrote my own script using unicode_plot.

To generate graphs like this one:

Native is more than seven times slower than JVM (Which is the fastest implementation) and is the slowest of any compiled implementation. Not great, but at least we can run it now.

What happened with memory consumption?

Well, Native consumes more than 100 times less memory than JVM. Not bad if you try to run on a platform that doesn't support JVM or with minimal resources like microcontrollers and so on.

Kotlin Native VM

Now, VM mode:

Numbers are better now (Note that GraalVM on VM mode is slower than in interpreter mode).

Kotlin JS

If I already did Native works, it should not be too difficult to compile using JS, right?

And it is not difficult, just some gradle trickery here and there.

Kotlin JS Interpreter

I also manage to run it with Node and Bun on interpreter mode.

As expected, Bun is faster than Node. But it is also slower than TypeScript by a long shot.

Let's see if I can do anything to improve those numbers a bit, i.e., my Kotlin implementation isn't the same as my TypeScript implementation, so we aren't comparing apples to apples (if you want to know the changes I made, check this PR).

We save a couple of seconds, but nothing relevant.

What is happening?

The TypeScript compiler is very good at generating clean, efficient, very modern JavaScript.

Let's have a look at a couple of examples.

The class Program (A program is a collection of statements) is compiled by Kotlin to this JavaScript code:

Old school code. On the other hand, TypeScript compiles an equivalent class to this JavaScript code:

Let's look at a more extended example, the main eval loop.

Kotlin compiles it into this code:

And TypeScript compiles it into this code:

The code generated by TypeScript is cleaner and uses modern features such as const, arrow functions and others, and the code has better performance.

Wait!, Why is the code generated by TypeScript faster?

First, interpreters benefit significantly by reducing the number of else clauses. (Remember that trend on YT a few years ago?)

Second, fewer instances. The code generated by Kotlin uses more instances.

Third, The code generated by Kotlin uses several functions from its standard library. That looks like an overhead.

Fourth, const yield some performance as well

Kotlin JS VM

I can also run it on VM mode. Something that I didn't implement on TypeScript

It is faster than interpreter mode. But it is still slower than TypeScript in interpreter mode (If you missed the previous entries. My TypeScript implementation running on Bun is faster than my Go implementation).

Conclusion

Writing a Kotlin multiplatform application with the same codebase is possible without too much hassle. That is fantastic news for a lot of teams.

The results with Kotlin Native are a little underwhelming. I thought it would be much faster, but the team could make some improvements in the next few years.

It is not fair to compare the Kotlin JS and TypeScript compilers. The amount of money and human resources Microsoft can pour into TypeScript is orders of magnitude bigger than what JetBrains can do. So there is a lot of room for improvement.