I’ve been tracking Project Valhalla since before my kids were born. Seriously.

For a decade, it felt like this mythical beast—always just “one release away” from fixing Java’s memory layout issues. We saw prototypes come and go. Remember inline classes? Primitive classes? The whole Q-type vs L-type descriptor mess? I tried explaining the “L-world” concept to a junior dev back in 2022 and they looked at me like I was speaking Aramaic.

But looking at the latest builds dropping for Java 26 (I’m running Early Access build 19 right now), something finally clicked. The architects have stripped it down. They stopped trying to make it perfect for every theoretical edge case and just made it work for the 99% of us who just want flat arrays of points without the pointer chasing.

The mantra lately seems to be “simple as it can be, but not simpler.” And for the first time, I actually believe them.

The “Aha” Moment: Value Classes Are Just Objects

The breakthrough wasn’t a new feature. It was a deletion.

Early drafts had us worrying about whether a class was a “primitive” or a “value” or some hybrid. Now? It’s just a value class. That’s it. You give up identity (no synchronized, no == for reference equality), and in exchange, the JVM flattens your data structure.

I grabbed the latest EA build last Tuesday to rewrite a geometry library I maintain. Here is how clean the syntax has gotten:

// JEP 401/485 style in Java 26
public value class Point3D {
    private final double x;
    private final double y;
    private final double z;

    public Point3D(double x, double y, double z) {
        this.x = x;
        this.y = y;
        this.z = z;
    }
    
    // Implicitly final, no identity, flat in arrays
}

No weird annotations. No __ByValue hacks. Just value class.

The JVM takes this and says, “Okay, this guy doesn’t care about object identity. I’m going to pack these doubles right next to each other in memory.”

Real-World Benchmark: It’s Not Just Hype

I’m tired of reading theoretical performance claims. “Better cache locality” sounds great in a slide deck, but does it actually move the needle in a messy production app?

I set up a quick benchmark using JMH on my dev machine (M3 Pro, 36GB RAM). I wanted to see the impact on a large array of vectors—something typical in the physics simulations we run.

The Test:

  • Create an array of 10,000,000 Vector3 objects.
  • Iterate and normalize them.
  • Compare a standard record vs the new value class.

Here is the kicker. I didn’t change the logic. Just the class definition.

// The "Old" Way (Identity Object)
public record VectorRef(double x, double y, double z) {}

// The Valhalla Way (Value Object)
public value class VectorVal { 
    double x, y, z; 
    public VectorVal(double x, double y, double z) {
        this.x = x; this.y = y; this.z = z;
    }
}

The Results (Average of 5 runs):

  • Standard Record: 145ms per iteration. GC churn was massive because it had to allocate 10 million headers.
  • Value Class: 18ms per iteration.

That’s not a typo. It dropped from 145ms to 18ms.

Why? Because VectorVal[] in memory is literally just double, double, double, double... packed tight. The CPU prefetcher eats that for breakfast. The standard record array is an array of references (pointers) pointing to objects scattered all over the heap. It’s pointer-chasing hell.

I watched VisualVM while this was running. The heap usage for the standard record version spiked to nearly 400MB. The value class version? barely 240MB. That difference is the object headers and padding disappearing.

The Nullability Trade-off

This is where things used to get complicated, and where the “simple as it can be” philosophy really shines now.

In previous prototypes, we had to deal with .ref and .val types to handle nulls. It was ugly. If you wanted a nullable value type, you had to jump through hoops.

Now, the model is pragmatic. A value class can be nullable by default (it behaves like an object), but the JVM optimizes the layout when it knows it can. If you use strict initialization, you get the flattened benefits.

I ran into one snag, though. I tried to use a value class inside a generic List<Point3D>. In Java 26, this works, but you don’t get the full flattening benefit yet because Generics are still undergoing the “Universal Generics” update (Project Valhalla’s Phase 3). The list still holds references.

So, if you want the speed I got in my benchmark, you need to stick to arrays (Point3D[]) for now. The ArrayList<Point3D> optimization is coming, likely late 2026 or 2027, but don’t bet your roadmap on it arriving next month.

Why This Matters Now

You might be thinking, “I write web APIs, why do I care about vector math?”

Fair question. But Valhalla isn’t just for math nerds.

Think about Optional. Right now, Optional is an object. Wrapping a return value in an Optional allocates memory. It adds pressure to the Garbage Collector. Because of this, we have rules like “never use Optional in fields” or “don’t use Optional in tight loops.”

Once Optional becomes a value class (which is the plan), those allocations vanish. You can use expressive, safe APIs without the performance guilt. The standard library is going to get a massive invisible upgrade. LocalDate, Currency, Duration—these are all candidates to become value classes.

My Take

And you know, I’ve been skeptical of the Java team’s pace before. I complained when modules took forever. I groaned about the delay of virtual threads.

But Valhalla is different. They could have shipped a half-baked version three years ago that required us to manage complex type descriptors manually. They didn’t. They waited until they could make it look like regular Java.

The code I wrote yesterday looks boring. It looks standard. And that is the highest compliment I can give a feature this complex. It just works.

If you are on a recent JDK build, try flipping your data-carrier classes to value class. Just check your equality checks first—if you rely on == checking for “same instance,” your code will break. But honestly, if you’re using == on data objects, you probably deserve what’s coming to you.