Showing posts with label java. Show all posts
Showing posts with label java. Show all posts

Wednesday, 5 November 2025

Embedded records - extracting data from classes

At Devoxx Belgium 2025 I discussed the idea of embedded records with a few people. The idea is to take what is great about records, and extend that to classes that represent data.

This responds to a pain point in Java, where there is a bit of a cliff-edge between records and classes. While millions of classes could and should be converted to records, millions more cannot. Yet they still represent data, and it would be a Good Thing to be able to capture that. Especially with Serialization 2.0 on the horizon.

Please see the proposal document for more details.

Wednesday, 15 October 2025

Type conversion in Java - an alternative proposal for primitive type patterns

A lot of good work has been done by the core Java team on patterns, providing new ways to explore data. The latest extension, in JEP 507, is the idea that primitive type patterns should be supported. Today I'm publishing an alternative approach.

Primitive Types in Patterns, instanceof, and switch

The current proposal is as follows:

  long val = createLong();
  int i = (int) val;  // cast long to int, potentially silently losing information
  switch (val) {
    case int j -> IO.println("Long fits in an int");
    case long v -> IO.println("Long does not fit in an int");
  };

I like the idea of being able to tell if a long value fits into an int without loss. But I hate the syntax.

The key problem is that type patterns check the supertype/subtype relationship, and int is not a subtype of long. The result is code that doesn't seem to make sense.

The official explanation is based on the notion that develoeprs use instanceof String before a cast to String all the time. Thus a parallel can be drawn to have an instanceof int before a cast to int. Effectively the aim is to extend the meaning of type patterns to cover primitive type casts, which are type conversions, not type checks.

I know I am not alone in finding this argument weak, and in finding the proposed syntax highly confusing. But it took a while, and an 8 page document, to figure out exactly why.

Type conversion in Java

In response to the JEP and subsequent discussions, I have written up a detailed proposal for type conversion casts and type conversion patterns. These allow developers to more clearly express the difference between type checks (that check the supertype/subtype relationship) and type conversions (where primitive types are changed to a different type).

The big idea is to introduce a new kind of cast, the type conversion cast that operates like a standard primitive type cast, but throws an exception when the conversion would be lossy.

  long val = createLong();
  int i = (int) val;   // cast long to int, potentially losing information by truncation
  int j = (~int) val;  // cast long to int, throwing TypeConversionException if lossy

As can be seen, the new kind of cast blends in well, but immediately offers safety benefits. Millions of primitive type casts in Java applications are intended to be fully safe, but are unchecked and could silently lose information. Simply adding one extra character ~ would upgrade them to a safer alternative.

The related type conversion pattern allows switch and instanceof to safely check for type conversions:

  long val = createLong();
  switch (val) {
    case ~int j -> IO.println("Long fits in an int");
    case long v -> IO.println("Long does not fit in an int");
  };

This allows pattern matching in switch for primitive types, but highlights the different things the pattern is checking.

  • Type patterns check the supertype/subtype relationship.
  • Type conversion patterns check point-to-point conversions, defined for primitive types or value types.

Please read the full proposal and provide feedback.

Comments at Reddit.

Tuesday, 20 February 2024

Pattern match Optional in Java 21

I'm going to describe a trick to get pattern patching on Optional in Java 21, but one you'll probably never actually use.

Using Optional

As of Java 21, Pattern matching in Java allows us to check a value against a type like an instanceof with a new variable being declared of the correct type. Pattern matching can handle simple types and the deconstruction of records. But pattern matching of arbitrary classes like Optional is not yet supported. (Work to support pattern match methods is ongoing).

In normal code, the best way to use Optional is with one of the functional methods:

  var addressOpt = findAddress(personId);
  var addressStr = addressOpt
        .map(address -> address.format())
        .orElse("No address available");

This works well in most cases. But sometimes you want to use the Optional with a return statement. This results in code using get() like this:

  var addressOpt = findAddress(personId);
  if (addressOpt.isPresent()) {
    // early return if address found
    return addressOpt.get().format();
  }
  // lots of other code to handle case when address not found

One way to improve this is to write a simple method:

  /**
   * Converts an optional to an iterable for use in the for-each statement.
   *
   * @param <lT> the type of optional element
   * @param optional the optional
   * @return an iterable representation of the optional
   */
  public static <lT> Iterable<lT> inOptional(Optional<lT> optional) {
    return optional.isPresent() ? List.of(optional.get()): List.of();
  }

Which allows the following neat form:

  for (var address : inOptional(findAddress(personId))) {
    // early return if address found
    return address.format();
  }
  // lots of other code to handle case when address not found

This is a great approach providing that you don't need an else branch.

Using Optional with Pattern matching

With Java 21 and pattern matching we have a new way to do this!

  if (findAddress(personId).orElse(null) instanceof Address address) {
    // early return if address found
    return address.format();
  } else {
    // lots of other code to handle case when address not found
  }

This makes use of the fact that instanceof rejects null. Any expression can be used that creates an Optional - just pop .orElse(null) on the end.

Note that this does not work with Java 17 in most cases, because unconditional patterns were not permitted. (In most cases, the expression will return a value of the type being checked for, and the compiler rejects it as being "always true". Of course this was a simplification in ealier Java versions, as the null really needed to be checked for.)

Is this a useful trick?

In reality, this is of marginal use. Where I think it could be of is a long if-else chain, for example:

  if (obj instanceof Integer value) {
    // use value
  } else if (obj instanceof Long value) {
    // use value
  } else if (obj instanceof Double value) {
    // use value
  } else if (lookupConverter(obj).orElse(null) instanceof Converter conv) {
    // use converter
  } else {
    // even more options
  }

(It is valuable here as it avoids calling lookupConverter(obj) until necessary.)

Anyway, it is a fun trick even if you never use it!

One day soon I imagine we will use something like this:

  if (lookupConverter(obj) instanceof Optional.of(Converter conv)) {
    // use converter
  } else {
    // even more options
  }

which I think we'd all agree is the better approach.

Comments at Reddit

Monday, 4 November 2019

Java switch - 4 wrongs don't make a right

The switch statement in Java is being changed. But is it an upgrade or a mess?

Classic switch

The classic switch statement in Java isn't great. Unlike many other parts of Java, it wasn't properly rethought when pulling features across from C all those years ago.

The key flaw is "fall-through-by-default". This means that if you forget to put a break clause in each case, processing will continue on to the next case clause.

Another flaw is that variables are scoped to the entire switch, thus you cannot reuse a variable name in two different case clauses. In addition, default clause is not required, which leaves readers of the code unclear as to whether a clause was forgotten or not.

And of course there is also the key limitation - that the type to be switched on can only be an integer, enum or string.

 String instruction;
 switch (trafficLight) {
   case RED:
     instruction = "Stop";
   case YELLOW:
     instruction = "Prepare";
     break;
   case GREEN:
     instruction = "Go";
     break;
 }
 System.out.println(instruction);

The code above does not compile because there is no default clause, leaving instruction undefined. But even if it did compile, it would never print "Stop" due to the missing break. In my own coding, I prefer to always put a switch at the end of a method, with each clause containing a return to reduce the risks of switch.

Upgraded switch

As part of Project Amber, switch is being upgraded. But sadly, I'm unconvinced as to the merits of the new design. To be clear, there are some good aspects, but overall I think the solution is overly complex and with some unpleasant syntax choices.

The key aim is to add an expression form, where you can assign the result of the switch to a variable. This is rather like the ternary operator (eg. x != null ? x : ""), which is the expression equivalent of an if statement. An expression form would reduce problems like the undefined variable above, because it makes it more obvious that each branch must result in a variable.

The current plan is to add not one, but three new forms of switch. Yes, three.

Explaining this in a blog post is, unsurprisingly, going to take a while...

  • Type 1: Statement with classic syntax. As today. With fall-through-by-default. Not exhaustive.
  • Type 2: Expression with classic syntax. NEW! With fall-through-by-default. Must be exhaustive.
  • Type 3: Statement with new syntax. NEW! No fall-through. Not exhaustive.
  • Type 4: Expression with new syntax. NEW! No fall-through. Must be exhaustive.

The headline example (type 4) is of course quite nice:

 // type 4
 var instruction = switch (trafficLight) {
   case RED -> "Stop";
   case YELLOW -> "Prepare";
   case GREEN -> "Go";
 };
 System.out.println(instruction);

As can be seen, the new syntax of type 3 and 4 uses an arrow instead of a colon. And there is no need to use break if the code consists of a single expression. There is also no need for a default clause when using an enum, because the compiler can insert it for you provided you've included all the known enum values. So, if you missed out GREEN, you would get a compile error.

The devil of course is in the detail.

Firstly, a clear positive. Instead of falling through by listing multiple labels, they can be comma-separated:

 // type 4
 var instruction = switch (trafficLight) {
   case RED, YELLOW -> "Stop";
   case GREEN -> "Go";
 };
 System.out.println(instruction);

Straightforward and obvious. And avoids many of the simple fall-through use cases.

What if the code to execute is more complex than an expression?

 // type 4
 var instruction = switch (trafficLight) {
   case RED -> "Stop";
   case YELLOW -> {
     revYourEngine();
     yield "Prepare";
   }
   case GREEN -> "Go";
 };
 System.out.println(instruction);

yield? Shrug. For a long time it was going to be break {expression}, but this clashes with labelled break (a syntax feature that is rarely used).

So what about type 2?

 // type 2
 var instruction = switch (trafficLight) {
   case RED: yield "Stop";
   case YELLOW:
     System.out.println("Prepare");
   case GREEN: yield "Go";
 };
 System.out.println(instruction);

Oops! I forgot the yield. So, an input of YELLOW will output "Prepare" and then fall-through to yield "Go".

So, why is it proposed to add a new form of switch that repeats the fall-through-by-default error from 20 years ago? The answer is orthogonality - a 2x2 grid with expression vs statement and fall-through-by-default vs no fall-through.

A key question is whether being orthogonal justifies adding a almost totally useless form of switch (type 2) to the language.

So, type 3 is fine them?

Well, no. Because of the insistence on orthogonality, and thus an insistence of copying the historic rules relating to type 1 statement switch, there is no requirement to list all the case clauses:

 // type 3
 switch (trafficLight) {
   case RED -> doStop();
   case GO -> doGo();
 }

So, what happens for YELLOW? The answer is nothing, but as a reader I am left wondering if the code is correct or incomplete. It would be much better if the above was a compile error, with developers forced to write a default clause:

 // type 3
 switch (trafficLight) {
   case RED -> doStop();
   case GO -> doGo();
   default -> {}
 }

The official argument is that since type 1 statement switch (the current one) does not force exhaustiveness, neither can the new type 3 statement switch. My view is that keeping a bad design from 20 years ago is a worse sin.

What else? Well, one thing to bear in mind is that expressions cannot complete early, thus there is no way to return directly from within a switch expression (type 2 or 4). Nor is there a way to continue/break a loop. Trust me when I say there is an endless supply of Java exam questions in the rules that actually apply.

Summarizing the types

Type 1: Classic statement switch

  • Statement
  • Fall-through-by-default
  • return allowed, also continue/break a loop
  • Single scope for variables
  • Logic for each case is a sequence of statements potentially ending with break
  • Not exhaustive - default clause is not required
  • yield is not allowed

Type 2: Classic syntax expression switch

  • Expression
  • Fall-through-by-default
  • return not allowed, cannot continue/break a loop
  • Single scope for variables
  • Logic for each case can be a yield expression, or a sequence of statements potentially ending with yield
  • Exhaustive - default clause is required
  • Must use yield to return values

Type 3: Arrow-form statement switch

  • Statement
  • Fall-through is not permitted
  • return allowed, also continue/break a loop
  • No variable scope problems, logic for each case must be a statement or a block
  • Not exhaustive - default clause is not required
  • yield is not allowed

Type 4: Arrow-form expression switch

  • Expression
  • Fall-through is not permitted
  • return not allowed, cannot continue/break a loop
  • No variable scope problems, logic for each case must be an expression or a block ending with yield
  • Exhaustive - default clause is required
  • Must use yield to return values, but only from blocks (it is implied when not a block)

Are you confused yet?

OK, I'm sure I didn't explain everything perfectly, and I may well have made an error somewhere along the way. But the reality is that it is complex, and there are lots of rules hidden in plain sight. Yes, it is orthogonal. But I really don't think that helps in comprehending the feature.

What would I do?

Type 4 switch expressions are fine (although I have real issues with the extension of the arrow syntax from lambdas). My problem is with type 2 and 3. In reality, those two types of switch will be very rare, and thus most developers will never see them. Given this, I believe it would be better to not include them at all. Once this is accepted, there is no point in treating the expression form as a switch, because it won't actually have many connections to the old statement form.

I would drop type 2 and 3, and allow type 4 switch expressions to become what is known as statement expressions. (Another example of a statement expression is a method call, which can be used as an expression or as a statement on a line of its own, ignoring any return value.)

 // Stephen's expression switch
 var instruction = match (trafficLight) {
   case RED: "Stop";
   case YELLOW: "Prepare";
   case GO: "Go";
 }
 // Stephen's expression switch used as a statement (technically a statement expression)
 match (instruction) {
   case "Stop": doStop();
   case "Go": doGo();
   default: ;
 }

My approach uses a new keyword match, as I believe extending switch is the wrong baseline to use. Making it a statement expression means that there is only one set of rules - it is always an expression, it is just that you can use it as though it were a statement. What you can't do with my approach is use return in the statement version, because it isn't actually a statement (you can't use return from any expression in Java today, so this would be no different).

Summary

If you ignore the complexity, and just use type 4 switch expressions, the new feature is quite reasonable.

However, in order to add the one form of switch Java needed, we've also got two other duds - type 2 and 3. In my view, the feature needs to go back to the drawing board, but sadly I suspect it is now too late for that.

Friday, 22 March 2019

User-defined literals in Java?

Java has a number of literals for creating values, but wouldn't it be nice if we had more?

Current literals

These are some of the literals we can write in Java today:

  • integer - 123, 12s, 1234L, 0xB8E817, 077, 0b1011_1010
  • floating point - 45.6f, 56.7d, 7.656e6
  • string - "Hello world"
  • char - 'a'
  • boolean - true, false
  • null - null

Project Amber is also considering adding multi-line and/or raw string literals.

But there are many other data types that would benefit from literals, such as dates, regex and URIs.

User-defined literals

In my ideal future, I'd like to see Java extended to support some form of user-defined literals. This would allow the author of a class to provide a mechanism to convert a sequence of characters into an instance of that class. It may be clearer to see some examples using one possible syntax (using backticks):

 Currency currency = `GBP`;
 LocalDate date = `2019-03-29`;
 Pattern pattern = `strata\.\w+`;
 URI uri = `https://blog.joda.org/`;

A number of semantic features would be required:

  • Type inference
  • Raw processing
  • Validated at compile-time
Type inference

Type inference is of course a key aspect of literals. It would have to work in a similar way to the existing literals, but with a tweak to handle the new var keyword. ie. these two would be equivalent:

 LocalDate date = `2019-03-29`;
 var date = LocalDate`2019-03-29`;

The type inference would also work with methods (compile error if ambiguous):

 boolean inferior = isShortMonth(`2019-04-12`);

 public boolean isShortMonth(LocalDate date) { return date.lengthOfMonth() < 31; }
Raw processing

Processing of the literal should not be limited by Java's escape mechanisms. User-defined literals need access to the raw string. Note that this is especially useful for regex, but would also be useful for files on Windows:

 // user-defined literals
 var pattern = Pattern`strata\.\w+`;
 // today
 var pattern = Pattern.compile("strata\\.\\w+");

Today, the `\` needs to be escaped, making the regex difficult to read.

Clearly, the problem with parsing raw literals is that there is no mechanism to escape. But the use cases for user-defined literals tend to have constrained formats, eg. a date doesn't contain random characters. So, although there might be edge cases where this would be a problem, they would vert much be edge cases.

Validated at Compile-time

A key feature of literals is that they are validated at compile-time. You can't use an integer literal to create an int if the value is larger than the maximum allowed integer (2^31).

User-defined literals also need to be parsed and validated at compile-time too. Thus this code would not compile:

 LocalDate date = `2019-02-31`;

Most types which would benefit from literals only accept specific input formats, so being able to check this at compile time would be beneficial.

How would it be implemented?

I'm pretty confident that there are various ways it could be done. I'm not going to pick an approach, as ultimately those that control the JVM and language are better placed to decide. Clearly though, there is going to need to be some form of factory method on the user class that performs the parse, with that method invoked by the compiler. And ideally, the results of the parse would be stored in the constant pool rather than re-parsed at runtime.

What I would say is that user-defined literals would almost be a requirement for making value types usable, so something like this may be on the way anyway.

Summary

I like literals. And I would really like to be able to define my own!

Any thoughts?

Wednesday, 31 October 2018

Should you adopt Java 12 or stick on Java 11?

Should you adopt Java 12 or stick on Java 11 for the next 3 years? Seems like an innocuous question, but it is one of the most important decisions out there for those running on the JVM. I'll try to cover the key aspects of the decision, with the assumption that you care about running with the latest set of security patches in production.

TL;DR, It is vital to fully understand and accept the risks before adopting Java 12.

The Java release train

There is now a new release of Java every six months, so Java 12 is less than five months away despite Java 11 having just been released. As part of the process of moving to more frequent releases, certain releases are designated to be LTS (long term support) and as such will have security patches available for four years or more. This makes them "major" releases, not because they have a bigger feature set but because they have multi year support.

It is expected that Java 11 patches (11.0.1, 11.0.2, 11.0.3, etc.) will be smaller and simpler than Java 8 updates (8u20, 8u40, 8u60, etc.) - Java 11 updates will be more focused on security patches, without the internal enhancements of Java 8 updates. Instead, Oracle want us to think of Java 12, 13, 14 etc. as small upgrades, similar to an imaginary Java 11u20, 11u40 etc. To be blunt, I find this nonsensical.

Senior Oracle employees have repeatedly argued that updates such as 8u20 and 8u40 often broke code. This was not my experience. In fact my experience was that update releases primarily contained bug fixes. The only break I can remember was the addition of --allow-script-in-comments to Javadoc, which isn't a core part of Java. As a result, I have never feared picking up the latest update release - and this has been a core benefit of the Java platform.

Drilling down into why update releases tend to cause no problems, lets examine the differences between release types:

Model Old model New model
Upgrade Java major releases Java update releases Java release train Java patches
Frequency Every 3 years or so Every 6 months Every 6 months Every 3 months
Versions 6 -> 7 -> 8 8 -> 8u20 -> 8u40 11 -> 12 -> 13 11 -> 11.0.1 -> 11.0.2
Language changes
JVM changes
Major enhancements
Added classes/methods
Removed classes/methods
New deprecations
Internal enhancements
JDK tool changes
Bug fixes
Security patches

Given the table above, I find it amazing that anyone would claim moving from 11 to 12 to 13 is anything like moving from 8u20 to 8u40. But that is the official Oracle viewpoint:

Going from Java 9->10->11 is closer to going from 8->8u20->8u40 than from 7->8->9.
Oracle FAQ

As the table clearly shows, each version in the Java release train can contain any change traditionally associated with a full major version. These include language changes and JVM changes, both of which have major impacts on IDEs, bytecode libraries and frameworks. In addition, there will not only be additional APIs, but also API removals (something that did not happen prior to 8).

Oracle's claim is that because each release is only 6 months after the previous one, there won't be as much "stuff" in it, thus it won't be as hard to upgrade. While true, it is also irrelevant. What matters is whether the upgrade has the potential to damage your code stack or not. And clearly, going from 11 -> 12 -> 13 has much greater potential for damage than 8 -> 8u20 -> 8u40 ever did.

The key difference compared to updates like 8u20 -> 8u40 are the changes to the bytecode version, and the changes to the specification. Changes to the bytecode version tend to be particularly disruptive, with most frameworks making heavy use of libraries like ASM or ByteBuddy that are intimately linked to each bytecode version. And moving from 8u20 -> 8u40 still had the same Java SE specification, with all the same classes and methods, something that cannot be relied on moving from 12 to 13. I simply do not accept Oracle's argument that the "amount of stuff" in a release is more significant than the "type of stuff".

Note however that another one of Oracle's claims really does matter. They point out that if you stick with Java 11 and plan to move to the next LTS version when it is released (ie. Java 17) that you might find your code doesn't compile. Remember that the Java development rules now state that an API method can be deprecated in one version and removed in the next one. Rules that do not take LTS releases into account. So, a method could be deprecated in 13 and removed in 15. Someone upgrading from 11 to 17 would simply find a deleted API having having never seen the deprecation. Lets not panic too much about removal though - the only APIs likely to be removed are specialist ones, not those in widespread use by application code.

Considerations before adopting the release train

In this section, I try to outline some of the considerations/risks that must be considered before adopting the release train.

Locked in to the train
If you adopt Java 12 and use a new language feature or new API, then you are effectively locking your project in to the release train. You have to adopt Java 13, 14, 15, 16 and 17. And you have to adopt each new release within one month of the next release coming out.

Remember that with the new release train, each release has a lifetime of six months, and is obsolete just seven months after release. That is because there will be only six months of security patches for each release, the first patch 1 month after release and the second 4 months after release. After 7 months, the next set of security patches come out but the older release will not get them.

Do your processes allow for a Java upgrade, any necessary bug fixing, testing and release within that narrow 1 month time window? Or are you willing to run in production on a Java version below the security baseline?

Upgrade blockers
There are many possible things that can block an upgrade of Java. Lets make a list of some of the common ones.

Insufficient development resources: Your team may get busy, or be downsized, or the project may go to production and the team disbanded. Can you guarantee that development time will be available to do the upgrade from Java 15 to 16 in two years time?

Build tools and IDEs: Will your IDE support each new version on the day of release? Will Maven? Gradle? Do you have a backup plan if they don't? Remember, you only have 1 month to complete the upgrade, test it and get it released to production. Under this section other tools include Checkstyle, JaCoCo, PMD, SpotBugs and many more.

Dependencies: Will your dependencies all be ready for each new version? Quickly enough for you to meet the 1 month deadline? Remember, it is not just your direct dependencies, but everything in your stack. Bytecode manipulation libraries are particularly affected for example, such as ByteBuddy and ASM.

Frameworks: Another kind of dependency, but a large and important one. Will Spring produce a new version every six months, within the narrow one month time window? Will Jakarta EE (formerly Java EE)? What happens if they don't?

Now the traditional approach to any of these blockers was to wait. With versions of Java up to 8, a common approach was to wait 6 to 12 months before starting the upgrade to give tools, libraries and frameworks the chance to fix any bugs. But of course the waiting approach is incompatible with the release train.

Cloud / Hosting / Deployment
Do you have control of where and how your code runs in production? For example, if you run your code in AWS Lambda you do not have control. AWS Lambda has not adopted Java 9 or 10, and they don't even have Java 11 even though it is over a month after release. Unless AWS give a public guarantee to support each new Java version, then you simply can't adopt Java 12. (My working assumption is that AWS Lambda will only support major LTS versions, supported by the Amazon Corretto JDK announcement.)

What about hosting of your CI system? Will Jenkins, Travis, Circle, Shippable, GitLab be updated quickly? What do you do if they are not?

Predicting the future
Perhaps you have read through the list above and are happy your code and processes today can cope. Great! But it is critical to understand that you are also restricting your ability to change in the future.

For example, maybe your code doesn't run on AWS Lambda today. But are you willing to say you can't do so for the next three years?

Planning for the release train

If you are considering adopting the release train, I recommend preparing a list of all the things you depend on now, or might depend on in the next 3 years. You need to be confident that everything on that list will work correctly and be upgraded along with the release train, or have a plan if that dependency is not upgraded. The list for my day job is something like this:

  • Amazon AWS
  • Eclipse
  • IntelliJ
  • Travis CI
  • Shippable CI
  • Maven
  • Maven plugins (compile, jar, source, javadoc, etc)
  • Checkstyle, & associated IDE plugins and maven plugin
  • JaCoCo, & associated IDE plugins and maven plugin
  • PMD, & associated maven plugin
  • SpotBugs, & associated maven plugin
  • OSGi bundle metadata tool
  • Bytecode tools (Byte buddy / ASM etc)
  • Over 100 jar file dependencies

And I've probably forgotten something.

Don't get me wrong. I think it is perfectly possible to make a choice to say that you are willing to take the risk. That the benefits of new language features, and probable enhanced performance, make the effort worthwhile. But I strongly believe it is more risky than remaining on Java 11.

A middle ground?

One possible middle ground is to develop your application for Java 12, but run it in production on Java 13, 14, 15 etc. as soon as they come out. Sadly, this approach is less viable than it should be.

The removal of APIs and changes to the bytecode version add uncertainty to the stack. Even if your code doesn't use one of the removed APIs, one of your libraries might. Or a bytecode manipulation library might need upgrading, with knock on effects. So while the middle ground is a possible fallback if you get stuck, it is far from a no-risk solution.

Some additional links

Spring framework has expressed its policy wrt Java 12 in a video. The key sections are:

Jaba 8 and 11 as the LTS branches officially supported from our end. Best efforts support for the releases inbetween. ... if you intend to upgrade to 12 ... we are very willing to work with you ... but they are not officially production supported. ... The long term support releases are what we are primarily focussed on. Java 12 and higher will be best effort from our side.

As an example of a typical software vendor, Liferay states:

Liferay has decided we will not certify every single major release of the JDK. We will instead choose to follow Oracle's lead and certify only those marked for LTS.
Liferay blog

Oracle's official "misconceptions" slide about the new release model.

Summary

I'm sure some development teams will adopt the Java release train. My hope is that they do so with their eyes wide open. I know we won't be adopting the release train at my day job any time soon, a key blocker being our use of AWS Lambda, but I'd be concerned about all the other points too.

Feel free to leave a comment, especially if you think I've missed any points that should be on the considerations list.

Wednesday, 26 September 2018

Oracle's Java 11 trap - Use OpenJDK instead!

TL:DR; Java is still available at zero-cost, you just need to stop using Oracle JDK and start using an OpenJDK build, such as this one or this one.

The trap

Java 11 has been released. It is a major release because it has long-term support (LTS). But Oracle have also set it up to be a trap (either deliberately or accidentally).

For 23 years, developers have downloaded the JDK from Oracle and used it for $free. Type "JDK" into your favourite search engine, and the top link will be to an Oracle Java SE download page (I'm deliberately not providing a link). But that search and that link is now a trap.

Oracle JDK, the one all web searches take you to, is now commercial not $free.

The key part of the terms is as follows:

You may not: use the Programs for any data processing or any commercial, production, or internal business purposes other than developing, testing, prototyping, and demonstrating your Application;

The trap is as follows:

  1. Download Oracle JDK (because that is what you've always done, and it is what the web-search tells you)
  2. Use it in production (because you didn't realise the license changed)
  3. Get a nasty phone call from Oracle's license enforcement teams demanding lots of money

In other words, Oracle can rely on inertia from Java developers to cause them to download the wrong (commercial) release of Java. Unless you read the text/warnings/legalese very carefully you might not even realise Oracle JDK is now commercial, and that you are therefore liable to pay Oracle for using this particular JDK in production.

(Update, 2018-10-03: Searches for Java 11 and JDK 11 now seem to be resolving to OpenJDK builds, not commercial ones!)

Is this trap malicious behaviour on the part of Oracle? Readers will have their own opinions. I do suggest bearing in mind that Oracle invests huge amounts in developing Java, so it is reasonable to have a commercial plan available for those that want it. And they do provide a $free alternative completely valid for commercial use...

The solution

The solution is simple!

Use an OpenJDK build.

There are many different $free OpenJDK builds of Java 11, so you need to choose the one that best fits your needs.

The Adoptium (formerly AdoptOpenJDK) build is $free, GPL licensed (with Classpath exception so safe for commercial use), and a good choice as it is vendor-neutral and is intended to have 4+ years of security patches.

Download $free Java from Adoptium here

The OpenJDK build from Oracle is $free, GPL licensed (with Classpath exception so safe for commercial use), and provided alongside their commercial offering. It will only have 6 months of security patches, after that Oracle intends you to upgrade to Java 12.

Download $free Java from Oracle here

Many more OpenJDK builds are available, including ones available via your package manager. See this post for a list covering the wide variety of OpenJDK builds. And see my post on zero-cost Java for background info.

Download other OpenJDK builds here

And for a counterpoint, see Marcus' great summary of why the underlying changes here are actually good news.

Summary

Do NOT download or use the Oracle JDK unless you intend to pay for it.

For Java 11, download and use an OpenJDK build, from AdoptOpenJDK , Oracle or elsewhere.

(No comments on this post. There are plenty of other places to express opinions.)

Thursday, 20 September 2018

Java release chains - Splitting features from security

There is now a Java release every 6 months - March and September. It started with Java 9 and we're about to get Java 11. But should you jump on the release train? To answer that, we need to look at how Java's release chains are being split.

Looking back at Java 8

In the olden days life was simple. There was a "major" Java release every few years and it contained lots of new features, for example Java 5, 6, 7 and 8. Each major release included new JDK methods, new JDK classes, deprecations, new JVM features and new language features. However, life wasn't actually as simple as it seemed.

Looking at Java 8, once it was released there was a regular frequency of "update" releases. The most well-known of these were 8u20, 8u40 and 8u60. But there were also many others - 8u5, 8u11, 8u25, 8u31, 8u45, 8u51, 8u65, 8u66, 8u71, 8u73, 8u74, 8u77, etc. So, what was going on?

Well the plan was quite simple, just not that widely known. 8u20, 8u40, 8u60 and so on were "feature" releases, while all the rest were security patch releases. See the full table.

  • 8u20, 8u40, 8u60 and so on were "feature" releases - every six months
  • 8u5, 8u11, 8u25, 8u31 and so on were "security" releases - every three months, plus additional emergency releases

If you look closely, you can see a pattern. The first security release after a feature release had a number 5 greater (8u25 is 5 greater than 8u20). The second security release after a feature release had a number 11 greater (8u31 is 11 greater than 8u20). This left space for emergency security releases like 8u66.

So what was a Java 8 feature release?

Well a feature release was allowed to contain anything that didn't impact the Java SE specification. So, JVM or tool enhancements might be allowed, particularly if covered by a flag that was disabled by default. For example, the "endorsed-standards override mechanism and the extension mechanism" was deprecated in 8u40, 8u60 added a new IBM character set, and 8u181 removed the Derby database from the JDK bundle.

If you did notice these feature releases, it might have been because Java 8 lambdas were pretty buggy until 8u40 (in my experience). As such, I've often commented that 8u40 should have been named 8.1 (although logically 8u20 would have been 8.1 and 8u40 would have been 8.2). Such a version numbering scheme would have been easy to grasp for the whole Java ecosystem, but sadly didn't happen with Java 8.

Java 11 and the release train

The observant will notice that there was a feature release every six months and a security release every three months. But this is exactly like the "new" release train post Java 9!

The key difference between then and now is that from Java 9 onwards each six-monthly "feature" release can now contain any change. Thus major JVM, library and language changes can be introduced every six months, which wasn't allowed before. It also means there is a new Java SE specification for each feature release.

To handle this high frequency and still provide stability, every three years, which is every six releases, there will be an "LTS" release - long term support. As I've discussed before, the LTS name is rather confusing given that Oracle will not be providing security patches beyond six months on an "long-term" LTS release for $free. (Read the linked blog to understand how the OpenJDK will be performing that role from now on.)

Obviously with this new release schedule the question of naming arose. Firstly, according to Oracle we are not supposed to refer to any release as a "major" release. Instead, all should be considered to be "feature" releases, and of equal importance. This is emphasised by using a single incrementing number - 9, 10, 11, 12, 13 and so on.

But are all releases really of equal importance when some have long-term support (LTS)?

The answer is clearly no. For many large companies, the LTS releases are the only ones that they will consider adopting, either to manage churn or because they insist on purchasing a support contract for Java. Thus only one conclusion can be drawn:

Java 11 is a major release because it has LTS, not because of its feature set. Java 12, 13 and so on are less significant simply because they do not have LTS.

Now this doesn't mean that Java 12 is any less tested or is in some way unsuited for general use. Nor does it mean that Java 11 is bigger than any of the other versions. It is simply a statement of fact that because Java 11 has LTS (long-term support) it will be more significant, and see adoption by a different type of user.

Release chains

The support model is built on chains of releases. In order to pick up a security patch, you may also have to pickup other changes. The support chain for any Java release is simple enough. The higher the number, the more fixes/features it contains. Thus 8u20 contains everything from 8u5 and 8u11.

Using bold for major/feature releases and italic for security patches we can see the pattern:

  • 8, 8u5, 8u11, 8u20, 8u25, 8u31, 8u40, 8u45, 8u51, 8u60, 8u65, 8u71, ...

From Java 9 onwards there are effectively two separate patterns.

Firstly, there is the "release train" pattern if you want both features and security patches:

  • 9, 9.0.1, 9.0.4, 10, 10.0.1, 10.0.2, 11, 11.0.1, 11.0.2, 12, 12.0.1, 12.0.2, ...

Secondly, there is the "LTS" pattern if you only want security patches:

  • 11, 11.0.1, 11.0.2, 11.0.3, 11.0.4, 11.0.6, 11.0.7, ...

Oracle is focussing its $free efforts on the first "release train" pattern. After 11.0.2, it is ultimately the job of the OpenJDK community (probably led by Red Hat) to keep the chain in the second "LTS" pattern going.

Note the key difference though. The two chains effectively split features from security.

In Oracle's view, the old Java 8 chain and the new "release train" chain are essentially equivalent. I disagree, but I need another blog article to explain why (as this one is already too long!).

Summary

There are now two separate chains of releases - the "release train" with both features and security patches, and the "LTS" with just security patches. Neither chain is equivalent to how it worked in Java 8, so you need to choose which chain to adopt carefully.

Comments welcome. For example, did an 8u feature release break your code?

Thursday, 6 September 2018

From Java 8 to Java 11

Moving from Java 8 to Java 11 is trickier than most upgrades. Here are a few of my notes on the process.

(And here are a couple of other blogs - Benjamin Winterberg and Leonardo Zanivan.)

Modules

Java 9 introduced one of the largest changes in the history of Java - modules. Much has been said on the topic, by me and others. A key point is sometimes forgotten however:

You do not have to modularise your code to upgrade to Java 11.

In most cases, code running on the classpath will continue to run on Java 9 and later where modules are completely ignored. This is terrible for library authors, but great for application developers.

So my advice is to ignore modules as much as you can when upgrading to Java 11. Turning your application into Java modules may be a useful thing to consider in a few years time when open source dependencies really start to adopt modules. Right now, attempting to modularise is just painful as few dependencies are modules.

(The main reason I've found to modularise your application is to be able to use jlink to shrink the size of the JDK. But in my opinion, you don't need to fully modularise to do this - just create a single jar-with-dependencies with a simple no-requires no-exports module-info.)

Deleted parts of the JDK

Some parts of the JDK have been removed. These were parts of Java EE and Corba that no longer fitted well with the JDK, or could be maintained elsewhere.

If you use Corba then there is little anyone can do to help you. However, if you use the Java EE modules then the fix for the deleted code should be simple in most cases. Just add the appropriate Maven jars.

On the Java client side, things are more tricky with the removal of Java WebStart. Consider using Getdown or Update4J instead.

Unsafe and friends

Sun and Oracle have been telling developers for years not to use sun.misc.Unsafe and other sharp-edge JDK APIs. For a long time, Java 9 was to be the release where those classes disappeared. But this never actually happened.

What you might get with Java 11 however is a warning. This warning will only be printed once, on first access to the restricted API. It is a useful reminder that your code, or a dependency, is doing something "naughty" and will need to be fixed at some point.

What you will also find is that Java 11 has a number of new APIs specifically designed to avoid the need to use Unsafe and friends. Make it a priority to investigate those new APIs if you are using an "illegal" API. For example, Base64, MethodHandles.privateLookupIn, MethodHandles.Lookup.defineClass, StackWalker and Variable Handles.

Tooling and Libraries

Modules and the new six-monthly release cycle have conspired to have a real impact on the tooling and libraries developers use. Some projects have been able to keep up. Some have struggled. Some have failed.

When upgrading to Java 11, a key task is to update all your dependencies to the latest version. If there hasn't been a release in since Java 9 came out, then that dependency may need extra care or testing. Make sure you've updated your IDE too.

But it is not just application dependencies that need updating, so does Maven (and Gradle too, but I don't know much about Gradle myself). Most Maven plugins have changed major versions to a v3.x, and upgrading Maven itself to v3.5.4 is also beneficial.

Sadly, the core maven team is very small, so there are still some bugs/issues to be solved. However, if your Maven build is sensible and simple enough, you should generally be OK. But do note that upgrading a plugin from a v2.x to a v3.x may involve changes to configuration beyond that just associated with modules. For example, the Maven Javadoc plugin has renamed the argLine property.

A key point to note is the way Maven operates with modules. When the Maven compiler or surefire plugin finds a jar file that is modular (ie. with a module-info.class) it can place that jar on the modulepath instead of the classpath. As such, even though you might intend to run the application fully on the classpath, Maven might compile and test the code partly on the classpath and partly on the modulepath. At present, there is nothing that can be done about this.

Sometimes your build will need some larger changes. For example, you will need to change Findbugs to SpotBugs. And change Cobertura to JaCoCo.

If yo use Web Start, there is an open source alternative - OpenWebStart.

These build changes may take some time - they did for me. But the information available by a simple web search is increasing all the time.

Summary

I've upgraded a number of Joda/ThreeTen projects to support Java 9 or later now. It was very painful. But that is because as a library author I have to produce a jar file with module-info that builds and runs on Java 8, 9, 10 and 11. (In fact some of my jar files run on Java 6 and 7 too!)

Having done these migrations, my conclusion is that the pain is primarily in maintaining compatibility with Java 8. Moving an application to Java 11 should be simpler, because there is no need to stay tied to Java 8.

Comments welcome, but note that most "how to" questions should be on Stack Overflow, not here!

Monday, 3 September 2018

Time to look beyond Oracle's JDK

From Java 11 its time to think beyond Oracle's JDK. Time to appreciate the depth of the ecosystem built on OpenJDK. Here is a list of some key OpenJDK builds.

This is a quick follow up to my recent zero-cost Java post

OpenJDK builds

In practical terms, there is only one set of source code for the JDK. The source code is hosted in Mercurial at OpenJDK.

Anyone can take that source code, produce a build and publish it on a URL. But there is a distinct certification process that should be used to ensure the build is valid.

Certification is run by the Java Community Process, which provides a Technology Compatibility Kit (TCK, sometimes referred to as the JCK). If an organization produces an OpenJDK build that passes the TCK then that build can be described as "Java SE compatible".

Note that the build cannot be referred to as "Java SE" without the vendor getting a commercial license from Oracle. For example, builds from AdoptOpenJDK that pass the TCK are not "Java SE", but "Java SE compatible" or "compatible with the Java SE specification". Note also that certification is currently on a trust-basis - the results are not submitted to the JCP/Oracle for checking and cannot be made public. See Volker's excellent comment for more details.

To summarise, the OpenJDK + Vendor process turns one sourcebase into many different builds.

In the process of turning the OpenJDK sourcebase into a build, the vendor may, or may not, add some additional branding or utilities, provided these do not prevent certification. For example, a vendor cannot add a new public method to an API, or a new language feature.

Oracle JDK

http://www.oracle.com/technetwork/java/javase/downloads/

From Java 11 this is a branded commercial build with paid-for support. It can be downloaded and used without cost only for development use. It cannot be used in production without paying Oracle (so it is a bit of a trap for the unwary). Oracle intends to provide full paid support until 2026 or later (details). Note that unlike in the past, the Oracle JDK is not "better" than the OpenJDK build (provided both are at the same security patch level). See here for more details of the small differences between Oracle JDK and the OpenJDk build by Oracle.

OpenJDK builds by Oracle

http://jdk.java.net/

These are $free pure unbranded builds of OpenJDK under the GPL license with Classpath Extension (safe for use in companies). These builds are only available for the first 6 months of a release. For Java 11, the expectation is there will be Java 11.0.0, then two security patches 11.0.1 and 11.0.2. To continue using the OpenJDK build by Oracle with security patches, you would have to move to Java 12 within one month of it being released. (Note that the provision of security patches is not the same as support. Support involves paying someone to triage and act upon your bug reports.)

Eclipse Temurin by Adoptium (formerly AdoptOpenJDK)

https://adoptium.net/

These are $free pure unbranded builds of OpenJDK under the GPL license with Classpath Extension. Unlike the OpenJDK builds by Oracle, these builds will continue for a much longer period for major releases like Java 11. The Java 11 builds will continue for 4 years, one year after the next major release (details). Adoptium is a project at Eclipse backed by some major industry companies. They will provide builds provided that other groups create and publish security patches in a source repository at OpenJDK.

Red Hat OpenJDK builds

Red Hat provides builds of OpenJDK via Red Hat Enterprise Linux (RHEL) which is a commercial product with paid-for support (details). They are very good at providing security patches back to OpenJDK, and Red Hat has run the security updates project of Java 6 and 7. The Red Hat build is integrated better into the operating system, so it is not a pure OpenJDK build (although you wouldn't notice the difference).

Other Linux OpenJDK builds

Different Linux distros have different ways to access OpenJDK. Here are some links for common distros: Debian, Fedora, Arch, Ubuntu.

Azul Zulu

https://www.azul.com/downloads/zulu/

Zulu is a branded build of OpenJDK with commercial paid-for support. In addition, Azul provides some Zulu builds for $free as "Zulu Community", however there are no specific commitments as to the availability of those $free builds. Azul has an extensive plan for supporting Zulu commercially, including plans to support Java 9, 13 and 15, unlike any other vendor (details).

Amazon Corretto

https://aws.amazon.com/corretto/

In preview from 2018-11-14, this is a zero-cost build of OpenJDK with long-term support that passes the TCK. It is under the standard GPL+CE license of all OpenJDK builds. Amazon will be adding their own patches and running Corretto on AWS, so it will be very widely used. Java 8 support will be until at least June 2023.

IBM Semeru (formerly AdoptOpenJDK OpenJ9 builds)

https://developer.ibm.com/languages/java/semeru-runtimes/

In addition to the standard OpenJDK builds, AdoptOpenJDK offered builds with OpenJ9 instead of HotSpot. OpenJ9 was originally IBM's JVM, but OpenJ9 is now Open Source at Eclipse. These builds are now avaialble from IBM directly. They also provide commercial paid-for support for the AdoptOpenJDK builds with OpenJ9.

SAP

https://sap.github.io/SapMachine/

SAP provides a JDK for Java 10 and later under the GPL+CE license. They also have a commercial closed-source JVM. Information on support lifetimes can be found here.

Liberica

https://bell-sw.com/java.html

Liberica from Bellsoft is a $free TCK verified OpenJDK distribution for x86, ARM32 and ARM64. Information on support can be found here.

ojdkbuild project

https://github.com/ojdkbuild/ojdkbuild

A community project sponsored by Red Hat that produces OpenJDK builds. They are the basis of Red Hat's Windows builds. The project has no customer support but does intend to provide builds so long as Red Hat supports a Java version. No information is provided about the TCK.

Others

There are undoubtedly other builds of OpenJDK, both commercial and $free. Please contact me if you'd like me to consider adding another section.

Summary

There are many different builds of OpenJDK, the original upstream source repository. Each build offers its own unique take - $free or commercial, branded or unbranded.

Choice is great. But if you just want the "standard", currently my best advice is to use the OpenJDK builds by Oracle, Adoptium builds or the one in your Operating System (Linux).

Tuesday, 28 August 2018

Java is still available at zero-cost

The Java ecosystem has always been built on a high quality $free (zero-cost) JDK available from Oracle, and previously Sun. This is as true today as it always has been - but the new six-monthly release cycle does mean some big changes are happening.

Six-monthly releases

Java now has a release every six months, something which greatly impacts how each version is supported. By support, I mean the provision of update releases with security patches and important bug fixes.

Up to and including Java 8, $free security updates were provided for many years. Certainly up to and beyond the launch of the next version. With Java 9 and the six-monthly release cycle, this $free support is now much more tightly controlled.

In fact, Oracle will not be providing $free long-term support (LTS) for any single Java version at all from Java 11 onwards.

VersionRelease dateEnd of $free updates from Oracle
Java 8March 2014January 2019 (for commercial use)
Java 9Sept 2017March 2018
Java 10March 2018Sept 2018
Java 11Sept 2018March 2019 (might be extended, see below)
Java 12March 2019Sept 2019

The idea here is simple. Oracle wants to focus its energy on moving Java forward with the cost of long-term support directly paid for by customers (instead of giving it away for $free). To do this, they need developers to continually upgrade their version of Java, moving version every six months (and picking up the patch releases in-between). Of course, for most development shops, such rapid upgrade is not feasible. But Java is now developed as OpenJDK, which means that Oracle's support dates are not the only ones to consider.

OpenJDK

A key point to grasp is that most JDK builds in the world are based on the open source OpenJDK project. The Oracle JDK is merely one of many builds that are based on the OpenJDK codebase. While it used to be the case that Oracle had additional extras in their JDK, as of Java 11 this is no longer the case.

Many other vendors also provide builds based on the OpenJDK codebase. These builds may have additional branding and/or additional non-core functionality. Most of these vendors also contribute back to the upstream OpenJDK project, including the security patches.

The impact is that the JDK you use should now be a choice you actively make, not passively accept. How fast can you get security patches? How long will it be supported? Do you need to be able to apply contractual pressure to a vendor to help with any issues?

In addition, there are two main ways that the JDK is obtained. The first is an update mechanism buit into the operating system (eg. *nix). The second is to visit a URL and download a binary (eg. Windows).

To examine this further, lets look at Java 8 and Java 11 separately.

Staying on Java 8

If you want to stay on Java 8 after January 2019, here are the choices as I see them:

1) Don't care about security.

It is entirely possible to remain on the last $free release forever. Just don't complain when hackers destroy your company.

2) Rely on Operating System updates.

On *nix platforms, you may well obtain your JDK via the operating system (eg. Red Hat, Debian, Fedora, Arch, etc.). And as such, updates to the JDK are delivered via the operating system vendor. This is where Red Hat's participation is key - they promise Java 8 updates until June 2023 in Red Hat Enterprise Linux - but they also have an "upstream first" policy, meaning they prefer to push fixes back to the "upstream" OpenJDK project. Whether you get security patches to the JDK or not will depend on your operating system vendor, and whether they need you to pay for those updates.

3) Pay for support.

A number of companies, including Azul, IBM, Oracle and Red Hat, offer ongoing support for Java. By paying them, you get access to the stream of security patches and update releases with certain guarantees (as opposed to volunteer-led approaches). If you have cash, maybe it is fair and reasonable to pay for Java?

4) Use the non-commercial build in a commercial setting.

Oracle will provide builds of Java 8 for non-commercial use until December 2020, so you could use those. But you don't want Oracle's software licensing teams chasing you, do you?

5) Build OpenJDK yourself.

The stream of security patches * is published to a public Mercurial repository under the GPL license. As such, it is perfectly possible to build OpenJDK yourself by keeping track of commits to that repository. I suspect this not a very realistic choice for most companies.

6) Use the builds from Adoptium AdoptOpenJDK.

Update 2021-11-04: AdoptOpenJDK is now Adoptium. The links below have not been updated as there is not a matching page in all cases.

The community team at AdoptOpenJDK has been busy over the past few years creating a build farm and testing rig. As such, they are now able to take the stream of security patches * and turn them into releases, just like you would get from the commercial offerings. They also intend to run the Java TCK (testing compatibility kit) to allow these builds to be fully certified as being compatible with the Java SE specification. Their plan is to produce Java 8 builds until September 2023 or later (two years after Java 17 comes out). Obviously, you don't get a warranty or genuine support - its a community build farm project. But for most users that want to use Java 8 without paying, this is likely the best choice.

Note that Azul also offers $free OpenJDK release builds under the name Zulu. A key advantage of Azul's offering is that you can pay for support later if you need it without changing your JDK.

* The last two options assume that a group actually will step forward and take over the "JDK 8 updates" OpenJDK project once Oracle stop. While the exact project details are not yet confirmed, this IBM statement indicates real backing for the approach:

Recognizing the impact that the release cycle changes will have with Java developers, IBM will partner with other members of the OpenJDK community to continue to update an OpenJDK Java 8 stream with security patches and critical bug fixes. We intend to keep the current LTS version secure and high quality for 4 years. This timescale bridges the gap between LTS versions with 1 year to allow for a migration period. IBM has also invested in an open build and test project (AdoptOpenJDK.net) along with many partners and Java leaders to provide community binaries across commonly used platforms of OpenJDK with Hotspot and OpenJDK with Eclipse OpenJ9. These community binaries are TCK (Java SE specification) compliance tested and ready for developers to download and use in production.
IBM statement

And here is further indication of Red Hat's support for the June 2023 date, based on their "upstream first" policy.

> Red Hat has said it may step forward to be the maintainer for JDK 11 - might it also > step forward to be the maintainer for JDK 8?
Yes.
> How long will JDK 8 be maintained?
June 2023 is right for JDK 8, but I wouldn't be surprised if it goes on beyond that.
Andrew Haley, Red Hat

And finally, here is the official Oracle view on that transition.

Note that the process around security issues will be managed by the newly formed vulnerability group (which formally codifies what was happening anyway).

Staying on Java 9 or Java 10

Don't.

No-one wants to provide builds or support for Java 9 or Java 10. And anyway, there is no good reason not to upgrade to Java 11 in my opinion.

(Actually, Azul are providing medium-term support for Java 9, if you really need it.)

Staying on Java 11

It is a brave new world and not 100% clear, but this is how it looks like things will happen.

Firstly, it is not clear that there will be an Oracle JDK that is $free to download. Despite my best attempts, I could not get 100% clarity on this point. However, this tweet and other sources indicate that it will be accessible for development purposes, just not for use in production (which is a bit of a trap for the unwary). But in reality, it is now irrelevant as to whether Oracle JDK is $free to download or not.

Now that Java 11 is released, we can see that Oracle JDK can be downloaded for $free. However, the license has changed, explicitly ruling out use in a production environment (without paying). Given that Oracle JDK has been the main JDK in use for 23 years, this is a huge trap for the unwary.

But this is not actually a problem, because Oracle JDK is no longer that important. That is because from Java 11, developers can treat Oracle JDK and OpenJDK as being equivalent. (See here for the detailed differences.) It is no longer appropriate or correct to consider the OpenJDK build to be secondary or less important. In fact, the most important build is now the OpenJDK one.

To be more specific, as of the release date, Java 11 developers should consider using AdoptOpenJDK or jdk.java.net to obtain a binary download, not any page on oracle.com.

So, for how long will Oracle provide security patches to Java 11?

Again, the answer to this is not 100% clear:

> What will Java 11 get from Oracle?
At least six months of free, GPL-licensed updates with binaries at https://jdk.java.net.
(I say “at least” because that’s the current plan. The plan could change, to a longer time period, if the situation warrants.)
Mark Reinhold, Oracle

Clearly, six months of security updates is not long enough to treat Java 11 as a "long term support" (LTS) release. The promise of the period being potentially longer doesn't really help, as companies need longer timelines and more certainty. So the working assumption should be that Java 11 has just 6 months of releases containing security patches from Oracle.

At this point, things move into the realms of probabilities. In all likelihood, when Oracle steps down from managing the "JDK 11 updates" project at OpenJDK (the one containing the security patches), someone else will take over. Exactly as with Java 8, discussed above. This has happened before with Java 6 and 7. And the evidence is that it will happen for Java 11 too:

OpenJDK is a community project. It's up to the community to support it. In practice this means that a group of organizations and individuals will maintain each OpenJDK LTS release for some period (TBA for 11, but it's sure to be a *lot* longer than six months.) I am certain that there will be a jdk11u project, and it will be properly and professionally run. I think it's likely that I'll be leading the project, but someone else may be chosen.
Andrew Haley, Red Hat

See also this Red Hat blog on the topic.

That covers the repository containing the security patches. (Red Hat have an excellent record in maintaining old releases of OpenJDK for the wider community.) But there is still the question of producing actual releases to download that have been certified as passing the Java SE testing TCK.

This is where the AdoptOpenJDK build farm is critical:

As part of the discussions Andrew mentioned, AdoptOpenJDK offered to build, test and make available OpenJDK LTS binaries for the major (and several minor) platforms. This isn't yet set in concrete but folks broadly thought that was a good idea. So the challenge of having a build and test farm for this joint effort is solved.
Martijn Verburg, AdoptOpenJDK

And AdoptOpenJDK are currently planning to do create releases until September 2022, one year after Java 17 comes out.

If people do what they say they will, then we can therefore conclude that it will be possible to use Java 11 for 4 years from release, with security patches, for $free (zero-cost). (I would imagine that if volunteers came forward, the September 2022 date could be moved even further into the future.)

Of course, only you and your company can decide if it is right and proper to use Java without giving back to the ecosystem. It could be argued that it is more ethical to either pay for support, or assist either the AdoptOpenJDK or "JDK 11 updates" OpenJDK project.

This is therefore the updated table of $free updates:

VersionRelease dateEnd of Oracle $free updatesEnd of OpenJDK-based $free updates
Java 8March 2014January 2019June 2023 (or later)
Java 9Sept 2017March 2018N/A
Java 10March 2018Sept 2018N/A
Java 11Sept 2018March 2019 (or later)September 2022 (or later)
Java 12March 2019Sept 2019N/A

(June 2023 is the date Red Hat has provided for the end of JDK 8 security patches, September 2022 is the date AdoptOpenJDK have provided - one year after the expected release of the next LTS (long-term support) version, Java 17).

Platforms

The OpenJDK builds by Oracle at jdk.java.net only cover three platforms. But this doesn't mean that they are the only platforms OpenJDK runs on. For example, AdoptOpenJDK provides Java 8 builds on 9 platforms with Hotspot (including ARM, Windows 32-bit and Solaris) and more platforms with OpenJ9.

Summary

All the pieces are in place for Java 11 to be maintained as a long-term support release. However, unlike Java 6, 7 and 8, Oracle will not be leading the long-term support effort. In all likelihood, Red Hat will take over this task - they have said publicly that they would like to.

For the first 6 months of Java 11's life, Oracle will be providing GPL+CE licensed $free zero-cost downloads at jdk.java.net with security patches.

To get GPL+CE licensed $free zero-cost update releases of Java 11 after the first six months, you are likely to need to obtain them from a different URL and a different build team. Currently, my view is that your package manager or Adoptium is the best place to look for those builds.

Feel free to comment if I've missed something obvious.

Thursday, 13 November 2014

No properties in the Java language

Here at Devoxx 2014, Brian Goetz went a little further than he has before about whether properties will ever be added to Java as a language feature.

No properties for you

Properties as a language feature in Java has been talked about for many years. But ultimately, it is Oracle, the Java stewards, who get to decide whether it should be in or out.

Today, at Devoxx, Brian Goetz, Java Language Architect, went a little further than he has before to indicate that he sees little prospect of properties being added to the language at this point.

Oracle have chosen to prioritize lambdas, modularization and value types as key language features over the JDK 8 to 10 timescale. As such, there was no realistic prospect of properties before JDK 11 anyway. However, the indications today were reasonably clear - that it isn't on the radar at all.

When quizzed, Brian indicated two main reasons. First, that it is perhaps too late for properties to make a difference as a language feature, with compatibility with older getter/setter beans being a concern. Second, that there are many different, competing visions of what properties means, and there is an unwillingness to pick one.

Competing visions include:

  • Just generate getters, setters, equals, hashCode and toString
  • Generate observable properties for GUIs
  • Raising the level of abstraction, treating a property as a first class citizen
  • Immutability support, with builders for beans
  • New shorter syntax for accessing properties

While it is easy to argue that these are competing visions, it is also possible to argue that these are merely different requirements, and that all could be tackled by a comprehensive language enhancement.

(Just to note that "beans and properties" does not imply mutable objects. It is perfectly possible to have immutable beans, with or without a builder, but immutable beans are harder for frameworks to deal with. Value types are intended for small values, not immutable beans.)

As is well known, I have argued for properties as a language feature for many years. As such, I find the hardening of the line against properties to be intensely disappointing.

Every day, millions of developers interact with beans, getters, setters and the frameworks that rely on them. Imagine if someone today proposed a naming convention mechanism (starts with "get") for properties? It would be laughable in any language, never mind the world's most widely used enterprise development language.

Instead, developers use IDEs to generate getters, setters, equals and hashCode, adding no real value to their code, just boilerplate of the kind Java is notorious for.

Consider that with 9 million Java developers, there could easily be 50,000 new beans created every day, each with perhaps 100 lines of additional boilerplate getters/setters/equals/hashCode/toString - easily 5 million lines of pointless code created, per day.

(And plenty more code in serialization and mapping code that either tries to deduce what the properties are, or results in lots of additional manual mapping code)

Realistically however, we have to accept that in all likelihood Java will never have properties as a language feature.

Summary

The Java language is not getting properties.

Comments welcome, but please no "Foo language has properties" comments!

Friday, 21 March 2014

VALJOs - Value Java Objects

The term "value object" gets confusing in Java circles with various different interpretations. When I think about these I have my own very specific interpretation, which I'm christening VALJOs.

Introduction

The Wikipedia article on value objects gives this definition:

In computer science, a value object is a small object that represents a simple entity whose equality is not based on identity: i.e. two value objects are equal when they have the same value, not necessarily being the same object.

Classic examples of values are numbers, amounts, dates, money and currency. Martin Fowler also gives a good definition, and related wiki.

I agree with the Wikipedia definition, however there is still a lot of detail to be added. What does it take to write a value object in Java - is there a set of rules?

Recently, life has got more complicated with talk of adding value types to a future Java release:

"A future version of Java" will almost certainly have support for "value types", "user-defined primitives", "identity-less aggregates", "structs", or whatever you would like to call them.

Work on value types is focussed on extending the JVM, language and libraries to support values. Key benefits revolve and memory usage and performance, particularly in enabling the JVM to be much more efficient. Since this post isn't about value types, I'll direct curious readers to the JEP and various articles on John Rose's blog.

JVM supported value types may be the future, but how far can be go today?

VALue Java Objects - VALJOs

What I want to achieve is a set of criteria for well-written value objects in Java today. Although written for today, the criteria have one eye on what value types of the future may look like.

As the term "value object" is overloaded, I'm giving these the name VALJO. Obviously this is based on the popular POJO naming.

  • The class must be immutable, which implies the final keyword and thread-safety.
  • The class name must be simple and direct, focussed on the value.
  • Instances must be obtained using static factory methods. All constructors must be private.
  • The state that defines the value must be clearly specified. It will consist of one or more elements.
  • The elements of the state must be other values, which includes primitive types.
  • The equals() and hashCode() methods must check all the elements of the state and nothing else.
  • If the class implements Comparable, then it must check all the elements of the state and nothing else.
  • If the class implements Comparable, the compareTo() method must be consistent with equals.
  • The toString() method must return a formally-defined string fully exposing the state and nothing else.
  • The toString() for two equal values must be the same. For two non-equal values it must be different.
  • There must be a static factory method capable of creating an instance from the formal string representation. It is strongly recommended to use the method name parse(String) or of(String).
  • The clone() method should not be public.
  • Provide methods, typically simple getters, to get the elements of the state.
  • Consider providing with() methods to obtain a copy of the original with different state.
  • Other methods should be pure functions, depending only on the arguments are the state or derived state.
  • Consider providing a link to the JDK 8 value-based classes documentation.

It is important to note that the state of a VALJO is not necessarily identical to the instance variables of the Class. This happens in two different ways:

Firstly, the state could be stored using different instance variables to those in the definition. For example, the state definition and public variables may expose an int but the instance variable might be a byte, because the validated number only needs 8 bit storage, not 32 bit. Similarly, the state definition might expose an enum, but store the instance variable as an int. In this case, the difference is merely an implementation detail - the VALJO rules apply to the logical state.

Secondly, there may be more instance variables than those in the state definition. For example, a Currency is represented in ISO 4217 with two codes, a three letter string and a three digit number. Since it is possible to derive the numeric code from the string code (or vice versa), the state should consist only of the string. However, rather than looking up the number each time it is needed, it is perfectly acceptable to store the numeric code in an instance variable that is not part of the state. What would not be acceptable for Currency is including numeric code in the toString (as the numeric code is not part of the state, only the string code is).

In effect, the previous paragraph simply permits caching of related data within a VALJO so long as it can be derived from the state. Extending this logic further, it is acceptable to cache the toString() or hashCode().

On parsing, I recommend using of(String) if the state of the VALJO consists entirely of a single element which is naturally a string, such as Currency. I recommend parse(String) where the state consists of more than one element, or where the single element is not a string, such as Money.

It is acceptable for the parse method to accept alternate string representations. Thus, a Currency parsing method might accept both the standard three letter code and the string version of the numeric code.

On immutability, it is allowed to have an abstract VALJO class providing the constructor is private scoped and all the implementations are immutable private static final nested classes. The need for this is rare however.

Using VALJOs

It is best practice to use VALJOs in a specific way tailored for potential future conversion to JVM value types:

  • Do not use ==, only compare using equals().
  • Do not rely on System.identityHashCode().
  • Do not synchronize on instances.
  • Do not use the wait() and notify() methods.

The first of these rules is stronger than absolutely necessary given we are talking about normal Java objects in a current JVM. Basically, so long as your VALJO implementation promises to provide a singleton cached instance for each distinct state then using == is acceptable, because == and equals() will always return the same result. The real issue to be stopped here is using == to distinguish between two objects that are equal via equals().

Separately from these restrictions, the intention of VALJOs is that users should refer to the concrete type, not an interface. For example, users refer to the JSR-310 LocalDate, not the Temporal interface that it implements. The reason for this is that VALJOs are, of necessity, simple basic types, and hiding them behind an interface is not helpful.

Tooling

Actually writing VALJO classes is boring and tedious. The rules and implications need to be carefully thought through on occasion. Some projects provide tools which may assist to some degree.

Joda-Convert provides annotations for @ToString and @FromString which can be used to identify the round-trippable string format.

Joda-Beans provides a source code generation system that includes immutable beans. They are not instant VALJOs, but can be customised.

Auto-Value provides a bytecode generation system that converts abstract classes into immutable value objects. They are not instant VALJOs, but can be customised.

Project Lombok provides a customised compiler and IDE plugin which includes value object generation. They are not instant VALJOs, but can be customised.

In general however, if you want to write a really good VALJO, you need to do it by hand. And if that VALJO will be used by many others, such as in JSR-310, it is worth doing correctly.

Summary

The aim of this blog was to define VALJOs, a specific set of rules for value objects in Java. The rules are intended to be on the strict side - all VALJOs are value objects, but not all value objects are VALJOs.

Did I miss anything? Get something wrong? Let me know!