Are you scared of Java language change? Why? I'm going to try and debunk some of the arguments against change, and express some frustration with Java as-is.
"It was designed simple"
There is a commonly held view that Java was designed to be this simple, perfect language for all-known tasks. Thats not how I read history.
If we go back and re-read about the creation of Java we see some interesting points. These articles cover how Java (Oak) was written for set-top boxes, a market that didn't work out. Thus, the plan changed to focus on Applets on the new internet. Then the internet exploded, Java caught the wave and the rest is history.
Yes folks, Java was originally designed for set-top boxes and applets. Yet today, it is probably the most widely used enterprise language, and applets are dead. Since the fundamental use-case has changed, why shouldn't the language?
My point is that those who claim Java's 'simplicity' was its reason for success are wrong. I contend Java just got lucky.
Still not convinced? Try reading this extract from Patrick Naughton's Long strange trip to Java (via Artima):
Java succeeded because it hit that critical time-window of being in the right time at the right place. But to achieve it, compromises were made. In particular, lots of language features were dropped - assertions, closures, enums, generics (sound familiar?). By all accounts, they weren't dropped to keep the language 'simple', so much as because the timeline dictated it.
Thus Java's so-called simplicity is a fallacy. Language changes now are simply completing the job that was unfinished back then and meeting the realities of Java as an enterprise language.
"Nothing beyond Java"
Some in the Java community seem to have become zealots, extremely passionate about the language, and vehemently rejecting all change. This may stem from the battles between Sun and Microsoft, where some became religiously committed to Java. This has left them unwilling to look over the fence at other programming languages, with a belief that anything from outside the Java ecosystem must inherently be bad.
I reject that view. Other programming languages do exist. Each has its plus points, and each its negatives. But they can teach us what works and what doesn't - assuming that we are willing to look Beyond Java and learn.
"Just look at generics"
Java 5 introduced generics amongst many other items. Unfortunately, generics are probably the most troublesome change that has been made to Java. With over 400 pages in the official generics FAQ attempting to explain weird corner cases, we know something went wrong.
The negative take on this is that we shouldn't change Java ever again because 'we might get another generics'. I believe that is a very reactionary point of view. So long as any change is well specified, and avoids weird corner cases, it should be fine. And the Java community should be testing that and enforcing it.
"Code isn't important"
Another group, take the view that in the big picture code and syntax isn't important. Instead, the focus should be on process, teamwork, risk and testing.
The problem with this is two-fold. Firstly, these issues about process affect you whichever language they use, so they are a non-argument when discussing language changes.
Secondly, they fail to take into account that less lines of code actually does matter. Less LOCs means less lines to test. Fewer if clauses means fewer places for it to go wrong, and fewer tests. Abstracted loops means more predictable code, and fewer tests. More errors caught at compile-time mean less problems in production and fewer tests. And less LOCs apparently also means more secure systems (number of bugs proportional to LOCs).
You could misread this last paragraph to suggest I believe LOCs is the only measure of language change - I certainly don't believe that (auto-boxing being the classic example of getting it wrong by obscuring intent). But it does turn out to be far more important than is often thought.
"Junior developers won't understand this"
The issue with training and language change seems overblown. Developers are not stupid - even the junior ones or so-called Java-Joes. Every day they interact with multiple frameworks consisting of hundreds of classes which are way more complicated than the language changes being talked about.
In fact, most language changes actually try to simplify that interaction. By abstracting out common patterns found in many frameworks into the language, knowledge from one framework becomes transferable to another framework and the overall interaction becomes simpler and often more compile-safe.
What if we don't change
An increasing number of developers are, over time, seeing life beyond the high walls of Java, most notably this has been exposure to Ruby or Groovy. When these developers come back to Java to write code, they tend to find it very frustrating. I think this quote from Ray Cromwell on Javalobby expresses the frustrations well:
Every Java IDE, every Java framework, every JSR and ultimately every Java developer is finding a way to work around issues in the language, many of which could be solved by relatively small, directed language changes.
Java is still the enterprise choice. But more and more architects are seeing life beyond Java. They're realising that coding something in half or less code is actually a good thing, and the result is code that is far clearer and more understandable than the equivalent Java code. And it often takes less time too.
Java will never be Ruby or Groovy, but it does need to learn some of their lessons. The process of language change really doesn't need to be scary - and, if done well, the upsides will far outweigh any downsides.