Monday, October 22, 2012

Using Core Groovy AST Transforms

Have you ever used an Abstract Syntax Tree (AST) transform? If you're using groovy or grails, chances are you have, and maybe you just didn't know it. An AST allows code to be modified at compile time. Groovy provides the capability to write both local (annotation driven) and global (globally applied) AST transforms. They have a nice tutorial

Even if writing AST transforms is not something you are doing, you can benefit from the numerous built in transforms that groovy provides. Many of these help to eliminate the need for boilerplate code that we've all written before. This post does not aim to list all of them, but rather to provide you an overview of some of them. Check out all implementors of the ASTTransformation from the groovy api ( to see a full list.

Consider a Money class that looks

class Money {
    int amount
    String type // USD, Euro, etc

    boolean equals(Money other) {
        if ( other == this ) {
            return true
        if ( other instanceof Money ) {
            return amount == other.amount && type == other.type
        return false

    int hashCode() {
        return new Integer(amount).hashCode() + type.hashCode()

This is fairly verbose at best, and at worst, it's very error prone. What if I add a new field to the Money class and update the equals but forget to update the hash code (yes, tests should catch this). What if I forget to add the field to equals and hash code all together?

In Java, you might use Apache Commons' EqualsBuilder and HashcodeBuilder's reflection methods. This certainly simplifies your code, but the performance can be pretty bad if these methods are called frequently.

Enter the AST transform.

class Money {
    int amount
    String type // USD, Euro, etc


The @EqualsAndHashCode annotation adds the code at compile time, so you don't take the runtime hit, but you get the benefit of not having to write the code yourself.

Some other useful transforms include
@ToString - Adds a toString method
@Bindable - Adds property change support
@Mixin - Mixes in code to classes

As you can see, AST transformations can provide a lot of benefit and reduce the amount of boiler plate code you need to write. If you haven't used them before, I encourage you to at least take a look.

Thursday, October 4, 2012

Build quality in from the start

Have you ever joined a large project in progress only to find that the project is a bit on the messy side? There's no testing, no automated builds and really no quality control. Did you wonder how it got that way? Then you try to correct some of these problems only to realize you're only making a minor dent (better than nothing). But if the project is in that state, it's likely going to be an uphill battle.

Even the largest of projects start small, and it is much easier to build the quality control into the product at the beginning of the project. At the start of a project, use "iteration 0" for setting up the test and automation infrastructure. There are a certain set of tools at a minimum that I prefer to use for projects. 

Version control - this one should be obvious, but I'll include it here anyway. Whether you use a traditional VCS or distributed VCS may vary based on your needs, but use something.

Unit testing framework - again here, hopefully this one is obvious also. I typically use JUnit, but other tools such as spock can be useful. Also included here are any of the other unit testing framework, such as DBUnit or XMLUnit.

Acceptance testing framework - have a look at Selenium/Webdriver coupled with Spock or Cucumber. Cucumber is a little more complex but lets you write business requirements in plain language rather than in code.

Automated build/Dependency Management tools - ant/ivy, maven or gradle. Pick your poison. I like maven because it is more standardized than ant and has a rich set of plugins. But the million lifecycle phases with maven and the fact that it downloads the entire internet can be a little unnerving. On my next project, I'll be trying out gradle.

Continuous integration server - I really only have experience with Jenkins, but a variety of others exist. I've been quite happy with Jenkins, so I don't see myself switching any time soon.

Deployment tools - This is an important, but often overlooked category of automation tools. Maybe your deployment is simple, but it should absolutely be scripted, even if you write those scripts yourself. Just as with any of the other tools, your deployment infrastructure will evolve as your project does. 

Infastructure tools - Using virtual machines is a great way to make your infrastructure easier to manage. I've been using Chef for provisioning my VMs, and it makes reconfiguring and standing up new machines much easier. VMs are a great way to standup test environments as well. You may additionally want to look at infrastructure monitoring tools such as Nagios

Code quality control tools - cobertura code coverage, checkstyle, findbugs, pmd, CPD (copy-paste detector), and sonar.

This is just a sampling of the common tools I use. There are a seemingly endless number of solutions available in these areas. This may seem a bit overwhelming, and even if you can't use or don't need all of these tools, that's ok. The important part is to step into automation and quality control. The more you can get in place earlier in the project, the easier it will be in the long run.

Friday, July 20, 2012

Hamcrest Matchers

Hamcrest matchers have been available for some time now, but I still don't see them being used regularly on projects. If you're not familiar with them, they are set of matchers that can be used with JUnit's assertThat syntax to make assertions easier to read and easier to diagnose when there are errors. Consider a test where we want to test if an object is a string. A traditional JUnit assertion might look like
Object o; // assume this is set during the test)
assertTrue(o instanceof String);
The default failure message is AssertionError. I could make the error message more meaningful by adding an error message (this still won't work if o is null).
assertTrue("expected String but was " + o.getClass(), o instanceof String);
Hamcrest matchers take of this for us. The assertion can be rewritten as
Now the error message is java.lang.AssertionError: Expected: an instance of java.lang.String got: <4%gt;. Note the is is just syntactic sugar and is optional. Hamcrest matchers also help to resolve the ambiguity between the expected and equals parameters. A typical assertion for comparing two strings might look like:
The assertEquals method is easy to get wrong and reverse the actual expected, which will produce backwards error messages. Using the hamcrest matchers, rewrite this is
assertThat(actual, equals(expected))
All hamcrest matchers follow this pattern of assertThat(actual, matcher), which make them read fluently. More information about hamcrest matchers can be found If you're writing Java unit tests, these provide a nice addition to the library.

Monday, July 2, 2012

Writing clean code

When I interview developers, one question I always ask is "What do you think makes code well written?" Almost without fail, I get the answer "It has a lot of comments." While I'm not saying that good code shouldn't have any comments, usually have a lot of comments is indicative of code that is hard to read. Take this simple example of a method that checks if a number if an even number less than 100 or an odd number greater than 200.
boolean isValidNumber(int number) {
   // number is valid if it is even and less than 100 or odd and greater than 200
   return (number %2 == 0 && number < 100 ) ||(number % 2 != 0 && number > 200)
Sure, with a little bit of thought I can understand what this does or by reading the comments (if I trust them, I'll get to that shortly). This version of the code
boolean isValidNumber(int number) {
  return (isEven(number) && number < 100) || (isOdd(number) && number > 200)
With just this simple change the code just reads better and without the documentation. While the documentation is valid in the first case and reading it does make the code just as easy to understand as the second version, what if I realized there was a change in the validation method and now my code looks like
boolean isValidNumber(int number) {
   // number is valid if it is even and less than 100 or odd and greater than 200
   return (number %2 == 0 && number < 100 ) ||(number % 2 != 0 && number > 201)
Now the documentation doesn't match the code. Six months later, I'm reading this code, and now I have a question - which is correct? Writing cleaner code makes the code easier to read and removes the potential for ambiguity. Code should be written to be read by people, the computer doesn't care what it looks like.

I started off by saying that I think good code can still have comments, but comments that are internal to methods should say why the code does what it does, not what it does. The code should be easy enough to read what it does. On a side note, I do think that documentation that says what the code does is appropriate for the javadocs public methods (or even on private methods since IDEs typically show the documentation when hovering over a method).

Talking about writing clean code is easy, doing it is much more difficult (which is why so few of us actually do). Practice and think about it consciously while writing code until it becomes second nature. I also recommend that all developers read Clean Code by Bob Martin ( at least once.

Monday, May 21, 2012

Automated provisioning of development environment

It's pretty common to use automated scripting tools (chef, puppet, etc) for provisioning your servers, but what about your development environment tools? It's far too common that developers are developing and testing with different versions of the software that is used in production. This also leads to each developer having different versions of tools, which often leads to "It works on my machine."

I've split this problem into two parts - IDE and tooling. Most of our developers use Eclipse, so I've geared my automation efforts toward Eclipse. Using the Eclipse p2 director, it is easy to script the installation of common plugins, settings, etc. Rather than mandating that developers download a pre-packaged distribution of Eclipse (which we then need to maintain), they can download whichever one they want (though most of developers use the Java EE version).

Using the p2 director application, we have a simple groovy script that does the following:
${eclipseExecutable} -application org.eclipse.equinox.p2.director -repository ${repoString} -installIU ${featuresString} -tag InstallInitialPlugins -destination ${eclipsePath} -profile ${profile}
Hopefully the variable names are self explanatory.

For tooling standardization (maven, java, etc), I've setup some simple puppet scripts to pull in those tools (I'm currently working in Windows, and Puppet seems to work better than Chef on Windows). This does not connect to a puppet server, rather we just have a directory on our server that has the necessary files. The script just installs a couple of executables and copies some zipfile distributions. Since our toolsets are not changing all that frequently, the script is run on demand, but it could be setup with a Puppet server on a polling interval if desired. For more complex environments, a virtual machine could be configured with vagrant/chef.