Archive for the ‘Tools’ Category

Quick tutorial how to promote/release artifacts in a Gradle project to Maven Central, without clicking in the Nexus GUI with Gradle Nexus Staging Plugin.

Introduction

Maven Central (aka The Central Repository) is (probably) the world’s largest set of open source artifacts used by Java and JVM-based projects. It was founded by the creators of Apache Maven and it has been serving artifacts since 2002. Nowadays there are some alternatives (listed below), but for many users Maven Central is still the primary source of project dependencies (and sometimes the only one whitelisted in the corporations).

The Central Repository logo

Problem

To perform the release to The Central Repository, Maven users can use Nexus Staging Maven Plugin – free, but not fully open source plugin. But with Gradle it was required to login into Nexus GUI and manually invoke two actions (close repository and release/promote repository). Quite boring and in addition highly problematic with the Continuous Delivery approach. Luckily Nexus exposes REST API which with some work allows to do the same. Gradle Nexus Staging Plugin arose to do that job.

Quick start

Important. Please pay attention that the prerequisite is to have an active and configured account in Sonatype OSSRH (OSS Repository Hosting) as well as Gradle project configured to publish release artifacts to staging repository. If you don’t have it already please follow a separate section for Gradle in the official guide.

To setup automatic release/promotion in your project add gradle-nexus-staging-plugin to the buildscript dependencies in your build.gradle file for root project:

buildscript {
    repositories {
        mavenCentral()
    }
    dependencies {
        classpath "io.codearte.gradle.nexus:gradle-nexus-staging-plugin:0.5.1"
    }
}

Apply the plugin:

apply plugin: 'io.codearte.nexus-staging'

Configure it:

nexusStaging {
    packageGroup = "org.mycompany.myproject"
    stagingProfileId = "yourStagingProfileId" //when not defined will be got from server using "packageGroup"
}

After successful archives upload (with maven, maven-publish or nexus plugin) to Sonatype OSSRH call:

./gradlew closeRepository promoteRepository

to close staging repository and promote/release it and its artifacts. If a synchronization with Maven Central was enabled the artifacts should automatically appear into Maven Central within several minutes.

Details

The plugin provides two main task:

  • closeRepository – closes the open repository with uploaded artifacts. There should be just one open repository available in the staging profile (possible old/broken repositories can be dropped with Nexus GUI)
  • promoteRepository – promotes/releases closed repository (required to put artifacts to Maven Central)

And one additional:

  • getStagingProfile – gets and displays staging profile id for a given package group. This is a diagnostic task to get the value and put it into the configuration closure as stagingProfileId. To see the result it is required to call gradle with --info switch.

It has to be mentioned that calling Nexus REST API ends immediately, but the closing operation takes a moment, so to make it possible to call closeRepository promoteRepository together there is a built-in retry mechanism.

The plugin is “upload mechanism agnostic” and can be used together with maven, maven-plugin or nexus plugins.

For more details and configuration parameters see project webpage or the working example in the plugin’s own release configuration.

Alternatives to Maven Central?

There is much younger, but promising alternative – Bintray which also allows to serve artifacts. It is free for open source projects and I personally had used it for some other projects and even created an automatic release mechanism for Bintray, Travis and Gradle. It works ok, but to put artifacts also to Maven Central it is required to store a private key used for singing on their servers and in addition provide Nexus credentials. It increases the risk to have them stolen and in Codearte we prefer to use private Jenkins instance to perform the release directly to Maven Central.

Summary

With Gradle Nexus Staging Plugin the whole release process to Maven Central can be performed with Gradle from a command line and with some additional work completely automatic from a CI server. No more buttons to push in Nexus GUI. In addition to Sonatype OSSRH the plugin can be also used with private Nexus instances with enabled staging repositories.

Btw, there possibly are many things that could be enhancement in the plugin. If you need something or found a bug feel free to use issue tracker to report that.

Thanks to Kuba Kubryński for motivation and help with analyzing the not very well documented Nexus REST API.

Update 20170904. The plugin is actively developed. Check the project webpage to read about new features added since version 0.5.1.

Advertisements

When writing integration tests it is sometimes required to set up environment initial conditions/state before/after a given test or the whole specification. Upcoming Spock 1.0 expands the number of available options to do it in the convenient way.

This is the second part of the series about new and noteworthy in (upcoming) Spock 1.0.

New extension @RestoreSystemProperties

System properties provides information about JVM configuration and the environment like JVM vendor and version, operating system, classpath, locale or a time zone. Some of them can impact the way our application works. The following example checks if the protection before running a dangerous program as root works properly on Unix machines. To not affect other tests @RestoreSystemProperties restores the original system properties.

@Stepwise
class RestoreSystemPropertiesSpec extends Specification {

    @RestoreSystemProperties
    def "should deny perform action when running as root"() {
        given:
            System.setProperty("user.name", "root")
        and:
            def sut = new DangerousActionPerformer()
        when:
            sut.squashThemAll()
        then:
            thrown(IllegalStateException)
    }

    def "should perform action when running as 'normal' user"() {
        given:
            def sut = new DangerousActionPerformer()
        when:
            sut.squashThemAll() //current user name was restored
        then:
            noExceptionThrown()
    }

    (...)
}

The extension can be activated using @RestoreSystemProperties annotation. It can be placed on a feature method to be applied right after the given test only or on a class to restore system properties after every test in the specification. The behavior in the second case is identical to placing the annotation on every test method.

Internally Spock makes a copy of a system Properties structure before a test and restores it after.

@ConfineMetaClassChanges (since 0.7)

One of the things which make Groovy language so powerful is metaprogramming – an ability to modify classes (e.g. add/modify new methods) at runtime. That could look like hacking (and that is true), but in some specific cases it is very useful (see Grails and GORM for creating database queries when non existing methods – like findByFirstNameAndSecondName(...) or writing a custom DSL like 6.minutes.ago).

MetaClass operations are usually executed at the class (not instance) level what generally causes that they affect every class instance in given class loader. GORM like changes done in multiple test could interact each other causing tests to fail. For that cases @ConfineMetaClassChanges extension (available already in previous Spock version) was created.

@Stepwise
class ConfineMetaClassChangesSpec extends Specification {

    @ConfineMetaClassChanges(EmptyClass)
    def "should allow to call added method"() {
        given:
            EmptyClass.metaClass.returnFoo = { "foo" }
        when:
            def fooValue = new EmptyClass().returnFoo()
        then:
            fooValue == "foo"

    }

    def "should throw exception on not available method"() {
        when:
            new EmptyClass().returnFoo()
        then:
            thrown(MissingMethodException)
    }
}

class EmptyClass {
}

@ConfineMetaClassChanges annotation can be placed on a feature method to restore MetaClass to the state just before that test or on a class to restore system properties after the last test in the specification to the state before calling the setupSpec method.

Note that @ConfineMetaClassChanges behavior placed on a specification level is different (once after the last test) than @RestoreSystemProperties behavior (every time after every test).

@AutoCleanup (since 0.7)

Another already existing, but rarely known and used extension is @AutoCleanup. Usually external resources are allocated in setup/setupSpec methods and released in cleanup/cleanupSpec methods. But Spock, the same as pure JUnit, support auto initialization instance variables (fields) before every test/feature (which overrides default Java behavior when a field is initialized only once when a class instance is created).

class ManualCleanupSpec extends Specification {

    @Shared
    private DBConnection dbConnection = new H2DBConnection()

    void cleanupSpec() {
        dbConnection.close()    //boilerplate code
    }

    def "should do fancy thing on DB"() {
        ...
    }
}

interface DBConnection {
    void close()
}

To make it easier to release those inlined resources can be annotated with @AutoCleanup to call close method (or any other method defined as a parameter) in the clean up phase.

class AutoCleanupSpec extends Specification {

    @AutoCleanup
    @Shared
    private DBConnection dbConnection = new H2DBConnection()

    def "should do fancy thing on DB"() {
        ...
    }
}

@AutoCleanup supports both instance variables (the method is called after every tests) and static/@Shared fields (the clean up is performed after the last test). All errors during clean up are caught to not interrupt test execution. By default there are logged, which can be disabled using quiet parameter @AutoCleanup(quiet = true).

Where can I find Spock 1.0?

Please notice that blog post is about features available in Git branch planned to become Spock 1.0 sometime in the future. They are currently available only in a 1.0-SNAPSHOT version in a separate repository. As not being released (at the time of writing) they could look different or even (in the extreme case) be not available in the final version. Be aware.

To add Spock 1.0 as a dependency to your project define snapshot repository https://oss.sonatype.org/content/repositories/snapshots/ and the Spock artifact in version 1.0-groovy-2.0-SNAPSHOT. Sample configuration in Gradle:

repositories {
    ...
    maven { url "https://oss.sonatype.org/content/repositories/snapshots/" }
}

dependencies {
    ...
    testCompile ('org.spockframework:spock-core:1.0-groovy-2.0-SNAPSHOT')
}

Summary

Mentioned extensions were designed for integration tests and because of class loader (or system) wide changes it is not safe to execute more than one test in the same time. Therefore it is important to separate pure, very fast and independent, unit tests from integration tests to (inter alia) allow unit tests to run in parallel.

After @Require and @IgnoreIf this is the second post about changes in upcoming Spock 1.0. The examples was written using Spock 1.0-groovy-2.0-SNAPSHOT and they can cloned from GitHub.

May the Spock be with you!

Introduction

Some tests (especially integration tests) should be run only if certain conditions are (or are not) met. Upcoming Spock 1.0 provides new @Requires and improved @IgnoreIf extensions to handle that requirement in a convenient way.

Historical background – waiting for Spock 1.0

The latest stable Spock version (0.7) has been released in October 2012 (~2 years as the time of writing). There have been made hundreds of commits to Git repository since then and many new features and improvements are already available in the current SNAPSHOT version. Unfortunately the “New and Noteworthy” section in the documentation doesn’t cover those changes and the only way (as I know) to get them know is digging into Git commits. As a Spock user I was curious what new is around the corner, so I did that job which resulted in the presentation at Confitura (in Polish). A as bonus I planned a series of blog entries describing those more interesting features and possibly also an update of Spock documentation.

This is just the first part of the changes in (upcoming) Spock 1.0. There is much more to discover and I will be presenting that in future blog posts.

New extension @Requires

@Requires allows to run given test (or the whole specification) only if given criteria are met.

    @Requires({ System.getProperty("os.arch") == "i386" })
    def "should use native libraries on 32-bit JVM"() { ... }

    @Requires({ env.containsKey("ENABLE_CRM_INTEGRATION_TESTS") })
    def "should do complicated things with CRM"() { ... }

It is opposite to already known from Spock 0.7 @Ignore extension which allowed to ignore the given test (or the whole specification) using access to system properties (propertiesSystem.getProperties()), environment variables (envSystem.getenv()) and java version (javaVersionSystem.getProperty("java.version")). In Spock 1.0 branch they both are much more powerful.

Testing my examples in practice I was surprised that a system property “os.arch” returns not a processor architecture taken from an OS, but a JVM version. So, be aware that running a 32-bit JVM on 64-bit system will return “i386”. Hopefully usually that knowledge is not important (unless for example the native libraries are used).

New features in @Requires and @IgnoreIf

In addition to the new @Requires extension it and its twin @IgnoreIf have got new internal abstraction to simplify the way how the operating system information and a JVM version can be accessed.

Operating system

Operating system information can be accessible in Spock 0.7 using os.name and os.version system properties. Unfortunately it can be complicated and error prone in some cases:

    //Run only on Linux and MacOS - in Spock 0.7 style
    @Requires({ System.getProperty("os.name").toLowerCase().contains("mac os") || 
                System.getProperty("os.name").toLowerCase().contains("darwin") || 
                System.getProperty("os.name").toLowerCase().contains("linux")  })
    def "should use fancy text console features"() { ... }

In the meantime Spock has got the new OperatingSystem abstraction which provides methods like:

    String getName() //Linux
    String getVersion() //8.1
    Family getFamily() //SOLARIS (enum)

    boolean isLinux()
    boolean isWindows()
    boolean isMacOs() //also Solaris and Other
    boolean iSolaris()
    boolean isOther()

With their help mentioned example can be simplified to very readable code precisely explaining our intentions:

    @Requires({ os.linux || os.macOs  })    //Spock 1.0 edition
    def "should use fancy text console features"() { ... }

JVM version

In addition to OperatatingSystem there is also Jvm abstraction which provides methods like:

    boolean isJava7() //true only if Java 7
    boolean isJava8Compatible() //true if Java 8+

or more generic:

    String getJavaVersion() //e.g. "1.8.0_05"
    String getJavaSpecificationVersion() //e.g. "1.8"

The original JVM version dependent test cases declaration:

    @IgnoreIf({ javaVersion < 1.8 })
    def "should find at runtime and use CompletableFuture for Java 8+"() { 
        ... 
    }

can be replaced with more verbose:

    @IgnoreIf({ !jvm.java8Compatible })
    def "should find at runtime and use CompletableFuture for Java 8+"() { 
        ... 
    }

Using static methods

@Requires (the same as @IgnoreIf) can also use static methods to define a condition (which is available, but less known in Spock 0.7). Those methods can be declared inside a given class, in an another class or (when using Groovy 2.3+) in a trait.

import static FancyFeatureRequiredIntranetConnectionSpecIT.isUrlAvailable

@Requires({ isUrlAvailable("http://some-host.intranet/") })
class FancyFeatureRequiredIntranetConnectionSpecIT extends Specification {

    //could be also from trait or other class
    static boolean isUrlAvailable(String url) { 
        //...
    }

    def "should do one thing in intranet"() { ... }

    def "should do another thing in intranet"() { ... }

    ...
}

Please pay attention to the static import in this example. Even though that code without an import looks ok and Idea sees the method it will fail at runtime with groovy.lang.MissingMethodException: No signature of method: (...). This is caused by the way how Spock internally tries to resolve references in a Closure. For detailed explanation see the next paragraph.

groovy.lang.MissingMethodException: No signature of method:
FancyFeatureRequiredIntranetConnectionSpecIT$_closure1.isUrlAvailable()
is applicable for argument types: (java.lang.String) values: [http://some-host.intranet/]

How does it internally work (for the curious)?

You may wonder how that code even compile if jvm or os are not known for the Specification class. The key thing is that code inside @Requires (and @IgnoreIf) annotation is wrapped with {} (curly brackets). In Groovy that syntax means “anonymous code block” and it is named Closure. The @Requires annotation accepts a Closure as a parameter:

@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.TYPE, ElementType.METHOD})
@ExtensionAnnotation(RequiresExtension.class)
public @interface Requires {
  Class<? extends Closure> value(); // <- important line
}

Two important aspects of a Groovy Closure are used in Spock to implement those features:

  • a block of code is executed “at the later point”
  • an execution context can be delegated

The first point causes that Groovy compiler ignores unresolved references. At the compilation time the execution context can be unknown (a Closure can be passed as a parameter far away from the declaration place). The used references will be resolved at runtime and here the second point applies.

By default property references and methods are attempted to be resolved to the owner (an enclosing class or a surrounding Closure) and later to the delegate (by default the same as owner). A delegate can be changed and that is used in RequiresExtension:

    (...)
    condition.setDelegate(new PreconditionContext());
    condition.setResolveStrategy(Closure.DELEGATE_ONLY);
    return condition.call();

PreconditionContext is a delegate class for @Requires and @IgnoreIf providing an execution context. Its getOS() or getJvm() methods are resolved and executed when os or jvm properties are used in the annotation Closure. In addition to set a dedicated delegate, a resolve strategy is changed to not bother with an enclosing class and try resolve unresolved references only using PreconditionContext.

If you are new to Groovy and feel confused with the internals I propose you to take a look at the introduction to Groovy Closure. If you like it there is a lot more things to get know about Groovy Closure and its power.

Where can I find Spock 1.0?

Please notice that blog post is about features available in Git branch planned to become Spock 1.0 sometime in the future. They are currently available only in a 1.0-SNAPSHOT version in a separate repository. As not being released (as the time of writing) they could look different or even (in the extreme case) be not available in the final version. Be aware.

To add Spock 1.0 as a dependency to your project define snapshot repository http://oss.sonatype.org/content/repositories/snapshots/ and the Spock artifact in version 1.0-groovy-2.0-SNAPSHOT. Sample configuration in Gradle:

repositories {
    ...
    maven { url "http://oss.sonatype.org/content/repositories/snapshots/" }
}

dependencies {
    ...
    testCompile 'org.spockframework:spock-core:1.0-groovy-2.0-SNAPSHOT'
}

Summary

New @Requires extension and the enhancements in @IgnoreIf are the first part of the series about new and noteworthy in upcoming Spock 1.0. When will it be released? Citing John Carmack “It’ll be done when it’s done”. If you would like to bring the release closer donate your time and contribute to the project.

Post written using Spock 1.0-groovy-2.0-SNAPSHOT. Examples can be cloned from GitHub.

Test quickly and prosper :).

AssertJ and Awaitility are two of my favorites tools using in automatic code testing. Unfortunately until recently it was not possible to use it together. But then Java 8 entered the game and several dozens lines of code was enough to make it happen in Awaility 1.6.0.

Awaitility logo

AssertJ provides a rich set of assertions with very helpful error messages, all available though the fluent type aware API. Awaitility allows to express expectations of asynchronous calls in a concise and easy to read way leveraging an active wait pattern which shortens the duration of tests (no more sleep(5000)!).

The idea to use it together came into my mind a year ago when I was working on an algo trading project using Complex event processing (CEP) and I didn’t like to learn Hamcrest assertion just for asynchronous tests with Awaitility. I was able to do a working PoC, but it required to make some significant duplication in AssertJ (then FEST Assert) code and I shelved the idea. A month ago I was preparing my presentation about asynchronous code testing for the 4Developers conference and asked myself a question: How Java 8 could simplify the usage of Awaitility?

For the few examples I will use asynchronousMessageQueue which can be used to send ping request and return number of received packets. One of the ways to test it with Awaitility in Java 7 (besides proxy based condition) is to create a Callable class instance:

    @Test
    public void shouldReceivePacketAfterWhileJava7Edition() {
        //when
        asynchronousMessageQueue.sendPing();
        //then
        await().until(receivedPackageCount(), equalTo(1));
    }

    private Callable receivedPackageCount() {
        return new Callable() {
            @Override
            public Integer call() throws Exception {
                return asynchronousMessageQueue.getNumberOfReceivedPackets();
            }
        };
    }

where equalTo() is a standard Hamcrest matcher.

The first idea to reduce verbosity is to replace Callable with a lambda expression and inline the private method:

    @Test
    public void shouldReceivePacketAfterWhile() {
        //when
        asynchronousMessageQueue.sendPing();
        //then
        await().until(() -> asynchronousMessageQueue.getNumberOfReceivedPackets(), equalTo(1));
    }

Much better. Going forward lambda expression can be replaced with a method reference:

    @Test
    public void shouldReceivePacketAfterWhile() {
        //when
        asynchronousMessageQueue.sendPing();
        //then
        await().until(asynchronousMessageQueue::getNumberOfReceivedPackets, equalTo(1));
    }

Someone could go even further and remove Hamcrest matcher:

    @Test
    public void shouldReceivePacketAfterWhile() {
        //when
        asynchronousMessageQueue.sendPing();
        //then
        await().until(() -> asynchronousMessageQueue.getNumberOfReceivedPackets() == 1);  //poor error message
    }

but while it still works the error message becomes much less meaningful:

ConditionTimeoutException: Condition with lambda expression in
AwaitilityAsynchronousShowCaseTest was not fulfilled within 2 seconds.

instead of very clear:

ConditionTimeoutException: Lambda expression in AwaitilityAsynchronousShowCaseTest
that uses AbstractMessageQueueFacade: expected  but was  within 2 seconds.>

The solution would be to use AssertJ assertion inside lambda expression:

    @Test
    public void shouldReceivePacketAfterWhileAssertJEdition() {
        //when
        asynchronousMessageQueue.sendPing();
        //then
        await().until(() -> assertThat(asynchronousMessageQueue.getNumberOfReceivedPackets()).isEqualTo(1));
    }

and thanks to the new AssertionCondition initially hacked within a few minutes it became a reality in Awaitility 1.6.0. Of course AssertJ fluent API and meaningful failure messages for different data types are preserved.

As a bonus all assertions that throw AssertionError (so particularly TestNG and JUnit standard assertions) can be used in the lambda expression as well (but I don’t know anyone who came back to “standard” assertion knowing the power of AssertJ).

The nice thing is that the changes itself leverage Runnable class to implement lambdas and AssertJ support and Awaitility 1.6.0 is still Java 5 compatible. Nevertheless for the sake of readability it is only sensible to use the new constructions in Java 8 based projects.

Update. Be aware that in the meantime until() has been renamed to untilAsserted().

Btw, here are the “slides” from my presentation at 4Developers.

4developers logo

Mutation testing allows to check the quality (effectiveness) of automatic tests. PIT (aka pitest) is a leading mutation testing tool for Java environment. In my last blog post about PIT in January 2013 I have covered version 0.29. Since then the PIT development team has been busy and the 4 releases introduced various new features (besides fixed bugs). In this post I will cover the most important (in my opinion) changes in the project (up to recently released version 0.33).

PIT mutation testing tool logo

New features

– preliminary support for Java 8 bytecode – PIT can be used with code which contains Java 8 syntax and constructions (including lamdas)
– internal refactoring resulted in much faster “standard” line coverage calculation
– support for parametrized JUnit tests written with Spock (in Groovy) and JUnitParams
– ability to define a coverage threshold (both line and mutation) below which the build will fail
– ability to use PIT with Robolectric
– new Remove Conditionals Mutator (a conditional statement will always be true – not enabled by default as of 0.33)
– new Remove Increments Mutator (an increment operation will be removed – not enabled by default as of 0.33)
– ability to choose JVM to be used for mutation testing
– ability to run PIT only for locally changed files for Maven build with configured SCM plugin
– demanding users can define their own strategies for: test selection, output format and test prioritization – PIT provides extension points which allow to write custom implementations
– partial support for JUnit categories
– support for mutating static initializers in TestNG

In the meantime there were also releases of plugins/tools based on PIT. My plugin for Gradle was enhanced with the dynamic task dependencies resolution (just “gradle pitest” takes care about all the requisites in the Gradle build lifecycle) and support for the additional main and test source sets. Plugin for Eclipse has got (inter alia) a new mutation view and an ability to run PIT against all the projects in a workspace.

Not only releases

Besides new releases PIT has got brand new Bootstrap based webpage, the logo (see above) and the source code was migrated from Mercurial on Google Code to GitHub. The nice thing is that the move resulted in a few contributions withing the first weeks.

Henry Coles the author of PIT also started the new commercial project FaultSeed – “better mutation testing tools for the JVM” which will be based on PIT and has a goal to be 50% faster than PIT and support also Groovy and Scala. Very promising.

PIT (and mutation testing in general) becomes more and more popular and recently there were given a number of talks about it (including my talk at Developer Conference 2014slides). The number of questions on the project’s mailing list also significantly increased. And you, have you tried PIT in your project yet?

DevConf.cz 2014 logo

AppFuse 3.0 has been released. It refreshes used technologies (Java 7+, Spring 4, Spring Security 3.2) and adds a bunch of new (Bootstrap 3, PrimeFaces, wro4j, WebJars and finally Apache Wicket).

AppFuse logo

From the project home page:

AppFuse is a full-stack framework for building web applications on the JVM. It was originally developed to eliminate the ramp-up time found when building new web applications for customers. Over the years, it has matured into a very testable and secure system for creating Java-based webapps. At its core, AppFuse is a project skeleton, similar to the one that’s created by your IDE when you click through a wizard to create a new web project.

I you are looking for some foundation for your project or just would like to see how the same things look in different technologies you can give AppFuse a try. A quick start page should be a good starting point.

There is also a personal thread in this story. Steadfast readers can remember that over 3 years ago I started working on Wicket frontend for AppFuse. I definitely prefer working on backends, but I wanted to get know Wicket (and its famous ability to being tested without Selenium) better and an engagement in the AppFuse development seemed to be a good way to practice these skills. There are still places in a Wicket frontend which need to be polished, but the work is mostly done (what I’m happy about :) ).

As a summary of my work I can write that even Wicket (where all page logic is written in Java or Groovy classes – no more c:forEach tags!) cannot completely remove the pain which comes with the limitation of HTTP (which wasn’t designed for the “enterprise applications”) and differences between browsers (although with jQuery and Bootstrap it is much easier). In addition 3 years is a lot of time in IT and currently there are even more use cases where component-based server side frameworks aren’t the best solution to make a good looking, responsible, scalable and trendy UI. Maybe it is time to work on an AngularJS frontend ;).

Mutation testing can efficiently detect places in code which are insufficiently covered by tests. The price we have to pay for it is time – number of mutations has to be tested with a set of unit tests. This time is much longer than calculating a “normal” code coverage. The newest PIT 0.29 provides a long awaiting feature – incremental analysis. When enabled PIT will store results from the previous runs on disk and track changes in the code and tests to avoid rerunning analyses which result should stay the same.

To start using incremental analysis it is necessary set historyInputLocation and historyOutputLocation configuration properties. For example in Pitest Maven Plugin it could be:

<plugin>
    <groupId>org.pitest</groupId>
    <artifactId>pitest-maven</artifactId>
    <version>0.29</version>
    <configuration>
        <threads>4</threads>
        <historyInputLocation>target/pitHistory.txt</historyInputLocation>
        <historyOutputLocation>target/pitHistory.txt</historyOutputLocation>
    </configuration>
</plugin>

It is worth to notice that for a basic usage (i.e. run analysis from time to time) input and output history locations will point to the same file. Therefor in Gradle plugin for PIT 0.29.0 there was added an additional parameter enableDefaultIncrementalAnalysis which when enabled automatically set historyInputLocation and historyOutputLocation to build/pitHistory.txt simplifying a configuration.

buildscript {
    (...)
    dependencies {
        classpath 'info.solidsoft.gradle.pitest:gradle-pitest-plugin:0.29.0'
        classpath 'org.pitest:pitest:0.29'
    }
}

apply plugin: 'pitest'

pitest {
    targetClasses = ['our.base.package.*']
    threads = 4
    enableDefaultIncrementalAnalysis = true
}

There is a number of optimizations already implemented in PIT. The author warns of potential errors which can be introduced in that way into the analysis, but although being currently an experimental feature it can dramatically reduce calculation time especially when used on very large codebases.

Update 2013-04-13: Added missing line which applies plugin to a project. Thanks to Bruno de Carvalho for report the issue.

Looking for a way to use mutation testing and PIT with your Gradle-based project? Your search is over. Recently released gradle-pitest-plugin makes it possible in a very comfortable way.

In short the idea with mutation testing is to modify the production code (introduce mutations) which should change its behavior (produce different results) and cause unit tests to fail. The lack of the failure may indicate that given part of the production code was not covered good enough by the tests. To read more about mutation testing take a look on my previous post or PIT webpage directly.

To start using PIT add following configuration to a build.gradle file in your project:

buildscript {
    repositories {
        mavenLocal()
        mavenCentral()
        //Needed to use a plugin JAR uploaded to GitHub (not available in a Maven repository)
        add(new org.apache.ivy.plugins.resolver.URLResolver()) {
            name = 'GitHub'
            addArtifactPattern 'http://cloud.github.com/downloads/szpak/[module]/[module]-[revision].[ext]'
        }
    }
    dependencies {
        classpath 'info.solidsoft.gradle.pitest:gradle-pitest-plugin:0.28.0'
        classpath 'org.pitest:pitest:0.28'
    }
}

This will add required dependencies to a build script together with a proper repositories configuration.

The second thing is to configure plugin itself.

pitest {
    targetClasses = ['our.base.package.*']
    threads = 4
}

The only required parameter is “targetClasses” – a package containing a code which should be mutated (usually the base package for your project), but in case your tests are written in a thread safe manner I encourage your to give a “threads” parameter a try. It can decrease a time required for mutation analysis dramatically. gradle-pitest-plugin supports all reasonable parameters available in PIT.

Having everything configured running mutation testing is as easy as:

gradle pitest

After a while you should see PIT summary similar to:

================================================================================
- Statistics
================================================================================
Generated 59 mutations Killed 52 (88%)
Ran 161 tests (2.73 tests per mutation)
================================================================================

Detailed reports with information about survived mutations and the corresponding parts of code is written to build/reports/pitest/ directory relative to your project root.

Sample PIT report

Sample PIT report

Btw, there is a version 0.29 of PIT just around the corner which provides such interesting features as incremental analysis (i.e. run only mutation tests for code that have been changed). I plan to write a post about it when released, so stay tuned.

Disclaimer. I am the author of gradle-pitest-plugin.

Mutation testing is a technique which allows to discover which parts of our code are not covered by tests. It is similar to a code coverage, but mutation testing is not limited to the fact that a given line was executed during tests. The idea is to modify the production code (introduce mutations) which should change its behavior (produce different results) and cause unit tests to fail. The lack of the failure may indicate that given part was not covered good enough by the tests. The idea of mutation testing is quite old, but it is rather unpopular. Despite the fact I am rather experienced in testing I found it just recently reviewing a beta version of the new book about testing.

PIT is “a fast bytecode based mutation testing system for Java that makes it possible to test the effectiveness of your unit tests”. It is a very young project, but very promising. It offers a set of mutation operators which among others modify conditional statements, mathematical operations, return values and methods calls.

Starting with the recently released version 0.25 PIT (experimentally) supports TestNG based tests (in addition to JUnit based). To use it from Maven it is required to add pitest-maven plugin to pom.xml:

<plugin>
    <groupId>org.pitest</groupId>
    <artifactId>pitest-maven</artifactId>
    <version>0.25</version>
<!--
    <configuration>
        <inScopeClasses>
            <param>info.solidsoft.blog.pitest.*</param>
        </inScopeClasses>
        <targetClasses>
            <param>info.solidsoft.blog.pitest.*</param>
        </targetClasses>
    </configuration>
-->
</plugin>

In many cases it would be enough. inScopeClasses (mutable classes and tests for running) and targetClasses (only candidates for mutation) by default use project groupId and usually can be omitted. There are several options that can be configured in the plugin configuration. “mvn org.pitest:pitest-maven:mutationCoverage” performs modified tests and generates mutation report which by default is saved in target/pit-reports/yyMMddHHmm directory.

A sample report (click to enlarge) for a specified class shows both line coverage and mutation coverage. Despite 100% line coverage (lines with light green background), PIT found that a test data set does not cover properly boundary conditions.

Sample PIT report

Sample PIT report

On the New Year’s day I released version 0.5.2 of AppInfo – a library providing an easy way to versioning your application. More about AppInfo I wrote in my previous post – step by step tutorial how to use it. In 0.5.2 ManifestReader hierarchy was internally refactored (duplication was removed, better coverage provided). I made also test and code coverage reports create automatically by CI server publicly available – I have no reason to be ashamed of my code. After all, working code is not enough :).