Posts Tagged ‘testng’

AssertJ and Awaitility are two of my favorites tools using in automatic code testing. Unfortunately until recently it was not possible to use it together. But then Java 8 entered the game and several dozens lines of code was enough to make it happen in Awaility 1.6.0.

Awaitility logo

AssertJ provides a rich set of assertions with very helpful error messages, all available though the fluent type aware API. Awaitility allows to express expectations of asynchronous calls in a concise and easy to read way leveraging an active wait pattern which shortens the duration of tests (no more sleep(5000)!).

The idea to use it together came into my mind a year ago when I was working on an algo trading project using Complex event processing (CEP) and I didn’t like to learn Hamcrest assertion just for asynchronous tests with Awaitility. I was able to do a working PoC, but it required to make some significant duplication in AssertJ (then FEST Assert) code and I shelved the idea. A month ago I was preparing my presentation about asynchronous code testing for the 4Developers conference and asked myself a question: How Java 8 could simplify the usage of Awaitility?

For the few examples I will use asynchronousMessageQueue which can be used to send ping request and return number of received packets. One of the ways to test it with Awaitility in Java 7 (besides proxy based condition) is to create a Callable class instance:

    @Test
    public void shouldReceivePacketAfterWhileJava7Edition() {
        //when
        asynchronousMessageQueue.sendPing();
        //then
        await().until(receivedPackageCount(), equalTo(1));
    }

    private Callable<Integer> receivedPackageCount() {
        return new Callable<Integer>() {
            @Override
            public Integer call() throws Exception {
                return asynchronousMessageQueue.getNumberOfReceivedPackets();
            }
        };
    }

where equalTo() is a standard Hamcrest matcher.

The first idea to reduce verbosity is to replace Callable with a lambda expression and inline the private method:

    @Test
    public void shouldReceivePacketAfterWhile() {
        //when
        asynchronousMessageQueue.sendPing();
        //then
        await().until(() -> asynchronousMessageQueue.getNumberOfReceivedPackets(), equalTo(1));
    }

Much better. Going forward lambda expression can be replaced with a method reference:

    @Test
    public void shouldReceivePacketAfterWhile() {
        //when
        asynchronousMessageQueue.sendPing();
        //then
        await().until(asynchronousMessageQueue::getNumberOfReceivedPackets, equalTo(1));
    }

Someone could go even further and remove Hamcrest matcher:

    @Test
    public void shouldReceivePacketAfterWhile() {
        //when
        asynchronousMessageQueue.sendPing();
        //then
        await().until(() -> asynchronousMessageQueue.getNumberOfReceivedPackets() == 1);  //poor error message
    }

but while it still works the error message becomes much less meaningful:

ConditionTimeoutException: Condition with lambda expression in
AwaitilityAsynchronousShowCaseTest was not fulfilled within 2 seconds.

instead of very clear:

ConditionTimeoutException: Lambda expression in AwaitilityAsynchronousShowCaseTest
that uses AbstractMessageQueueFacade: expected <1> but was <0> within 2 seconds.>

The solution would be to use AssertJ assertion inside lambda expression:

    @Test
    public void shouldReceivePacketAfterWhileAssertJEdition() {
        //when
        asynchronousMessageQueue.sendPing();
        //then
        await().until(() -> assertThat(asynchronousMessageQueue.getNumberOfReceivedPackets()).isEqualTo(1));
    }

and thanks to the new AssertionCondition initially hacked within a few minutes it became a reality in Awaitility 1.6.0. Of course AssertJ fluent API and meaningful failure messages for different data types are preserved.

As a bonus all assertions that throw AssertionError (so particularly TestNG and JUnit standard assertions) can be used in the lambda expression as well (but I don’t know anyone who came back to “standard” assertion knowing the power of AssertJ).

The nice thing is that the changes itself leverage Runnable class to implement lambdas and AssertJ support and Awaitility 1.6.0 is still Java 5 compatible. Nevertheless for the sake of readability it is only sensible to use the new constructions in Java 8 based projects.

Btw, here are the “slides” from my presentation at 4Developers.

4developers logo

Advertisements

Recently I wanted to configure an ability to run both TestNG and JUnit tests in one Maven module (project). At the end I managed how to do it clean and short, but before that I have found a few different solutions on the web (top 5 in Google) which part of them didn’t work and the rest applied to the earlier versions of Surefire plugin and was overly complicated (e.g. two separate executions). Therefore I decided to write this short post to show how it could be done in Surefire 2.13 – the newest version available in March 2013.

Mixing those tests frameworks in one module can be done just by adding both JUnit and TestNG as a plugin dependencies (not as project dependencies):

<plugin>
    <artifactId>maven-surefire-plugin</artifactId>
    <version>${surefire.version}</version>
    <dependencies>
        <dependency>
            <groupId>org.apache.maven.surefire</groupId>
            <artifactId>surefire-junit47</artifactId>
            <version>${surefire.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.maven.surefire</groupId>
            <artifactId>surefire-testng</artifactId>
            <version>${surefire.version}</version>
        </dependency>
    </dependencies>
</plugin>

As the result both test types are executed.

[INFO] --- maven-surefire-plugin:2.13:test (default-test) @ junit-testng-poc ---
[INFO] Surefire report directory: /tmp/junit-testng-poc/target/surefire-reports
[INFO] Using configured provider org.apache.maven.surefire.junitcore.JUnitCoreProvider
[INFO] Using configured provider org.apache.maven.surefire.testng.TestNGProvider

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
parallel='none', perCoreThreadCount=true, threadCount=2, useUnlimitedThreads=false
Running info.solidsoft.rnd.junit.testng.SampleJUnitTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.253 sec

Results :

Tests run: 1, Failures: 0, Errors: 0, Skipped: 0


-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running TestSuite
Configuring TestNG with: TestNG652Configurator
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.869 sec

Results :

Tests run: 1, Failures: 0, Errors: 0, Skipped: 0

Why to use TestNG with JUnit together? TestNG has a few features which are unavailable or less flexible in JUnit (just to mention a few: dependencies between tests and groups of tests (irreplaceable for integration tests with long startup), parametrized tests, concurrent execution or per suite/group/class init/shutdown operations). Therefore it is tempting to migrate existing tests from JUnit to TestNG. Having large code base it could be not so easy to migrate all of them at once and presented configuration allows to write the new tests in TestNG and rewrite the old tests when appropriate.

The whole working example can be found in my GitHub repository.

Btw, it is worth to mention that thanks to the fact TestNG generates reports also in the JUnit’s XML format all the tools compatible with JUnit (Jenkins, Sonar, …) will merge test results from JUnit and TestNG and display all of them properly.

Btw2, the same configuration works also with Failsafe plugin.

Btw3, thanks to the fact Spock Framework is under the hood a runner for JUnit presented trick can be used also to mix it with TestNG. This requires some additional work to integrate also Groovy and I plan to write about it in one of my future posts.

Mutation testing is a technique which allows to discover which parts of our code are not covered by tests. It is similar to a code coverage, but mutation testing is not limited to the fact that a given line was executed during tests. The idea is to modify the production code (introduce mutations) which should change its behavior (produce different results) and cause unit tests to fail. The lack of the failure may indicate that given part was not covered good enough by the tests. The idea of mutation testing is quite old, but it is rather unpopular. Despite the fact I am rather experienced in testing I found it just recently reviewing a beta version of the new book about testing.

PIT is “a fast bytecode based mutation testing system for Java that makes it possible to test the effectiveness of your unit tests”. It is a very young project, but very promising. It offers a set of mutation operators which among others modify conditional statements, mathematical operations, return values and methods calls.

Starting with the recently released version 0.25 PIT (experimentally) supports TestNG based tests (in addition to JUnit based). To use it from Maven it is required to add pitest-maven plugin to pom.xml:

<plugin>
    <groupId>org.pitest</groupId>
    <artifactId>pitest-maven</artifactId>
    <version>0.25</version>
<!--
    <configuration>
        <inScopeClasses>
            <param>info.solidsoft.blog.pitest.*</param>
        </inScopeClasses>
        <targetClasses>
            <param>info.solidsoft.blog.pitest.*</param>
        </targetClasses>
    </configuration>
-->
</plugin>

In many cases it would be enough. inScopeClasses (mutable classes and tests for running) and targetClasses (only candidates for mutation) by default use project groupId and usually can be omitted. There are several options that can be configured in the plugin configuration. “mvn org.pitest:pitest-maven:mutationCoverage” performs modified tests and generates mutation report which by default is saved in target/pit-reports/yyMMddHHmm directory.

A sample report (click to enlarge) for a specified class shows both line coverage and mutation coverage. Despite 100% line coverage (lines with light green background), PIT found that a test data set does not cover properly boundary conditions.

Sample PIT report

Sample PIT report

To be useful automatic tests should run very fast. Otherwise will not be run often during development or even will be ignored in the default configuration on the developer workstations. The simplest rule it to write only small unit tests which test given class having the neighborhood mocked. Nevertheless sometimes it is useful/required to test something in IoC container context (Spring, CDI, Guice) or using embedded database (H2, HyperSQL, Derby). Unfortunately even a few tests of that kind can increase significantly overall test execution time. I had that situation in one of my projects and to prevent using skipTests flag I developed a solution using groups from TestNG and Maven Surefire plugin.

Tests were divided into three groups:
  – very fast real unit tests (all by default) – should be run very often during development (from IDE or by mvn test, mvn package)
  – slower integration but self-sufficient tests (setting up Spring context and/or using embedded H2 database) – should be run at least before commit/push or while working on given part (from IDE or by mvn integration-test, mvn install)
  – real integration tests (requiring access to remote servers – e.g. testing web services or REST) – should be run daily by CI server or by developers working on the integration (mvn install, mvn integration-test with additional profile enabled)

To achieve that given tests (or test classes) have to be marked as “self-integration” or “integration” (at method or class level):

    @Test(groups = "self-integration")
    public void shouldInitializeChainedAppInfoProperly() {
@Test(groups = "integration")
public class FancyWebServiceIntegrationTest {

Maven Surefire plugin should be configured to exclude “self-integration” and “integration” test groups from default execution and add “self-integration” to “integration-test phase:

    <build>
        <plugins>
            (...)
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>${ver.surefire-plugin}</version>
                <executions>
                    <execution>
                        <id>default-test</id> <!-- to override default configuration - in fact: unit tests -->
                        <configuration>
                            <excludedGroups>self-integration,integration</excludedGroups>
                        </configuration>
                    </execution>
                    <execution>
                        <id>self-integration</id>
                        <phase>integration-test</phase>
                        <goals>
                            <goal>test</goal>
                        </goals>
                        <configuration>
                            <groups>self-integration</groups>
                            <reportsDirectory>target/self-integration-surefire-reports/</reportsDirectory>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

In addition (if needed) the separate separate profile with “integration” test group configured in “integration-test” phase could be created.

    <profiles>
        (...)
        <profile>
            <id>integration</id>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>${ver.surefire-plugin}</version>
                        <executions>
                            <execution>
                                <id>integration</id>
                                <phase>integration-test</phase>
                                <goals>
                                    <goal>test</goal>
                                </goals>
                                <configuration>
                                    <groups>integration</groups>
                                    <reportsDirectory>target/integration-surefire-reports/</reportsDirectory>
                                </configuration>
                            </execution>
                        </executions>
                    </plugin>
                </plugins>
            </build>
        </profile>
    </profiles>

Working example can be find in AppInfo‘s artificial branch (pom.xml and sample test class). It’s easy to adopt it to your needs.

All three test groups have separate reports format to not override each other. As the extension it could probably possible to merge they into one aggregated test report.

TestNG is a testing framework created as an annotation driven alternative for JUnit 3 in times when “extends TestCase” was an indispensable part of writing tests. Even now it provides some interesting features like data providers, parallel tests execution or test groups. In the situation our tests are not executed from IDE it’s often useful to take a look at a test result in a HTML report. The original TestNG report looks… raw. What is more it is not very intuitive and readable. There is an alternative – ReportNG. It provides a better looking and more lucid HTML test reports.

More information about ReportNG can be found at its webpage, but when I tried to use for my AppInfo library in Maven builds running from a CI server I had a problem to find any at a glance guide how to use it with Maven. Fortunately there are samples for Ant and Gradle, so I was able to figure it out, but I hope with this post everyone wanting to use ReportNG with Maven will be able to achieve it without any problem within a few minutes.

First, the additional dependency has to be added to pom.xml:

<dependencies>
    <dependency>
        <groupId>org.uncommons</groupId>
        <artifactId>reportng</artifactId>
        <version>1.1.2</version>
        <scope>test</scope>
        <exclusions>
            <exclusion>
                <groupId>org.testng</groupId>
                <artifactId>testng</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    (...)
</dependencies>

Usually in our project a newer TestNG version is used, so that ReportNG dependency should be excluded.

Next, Surefire plugin has to be configured:

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-surefire-plugin</artifactId>
            <version>2.5</version>
            <configuration>
                <properties>
                    <property>
                        <name>usedefaultlisteners</name>
                        <value>false</value>
                    </property>
                    <property>
                        <name>listener</name>
                        <value>org.uncommons.reportng.HTMLReporter, org.uncommons.reportng.JUnitXMLReporter</value>
                    </property>
                </properties>
                <workingDirectory>target/</workingDirectory>
            </configuration>
        </plugin>
        (...)
    </plugins>
</build>

ReportNG uses two reporters pluggable into TestNG. JUnitXMLReporter generates XML summarize of running tests. It’s used for tools (like CI server). HTMLReporter creates human readable HTML report. Default TestNG listeners should be disabled.

After a test run I added also a workingDirectory property which causes that velocity.log (file created by Velocity engine used internally by ReportNG) is placed in a target instead of main project directory (and therefore it is deleted by the “mvn clean” command).

One more thing. Unfortunately ReportNG jar isn’t available in Maven Central Repository, so it could be required to add java.net repository in your settings.xml.

<repositories>
    <repository>
        <id>java-net</id>
        <url>http://download.java.net/maven/2</url>
    </repository>
    (...)
</repositories>

That’s all. Now “mvn clean test” should generate a nice looking HTML report for lots of tests covering our project :).

Update 2012-08-23. This post was written with TestNG 5.x in mind. With TestNG 6.0+ you can meet a problem with “ClassNotFoundException: com.google.inject.Module” exception. In that case Guice dependency needs to be added. Thanks to Alexander Schikora for pointing it out.

<dependency>
    <groupId>com.google.inject</groupId>
    <artifactId>guice</artifactId>
    <version>3.0</version>
    <scope>test</scope>
</dependency>