Pitest: Measure the Quality of your Unit Tests with Mutation Testing

It is not uncommon among developers to discuss about the quality of automated unit tests: Are they testing enough of application code? And more importantly, are they really verifying the expected behavior?

The first question has a relatively simple answer: use automated code coverage tools that will track which lines of code and which branches in execution flow are being tested. Code coverage reports are very helpful to 1) determine which portions of application code are not being tested; and 2) if measuring code coverage per individual test, determine whether each test is effectively testing the appropriate piece of application code. If interested in techniques for that, you may want to look at this other blog post: http://deors.wordpress.com/2014/07/04/individual-test-coverage-sonarqube-jacoco/

However, no matter how useful is to measure code coverage, these reports will not let you know one fundamental aspect of tests: which behavior is being verified!

Simply put, your test code may be passing through every single line of your code, and not verifying anything. If you are familiar with JUnit framework, your test code may not contain a single assertion!

To overcome this limitation of automated unit testing, one technique that can be of great help is Mutation Testing.

Mutation Testing… Explained

Let’s assume you have your application code and your test code as usual. A mutation testing tool will take your application code and make small surgical changes, one at a time, a so-called “mutation”. It could be changing a logical operator in an if statement (e.g. > is changed to <=), it could be removing some service call, it could be changing some for loop, it could be altering some return value, and so forth.

Mutation testing is, therefore, based on the assumption that if you are testing your code and making the right assertions to verify the behavior, once you re-execute your unit tests with a mutation in application code some of them should fail.

Pitest – A Mutation Testing Framework for Java

Although very interesting, such a technique would be useless without the proper tools. There are some, and for different languages, like Jester, Jumble or NinjaTurtles, but probably the most mature and powerful we’ve seen to date is Pitest (http://pitest.org).

Working with Pitest is very simple and requires minimal effort to start. It can be integrated with build tools like Maven, Ant or Gradle, with IDEs like Eclipse (Pitclipse plug-in) or IntelliJ, and with quality tools like SonarQube.

Regardless of the way you execute it, Pitest will analyze the application byte codes and decide which mutations will be introduced (for a full list and description of available mutators in Pitest, check their site here: http://pitest.org/quickstart/mutators/).

To optimize the test execution as much as possible, Pitest gathers code coverage metrics in a “normal execution” and then re-executes only the matching test cases for a certain mutation. Total execution time is noticeably longer than a normal unit test execution, basically because the incredible test harness that Pitest adds even to the most simple of code bases.

As a result, Pitest generates a fully detail report showing which mutations “lived” after the execution, that is, which mutations where not detected by any existing assertion. These “lived” mutations are your main focus, because they mean that there is some logic, some return value, or some call that is not being verified.

Of course not all of the mutations will be meaningful. Some may produce out of memory errors or infinite loops. For those cases, Pitest does its best to detect them and remove from the resulting reports. These can be fine-tuned if needed, for example by tuning time outs and other parameters, but sensible defaults work really well to start with.

Pitest in Action

Seeing is believing, so we put Pitest to work on a simple 10-classes Java library. We decided to use the Maven plug-in, as this method requires zero configuration to start. We opened a command prompt at the project directory, and just executed this command:

> mvn org.pitest:pitest-maven:1.0.0:mutationCoverage

After a few minutes (5 to 6 for this project) and lots of iterations showing in the console, the build finishes and the reports are generated in target directory:

> target\pit-reports\201408181908\index.html

When the report loaded in the browser, the first fact that caught our attention was that one class, that we worked hard to be fully tested, AbstractContext, although with a 100% code coverage it showed one lived mutation. Oops, something was not properly verified. Was Pitest right?

mutation-1

After clicking the class name, we could see the detail on where the lived mutation was found:

mutation-2

Pitest was right! Although that method is fully tested, and there are test cases for every single execution flow, we were missing the proper assertion for that if statement. Really really hard to catch if not for a good tool helping us to find out more about our unit tests.

Of course, next step was to add the forgotten assertion to the relevant test method. Once done, we re-launched Pitest. After a few minutes, a new set of reports where created and once loaded in the browser… clean result for that class!

mutation-4

Conclusion

Although arguably a bit fortunate to obtain such a fabulous result at the first try, it is true that after a more thorough inspection of the reports we found many other places where assertions were missing.

Our view is that Pitest is a very valuable tool to write really meaningful and truly useful automated unit test suites, and should be standard gear for Java projects going forward. It is simple to use, requires zero or minimal configuration, and produces valuable results that directly impact in the quality of the test we create, and therefore in the quality of our deliverables.

To mutate, or not to mutate: that is the question.
Whether 'tis nobler in the mind to suffer
The slings and arrows of outrageous unit tests.

Script to List Key Job Settings in Jenkins at a Glance

One can get addicted to scripting in Jenkins quickly! ;-)

When you have dozens even hundreds of jobs in Jenkins, it is really important to have a way to review or change job settings in one shot. One of my favorite scripts, that I use when I want to get key settings from all jobs at a glance, is this one:

import hudson.model.*
import hudson.maven.*

for (job in Hudson.instance.items) {
  if (job instanceof MavenModuleSet) {
    mms = (MavenModuleSet) job
    def name = mms.name
    def jdk = "def"
    if (mms.JDK != null) {
      jdk = mms.JDK.name
    }
    def mvn = mms.mavenName
    def goals = mms.goals
    printf("%-50s | %-10s | %-15s | %-50s\n", name, jdk, mvn, goals)
  }
}

And this is a example output. I love it! :-D  (Be sure to scroll right to see full output.)

deors.demos.annotations.base                       | jdk-8      | null            | clean install                                     
deors.demos.annotations.base.client                | jdk-8      | null            | clean test                                        
deors.demos.annotations.base.processors            | jdk-8      | null            | clean install                                     
deors.demos.annotations.beaninfo                   | jdk-8      | null            | clean install                                     
deors.demos.annotations.beaninfo.client            | jdk-8      | null            | clean test                                        
deors.demos.annotations.beaninfo.processors        | jdk-8      | null            | clean install                                     
deors.demos.annotations.velocity.client            | jdk-8      | null            | clean test                                        
deors.demos.annotations.velocity.processors        | jdk-8      | null            | clean install                                     
deors.demos.batch.springbatch2                     | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.cloud.gae                              | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.cloud.heroku                           | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.cloud.rhc                              | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.cloud.vmc                              | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.java8                                  | jdk-8      | maven-3.2.1     | clean verify                                      
deors.demos.testing.arquillian                     | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.testing.arquillian-glassfish-embedded  | jdk-7      | null            | clean verify -Parquillian-glassfish-embedded      
deors.demos.testing.arquillian-glassfish-remote    | jdk-7      | null            | clean verify -Parquillian-glassfish-remote,!arquillian-glassfish-embedded
deors.demos.testing.arquillian-jboss-managed       | jdk-7      | null            | clean verify -Parquillian-jboss-managed,!arquillian-glassfish-embedded
deors.demos.testing.arquillian-jboss-remote        | jdk-7      | null            | clean verify -Parquillian-jboss-remote,!arquillian-glassfish-embedded
deors.demos.testing.arquillian-weld-embedded       | jdk-7      | null            | clean verify -Parquillian-weld-embedded,!arquillian-glassfish-embedded
deors.demos.testing.htmlunit                       | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.testing.htmlunit-cargo-glassfish       | jdk-7      | null            | -P glassfish cargo:redeploy                       
deors.demos.testing.htmlunit-cargo-jboss           | jdk-7      | null            | -P jboss cargo:redeploy                           
deors.demos.testing.htmlunit-cargo-tomcat          | jdk-7      | null            | -P tomcat cargo:redeploy                          
deors.demos.testing.htmlunit-deploy-glassfish      | jdk-7      | null            | clean install                                     
deors.demos.testing.htmlunit-deploy-tomcat         | jdk-7      | null            | clean install                                     
deors.demos.testing.mocks                          | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.testing.selenium                       | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.testing.selenium-cargo-glassfish       | jdk-7      | null            | -P glassfish cargo:redeploy                       
deors.demos.testing.selenium-cargo-jboss           | jdk-7      | null            | -P jboss cargo:redeploy                           
deors.demos.testing.selenium-cargo-tomcat          | jdk-7      | null            | -P tomcat cargo:redeploy                          
deors.demos.testing.selenium-deploy-glassfish      | jdk-7      | null            | clean install                                     
deors.demos.testing.selenium-deploy-tomcat         | jdk-7      | null            | clean install                                     
deors.demos.web.gwt2                               | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.web.gwt2spring                         | jdk-7      | maven-3.2.1     | clean verify                                      
deors.demos.web.springmvc3                         | jdk-7      | maven-3.2.1     | clean verify                                      
petclinic-1-build-test                             | jdk-7      | maven-3.2.1     | clean test                                        
petclinic-2-package                                | jdk-7      | null            | package -DskipTests=true                          
petclinic-3-tomcat-run                             | jdk-7      | null            | cargo:run -Pcargo-tomcat                          
petclinic-4-verify-selenium-htmlunit               | jdk-7      | maven-3.2.1     | failsafe:integration-test -P selenium-tests       
petclinic-5-verify-jmeter                          | jdk-7      | maven-3.2.1     | jmeter:jmeter -P jmeter-tests                     
petclinic-6-tomcat-stop                            | jdk-7      | null            | cargo:stop -Pcargo-tomcat                         
petclinic-9a-verify-selenium-openshift             | jdk-7      | maven-3.2.1     | failsafe:integration-test -P selenium-tests       
petclinic-9b-verify-selenium-heroku                | jdk-7      | maven-3.2.1     | failsafe:integration-test -P selenium-tests       
petclinic-full-all-browsers                        | jdk-7      | maven-3.2.1     | clean verify -P cargo-tomcat,selenium-tests       
petclinic-full-htmlunit-sonar                      | jdk-7      | maven-3.2.1     | clean verify -P cargo-tomcat,selenium-tests,jmeter-tests

Script to Update Jenkins Jobs to Use a Different Maven Instance

One of the features of Jenkins that I like a lot, very useful when you need to do bulk changes on your jobs configuration, is the script console.

The script console allows to run Groovy scripts that can read and alter the state of the jobs, or any other piece of configuration or state exposed through Jenkins API or through its plug-ins respective APIs.

Today I decided to take a look to Apache Maven 3.2.1 and I wanted to easily test all my existing jobs with this new version. As I’m “lazy” ;-) and didn’t want to update the jobs one by one, I created this script to do the job for me in no time. Hope you enjoy it!

import hudson.maven.*
import hudson.model.*
import hudson.tasks.*
import hudson.tools.*

oldMavenName = "maven-3.0.4"
newMavenName = "maven-3.2.1"

Maven.MavenInstallation oldMaven
Maven.MavenInstallation newMaven

// look for old and new Maven installations
// useful to detect that something is not well configured
for (ti in ToolInstallation.all()) {
  if (ti instanceof Maven.MavenInstallation.DescriptorImpl) {
    for (i in ti.installations) {
      if (i.name.equals(oldMavenName)) {
        oldMaven = i
      } else if (i.name.equals(newMavenName)) {
        newMaven = i
      }
    }
  }
}

println("migrating jobs from Maven: " + oldMaven.name)
println("to Maven: " + newMaven.name)
println()

// locate the jobs and update the Maven installation
// optionally filter them by some prefix or regex
for (job in Hudson.instance.items) {
  if (job instanceof MavenModuleSet) {
    mms = (MavenModuleSet) job
    if (mms.name.startsWith("some.prefix")) {
      println("job " + mms.name + " currently using: " + mms.mavenName)
      // if name is null, it means the default Maven installation
      if (mms.mavenName == null || mms.mavenName == oldMaven.name) {
        mms.mavenName = newMaven.name
        println(" migrate to: " + mms.mavenName)
      } else {
        println(" no migration needed")
      }
    }
  }
}

Code Coverage of Individual Tests with SonarQube and JaCoCo

This post explains how to enable SonarQube to gather test code coverage metrics of individual tests. Code coverage tools typically produce a report showing the code coverage (by line, branch, etc.) for the combined effect of all the tests executed during a given test session. This is case, for example, when you run unit tests in continuous integration. With the help of SonarQube and JaCoCo, it is possible to gather coverage metrics split at the level of the individual test case (test method in JUnit or TestNG). To enable this, there is some special configuration required that we are showing in this post.

The Environment

The following process has been verified with SonarQube 4.1.2 and 4.3.2 versions, but it should work with SonarQube 3.7.x (latest LTS release), too. The application code we have used to verify the setup is the familiar Spring Pet Clinic application, enhanced to support Tomcat 7 and Spring 3 (see this post here for reference on updates needed in Pet Clinic: http://deors.wordpress.com/2012/09/06/petclinic-tomcat-7/) The code can be downloaded from GitHub in the repository: https://github.com/deors/deors.demos.petclinic

The Instructions

The instructions are really simple, once you’ve figured out how to connect all the dots. All that is required is to add some specific configuration to Maven Surefire plug-in (Surefire is the plug-in that is tasked with the unit test execution, and it supports JUnit and TestNG). As this specific configuration should not impact the regular unit test execution, it is recommended to include the needed configuration in a separate profile that will be executed only when the SonarQube analysis is performed. Let’s describe the required changes in the pom.xml file, section by section.

Build Section

No changes are needed here. However, you should take note of any customised configuration of Maven Surefire to be sure it is also applied to the profile we are going to create. In the case of Spring Pet Clinic, this is the relevant portion of the POM we are writing down for reference:

<build><plugins>
...
 <plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-surefire-plugin</artifactId>
  <version>2.13</version>
  <configuration>
   <argLine>-XX:-UseSplitVerifier</argLine>
   <includes>
    <include>**/*Test.java</include>
    <include>**/*Tests.java</include>
   </includes>
   <excludes>
    <exclude>**/it/*IT.java</exclude>
   </excludes>
  </configuration>
 </plugin>
...
</plugins></build>

This piece of configuration is telling Surefire to: 1) exclude the integration tests for the execution of unit tests (integration tests are covered by Surefire’s twin plug-in, Failsafe); and 2) disable the byte code verifier, preventing runtime errors when classes are instrumented (i.e. when adding mocks, or TopLink enhancements).

Dependencies Section

Again no changes are needed in this section. We just wanted to note that if your project is already leveraging JaCoCo to gather integration test coverage metrics, and is explicitly referring to JaCoCo artefact in this section, it can be left – no conflicts have been identified so far. Anyway it should not be needed here, so it’s probably safer to remove it from this section.

Profiles Section

All the required changes come in this section. And they are very clean to add, as they all require only to add a new profile to the POM. This profile will configure a special listener for Surefire that will ensure that coverage metrics for each individual test case are appropriately gathered. To guarantee a successful test execution, we will maintain here the same configuration that appears in the build section of the POM. Finally, the profile will add a new dependency to the artefact that contains the listener code. The result is this:

<profile>
 <!-- calculate coverage metrics per test with SonarQube and JaCoCo -->
 <id>coverage-per-test</id>
  <build>
   <plugins>
    <plugin>
     <groupId>org.apache.maven.plugins</groupId>
     <artifactId>maven-surefire-plugin</artifactId>
     <version>2.13</version>
     <configuration>
      <!-- same configuration as in the regular test execution goal -->
      <!-- plus argLine parameter configured by JaCoCo prepare-agent -->
      <argLine>${argLine} -XX:-UseSplitVerifier</argLine>
      <includes>
       <include>**/*Test.java</include>
       <include>**/*Tests.java</include>
      </includes>
      <excludes>
       <exclude>**/it/*IT.java</exclude>
      </excludes>
      <!-- new configuration needed for coverage per test -->
      <properties>
       <property>
        <name>listener</name>
         <value>org.sonar.java.jacoco.JUnitListener</value>
       </property>
      </properties>
     </configuration>
    </plugin>
   </plugins>
  </build>
 <dependencies>
  <dependency>
   <groupId>org.codehaus.sonar-plugins.java</groupId>
   <artifactId>sonar-jacoco-listeners</artifactId>
   <version>2.3</version>
   <scope>test</scope>
  </dependency>
 </dependencies>
</profile>

A piece of warning around the JaCoCo listener artefact version. Although it is unclear in the documentation, it seems that the best results are obtained when the JaCoCo listener version matches that of the Java plug-in installed in SonarQube. In this case, as the Java plug-in that we have installed in SonarQube is version 2.3, we have used the listener artefact version 2.3. We also tested with listener 1.2 with same good results, but to prevent any future conflict, we recommend keeping versions aligned.

Running the Analysis

Once the changes in the project configuration are done, you just need to re-execute a SonarQube analysis to see the new reports.

Depending on which SonarQube Java version you have installed, the configuration differs a bit.

Running the Analysis in Older Versions

When the Java plug-in version in use is 2.1 or an earlier version, the profile should be enabled when the analysis executes, and only when the analysis executes. This means that it is now a requirement to launch the sonar:sonar goal as a separate Maven build (it was recommended to do so, but in many cases you could execute all the targets in one run). In the case of our version of Pet Clinic:

>mvn clean verify -P cargo-tomcat,selenium-tests,jmeter-tests
>mvn sonar:sonar -P coverage-per-test

If your build is triggered by a Jenkins job, then the new profile should be added to the post-build action as can be seen in this screenshot: sonar-post-build

Running the Analysis in Newer Versions

When the Java plug-in version in use is 2.2 or newer, code coverage is no longer executed during the analysis. Therefore you should configure the build to gather the code coverage metrics first:

>mvn clean org.jacoco:jacoco-maven-plugin:0.7.0.201403182114:prepare-agentverify -P coverage-per-test,cargo-tomcat,selenium-tests,jmeter-tests
>mvn sonar:sonar -P coverage-per-test

If your build is triggered by a Jenkins job, then the JaCoCo prepare agent goal and the new profile should be added to the build action as can be seen in this screenshot:

sonar-maven-modern

Analysis Results

Once the analysis is completed, the code coverage reports get some new interesting views. When clicking on any test on the test view, a new column labelled ‘Covered Lines’ shows the individual hits for each test method in the class: sonar-test-summary When the link on Covered Lines value is followed, a new widget shows containing all the classes hit by that test method, and the touched lines per class: sonar-test-detail When the link under each of the classes is followed, a new widget appears showing the class source coloured with the actual line/branch hits:

sonar-test-code

Users can also get to this view if navigating through other views, as components or violations drill-down. Once the class level is reached, users can use the ‘Coverage’ tab to get this information:

sonar-class-coverage
By default, the decoration shown is ‘Lines to cover’, showing the code coverage from all tests combined. Use the drop-down list and select ‘Per test -> Covered lines’ and then select the right text case in the new drop-down list that will appear:
sonar-class-select-decoration
sonar-class-select-testcase
sonar-class-final

Conclusion

Measuring code coverage of individual tests is a very useful feature to have in development projects. Code coverage metrics alone may not be sufficient to identify that the rights tests are being executed and they are touching the right functionality. With the ability to identify which portions of the code are executed by any test case, developers and tester can ensure that the expected code logic is tested, versus what can be obtained with other code coverage tools that only gives a combined coverage report.

Writing Your Own SonarQube Plug-ins – Part 4: Testing and Deploying

In part 1 of this series here, I explained the basics of SonarQube plug-ins and how to start writing your own plug-in. In part 2 here, I explained how to define custom metrics and sensor classes to collect metric data (measures). In part 3 here, I showed the basics of how to create custom widgets to present to users the information we are collecting from analysed projects.

Now finally in part 4, I will show how to test and deploy your plug-ins, and some tips to make development and testing workarounds quicker, like configuring JRebel to speed up development.

Part 4 – Testing and Deploying

If you have been following this series of blog posts, you already have a simple structure for a plug-in, hopefully doing something useful and showing valuable insights from your code and configuration.

So what’s next? How can we test and deploy our plug-ins? The simple naïve answer is: package and deploy it to a working SonarQube instance, as you would do with any other plug-in.

Packaging the plug-in is as easy as executing Maven’s package goal. The Maven configuration, governed by the sonar-plugin packaging type, will ensure that the right classes and configuration files are packaged in the right way. The resulting Jar file will be ready for deployment into a SonarQube instance.

Deploying the plug-in is also an easy task. If you are a seasoned SonarQube user, you probably know how to install and update public plug-ins from SonarQube update center. Unfortunately this way is only available for plug-ins listed in the public SonarQube directory, and is not suitable for our own in-house, internal-only plug-ins. In our case, we would need to manually copy the packaged plug-in file to our SonarQube instance, in the folder extensions\downloads, and then restart SonarQube. This is actually what the update center does: downloading a plug-in file from Internet and place it in downloads folder, so it is picked up automatically in next restart.

Although the process above is simple and easy to automate with a script, it is lengthy. Moreover, it needs a local SonarQube instance that you need to manage, and that it’s subject to get polluted by other plug-ins or just by normal user usage. Fortunately, SonarQube provides with a better alternative way to test a plug-in.

SonarQube Development Mode

To prevent developer’s sanity, SonarQube provides with a development mode. This development mode will automate the deploy process as a Maven plug-in, downloading a SonarQube instance from the internet if needed and running it as a child process of the Maven process.

To launch the development mode, just issue this command:

mvn install org.codehaus.sonar:sonar-dev-maven-plugin:1.7:start-war -Dsonar.runtimeVersion=3.7.3

You may setup an alternate version of SonarQube to host and run the plug-in as needed. The SonarQube instance spawned by this command will run on default settings: listening on port 9000, with embedded H2 database.

Once ready, just launch a SonarQube analysis, either using SonarQube Maven plug-in or using SonarQube Runner standalone analysis. The analysis will connect by default to the server in development mode and run your plug-in as part of the analysis process.

Optimized Testing with JRebel

Although SonarQube development mode is nice and very convenient, it has a fundamental problem – it is slow! It takes quite a few minutes to have the environment prepared, and if you find a bug or want to improve something, and you need to change some Java code, you have to drop it and start the process again.

Fortunately, JRebel comes to a rescue and can dramatically speed up the process. The same that JRebel monitors Java web containers and injects changes in Java bytecodes directly into the running JVM that hosts the web container, without loosing state – effectively saving you the few (or many) minutes needed to redeploy and/or restart the web container – JRebel can monitor our plug-in under development.

To configure JRebel, just add JRebel configuration and agent to the developer mode launch script.

JRebel configuration is just a simple rebel.xml file under src/main/resources folder, which content is set to monitor the right folder where Java classes are compiled into:

<?xml version="1.0" encoding="UTF-8"?>
<application xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.zeroturnaround.com" xsi:schemaLocation="http://www.zeroturnaround.com http://www.zeroturnaround.com/alderaan/rebel-2_0.xsd">
    <classpath>
        <dir name="C:/projects/xxxxx/target/classes"/>
    </classpath>
</application>

Then, add the JRebel agent to the development mode launch command:

mvn install org.codehaus.sonar:sonar-dev-maven-plugin:1.7:start-war -Dsonar.runtimeVersion=3.7.3 -Dsonar.containerArgs=-javaagent:C:\java\tools\jrebel-5.5.2\jrebel.jar -Drebel.log=true

This is possible thanks to sonar.containerArgs command-line options, kindly provided by ZeroTurnaround support team as a merge request into open-sourced SonarQube development mode plug-in. Who said vendor support does not work?

Once launched with JRebel, the development mode may remain active during the full development and testing session, without needing to redeploy or restart it again, and picking changes done in plug-in classes.

Evenmore, if the SonarQube analysis is launched with JRebel agent active, each subsequent SonarQube analysis process that is executed will pick changes in the plug-in, effectively making the plug-in code to be ‘reloaded’ automatically, without needing to redeploy it again. To do that, don’t forget to add this to the SonarQube analysis command:

mvn sonar:sonar -javaagent:c:\java\tools\jrebel-5.5.2\jrebel.jar

Conclusion

It is easy to get started with custom plug-in development in SonarQube if you know where to start. Although open-source plug-ins are a good source of useful information, the online documentation lacks of a good plug-in 101 guide. I really hope that this short blog post series has been of help, it allows you to connect all the pieces that confirms a plug-in, and encourages you to conceive and bring to life your own custom plug-ins, either analysers, reporters or integrators with other existing tools.

Happy coding!

Idiom for Browser-Selectable Selenium Tests

For some time I’ve wanted to share an idiom I personally use and recommend when building Selenium Tests. This idiom allows to control which browsers are used to run the tests without needing to update test sources or configuration.

The simple ideas behind this idiom are:

  • Test code and configuration should not depend on the test environment.
  • Tests can be executed in any given browser, independently from others.
  • To change the browsers used for test execution, it is not needed to update test sources or configuration.
  • Selenium Grid URL and application URL are also configurable.
  • Both environment variables and Java system variables can be used.
  • All settings have sensible defaults.

I call this idiom ‘Browser-Selectable Tests’. I promise I keep thinking on a better name :-)

Continue reading

Installing Sonar in OpenShift as a DIY application

Note: this is an excerpt extracted from my talk at Red Hat Developer Day London. You can see more about the talk in my post here:

http://deors.wordpress.com/2012/10/03/developer-day/

Sonar is a popular code profiler and dashboard that excels when used along a Continuous Integration engine:

  • Seamless integration with Maven.
  • Leverages best-of-breed tools as Checkstyle, PMD or FindBugs.
  • Configurable quality profiles.
  • Re-execution of tests and test code coverage (UT, IT).
  • Design Structure Matrix analysis.
  • Flexible and highly customisable dashboard.
  • Actions plans / peer reviews.
  • Historic views / run charts.
  • Can be used with Java, .Net, C/C++, Groovy, PHP,…

Continue reading