Script to Update Jenkins Jobs to Use a Different Maven Instance

One of the features of Jenkins that I like a lot, very useful when you need to do bulk changes on your jobs configuration, is the script console.

The script console allows to run Groovy scripts that can read and alter the state of the jobs, or any other piece of configuration or state exposed through Jenkins API or through its plug-ins respective APIs.

Today I decided to take a look to Apache Maven 3.2.1 and I wanted to easily test all my existing jobs with this new version. As I’m “lazy” ;-) and didn’t want to update the jobs one by one, I created this script to do the job for me in no time. Hope you enjoy it!

import hudson.maven.*
import hudson.model.*
import hudson.tasks.*
import hudson.tools.*

oldMavenName = "maven-3.0.4"
newMavenName = "maven-3.2.1"

Maven.MavenInstallation oldMaven
Maven.MavenInstallation newMaven

// look for old and new Maven installations
// useful to detect that something is not well configured
for (ti in ToolInstallation.all()) {
  if (ti instanceof Maven.MavenInstallation.DescriptorImpl) {
    for (i in ti.installations) {
      if (i.name.equals(oldMavenName)) {
        oldMaven = i
      } else if (i.name.equals(newMavenName)) {
        newMaven = i
      }
    }
  }
}

println("migrating jobs from Maven: " + oldMaven.name)
println("to Maven: " + newMaven.name)
println()

// locate the jobs and update the Maven installation
// optionally filter them by some prefix or regex
for (job in Hudson.instance.items) {
  if (job instanceof MavenModuleSet) {
    mms = (MavenModuleSet) job
    if (mms.name.startsWith("some.prefix")) {
      println("job " + mms.name + " currently using: " + mms.mavenName)
      // if name is null, it means the default Maven installation
      if (mms.mavenName == null || mms.mavenName == oldMaven.name) {
        mms.mavenName = newMaven.name
        println(" migrate to: " + mms.mavenName)
      } else {
        println(" no migration needed")
      }
    }
  }
}

Code Coverage of Individual Tests with SonarQube and JaCoCo

This post explains how to enable SonarQube to gather test code coverage metrics of individual tests. Code coverage tools typically produce a report showing the code coverage (by line, branch, etc.) for the combined effect of all the tests executed during a given test session. This is case, for example, when you run unit tests in continuous integration. With the help of SonarQube and JaCoCo, it is possible to gather coverage metrics split at the level of the individual test case (test method in JUnit or TestNG). To enable this, there is some special configuration required that we are showing in this post.

The Environment

The following process has been verified with SonarQube 4.1.2 and 4.3.2 versions, but it should work with SonarQube 3.7.x (latest LTS release), too. The application code we have used to verify the setup is the familiar Spring Pet Clinic application, enhanced to support Tomcat 7 and Spring 3 (see this post here for reference on updates needed in Pet Clinic: http://deors.wordpress.com/2012/09/06/petclinic-tomcat-7/) The code can be downloaded from GitHub in the repository: https://github.com/deors/deors.demos.petclinic

The Instructions

The instructions are really simple, once you’ve figured out how to connect all the dots. All that is required is to add some specific configuration to Maven Surefire plug-in (Surefire is the plug-in that is tasked with the unit test execution, and it supports JUnit and TestNG). As this specific configuration should not impact the regular unit test execution, it is recommended to include the needed configuration in a separate profile that will be executed only when the SonarQube analysis is performed. Let’s describe the required changes in the pom.xml file, section by section.

Build Section

No changes are needed here. However, you should take note of any customised configuration of Maven Surefire to be sure it is also applied to the profile we are going to create. In the case of Spring Pet Clinic, this is the relevant portion of the POM we are writing down for reference:

<build><plugins>
...
 <plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-surefire-plugin</artifactId>
  <version>2.13</version>
  <configuration>
   <argLine>-XX:-UseSplitVerifier</argLine>
   <includes>
    <include>**/*Test.java</include>
    <include>**/*Tests.java</include>
   </includes>
   <excludes>
    <exclude>**/it/*IT.java</exclude>
   </excludes>
  </configuration>
 </plugin>
...
</plugins></build>

This piece of configuration is telling Surefire to: 1) exclude the integration tests for the execution of unit tests (integration tests are covered by Surefire’s twin plug-in, Failsafe); and 2) disable the byte code verifier, preventing runtime errors when classes are instrumented (i.e. when adding mocks, or TopLink enhancements).

Dependencies Section

Again no changes are needed in this section. We just wanted to note that if your project is already leveraging JaCoCo to gather integration test coverage metrics, and is explicitly referring to JaCoCo artefact in this section, it can be left – no conflicts have been identified so far. Anyway it should not be needed here, so it’s probably safer to remove it from this section.

Profiles Section

All the required changes come in this section. And they are very clean to add, as they all require only to add a new profile to the POM. This profile will configure a special listener for Surefire that will ensure that coverage metrics for each individual test case are appropriately gathered. To guarantee a successful test execution, we will maintain here the same configuration that appears in the build section of the POM. Finally, the profile will add a new dependency to the artefact that contains the listener code. The result is this:

<profile>
 <!-- calculate coverage metrics per test with SonarQube and JaCoCo -->
 <id>coverage-per-test</id>
  <build>
   <plugins>
    <plugin>
     <groupId>org.apache.maven.plugins</groupId>
     <artifactId>maven-surefire-plugin</artifactId>
     <version>2.13</version>
     <configuration>
      <!-- same configuration as in the regular test execution goal -->
      <argLine>-XX:-UseSplitVerifier</argLine>
      <includes>
       <include>**/*Test.java</include>
       <include>**/*Tests.java</include>
      </includes>
      <excludes>
       <exclude>**/it/*IT.java</exclude>
      </excludes>
      <!-- new configuration needed for coverage per test -->
      <properties>
       <property>
        <name>listener</name>
         <value>org.sonar.java.jacoco.JUnitListener</value>
       </property>
      </properties>
     </configuration>
    </plugin>
   </plugins>
  </build>
 <dependencies>
  <dependency>
   <groupId>org.codehaus.sonar-plugins.java</groupId>
   <artifactId>sonar-jacoco-listeners</artifactId>
   <version>2.3</version>
   <scope>test</scope>
  </dependency>
 </dependencies>
</profile>

A piece of warning around the JaCoCo listener artefact version. Although it is unclear in the documentation, it seems that the best results are obtained when the JaCoCo listener version matches that of the Java plug-in installed in SonarQube. In this case, as the Java plug-in that we have installed in SonarQube is version 2.0, we have used the listener artefact version 2.0. We also tested with listener 1.2 with same good results, but to prevent any future conflict, we recommend keeping versions aligned.

Running the Analysis

Once the changes in the project configuration are done, you just need to re-execute a SonarQube analysis to see the new reports.

Depending on which SonarQube Java version you have installed, the configuration differs a bit.

Running the Analysis in Older Versions

When the Java plug-in version in use is 2.1 or an earlier version, the profile should be enabled when the analysis executes, and only when the analysis executes. This means that it is now a requirement to launch the sonar:sonar goal as a separate Maven build (it was recommended to do so, but in many cases you could execute all the targets in one run). In the case of our version of Pet Clinic:

>mvn clean verify -P cargo-tomcat,selenium-tests,jmeter-tests
>mvn sonar:sonar -P coverage-per-test

If your build is triggered by a Jenkins job, then the new profile should be added to the post-build action as can be seen in this screenshot: sonar-post-build

Running the Analysis in Newer Versions

When the Java plug-in version in use is 2.2 or newer, code coverage is no longer executed during the analysis. Therefore you should configure the build to gather the code coverage metrics first:

>mvn clean org.jacoco:jacoco-maven-plugin:0.7.0.201403182114:prepare-agentverify -P coverage-per-test,cargo-tomcat,selenium-tests,jmeter-tests
>mvn sonar:sonar -P coverage-per-test

If your build is triggered by a Jenkins job, then the JaCoCo prepare agent goal and the new profile should be added to the build action as can be seen in this screenshot:

sonar-maven-modern

Analysis Results

Once the analysis is completed, the code coverage reports get some new interesting views. When clicking on any test on the test view, a new column labelled ‘Covered Lines’ shows the individual hits for each test method in the class: sonar-test-summary When the link on Covered Lines value is followed, a new widget shows containing all the classes hit by that test method, and the touched lines per class: sonar-test-detail When the link under each of the classes is followed, a new widget appears showing the class source coloured with the actual line/branch hits: sonar-test-code Users can also get to this view if navigating through other views, as components or violations drill-down. Once the class level is reached, users can use the ‘Coverage’ tab to get this information:
sonar-class-coverage
By default, the decoration shown is ‘Lines to cover’, showing the code coverage from all tests combined. Use the drop-down list and select ‘Per test -> Covered lines’ and then select the right text case in the new drop-down list that will appear:
sonar-class-select-decoration
sonar-class-select-testcase
sonar-class-final

Conclusion

Measuring code coverage of individual tests is a very useful feature to have in development projects. Code coverage metrics alone may not be sufficient to identify that the rights tests are being executed and they are touching the right functionality. With the ability to identify which portions of the code are executed by any test case, developers and tester can ensure that the expected code logic is tested, versus what can be obtained with other code coverage tools that only gives a combined coverage report.

Writing Your Own SonarQube Plug-ins – Part 4: Testing and Deploying

In part 1 of this series here, I explained the basics of SonarQube plug-ins and how to start writing your own plug-in. In part 2 here, I explained how to define custom metrics and sensor classes to collect metric data (measures). In part 3 here, I showed the basics of how to create custom widgets to present to users the information we are collecting from analysed projects.

Now finally in part 4, I will show how to test and deploy your plug-ins, and some tips to make development and testing workarounds quicker, like configuring JRebel to speed up development.

Part 4 – Testing and Deploying

If you have been following this series of blog posts, you already have a simple structure for a plug-in, hopefully doing something useful and showing valuable insights from your code and configuration.

So what’s next? How can we test and deploy our plug-ins? The simple naïve answer is: package and deploy it to a working SonarQube instance, as you would do with any other plug-in.

Packaging the plug-in is as easy as executing Maven’s package goal. The Maven configuration, governed by the sonar-plugin packaging type, will ensure that the right classes and configuration files are packaged in the right way. The resulting Jar file will be ready for deployment into a SonarQube instance.

Deploying the plug-in is also an easy task. If you are a seasoned SonarQube user, you probably know how to install and update public plug-ins from SonarQube update center. Unfortunately this way is only available for plug-ins listed in the public SonarQube directory, and is not suitable for our own in-house, internal-only plug-ins. In our case, we would need to manually copy the packaged plug-in file to our SonarQube instance, in the folder extensions\downloads, and then restart SonarQube. This is actually what the update center does: downloading a plug-in file from Internet and place it in downloads folder, so it is picked up automatically in next restart.

Although the process above is simple and easy to automate with a script, it is lengthy. Moreover, it needs a local SonarQube instance that you need to manage, and that it’s subject to get polluted by other plug-ins or just by normal user usage. Fortunately, SonarQube provides with a better alternative way to test a plug-in.

SonarQube Development Mode

To prevent developer’s sanity, SonarQube provides with a development mode. This development mode will automate the deploy process as a Maven plug-in, downloading a SonarQube instance from the internet if needed and running it as a child process of the Maven process.

To launch the development mode, just issue this command:

mvn install org.codehaus.sonar:sonar-dev-maven-plugin:1.7:start-war -Dsonar.runtimeVersion=3.7.3

You may setup an alternate version of SonarQube to host and run the plug-in as needed. The SonarQube instance spawned by this command will run on default settings: listening on port 9000, with embedded H2 database.

Once ready, just launch a SonarQube analysis, either using SonarQube Maven plug-in or using SonarQube Runner standalone analysis. The analysis will connect by default to the server in development mode and run your plug-in as part of the analysis process.

Optimized Testing with JRebel

Although SonarQube development mode is nice and very convenient, it has a fundamental problem – it is slow! It takes quite a few minutes to have the environment prepared, and if you find a bug or want to improve something, and you need to change some Java code, you have to drop it and start the process again.

Fortunately, JRebel comes to a rescue and can dramatically speed up the process. The same that JRebel monitors Java web containers and injects changes in Java bytecodes directly into the running JVM that hosts the web container, without loosing state – effectively saving you the few (or many) minutes needed to redeploy and/or restart the web container – JRebel can monitor our plug-in under development.

To configure JRebel, just add JRebel configuration and agent to the developer mode launch script.

JRebel configuration is just a simple rebel.xml file under src/main/resources folder, which content is set to monitor the right folder where Java classes are compiled into:

<?xml version="1.0" encoding="UTF-8"?>
<application xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.zeroturnaround.com" xsi:schemaLocation="http://www.zeroturnaround.com http://www.zeroturnaround.com/alderaan/rebel-2_0.xsd">
    <classpath>
        <dir name="C:/projects/xxxxx/target/classes"/>
    </classpath>
</application>

Then, add the JRebel agent to the development mode launch command:

mvn install org.codehaus.sonar:sonar-dev-maven-plugin:1.7:start-war -Dsonar.runtimeVersion=3.7.3 -Dsonar.containerArgs=-javaagent:C:\java\tools\jrebel-5.5.2\jrebel.jar -Drebel.log=true

This is possible thanks to sonar.containerArgs command-line options, kindly provided by ZeroTurnaround support team as a merge request into open-sourced SonarQube development mode plug-in. Who said vendor support does not work?

Once launched with JRebel, the development mode may remain active during the full development and testing session, without needing to redeploy or restart it again, and picking changes done in plug-in classes.

Evenmore, if the SonarQube analysis is launched with JRebel agent active, each subsequent SonarQube analysis process that is executed will pick changes in the plug-in, effectively making the plug-in code to be ‘reloaded’ automatically, without needing to redeploy it again. To do that, don’t forget to add this to the SonarQube analysis command:

mvn sonar:sonar -javaagent:c:\java\tools\jrebel-5.5.2\jrebel.jar

Conclusion

It is easy to get started with custom plug-in development in SonarQube if you know where to start. Although open-source plug-ins are a good source of useful information, the online documentation lacks of a good plug-in 101 guide. I really hope that this short blog post series has been of help, it allows you to connect all the pieces that confirms a plug-in, and encourages you to conceive and bring to life your own custom plug-ins, either analysers, reporters or integrators with other existing tools.

Happy coding!

Idiom for Browser-Selectable Selenium Tests

For some time I’ve wanted to share an idiom I personally use and recommend when building Selenium Tests. This idiom allows to control which browsers are used to run the tests without needing to update test sources or configuration.

The simple ideas behind this idiom are:

  • Test code and configuration should not depend on the test environment.
  • Tests can be executed in any given browser, independently from others.
  • To change the browsers used for test execution, it is not needed to update test sources or configuration.
  • Selenium Grid URL and application URL are also configurable.
  • Both environment variables and Java system variables can be used.
  • All settings have sensible defaults.

I call this idiom ‘Browser-Selectable Tests’. I promise I keep thinking on a better name :-)

Continue reading

Installing Sonar in OpenShift as a DIY application

Note: this is an excerpt extracted from my talk at Red Hat Developer Day London. You can see more about the talk in my post here:

http://deors.wordpress.com/2012/10/03/developer-day/

Sonar is a popular code profiler and dashboard that excels when used along a Continuous Integration engine:

  • Seamless integration with Maven.
  • Leverages best-of-breed tools as Checkstyle, PMD or FindBugs.
  • Configurable quality profiles.
  • Re-execution of tests and test code coverage (UT, IT).
  • Design Structure Matrix analysis.
  • Flexible and highly customisable dashboard.
  • Actions plans / peer reviews.
  • Historic views / run charts.
  • Can be used with Java, .Net, C/C++, Groovy, PHP,…

Continue reading

The Usual Suspects – Talk in Red Hat Developer Day London, Nov 1st

On Nov 1st, I will be presenting in Red Hat Developer Day London.

The tittle of my talk is: “The Usual Suspects – Creating a Cloud Development Environment with Sonar, Selenium and JMeter on OpenShift Origin“. During the session I will show how to extend the basic development environment offered by OpenShift (Git, Maven, Jenkins) and create a more powerful environment on OpenShift featuring “usual suspects” such as Sonar for continuous quality assurance, Selenium for functional testing, JMeter for performance/load testing as well as Arquillian for in-container testing. The session includes a live demo built on OpenShift Origin.

For more information about the event, full agenda and registration, visit: http://www.redhat.com/developerday/

See you there!

Edit 2012-11-01: These are the slides for the presentation. Meanwhile they are published in the conference site I have uploaded them here: The Usual Suspects – Red Hat Developer Day 2012-11-01

 

First Steps with Heroku – The New-Old Boy in the Cloud

Since my previous posts about Java cloud platforms I wanted to expend some time with Heroku and compare with the others.

Heroku is a veteran among the cloud platforms, but it’s not until a few months ago that they launched a Java offering.

In this post I will share my experiences starting with Heroku and making an existing application to work on it.

Continue reading

Test Automation with Selenium WebDriver and Selenium Grid – part 3: Continuous Integration

In part 1 in the series (read it here) I discussed about Selenium, the widely used tool for browser test automation, and I showed how easy is to setup a testing grid with multiple OS and browsers. In part 2 (read it here) I showed how to leverage WebDriver API to create and execute tests distributed across the grid that was created.

Now in part 3 I will show how to execute Selenium tests under a Continuous Integration process with Maven, Cargo and Jenkins, and how to gather code coverage metrics for those tests using Sonar and JaCoCo.

Continue reading

Test Automation with Selenium WebDriver and Selenium Grid – part 2: Creating and Executing Tests

In part 1 in the series (read it here) I presented Selenium, a widely known tool for browser test automation.

Starting with Selenium 2, the most important components from the suite are Selenium WebDriver and Selenium Grid. In part 1 I showed how easy is to setup a testing grid with multiple OS and browsers. Now in part 2 I will show how to leverage WebDriver API to create and execute tests.

Continue reading

Test Automation with Selenium WebDriver and Selenium Grid – part 1: Setting Up the Grid

For a long while I’ve been “dying to play” with Selenium (www.seleniumhq.org and code.google.com/p/selenium/). I’ve heard and read very good things about this tool from colleagues and from the blogosphere.

Selenium is, in short, an open source tool to automate web browser interactions. A primary use case is, of course, browser test automation.

Selenium has greatly evolved with time, specially since the 2.0 release when the legacy Selenium project merged with Google’s WebDriver. Nowadays, Selenium offers a wide range of programming languages supported to write the tests, an impressive browser compatibility list, the ability to record tests from user interactions and, above it all in my opinion, the ability to re-execute tests across a grid of machines with various operating systems, browser families and versions.

Although Selenium seems to be primarily chosen for functional/regression test automation, it’s also a great choice – precisely because of the grid feature – for cross-browser compatibility testing: ensuring in an easy, cost-effective way, that our web applications are usable in all sorts of operating systems and browsers.

In this and forthcoming posts in a short series I will share my experiences setting up a Selenium Grid, building some automated tests for a simple Spring application, re-executing them from Eclipse IDE and finally re-executing them in continuous integration (including code coverage) with Maven, Cargo, Jenkins, Sonar and JaCoCo.

Continue reading