Test Automation with Selenium WebDriver and Selenium Grid – part 3: Continuous Integration

In part 1 in the series (read it here) I discussed about Selenium, the widely used tool for browser test automation, and I showed how easy is to setup a testing grid with multiple OS and browsers. In part 2 (read it here) I showed how to leverage WebDriver API to create and execute tests distributed across the grid that was created.

Now in part 3 I will show how to execute Selenium tests under a Continuous Integration process with Maven, Cargo and Jenkins, and how to gather code coverage metrics for those tests using Sonar and JaCoCo.

Adding Selenium WebDriver tests to a Continuous Integration process is easy if you know how. When Maven is used for build automation and life-cycle management, having the tests executed is as easy as adding a few configuration lines to the project’s pom.xml. Actually most of what I am going to show in this post is already published in some previous posts about HtmlUnit (read them here and here) – the approach is precisely the same. No matter what tool is used to write and execute the tests as long as they run on top of JUnit.

Reasons for Having Independent Tests

If you stop to think what are the weakest points about running Selenium tests, as shown in previous posts in the series, most probably you will agree with me that there are two: the Selenium Grid should exist and the application under test should be deployed on an application server, accessible to the test nodes.

Let’s focus on the second one – Maven and Cargo will come to the rescue. Things that need to be done are:

  • Configure Maven Failsafe plug-in to automate the execution of the integration tests.
  • Configure Maven Cargo plug-in to automate the application server provisioning for the tests and to automate the deployment of the application under test. The application server will exist only during the Maven process (embedded runtime).
  • Optionally, leverage an embedded database. This is out of the scope of this post and actually a subject for another one. In short, use Spring JDBC embedded database support (you can read about this nice feature here).

First step is useful to separate unit tests, that run without a container or database, from integration tests, that may need (or not, depending on the integration scope) an existing runtime. Second and third steps are useful to build an integration test suite that is independent from an existing runtime.

Independence is critical to ensure a smooth workflow across the development team and the continuous integration engine. If integration tests require an existing application server and database, two developers (or one developer and the CI engine) would clash if running tests at the same time. Independence is also useful to create tests that do not rely on the previous one to finish successfully. Each test may leverage its own clean database, with the input data that is required to test different code branches and achieve reasonable code coverage. Thus, no matter if a previous test failed and left the database in an inconsistent state. This state is discarded, a new input data set is loaded and thus the feedback from subsequent tests will remain meaningful.

Configuring Failsafe

Let’s start by configuring Maven Failsafe plug-in. Failsafe is no more than a fork to the well known Surefire plug-in that provides a mechanism for differentiating unit tests from integration tests.

Although there is people that prefer having integration tests in a separate Maven artifact, I prefer, if the project is not big or has a simple layout, having them in the same artifact and just differentiate both types of tests with a different name pattern.

A possible configuration would be like this:

<project...>
  ...
  <build>
    ...
    <plugins>
      ...
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.8.1</version>
        <configuration>
          <includes>
            <include>**/test/*TestCase.java</include>
          </includes>
        </configuration>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-failsafe-plugin</artifactId>
        <version>2.8.1</version>
        <configuration>
          <includes>
            <include>**/integrationtest/*IntegrationTestCase.java</include>
          </includes>
        </configuration>
        <executions>
          <execution>
            <id>integration-test</id>
            <goals>
              <goal>integration-test</goal>
            </goals>
          </execution>
          <execution>
            <id>verify</id>
            <goals>
              <goal>verify</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
      ...
    </plugins>
    ...
  </build>
  ...
</project>

This configuration will trigger the following actions in a Maven build:

  • Surefire will execute, during unit test phase (default binding), only those test classes which are located in *.test packages and which name ends with TestCase.
  • Failsafe will execute, during integration-test and verify phase (defined bindings), only those test classes which are located in *.integrationtest packages and which name ends with IntegrationTestCase.

Configuring an Embedded Server with Cargo

Next step is to configure Cargo for provisioning an embedded server that will be used to host the application under test. Cargo is capable of remotely deploy artifacts to a server, or download and install one locally for the test session if we do not have one available. Cargo can also automatically start and stop the servers (both remote and local). With the appropriate bindings, the star-deploy-test-stop flow can be achieved automatically.

A possible configuration, that provisions a Tomcat 7.0.22 runtime, would be like this:

<project...>
  ...
  <build>
    ...
    <plugins>
      ...
      <plugin>
        <groupId>org.codehaus.cargo</groupId>
        <artifactId>cargo-maven2-plugin</artifactId>
        <version>1.1.4</version>
        <configuration>
          <container>
            <containerId>tomcat7x</containerId>
            <zipUrlInstaller>
              <url>http://archive.apache.org/dist/tomcat/tomcat-7/v7.0.22/bin/apache-tomcat-7.0.22.zip</url>
            </zipUrlInstaller>
          </container>
          <configuration>
            <properties>
              <cargo.servlet.port>8180</cargo.servlet.port>
              <cargo.tomcat.ajp.port>8109</cargo.tomcat.ajp.port>
            </properties>
          </configuration>
        </configuration>
        <executions>
          <!-- start server before integration tests -->
          <execution>
            <id>start-container</id>
            <phase>pre-integration-test</phase>
            <goals>
              <goal>start</goal>
            </goals>
          </execution>
          <!-- stop server after integration tests -->
          <execution>
            <id>stop-container</id>
            <phase>post-integration-test</phase>
            <goals>
              <goal>stop</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
      ...
    </plugins>
    ...
  </build>
  ...
</project>

This configuration will trigger the following actions in a Maven build:

  • Cargo start goal is bound to pre-integration-test phase. During this phase, Cargo will download a Tomcat 7.0.22 server an configure it to run on ports 8180 and 8109 (not using the default ports is a good idea to prevent clashing another server process, i.e. Jenkins). After that the application under test is deployed.
  • Cargo stop goal is bound to post-integration-test phase. During this phase, that runs once Failsafe finishes executing the integration tests, Cargo will stop the Tomcat server.

At this point we can execute a Maven build up to verify phase that will run the integration tests against the application hosted in the embedded server. Obviously, the Tomcat server process listening at port 8180 should be visible from test nodes, so watch carefully firewall or proxy settings if nodes cannot connect to the application server.

Only one thing is left: we need to pass the hub and target server URLs into the JUnit processes spawned by Failsafe. If you remember from part 2, when we were to execute the tests from the IDE, we added some system properties to the launcher configuration:

-Dtest.selenium.hub.url=http://xxxxx:4444/wd/hub
-Dtest.target.server.url=http://xxxxx:8080/sdc.samples.selenium

If we proceed the same way it will not work – Failsafe does not propagates system properties to the spawned JUnit processes. To get the job done, we need to leverage the argLine parameter:

mvn verify -DargLine="
\ -Dtest.selenium.hub.url=http://xxxxx:4444/wd/hub
\ -Dtest.target.server.url=http://xxxxx:8180/sdc.samples.selenium"

Failsafe will take the argLine parameter value and propagate it to each JUnit process. The application will be packaged and deployed and tests will be executed from the nodes available in our test grid.

Configuring the Build Job in Jenkins

To configure the build job in Jenkins no special setup is needed. Just add the verify phase to the Maven goals and add Failsafe’s argLine parameter (either as a regular option or as part of MAVEN_OPTS definition):

Validate the job configuration by running it from Jenkins dashboard: integration tests should be executed and results shown in Jenkins’ build results page.

Gathering Code Coverage Metrics with Sonar and JaCoCo

At this point, gathering code coverage metrics is a simple task:

  • In pom.xml, add a dependency to JaCoCo agent.
  • In pom.xml, add a command line argument to Tomcat startup to attach JaCoCo agent to it.
  • In pom.xml, add sonar.jacoco.itReportPath property telling Sonar where the coverage report can be found.
  • In Jenkins job configuration, be sure the Sonar’s post build action is enabled.

The dependency entry is added to the project’s pom:

<project...>
  ...
  <dependencies>
    ... 
    <dependency>
      <groupId>org.jacoco</groupId>
      <artifactId>org.jacoco.agent</artifactId>
      <version>0.5.5.201112152213</version>
      <type>jar</type>
      <classifier>runtime</classifier>
      <scope>test</scope>
    </dependency>
    ...
  </dependencies>
  ...
</project>

The command line argument is added to Tomcat startup:

<project...>
  ...
  <build>
    ...
    <plugins>
      ...
      <plugin>
        <groupId>org.codehaus.cargo</groupId>
        <artifactId>cargo-maven2-plugin</artifactId>
        ...
          <configuration>
            <properties>
              <cargo.servlet.port>8180</cargo.servlet.port>
              <cargo.tomcat.ajp.port>8109</cargo.tomcat.ajp.port>
              <cargo.jvmargs>-javaagent:${settings.localRepository}/org/jacoco/org.jacoco.agent/0.5.5.201112152213/org.jacoco.agent-0.5.5.201112152213-runtime.jar=destfile=${project.build.directory}/itest.jacoco</cargo.jvmargs>
            </properties>
          </configuration>
        ...
      </plugin>
      ...
    </plugins>
    ...
  </build>
  ...
</project>

JaCoCo’s code coverage report is passed into Sonar:

<project...>
  ...
  <properties>
    <sonar.jacoco.itReportPath>
      ${project.build.directory}/itest.jacoco
    </sonar.jacoco.itReportPath>
  </properties>
  ...
</project>

And finally, enable Sonar’s post build action in Jenkins job configuration page:

Once changes in configuration are done, launch the Jenkins job again and after a few minutes you should see the project analyzed in Sonar dashboard, including code coverage for Selenium integration tests. Awesome!

Conclusion

Praise for Selenium is well deserved. Selenium, through WebDriver and Grid components, provides a powerful mechanism to automate browser tests. Tests can be easily distributed across a grid allowing multi-browser, multi-OS testing. Tests can be easily created and executed within our favorite IDEs and added to a Continuous Integration process, including code coverage metrics for Selenium tests.

Selenium helps to achieve important productivity improvements. The time invested in writing an automated test is rapidly returned by the fact that no effort is needed to re-execute them. Moreover, being automated, it is likely that they will be execute more often that if they were manual, contributing to increase the quality and predictability of our software releases.

Author: deors

senior technology architect in accenture, with a passion for technology related stuff, celtic music and the best sci-fi, among other thousand things!

16 thoughts on “Test Automation with Selenium WebDriver and Selenium Grid – part 3: Continuous Integration”

  1. What to say.. other than this was a great article. Thank you very much for the information. I’ll be following it to the letter and will be sure to credit you. Well worded and clear.

  2. You might consider extending this article to include handling the display on the node side.

  3. Nice presentation,I am using jbehave+selenium webdriver+maven+junit for the automation project. I have a requirement of running test cases in different browsers.How to pass parameter through command prompt choosing the browser name to run a test? how to set up similar configuration on Jenkin job? Awaiting the reply…

  4. First, I want to thank you for great job. I read all you Selenium Grid articles and got a lot of useful information.
    Could you, please, proceed with Selenium Grid line, especially logger configuration.
    I’m trying to set up log4j, but have some problems with it. Seem, like log4j usage for Selenium Grid is rather different, than one for WebDriver. Isn’t it?

  5. Awesome!
    I wonder if I can do the all the steps you described using Visual Studio instead of Eclipse?

  6. Hi All Automation Testing experts,

    I am currently writing an automation framework, where I work has several environments viz. Development Environment, Staging and Live. And understandably we have different base urls, user name and password for different environments. I plan to automate the daily tests on different environments using Jenkins. At the moment, I have hard coded all the variables in the step definition files, committed in the repo and run it on Jenkins.

    What I am thinking at this stage is that I should be able to pass the environment I want to test under as a parameter in Jenkins and based on this a correct configuration with appropriate variables (urls, user names and passwords) are chosen which is in turn used by the step definition files for execution.

    I did some basic look up in google and I am not sure if I am searching for correct things based on my requirements.

    I am not looking for shortcuts here, I am wiling to spend the time it takes to learn things by reading docs and trying out examples.

    Can someone point me in the right direction as to how to approach this?

    Your help is much appreciated, thanks a ton in advance for your time.

      1. Thanks a ton !
        I think this will work, in a couple of days I’ll trial this and let you know how I went.

    1. Of course you can. You just need to adapt the test execution phase so it is launched by Gradle build instead of Maven Failsafe plugin. The rest: test scripts, test data, link with Jenkins or SonarQube, will remain the same.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s