Friday, 15 August 2014

Skip TestNG test based on a condition

Sometimes you might want to skip tests based on a condition. Following snippet shows a simple way to do it by throwing a testngSkipException. This integrates seamlessly with test reports and skipped tests are counted displayed correctly in it.

//....code snippet.....
// Filter out bad tests
for (String c : conditions) {
if(c.matches("some condition 1") || c.matches("some other condition")){
throw new org.testngSkipException(c);
}
}
//....code snippet.....

Custom logger for TestNG

Another small post on TestNG custom logger. I have a TestNG test suite that runs for hours doing through integration tests. Now at the end of the test run I can see testNG reports and see what failed and what passed, however I get no response during the test run as to if all tests are failing due to some configuration issue or its a normal test run. This is happening due to a fact that its basically only 1 test with DataProvider providing different test parameters. If you have separate tests then yes you will get messages for each test.

Anyhow I wanted a custom TestNG logger which logs a message that I understand at the end of each test and here is a way to do so.


Extend TestListenerAdapter class and override a few methods.
package com.clearqa.utils;
import org.testng.ITestResult;
import org.testng.TestListenerAdapter;
public class TestNgCustomLogger extends TestListenerAdapter{
@Override
public void onTestFailure(ITestResult tr) {
logToStdOut(tr, "FAILED");
}
@Override
public void onTestSkipped(ITestResult tr) {
logToStdOut(tr, "SKIPPED");
}
@Override
public void onTestSuccess(ITestResult tr) {
logToStdOut(tr, "PASS");
}
private void logToStdOut(ITestResult tr, String result){
Object[] parameters = tr.getParameters();
System.out.println("Test with parameters " + result);
for(Object o : parameters) {
System.out.println("\t -" + o.toString());
}
}
}


Add a custom Listener to your testng xml config:
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" >
<suite name="REST api Integration Tests" verbose="1" data-provider-thread-count="10">
<listeners>
<listener class-name="com.clearqa.utils.TestNgCustomLogger" />
</listeners>
<test name="Rest API - Json schema validation Tests" >
<classes>
<class name="com.clearqa.restapi.test.RestApiITCase" />
</classes>
</test>
</suite>
view raw testng.xml hosted with ❤ by GitHub


and wola I get the much needed indication on the std out:

$ mvn test-compile failsafe:integration-test
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building XXXX Webapp 1.0.5-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
....blah blah blah....
....blah blah blah....
[INFO] Failsafe report directory: XXXX\target\failsafe-reports
-------------------------------------------------------
T E S T S
-------------------------------------------------------
Running TestSuite
Test with parameters PASS
-XXXX
Test with parameters FAILED
-YYYY
...
...
view raw stdout.log hosted with ❤ by GitHub


Wednesday, 13 August 2014

Increase JBehave story timeout

Small post on how to increase the default story timeout in JBehave. This might be required if you have a long running story as I have with many Steps/Examples etc and you see error message similar to following:

Story my.story duration of 301 seconds has exceeded timeout of 300 seconds

or

STORY CANCELLED (DURATION 301 s)

The solution is the increase the story timeout in your maven config (default is 300s). Note this will only take effect when you run integration-tests via mvn command and not when you run the stories via Eclipse/IntelliJ which will bypass your pom.xml and other maven config.

Full pom.xml with timeout setting can be found here: https://github.com/jbehave/jbehave-core/blob/master/examples/threads/pom.xml


Relevant portion below:
<build>
<plugins>
<plugin>
<groupId>org.jbehave</groupId>
<artifactId>jbehave-maven-plugin</artifactId>
<executions>
<execution>
<id>embeddable-stories</id>
<phase>integration-test</phase>
<configuration>
<includes>
<include>${embeddables}</include>
</includes>
<excludes />
<skip>${skip}</skip>
<batch>false</batch>
<threads>${threads}</threads>
<storyTimeoutInSecs>600</storyTimeoutInSecs> <!-- Affects default story timeout of 300 secs -->
<generateViewAfterStories>true</generateViewAfterStories>
<ignoreFailureInStories>${ignore.failre.in.stories}</ignoreFailureInStories>
<ignoreFailureInView>true</ignoreFailureInView>
<metaFilters>
<metaFilter>${meta.filter}</metaFilter>
</metaFilters>
</configuration>
<goals>
<goal>run-stories-as-embeddables</goal>
</goals>
</execution>
</executions>
<dependencies>
<!-- Only needed if groovy-based MetaFilters are used -->
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>1.8.4</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
view raw pom.xml hosted with ❤ by GitHub