Friday, 15 August 2014

Skip TestNG test based on a condition

Sometimes you might want to skip tests based on a condition. Following snippet shows a simple way to do it by throwing a testngSkipException. This integrates seamlessly with test reports and skipped tests are counted displayed correctly in it.

//....code snippet.....
// Filter out bad tests
for (String c : conditions) {
if(c.matches("some condition 1") || c.matches("some other condition")){
throw new org.testngSkipException(c);
}
}
//....code snippet.....

Custom logger for TestNG

Another small post on TestNG custom logger. I have a TestNG test suite that runs for hours doing through integration tests. Now at the end of the test run I can see testNG reports and see what failed and what passed, however I get no response during the test run as to if all tests are failing due to some configuration issue or its a normal test run. This is happening due to a fact that its basically only 1 test with DataProvider providing different test parameters. If you have separate tests then yes you will get messages for each test.

Anyhow I wanted a custom TestNG logger which logs a message that I understand at the end of each test and here is a way to do so.


Extend TestListenerAdapter class and override a few methods.
package com.clearqa.utils;
import org.testng.ITestResult;
import org.testng.TestListenerAdapter;
public class TestNgCustomLogger extends TestListenerAdapter{
@Override
public void onTestFailure(ITestResult tr) {
logToStdOut(tr, "FAILED");
}
@Override
public void onTestSkipped(ITestResult tr) {
logToStdOut(tr, "SKIPPED");
}
@Override
public void onTestSuccess(ITestResult tr) {
logToStdOut(tr, "PASS");
}
private void logToStdOut(ITestResult tr, String result){
Object[] parameters = tr.getParameters();
System.out.println("Test with parameters " + result);
for(Object o : parameters) {
System.out.println("\t -" + o.toString());
}
}
}


Add a custom Listener to your testng xml config:
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" >
<suite name="REST api Integration Tests" verbose="1" data-provider-thread-count="10">
<listeners>
<listener class-name="com.clearqa.utils.TestNgCustomLogger" />
</listeners>
<test name="Rest API - Json schema validation Tests" >
<classes>
<class name="com.clearqa.restapi.test.RestApiITCase" />
</classes>
</test>
</suite>
view raw testng.xml hosted with ❤ by GitHub


and wola I get the much needed indication on the std out:

$ mvn test-compile failsafe:integration-test
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building XXXX Webapp 1.0.5-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
....blah blah blah....
....blah blah blah....
[INFO] Failsafe report directory: XXXX\target\failsafe-reports
-------------------------------------------------------
T E S T S
-------------------------------------------------------
Running TestSuite
Test with parameters PASS
-XXXX
Test with parameters FAILED
-YYYY
...
...
view raw stdout.log hosted with ❤ by GitHub


Wednesday, 13 August 2014

Increase JBehave story timeout

Small post on how to increase the default story timeout in JBehave. This might be required if you have a long running story as I have with many Steps/Examples etc and you see error message similar to following:

Story my.story duration of 301 seconds has exceeded timeout of 300 seconds

or

STORY CANCELLED (DURATION 301 s)

The solution is the increase the story timeout in your maven config (default is 300s). Note this will only take effect when you run integration-tests via mvn command and not when you run the stories via Eclipse/IntelliJ which will bypass your pom.xml and other maven config.

Full pom.xml with timeout setting can be found here: https://github.com/jbehave/jbehave-core/blob/master/examples/threads/pom.xml


Relevant portion below:
<build>
<plugins>
<plugin>
<groupId>org.jbehave</groupId>
<artifactId>jbehave-maven-plugin</artifactId>
<executions>
<execution>
<id>embeddable-stories</id>
<phase>integration-test</phase>
<configuration>
<includes>
<include>${embeddables}</include>
</includes>
<excludes />
<skip>${skip}</skip>
<batch>false</batch>
<threads>${threads}</threads>
<storyTimeoutInSecs>600</storyTimeoutInSecs> <!-- Affects default story timeout of 300 secs -->
<generateViewAfterStories>true</generateViewAfterStories>
<ignoreFailureInStories>${ignore.failre.in.stories}</ignoreFailureInStories>
<ignoreFailureInView>true</ignoreFailureInView>
<metaFilters>
<metaFilter>${meta.filter}</metaFilter>
</metaFilters>
</configuration>
<goals>
<goal>run-stories-as-embeddables</goal>
</goals>
</execution>
</executions>
<dependencies>
<!-- Only needed if groovy-based MetaFilters are used -->
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>1.8.4</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
view raw pom.xml hosted with ❤ by GitHub

Thursday, 10 April 2014

restIT: Testing Framework for REST based web services

I would like to call this testing framework restIT (REST Integration Test).

This test framework will give a simple way to verify that REST webservice is responding with right JSON responses. All you need to provide is a couple of yml config files declaring the REST urls and JSON schema files for the expected JSON responses.

Adding new tests should not take more than 10 mins of your time. Framework is flexible enough for you to add more advanced features such as Authentication headers and complex JSON response validation which cannot be done via JSON schema files.

Project is hosted on github at: https://github.com/njaiswal/restITframework

Steps to make use of this framework.

  • Git clone the project.
git clone git@github.com:njaiswal/restITframework.git restIT
view raw git clone cmd hosted with ❤ by GitHub

  • Import the project in eclipse or your fav ID as existing maven project.
  • Replace src/test/resources/testcases.yml file with your own.
    • Make sure you change baseurl to map to base url of your rest service.
    • Add services which you want to test with/out prameters, this depends on your rest service. See examples in the file.
---
baseurl: 'http://yourWebSite/AppName'
services:
- /customers
- /customer/One
- /customer/Two
view raw testCases.yml hosted with ❤ by GitHub

  • For each of the test case you have added in the above step, you need to add a schema map. Replace file src/main/resources/schemaMap.yml with your own.
---
schemas:
- id: Customers List
file: "/schemas/customers_list_schema.json"
regex: "^.*/customers$"
- id: Any particular customer
file: "/schemas/customer_schema.json"
regex: "^.*/customer/.+$"
view raw schemaMap.yml hosted with ❤ by GitHub

Each schema map in this yml file has 3 fields:
    • id: This is just a name.Appears in logs/reports.
    • file: This is the JSON schema file path
    • regex: If the URL matches this regex, response of this URL will be validated againts above JSON schema file
  • In src/test/resoruces/schemas directory add all JSON schema files defined in schemaMap.yml file.
    • See examples in the git project.
    • Find more details on how to write schema files at: http://json-schema.org/
    • You can add as much or as little JSON response validation as you want depending on what json-schema provides out of the box. Think of this like xml schema for xml responses.
    • If you want to do more complicated response validation, extended the response validation code, or if you have specific requests let me know and I might accommodate it in my free time.

Steps to run the test framework and view reports.

Easy: 
mvn test
view raw cmd hosted with ❤ by GitHub

View reports at: target/surefire-reports/index.html

Make sure to check the 'Reporter Output' tab for any JSON schema validation error reports. It shows all json schema vaildation errors pretty printed.

Enjoy. Let me know if you have any questions/suggestions.


Friday, 4 April 2014

Small but useful utilities

Many times working inside corporate env you cannot use online utilities like url decoders, base64 decoders etc where you copy paste the encoded string into a webpage and wola its gets decoded on the browser. Though these websites say all decoding is done on the client side, which is true most of the times, its always smart not to copy paste company data on to random websites.

I have a set of small utilities which i have shared here, might prove useful to you.


  • Decode encoded URL string. Useful for me to read encoded urls and make sense of it at times.
    $ cat url_decoder
    #!/usr/bin/perl
    use strict;
    use warnings;
    use URI::Encode qw(uri_encode uri_decode);
    while (1) {
    my $url = <STDIN>;
    print uri_decode($url) . "\n\n";
    }
    view raw url_decoder hosted with ❤ by GitHub

    ./url_decoder
    https://www.google.co.uk/search?q=encoded+urls&oq=encoded+urls&aqs=chrome.0.69i57j0l3j69i62l2.2861j0&sourceid=chrome&ie=UTF-8#q=GBP+%C2%A3+%2F+USD+%24
    https://www.google.co.uk/search?q=encoded+urls&oq=encoded+urls&aqs=chrome.0.69i57j0l3j69i62l2.2861j0&sourceid=chrome&ie=UTF-8#q=GBP+£+/+USD+$
  • Base64 decoder. Useful when you want a quick decoding requirements.
    #!/usr/bin/perl
    use strict;
    use warnings;
    use MIME::Base64 qw/decode_base64/;
    while (1) {
    my $text = <STDIN>;
    print decode_base64($text) . "\n\n";
    }
    view raw base64_decoder hosted with ❤ by GitHub

    $ ./base64_decoder
    cGxhaW5UZXh0wqM=
    plainText£
    .
  • json to yml: Useful tool to quickly convert json to yml
    #!/usr/bin/perl
    use strict;
    use warnings;
    use JSON qw/decode_json/;
    use Data::Dumper;
    use YAML;
    my $json_file = $ARGV[0];
    my $json_text;
    open(IN, "<", $json_file) or die("Cannot open file: $json_file");
    {
    local $/;
    $json_text = <IN>;
    }
    close(IN);
    my $hash = decode_json($json_text);
    print YAML::Dump $hash;
    view raw json2yml hosted with ❤ by GitHub

These are  very simple but useful utilities, obviously will need your perl modules and setup ready to run them.

Making Unit and Integration Tests coexist - Maven/TestNG

Recently I had to make unit and integration tests coexist in the same maven project (pom), up-till then integration tests resided in a separate project (nice and easy).  However ideally these should co-exists in same project to make versioning/branching possible.

This small post I will share the steps taken to make this possible.

Technologies used here are:

- Maven
- TestNG

Goals: 

- mvn test should only run unit tests
- mvn integration-test should only run integration tests so that they do not cause issues building the project and are not included in the maven lifecycle for mvn install

Steps:


  • Make 2 different testng.xml files called:
    • ITtestng.xml (for integration tests)
      <!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" >
      <suite name="TPCC REST api Integration Tests" verbose="1" >
      <test name="Rest API - Json schema validation Tests" >
      <classes>
      <class name="com.clearqa.test.RestApiITCase" />
      </classes>
      </test>
      </suite>
      view raw ITtestng.xml hosted with ❤ by GitHub
    • UTtestng.xml (for unit tests)
      <!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" >
      <suite name="TPCC Unit Tests" verbose="1" >
      <test name="Dummy Tests" >
      <classes>
      <class name="com.clearqa.test.DummyTest" />
      </classes>
      </test>
      </suite>
      view raw UTtestng.xml hosted with ❤ by GitHub
  • Include maven-failsafe-plugin in your pom.xml to run just the integration test cases. Please not maven-failsafe-plugin is designed to run integration tests while the Surefire plugins is desinged to run unit test more info on this can be found here.
    <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-failsafe-plugin</artifactId>
    <version>2.17</version>
    <configuration>
    <suiteXmlFiles>
    <suiteXmlFile>target/test-classes/ITtestng.xml</suiteXmlFile>
    </suiteXmlFiles>
    </configuration>
    <executions>
    <execution>
    <id>failsafe-integration-tests</id>
    <phase>integration-test</phase>
    <goals>
    <goal>integration-test</goal>
    </goals>
    </execution>
    </executions>
    </plugin>
  • That is it! 
    • Now your mvn test command will run only the unit tests. Also mvn install will not run integration tests and cause the build to fail
    • mvn integration-test will run both unit and integration tests.