JUnit 4 test reporting

If you are not using JUnit 4, view the Java docs for information about integrating with a lower level library.

We recommend you run your JUnit 4 tests using junit-vintage-engine in JUnit 5 as outlined here: https://junit.org/junit5/docs/current/user-guide/#migrating-from-junit4.

This is a great way to run your existing JUnit 4 tests while moving towards JUnit 5, because you do not need to make changes to your tests. Simply add the junit-vintage-engine as a dependency, examples here for Gradle and Maven, but for full details visit the link above.

Gradle: build.gradle
dependencies {

Maven: pom.xml

You can then make use of the Tesults JUnit 5 extension to push test results data to Tesults. The only caveat is that the extension will work best with JUnit 4 tests if your tests are in a package, which is easy to add if they are not already, this avoids an issue with how the junit-vintage-engine reporting test suite names.

Once you are running your JUnit 4 tests using the JUnit 5 provided junit-vintage-engine continue following the instructions for setting up Tesults with JUnit 5: https://www.tesults.com/docs/junit5.

Alternatively, if you do not want to run your JUnit 4 tests using the junit-vintage-engine as suggested above then to integrate with Tesults you need to setup a listener and register it within your JUnit 4 project.

JUnit 4 Listener

As outlined above, we do not recommend this method but it may be useful in specific situations.


The Tesults Java API Library is available from the JCenter and Maven Central repositories or as a JAR you can download from here.


If you are using Gradle add this dependency snippet to your build.gradle file:

dependencies {
  compile 'com.tesults:tesults:1.0.2'

Also ensure you have the JCenter or Maven Central repository referenced in your build.gradle file.


If you are using Maven add this dependency snippet to your pom.xml file:



Alternatively a JAR is available for download directly here: tesults.jar (compatible with Java 7+)


Make the tesults.jar available as a library and then import the com.tesults.tesults package in your code:

import com.tesults.tesults.*;


Tesults recommends extending the JUnit4 RunListener to listen to test events as your tests run. This way results data can be generated and then uploaded. Only two methods need to be implemented: testFinished and testRunFinished finished.

Tesults provides the TesultsListener below, ready to be copied and pasted for use with your system. Use as is or edit. To upload files uncomment the files section and provide paths to where your generated files are mapped for each test case. If you require help with this contact help@tesults.com and we will write this for you.

import com.sun.org.apache.xpath.internal.operations.Bool;
import org.junit.runner.Description;
import org.junit.runner.Result;
import org.junit.runner.notification.Failure;
import org.junit.runner.notification.RunListener;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

import com.tesults.tesults.*;

public class TesultsListener extends RunListener {
  Map<String, Integer> testCaseIndices = new HashMap<String, Integer>();
  List<Map<String,Object>> testCases = new ArrayList<Map<String, Object>>();

  public void testFinished(Description description) throws Exception {
    Map<String, Object> testCase = new HashMap<String, Object>();
    testCase.put("name", description.getMethodName());
    testCase.put("desc", description.getMethodName());
    testCase.put("suite", description.getClassName());
    testCase.put("result", "pass");

    // (Optional) For uploading files:
    //List<String> files = new ArrayList<String>();
    //testCase.put("files", files);


    testCaseIndices.put(description.getDisplayName(), testCases.size() - 1);

  public void testRunFinished(Result result) throws Exception {
    List<Failure> failures = result.getFailures();
    for(Failure failure : failures){
      int index = testCaseIndices.get(failure.getDescription().getDisplayName());
      Map<String, Object> testCase = testCases.get(index);
      testCase.put("result", "fail");
      testCase.put("reason", failure.getMessage());

    // Map<String, Object> to hold your test results data.
    Map<String, Object> data = new HashMap<String, Object>();
    data.put("target", "token");

    Map<String, Object> results = new HashMap<String, Object>();
    results.put("cases", testCases);
    data.put("results", results);

    // Upload
    Map<String, Object> response = Results.upload(data);
    System.out.println("success: " + response.get("success"));
    System.out.println("message: " + response.get("message"));
    System.out.println("warnings: " + ((List<String>) response.get("warnings")).size());
    System.out.println("errors: " + ((List<String>) response.get("errors")).size());

The target value, 'token' above should be replaced with your Tesults target token. If you have lost your token you can regenerate one from the config menu.

The upload method returns a response of type Map<String, Object> that you can use to check whether the upload was a success.

Value for key "success" is a Boolean: true if results successfully uploaded, false otherwise.

Value for key "message" is a String: if success is false, check message to see why upload failed.

Value for key "warnings" is a List<String>, if size is not zero there may be issues with file uploads.

Value for key "errors" is a List<String>, if "success" is true then this will be empty.

In order to use a RunListener you must register it with the test runner. If you are not already using a test runner, it's easy to add one:

public static void main(String [ ] args) {
  JUnitCore core= new JUnitCore();
  core.addListener(new TesultsListener());

Caution: If uploading files the time taken to upload is entirely dependent on your network speed. Typical office upload speeds of 100 - 1000 Mbps should allow upload of even hundreds of files quite quickly, just a few seconds, but if you have slower access it may take hours. We recommend uploading a reasonable number of files for each test case. The upload method blocks at the end of a test run while uploading test results and files. When starting out test without files first to ensure everything is setup correctly.

Consolidating parallel test runs

If you execute multiple test runs in parallel or serially for the same build or release and results are submitted to Tesults within each run, separately, you will find that multiple test runs are generated on Tesults. This is because the default behavior on Tesults is to treat each results submission as a separate test run. This behavior can be changed from the configuration menu. Click 'Results Consolidation By Build' from the Configure Project menu to enable and disable consolidation by target. Enabling consolidation will mean that multiple test runs submitted with the same build name will be consolidated into a single test run.

Dynamically created test cases

If you dynamically create test cases, such as test cases with variable values, we recommend that the test suite and test case names themselves be static. Provide the variable data information in the test case description or other custom fields but try to keep the test suite and test name static. If you change your test suite or test name on every test run you will not benefit from a range of features Tesults has to offer including test case failure assignment and historical results analysis. You need not make your tests any less dynamic, variable values can still be reported within test case details.

Proxy servers

Does your corporate/office network run behind a proxy server? Contact us and we will supply you with a custom API Library for this case. Without this results will fail to upload to Tesults.

Have questions or need help? Contact us

Result Interpretation

Result interpretation is not currently supported by this integration. If you are interested in support please contact help@tesults.com.