Espresso test reporting

Using the Espresso test framework? Follow the instructions below to report results. If you are not using Espresso please see the Java docs for integrating with a lower-level library.

An alternative way to report results if you do not want to follow the instructions below is to utilize the JUnit XML output from Espresso but the method below is recommended because you can also upload files that way.


Espresso uses the JUnit 4 test framework for writing tests by default and the instructions here assume that is the case for your Espresso tests.

Add the following dependency in your build.gradle file:

dependencies {
  implementation 'com.tesults:tesults:1.0.2'



Create a new Java file called in your test package. Copy and paste the code below into the new file you have created. To upload files uncomment the files section and provide paths to where your generated files are mapped for each test case.

This listener extends the JUnit4 RunListener to listen to test events as your tests run. This way results data can be generated and then uploaded. Only two methods need to be implemented: testFinished and testRunFinished finished.

import org.junit.runner.Description;
import org.junit.runner.Result;
import org.junit.runner.notification.Failure;
import org.junit.runner.notification.RunListener;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

import com.tesults.tesults.*;

public class TesultsListener extends RunListener {
  Map<String, Integer> testCaseIndices = new HashMap<String, Integer>();
  List<Map<String,Object>> testCases = new ArrayList<Map<String, Object>>();
  Map<Integer, Long> times = new HashMap<Integer, Long>();

  public void testStarted(Description description) throws Exception {
    times.put(description.hashCode(), System.currentTimeMillis());

  public void testFinished(Description description) throws Exception {
    Map<String, Object> testCase = new HashMap<String, Object>();
    testCase.put("name", description.getMethodName());
    testCase.put("desc", description.getMethodName());
    testCase.put("suite", description.getClassName());
    testCase.put("result", "pass");

    testCase.put("start", times.get(description.hashCode()));
    testCase.put("end", System.currentTimeMillis());

    // (Optional) For uploading files:
    //List<String> files = new ArrayList<String>();
    //testCase.put("files", files);


    testCaseIndices.put(description.getDisplayName(), testCases.size() - 1);

  public void testRunFinished(Result result) throws Exception {
    List<Failure> failures = result.getFailures();
    for(Failure failure : failures){
      int index = testCaseIndices.get(failure.getDescription().getDisplayName());
      Map<String, Object> testCase = testCases.get(index);
      testCase.put("result", "fail");
      testCase.put("reason", failure.getMessage());

    // Map<String, Object> to hold your test results data.
    Map<String, Object> data = new HashMap<String, Object>();
    data.put("target", "token");

    Map<String, Object> results = new HashMap<String, Object>();
    results.put("cases", testCases);
    data.put("results", results);

    // Upload
    System.out.println("Tesults results upload...");
    Map<String, Object> response = Results.upload(data);
    System.out.println("success: " + response.get("success"));
    System.out.println("message: " + response.get("message"));
    System.out.println("warnings: " + ((List<String>) response.get("warnings")).size());
    System.out.println("errors: " + ((List<String>) response.get("errors")).size());

The target value, 'token' above should be replaced with your Tesults target token. If you have lost your token you can regenerate one from the config menu.

The upload method returns a response of type Map<String, Object> that you can use to check whether the upload was a success.

Value for key "success" is a Boolean: true if results successfully uploaded, false otherwise.

Value for key "message" is a String: if success is false, check message to see why upload failed.

Value for key "warnings" is a List<String>, if size is not zero there may be issues with file uploads.

Value for key "errors" is a List<String>, if "success" is true then this will be empty.

Register TesultsListener

Create a new AndroidManifest.xml at the root of your Espresso tests folder. The name of the file must be AndroidManifest.xml. This file will then show up under the manifests folder in the IDE. If it does not, please ensure you have created this file at the root of your Espresso tests folder (at the same level as the Java folder). Please note, this is not the same AndroidManifest.xml that Android Studio generates for your application, this will be a new AndroidManifest.xml for your tests package.

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android=""

    android:value="com.example.myapplication.TesultsListener" />

Finally ensure internet permissions are set in your application's AndroidManifest.xml. Without this, Tesults will not be able to be reached and you will see an error about internet permissions under Logcat in Android Studio.

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />

Now run your tests and results will be uploaded to Tesults to the target that maps to the token supplied.


You can submit build information by adding a build case to the TesultsListener. This is a test case with the suite set to the value '[build]'.


Map<String, Object> buildCase = new HashMap<String, Object>();
buildCase.put("name", "1.0.0");
buildCase.put("desc", "build description");
buildCase.put("suite", "[build]");
buildCase.put("result", "pass");

// Add to the same array test cases are added to in

Result Interpretation

Result interpretation is not currently supported by this integration. If you are interested in support please contact

Consolidating parallel test runs

If you execute multiple test runs in parallel or serially for the same build or release and results are submitted to Tesults within each run, separately, you will find that multiple test runs are generated on Tesults. This is because the default behavior on Tesults is to treat each results submission as a separate test run. This behavior can be changed from the configuration menu. Click 'Results Consolidation By Build' from the Configure Project menu to enable and disable consolidation by target. Enabling consolidation will mean that multiple test runs submitted with the same build name will be consolidated into a single test run.

Dynamically created test cases

If you dynamically create test cases, such as test cases with variable values, we recommend that the test suite and test case names themselves be static. Provide the variable data information in the test case description or other custom fields but try to keep the test suite and test name static. If you change your test suite or test name on every test run you will not benefit from a range of features Tesults has to offer including test case failure assignment and historical results analysis. You need not make your tests any less dynamic, variable values can still be reported within test case details.

Proxy servers

Does your corporate/office network run behind a proxy server? Contact us and we will supply you with a custom API Library for this case. Without this results will fail to upload to Tesults.

Have questions or need help? Contact us