Mocha test reporting

Using the Mocha test framework? Follow the instructions below to report results. If you are not using Mocha please see the Node.js docs for integrating with a lower-level library.


Install the mocha-tesults-reporter

npm install mocha-tesults-reporter --save


Now that the mocha-tesults-reporter is installed you have to have Mocha use it. This is done using Mocha's --reporter arg. In your package.json file add this snippet:

"scripts": {
  "test": "mocha * --reporter mocha-tesults-reporter -- tesults-target=token"

Then run Mocha as usual with the npm test command. Use npm run script-name if you named the script something other than test in the package.json. If running mocha from the commandline run:

mocha * --reporter mocha-tesults-reporter -- tesults-target=token

The 'token' above should be replaced with your Tesults target token, which you received when creating a project or target, and can be regenerated from the configuration menu. The * after mocha is telling Mocha to run all tests it finds, you may want to narrow this down, see Mocha docs for more details.

To supply multiple args to the reporter use this format:

mocha * --reporter mocha-tesults-reporter -- arg1=val1 arg2=val2 arg3=val3

for example:

mocha * --reporter mocha-tesults-reporter -- tesults-target=token tesults-files=path tesults-config=configpath

The reporter treats the describe as a test suite and each it as a test case.

const assert = require('assert');

describe('SuiteA', function() {
  it("Test1", function () {
     assert.equal(2, 2);

  it("Test2", function () {
     assert.equal(1, 2);

describe('SuiteB', function() {
  it("Test3", function () {
     assert.equal(3, 3);

Using Protractor? The configuration instructions above do not apply. Follow the Protractor configuration instructions instead and then return here to continue.


Required arg to upload to Tesults, if this arg is not provided the reporter does not attempt upload, effectively disabling it. Get your target token from the configuration menu in the Tesults web interface.

Inline method


In this case, the target token is supplied directly or inline in the commandline args. This is the simplest approach.

Key method


In this case, the reporter will automatically look up the value of the token from a config.json file. See below for more details.


You can optionally provide a path to a config.json file. Configuration json files make it possible to store many targets and have one selected with the tesults-target arg (see above). All args other than tesults-target and tesults-config can be supplied in a config.json file instead of the mocha command, including the file and build related args detailed below.

Example contents of a config.json:

  "target1": "eyJ0eXAiOiJKV1QiLCJhb1",
  "target2": "eyJ0eXAiOiJKV1QiLCJhb2",
  "target3": "eyJ0eXAiOiJKV1QiLCJhb3",
  "target4": "eyJ0eXAiOiJKV1QiLCJhb4",

Or something more descriptive about the targets:

  "web-qa-env": "eyJ0eXAiOiJKV1QiLCJhb1",
  "web-staging-env": "..."
  "web-prod-env": "..."
  "ios-qa-env": "eyJ0eXAiOiJKV1QiLCJhb2",
  "ios-staging-env": "..."
  "web-prod-env": "..."
  "android-qa-env": "eyJ0eXAiOiJKV1QiLCJhb3",
  "android-staging-env": "..."
  "android-prod-env": "..."

Basic configuration complete

At this point the mocha-tesults-reporter will push results to Tesults when you run your mocha command. The tesults-target arg must be supplied to indicate which target to use.


During a test run, save test generated files such as logs and screenshots to a local temporary directory. At the end of the test run all files will automatically be saved to Tesults as long as you save files in the directory structure below.
  • expanded temporary
    • expanded Test suite name
      • expanded Test case name
        • file 1
        • file 2

For example:

  • expanded temporary
    • expanded Test Suite A
      • expanded Test 1
        • test.log
        • screenshot.png
      • expanded Test 2
        • test.log
        • screenshot.png
    • expanded Test Suite B
      • expanded Test 3
        • metrics.csv
        • error.log
      • expanded Test 4
        • test.log

If not using test suites then omit the suite directories:

  • expanded temporary
    • expanded Test 1
      • test.log
      • screenshot.png
    • expanded Test 2
      • test.log
      • screenshot.png

Provide this arg to save files generated during a test run. The top-level directory is expected, where files generated during testing are saved for the running test run. Files, including logs, screen captures and other artifacts will be automatically uploaded.


This is one area where the mocha-tesults-reporter is opinionated and requires that files generated during a test run be saved locally temporarily within a specific directory structure.

Store all files in a temporary directory as your tests run. After Tesults upload is complete, delete the temporary directory or just have it overwritten on the next test run.

Within this temporary directory create subdirectories matching the test suite (describe) and test case (it) name.

Be aware that if providing build files, the build suite is always set to [build] and files are expected to be located in temporary/[build]/buildname

Caution: If uploading files the time taken to upload is entirely dependent on your network speed. Typical office upload speeds of 100 - 1000 Mbps should allow upload of even hundreds of files quite quickly, just a few seconds, but if you have slower access it may take hours. We recommend uploading a reasonable number of files for each test case. The upload method blocks while uploading test results and files. When starting out test without files first to ensure everything is setup correctly.



Use this to report a build version or name for reporting purposes.



Use this to report the build result, must be one of [pass, fail, unknown].



Use this to report a build description for reporting purposes.

tesults-build-description='added new feature'


Use this to report a build failure reason.

tesults-build-reason='build error line 201 somefile.js'

Consolidating multiple test results submissions into one test run

If you execute multiple test runs in parallel or serially for the same build or release and results are submitted to Tesults within each run, separately, you will find that multiple test runs are generated on Tesults. This is because the default behavior on Tesults is to treat each results submission as a separate test run. This behavior can be changed from the configuration menu. Click 'Results Consolidation By Build' from the Configure Project menu to enable and disable consolidation by target. Enabling consolidation will mean that multiple test runs submitted with the same build name will be consolidated into a single test run.

Dynamically created test cases should have static names

If you dynamically create test cases, such as test cases with variable values, we recommend that the test suite and test case names themselves be static. Provide the variable data information in the test case description or other custom fields but try to keep the test suite and test name static. This way you will you benefit from a range of features Tesults has to offer including test case failure assignment and historical results analysis. You need not make your tests any less dynamic, variable values can still be reported within test case details.

Proxy servers

Does your corporate/office network run behind a proxy server? Contact us and we will supply you with a custom API Library for this case. Without this results will fail to upload to Tesults.

Have questions or need help? Contact us