Nightwatch test reporting

If you are not using the Nightwatch test framework but are using JavaScript/Node.js view the Node.js docs instead for integrating with a lower-level library.

Installation

Tesults supplies a reporter specifically designed for Nightwatch to make integration quick and easy.

Install nightwatch-tesults:

npm install nightwatch-tesults --save

Configuration

In your nightwatch.conf.js file, under test_settings and any environment for which you are testing add a property called tesults (in this example the default environment is being used):

nightwatch.conf.js
module.exports = {
  ...
  test_settings: {
    default: {
      ...
      globals: {
        tesults: {
          target: token
        }
      }
      ...
    }
  }
  ...
}

The 'token' above should be replaced with your Tesults target token, which you received when creating a project or target, and can be regenerated from the configuration menu.

Then run your tests as usual and use the reporter parameter to use nightwatch-tesults:

npx nightwatch tests --reporter nightwatch-tesults

Basic configuration complete

At this point nightwatch-tesults will push results to Tesults after your tests complete. The target arg must be supplied in the nightwatch.conf.js file otherwise reporting will be disabled.

Files generated by tests

Any files attached during a test using the built-in Nightwatch screenshot mechanism (enabled in nightwatch.conf.js) will be automatically uploaded to Tesults for reporting.

Example of enabling Nightwatch screenshots on failure. Any files attached in this way in your tests will be automatically uploaded to Tesults.
module.exports = {
  ...
  test_settings: {
    default: {
      ...
      screenshots: {
        enabled: true
        path: 'screens'
        on_failure: true
      }
      ...
    }
  }
  ...
}
Other files

To upload other files, make use of the files property in the tesults object in the nightwatch.conf.js file. The value for this property is an absolute path to a directory where you store all of your test generated files during at test run. This directory can be temporary, after the test run has completed and files uploaded it can be deleted or overwritten.

These other files must be stored here under a specific directory structure.
  • expanded temporary folder
    • expanded Test Suite A
      • expanded Test 1
        • test.log
        • screenshot.png
      • expanded Test 2
        • test.log
        • screenshot.png
    • expanded Test Suite B
      • expanded Test 3
        • metrics.csv
        • error.log
      • expanded Test 4
        • test.log

Caution: If uploading files the time taken to upload is entirely dependent on your network speed. Typical office upload speeds of 100 - 1000 Mbps should allow upload of even hundreds of files quite quickly, just a few seconds, but if you have slower access it may take hours. We recommend uploading a reasonable number of files for each test case. The upload method blocks at the end of a test run while uploading test results and files. When starting out test without files first to ensure everything is setup correctly.

Handling multiple test jobs with targets

For supporting multiple test jobs Tesults suggests creating multiple targets, as outlined in the project structuring advice you can specify different target tokens for different environments within your nightwatch.conf.js file or even have multiple configuration files.

Full parameter list

module.exports = {
  ...
  test_settings: {
    default: {
      ...
      globals: {
        tesults: {
          target: token,
          build_name: "1.0.0",
          build_result: "pass",
          build_desc: "Build description",
          build_reason: "Build failure reason",
          files: "/full/path/to/temp/files/directory"
        }
      }
      ...
    }
  }
  ...
}

Target

targetRequired

Required arg to upload to Tesults, if this arg is not provided the reporter does not attempt upload, effectively disabling it. Get your target token from the configuration menu in the Tesults web interface.

Build

build_nameOptional

Use this to report a build version or name for reporting purposes.


build_resultOptional

Use this to report the build result, must be one of [pass, fail, unknown].


build_descOptional

Use this to report a build description for reporting purposes.


build_reasonOptional

Use this to report a build failure reason.


filesOptional

Absolute path to the folder containing additional files to upload. The directory structure must match the described layout in the 'Other files' section above.

Result Interpretation

Result interpretation is not currently supported by this integration. If you are interested in support please contact help@tesults.com.

Consolidating parallel test runs

If you execute multiple test runs in parallel or serially for the same build or release and results are submitted to Tesults within each run, separately, you will find that multiple test runs are generated on Tesults. This is because the default behavior on Tesults is to treat each results submission as a separate test run. This behavior can be changed from the configuration menu. Click 'Results Consolidation By Build' from the Configure Project menu to enable and disable consolidation by target. Enabling consolidation will mean that multiple test runs submitted with the same build name will be consolidated into a single test run.

Dynamically created test cases

If you dynamically create test cases, such as test cases with variable values, we recommend that the test suite and test case names themselves be static. Provide the variable data information in the test case description or other custom fields but try to keep the test suite and test name static. If you change your test suite or test name on every test run you will not benefit from a range of features Tesults has to offer including test case failure assignment and historical results analysis. You need not make your tests any less dynamic, variable values can still be reported within test case details.

Proxy servers

Does your corporate/office network run behind a proxy server? Contact us and we will supply you with a custom API Library for this case. Without this results will fail to upload to Tesults.

Have questions or need help? Contact us