CodeceptJS test reporting

Powering up your CodeceptJS tests with Tesults reporting is easy. This page explains it all. If you have any questions please reach out to the Tesults team. If you are not using CodeceptJS view the Node.js and JavaScript docs for information about integrating with a lower level library.

Installation

npm install codeceptjs-tesults --save

Configuration

In your codecept.conf.js file add codeceptjs-tesults in the plugins section:

exports.config = {
  // ...
    plugins: {
      'tesults': {,
        'require': 'codeceptjs-tesults',
        'enabled': true,
        'target': 'token'
        }
      }
    },
  // ...
}
targetRequired

You must provide your target token to push results to Tesults. Replace 'token' in the above code with the value of your target token. If this arg is not provided the plugin does not attempt upload and is disabled. You receive a target token when creating a project or new target and can regenerate one at anytime from the configuration menu. Find out more about targets

To support multiple reporting targets, you can use multiple configuration files.

Basic configuration complete

At this point, if you run tests, results data will be uploaded to Tesults

npx codeceptjs run

Enhanced reporting

Associate a file to the test case in order to upload it for reporting using codeceptjs-tesults version 1.2.0+ using I.say followed by a string with prefix "tesults:file:" followed by the full path to the file:

Feature('Feature A');

Scenario('Scenario 1', ({ I }) => {
  I.amOnPage('https://www.google.com');
  I.say("tesults:file:/full/path/to/file/log.txt");
  I.say("tesults:file:/full/path/to/file/screenshot.png");
  I.see('does not exist');
});

Caution: If uploading files the time taken to upload is entirely dependent on your network speed. Typical office upload speeds of 100 - 1000 Mbps should allow upload of even hundreds of files quite quickly, just a few seconds, but if you have slower access it may take hours. We recommend uploading a reasonable number of files for each test case. The upload method blocks at the end of a test run while uploading test results and files. When starting out test without files first to ensure everything is setup correctly.

Build

Optionally report build information by passing build data as additional data in the plugin properties

exports.config = {
  // ...
    plugins: {
      'tesults': {,
        'require': 'codeceptjs-tesults',
        'enabled': true,
        'target': 'token',
        'build-name': '1.0.0',
        'build-description': 'build description',
        'build-result': 'pass',
        'build-reason': 'failure reason',
        'build-files': ['/full/path/to/file']
        }
      }
    },
  // ...
}

Build properties

build-nameOptional

Use this to report a build version or name for reporting purposes.


build-resultOptional

Use this to report the build result, must be one of [pass, fail, unknown].


build-descriptionOptional

Use this to report a build description for reporting purposes.


build-reasonOptional

Use this to report a build failure reason.


build-filesOptional

Use this to upload build files as an array of paths to specific files, see example above.


Result Interpretation

Result interpretation is supported by CodeceptJS. Install version 1.1.0 or newer. If you use result interpretation we recommend you add these minimum mapping values:

passed
pass
success
pass
failed
fail

Consolidating parallel test runs

If you execute multiple test runs in parallel or serially for the same build or release and results are submitted to Tesults within each run, separately, you will find that multiple test runs are generated on Tesults. This is because the default behavior on Tesults is to treat each results submission as a separate test run. This behavior can be changed from the configuration menu. Click 'Results Consolidation By Build' from the Configure Project menu to enable and disable consolidation by target. Enabling consolidation will mean that multiple test runs submitted with the same build name will be consolidated into a single test run.

Dynamically created test cases

If you dynamically create test cases, such as test cases with variable values, we recommend that the test suite and test case names themselves be static. Provide the variable data information in the test case description or other custom fields but try to keep the test suite and test name static. If you change your test suite or test name on every test run you will not benefit from a range of features Tesults has to offer including test case failure assignment and historical results analysis. You need not make your tests any less dynamic, variable values can still be reported within test case details.

Proxy servers

Does your corporate/office network run behind a proxy server? Contact us and we will supply you with a custom API Library for this case. Without this results will fail to upload to Tesults.

Have questions or need help? Contact us