/results

You may use the REST API to upload test results data to Tesults.

Tesults recommends using the Tesults API library for your language or test framework plugin instead

A Tesults API library or test framework plugin provides greater control and is the only way to upload files (logs, screenshots etc.) along with your test results.

Use the /results API to upload results:

API

https://www.tesults.com/results
Method: 'POST'

Headers

"Content-Type": "application/json"

Ensure no other headers are supplied.

Body

Submit test results data in JSON format in the request POST body:

{
  "target": "token",
  "results": {<results>}
}

Both 'target' and 'results' values are required.

The "token" placeholder should be replaced with the target token string provided to you on creation of your project. If you did not note down the token when you created your project you can regenerate a token for the target in the configuration menu.

<results> must be replaced with your test results. Test results are formatted as an array of test case objects.


"results": { "cases": [
  {
    "name" : <name>,
    "desc" : <desc>,
    "suite" : <suite>,
    "result" : <result>,
    "reason" : <reason>,
    "params" : <params>
  },
  ...more test cases here
  ]
}

In each test case object "name" and "result" are required, "desc", "suite", "reason" and "params" are optional.

<name> should be replaced with the name of the test case. (Required)

<desc> should be replaced with the description of the test case. (Optional)

<suite> should be replaced with the suite of the test case. (Optional)

<result> should be replaced with the result of the test case. (Required)

<reason> should be replaced with the reason for the failure of the test case. (Optional)

<params> should be replaced with the parameters of the test case if it is a parameterized test. (Optional)

To report a build pass or failure add a case with the suite value set to '[build]'. If a case has suite set to '[build]' it will be treated as build data rather than as a test case. Use the name or description to report a build number, version or revision.

Here is a complete example:

{
  "target": "0123456789",
  "results": { "cases": [
    {
        "name": "Test A",
        "desc": "Test A description",
        "suite": "Suite X",
        "result": "pass"
    },
    {
        "name": "Test B",
        "desc": "Test B description",
        "suite": "Suite X",
        "result": "fail",
        "reason": "Assert in line 203, example.cpp.",
        "_CustomField": "Custom field value"
    },
    {
        "name": "Test C",
        "desc": "Test C description",
        "suite": "Suite Y",
        "result": "pass",
        "params": {
          "param1": "value1",
          "param2": "value2"
        }
    }
    ]
  }
}

Within 'results' the 'cases' array is required. If you are only uploading one test result you must still send it as one element in the array.

"0123456789" should be replaced with the target token string provided to you once you on creation of your project. If you did not note down the token when you created your project you can regenerate a token for the target in the configuration menu.

If the the request is successful the response will have a status code of 200 and include a JSON response as shown below.


status code 200 with JSON response:

{
  "data": {
     "code": 200,
     "message": "Success"
  }
}

If there is an error there will be a status code other than 200 with JSON response as shown below.


status code other than 200, for example 400, 401, 403, 429, 500 with JSON response:

{
  "error": {
     "code": 400,
     "message": "Missing required parameters - target and results are required"
  }
}

The error message will provide detail as to the specific reason for the failed request.

Fix the error displayed in the error message and try again to get a successful response.

Your team members will be able to view your uploaded test results immediately after a successful response. Any notificiation recipients you have configured will be notified about the availability of the new results by email.

Test case properties

This is a complete list of test case properties for reporting results. The required fields must have values otherwise upload will fail with an error message about missing fields.

PropertyRequiredDescription
name*Name of the test.
result*Result of the test. Must be one of: pass, fail, unknown. Set to 'pass' for a test that passed, 'fail' for a failure.
suiteSuite the test belongs to. This is a way to group tests.
descDescription of the test
reasonReason for the test failure. Leave this empty or do not include it if the test passed
paramsParameters of the test if it is a parameterized test.
filesFiles that belong to the test case, such as logs, screenshots, metrics and performance data.
stepsA list of test steps that constitute the actions of a test case.
startStart time of the test case in milliseconds from Unix epoch.
endEnd time of the test case in milliseconds from Unix epoch.
durationDuration of the test case running time in milliseconds. There is no need to provide this if start and end are provided, it will be calculated automatically by Tesults." : "Duration of the build time in milliseconds. There is no need to provide this if start and end are provided, it will be calculated automatically by Tesults.
rawResultReport a result to use with the result interpretation feature. This can give you finer control over how to report result status values beyond the three Tesults core result values of pass, fail and unknown.
_customReport any number of custom fields. To report custom fields add a field name starting with an underscore ( _ ) followed by the field name.

Build properties

To report build information simply add another case added to the cases array with suite set to [build]. This is a complete list of build properties for reporting results. The required fields must have values otherwise upload will fail with an error message about missing fields.

PropertyRequiredDescription
name*Name of the build, revision, version, or change list.
result*Result of the build. Must be one of: pass, fail, unknown. Use 'pass' for build success and 'fail' for build failure.
suite*Must be set to value '[build]', otherwise will be registered as a test case instead.
descDescription of the build or changes.
reasonReason for the build failure. Leave this empty or do not include it if the build succeeded.
paramsBuild parameters or inputs if there are any.
filesBuild files and artifacts such as logs.
startStart time of the build in milliseconds from Unix epoch.
endEnd time of the build in milliseconds from Unix epoch.
durationDuration of the build time in milliseconds. There is no need to provide this if start and end are provided, it will be calculated automatically by Tesults.
_customReport any number of custom fields. To report custom fields add a field name starting with an underscore ( _ ) followed by the field name.

Consolidating parallel test runs

If you execute multiple test runs in parallel or serially for the same build or release and results are submitted to Tesults within each run, separately, you will find that multiple test runs are generated on Tesults. This is because the default behavior on Tesults is to treat each results submission as a separate test run. This behavior can be changed from the configuration menu. Click 'Results Consolidation By Build' from the Configure Project menu to enable and disable consolidation by target. Enabling consolidation will mean that multiple test runs submitted with the same build name will be consolidated into a single test run.

Dynamically created test cases

If you dynamically create test cases, such as test cases with variable values, we recommend that the test suite and test case names themselves be static. Provide the variable data information in the test case description or other custom fields but try to keep the test suite and test name static. If you change your test suite or test name on every test run you will not benefit from a range of features Tesults has to offer including test case failure assignment and historical results analysis. You need not make your tests any less dynamic, variable values can still be reported within test case details.