test-manager
latest
false
UiPath logo, featuring letters U and I in white

Test Manager user guide

Last updated Jun 26, 2025

Reports

Understand the progress, detect problems at a glance, and review KPIs through reports.

Report Types

The following report types are available:

  • Default Test Manager dashboards provided by Insights.
  • Dashboard reports that provide daily and weekly breakdown of test results, including passed/failed ratio, test execution list, and KPIs, such as the total coverage, or open defects.

Reporting with Insights

Prerequisites

To enable reporting with Insights, follow these steps:

  1. Enable the Insights service on the same tenant as your Test Manager instance.

    You need an Automation Cloud Organization Administrator role to enable a service.

  2. From Test Manager, activate the Enable reporting with Insights tenant-level setting.

    You need a Test Manager Administrator tenant role to enable the integration with Insights.

    For more information about activating the setting, visit Tenant level settings.

Overview

Once you enable the Insights integration in your Test Manager tenant, you will be able to access analytics for all your testing projects within that tenant. Insights retrieves data from Test Manager, based on a specific data model, and presents it through the Test Manager Execution Report predefined dashboard. This dashboard provides an overview of all your test executions within your tenant.

Figure 1. Test Manager Execution Report dashboard

Note: When you enable reporting with Insights, data from all Test Manager projects are uploaded to Insights. Therefore, all users with access to Insights will be able to generate reports on those projects, independent from their permissions in Test Manager.

Data model

Insights uses test case logs from Test Manager to generate the customizable dashboards. For more details on the data model used by Insights to generate the dashboards, check the Test Manager data model section.

Note: In the Test Manager data model, test case logs contain two types of results: technical and functional. Usually these results are the same, unless a technical error appears during execution. Technical errors can include infrastructure disruptions, automation errors, or other non-functional triggers. In any of these scenarios, the technical results will be label the test as failed. Functional results are meant to only reflect the outcome of business verifications. Therefore they indicate no-result in case of errors because of the lack of a reliable outcome.

Test Manager dashboard reports

You can examine the reports of each project individually. To view dashboard reports, open Test Manager and select a project from the list to view the Dashboard page.



Section Number

Report

Description

1

Results

Bar chart displaying the results of your test runs. To switch between daily and weekly views, click the focus indicator, at the upper-right side of the dashboard and choose an option from the dropdown.

The weekly report goes back 14 weeks.

2

Current Day/Week Indicator

Day/Week indicator of the results shown in the bar chart.

This section is not captured in the above figure.

3

KPIs

Open Defects is available only for connected projects. For example, defect reporting from the Jira project connected to Test Manager. The number of defects defined with the highest priority in Jira are shown in the Critical section. You can click the hyperlink to open the Jira filter.

If you unlink a connection, Open Defect will not be shown anymore.

Total Coverage

The percentage of requirements that have at least one test case out of the total number of requirements.

Automation Rate

The percentage of test cases without automation/manual steps defined, as well as the automated, and manual test cases, respectively.

4

Latest Results

Latest test executions that contributed to your test results displayed in the bar chart.

Use Case Scenario

To take advantage of the dashboard reports, consider the following scenario:

Issue

For example, you have a high number of defects that need to be dealt, alongside an equally high number of manual test cases.

Solution

You can glance over the dashboard to check the number of defects, where you can click on the Critical section within the Open Defects report, to open the filter in Jira. The Automation Rate report indicates the degree of automation of your test project. Using this information, you can start reducing the number of manual test cases, so that automated test cases come ahead, and allocate more testers to deal with the high number of defects.

Using external reporting tools

This page describes the object types and their corresponding attributes available for reporting purposes. To connect your reporting solution to those views, contact your system administrator. You can either use native SQL connections or your own reporting tools.

The following sections describe the object types and the attributes that you can check when using an external reporting tool.

Projects

The following table lists the available attributes of a Test Manager project in external reporting tools.

Table 1. Project attributes
NameData typeDescription
IdStringIdentifier for the project.
NameStringName of the project.
descriptionString (supports Markdown)Description of the project.
ProjectPrefixStringThe specific prefix of the project.
isActiveBooleanIndicates whether the project has been modified in the past six months.
isAuthorizationEnabledBooleanIndicates whether authorization is required for accessing the project.
isSapConfiguredBooleanIndicates whether there is an active connection to an SAP instance.
createdByStringThe email address of the Test Manager user who created the project.
createdDateTimeTimestamp indicating when the project was created.
updatedDateTimeTimestamp indicating the most recent update of the project.

Requirements

The following table lists the available attributes of a requirement in external reporting tools.

Table 2. Requirement attributes
NameData typeDescription
IdStringIdentifier for a requirement created directly in Test Manager.
NameStringName of the requirement.
DescriptionString (supports Markdown)Description of the requirement.
ProjectIdStringIdentifier (GUID) for the project to which the requirement belongs.
ExternalRequirementIdStringIdentifier for a requirement synchronized from an external ALM tool in Test Manager.
ObjKeyStringKey of the requirement.
CreatedDateTimeTimestamp indicating when the requirement was created.
UpdatedDateTimeTimestamp of the most recent update to the requirement.

Test cases

The following table lists the available attributes of a test case in external reporting tools.

Table 3. Test case attributes
NameData typeDescription
IdStringIdentifier for a test case created directly in Test Manager.
ExternalTestCaseIdStringIdentifier for a test case synchronized from an external ALM tool in Test Manager.
ProjectIdStringIdentifier (GUID) for the project to which the test case belongs.
NameStringName of the test case.
DescriptionString (supports Markdown)Description of the test case.
ObjKeyStringKey of the test case.
preConditionString (supports Markdown)Precondition of the test case, if exists.
RequirementsCollection or RelationshipsObject keys of the requirements assigned to this test case.
updatedDateTimeTimestamp of the most recent update to the test case.
createdDatetTimeTimestamp indicating when the test case was created.
AutomationProjectNameStringName of the Studio project which contains the automation assigned to this test case.
AutomationTestCaseNameStringName of the test case in Studio which serves as the automation assigned to this Test Manager test case.

Test sets

The following table lists the available attributes of a test set in external reporting tools.

Table 4. Test set attributes
NameData typeDescription
IdStringIdentifier of a test set created directly in Test Manager.
ExternalTestSetIdStringIdentifier of a test set synchronized from an external ALM tool in Test Manager.
ProjectIdStringIdentifier (GUID) of the project to which the test set belongs.
NameStringName of the test set.
DescriptionString (supports Markdown)Description of the test set.
ObjKeyStringKey of the test set.
SourceStringSource of the test set, that can be either Test Manager or Orchestrator.
NumberOfTestCasesIntegerNumber of test cases assigned to the test set.
TestCasesList<String>List of the test cases assigned to the test set.
VersionDoubleVersion of the test set.
UrlStringThe URL of the test set inside the Test Manager project.
enableCovergaeBooleanIndicates whether activity coverage is enabled for the test set, and can enable if coverage information should be collected.
enforceExecutionOrderBooleanIndicates whether execution order is enforced for the test set.
createdDateTimeTimestamp indicating when the test set was created.
updatedDateTimeTimestamp indicating the most recent update to the test set.

Test executions

The following table lists the available attributes of a test execution in external reporting tools.

Table 5. Test execution attributes
NameData typeDescription
IdStringIdentifier for the test execution.
NameStringName of the test execution.
ProjectIdStringIdentifier (GUID) for the project to which the test execution belongs.
DescriptionString (supports Markdown)Description of the test execution.
ExecutionTypeStringType of execution, which can be Manual, Automated, or Mixed.
ExecutionStartDateTimeTimestamp indicating when the test execution started.
ExecutionFinishedDateTimeTimestamp indicating when the test execution finished.
durationIntegerDuration of the test execution.
SourceStringSource of the test execution, that can be Test Manager or Orchestrator.
StatusStringStatus of the test execution: Pending, Running, Cancelling, Passed, Finished, or Cancelled.
TestSetIdStringIdentifier for the test set executed during this test execution.
testSetObjKeyStringKey of the test set executed during this test execution.
Total Duration (seconds)DoubleTotal time of the test execution in seconds.
Result Summary (P/F/N)StringFinal result of the test execution: Passed, Failed, or None.
Reporting DateDateTimeTimestamp indicating the reporting date.
Execution Order EnforcedBooleanIndicates if execution order was enforced.
Error RateDoubleError rate of the test execution.
Coverage EnabledBooleanIndicates whether activity coverage was enabled for the executed test set.
CreatedDateTimeTimestamp indicating when the test execution was created.
ResultsList<String>List containing the results of the test execution.

Test case logs

The following table lists the available attributes of a test case log in external reporting tools.

Table 6. Test case log attributes
NameData typeDescription
automationIdStringAutomation ID assigned to the test case responsible for generating this test case log after its execution.
automationProjectNameStringName of the Studio project which contains the automation assigned to this test case.
automationTestCaseNameStringName of the test case in Studio which serves as the automation assigned to this Test Manager test case.
Business ResultStringThe business result of the test case responsible for generating this test case log after its execution.
DefectIdStringIdentifier for the defect resulted from the test case log.
DetailLinkStringTest Manager URL of the test case log.
DurationDoubleDuration of the test case execution.
Duration (seconds)DoubleDuration of the test case execution in seconds.
ExecutedByStringThe username or robot name who executed the test.
ExecutionEndDateTimeTimestamp indicating when the execution of the test case ended.
executionOrderStringOrder of test cases enforced during execution.
ExecutionStartDateTimeTimestamp indicating when the execution of the test case started.
executionTypeStringType of execution: Manual, Automated, or Mixed.
HasErrorBooleanIndicates if the test case encountered an error or not.
hostMachineNameString The name of the machine that executed the test case.

Was this page helpful?

Get The Help You Need
Learning RPA - Automation Courses
UiPath Community Forum
Uipath Logo White
Trust and Security
© 2005-2025 UiPath. All rights reserved.