test-manager
latest
false
UiPath logo, featuring letters U and I in white

Test Manager user guide

Last updated Sep 18, 2025

Reporting with Insights

Prerequisites

To enable reporting with Insights, follow these steps:

  1. Enable the Insights service on the same tenant as your Test Manager instance.

    You need an Automation Cloud Organization Administrator role to enable a service.

  2. From Test Manager, activate the Enable reporting with Insights tenant-level setting.

    You need a Test Manager Administrator tenant role to enable the integration with Insights.

    For more information about activating the setting, visit Tenant level settings.

Overview

Once you enable the Insights integration in your Test Manager tenant, you will be able to access analytics for all your testing projects within that tenant. Insights retrieves data from Test Manager, based on a specific data model, and presents it through the Test Manager Execution Report predefined dashboard. This dashboard provides an overview of all your test executions within your tenant.

Figure 1. Test Manager Execution Report dashboard

Note: When you enable reporting with Insights, data from all Test Manager projects are uploaded to Insights. Therefore, all users with access to Insights will be able to generate reports on those projects, independent from their permissions in Test Manager.

Data model

Insights uses test case logs from Test Manager to generate the customizable dashboards. For more details on the data model used by Insights to generate the dashboards, check the Test Manager data model section.

Note: In the Test Manager data model, test case logs contain two types of results: technical and functional. Usually these results are the same, unless a technical error appears during execution. Technical errors can include infrastructure disruptions, automation errors, or other non-functional triggers. In any of these scenarios, the technical results will be label the test as failed. Functional results are meant to only reflect the outcome of business verifications. Therefore they indicate no-result in case of errors because of the lack of a reliable outcome.

Test Manager data model

Terms and concepts

The structure of the Test Manager data model is based on the following concepts:

Concept

Description

Explore

The starting point for exploration.

The data is surfaced via Explores, which you can think of as a general entity that corresponds to the fields within it.

View

A view represents a table of data, whether that table is native to your database or was created using Looker’s derived table functionality.

Within each view are field definitions, each of which typically corresponds to a column in the underlying table or a calculation in Looker.

Dimension

As part of a View within the Explore, the Dimension is a groupable field and can be used to filter query results.

It can be one of the following:

  • An attribute, which has a direct association to a column in an underlying table.
  • A fact or numerical value.
  • A derived value computed based on the values of other fields in a single row.

Measure

As part of a View within the Explore, the Measure parameter declares a new measure (aggregation) and specifies a name for that measure.

Examples of measures types: integer, string.

Dimensions and measures

Note: The Requirements, Test cases, Test sets, and Test executions dimensions and measures are available only in Test Manager via Test Cloud.
Test case logs

The following tables describe the available dimensions and measures for test case logs.

Table 1. Test Case Logs dimensions
DimensionTypeDescription
Assignee EmailstringThe email address of the user to which a test case was assigned to.
Automation Project NamestringThe name of the automation linked to the test case.
Due DatedateThe date when a manual test case was scheduled to have been executed.
Execution StartdateThe date when the test execution started.
Executed BystringThe username or robot name who executed the test.
Execution EnddateThe date when the test execution ended.
Execution TypestringThe type of the execution:
  • Automated for automated tests.
  • Manual for manual tests.
  • None for test cases that are not executed and are in the Pending state.
Host Machine NamestringThe name of the machine.
Project NamestringThe name of the Test Manager project.
Project PrefixstringThe prefix of the Test Manager project.
Reporting DatedateThe date when the test was executed. This is a date without time and also without time zone.
ResultstringThe test case result: Passed, Failed, or None.
Robot NamestringThe name of the robot that executed the test.
Test Execution Namestring The name of the test execution.
Table 2. Test Case Logs measures
MeasuresTypeDescription
Total CountintegerThe total number of test case logs.
Passed CountstringThe number of passed test case logs.
Failed Count (Technical)integerTechnical results indicate results as failed when an exception occurs during execution.
No Result Count (Technical)integerTechnical results indicate no results when an exception occurs during execution.
Duration in SecondsintegerTotal runtime in seconds.
Requirements

The following tables describe the available dimensions and measures for requirements.

Table 3. Requirement dimensions
DimensionTypeDescription
Project NamestringA combination of the project name and project prefix aimed to make it unique and increase readability.
NamestringA combination of the requirement key and its name, aimed to make it unique and increase readability.
Table 4. Requirement measures
MeasureTypeDescription
Total test case countintegerThe number of test cases assigned to a requirement.
Automated test case countintegerThe number of automated test cases assigned to a requirement.
Manual test case countintegerThe number of manual test cases assigned to a requirement.
Passed test case countintegerThe number of test cases assigned to a requirement that have passed in the latest test run.
Failed test case countintegerThe number of test cases assigned to a requirement that have failed in the latest test run.
None test case countintegerThe number of test cases assigned to a requirement with no result in the latest test run.
Total requirements countintegerThe total number of requirements.
Test sets
The following tables describe the available dimensions and measures for test sets.
Note: For assigned test cases, the test set view contains information about static test assignments.
Table 5. Test set dimensions
DimensionTypeDescription
Project NamestringA combination of the project name and project prefix aimed to make it unique and increase readability.
NamestringA combination of the project name and project prefix aimed to make it unique and increase readability.
Robot namestringThe name of the robot that executed the test set.
SourcestringThe source of the test set: Orchestrator or Test Manager.
Table 6. Test set measures
MeasureTypeDescription
Total test set countintegerThe number of test sets.
Automated test case countstringThe number of automated test cases in the test set.
Manual test case countintegerNumber of automated test cases in the test set.
Test execution

The following tables describe the available dimensions and measures for test executions.

Table 7. Test executions dimensions
DimensionTypeDescription
Project NamestringA combination of project prefix and name to make it unique and increase readability.
NamestringA combination of execution ID and name to make it unique and increase readability.
Test set namestringA combination test set key and name to make it unique and increase readability.
Execution typestringThe execution type: Manual, Automated, None, or Mixed.
SourcestringThe source of the execution: TestManager, Orchestrator, or Studio.
Table 8. Test execution measures
MeasureType
Total test execution countinteger
Manual execution type countinteger
Automated execution type countinteger
Mixed execution type countinteger
Duration in seconds of test executioninteger
Test case

The following tables describe the available dimensions and measures for test cases.

Table 9. Test case dimensions
DimensionTypeDescription
Project nameguidA combination of the project name and project prefix aimed to make it unique and increase readability.
NamestringA combination of the project name and project prefix aimed to make it unique and increase readability.
Package namestringN/A
Table 10. Test case measures
MeasureType
Total number of test casesinteger
Number of passed test case logsinteger
Number of failed test case logsinteger
  • Prerequisites
  • Overview
  • Data model
  • Test Manager data model
  • Terms and concepts
  • Dimensions and measures

Was this page helpful?

Get The Help You Need
Learning RPA - Automation Courses
UiPath Community Forum
Uipath Logo
Trust and Security
© 2005-2025 UiPath. All rights reserved.