test-manager
latest
false
UiPath logo, featuring letters U and I in white

Test Manager user guide

Last updated Jul 2, 2025

Reporting with Insights

Prerequisites

To enable reporting with Insights, follow these steps:

  1. Enable the Insights service on the same tenant as your Test Manager instance.

    You need an Automation Cloud Organization Administrator role to enable a service.

  2. From Test Manager, activate the Enable reporting with Insights tenant-level setting.

    You need a Test Manager Administrator tenant role to enable the integration with Insights.

    For more information about activating the setting, visit Tenant level settings.

Overview

Once you enable the Insights integration in your Test Manager tenant, you will be able to access analytics for all your testing projects within that tenant. Insights retrieves data from Test Manager, based on a specific data model, and presents it through the Test Manager Execution Report predefined dashboard. This dashboard provides an overview of all your test executions within your tenant.

Figure 1. Test Manager Execution Report dashboard

Note: When you enable reporting with Insights, data from all Test Manager projects are uploaded to Insights. Therefore, all users with access to Insights will be able to generate reports on those projects, independent from their permissions in Test Manager.

Data model

Insights uses test case logs from Test Manager to generate the customizable dashboards. For more details on the data model used by Insights to generate the dashboards, check the Test Manager data model section.

Note: In the Test Manager data model, test case logs contain two types of results: technical and functional. Usually these results are the same, unless a technical error appears during execution. Technical errors can include infrastructure disruptions, automation errors, or other non-functional triggers. In any of these scenarios, the technical results will be label the test as failed. Functional results are meant to only reflect the outcome of business verifications. Therefore they indicate no-result in case of errors because of the lack of a reliable outcome.

Test Manager data model

Terms and concepts

The structure of the Test Manager data model is based on the following concepts:

Concept

Description

Explore

The starting point for exploration.

The data is surfaced via Explores, which you can think of as a general entity that corresponds to the fields within it.

View

A view represents a table of data, whether that table is native to your database or was created using Looker’s derived table functionality.

Within each view are field definitions, each of which typically corresponds to a column in the underlying table or a calculation in Looker.

Dimension

As part of a View within the Explore, the Dimension is a groupable field and can be used to filter query results.

It can be one of the following:

  • An attribute, which has a direct association to a column in an underlying table.
  • A fact or numerical value.
  • A derived value computed based on the values of other fields in a single row.

Measure

As part of a View within the Explore, the Measure parameter declares a new measure (aggregation) and specifies a name for that measure.

Examples of measures types: integer, string.

Test Case Logs

Table 1. Test Case Logs dimensions
DimensionTypeDescription
Assignee EmailstringThe email address of the user to which a test case was assigned to.
Automation Project NamestringThe name of the automation linked to the test case.
Due DatedateThe date when a manual test case was scheduled to have been executed.
Execution StartdateThe date when the test execution started.
Executed BystringThe username or robot name who executed the test.
Execution EnddateThe date when the test execution ended.
Execution TypestringThe type of the execution:
  • Automated for automated tests.
  • Manual for manual tests.
  • None for test cases that are not executed and are in the Pending state.
Host Machine NamestringThe name of the machine.
Project NamestringThe name of the Test Manager project.
Project PrefixstringThe prefix of the Test Manager project.
Reporting DatedateThe date when the test was executed. This is a date without time and also without time zone.
ResultstringThe test case result: Passed, Failed, or None.
Robot NamestringThe name of the robot that executed the test.
Test Execution Namestring The name of the test execution.
Table 2. Test Case Logs measures
MeasuresTypeDescription
Total CountintegerThe total number of test case logs.
Passed CountstringThe number of passed test case logs.
Failed Count (Technical)integerTechnical results indicate results as failed when an exception occurs during execution.
No Result Count (Technical)integerTechnical results indicate no results when an exception occurs during execution.
Duration in SecondsintegerTotal runtime in seconds.
  • Prerequisites
  • Overview
  • Data model
  • Test Manager data model
  • Terms and concepts
  • Test Case Logs

Was this page helpful?

Get The Help You Need
Learning RPA - Automation Courses
UiPath Community Forum
Uipath Logo White
Trust and Security
© 2005-2025 UiPath. All rights reserved.