- Getting started
- Project management
- Documents
- Change Impact Analysis
- Scheduling executions
- Executing tests
- Searching with Autopilot
- Project operations and utilities
- Test Manager settings
- ALM tool integration
- API integration
- Troubleshooting

Test Manager user guide
Reporting with Insights
To enable reporting with Insights, follow these steps:
- Enable the Insights service on the same tenant as your Test Manager
instance.
You need an Automation Cloud Organization Administrator role to enable a service.
- From Test Manager, activate the Enable reporting with Insights
tenant-level setting.
You need a Test Manager Administrator tenant role to enable the integration with Insights.
For more information about activating the setting, visit Tenant level settings.
Once you enable the Insights integration in your Test Manager tenant, you will be able to access analytics for all your testing projects within that tenant. Insights retrieves data from Test Manager, based on a specific data model, and presents it through the Test Manager Execution Report predefined dashboard. This dashboard provides an overview of all your test executions within your tenant.
Insights uses test case logs from Test Manager to generate the customizable dashboards. For more details on the data model used by Insights to generate the dashboards, check the Test Manager data model section.
The structure of the Test Manager data model is based on the following concepts:
Concept |
Description |
---|---|
Explore |
The starting point for exploration. The data is surfaced via Explores, which you can think of as a general entity that corresponds to the fields within it. |
View |
A view represents a table of data, whether that table is native to your database or was created using Looker’s derived table functionality. Within each view are field definitions, each of which typically corresponds to a column in the underlying table or a calculation in Looker. |
Dimension |
As part of a View within the Explore, the Dimension is a groupable field and can be used to filter query results. It can be one of the following:
|
Measure |
As part of a View within the Explore, the Measure parameter declares a new measure (aggregation) and specifies a name for that measure. Examples of measures types:
integer ,
string .
|
Dimension | Type | Description |
---|---|---|
Assignee Email | string | The email address of the user to which a test case was assigned to. |
Automation Project Name | string | The name of the automation linked to the test case. |
Due Date | date | The date when a manual test case was scheduled to have been executed. |
Execution Start | date | The date when the test execution started. |
Executed By | string | The username or robot name who executed the test. |
Execution End | date | The date when the test execution ended. |
Execution Type | string | The type of the execution:
|
Host Machine Name | string | The name of the machine. |
Project Name | string | The name of the Test Manager project. |
Project Prefix | string | The prefix of the Test Manager project. |
Reporting Date | date | The date when the test was executed. This is a date without time and also without time zone. |
Result | string | The test case result: Passed, Failed, or None. |
Robot Name | string | The name of the robot that executed the test. |
Test Execution Name | string | The name of the test execution. |
Measures | Type | Description |
---|---|---|
Total Count | integer | The total number of test case logs. |
Passed Count | string | The number of passed test case logs. |
Failed Count (Technical) | integer | Technical results indicate results as failed when an exception occurs during execution. |
No Result Count (Technical) | integer | Technical results indicate no results when an exception occurs during execution. |
Duration in Seconds | integer | Total runtime in seconds. |