- Getting started
- Project management
- Documents
- Working with Change Impact Analysis
- Searching with Autopilot
- Project operations and utilities
- Test Manager settings
- ALM tool integration
- API integration
- Troubleshooting

Test Manager user guide
Reporting with Insights
linkPrerequisites
linkTo enable reporting with Insights, follow these steps:
- Enable the Insights service on the same tenant as your Test Manager
instance.
You need an Automation Cloud Organization Administrator role to enable a service.
- From Test Manager, activate the Enable reporting with Insights
tenant-level setting.
You need a Test Manager Administrator tenant role to enable the integration with Insights.
For more information about activating the setting, visit Tenant level settings.
Overview
linkOnce you enable the Insights integration in your Test Manager tenant, you will be able to access analytics for all your testing projects within that tenant. Insights retrieves data from Test Manager, based on a specific data model, and presents it through the Test Manager Execution Report predefined dashboard. This dashboard provides an overview of all your test executions within your tenant.
Data model
linkInsights uses test case logs from Test Manager to generate the customizable dashboards. For more details on the data model used by Insights to generate the dashboards, check the Test Manager data model section.
Test Manager data model
linkTerms and concepts
The structure of the Test Manager data model is based on the following concepts:
Concept |
Description |
---|---|
Explore |
The starting point for exploration. The data is surfaced via Explores, which you can think of as a general entity that corresponds to the fields within it. |
View |
A view represents a table of data, whether that table is native to your database or was created using Looker’s derived table functionality. Within each view are field definitions, each of which typically corresponds to a column in the underlying table or a calculation in Looker. |
Dimension |
As part of a View within the Explore, the Dimension is a groupable field and can be used to filter query results. It can be one of the following:
|
Measure |
As part of a View within the Explore, the Measure parameter declares a new measure (aggregation) and specifies a name for that measure. Examples of measures types:
integer ,
string .
|
Dimensions and measures
Test case logs
The following tables describe the available dimensions and measures for test case logs.
Dimension | Type | Description |
---|---|---|
Assignee Email | string | The email address of the user to which a test case was assigned to. |
Automation Project Name | string | The name of the automation linked to the test case. |
Due Date | date | The date when a manual test case was scheduled to have been executed. |
Execution Start | date | The date when the test execution started. |
Executed By | string | The username or robot name who executed the test. |
Execution End | date | The date when the test execution ended. |
Execution Type | string | The type of the execution:
|
Host Machine Name | string | The name of the machine. |
Project Name | string | The name of the Test Manager project. |
Project Prefix | string | The prefix of the Test Manager project. |
Reporting Date | date | The date when the test was executed. This is a date without time and also without time zone. |
Result | string | The test case result: Passed, Failed, or None. |
Robot Name | string | The name of the robot that executed the test. |
Test Execution Name | string | The name of the test execution. |
Measures | Type | Description |
---|---|---|
Total Count | integer | The total number of test case logs. |
Passed Count | string | The number of passed test case logs. |
Failed Count (Technical) | integer | Technical results indicate results as failed when an exception occurs during execution. |
No Result Count (Technical) | integer | Technical results indicate no results when an exception occurs during execution. |
Duration in Seconds | integer | Total runtime in seconds. |
Requirements
The following tables describe the available dimensions and measures for requirements.
Dimension | Type | Description |
---|---|---|
Project Name | string | A combination of the project name and project prefix aimed to make it unique and increase readability. |
Name | string | A combination of the requirement key and its name, aimed to make it unique and increase readability. |
Measure | Type | Description |
---|---|---|
Total test case count | integer | The number of test cases assigned to a requirement. |
Automated test case count | integer | The number of automated test cases assigned to a requirement. |
Manual test case count | integer | The number of manual test cases assigned to a requirement. |
Passed test case count | integer | The number of test cases assigned to a requirement that have passed in the latest test run. |
Failed test case count | integer | The number of test cases assigned to a requirement that have failed in the latest test run. |
None test case count | integer | The number of test cases assigned to a requirement with no result in the latest test run. |
Total requirements count | integer | The total number of requirements. |
Test sets
Dimension | Type | Description |
---|---|---|
Project Name | string | A combination of the project name and project prefix aimed to make it unique and increase readability. |
Name | string | A combination of the project name and project prefix aimed to make it unique and increase readability. |
Robot name | string | The name of the robot that executed the test set. |
Source | string | The source of the test set: Orchestrator or Test Manager. |
Measure | Type | Description |
---|---|---|
Total test set count | integer | The number of test sets. |
Automated test case count | string | The number of automated test cases in the test set. |
Manual test case count | integer | Number of automated test cases in the test set. |
Test execution
The following tables describe the available dimensions and measures for test executions.
Dimension | Type | Description |
---|---|---|
Project Name | string | A combination of project prefix and name to make it unique and increase readability. |
Name | string | A combination of execution ID and name to make it unique and increase readability. |
Test set name | string | A combination test set key and name to make it unique and increase readability. |
Execution type | string | The execution type: Manual, Automated, None, or Mixed. |
Source | string | The source of the execution: TestManager, Orchestrator, or Studio. |
Measure | Type |
---|---|
Total test execution count | integer |
Manual execution type count | integer |
Automated execution type count | integer |
Mixed execution type count | integer |
Duration in seconds of test execution | integer |
Test case
The following tables describe the available dimensions and measures for test cases.
Dimension | Type | Description |
---|---|---|
Project name | guid | A combination of the project name and project prefix aimed to make it unique and increase readability. |
Name | string | A combination of the project name and project prefix aimed to make it unique and increase readability. |
Package name | string | N/A |
Measure | Type |
---|---|
Total number of test cases | integer |
Number of passed test case logs | integer |
Number of failed test case logs | integer |