test-manager
latest
false
- Getting started
- Project management
- Documents
- Working with Change Impact Analysis
- Creating automated tests
- Step 1: Complete prerequisites for performance scenarios
- Step 2: Add performance scenarios
- Step 3: Add test cases to load groups
- Step 4: Configure load group settings
- Executing performance scenarios
- Known limitations for performance testing
- Best paractices for performance testing
- Troubleshooting performance testing
- Searching with Autopilot
- Project operations and utilities
- Test Manager settings
- ALM tool integration
- API integration
- Troubleshooting

Test Manager user guide
Last updated Oct 16, 2025
Load groups define the specific test cases that will run in parallel at scale. Each group can represent a distinct workload (e.g., login flow vs. transaction flow).
In real-world systems, application traffic rarely comes from a single type of user or follows one uniform pattern. By designing scenarios with varied test case types, load profiles, and execution timings, you simulate realistic usage conditions—making it more likely to uncover bottlenecks and validate that the application can handle true production-scale demands.
By combining multiple load groups within a scenario, you can:
- Mix different automation types: Include API, Web UI, and Desktop test cases together to reflect the diversity of how users and systems interact with the application. For example, API calls may run in high volume while fewer desktop or web users perform more complex transactions.
- Simulate realistic traffic patterns: Different load groups allow you to model varied timings, ramp-ups, and peak loads. Some users may log in continuously, while others generate activity in bursts or spikes, as shown in the screenshot where one group runs steadily while others ramp up and down at different times.
- Detect interaction effects: Running different workloads in parallel highlights how one process may affect the performance of another. For instance, a surge in API requests may slow down web transactions, or heavy login activity could impact back-end processing times.
- Stress the system holistically: Multiple groups together create a more authentic end-to-end workload, ensuring the system is validated against combined stress, not just isolated test cases. This improves confidence that performance in the test environment reflects production behavior.
- Log in to Test Manager.
- Go to Performance Scenarios and open a scenario.
- Select Add Load Group or select the Test case field in an empty load group to open the selection dialog.
- To set the execution context, under Default folder, select the Orchestrator folder where your user and robot are assigned.
- Choose the test case you want to execute at scale.
-
Select the robot type.
- Cloud serverless robots consume Platform Units and are recommended for Web and API testing.
- On-premises robots consume runtimes (250 per Virtual Users Bundle), which must be configured in machine templates in Orchestrator.
- Select the package version of the automation you want to run. By default, the latest version is chosen automatically.
- Confirm your selection.