UiPath Documentation
test-manager
latest
false
  • Getting started
    • About Test Manager
      • About Autopilot
      • About Autopilot Chat
        • Prerequisites for Autopilot Chat
        • Functionalities in Autopilot Chat
        • MCP servers and settings
    • Getting started
      • Conditions
    • Unified Pricing: Licensing Test Manager
    • Flex: Licensing Test Manager
    • Quickstart guide
  • Project management
    • Documents
    • Heatmap
      • Prerequisites
        • Step 1: Installing the UiPath SAP Add-On in your SAP system
        • Step 2: Configuring your SAP system
        • Step 3: Configuring your network components
        • Step 4: Configuring UiPath Test Manager
          • Web service - basic connection
          • Web service - OAuth connection
          • RFC connection
          • CSV file connection
      • Working with Heatmap
    • Working with Change Impact Analysis
    • Requirements
      • Creating requirements
      • Cloning requirements
      • Exporting requirements
      • Quality-check requirements
      • Generate tests for requirements
      • Finding obsolete tests based on requirements
      • Jira Requirements
      • Applying filters and views
    • Test cases
      • Create test cases
      • Assigning test cases to requirements
      • Cloning test cases
      • Exporting test cases
      • Automate test cases
        • Selecting automation
          • Unlinking automation
        • Automating test cases in Studio Web
      • Linking test cases in Studio to Test Manager
      • Delete test cases
      • Manual test cases
      • Importing manual test cases
      • Document test cases with Task Capture
      • Parameters
      • Applying filters and views
    • Test sets
      • Importing Orchestrator test sets
      • Creating test sets
      • Adding test cases to a test set
      • Assigning default users in test set execution
      • Enabling activity coverage
      • Enabling Healing Agent
      • Configuring test sets for specific execution folders and robots
      • Overriding parameters
      • Cloning test sets
      • Exporting test sets
      • Applying filters and views
    • Executing tests
      • Executing manual tests
      • Executing automated tests
      • Executing test cases without a test set
      • Executing mixed tests
      • Creating pending executions
      • Enforcing an execution order
      • Re-executing test executions
      • Scheduling executions
      • Troubleshooting automated executions
      • FAQ - Feature parity - Test Manager vs Orchestrator
    • Performance testing and scenarios for Test Cloud
      • Licensing model for performance testing
        • Performance Testing Virtual Users Bundle
        • Performance testing - usage example
        • Platform Units and infrastructure costs
        • Additional licenses
      • Getting started with performance testing
        • Performance testing - overview
        • Software and system requirements
        • On-premises robot system requirements
      • Creating automated tests
      • Creating performance scenarios
        • Step 1: Complete prerequisites for performance scenarios
        • Step 2: Add performance scenarios
        • Step 3: Add test cases to load groups
        • Step 4: Configure load group settings
      • Executing performance scenarios
      • Analyzing performance results
        • Comparing performance results
        • Interpreting successful results
        • Interpreting failed results
        • Viewing API performance results
      • Known limitations for performance testing
      • Best practices for performance testing
      • Troubleshooting performance testing
    • Accessibility testing for Test Cloud
    • Test results
      • Analyze test results
      • Deleting test case results
      • Defect synchronization
      • Generate test report
    • Autopilot best practices
      • Quality-check requirements
      • Generate tests for requirements
      • Convert text into code
      • Automate manual tests
      • Generate synthetic test data
      • Generate test report
    • Searching with Autopilot
  • Project operations and utilities
    • Import project
    • Export project
    • Export data
    • Bulk operations
    • Reports
      • Reporting with Insights
      • Test Manager dashboard reports
      • Using external reporting tools
    • Migrating Test Manager artifacts to other delivery options
    • Testing Data retention policy
  • Test Manager settings
    • Tenant level settings
    • User and group access management
    • Autopilot search
    • Custom fields
    • Prompt library
    • General project settings
    • Automation project configuration
    • My notifications
    • Customer managed keys encryption
    • Audit logs
  • ALM tool integration
    • ALM tool integration
    • Test Manager Connect
    • Out-of-the-box connectors
      • Outbound IP ranges for connectors
      • SAP Cloud ALM
      • SAP Solution Manager integration
        • Setup and configuration
          • Configuring user machine
          • Configuring SAP Solution Manager
          • Configuring system under test
          • SAP Solution Manager command line
        • Use case scenarios
      • Atlassian Jira
      • Xray for Jira
      • Azure DevOps
      • ServiceNow
      • Webhooks
      • Redmine Integration
  • API integration
    • Test Manager API integration
      • Integrating your tool with Test Manager
      • API Scopes
    • Test Manager API
      • Test Manager API
      • API version
      • Test Manager URL
      • API references
      • API rate limits
  • Troubleshooting
    • Troubleshooting Test Manager
    • Troubleshooting Autopilot for Testers
UiPath logo, featuring letters U and I in white

Test Manager user guide

Last updated Apr 1, 2026

Software and system requirements

Before you create and execute performance testing scenarios in Test Manager, make sure your organization and tenant meet the required infrastructure, robot, and test automation prerequisites. The configuration depends on the type of robot used to execute the load with: cloud robots or unattended, on-premises robots.

Studio requirements

Performance testing reuses your existing automated functional test cases. Make sure that you create your Studio tests based on the following conditions:
  • Use a UiPath Studio version equal to or higher than 2025.10.3.
  • Use a Test Automation project type.
  • Using the latest activity package versions:
    • System.Activities version 25.4.2 or higher
    • Testing.Activities version 25.10.0 or higher
    • UIAutomation.Activities version 25.10.2 or higher
    • WebAPI.Activities version 2.1.0 or higher

Robot requirements

Configure the robot types you need.
  • Serverless cloud robots consume Platform Units and are best suited for Web and API testing. They scale automatically in Test Cloud, but do not support desktop automations.
  • On-premises robots consume 250 runtimes per Virtual Users Bundle and are required for desktop performance tests. These runtimes must be assigned within machine templates in Orchestrator. Refer to the system requirements for on-premises robots.

Data sources (Data Fabric entities)

Performance scenarios often require dynamic test data at scale. Test Manager can connect to Data Fabric entities, which serve as the source of parameterized test data during scenario execution. This ensures realistic and varied input for concurrent virtual users.

  • Studio requirements
  • Robot requirements
  • Data sources (Data Fabric entities)

Was this page helpful?

Connect

Need help? Support

Want to learn? UiPath Academy

Have questions? UiPath Forum

Stay updated