- Before you begin
- Managing access
- Getting started
- Integrations
- Working with process apps
- Working with dashboards and charts
- Working with process graphs
- Working with Discover process models and Import BPMN models
- Showing or hiding the menu
- Context information
- Export
- Filters
- Sending automation ideas to UiPath® Automation Hub
- Tags
- Due dates
- Compare
- Conformance checking
- Process simulation
- Root cause analysis (Preview)
- Simulating automation potential
- Starting a Task Mining project from Process Mining
- Triggering an automation from a process app
- Viewing Process data
- Process Insights (preview)
- Creating apps
- Loading data
- Transforming data
- Autopilot™ for SQL (preview)
- Structure of transformations
- Tips for writing SQL
- Exporting and importing transformations
- Viewing the data run logs
- Merging event logs
- Configuring Tags
- Configuring Due dates
- Configuring fields for Automation potential
- Activity Configuration: Defining activity order
- Making the transformations available in dashboards
- Data models
- Adding and editing processes
- Customizing dashboards
- Publishing process apps
- App templates
- Notifications
- Additional resources
Process Mining user guide
With DataUploader you can upload data files up to 5TB each directly into a Process Mining process app. Loading data using DataUploader is more stable than loading data using the Upload data option in Process Mining, also for smaller data files. If the upload using DataUploader fails, for example, due to an unstable connection, DataUploader will retry the upload up to 4 times.
Always make sure that the data is in the required format for the app template used for the process app. Refer to App Templates.
Parameters
The following table describes the parameters for DataUploader.
| Parameter | Format | Description | Mandatory | Example |
|---|---|---|---|---|
-c / --csv-dir | <csv dir value> | The directory containing the CSV files you want to upload. This can also be a relative path. | Y | C:\P2P data |
-s / --sas-url | <sas url value> | The Shared Access Signature (SAS) URL for the Azure Blob Storage container where the files need to be uploaded. Refer to Retrieving the credentials for the Azure blob storage. | Y | |
-e / --end-of-upload-api | <value> | URL to the end-of-upload API that will be called when the files have been successfully uploaded. Refer to Retrieving the credentials for the Azure blob storage. | Y | |
-d / --delimiter | <delimiter value> | The ASCII code of the delimiter used in the input files. Must be an ASCII value between 0 and 127. Default is 9 (Tab). | N | 44 (Comma), 9 (Tab) |
-p / --proxy | <proxy value> | The proxy URL. | N | |
-u / --proxy-username | <proxy username> | The username if authentication is needed to connect to the proxy server. | N | |
-P / --proxy-password | <proxy password> | The password if authentication is needed to connect to the proxy server. | N | |
-t / --tables | <tables value> | Table details in JSON format, required for incremental sync. The table prefix must not contain underscores. | N | [{"prefix": "pfx01", "name": "Table01", "load-type": "incremental"}, {"prefix": "pfx02", "name": "Table02", "load-type": "full"}] |
-r / --recursive | Look for all files in the given folder including sub-folders. If files with duplicate names are present in different folders, the behavior can be unpredictable. By default, only files in the given folder are uploaded and sub-folders are ignored. | |||
-f / --config-file | <config file value> | Allows you to use a configuration file instead of inline parameters. Inline parameters override the values in the config file. | ||
-h | Allows you to display help for a DataUploader command. |
Configuration file
Instead of specifying all parameters inline, you can use a YAML configuration file and pass it with the --config-file parameter. Inline parameters override values in the config file.
The following shows an example configuration file:
csv-dir: "path\\to\\CSV\\folder"
sas-url: "https://my.sas.url"
end-of-upload-api: "https://my.end.of.upload.api"
tables:
- prefix: pfx01
name: Table01
load-type: incremental
- prefix: pfx02
name: Table02
load-type: full
delimiter: 9
full-reload: false
recursive: true
csv-dir: "path\\to\\CSV\\folder"
sas-url: "https://my.sas.url"
end-of-upload-api: "https://my.end.of.upload.api"
tables:
- prefix: pfx01
name: Table01
load-type: incremental
- prefix: pfx02
name: Table02
load-type: full
delimiter: 9
full-reload: false
recursive: true
Downloading DataUploader
Use the following link to download DataUploader: https://download.uipath.com/ProcessMining/versions/2.0.5/DataUploader/process-mining-data-uploader.exe