UiPath Documentation
process-mining
latest
false

Process Mining user guide

Last updated May 6, 2026

Loading data using DataUploader

With DataUploader you can upload data files up to 5TB each directly into a Process Mining process app. Loading data using DataUploader is more stable than loading data using the Upload data option in Process Mining, also for smaller data files. If the upload using DataUploader fails, for example, due to an unstable connection, DataUploader will retry the upload up to 4 times.

Note:

Always make sure that the data is in the required format for the app template used for the process app. Refer to App Templates.

Parameters

The following table describes the parameters for DataUploader.

ParameterFormatDescriptionMandatoryExample
-c / --csv-dir<csv dir value>The directory containing the CSV files you want to upload. This can also be a relative path.YC:\P2P data
-s / --sas-url<sas url value>The Shared Access Signature (SAS) URL for the Azure Blob Storage container where the files need to be uploaded. Refer to Retrieving the credentials for the Azure blob storage.Y
-e / --end-of-upload-api<value>URL to the end-of-upload API that will be called when the files have been successfully uploaded. Refer to Retrieving the credentials for the Azure blob storage.Y
-d / --delimiter<delimiter value>The ASCII code of the delimiter used in the input files. Must be an ASCII value between 0 and 127. Default is 9 (Tab).N44 (Comma), 9 (Tab)
-p / --proxy<proxy value>The proxy URL.N
-u / --proxy-username<proxy username>The username if authentication is needed to connect to the proxy server.N
-P / --proxy-password<proxy password>The password if authentication is needed to connect to the proxy server.N
-t / --tables<tables value>Table details in JSON format, required for incremental sync. The table prefix must not contain underscores.N[{"prefix": "pfx01", "name": "Table01", "load-type": "incremental"}, {"prefix": "pfx02", "name": "Table02", "load-type": "full"}]
-r / --recursiveLook for all files in the given folder including sub-folders. If files with duplicate names are present in different folders, the behavior can be unpredictable. By default, only files in the given folder are uploaded and sub-folders are ignored.
-f / --config-file<config file value>Allows you to use a configuration file instead of inline parameters. Inline parameters override the values in the config file.
-hAllows you to display help for a DataUploader command.

Configuration file

Instead of specifying all parameters inline, you can use a YAML configuration file and pass it with the --config-file parameter. Inline parameters override values in the config file.

The following shows an example configuration file:

csv-dir: "path\\to\\CSV\\folder"
sas-url: "https://my.sas.url"
end-of-upload-api: "https://my.end.of.upload.api"
tables:
  - prefix: pfx01
    name: Table01
    load-type: incremental
  - prefix: pfx02
    name: Table02
    load-type: full
delimiter: 9
full-reload: false
recursive: true
csv-dir: "path\\to\\CSV\\folder"
sas-url: "https://my.sas.url"
end-of-upload-api: "https://my.end.of.upload.api"
tables:
  - prefix: pfx01
    name: Table01
    load-type: incremental
  - prefix: pfx02
    name: Table02
    load-type: full
delimiter: 9
full-reload: false
recursive: true

Downloading DataUploader

  • Parameters
  • Configuration file
  • Downloading DataUploader

Was this page helpful?

Connect

Need help? Support

Want to learn? UiPath Academy

Have questions? UiPath Forum

Stay updated