- Release Notes
- Before you begin
- Getting started
- Installing Automation Suite
- Migration and Upgrade
- Projects
- Datasets
- ML packages
- Pipelines
- ML Skills
- ML Logs
- Document Understanding in AI Center
- How To
- Basic Troubleshooting Guide
- General AI Center troubleshooting and FAQs

AI Center User Guide
When uploading dataset files, the following error can occur:
Failed to upload item(s), it may be due to a slow or lost internet connection
Possible cause
This error message can show up because of some browser configurations.
Solution
Open browser console and get the DNS of the objectstore url. It will be of the form objectstore.xxx.xx Make sure that the objectstore DNS is resolvable either by adding to host file or talking to your network administrator Once the DNS is resolved, if the certificate is not trusted, make sure you trust the certificate inside your browser before uploading the item.
When trying to view or run pipelines, an error can be occur, even though permissions to run pipelines are in place.
Solution
In order to run and view pipelines, Read permissions on the ML Packages are mandatory.
Issue: Service deployment can get stuck because of the DATABASECHANGELOGLOCK lock not being released by one service
On rare occasions, if you restart the machine two times consecutively, service deployment can get stuck because of the DATABASECHANGELOGLOCK lock not being released by one service. In this case you will see AI Center pods restarting continuously.
Solution
Run the following SQL command in the AI Center database to release the lock:
UPDATE DATABASECHANGELOGLOCK SET LOCKED = 0, LOCKGRANTED = null, LOCKEDBY = null
UPDATE DATABASECHANGELOGLOCK SET LOCKED = 0, LOCKGRANTED = null, LOCKEDBY = null
The import/export script is failing with the following error message:
cookfile_new.txt: Permission denied
Solution
cookfile.txt
and cookfile_new.txt
files generated locally by the import/export script.
When running the import or export scripts, the following error message can occur:
./export.sh: line 2: $'\r': command not found
This error message is displayed when importing or exporting ML Packages using scripts.
Solution
Run the following command before running the import or export script:
dos2unix <filename>
dos2unix <filename>
This issue can occur when running a UiPath Studio automation and uploading validation data for training using a public dataset.
Solution
If you encounter this issue, try one of the following solutions:
- Upload the package manually.
- Use a private dataset so you can connect through a robot that can leverage this.
update-mlskills-cm
cronjob is missing in AI Center versions 2021.10.1 and 2021.10.2.
Solution
YAML
file below.
apiVersion: batch/v1beta1
kind: CronJob
metadata:
name: update-mlskill-cm
namespace: uipath
spec:
concurrencyPolicy: Forbid
failedJobsHistoryLimit: 1
jobTemplate:
spec:
template:
metadata:
annotations:
sidecar.istio.io/inject: "false"
spec:
containers:
- args:
- -XPOST
- ai-deployer-svc.uipath.svc.cluster.local/ai-deployer/v1/system/mlskills:update-cm
image: registry.uipath.com/aicenter/alpine-curl:7.78.0
imagePullPolicy: IfNotPresent
name: update-mlskill-cm
securityContext:
allowPrivilegeEscalation: false
capabilities:
drop:
- NET_RAW
privileged: false
readOnlyRootFilesystem: true
runAsNonRoot: true
dnsPolicy: ClusterFirst
imagePullSecrets:
- name: regcred
restartPolicy: OnFailure
schedulerName: default-scheduler
securityContext: {}
terminationGracePeriodSeconds: 30
ttlSecondsAfterFinished: 120
schedule: 0 */2 * * *
startingDeadlineSeconds: 200
successfulJobsHistoryLimit: 1
suspend: false
apiVersion: batch/v1beta1
kind: CronJob
metadata:
name: update-mlskill-cm
namespace: uipath
spec:
concurrencyPolicy: Forbid
failedJobsHistoryLimit: 1
jobTemplate:
spec:
template:
metadata:
annotations:
sidecar.istio.io/inject: "false"
spec:
containers:
- args:
- -XPOST
- ai-deployer-svc.uipath.svc.cluster.local/ai-deployer/v1/system/mlskills:update-cm
image: registry.uipath.com/aicenter/alpine-curl:7.78.0
imagePullPolicy: IfNotPresent
name: update-mlskill-cm
securityContext:
allowPrivilegeEscalation: false
capabilities:
drop:
- NET_RAW
privileged: false
readOnlyRootFilesystem: true
runAsNonRoot: true
dnsPolicy: ClusterFirst
imagePullSecrets:
- name: regcred
restartPolicy: OnFailure
schedulerName: default-scheduler
securityContext: {}
terminationGracePeriodSeconds: 30
ttlSecondsAfterFinished: 120
schedule: 0 */2 * * *
startingDeadlineSeconds: 200
successfulJobsHistoryLimit: 1
suspend: false
Versions up to 2021.10.4
LOGS_STREAMING_ENABLED
environment variable to false
.
You can also add a logsStreamingEnabled
global variable with the value set
as false
using ArgoCD under the aicenter app details. Make sure to
sync ArgoCD after the change is done.
Versions from 2021.10.5
To disable streaming logs for versions starting with 2021.10.5, toggle the ArgoCD values from the screenshot.
- General AI Center troubleshooting and FAQs
- Message: Failed to upload Item(s), it may be due to a slow or lost Internet connection
- Possible cause
- Solution
- Issue: Error on Pipelines pages even though permissions are in place for running pipelines
- Solution
- Issue: Service deployment can get stuck because of the DATABASECHANGELOGLOCK lock not being released by one service
- Solution
- Issue: import/export script fails
- Solution
- Message: ./export.sh: Line 2: $'r': Command Not Found
- Solution
- Issue: signed URL for public datasets is failing
- Solution
- Issue: the Update-mlskills-cm cronjob is missing
- Solution
- Disabling streaming logs
- Versions up to 2021.10.4
- Versions from 2021.10.5