communications-mining
latest
false
UiPath logo, featuring letters U and I in white
Communications Mining User Guide
Last updated Nov 19, 2024

Adding new labels to existing taxonomies

User permissions required: 'View Sources' AND 'Review and annotate'.

If you have a pre-existing mature taxonomy, with many reviewed messages, adding a new label requires some additional training to bring it in line with the rest of the labels in the taxonomy.

When adding a new label to a well-trained taxonomy, you need to make sure to apply it to previously reviewed messages if the label is relevant to them.

If you do not, the model will have effectively been taught that the new label should not apply to them and will struggle to predict the new label confidently.

The more reviewed examples there are in the dataset, the more training this will require when adding a new label (unless it's an entirely new concept that you won't find in older data, but will find in much more recent data).

Key steps:

Create the new label when you find an example where it should apply

Find other examples where it should apply using a few different methods:

  1. You can search for key terms or phrases using search function in Discover to find similar instances - this way you apply the label in bulk if there are lots of similar examples in the search results
  2. Or you can search for key terms or phrases in Explore - this is potentially a better method as you can filter to 'Reviewed' messages, and searching in Explore returns an approximate count of the number of messages that match your search terms
  3. You can also select labels that you think might often appear alongside your new label, and review the pinned examples for that label to find examples where your new label should be applied
  4. Once you have a few pinned examples, see if it starts to get predicted in 'Label' mode - if it does, add more examples using this mode
  5. Lastly, if you're annotating in a sentiment enabled dataset, and your new label is typically either positive or negative you can also choose between positive and negative sentiment when looking at reviewed examples (though at present you cannot combine 'text search' with the 'reviewed' filter AND a sentiment filter)

Then use 'Missed label' to find more messages where the platform thinks the new label should have been applied:

  • Once you have annotated quite a few examples using the methods above and the model has had time to retrain, use the 'Missed label' functionality in Explore by selecting your label and then select 'Missed label' from the dropdown menu
  • This will show you reviewed messages where the model thinks the select label may have been missed in the previously reviewed examples
  • In these instances, the model will show the label as a suggestion (as shown in the example below)
  • Apply the label to all of the messages that the model correctly thinks the label should have been applied to
  • Keep training in this page until you have annotated all of the correct examples, and this mode no longer shows you examples where the label should actually apply
Example message where the model correctly suggests that 'Claim > Confirmation > Payment' has been missed

Then check how the new label performs in the Validation page (once the model has had time to retrain and calculate the new validation statistics) and see if more training is required.

Was this page helpful?

Get The Help You Need
Learning RPA - Automation Courses
UiPath Community Forum
Uipath Logo White
Trust and Security
© 2005-2024 UiPath. All rights reserved.