Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

Testing text predictions with batch tests

Updated on July 5, 2022

Before you deploy your text prediction to a production environment, you can test the prediction against a batch of sample sentences that you upload in a CSV format. Next, you can use the results of the analysis to improve the configuration of your text prediction.

For example, you can create additional topics and entities or add more training data or keywords for topics and entities.

  1. Open the text prediction:
    1. In the navigation pane of App Studio, click Channels.
    2. In the Current channel interfaces section, click the icon that represents a channel for which you want to configure the text prediction.
    3. On the channel configuration page, click the Behavior tab, and then click Open text prediction.
  2. In the header of the prediction workspace, click the arrow to the right of the Test button, and then click Batch test.
  3. If you do not have an input file ready, create an input file by using the template:
    1. In the Batch test prediction dialog box, in the Input file template field, click Download to download the template.
    2. In the template, enter sample content for batch testing of your text prediction.
      An input file contains the following columns:
      • ID (mandatory)
      • Content (mandatory)
      • Tag1, Tag2, Tag3, and Tag4 (optional)

      Tags 1-4 can contain pass-through values that are included in the test results. You can use these columns to annotate individual content items, and then use the tags to search or filter the test results.

      For example:
      IDContentTag1Tag2Tag3Tag4
      1I want to book a flight ticketBooking
      2What is the price for a ticket from London to DubaiBookingPrice
      3I want to take a trip to TokyoBooking
  4. In the Batch test prediction dialog box, click Choose File.
  5. Select the CSV file with the sample content, and then click Open.
  6. Click Test.
    Result: The prediction analyzes the input for the configured outcomes. When the system completes the batch test run, the Download button becomes available.
    Batch test results
    The progress bar shows completion at 100 percent. The test processed five rows and took 17 seconds.
  7. Click Download to save a CSV file with the test results.
    Result: Depending on your configuration, the results cover the detected topics, sentiments, and entities with confidence scores.
  8. Optional: To see the details of the data flow run that the system completed to perform the batch test, click Click to see more details.
    Result: The batch test data flow run page provides information about the number of processed records and their processing time. From this page, you can edit the data flow run settings and restart the batch test run.

    The information on the data flow run page might be helpful for debugging in cases where some or all rows in the data flow fail for some reason. You can identify the step at which the data flow failed and view the logs.

    Details of a batch test data flow run
    The data flow run page contains progress and component statistics. The system successfully processed five records in this run.
What to do next:

If the test results show that the predictions are inaccurate, reconfigure your text prediction (for example, by adding more training data for topics or entities). For more information, see Reviewing and adding training data for text predictions.

If you are satisfied with the test results, a system architect can deploy the prediction to the production environment. For more information, see Moving applications between systems.

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us