Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

Viewing a predictive model report

Updated on July 5, 2022

To ensure that the performance of your predictive models is high, apart from accessing the default charts in the Monitor tab of the predictive models, you can create your own reports. View examples of such reports in Prediction Studio.

Before you begin: To monitor a predictive model, ensure that a system architect creates a response strategy that references the model and defines the values for the .pyOutcome and .pyPrediction properties, where:
  • The .pyPrediction value is the same as the model objective that is visible in the Model tab for that predictive model (applies to all model types).
  • For binary models, the .pyOutcome value is the same as one of the outcome labels that is visible in the Model tab for that predictive model. For continuous and categorical models, this parameter value does not need to correspond to the model settings.
For more information, see Headless decisioning.
  1. In the header of Prediction Studio, click ActionsReportsPredictive, and select a predictive model report type that you want to view:
    • To verify models that predict two predefined possible outcome categories, click List of binary models.
    • To verify models that predict three or more predefined possible outcome categories, click List of categorical models.
    • To verify models that predict a range of possible outcome values, click List of continuous models.
    • To compare the accuracy of all predictive models, click Latest performance per model.
  2. Optional: In the list of models, decide what data you want to see in the report by clicking Edit report and choosing the columns to display.
    For more information, see Editing a report.
  3. In the list of models, click the model you want to analyze in detail.
  4. In the detailed model view, review the predicted and actual outcome data.
    For more information on how to interpret the monitoring data, see Metrics for measuring predictive performance.

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us