Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

Adaptive models monitoring

Updated on July 5, 2022

To monitor all models that are part of an adaptive model, use the Monitor tab of an adaptive model in Prediction Studio. The predictive performance and success rate of individual models provide information that can help business users and strategy designers refine decision strategies and adaptive models.

You can use adaptive models to predict customer behavior by calculating the propensity of customers to accept a proposition, respond to a message, click a web banner, and so on. Because the models are self-learning predictive models, their predictive performance improves with every customer interaction; for example, when a customer accepts an offer, the next version of the model uses that information to provide better predictions.

The propensity values that an adaptive model returns facilitate offer prioritization, which ensures that the next best action is relevant and personalized.

You can monitor the models to check their predictive performance, to obtain insight into how they work and which predictors they use, and to learn how these factors correlate to the outcome.

Use Prediction Studio and the standard reports in the Report Browser to better understand adaptive models and monitor their performance.

Note: Learn how to use adaptive models in decision strategies from the Data Scientist mission on Pega Academy.

Models chart

The following example illustrates a chart that shows the correlation between an adaptive model's success rate and its performance:

Adaptive models monitoring chart
The monitoring tab of an adaptive model shows a bubble chart with bubbles that represent different models.

In the bubble chart that is displayed on the Monitoring tab, each bubble represents a model for a specific proposition. The size of a bubble represents the number of responses (positive and negative). When you hover the cursor over a bubble, you can view the number of responses, the performance, and the success rate.

The Performance axis indicates the accuracy of the outcome prediction. The model performance is expressed in the Area Under the Curve (AUC) unit of measurement, which has a range between 50 and 100. The higher the AUC, the better a model is at predicting the outcome.

The Success rate axis indicates the success rate expressed in percentages. The system calculates this rate by dividing the number of positive responses by the total number of responses.

Adaptive models table

The data that is used to visualize the models in the bubble chart is displayed below the chart. Models are identified by their model context (business issue, group, proposition name, and so on). For each model response count, success rate and performance is shown.

When to use adaptive models monitoring

Use adaptive models monitoring for the following purposes:

  • Identify technical problems.

    In the Models overview tab, look for adaptive models with a success rate of zero. This means that the propositions for these models do not have any positive responses.

  • Identify propositions for which the model is not predictive.

    In the Models overview tab, look for adaptive models with low performance; these are the models in the left side of the chart. Review the reports for these models and consider adding additional data as predictors.

  • Identify propositions that have a low number of responses.

    In the Models overview tab, look for adaptive models with a low number of responses; these models are represented by the small circles in the chart. Investigate the eligibility criteria in the decision strategy and change exclusion settings to increase the proposition frequency. For more information, see the component categories in Strategy rule form - Completing the Strategy tab.

  • Identify propositions that are proposed so often that they dominate other propositions.

    In the Models overview tab, look for adaptive models with a high number of responses; these models are represented by the big circles in the chart. A high number of responses might be fine from the business point of view. However, if necessary, you can adjust prioritization in the decision strategy to decrease the proposition frequency. For more information, see Arbitration.

  • Identify propositions with a low success rate.

    In the Models overview tab, look for adaptive models with a low success rate; these are the models that are close to the Performance axis. If the model performance is also low, you can try to offer the proposition to more relevant customers to increase the success rate. If the model performance is already high, the relevance to the customers is high, but the proposition is unattractive and you might want to dismiss it. For more information, see Propositions.

  • Inspect an adaptive model.

    Open the Model report to inspect your model after introducing a new proposition, adding or removing a predictor, or changing prioritization in a decision strategy. In the Model report, you can view the active and inactive predictors, investigate the top predictors, or check the uplift among the top-scoring customers. For more information, see About Adaptive Model rules, Propositions, and Strategy rule form - Completing the Strategy tab.

  • Inspect a predictor.

    Check the details of a predictor with a low performance score. A possible cause can be too many missing values for the predictor. Look at the top predictors and in the bins that have a particularly high or low success rate. To inspect a predictor, open the Model report.

  • Identify predictors that are never used.

    In the Predictors tab, look for predictors that are never used by any model. Because unused predictors have only a minor effect on model performance, you do not need to remove them from an adaptive model configuration; however, you can conduct an occasional cleanup as part of your maintenance activities. An unused predictor might still become relevant for a future proposition.

To ensure the accuracy of your adaptive models, perform the following tasks regularly:

  • Check the performance of your models every two weeks.
  • Check the success rate of your models every two weeks.
  • Inspect predictors every two or three months.
  • Viewing summarized adaptive model reports

    You can create customized reports that pertain to all adaptive models that you created in Prediction Studio. You can download those reports as PDF or CSV files to view them outside of your application.

  • Generating and downloading a model report

    Download a report that contains a summary on all predictive, adaptive, and text analytics models in your application. Reports contain data that can help you evaluate the model predictive performance (for example, area under the curve). You might need to store model reports for auditing purposes, or share them with other people in your organization who do not have access to Pega Platform. In the reports, you can check the status and predictive performance of the models and identify who made the last change to the model and when.

  • Viewing a model report

    To analyze an adaptive model, you can view a detailed model report that lists active predictors, inactive predictors, the score distribution, and a trend graph of model performance. You can also zoom into a predictor distribution.

  • Viewing the predictors overview

    In the predictors overview you can see how often a predictor is actively used in a model. This overview can help you identify predictors that are not used in any of the models.

  • Adaptive model methods

    Adaptive models can be managed and trained through a rule-based API.

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us