Skip to main content

         This documentation site is for previous versions. Visit our new documentation site for current releases.      

   Go here for previous versions.

What's new in decision management 8.7

Updated on January 18, 2022

Significant usability enhancements improve the overall user experience of decision management and enable you to meet the ever-changing needs of your customers. The key enhancements include the capability to export real-time data directly to files and daily Prediction Studio email notifications.

Higher predictive power with adaptive gradient boosting

Pega Platform version 8.7 introduces a new adaptive gradient boosting algorithm in Pega Adaptive Decision Manager with a higher predictive power.

Adaptive boosting is an alternative to the existing adaptive modeling technique that is based on a Bayesian core algorithm. Adaptive boosting achieves higher predictive power to deliver more accurate predictions. This algorithm increases the acceptance rate of propositions and improves the customer experience by ensuring that the proposed actions are more relevant to customers.

Adaptive boosting is an online self-learning predictive model in Adaptive Decision Manager that predicts propensities for all the available actions. It provides highly personalized and relevant actions to individual customers, achieving true one-to-one customer engagement.

For more information, see Adaptive gradient boosting overview and Adaptive Gradient Boosting - a Pega Whitepaper.

Monitor model predictors and output without capturing responses

Prediction Studio has new monitoring capabilities to help you observe whether your models behave as expected. You can now evaluate the execution of your models irrespective of responses (such as clicks or conversions), by monitoring the predictors (input) and output of your models.

Previous versions of Pega Platform provided monitoring capabilities based on the feedback that models receive, and on tracking the accuracy of the models over time. However, certain types of model, such as the predictive models used in Process AI, do not receive feedback regularly or do not receive it at all. Capturing model responses can also be technically difficult. For example, to retrieve conversion data, you might require access to an external fulfillment system and have to wait a long time for the data to accumulate.

However, data scientists can draw important conclusions from model execution data, without capturing responses. By monitoring model predictors (such as age, income, gender, or location) and output (such as propensity or labels), you can identify performance issues soon after model deployment and update your models on a regular basis. Prediction Studio reports any significant changes in predictor and output values through notifications and charts that provide evidence for the notifications.

Sample predictor charts in a prediction
Line charts show age percentiles and minimum and maximum values.

For more information, see Monitoring predictions.

Update impact
After you update your Pega Platform to version 8.7, the monitoring of predictors and model output is enabled by default, and the system is configured to monitor 5% of all model executions. Typically, this amount of data should be sufficient for meaningful model analysis without having a negative impact on system performance.
What steps are required to update the application to be compatible with this change?
  1. Ensure that your system meets the following disk space requirements:
    • The disk space allocated to the Stream service is 250 GB.
    • The analytics repository is configured in Prediction Studio with 15 GB of disk space.
  2. Review the assumptions and calculations that are the basis of the disk space requirements and the default 5% monitor percentage, and verify whether they apply to the data volumes in your system. For more information, see Estimating the model monitoring payload and Configuring the monitoring of model input and output.

Stay focused and well-informed with enhanced notifications

Prediction Studio has new notification features to help you monitor and manage your models and predictions in an intentional and efficient manner.

Prediction Studio provides new options to categorize and sort notifications. You can now filter notifications by category to focus on a specific aspect of model execution (responses, performance, output, predictors), or to attend to tasks that await your attention (model approval, prediction deployment). In the Impact column, you can now see the priority level of each notification: high, medium, or low, to help you identify the most important issues and prioritize them accordingly.

Category selection and impact labels for notifications
The Notifications page shows several messages with High impact labels. The Categories menu contains the filtering options.

Notifications that apply to predictions can now link to one or more predictions. For example, significant changes in the performance of a model that is used in multiple predictions generate a single notification that links to the impacted predictions. When you click the notification, you have the option to choose the prediction that you want to view.

Each prediction now contains the Notifications tab with all the associated notifications, to help you identify performance issues and interpret the monitoring charts on the Analysis tab.

For more information, see Monitoring predictions and Viewing Prediction Studio notifications.

Daily email notifications provide actionable insights into your models

Data scientists can now receive daily emails with high-priority notifications that Prediction Studio generated for the models and predictions in the past 24 hours. The system sends these daily summaries to users in the work group for the data scientists. The emails contain CSV files with the notification details. For each notification, the file provides the following information:

  • Model name
  • Source type, for example, adaptive model, predictive model, or prediction
  • Notification type
  • Insight
  • Time
Sample file attached to the Daily Prediction Studio Notifications email
The sample contains notifications for sales models regarding low performance or lack of responses.

Prediction Studio email notifications provide manageable chunks of monitoring information to help data scientists keep track of important events and issues in the system, such as a candidate model awaiting approval or a significant drop in model performance. By receiving notification of an event or issue within 24 hours, data scientists can perform important tasks or resolve problems in a timely manner.

For more information, see Enabling Prediction Studio email notifications.

Stream service enhancements

Pega Platform version 8.7 introduces the following enhancements to the Stream service:

Six partitions per topic

To improve the balance between resource utilization and performance, Pega Platform now defaults the number of partitions per topic to six. Pega Platform uses the Kafka concept of topics and partitions to achieve high data consumption throughput and concurrency. Partitions are a mechanism for providing scalability and redundancy. Typically, a greater number of partitions enable more clients to consume messages. However, a higher partition count increases resource utilization and latency, for example, due to the high number of open server files and replication latency. That is why there is a limit on the maximum number of partitions that a broker can handle.

Update impact
In Pega Platform versions 8.6 and earlier, the default number of partitions per topic is 20. After a system update to Pega Platform version 8.7 or later, the number of partitions of the topics associated with existing Stream data sets or queue processors does not change. However, if you truncate the pr_data_stream_nodes table, the number of partitions reverts to the new default setting of six partitions. Newly created topics have six partitions.
What steps are required to update the application to be compatible with this change?
No special steps are necessary. After the update, you can customize the global default number of partitions and size individual queue processors and Stream data sets by changing the number of partitions of the associated topic. Before you change the partitions count, review the information on the maximum number of partitions per broker to ensure optimal performance of your Stream service.

Individual sizing of queue processors and Stream data sets

You can size individual queue processors and Stream data sets by changing the number of partitions of the associated topic, with the help of the AlterStreamPartitions activity. By changing the partitions count, you can manage data consumption throughput to address higher or lower traffic for some queue processors, for example, during a new product launch. For more information, see Sizing queue processors and Stream data sets individually.

AlterStreamPartitions activity
The number of partitions for a Real Time Interactions data set is set to five. The allow Data loss checkbox is selected.

Other enhancements

Read about minor enhancements in Pega Platform version 8.7.

Exporting real-time data to a repository

You can now export data from real-time data sets (Kafka, Kinesis, Stream) to files in a repository. For more information, see Creating a data flow.

Encryption and decryption of files using File data sets

You are now provided with a method of data encryption and decryption that is built into Pega Platform. You can encrypt data and save the encrypted information into the repository. You can also decrypt and process data from the encrypted files within Pega. For more information, see Creating a File data set record for files on repositories.

Asynchronous processing of records in single-case data flow runs

You can now enable asynchronous processing of records when saving data to a Stream data set as part of a single-case data flow run. Asynchronous processing ensures higher throughput. For more information, see Creating a data flow.

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best. is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us