Monitoring the predictive performance of a model
Capture customer interactions and analyze the performance of a predictive model by using a robust set of metrics to improve your customer experience.
Use case
uPlusTelco wants to improve the experience of their customer support by predicting the reason for each customer call. To achieve that goal, the data analytics team built a predictive model and uploaded it to Prediction Studio. A system architect created a decision strategy with that model, deployed that strategy in a decision data flow, and then created a response data flow with a strategy that references the Predict Call Context model.
When you gather responses by running the decision and response data flows for customer support interactions, your responsibility as a data scientist is to analyze the monitoring data.
Before you begin
Create a response strategy that defines the .pyPrediction and .pyOutcome properties of the model that you want to monitor. See Creating a response strategy.
Gathering customer interactions
To run predictive analytics monitoring on a live system, run the decision and response data flows. This way you collect customer interaction data for monitoring and apply the decision and response strategies to analyze that data.
- Run the decision data flow:
- In Dev Studio, click .
- On the list of the Data Flow rule instances, locate and click MonitorMyPredictiveModel.
- On the data flow tab, click .
- In the data flow test run dialog box, click .
- Wait until the process completes and close the dialog box.
- Run the response data flow:
- On the list of the Data Flow rule instances, locate and click SetResponsesToMonitorMyModels.
- On the data flow tab, click .
- In the data flow test run dialog box, click .
- Wait until the process completes and close the dialog box.
Analyzing the predictive performance of a model
After gathering customer interactions, use different charts and reports to verify the predictive performance of your model.
- In the navigation panel of Prediction Studio, click .
- In the Predictions work area, click the My Predict Call Context model.
- On the predictive model page, open the Monitor tab and click .
- Specify the time span in which you want to analyze the model by selecting the
: All time
: Week
and options, for example:
- Analyze the predictive metrics:
- In the Performance area, verify how accurately your model predicted the outcomes in the specified time, compared to the expected value.
- In the Total responses area, analyze the number of responses that were gathered in the specified time.
- In the Performance area, click and analyze a contingency table of actual outcomes versus the expected outcomes (in percentages or in the number of responses).
- Optional: If you want to store the confusion chart offline for further analysis, click
For more information on how to interpret the data and on predictive performance metrics for other model types, see Metrics for measuring predictive performance.
and save the .csv file.
- In the Performance area, verify how accurately your model predicted the outcomes in the specified time, compared to the expected value.
Conclusion
You have imported a PMML model, created a strategy that uses that model to make predictions, captured responses, and analyzed the predictive performance of your model. Thanks to these activities, you have now a baseline for further analysis of the model accuracy.
What to do next
If you are unsatisfied with the performance of the model, create or import a new one to verify whether the predictions they make help you drive your business results and fulfill your commitments to customers. For more information, see Predictive models monitoring.
To view the main process outline for this tutorial, see Monitoring predictive models.
Previous topic Creating and implementing a response strategy for predictive model monitoring Next topic Tutorial: Building a headless decisioning scenario with data flows in Pega 7.2.1