Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

Evaluating candidate models with MLOps

Updated on May 17, 2024

After you add a candidate model to a prediction, Prediction Studio configures and validates the new model, and provides comparison data to help you evaluate the new model. Decide whether you want to approve the new model for deployment to production or reject it.

Model evaluation is part of the Machine Learning Operations (MLOps) feature of Pega Platform.

  1. In the navigation pane of Prediction Studio, click Predictions.
  2. From the list of predictions, open a prediction that contains a candidate model, and then click the Models tab.
  3. Display the candidate model by expanding the twist arrow to the left of the current model.
  4. Ensure that the status of the candidate model is READY FOR REVIEW. If the configuration or validation of the model is still in progress, wait several seconds, and then refresh the page by clicking ActionsRefresh.
  5. Open the candidate model by clicking its name.
    Result: In the Model comparison window, the Validation tab displays a basic comparison of the two models. Prediction Studio marks the better model as Recommended based on the receiver operating characteristic (ROC) and transparency. If a data set was added for use in the comparison, Prediction Studio generates charts that show how the two models compare in terms of score distribution, ROC, lift, and gains. For more information, see Active and candidate model comparison charts.
    Model comparison
    Window shows names, types, and transparency indexes for two models. Score distribution and ROC curve charts are shown below.
  6. Optional: If you want to add a data set for the comparison or use a different data set, click Configure data set.
    1. In the Validation data set window, select the data set and the outcome column to compare the two models.
      The outcome column contains the response labels for the outcome.
    2. Review the properties in the data set and their values.
      In the Records field, you can select the number of records for the data set preview. The values are recalculated after each change.
    3. Click Submit.
    4. In the top-right corner of the Model comparison window, click Save.
    5. In the Confirm save dialog box, click Yes.
      Result: You return to the Prediction window.
    6. Wait several seconds for the validation to complete, and then refresh the page by clicking ActionsRefresh.
    7. When the status of the model changes to READY FOR REVIEW, open the model by clicking its name, and review the new comparison charts.
  7. Review the comparison charts to decide which model is better for your use case.
  8. Optional: Export the analysis data to a CSV file by clicking Download analysis data.
  9. In the top-right corner, click Evaluate.
  10. In the evaluation window, approve or reject the model:
    • To let the new model shadow the current model, click Approve new candidate model and start shadowing (recommended).
    • To replace the current model, click Approve candidate model and replace current active model.
    • To reject the new model, select Reject candidate model.
    Note: Shadowing means that the new model receives production data and the system tracks the outcomes of the model, but does not use the outcomes to make business decisions. This option is only available for predictive models. For more information, see Active and candidate model type combinations.

    Approving a candidate model deletes other candidate models that are associated with the active model.

    A rejected model and its related assets are archived. You can view them in the model history.

  11. In the Reason field, provide a comment on the approval or rejection.
  12. Click Save.
Result:

In Pega Customer Decision Hub environments, if you approve a model update, the system automatically creates and resolves a Change prediction request in Pega 1:1 Operations Manager. A revision manager can then deploy the model to production.

In environments without Pega Customer Decision Hub, the branch with the approved model is updated. A system architect can use then deploy the model to production.

What to do next:

When the deployment is complete, log in to the production instance and verify that the model is updated.

If you approved a predictive model in shadow mode, review its performance after a period of time. For more information, see Monitoring predictions.

Based on your analysis and your business needs, you can decide to promote the model as the active model or reject it. For more information, see:

    Have a question? Get answers now.

    Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

    Did you find this content helpful?

    Want to help us improve this content?

    We'd prefer it if you saw us at our best.

    Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

    Close Deprecation Notice
    Contact us