Replacing models in predictions with MLOps
Improve a prediction by replacing a low-performing model with a high-accuracy model. You can also replace a model with a scorecard or a field in the data model that contains a score.
For example, in a customer engagement context, you can replace a scorecard that is used to predict churn with a PMML or H2O machine learning model.
You can replace both the outcome-based and supporting models in predictions.
Replacing models is part of the Machine Learning Operations (MLOps) feature of Pega Platform.
Ensure that Prediction Studio and your Business Change pipeline are configured to support the model update feature. For a list of prerequisites, see Updating active models in predictions with MLOps.
Plan your changes by checking which model types you can use to replace your current models, and which mode you can use to deploy the new models (shadow mode, replace, direct replace). For more information, see Active and candidate model type combinations.
- Log in to your application:
- Log in as a data scientist to the Business Operations Environment (BOE).
- In the header of Dev Studio, click the name of your current application, and then click Switch Application.
- From the list, select the overlay application.
- Switch the portal to Prediction Studio.
- In the navigation pane of Prediction Studio, click Predictions.
- From the list of predictions, open a prediction that you want to change, and then click the Models tab.
- To the right of the model that you want to change, click the More icon, and then click Replace model.
- In the Replace model dialog box, choose the entity with
which you want to replace the active model:
Options Actions Replace with a machine learning model Click Model, and then go to 7. Replace with a scorecard - Select Scorecard, and then click Next.
- Select the scorecard to directly replace the current model,
and then click Replace.
Result: The scorecard replaces the current model. This is the end of the procedure.
Replace with a field in the data model - Select Field, and then click Next.
- Select a field that you want to use to predict outcomes,
and then click Replace.
Result: The field replaces the current model. This is the end of the procedure.
- Optional: If you want to use a validation data set to compare the active model and the
candidate model, in the Replace model dialog box, select
the Compare the models field.If you clear the Compare the models field, the Replace model window will not prompt you to select a data set for the comparison. You can add a validation data set later when you evaluate the model.
- Select the source of the new model to replace the current model:
Choices Actions Use a file-based model - Click Upload.
- In the Select a PMML, H2O MOJO or Pega OXL file section, add a model to replace the current model.
Use a cloud-based model - Click Machine learning service.
- In the Machine learning service
field, select the machine learning service from which you
want to run the model.Pega Platform currently supports Google AI Platform and Amazon SageMaker models.
- In the Model field, select a model to replace the current model.
- Optional: In the Upload model metadata file
section, add a metadata file with input mapping and outcome
categories for the model.You can create a metadata file by clicking Download template and mapping the JSON input fields to Pega Platform fields in the template. For information about the metadata file properties and the available values, see Metadata file specification for predictive models.
Choose a model from Pega Platform - Click Model list.
- Select a model to replace the current model.
- Optional: In the Add model documentation (optional) section, upload a documentation file for the model.
- Click Next.
- If you selected the Compare the models field in step
7, add a validation data set and the outcome column to compare
the two models, and then click Next.The outcome column contains the response labels for the outcome.
- In the Summary section, in the Model
name field, enter a name for the new model, and then click
Add model.
Result: You can view the new model by expanding the twist arrow to the left of the current model. The new model name includes the number of the branch that contains the model change, for example, Validate Churn (M-7012). A candidate model in a prediction - Wait a few seconds for the analysis of the new model to complete. If necessary, refresh the page by clicking .
- CONFIGURATION FAILED
- VALIDATION FAILED
- READY FOR REVIEW
If the configuration of artifacts or validation of predictors fails, open the model by clicking its name, and read the error message in the Model comparison window. Resolve the error and resume the analysis or reject the model. For more information, see Model update statuses and notifications.
When the configuration and validation are successful, the model is ready for review.
Previous topic MLOps prerequisites Next topic Evaluating candidate models with MLOps