Ensure that Prediction Studio and your Business Change pipeline are configured to support the model update feature. For a list of prerequisites, see Updating active models in predictions with MLOps.
Plan your changes by checking which model types you can use to replace your current models, and which mode you can use to deploy the new models (shadow mode, replace, direct replace). For more information, see Active and candidate model type combinations.
- Log in to your application:
Options Actions Systems with Pega Customer Decision Hub
- Log in as a data scientist to the Business Operations Environment (BOE).
- In the header of Dev Studio, click the name of your current application, and then click Switch Application.
- From the list, select the overlay application.
Other systems Log in as a data scientist to a non-production instance of your application.
- Switch the portal to Prediction Studio.
- In the navigation pane of Prediction Studio, click Predictions.
- From the list of predictions, open a prediction that you want to change, and then click the Models tab.
- To the right of the model that you want to change, click the More icon, and then click Replace model.
- In the Replace model dialog box, choose the entity with
which you want to replace the active model:
Options Actions Replace with a machine learning model Click Model, and then go to 7. Replace with a scorecard
- Select Scorecard, and then click Next.
- Select the scorecard to directly replace the current model,
and then click Replace.
Result: The scorecard replaces the current model. This is the end of the procedure. Replace with a field in the data model
- Select Field, and then click Next.
- Select a field that you want to use to predict outcomes,
and then click Replace.
Result: The field replaces the current model. This is the end of the procedure.
- Optional: If you want to use a validation data set to compare the active model and the
candidate model, in the Replace model dialog box, select
the Compare the models field.If you clear the Compare the models field, the Replace model window will not prompt you to select a data set for the comparison. You can add a validation data set later when you evaluate the model.
- Select the source of the new model to replace the current model:
Choices Actions Use a file-based model
- Click Upload.
- In the Select a PMML, H2O MOJO or Pega OXL file section, add a model to replace the current model.
Use a cloud-based model
- Click Machine learning service.
- In the Machine learning service
field, select the machine learning service from which you
want to run the model.Pega Platform currently supports Google AI Platform and Amazon SageMaker models.
- In the Model field, select a model to replace the current model.
- Optional: In the Upload model metadata file
section, add a metadata file with input mapping and outcome
categories for the model.You can create a metadata file by clicking Download template and mapping the JSON input fields to Pega Platform fields in the template. For information about the metadata file properties and the available values, see Metadata file specification for predictive models.
Choose a model from Pega Platform
- Click Model list.
- Select a model to replace the current model.
- Optional: In the Add model documentation (optional) section, upload a documentation file for the model.
- Click Next.
- If you selected the Compare the models field in step
7, add a validation data set and the outcome column to compare
the two models, and then click Next.The outcome column contains the response labels for the outcome.
- In the Summary section, in the Model
name field, enter a name for the new model, and then click
Result: You can view the new model by expanding the twist arrow to the left of the current model. The new model name includes the number of the branch that contains the model change, for example, Validate Churn (M-7012).
- Wait a few seconds for the analysis of the new model to complete. If necessary, refresh the page by clicking .
- CONFIGURATION FAILED
- VALIDATION FAILED
- READY FOR REVIEW
If the configuration of artifacts or validation of predictors fails, open the model by clicking its name, and read the error message in the Model comparison window. Resolve the error and resume the analysis or reject the model. For more information, see Model update statuses and notifications.
When the configuration and validation are successful, the model is ready for review.