Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

Connecting to external predictive models

Updated on July 5, 2022

You can run your custom artificial intelligence (AI) and machine learning (ML) models externally in third-party machine learning services. This way, you can implement custom predictive models in your decision strategies by connecting to models in the Google AI Platform and Amazon SageMaker machine learning services.

For the list of models that are supported in Pega Platform, see:

For more information, see Using machine learning services.

Before you begin: Define your model, and the machine learning service connection:
  1. In a third-party cloud ML service of your choice, create an ML model.
  2. In Dev Studio, connect to your cloud service instance by creating an authentication profile.
    • For a Google AI Platform service connection, create an OAuth 2.0 authentication profile.
    • For an Amazon SageMaker service connection, create an Amazon Web Services (AWS) authentication profile.

    For more information, see Creating an authentication profile.

  3. In Prediction Studio, define your ML service.

    For more information, see Configuring a machine learning service connection.

  1. In the navigation pane of Prediction Studio, click Models.
  2. In the header of the Models work area, click NewPredictive model.
  3. In the New predictive model dialog box, enter a Name for your model.
  4. In the Create model section, click Select external model.
  5. In the Machine learning service list, select the ML service from which you want to run the model.
    Pega Platform currently supports Google AI Platform and Amazon SageMaker models.
  6. In the Model list, select the model that you want to run.
    The list contains all the models that are part of the authentication profile that is mapped to the selected service.
  7. In the Upload model metadata section, upload the model metadata file with input mapping and outcome categories for the model:
    1. Download the template for the model metadata file in JSON format by clicking Download template.
    2. On your device, open the template model metadata file that you downloaded and define the mapping of input fields to Pega Platform.
      For example: To predict if a customer is likely to churn, define the mapping of input fields as follows:
      {
          "predictMethodUsesNameValuePair": false,
          "predictorList": [{
                  "name": "GENDER",
                  "type": "CATEGORICAL"
              },
              {
                  "name": "AGE",
                  "type": "NUMERIC"
              }
          ],
          "model": {
              "objective": "Churn",
              "outcomeType": "BINARY",
              "expectedPerformance": 70,
              "framework": "SCIKIT_LEARN",
              "modelingTechnique":"Tree model",
       
               "outcomes": {
                  "range": [
                  ],
                  "values": [
                      "yes", "no"
                  ]
              }
          }
      }
      For information about the JSON file fields and the available values, see Metadata file specification for predictive models.
    3. Save the model metadata file.
    4. In Prediction Studio, click Choose file, and then double-click the model metadata file.
  8. In the Context section, specify where you want to save the model:
    1. In the Apply to field, press the Down arrow key, and then click the class in which you want to save the model.
    2. Define the class context by selecting the appropriate values in the Development branch, Add to ruleset, and Ruleset version lists.
  9. Verify the settings, and then click Next.
  10. In the Outcome definition section, define what you want the model to predict.
    Enter a meaningful value, for example, Customer Churn.
    Note: To capture responses for the model, the model objective label that you specify should match the value of the .pyPrediction parameter in the response strategy (applies to all model types).
  11. In the Predicting list, select the model type:
    • For binary outcome models, select Two categories, and then specify the categories that you want to predict.

      Binary outcome models are models for which the predicted outcome is one of two possible outcome categories, for example, Churn or Loyal.

    • For categorical outcome models, select More than two categories, and then specify the categories that you want to predict.

      Categorical outcome models are models for which the predicted outcome is one of more than two possible outcome categories, for example, Red, Green, or Blue.

    • For continuous outcome models, select A continuous value, and then enter the value range that you want to predict.

      Continuous outcome models are models for which the predicted outcome is a value between a minimum and maximum value, for example, between 1 and 99.

  12. In the Expected performance field, enter a value that represents the expected predictive performance of the model:
    • For binary models, enter an expected area under the curve (AUC) value between 50 and 100.
    • For categorical models, enter an expected F-score performance value between 0 and 100.
    • For continuous models, enter an expected RMSE value between 0 and 100.
    For more information about performance measurement metrics, see Metrics for measuring predictive performance.
  13. Review the model settings, and then click Create.
    Result: Your custom model is now available in Prediction Studio.
  14. Click Save.
What to do next: Include your model in a strategy. For more information about strategies, see About Strategy rules.

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us