Integrating topic models FAQ
To learn more about some aspects of integrating custom models for topic detection with Pega Platform, refer to the following frequently asked questions about this topic.
Is an authentication profile mandatory?
Yes. Configuration of an authentication profile is not enforced, but it is strongly recommended that you implement your custom model using an authentication mechanism.
Which authentication mechanisms are supported?
Pega Platform supports OAuth 2.0 for custom model integration.
Which grant types are supported for OAuth 2.0?
Client credentials and password credentials are supported.
What is the service discovery (Open API) connection type?
Service discovery is an option for configuring a machine learning service connection in Prediction Studio by using an exposed Swagger endpoint that conforms to the OpenAPI specification. Pega Platform supports the OpenAPI specification versions 2.x.x and 3.x.x.
What is the standalone API connection type?
Standalone API is an option for configuring a machine learning service connection in Prediction Studio. By selecting this option, you can provide the details of the prediction endpoint manually if a service discovery endpoint is not available.
What is output mapping?
The output mapping gets its value from a data transform. The data transform converts the JSON output from a prediction endpoint to an output format that is supported in Pega Platform. For more information, see Configuring a data transform for a JSON output mapping.
Previous topic Creating a text categorization model to run topic models through an API Next topic Best practices for creating categorization models