Real-time data can be critical in providing the most accurate and targeted offers or next best actions to customers. Pega provides multiple real-time data integration solutions to empower the customer and provide flexibility in their design and implementation solutions.
Data streams are the recommended solution for implementing real-time data. Pega supports Kafka and Kinesis options for data stream. For more details on how to implement a Kafka or Kinesis data stream in Pega Customer Decision Hub for Financial Services, see Creating a Kafka data set and Creating a Kinesis data set.
Connecting directly through the database
If you have an existing data repository or decision management data store in a relational database, you can directly connect to it by defining a database rule in Pega. Create one class for each table that you need to pull data from and map the class to the table with database table rules inside Pega. You can then pull and load the data through report definitions or data flows. While data flows also support this functionality, it is not recommended to use data pages due to performance concerns.
Connecting through services
Pega Customer Decision Hub for Financial Services can read real-time data directly through services. The recommended solution is to push your data through a regular data stream using a REST service or a web socket connection.
Alternatively, you can extend the real-time containers to pass additional real-time data every time the REST service for the container is invoked.
It is not recommended for you to invoke REST or SOAP services to fetch data during the processing of your decisioning as this can lead to performance challenges. Instead, consider updating your decision records each time a change occurs so that your decision records are always up to date.
Real-time data flows
Real-time streaming services and connections feed a real-time data flow. You can feed behavioral data through real-time data flows. For more information about how to configure a real-time data flow, see Configuring the Data Flow service.