Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

Creating a Kafka configuration instance

Updated on April 6, 2022

To manage connections to your Apache Kafka server or cluster of servers that is the source of your application stream data, configure a Kafka configuration instance in the Pega Platform Data-Admin-Kafka class.

With this configuration instance, you can then create one or more Kafka data sets in your application to stream data in real time. This instance ensures that you have stable, reliable connections to your Kafka server or cluster of servers and the specific topics that are part of your Kafka cluster.

Pega Platform supports Apache Kafka cluster version 0.10.0.1 or later.

Depending on the type of your deployment, the following authentication methods are supported:

Authentication methodPega Cloud services deploymentsDeployments on premises
SSL
SASL using a username and password
SASL using KerberosNot available
Before you begin:

In systems deployed on premises, Pega supports configuring SASL authentication between Pega Platform and the Kafka cluster using a JAAS configuration file. To configure SASL authentication, perform the following steps:

  1. In the Kafka cluster, configure the Kafka Client credentials in the JAAS configuration file to enable either simple authentication (using a username and password) or Kerberos authentication.
  2. Pass the location of the JAAS configuration file as a JVM parameter in the Kafka cluster, for example: -Djava.security.auth.login.config = <path_to_JAAS_file>

For more information about configuring the JAAS configuration file, see the Apache Kafka documentation.

To create your Kafka configuration instance, perform the following steps:

  1. In the header of Dev Studio, click CreateSysAdminKafka.
  2. On the New tab, enter identifying information for this rule:
    1. In the Short description field, enter a comment on the purpose of this rule.
    2. In the Kafka field, enter an appropriate name for this Kafka service connection, for example, Kafka-service-1.
    3. Click Create and open.
  3. In the Details section, configure a host and port combination to connect to the Kafka cluster:
    1. In the Host field, enter the address of the Kafka cluster.
    2. In the Port field, enter the port number.
    3. Optional: Click Add host to configure additional host and port combinations.
    Note: Pega Platform discovers all nodes in the cluster during the first connection. This means that you can enter a single host and port combination to connect to a Kafka cluster. We recommended to provide all available host and port combinations.
  4. Configure an authentication method for this Kafka server connection:
    OptionsActions
    SSL-based authentication
    1. In the Security settings section, select the Use SSL configuration check box.
    2. In the Truststore field, press the Down Arrow key and select a truststore file that contains a Kafka certificate or create a truststore record by clicking the Open icon.
    3. Select Use client certificate and enter the Pega Platform private key and private key password credentials in the Keystore and Key password fields respectively.
    SASL-based authentication
    1. In the Authentication section, select Use authentication.
    2. Select the authentication type:
      • To enable authentication using login credentials, select Username and password, and then enter the login credentials.
      • To enable authentication using Kerberos, select Kerberos, and then enter the Kerberos authentication key.
      Note: Authentication using Kerberos is only supported in on-premises systems.
      Tip: If you see the message No JAAS configuration file set, SASL authentication between Pega Platform and the Kafka cluster is not configured. For configuration steps, see the Before you begin section of this procedure.
  5. Optional: To upload a client properties file containing the properties that you want to use to establish the connection with the Kafka cluster, in the Advanced configuration section, click Upload client properties.
    Note: The client properties file can contain the following properties: "metadata.max.age.ms","send.buffer.bytes", "receive.buffer.bytes","client.id","reconnect.backoff.ms","reconnect.backoff.max.ms","retries","retry.backoff.ms","metrics.sample.window.ms","metrics.num.samples","metrics.recording.level","metric.reporters","security.protocol","connections.max.idle.ms", "request.timeout.ms","ssl.protocol","ssl.provider","ssl.cipher.suites","ssl.enabled.protocols","ssl.keystore.type","ssl.keystore.location","ssl.keystore.password","ssl.key.password","ssl.truststore.type","ssl.truststore.location","ssl.truststore.password","ssl.keymanager.algorithm","ssl.trustmanager.algorithm","ssl.endpoint.identification.algorithm","ssl.secure.random.implementation","sasl.mechanism","sasl.jaas.config","sasl.kerberos.service.name","sasl.kerberos.kinit.cmd","sasl.kerberos.ticket.renew.window.factor","sasl.kerberos.ticket.renew.jitter", "sasl.kerberos.min.time.before.relogin"
  6. Click Test connectivity to test the connection between Pega Platform and the Kafka cluster.
  7. If the Kafka cluster is connected, click Save.

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us