Skip to main content

Resolved Issues

View the resolved issues for a specific Platform release.

Go to download resolved issues by patch release.

Browse release notes for a selected Pega Version.

NOTE: Enter just the Case ID number (SR or INC) in order to find the associated Support Request.

Please note: beginning with the Pega Platform 8.7.4 Patch, the Resolved Issues have moved to the Support Center.

INC-170423 · Issue 648982

Added catch for SAML WebSSO duplicate key exception

Resolved in Pega Version 8.4.5

After logging in from SSO, closing the Pega window and opening it again resulted in the error "Unable to process the SAML WebSSO request : Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object." This has been resolved by updating the session index handling and adding a catch for the duplicate key exception.

INC-171838 · Issue 651438

Added mail/telephone link to allowed CSP child frame

Resolved in Pega Version 8.4.5

After upgrade from v8.3 to v8.5, clicking on the mail / telephone link in the out of the box case participants gadget generated the Content Security Policy error "This content is blocked. Contact the site owner to fix the issue." Examination of the browser dev-tools console indicated the object refused to frame because it violated the Content Security Policy directive: "frame-src *". This behavior was specific to the Google Chrome browser, and has been resolved by adding code so the mailto: and tel: will be added to the frame-src when Data is selected under Child Frame-Source option. Unchecking the Data checkbox for Child Frame-Source on the policy landing page will remove these from allowed actions under CSP.

INC-171875 · Issue 653894

Skip restored for browser request CSRF token

Resolved in Pega Version 8.4.5

Many SECU0008 alerts were seen in the production logs. This was the result of a CSRF token check on requests without pyActivity or pyStream, and has been resolved by restoring a conditional skip of the check as those other browser requests do not contain a CSRF token.

INC-227878 · Issue 727855

UPDATE IMPACT FOR PEGA CALL

Resolved in Pega Version 8.7.3

Log4j-1.2.14.jar and Log4j-1.2.17.jar have been removed to address the security concerns with these versions, and logger jars have been upgraded to 12.7.2 version (from 12.7.1 version) to make Pega Call compatible. This change will impact Pega Call customer environments due to Avaya or Genesys, which are part of Pega Call, having an internal dependency on Log4j1.x version jars. As a result, the SDK logging for Avaya or Genesys will not be available in the 8.7.3 release unless the Log4j-1.x jar files are reimported locally.

INC-202111 · Issue 710106

Logging extended for PRPCPropertyInfoProvider

Resolved in Pega Version 8.7.3

In order to assist with diagnosing issues with Kafka and JSON, additional logging has been added for PRPCPropertyInfoProvider.

INC-208976 · Issue 719165

Enhanced SSA metrics made available

Resolved in Pega Version 8.7.3

In order to better diagnose delays related to the time when a Campaign is scheduled to start and the time when the Dataflow actually starts to run, an update has been made which will generate detailed metrics to cover some of the strategy execution key performance intensive areas. Additional lower level internal metrics related to SSA engine execution have also been made available by way of a DSS to collect more runtime insight for diagnosis. To enable the collection of these Level 2 SSA internal metrics, set the dataflow/shape/strategy/detailed_metrics/level2 DSS in the Pega-DecisionEngine rule set to 'true'. A comprehensive set of enhanced metrics will be available in Pega 8.8.

INC-217290 · Issue 721375

Added support for creating predictive models in Production

Resolved in Pega Version 8.7.3

While creating a new predictive model rule in Prediction studio, the case was going into broken process after selecting the template with the error message "Error loading D_ProjectList , Reason : No databases defined in properties file:/databases.properties". This was an unexpected use case for creating models in Production level, and has been resolved by updating the flows to turn off the draft mode in this scenario.

INC-218145 · Issue 715678

DSS introduced to control DSM clipboard page serialization

Resolved in Pega Version 8.7.3

When using a Kafka dataset to consume a message from an external topic that had an attribute name with a special character contained in a page list structure, using a JSON data transform for the mapping in a realtime dataflow resulted in the error "Exception in stage: KafkaDS; LegacyModelAspectInvokableRuleContainer.invoke-Exception encountered a :java.lang.UnsupportedOperationException." To resolve this, a new DSS dataset/CLASS_NAME/DATASET_NAME/JSONDataTransform/deserialization/useDSMPage has been introduced. When the value is set to true, the process will follow the previous behavior of DSM clipboard pages being generated when Kafka records are deserialized using JSON data transform. When the value is set to false, the JSON data transform will generate regular clipboard pages and convert them later to DSM clipboard pages. This would avoid errors when a JSON data transform calls methods from the Clipboard API that are not implemented by DSM pages. This DSS is set per data set instance. CLASS_NAME and DATASET_NAME are placeholders which should be replaced by data set's pyClassName and pyPurpose property values. In addition, a similar DSS, dataset/CLASS_NAME/DATASET_NAME/JSONDataTransform/serialization/useDSMPage, has been introduced for serialization.

INC-218172 · Issue 716398

Text analytics character limit set to avoid memory issues

Resolved in Pega Version 8.7.3

Utility nodes were unstable related to searching, and email listener threads became stuck during Rule-based Text Annotation (RUTA) and natural language processing (NLP) work on incoming emails. This happened when the system experienced high memory consumption or exceeded memory usage when using text analytics. This has been resolved by setting the default maximum character limit for NLP analysis to 25,000 characters to avoid RUTA memory issues. If text is provided > 25,000 characters, the system will consider only the top 25,000 characters and a flag will appear on NLPOutcome to indicate text has been limited. This character limit is configurable, but if the configuration is set in excess of 25,000 a warning will be shown prior to saving the change.

INC-223376 · Issue 723575

JMX authentication enabled by default for embedded Kafka and Cassandra

Resolved in Pega Version 8.7.3

For on-premises clients, a potential vulnerability for a Remote Code Execution using the JMX interface on Cassandra and Kafka using exposed network ports has been mitigated by enabling JMX authentication by default for embedded Kafka and Cassandra.

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us