INC-199679 · Issue 688738
Handling added to email encoding for ISO-8859-8-i charset
Resolved in Pega Version 8.7.1
After update, cases were intermittently not getting created from inbound email and the process became stuck. This has been resolved by adding handling to replace the ISO-8859-8-i charset with ISO-8859-8 for encoding the mail content.
INC-201502 · Issue 696085
Parser updated for value list
Resolved in Pega Version 8.7.1
When attempting to process a JSON list using a File Data set that had a list attribute, the clipboard looked correct while performing the browse operation on the data set, but executing the data flow that referenced this data set resulted in the error "Expecting PageList and got String List". If this was changed to a text property, the browse operation on Data set failed but the data flow worked without any issues. and the values were copied into the text properties.This has been resolved by changing the parser for value list.
INC-201648 · Issue 696964
Removed services check and added warnings for simulations
Resolved in Pega Version 8.7.1
Attempting to run an audience simulation resulted in the error "Running simulations is not possible, because the required services are not available. Contact your system administrator to enable the data flow and real-time data grid services". Investigation showed the @DsmServices.pxHasFunctionalNodes("DataFlow","Batch") function call contained in the 'when' rule pyUnavailableDecisionServices was returning false even if all the nodes were in the cluster and all the DSM Services were in NORMAL status. To resolve this, the services check has been disabled and the simulation run will show a warning or fail if a data flow run is queued for more than 30 secs or if there is an issue with querying the underlying metrics storage.
INC-201991 · Issue 692860
Explicit connectivity close added to Queue Manager error handling
Resolved in Pega Version 8.7.1
Performing a connectivity test on the MQ Connector page with an invalid queue name aborts the test connectivity but the MQ connection was not closing. This can become an issue when using IBM MQaaS (MQ as a service) where only a limited number of connections are allowed. This has been resolved by adding an explicit Queue Manager close in the error handling block.
INC-202510 · Issue 695889
SOAP connector supports OAuth2 profile
Resolved in Pega Version 8.7.1
Support has been added for using an OAuth2 profile as one of the allowed profiles for SOAP connector.
INC-204897 · Issue 696163
Log4j file security vulnerability issue addressed
Resolved in Pega Version 8.7.1
A zero-day vulnerability was identified in the Apache Log4j logging software which could potentially allow malicious actors to take control of organizational networks. Pega has immediately and thoroughly addressed this issue. More information can be found at https://docs.pega.com/security-advisory/security-advisory-apache-log4j-zero-day-vulnerability .
INC-180246 · Issue 699700
Support for apostrophe added to keyword tokenization
Resolved in Pega Version 8.7.1
A keyword containing an apostrophe was not detected properly in Text extraction model. This has been resolved by updating the annotator used in the tokenization.
INC-193399 · Issue 688115
DSS added to handle merges with lower versions of Postgres
Resolved in Pega Version 8.7.1
After update, executing the batch campaign with volume constraint resulted in the second data flow DF_Wait failing with error message "ERROR: number of columns (1844) exceeds limit (1664)". This was due to the database set’s change (in 8.5) to use the database layer’s merge statement. Prior to that, the logic used "deletes and inserts". Depending on the version of Postsgres, the generated SQL statement for a merge statement is different. The “INSERT … ON CONFLICT … UPDATE” syntax is generated for Postgres 9.5+ AND when there is a PK constraint defined for the DB table. Otherwise, the complex UPSERT statement (old syntax) is generated, as was the case in this issue. This is a known issue in the Postgres server software where it mis-interprets the number of columns involved. i.e., it mistakenly counts the number of columns twice. As a result, the actual maximum columns allowed is only half of the official limit (1664). The same UPSERT statement does not cause the “exceeds limit” exception if there are 832 or fewer columns in the statement. To resolve this, an option has been provided to select between the “original logic” (deletes and inserts) and the “merge statements” logic by way of the DSS “decision/datasets/db/useMergeStatementForUpdates”. Setting “true” will use the merge statement logic, and setting “false” will use deletes and inserts. When the DSS is not defined, the default is "true" and the system will use merge statements in the form preferred by Postgres 9.5+.
INC-193632 · Issue 679172
Cassandra driver metrics exposed for performance troubleshooting
Resolved in Pega Version 8.7.1
By default Cassandra driver metrics are now enabled. Metrics can be disabled by setting the dnode/disable_driver_metrics prconfig parameter.
INC-193847 · Issue 695974
DSS added to allow masking of subjectID in alerts
Resolved in Pega Version 8.7.1
In order to allow customizing whether or not a subjectID is included in alerts, a DSS has been added to conditionally mask the subjectID from being logged. To use this, set the "alerts/maskIHsubjectID" DSS in the Pega-DecisionEngine ruleset to true to hide the pySubjectID.