SR-B74689 · Issue 324632
Marketing data flow run made more robust
Resolved in Pega Version 7.3.1
After upgrade, it was observed that Pega Marketing Campaigns were failing if there were no customers in the Audience configured on the Campaign, generating the error message "The run failed, because it exceeds the maximum number of failed records, which is currently set to 0". The cause of this was executing a distributed data flow with a database as primary source on an empty table, leading the run to fail as a table without any partition was considered in the handling. The database dataset has now been updated to differentiate the case when there's no partition available from the case when there's a single partition for every record, ensuring the DB data set now returns 'all' records when there is no partition key defined, and the data flow handles the no values for partitions in a more robust way.
SR-B51204 · Issue 309685
Cassandra enhancement to retrieve multiple records and use runinparallel
Resolved in Pega Version 7.3.1
An enhancement has been added to allow the Cassandra Connector GET operation to return multiple rows by querying on clustering keys and/or secondary index. In addition, an issue with Connect-Cassandra not supporting run-in-parallel execution mode has been resolved to allow a result list from list operation to be called from pyoutputpage or primary page.
SR-B64066 · Issue 313310
Cassandra enhancement to retrieve multiple records and use runinparallel
Resolved in Pega Version 7.3.1
An enhancement has been added to allow the Cassandra Connector GET operation to return multiple rows by querying on clustering keys and/or secondary index. In addition, an issue with Connect-Cassandra not supporting run-in-parallel execution mode has been resolved to allow a result list from list operation to be called from pyoutputpage or primary page.
SR-B56597 · Issue 311992
IBM AIX CRC32 result forced positive for stream data generation
Resolved in Pega Version 7.3.1
An issue with IBM AIX intermittently returning a negative CRC32 result caused files generated by a Stream data set to be corrupted. A check has been added that will force the CRC32 result to always be positive.
SR-B56597 · Issue 314377
IBM AIX CRC32 result forced positive for stream data generation
Resolved in Pega Version 7.3.1
An issue with IBM AIX intermittently returning a negative CRC32 result caused files generated by a Stream data set to be corrupted. A check has been added that will force the CRC32 result to always be positive.
SR-B56597 · Issue 314380
IBM AIX CRC32 result forced positive for stream data generation
Resolved in Pega Version 7.3.1
An issue with IBM AIX intermittently returning a negative CRC32 result caused files generated by a Stream data set to be corrupted. A check has been added that will force the CRC32 result to always be positive.
SR-B65908 · Issue 314693
Serialization fixed for multi-node pre-activity dataflow
Resolved in Pega Version 7.3.1
Attempting to trigger a dataflow run with pre-activity configured to run on every node was failing with a serialization exception. This was caused by the task being distributed across the cluster having a member variable that was not serializable; this has been fixed.
SR-B64710 · Issue 317013
Revision management browse refined to handle multiple applications
Resolved in Pega Version 7.3.1
If an instance had two applications with respective overlays, the rollback of a revision in one application was using the revision from the other application instead of fetching the revision list from the active application. In order to support this structure, the browse condition in pyCreateRevisionVersion has been reworked to browse the activated revisions and pick the latest activated revision to make it active from inactive, then make the current revision as rolled back.
SR-B66424 · Issue 315782
Backwards compatibility added for Stream datasets
Resolved in Pega Version 7.3.1
In order to ensure backwards compatibility for the Data-Admin-DataSet-Stream property pyKeys (formerly pyPartitionKeys), logic has been added to handle stream datasets created in previous releases: if pyKeys (the new property) is empty and pyPartitionKeys (the deprecated property) is not empty, then pyPartitionKeys will be used.
SR-B56514 · Issue 313700
Enhancement to support high number of Adaptive Model predictors
Resolved in Pega Version 7.3.1
It was not possible to save an adaptive model rule with more than approximately 650 predictors. This was a long-standing issue with JVM per-method limits relating to how the rule code for Adaptive Model predictors was generated, but an enhancement has been added to modify the Java rule generation so it will not try to find values for all the predictors in one method and support more complex use.