INC-150809 · Issue 611856
Loading timing updated for openAPI content
Resolved in Pega Version 8.5.3
Open API content loading was taking too much time and interfering with working on other REST rule configurations. To resolve this, the loading of openAPI contents has been removed from the opening or saving of REST rule and shifted to lazy loading. Clicking on the openAPI tab will load the contents, and any modification in REST rule or service package rule will update the contents of openAPI when "Action->Refresh" is clicked.
INC-151043 · Issue 614872
Optimizing helper class enhanced to handle external databases
Resolved in Pega Version 8.5.3
Running a BIX extract that included a manifest for a target database was resulting in a null pointer exception for the manifest extraction. Attempting to generate the DDL for the manifest table also failed. This was traced to an issue with the helper class using a hardcoded default database for forming the queries, causing it to ignore the database config/DADN/prconfig for the Oracle database and form a query using the PegaRules' database credentials. This only occurred when trying to do external database operations on a different database platform; Oracle PegaRules worked as expected with an Oracle external database and Postgres Pegarules worked with a Postgres external database, but mixing Postgres PegaRules and an Oracle external database would result in the null pointer exception. To resolve this, the helper class has been enhanced to work with external databases by passing the database name as a parameter so it will properly calculate the query based on the type of target. An error in the name of the class has also been corrected, and is now available as PerformanceHelper rather than the previous "PerformaneHelper".
INC-151228 · Issue 620602
Kafka updated
Resolved in Pega Version 8.5.3
Stream node errors were seen in the log file indicating "Invalid configuration. Undefined stream provider end point." This has been resolved by updating Kafka to v1.1.0.5, which was released to address this issue in WIndows environments.
INC-151426 · Issue 613089
JobScheduler initialization timing adjusted
Resolved in Pega Version 8.5.3
DDS nodes were ending up in a deadlock condition on restart, preventing them from joining the cluster. This has been resolved by ensuring the JobScheduler initialization task waits for Search to start up.
INC-151708 · Issue 622067
React-based UI app alias supports space or dash
Resolved in Pega Version 8.5.3
When using the "React-based UI" (Beta test version) in App Studio, portal creation was successful but previewing the portal showed only a blank page and a 404 error code was generated. Switching to "Server rendered UI" rendered the portal as expected. Investigation showed that the alias had a space in the name which was handled as a dash (-), and this issue has been resolved by adding support for using a dash in the app alias.
INC-152057 · Issue 621210
S3 attachment migration handles LInk-Attachment with multiple instances
Resolved in Pega Version 8.5.3
After S3 attachment migration, some attachments were intermittently not opening and displaying an error relating to being unable to load the file. Investigation showed that the attachments that failed to open did not have a pxStorageType tag in the XML of the work item. After the migration is done, the corresponding Data-WorkAttach-File and Link-Attachment instances are updated to point them to the repository. In this case, multiple LInk-Attachment instances were pointing to the same Data-WorkAttach-File instances, so only one Link-Attachment was updated and all of the other instances pointing to the same Data-WorkAttach-File instance were rendered unusable. While there was a workaround of manually updating the storage type in the database, this has been resolved by updating "pzgetattachmentcontent","pzupdatesourcereferences", and "pzuploadcontenttoexternalstorage" to check whether attachment is already migrated or not using the boolean parameter "isAttachmentAlreadyMigrated" to ensure all of the other Link-Attachment instances for a particular attachment are updated. If a migration was done and all Data-WorkAttach-File instances are pointing to a repository with some Link-Attachment instances not updated, those will be updated by running the migration again.
INC-152435 · Issue 612997
Hardcoded cluster password deprecated
Resolved in Pega Version 8.5.3
An unneeded hardcoded cluster group password has been removed in 3.8 version of Hazelcast as this use has been deprecated.
INC-152440 · Issue 614335
Compiler jars load as expected
Resolved in Pega Version 8.5.3
The system was not able to pick up the Default Paths and Default Classes arguments in Configure->System->Settings->Compiler tab when attempting to use a third-party custom jar file. This was an unintended consequence of work done to address performance issues when a requested DASS instance was missing, and has been resolved by ensuring that null values are not cached where are are lookup failures. In addition, enhanced logging has been added to SystemSettingsImpl.
INC-152647 · Issue 609605
Email Listener auto-reply evaluation updated
Resolved in Pega Version 8.5.3
After upgrade, messages were being read but not getting processed for a specific Email listener (RCEmailListerner). The error "Email flagged as an autoreply email and will not be processed" appeared in the logs. Previously, an email was not considered to be an auto-reply only when the 'auto-submitted' header didn't exist or existed with value 'no'. This caused issues with auto-forward or auto-redirect emails where 'auto-submitted:auto-generated' could be in the header. Due to this, email was marked as auto-reply and email listener stopped processing it. To resolve this, the system has been modified to mark the message as auto-reply if it finds 'auto-submitted: auto-replied' in the header, but not 'auto-submitted:auto-generated'.
INC-153014 · Issue 625697
Handling added for missing archival class definitions
Resolved in Pega Version 8.5.3
Handling has been added to avoid suspending the archival process when a class definition no longer exists in the system. If the system does not find a class corresponding to a configured case type, the exception generated will be logged and processing will continue with the next case type.