INC-154042 · Issue 621261
Pega Catalog custom upload control modified
Resolved in Pega Version 8.6
Attempting to upload a catalog.zip file caused the system to hang and thread dumps were seen in the logs. Investigation traced the issue to the custom control used to upload the catalog, which was posting the entire content in form data rather than sending a multi-part request. The control contained both legacy code which used form.submit() and encoding along with new code that used SafeURL and sent an async request. With this, encoding could not be set to multi-part in case of an AJAX request. To resolve this, the catalog upload control has been modified to use the appropriate legacy code that performs form.submit() and sets the encoding properly.
INC-155789 · Issue 622547
Third-party libraries upgraded
Resolved in Pega Version 8.6
The following third-party jar files have been updated to the most recent versions:ant: v1.10.9 httpclient: v4.5.13 xercesImpl: v2.12.1
INC-157196 · Issue 629297
Deprecated service package features now require authentication
Resolved in Pega Version 8.6
Authentication has been added to deprecated features of the standard service package to improve security. If issues are encountered during product migration, please use the Deployment Manager.
INC-158519 · Issue 625079
Filter considers all instances pages during deployment
Resolved in Pega Version 8.6
During package deployment, attempting to use Filter to skip some of the instances only displayed the result of the current active page instead of all pages. This was an unintended consequence of previous work, and has been resolved by adding the logic to strip quotes in the value and adding the "Pagination activity manages filtering" checkbox by default.
INC-159834 · Issue 632249
StackOverFlow logging improved
Resolved in Pega Version 8.6
Enhanced diagnostic logging information has been added to help find issues when StackOverFlow errors occur.
INC-160198 · Issue 632905
Enhancement added for attachment size handling with Kafka
Resolved in Pega Version 8.6
Attempting to send an email attachment larger than 2 MB resulted in the error "Caused by: org.apache.kafka.common.errors.RecordTooLargeException: The message is 8101592 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration." This has been resolved by updating the Stream SPI size to 2.0.5-14 to support custom producer configurations. The settings can be passed as environment parameters, for example 'Dstream.producer.max.request.size=500990'.
INC-160288 · Issue 626067
Kerberos handling updated for database remap
Resolved in Pega Version 8.6
After upgrade from Pega v7.2 to Pega v8.4, using Kerberos authentication was failing during the remap task. Investigation showed that null username and password values were being passed to SchemaAssignmentUtility along with the flags as arguments, causing the utility to misinterpret the arguments. As arguments should be populated only when flags and values are available and not null, an update has been made which will set the username and password flags only if they are not null in the Remap database tables target.
INC-160382 · Issue 630714
Added handling for double quotes when converting tables
Resolved in Pega Version 8.6
After importing some packages from development to production, DDL conversion failed for tables with 30 characters in the table name when the import file contained a "Create VIEW" sentence. This was due to the addition of double quotes (" "). The table rename utility was not expecting to handle quoted table names, so as a result the addition of the quotation mark characters were included in the length calculation, and when renaming was performed it yielded syntactically invalid names (with leading " in the name). To resolve this, an update has been made which will strip quotes from table names being passed into TableRenameUtil.
INC-162360 · Issue 635817
Unneeded AUC error percentage calculation removed for better system performance
Resolved in Pega Version 8.6
When generating snapshots of the ADM models, an inefficiency in the metric calculation for the AUC error percentage caused an excessive amount of memory to be allocated which could cause slowdown or require a system reboot. This has been resolved by removing the problematic AUC error percentage calculation as it is not used in the UI.
INC-163940 · Issue 634491
CustomerData schema added to list of databases for optimization
Resolved in Pega Version 8.6
was no "Create?" column. This column appeared for the PegaDATA database but not for CustomerData. To resolve this, the Optimize Schema Landing Page has been updated to treat the CustomerData schema as a shipped Pega database.