INC-218145 · Issue 715679
DSS introduced to control DSM clipboard page serialization
Resolved in Pega Version 8.8
When using a Kafka dataset to consume a message from an external topic that had an attribute name with a special character contained in a page list structure, using a JSON data transform for the mapping in a realtime dataflow resulted in the error "Exception in stage: KafkaDS; LegacyModelAspectInvokableRuleContainer.invoke-Exception encountered a :java.lang.UnsupportedOperationException." To resolve this, a new DSS dataset/CLASS_NAME/DATASET_NAME/JSONDataTransform/deserialization/useDSMPage has been introduced. When the value is set to true, the process will follow the previous behavior of DSM clipboard pages being generated when Kafka records are deserialized using JSON data transform. When the value is set to false, the JSON data transform will generate regular clipboard pages and convert them later to DSM clipboard pages. This would avoid errors when a JSON data transform calls methods from the Clipboard API that are not implemented by DSM pages. This DSS is set per data set instance. CLASS_NAME and DATASET_NAME are placeholders which should be replaced by data set's pyClassName and pyPurpose property values. In addition, a similar DSS, dataset/CLASS_NAME/DATASET_NAME/JSONDataTransform/serialization/useDSMPage, has been introduced for serialization.
INC-218757 · Issue 714552
AESRemote updated to be asynchronous
Resolved in Pega Version 8.8
The Autonomic Event Services (AES) agent PushCDHMetrics became stuck and was not pushing metrics to the console. This has been resolved by updating AESRemote to be asynchronous.
INC-218909 · Issue 715281
Override added to delete records for a stream dataset after processing
Resolved in Pega Version 8.8
Kafka data was accumulating for a Stream data set due to huge volume of inbound calls. This has been resolved by adding support to override pyDeletedProcessed through a DASS in order to remove the records for a particular stream dataset (topic) as soon as they are processed by Pega.
INC-220622 · Issue 680790
Deprecated Libraries
Resolved in Pega Version 8.8
The following library dependencies have been deprecated, excluded, and/or removed: - ant - bsh - commons-compress - gson - htmlunit - io.netty - jackson-mapper-asl - jdom - jdom2 - jdom-legacy - jetty-http - jetty-io - jetty-server - jetty-util - junrar - logback-core - netty-handler - plexus - plexus-utils - xercesImpl - xstream
INC-220622 · Issue 695948
Updated Libraries
Resolved in Pega Version 8.8
The following libraries have been updated to the most recent version: - commons-collections - cxf-rt-rs-security-oauth2 - derby - dom4j - esapi - google-oauth-client - groovy - h2database - jackson-databind - java-sdk-s3 - Jcommander - json-smart - nekohtml - netty-handler - postgres - snakeyaml - spring - spring-core - underscore - woodstox - xmlsec
INC-220642 · Issue 717032
Updated context handling for executing data transforms in dataflows
Resolved in Pega Version 8.8
When performing a sort operation in a data transform on a pagelist and invoking the Data transform through a Data flow, a java.lang.UnsupportedOperationException was generated on the sort step even though the data was correct. Investigation showed that the dataflows were using a different execution context than the context used for regular activity execution. An update has been made to ensure the correct context is being used while executing data transforms in dataflows.
INC-220893 · Issue 728488
Errors persisted for single case runs using a custom error handler
Resolved in Pega Version 8.8
Single case processing (SingleAccountData) had failed records when opening the SubscriptionData dataset, but it was not possible to see the failed record details for further investigation. Investigation showed that when running a single case run with a custom error handler, errors were persisted as part of the dataflow metrics, but not written to the database. To resolve this, an update has been made to ensure errors are persisted for single case runs configured using a custom error handler.
INC-222561 · Issue 721041
Check added for destination type for distribution test reports
Resolved in Pega Version 8.8
When there were two output destinations in the system, one of type VBD and another of type Database table and both had the same name, an incorrect class was set for distribution test reports and an error was generated when trying to open the report. Investigation showed the system was only checking for the name of the destination and not its type; this has been resolved by adding a pzSetSimulationOutputClass data transform to check for the destination type in addition to the destination name when setting the class for reports.
INC-225257 · Issue 730293
Pega DateTime supports pre-1970 timestamps
Resolved in Pega Version 8.8
The function getTimeStampAsDateStamp from Pega DateTime class was not working correctly for dates before Jan 1st, 1970. Support has now been added for this use.
INC-228430 · Issue 738822
Enhanced diagnostics for NLP
Resolved in Pega Version 8.8
To enhance diagnostics for issues related to natural language processing (NLP) and Apache Rule-based Text Annotation (RUTA), support has been added for runtime metrics that will allow benchmarking and debug test scenarios over a set amount of time. NLP diagnostics have been introduced which will capture the text analyzed for all incomplete executions in Data-NLP-Report-Summary table. Every text analyzer execution will insert a record into summary table with text and text analyzer details and at end of execution the row is deleted. On system crash, texts with incomplete executions will be present in the table to help identify any special patterns that could cause issues. This summary record insert/delete will occur only if DSS(PEGA-NLP!ENABLENLPDIAGNOSTIC) is set to false. Currently the DSS is shipped with true. To enable/disable, change the value in DSS and then re-validate and save for the changed DSS to reflect (Configure > System > Release > Upgrade > Validate > click on re-validate and save - Type:rule-NLP-PredictiveModel dropdown and ruleset: give the application ruleset - ruleset in which the text analyzers are created. Click on list and then Click on run.) Additional support: - The activity "pygetdiagnosticnlpsummaryrecords" has been added to retrieve the summary diagnostic records present in summary table whose NLP analysis has not been completed. - The "pyDumpNLPProcessTimersInRepository" activity has been created to dump process timers to a repository. This activity must be called from a job scheduler configured to run on background processing nodes to dump process timers. This will be done on-demand to analyze the performance(time taken) for RUTA execution etc. Before running this JS, configure the repository in prediction studio settings to dump the CSV in that repository. As soon as 1 CSV is dumped, the JS can be disabled. - Debug logs have been enabled for com.pega.nlp.ner.command.RutaCommand (system>operations>log level settings) which prints the RUTA entity type start and end of execution.