Skip to main content

Resolved Issues

View the resolved issues for a specific Platform release.

Go to download resolved issues by patch release.

Browse release notes for a selected Pega Version.

NOTE: Enter just the Case ID number (SR or INC) in order to find the associated Support Request.

Please note: beginning with the Pega Platform 8.7.4 Patch, the Resolved Issues have moved to the Support Center.

INC-142084 · Issue 599876

Support added for expression in strategy scorecards

Resolved in Pega Version 8.2.8

When invoking REST against a dataflow that had a strategy containing a scorecard that used an expression and the "Include model explanations option" was enabled in the Strategy configuration, the system failed with the error "PropertyValueInvalid .pxMaxScore Cannot cast the value (unknown) to double". This was traced to the Scorecard explanations failing during serialization when an expression was used, and has been corrected.

INC-143927 · Issue 599491

Oracle database performance improvements

Resolved in Pega Version 8.2.8

When the IH Summary was enabled and materialized on an ADM model, updating the ADM model was very slow on large sites. This has been resolved by adding several performance improvements for working with Oracle databases, including Oracle pre/post processing steps.

SR-D93777 · Issue 565692

Handling added for Oracle Aggregate IH Summaries

Resolved in Pega Version 8.2.8

When using using (Non Materialized) IH Summaries to aggregate IH data, the data returned by the IH summary did not include all the expected records. If the same criteria was executed on the database via SQL or by using a strategy to process raw IH data then the results were as expected. This was due to a difference in handling of Oracle vs postgres which causes an order by clause not to be generated in the query: the postgres column name is lower case, while in Oracle it is upper case. This has been resolved by updating the system to get the column name correctly from the propertytocolumn map so IH records are returned in correct order by Browse By Keys operation.

SR-D95605 · Issue 565486

Data Flow correctly saved to Database table Dataset

Resolved in Pega Version 8.2.8

After setting up a dataflow with Report definition as a source and Database table dataset as destination with the option "Insert new records and override existing", a data transform was used to modify a few of the values and write to the same database table. This table was mapped to pegadata schema and didn't have a pzpvstream column. Running the data transform generated an exception stating "DataStoreSaveStatementWithoutStream(PageDatabaseMapperImpl). The error was not seen when the pzpvstream column was added or if pzInskey was also removed along with pzpvstream, or if "only insert new records" was selected. This was traced to pzInsKey and pxInsName being null in the query formed while writing. There are two execution paths for the database dataset save operation, one for internal tables and another one for external tables. For a table with pzInskey but no blob, the system was incorrectly using the external table logic. This has been corrected.

INC-150395 · Issue 625068

Tokenizer updated to handle commas

Resolved in Pega Version 8.4.5

The Text Analyzer was not working as expected in cases where the number was combined with a comma (,) with it but was working when a space was used between the number and the comma. This was traced to the tokenizer not correctly processing and splitting the input text when there was a special character before or after the token. This has been resolved by updating the tokenizer logic.

INC-150873 · Issue 612896

Performance improvement for saving ADM model rule

Resolved in Pega Version 8.4.5

Saving an ADM model rule generated a heap dump. The stack trace from the heap dump showed a thread consuming the maximum memory (4.7 GB of memory). Configurations on all factories are updated when a model rule is saved, but at the time of development it was not expected that there would be a lot of factories in a Dev environment so the system was loading all existing factories simultaneously into memory and updating configurations on them. To improve performance, an update has been made which will now sequentially load factories and update the configuration.

INC-156818 · Issue 628465

Materialization uses time limit boundary for query

Resolved in Pega Version 8.4.5

After turning on Materialization for pyIHSummary and OfferOutcomesForPast45Days datasets, an SQL query was taking an excessive amount of time and causing multiple alerts in the logs. Investigation traced the issue to database partitioning, specifically that running a query where the pyOutcomeTime range spanned multiple partitions was causing the indexes for all partitions in the range to be opened. To resolve this, the query has been updated with a DSS to support a partition size of min(pxOutcomeTime) to limit the time range to querying day by day, or hour by hour, or any other chronology unit specified. If there are no records for the current limit, then it will look at the next partition. This should prevent the query from needing to open more than 1 or 2 partitions.

INC-157357 · Issue 636711

Hazelcast remote execution not called from synchronized context

Resolved in Pega Version 8.4.5

After navigating to the Admin Studio portal to view the nodes, the portal was temporarily freezing. Investigation of the thread dump revealed this was caused by a DDS pulse sending a remote execution call to all nodes to update logger settings even though the site was not using DDS. This has been resolved by updating the system to avoid calling Hazelcast remote execution from a synchronized context.

INC-157629 · Issue 626632

Duplicate key exception resolved for adaptive model

Resolved in Pega Version 8.4.5

During the model snapshot update, a DuplicateKeyException was generated while trying to insert a record in to the predictor table. This did not affect the model's learning, but did appear ion the model monitoring report. This was traced to a local scenario of having the same outcome values defined on the model with different cases (Accept and accept). All predictors used in an Adaptive model are inserted into the model monitoring tables as a part of the monitoring job: because the monitoring tables are not case sensitive, this lead to a unique constraint exception since there were multiple IH predictors with the same name. To resolve this, validation has been added which will skip adding duplicates from new responses.

INC-158686 · Issue 628552

DSS added to create consistent handling of longform datetime

Resolved in Pega Version 8.4.5

After upgrade, a difference in handling related to datetime value was seen. For example, EmailSchedRunEndDate is a date type property holding the value "20201016T000000.000 GMT"; in Pega v7.4, a substring function was used to move the extra characters from the date field ex. EmailSchedRunStartDate = @substring(.EmailSchedRunStartDate,0,8), but in Pega v8.4 and higher the long datetime value ( "20201016T000000.000 GMT") was still being used for the date field. This long value was then truncated to 2020101+ when saving to the database, causing errors in later steps. However, research found that if there is a call @toDate function before this step for any other field, the correct date value was set for EmailSchedRunStartDate. While ClipboardPages separate Dates and DateTimes, internally, in Java, both have a time component. The implementation of DSMClipboardPage made no difference for serialization and appended the time component for pure Date properties. To create consistent handling, an update has been made to optionally set the correct behavior after setting the Dynamic System Setting by way of "Pega-DecisionEngine dsm/clipboard/correctDateFormat -> true". This setting would only take effect after a restart of Pega, and the default is false in order to not disrupt any application inadvertently relying on this behavior.

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us