SR-A7401 · Issue 219542
Performance tune-ups in DSM
Resolved in Pega Version 7.2
Strategy Profiler was showing misleading timing information for Switch, Champion Challenger, Exclusion, and Data Join. This has been fixed. In addition, Strategy Set has been optimized to let it evaluate the customer-specific expressions once per component execution (instead of per SR page) , and it will not clone SR pages if it's unnecessary. The Prioritization shape has also been improved to only prioritize the Top N if that?s configured.
SR-A12733 · Issue 227212
Fixed NPE for Tracer
Resolved in Pega Version 7.2
It was not possible to execute strategies if the tracer was turned on. This was failing because the tracer was trying to collect the primary page in XML format from a clipboard page that was missing the implementation to generate the XML. This has been corrected.
SR-A13237 · Issue 227844
Fixed NPE for Tracer
Resolved in Pega Version 7.2
It was not possible to execute strategies if the tracer was turned on. This was failing because the tracer was trying to collect the primary page in XML format from a clipboard page that was missing the implementation to generate the XML. This has been corrected.
SR-A12341 · Issue 226423
Corrected implementation for DataFlow key non-match
Resolved in Pega Version 7.2
If a DataFlow Dataset configured against a Report Definition had a property key name that was not identical to the column name in the database table, the Dataflow failed to execute with error " Incorrect implementation for Report Definition as a data flow source. Property-to-database table column mapping was not applied correctly". The implementation has been fixed to resolve this issue.
SR-A14774 · Issue 233421
Number formatting made consistent for Propositions
Resolved in Pega Version 7.2.1
In Proposition landing, a decimal property was rounded to three digits when shown in read-only mode but clicking on the edit option for the Proposition displayed the entire number with all the digits after the decimal. This was not a desired difference, and the presentation setting for Double type property has been modified to show all numbers after the decimal.
SR-A16960 · Issue 233587
Predictive Analytics rulesets excluded from RSA
Resolved in Pega Version 7.2.1
The Pega-provided Predictive Analytics rulesets were being incorrectly being checked and flagged by the Rule Security Analyzer. The PAD rulesets have now been properly excluded from the RSA check, and further analysis was done to find and fix other RSA flags that should have been excluded.
SR-A18333 · Issue 237135
Corrected JSON page group handling
Resolved in Pega Version 7.2.1
If DataSet-Execute method was used to cache a CustomerPage from Clipboard, using the same DataSet-Execute method to retrieve it from the cache using the "Browse by key" operation was not working correctly: If the page contained a page group property, then the retrieved CustomerPage contained additional invalid entries that led to errors in the mapping logic. This was an issue with the Clipboard page JSON converter not properly handling page groups, and has been corrected.
SR-A18550 · Issue 234678
Strategy and data flow behavior more consistent
Resolved in Pega Version 7.2.1
Strategy execution via data flow was producing different results depending on how it was executed. This was caused by the use of different types of pages based on the execution method, and the code has been updated to produce more consistent behavior.
SR-A18909 · Issue 235146
Strategy and data flow behavior more consistent
Resolved in Pega Version 7.2.1
Strategy execution via data flow was producing different results depending on how it was executed. This was caused by the use of different types of pages based on the execution method, and the code has been updated to produce more consistent behavior.
SR-A19104 · Issue 237289
Ability added to set TTL on data sets
Resolved in Pega Version 7.2.1
An enhancement has been added to Data Flow datasets to allow setting the Time To Live (TTL) on DDS stores from the canvas. This allows setting an expiry date on cached data, which is essential when working with large streams of data.