Troubleshooting data flows
If your data flows do not work as expected, use the following tips to troubleshoot and solve the most common issues that you may experience when working with data flows.
- Investigating data flow run failures
From the Data Flow landing page, you can access detailed reports on any errors that occur while your application processes a data flow. By analyzing these error reports, you can quickly diagnose the root cause of an error.
- Data flows cannot parse JSON records
The following exception occurs when you start processing a data flow run: Cannot parse JSON specific information about a data flow shape
- Data flow fails to run from a job scheduler in the System Runtime Context
Resolve issues with running a data flow from a job scheduler that is configured to use the System Runtime Context (SRC).
- Data flow runs return errors while writing records to a database data set
The data flow run fails with an error while results are being written in a database data set.
- Exceeded threshold for error count in real-time data flow run
The processing of a real-time data flow run failed with the following exception: Failure threshold hit - too many errors occurred.
- Failure or performance issues when saving records to the database
By default, Pega Platform uses the SQL merge statement to write records to the database. In some cases, when this logic is used, a data flow run might fail or take a long time to complete.
- No data flow service nodes are available
The following exception occurs when you start processing a data flow run: No data flow service nodes available.
Previous topic Values of the predictors are not set Next topic Investigating data flow run failures