Data flow runs do not progress beyond Pending-Start or Pending-Activation stage
During a data flow run, the status of that data flow run does not advance beyond the Pending-Start or Pending-Activation stage.
There was a problem during initialization, such as failure to create partitions for the primary source of the data flow.
- Depending on your installation, review the
pr_assign table in the rules schema for any
- Using an SQL tool, run a query that returns the detailed status
information for the data flow run.
For example: Select * from pr_assign where pxinsname like '%work_item_ID%' where work item ID is the ID of the data flow run, for example, DF-1.
- In the query results, review the
pyassignmentstatus property.For an active data flow run, the correct value for that property is In Progress. An empty value of pyassignmentstatus property might point to a data flow activation issue that is independent of Pega Platform.
- Using an SQL tool, run a query that returns the detailed status information for the data flow run.
- Review the application log files for any issues with the activation of
the data flow run.For more information, see Viewing logs.
Note: Before viewing logs, you can change the application logging level to Debug and restart the data flow run. For more information, see Investigating data flow run failures. Previous topic Data flow fails to run from a job scheduler in the System Runtime Context Next topic Data flow runs return errors while writing records to a database data set