Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

Data flow runs do not progress beyond Pending-Start or Pending-Activation stage

Updated on May 17, 2024

During a data flow run, the status of that data flow run does not advance beyond the Pending-Start or Pending-Activation stage.

Cause

There was a problem during initialization, such as failure to create partitions for the primary source of the data flow.

Solution

  1. Depending on your installation, review the pr_assign table in the rules schema for any status issues:
    1. Using an SQL tool, run a query that returns the detailed status information for the data flow run.
      For example: Select * from pr_assign where pxinsname like '%work_item_ID%' where work item ID is the ID of the data flow run, for example, DF-1.
    2. In the query results, review the pyassignmentstatus property.
      For an active data flow run, the correct value for that property is In Progress. An empty value of pyassignmentstatus property might point to a data flow activation issue that is independent of Pega Platform.
  2. Review the application log files for any issues with the activation of the data flow run.
    For more information, see Viewing logs.
    Note: Before viewing logs, you can change the application logging level to Debug and restart the data flow run. For more information, see Investigating data flow run failures.
  • Previous topic Data flow fails to run from a job scheduler in the System Runtime Context
  • Next topic Data flow runs return errors while writing records to a database data set

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us