Skip to main content

         This documentation site is for previous versions. Visit our new documentation site for current releases.      

This content has been archived and is no longer being updated.

Links may not function; however, this content may be relevant to outdated versions of the product.

Data flow public API

Updated on November 20, 2015

On the Pega 7 Platform, you can run data flows manually or with the API methods. The Data Flow public API allows you to execute, monitor, and manage data flow instances programmatically without going to the Data Flow landing page or the Data Flow rule form. This API is rule-based, all the rules are available in the Data-Decision-DDF-RunOptions and Data-Decision-DDF-Progress classes.

Running data flows manually

When the source of the data flow is not an abstract (the source originates from a data set or another data flow), you can execute it with the Run button in the rule form or in the Data Flow landing page from the Batch processing or Real-time processing tab.

​Running data flows with the API methods

When you use the API methods, you can execute data flows in the single case or batch mode.

To execute a data flow in the single case mode, it must have an abstract as the primary source. In this mode, the flow is invoked for each and every instance passed explicitly from the calling activity.

The batch mode is for a data flow instance with a primary source that originates from a data set or another data flow. In this mode, the data flows are invoked once and they process all the records present in the main input.

Methods to run data flows in activities

  • Data-Decision-DDF-RunOptions.pxRunSingleCaseDDF

    This method triggers the execution of a data flow in single case mode and has three parameters:

    • RuleName – a data flow instance

    • ​InputPage – a page reference to be used as a single case source

    • ResultPage – a page reference to Code-Pega-List page to store results of the data flow execution. It is applicable only for the data flows with an abstract destination.

  • Data-Decision-DDF-RunOptions.pxRunDDFWithProgressPage

    This method triggers the execution of a data flow in batch mode and creates the progress page to monitor it.

Monitoring data flows

You can monitor data flows that were executed in batch mode with the Data-Decision-DDF-RunOptions.pxRunDDFWithProgressPage method or submitted on the Data Flows landing page.

To monitor data flows with the API methods:

  1. Call the Data-Decision-DDF-RunOptions.pxInitializeProgressPage method with the ID of the data flow run and the name of the data flow.

    This method creates a progress page that consists of a top level page named Progress of the Data-Decision-DDF-Progress data type.

  2. Call the Data-Decision-DDF-Progress.pxLoadProgress method on the progress page created in Step 1 to get progress updates.

You can use a default section and harness to display and control execution progress.

  • The Data-Decision-DDF-Progress.pyProgress section displays recent information. This section, which is also used in the Data Flows landing page, refreshes periodically to update the progress information.
  • The Data-Decision-DDF-RunOptions.pxDDFProgress harness, which is also used in the run dialog of the Data Flow rule, displays the complete harness for the data flow run. It provides the progress section and the action buttons that allow you to start, stop and restart the data flow run.

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best. is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us