Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

This content has been archived and is no longer being updated.

Links may not function; however, this content may be relevant to outdated versions of the product.

Data flow fails to run from a job scheduler in the System Runtime Context

Updated on March 11, 2021

Resolve issues with running a data flow from a job scheduler that is configured to use the System Runtime Context (SRC).

Condition

A job scheduler that is configured to run in the SRC fails to execute a data flow activity. The server node logs contain an error message that the system could not start the data flow, as in the following example:

Unable to find dataflow rule [PegaCRM-Entity-Contact.LoadRelIntData] com.pega.pegarules.pub.generator.RuleNotFoundException: Failed to find a 'RULE-DECISION-DDF' with the name 'LOADRELINTDATA' that applies to 'PegaCRM-Entity-Contact'. There were 3 rules with this name in the rulebase, but none matched this request. The 3 rules named 'LOADRELINTDATA' defined in the rulebase are: 3 related to applies-to class 'PegaCRM-Entity-Contact', but were defined in rulesets which are not in your rulesetlist: {SA-Artifacts_Branch_RDchanges:01-01-01, SA-Artifacts_Branch_RelInt_Rev1:01-01-01, SA-Artifacts:08-05-01}

Caused by: com.pega.dsm.dnode.api.dataflow.service.DataFlowActivationException: Could not start run DataFlowRunConfig{serviceInstanceName=Batch, runId=, className=PegaCRM-Entity-Contact, ruleName=LoadRelIntData, accessGroup=PRPC:Agents} at com.pega.dsm.dnode.impl.dataflow.task.StartRunTask.startRun(StartRunTask.java:104) ~[d-node.jar:?] at com.pega.dsm.dnode.impl.dataflow.service.DataFlowRunManagerImpl.start(DataFlowRunManagerImpl.java:76) ~[d-node.jar:?]

For more information about viewing logs, see Log files tool.

Cause

Data flows can only run in the context of the access group that is specified in the pyAccessGroup property in the Data-Decision-DDF-RunOptions class, or if this property is not set, the access group of the current thread.

A job scheduler that is configured to run in the SRC uses a different mechanism to determine the context in which to execute an activity. This context is the run-time list of rulesets that is maintained in the SRC. In such cases, the job scheduler does not operate on a single access group, and does not set the access group of the current thread. As a result, the job scheduler might fail to execute a data flow, unless the access group is explicitly passed on to the data flow. To achieve this, add a step to the data flow activity to set the pyAccessGroup property for the data flow, as described in the following procedure.

Solution

  1. In the navigation pane of Dev Studio, click Records.
  2. Expand the SysAdmin category, and then click Job Scheduler.
  3. In the Context field, ensure that Use System Runtime Context is selected.
  4. Click the Open icon to the right of the Activity field.
  5. On the Pages & Classes tab, ensure that the list contains a page with the following parameters:
    Page name: RunOptionsClass: Data-Decision-DDF-RunOptions
  6. On the Steps tab, add a step to the activity with the following settings:
    • Method: Property-Set
    • Step page: RunOptions
    • PropertiesName: pyAccessGroup
    • PropertiesValue: access group to be used to run the data flow
    Setting the access group in the data flow activity
    Data flow activity step that sets the access group for data flow resolution
  7. Click Save as, and then click Create and open.
  8. Click Save.
  • Previous topic Data flows cannot parse JSON records
  • Next topic Data flow runs do not progress beyond Pending-Start or Pending-Activation stage

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us