Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

This content has been archived and is no longer being updated.

Links may not function; however, this content may be relevant to outdated versions of the product.

How to unit test activities with the Automated Testing feature

Updated on September 10, 2021

Summary

You can unit test an individual activity rule before testing it in the context of the entire application you are building. With Automated Unit Testing, you can save the test data that you use as test case rules. Then, the next time you test that rule, you can run the test case rather than manually re-entering the test data.

Suggested Approach

Testing Activities

When running an activity to save the run as a test case, it is important to have the clipboard in a known, clear state before starting the run. Otherwise, the clipboard could contain pages from an earlier run that you might not want saved as part of the test case, or which present an inaccurate picture of what the activity's steps do.

To unit test an activity and save the test as a test case rule:
  1. Open the activity rule you want to test.

  1. Click the Run toolbar icon (). The Run Rule window appears.
In V6.1, step two is different. To create a test case for an activity in V6.1:
  1. Go to the Test Cases tab of the opened rule.
  2. Click Record New Test Case. The Run Rule window opens.

The rest of the steps for creating a test case are applicable to V6.1.

  1. In the Test Page section, choose the test page for the activity. Specify whether you are not using a test page, creating a new test page, or using an existing test page. (The Copy existing page option is available if your clipboard has pages that you can copy.)
  2. In the Enter Parameterssection, enter values for the parameters that are needed for the activity to run.

    This section lists all of the parameters defined in the activity rule. Ones that are displayed in bold text are those parameters on the activity rule's Parameters tab that have Required selected on that tab.

  3. Click Execute to test the activity rule. The system displays the results from running the activity.

  1. Examine the results and determine whether the test data used generated the expected results.
  2. When you are satisfied with the results from running the activity, click Save Test Case to save this run as a test case. The new rule dialog displays.

  1. Enter the name of the test case, a short description of the test case, and the appropriate RuleSet and version. Then click Create.
  2. Optional: You may also add this test case to your list of shortcuts. Click Add to Shortcuts. The Add to Shortcuts dialog opens. Enter the name of the shortcut and click Save.

Running Activity Test Cases

After you create a test case for an activity, it will appear in the list for saved test cases in the Run Rule window for the tested rule.

In V6.1, the steps for running a test case are different. After you create test cases for a rule, they appear on the Test Cases tab for that rule.
To run a test case for an activity in V6.1:
  1. Open the rule that you want to test.
  2. Go to the Test Cases tab of the opened rule.
  3. Click the name of the test case.

    The Run Rule window opens, the system runs the test case, and displays the results.

To run a test case:
  1. Open the activity you want to test.
  2. Click the Run toolbar icon (). The Run Rule window appears.
  3. Select the Run against a saved test case option and choose a test case from the list.

Because the test case rule contains the initial pages that were created, loaded, or copied before the rule was run, you do not have to recreate the initial conditions before running the test case.

  1. Click Run Test Case. Process Commander runs the test case and displays the results in the Result section of the Run Rule window. If there are any differences found between the current results and the saved test case, a message states that the results were unexpected.

In the case of unexpected results, if the new results are valid, you can overwrite the test case so it uses the new information by clicking Overwrite Test Case.

You can choose to ignore a particular difference by selecting the check box in the Ignore? column. For example, if you have a Date type of property that the activity sets to the current date plus 5 days, playing back the test case on different days will give different resulting values for that property. Instead of having this flagged as a difference every time, you can choose to have differences in that property ignored. Once you have determined which differences to ignore, click Save Ignores to save your selections to the test case.

Starting with Version 6.1 SP2, you have two additional options for ignoring differences in future runs:
  • Ignore differences on a page: In the list of found differences, you can select a page to ignore all differences found on that page. The selection applies only for this specific test case (not across all test cases). If you select to ignore a page, all differences found on that page are ignored each time this test case runs.
  • Ignore differences for all test cases: You can specify that a difference should be ignored for all test cases in the application.

Related Topics

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us