Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

Understanding unit test cases

Updated on August 16, 2021

A test case identifies one or more testable conditions (assertions) that are used to determine whether a rule returns an expected result. Reusable test cases support the continuous delivery model, providing a way to test rules on a recurring basis to identify the effects of new or modified rules.

You can run test cases whenever code changes are made that might affect existing functionality. For example, an account executive wants to ensure that a 10% discount is applied to all preferred customers. You create a test case that verifies that this discount is applied to all preferred customers in the database. The test case test fails if there are any preferred customers for which the 10% discount is not applied. You then add a new preferred customer to the database and run the test case to make sure that the customer is correctly configured to receive the discount and that the discount for other preferred customers is not affected.

Additionally, you can group related unit test cases into a test suite so that you can run multiple test cases and suites in a specified order. For example, you can run related test cases in a regression test suite when changes are made to application functionality. For more information about test suites, see Grouping test cases into suites.

After you create test cases and test suites, you can run them in a CI/CD pipeline for your application by using Deployment Manager or a third-party automation server such as Jenkins. For more information, see Using Deployment Manager for model-driven DevOps.

You can use unit test the following types of rules:

  • Activities
  • Case types
  • Collections
  • Data pages
  • Data transforms
  • Decision tables
  • Decision trees
  • Declare expressions
  • Flows
  • Map values
  • Report definitions
  • Strategies
  • When

Typically, you unit test a rule, and then convert it to a test case. For flow and case type rules, you record the test case.

  • Viewing test coverage reports

    View a report that contains the results of test coverage sessions to determine which rules in your application are not covered with tests. You can improve the quality of your application by creating tests for all uncovered rules that are indicated in the reports.

  • Creating unit test cases for flows and case types

    When you create a unit test case for a flow or case type, you run the flow or case type and enter data for assignments and decisions. The system records the data that you enter in a data transform, which is created after you save the test form. You can start recording at any time.

  • Defining expected test results with assertions

    Use unit test cases to compare the expected output of a rule to the actual results returned by running the rule. To define the expected output, you configure assertions (test conditions) on the test cases that the test, when run, compares to the results returned by the rule.

  • Opening a unit test case

    You can view a list of the unit test cases that have been created for your application and select the one that you want to open.

  • Running a unit test case

    Run a unit test case to validate rule functionality.

  • Viewing test case results

    After you run a unit test case, you can view the results of the test run.

  • Exporting a list of test cases

    You can export a list of all the unit test cases that are in your application or configured on a rule form.

  • Managing unit tests and test suites

    On the Application: Unit testing landing page you can run unit test cases and test suites that are not marked as disabled, and view reports to check the quality of your application to identify rules that did not pass unit testing.

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us