Establish a robust approach to development at the start of the project and document it so that the whole project team can collaborate most effectively to produce the highest-quality outcome.
|Primary role||Lead System Architect|
Understanding testing in Pega
Pega provides many tools and processes to help you maximise your application’s quality throughout development. These are discussed in this Tech Talk, and include:
- Make use of Pega Guardrails
- Consult the Pega Diagnostic Cloud often
- Have a set of Journey Centric Test Plans
- Have a well-defined definition of done
- Structure your testing around the day 1 live plan
- Automate tests early to reduce effort
- Consult the application quality dashboard
Incorporating testing as the core of your development practice as early in the project as possible will lead to a robust application of the highest quality.
The overall approach to your testing process is agreed early in the project between the project’s Lead System Architect (LSA), test lead, product owner, and any other stakeholders. Once the process is agreed, testing and quality is everyone’s responsibility.
Defining your testing procedures
Consider the steps below when you are defining your testing procedures as early in your project as you can.
- Consult your application’s guardrail compliance score regularly and aim for it to be above a number that you define. 98% is a good figure to achieve. Build guardrail compliance into your application’s deployment review.
- Consult the Pega Diagnostic Cloud regularly. This will highlight any areas for concern early and before they become a major problem on Production. It is always easiest and most cost effective to resolve issues as early as possible, so regular consultation of PDC helps you do this. Build a PDC review into your sprint retrospective ceremony or similar end-of-sprint activity.
- Write your test plans from a journey-centric point of view. A journey-centric test plan focuses on individual journeys which can be tested in isolation and which do not depend on each other. This will allow you to focus on the functionality that matters and will allow you to concentrate testing effort on those functions that are required for the minimal lovable product.
- Build testing into the heart of your Scrum team’s definition of done. You can start with the template definition of done on the Pega website, or create your own. It is best to have a single definition of done instead of changing it from project-to-project. For more information, see Complete to Definition of Done.
- Plan your testing effort around the project’s day one live plan so that you focus only on the elements that are essential for your minimal lovable product. Ensure the whole team is aware of the day one live plan and the test plan. Align your Scrum test team with the plans so that it can focus its testing on what is really needed.
- Automate unit testing early, and make this part of your definition of done. Many types of rule can be tested automatically with PegaUnit. Make it mandatory for rules to have unit tests, and aim for 100% test coverage. Ensure that your Scrum Test team is recording Pega Scenario Tests to build a library of repeatable tests based in the user interface of your application.
- As part of your release process, check the application quality dashboard for guardrail compliance and unit test coverage. Do not approve an application for release unless it is falling with your team’s documented quality metrics.
- Plan which environment you are going to release some or all your unit tests. Configure your DevOps pipeline with a Test application and configure it by following the guidelines shown in Managing test cases separately in Deployment Manager.
Defects in software are simplest and easiest to fix the earlier they are detected. By following the process defined above to define a thorough and robust testing procedure, you will have the best chance to detect and resolve issues as soon as you possibly can.
By building your testing into your definition of done, you are making testing a central part of your project team’s effort, and not just leaving it to your test team to figure out once development is complete.
Work to ensure your whole team understands the importance of testing and that it is not seen as a task that “other people do” – testing is a core part of the team’s development efforts and the whole team must buy into this concept.
Frequently asked questions about your testing process
How far down the environments should I release unit tests?
Unit tests should certainly be running on your Development environment, but it can be beneficial to release them (or a subset of them) to downstream environments too. This could be used to enable you to do automated smoke-testing of the release, particularly if you have a large set of static or reference data you need to ensure is up to date. It is not usual practice to release unit tests to Production, but do consider releasing them to your Test or Staging environments.
Is there an Academy course I can complete for unit testing?
Yes, there is a course that covers unit testing. It is based on Pega 8.1 and is accessible at Pega Academy – Unit Testing Rules.
Is there documentation covering unit testing?
Is there more information about all the different types of testing I can do in a Pega application?
How can I stop the personal opinions of my testers becoming “defects”?
It can be common for testers to raise bugs that are based on their opinion rather than being a defect in the Minimum Lovable Product. This does not mean that the issue is not genuine, but it may mean that it is outside the scope of the MLP, or of the project phase. Two things can help.
Firstly, ensure the testers are aligned around the project’s Day One Live plan so that they know the scope of the application and the problems it is trying to address. The Day One Live plan concentrates on what is going to be different for users on day one, so gives scope to the testing.
Secondly, ensure you have an easy-to-access method for testers to raise new work and enhancement requests into the product backlog so that any enhancements or out-of-scope issues can be captured for addressing in future.
When unit testing, do I test just the happy path, just edge cases, or both?
The project LSA will usually guide you on this, but generally, every rule that can have a unit test should have a test that tests the “happy path” – that is the business-as-usual behaviour of the rule. In many cases, this is enough.
If your rule depends on boundary values, such as a Decision Table that returns a result depending on ranges of financial figures, then it is a good idea to have extra tests to cover the boundary conditions and test as many combinations of values as is necessary to catch any issues with boundary values.
Whose responsibility is it to resolve issues with a unit test suite that has errors?
It is your responsibility! The whole team has responsibility to ensure that the unit tests are working. Any test that is no longer passing is an indication of a potential issue with the development and needs to be addressed.
What do I do if a user interface change has stopped a scenario test from working?
The easiest way to address this problem is to re-record the scenario test.