Back Forward Load-balancing

Load-balancing is a technique or facility that attempts to provide an even backlog of demand across multiple processors or production facilities. This term is used in two (unrelated) contexts.

Balancing assignment workloads

If a team or work group of operators are equally capable of performing a new assignment, a router task (Router) in the flow can send the assignment to the operator with the shortest worklist, in the interests of fairness and throughput.

The definition of "shortest" may differ from application to application, and may not be a simple count of assignments on each operator's worklist.

For example, the standard routing activity named Work-.ToLeveledGroup sends an assignment to the operator within a specific work group who has the least urgent total worklist, based on a computed score. It uses the standard function pickLoadBalancedWorkGroup (in the Routing library) to combine information about operator skills, operator availability, work object urgency, and effort. As an output, this activity identifies one member of a work group who will receive the next assignment. PRKB-19667

Balancing HTTP traffic across multiple server nodes

In a multinode Process Commander system, server demands from interactive user sessions are ideally balanced across nodes in relation to their power. (Some processing workload arises from agents and services rather than from interactive users; this can occur on designated nodes.)

Typically, such load balancing is achieved with hardware routers that support "sticky" HTTP sessions, so that a user who happens to first log into node ALPHA remains with ALPHA for the duration of the session. For example, Cisco Systems Inc. and F5 Networks Inc. offer such hardware (among others).

However, in some situations, software-based load balancing is appropriate. For example, IBM's WebSphere Edge Components includes a Load Balancer module. In some environments, reverse proxy servers are useful.

ThesePDN Pega Developer Network articles map be helpful:

TipTo support reverse proxy serving, add the following prconfig Dynamic System Settings:

<env name="Initialization/ContextRewriteEnabled" value="true" />
<env name="Initialization/SetBaseHTMLContext"
     value="http://revproxy/AcmeCorp/setup" />

Dynamic System Setting: prconfig/Initialization/ContextRewriteEnabled/default

The prconfig/Initialization/ContextRewriteEnabled/default setting enables Reverse Proxy Server functionality. In order to use either the SetContextURI HTTP Header or the SetBaseHTMLcontext settings, this setting must be set to true.

CautionIf a URL is specified for the GatewayURL setting, that URL will override any entries for this and SetBaseHTMLContext.

The following values are valid for this Dynamic System Setting:

Value

Description

true Enables Reverse Proxy Server functionality.
false Disables Reverse Proxy Server functionality. (Default)

XXXXX[research] Need more information about the following setting. What does the string actually define?

Dynamic System Setting: prconfigInitialization/SetBaseHTMLContext/default

The prconfig/Initialization/SetBaseHTMLContext/default setting

The following values are valid for this Dynamic System Setting:

Value Type

Description

String xxxxx

CautionChange this setting with care. Prconfig Dynamic System Settings can have broad impact on the operation of your system. See How to create or update a prconfig setting.

 

Testing under load

You can verify operation of a load-balancing scheme using open source toolsets such as OpenSTA (the Open System Testing Architecture, at IEwww.opensta.org) and with commercial software products.

Definitions router task, skills, urgency
Related topics Clusters — Concepts and terms

UpDefinitions — L