Skip to main content

         This documentation site is for previous versions. Visit our new documentation site for current releases.      

This content has been archived and is no longer being updated.

Links may not function; however, this content may be relevant to outdated versions of the product.

Analyzing text in the NLP Sample application in Pega 7.1.9

Updated on August 29, 2018

The NLP Sample is a reference application that showcases the text analytics capability of the Pega 7 Platform. You can analyze text-based content including news feeds, emails, and posts on social media streams such as Facebook, Twitter, and YouTube. This type of information can provide strategic insights and influence enterprise decisions.

The application is included in a special RAP script that you download and import through Designer Studio.

This tutorial explains how to access the NLP Sample application and analyze the text-based content of tweets, Facebook posts, or YouTube metadata.

You can use this version of the NLP Sample application only with Pega 7.1.9.

Installing the RAP script

Go to Designer Studio and import the RAP script that contains the NLP Sample application.

  1. Download icon_zip.gifNLPSample.jar.
  2. Click Designer Studio > Application > Distribution > Import.
  3. In the import wizard, click Browse and select NLPSample.jar.
  4. Click Next, and make sure that the NLPSample.jar file is selected.
  5. Click Next.
  6. Click Next, wait until the import is completed, and click Done.
  7. In the Operator menu, click Operator.
  8. In the Application Access section, add the NLPSample:Administrators access group.
  9. Click the button to the left of the NLPSample:Administrators access group. This selection makes NLP Sample your default application.
  10. Log off and log in with the same credentials.​
  11. In the​​ Application menu, verify that your application switched to NLP Sample.

Preparing resources for text analysis

Before you can analyze the text-based content of tweets, Facebook posts, or YouTube metadata, you must complete the following procedures:

  • Create and configure an instance of the Data Set rule that allows you to connect with the Twitter API, Facebook API, or YouTube Data API.
    You can also use one of the sample instances of the Data Set rule (Facebook DS, Twitter DS, or YouTube DS) that are delivered in NLPSample.jar, but remember to change its access details. The data sets do not work with the sample access details and can cause errors.
  • Create and configure an instance of the Free Text Model rule.
  • Create an instance of the Data Flow rule to reference the data set and the Free Text Model rule from the data flow.

For more details, see the following tutorials:

Analyzing the text-based content posted on Facebook

Analyzing the text-based content posted on Twitter

Analyzing the metadata of YouTube videos

Viewing text analytics reports

Activate the data flow that you just created and keep it active until it processes some records.

  1. Click the Application menu in Designer Studio and switch to your application.
  2. Open the data flow that you created.
  3. Click Actions > Run.
  4. In the Data Flow Test Run dialog box, click Activate.
  5. Wait until the data flow processes some records.
  6. Click Designer Studio > Text Analytics Reports.
  7. In the Text Analytics Reports window, you can view different charts that contain the text analytics reports.

You have just completed a text analysis of content from one of the social media streams (Facebook, Twitter, or YouTube). You activated a data flow that references an instance of the Data Set rule and Free Text Model rule. The data set connected to an appropriate API and the Free Text Model rule conducted the text analysis. The results of the analysis are available in the text analytics reports.

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best. is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us