AppExchange App Analytics and AWS: Visualize Adoption Metrics Using TableauCRM Dashboards

Kam Patel
5 min readNov 3, 2021

--

Special thanks to my co-authors Jeremy Hay Draude and Anika Teppo.

In part one of this series, we walked you through the setup of PILabapp which pulls the daily AppAnalytics log files of your AppExchange app in the S3 bucket. In part two, we’ll cover how you can aggregate the S3 data and stage it for the TableauCRM to consume.

Since AppAnalytics will grow over time and can grow into terabytes (unless you purge periodically), it’s a good idea to aggregate data to be easily consumed by Business Intelligence (BI) tools.

How about trying out these awesome dashboards yourself? You can! Go to PILabapp listing on AppExchange, select the“Test Drive” option, and follow along with the In-App guidance to check out sample adoption dashboards (of another labapp called “tagit”).

Pre-requisite:

  • Set up the process to pull daily log files in the S3 bucket (you can review these steps in part one of this series)

AWS setup

This setup should take around 15 minutes.

Step 1: Manually test the step function (daily scheduled) that gets the Athena view’s data and stages the aggregated data in S3 so TableauCRM can pull it using S3 connector

  • Go to AWS Services → Step Functions → State machines
  • Select the state machine that starts with the name “piappSummaryStateMachine***”
  • Click “New Execution” and then “Start Execution”
  • Monitor the status. If everything goes well, it should be all green and you will see aggregated data under the S3 bucket “piappjscdkstack-piapppisummarybucket***”. Each folder under this S3 bucket has corresponding aggregated view data in CSV format.

Step 2: Save AWS access keys

  • Go to AWS → IAM and create access key to be used in the next steps.

Salesforce TableauCRM Setup

Let’s connect TableauCRM to Athena views. This should take around 20 minutes.

Step 1: Go to Data Manager.

  • Click “Connect to Data”
  • Click “Add Connection”
  • Select the connection type “AWS S3”. For Connection Name, Developer Name, Description fields, use “piaws”.
  • Enter AWS access key and secret key (From AWS setup, step 2). Master Symmetric Key is not needed. In the Folder path, copy and paste bucket name from AWS S3 starting with pilabapp-piapppisummarybucket*** (do not enter “S3://”) just the bucket name.

Note: In the region filed make sure to enter “Region Name” (not the code) as per the document.

  • Create the connection.

Step 2: Create the required objects pointing to S3 folders (one at a time).

  • Follow previously mentioned steps by clicking “Connect to Data” and selecting the AWS S3 connection (i.e. “piaws”) that was created in the previous step. This will show the folders (one folder for each Athena view). Pick one of the view and click “Continue”.
  • Select all the fields and click continue.

Note: If you get an error “Can’t retrieve field values, but you can still edit the object settings ….” then just ignore and continue.

Step 3: Repeat the previous step for the remaining views.

  • Once you are done creating all the required objects, go to the following step.

Step 4: On “piaws”, click on “Run now” using the right drop-down arrow.

  • This will sync the data from AWS S3 bucket.
  • Set up a Schedule so this runs daily as per the doc.

Step 5: Create an output connector.

Make sure you have enabled ‘Enable Salesforce output connection“ & “Allow recipes to use direct data” as shown below

For the Entity-FeatureMapping to work automatically, let’s set up an output connection.

  • Click “Connect to Data”.
  • Go to Output Connections, under “local” connection’s dropdown box
  • Click Edit “connection“ and enter username/password of your PBO org and for ”Service URL” use: https://login.salesforce.com/services/Soap/u/52.0
  • Click “Save & Test”

Step 6: Install PILabapp extension package.

Now let’s install an unmanaged package (which has TableauCRM Dashboards and recipes) in your PBO org (make sure to select the option “Install for All users”), where you already have installed the PILabapp if you followed part one in this series.

Step 7: Run the recipe which should create datasets.

  • In the Data Manager, go to “Dataflows & Recipes” and select PI_Main recipe.
  • Click “ Run now” which will create the datasets.
  • Go to Monitor and check the status. Note: it can take around five minutes to complete
  • Set up a Schedule so recipe runs daily as per the doc.

After recipe run is complete, you will see pilabapp__FeatureEntityMapping__c custom object is populated with all of your entities. Go ahead and update picklist values of “pilabapp__Feature__c” field as appropriate for your app and then modify FeatureEntityMapping records with correct feature name you want to associate with. Next time when you run the recipe again, your Feature adoption dashboards will show adoption by configured Features.

Step 8: Validate the Dashboards and datasets.

  • Go to PI Labapp and check out the dashboards

Step 9 (Optional): Add the Dashboards in the new Lightning App Builder pages as shown in the test drive.

Summary

And there you have it! We connected Salesforce TableauCRM to the AWS S3 bucket, installed an extension package with recipe & dashboards. This solution is very flexible and easy to extend with additional Athena views (S3 aggregations) and TableauCRM dashboards.

--

--

Kam Patel

Hello! I work at Salesforce as a technical expert for ISV partners. I love going for long walks, socializing, public speaking and teaching kids in rural areas.