AppExchange App Analytics and AWS: How to Automate Pulling Your Daily Log Files

Kam Patel
AppExchange and the Salesforce Ecosystem
8 min readSep 3, 2021

Special thanks to my co-author Jeremy Hay Draude.

“Data beats opinions.” That’s a saying that never gets old — and the more data you have access to, the better. With AppExchange App Analytics, ISV partners can get usage data on how users interact with your AppExchange solutions. Getting this type of data and related analytics has a lot of benefits for partners, like identifying attrition risks and helping to make feature development decisions.

In this blog post, I will walk through the necessary steps for deploying the App Analytics app and an AWS stack that will fetch and store App Analytics daily log files in the AWS S3 bucket. In the next blog I will show you how to leverage TCRM to visualize this data. We’re going to get technical, so grab your caffeinated beverage of choice and let’s dive in.

Check out the on-demand Dreamforce ’21 episode covering this topic! Stream on Salesforce+ today.

Also, I would like to thank my Colleague Jeremy Hay Draude who is the ISV Platform Expert responsible for Partner Intelligence. To request a consultation, simply log a case through the partner community and choose “Customer 360 Platform Expert Consultation” as the SubTopic.

The PI-LabApp (App Analytics)

Available as a free app via the Salesforce Labs program, you can find the app on the AppExchange. This app has “Log pull Config” object that stores the “Log pull configuration” (appname, packageid) and also “Log Pull Activity” for tracking the log pull activity.

Pre-requisites:

  • Salesforce Partner Business Org’s username with API enabled
  • Existing AWS account or you can sign up for a new account at aws.amazon.com

Disclaimer: AWS stack in this blog post is provided as-is and as a general guidance. You are responsible for everything including all the expenses incurred.

Salesforce Setup

Let’s start with how you’ll need to setup Salesforce. This setup should take around 15 minutes.

Pre-requisites: Make sure your partner business org (PBO) is enabled for the daily logs. Verify you can use this free app and make sure you can retrieve a file for data type: Package Usage Log. If it is not enabled please open a support case as per this doc.

Step 1: Install the package in your PBO org

Step 2: Assign Permission set “PILabappConfigPSL” to the users (those going to access PILabapp and also API user that’s going call AppAnalyticsRequest API from AWS)

Step 3: Configure the Log pull records

Go to the application “PI Labapp”. In the “Log Pull Config” tab, create a new record as shown below.

  • AppName: The name of your app. Note: No special characters can be used.
  • AppPackageId — Main PackageID of your app (this will be 15 characters of your Package ID that is associated with your AppExchange listing). Note: This means you will strip the last three characters from your 18-character Package ID.
  • Packages: If you have only one package associated with the AppExchange listing, copy the same value as in AppPackageId in this field. If you have multiple extension packages associated with the main package, then enter comma delimited extension packages along with the AppPackageIdm e.g. packageid1, packageid2, etc.
  • ExpectedVolume: Select this value as a ‘HIGH’ if package log volume is high (this will pull log files per day by making 24 hourly requests). Select this value as a ‘LOW’ if log volume is going to be low (this will pull log files per day in a single request).

AWS Setup

Now, let’s get AWS in a good place. This setup should take around 20 minutes.

Pre-requisite: Use your existing AWS instance or signup for a new AWS instance.

Step 1: Install the stack

  • Stack currently supports only one region: us-east-1
  • In the AWS console, change the region to us-east-1 (This is located in the top second right menu item).

Note: It will support other regions soon. We are working through additional configuration.

  • Go to AWS Services → CloudFormation → Stacks
  • Click “Create stack” with the option “With new resources (standard)”, select all defaults thereafter and use
  • For us-east-1 region use this S3 URL.
  • Name the stack, accept all defaults and create the stack

Step 2: Update secret keys with PBO login credentials

  • Go to AWS Services → Secret Manager

Find the secret starting with TemplatedSecret*** and update with your PBO username and password. Note: Make sure to append security token to your password.

Below is a short video showing Steps 1 and 2 (Open the image in a new tab for best viewing).

Manual Testing

Another saying that will never go out of style: Test, test, test. It’s time to walk through some manual testing. This will take around 30 minutes.

Step 1: In Salesforce, create Log Request records manually. In PBO org, go to “PI Labapp-Config” app, go to “Bulk Log Request” tab, select an app and create records to request log pull for a couple of days in the past (less than today’s date).

This should create Log Pull request under “Log Pull Activity” tab. In Step 2, when we run AWS Step function, it will process these records.

Step 2: In AWS, run Step function

  • Go to AWS Services → Step Functions → State machines
  • Select the state machine that starts with the name “piappLogRequestStateMachine***”
  • Click “New Execution” and then “Start Execution”
  • Monitor the status: If everything goes well, all steps except the last should go green. Note: The last step will be red (failed) as it is related to Athena which we have not yet set up, so ignore for the time being

Step 3: In Salesforce, validate the Status. In PBO org, go to “PI Labapp-Config” app, go to “Log Pull Activity” tab and you can see the logs.

Step 4: In AWS, validate S3 files

  • Go to AWS Services → S3
  • Find the bucket that starts with the name “piappjscdkstack-piapppidailylogbucket***”
  • You should see a folder structure similar to the below image and daily log files under the appropriate date folders

Automation: Enable Daily Log Pull

Now that manual testing is done, let’s enable the Daily Log Pull job. This will take around 10 minutes.

Step 1: In Salesforce PBO org:

  • Under “Log Pull Config“ pf your app record, update the “EnableDailyPull” flag to true

Step 2 (Optional): In AWS:

  • Go to AWS Services → CloudWatch->Rules and change the timing of the run

AWS Athena Setup

We’re almost done! Let’s spend 15 minutes getting AWS Athena up and running.

Step 1: Run Glue crawler to detect the parquet table from S3 bucket

In AWS, go to Go to AWS Services → Glue. Click on crawler, and run the crawler so it detects a table created from S3 bucket where daily log files are stored(name starting with piappjscdkstack-piapppidailylogbucket***). Wait for the status to be “Ready” and you should see one new table created.

Step 2: Run the Athena queries

Note: Before you run your very first Athena query, you need to set up a query result location in Amazon S3. Select the existing bucket name starting with “pilabapp-piapppiathenaoutbucket***”

  • In AWS, go to Go to AWS Services → Athena → Athena query editor
  • Select the “piappdb”
  • Run this query “MSCK REPAIR TABLE apps2” to update Log partitions
  • Validate the table “SELECT * FROM “piappdb”.”apps” limit 10;”

Step 3: Create Views

In AWS, go to Go to AWS Services → Athena → Athena query editor,

  • Select the “piappdb”
  • Open the gist which has View definitions
  • Copy each View creation query (e.g. logins, entity) one by one e.g.

Tableau (Optional)

This last step is completely optional, but we still wanted to include it. At this point, you are ready to point Tableau to Athena table/views and start creating dashboards on the App Analytics data.

Note: Tableau Desktop or online is required. Don’t have this yet and are interested? You can get a 14-day free trial here.

Frequently Asked Questions

Here are a few of the questions we get most often.

Q: How can I monitor AWS step function failures?

A: Go to Step function execution history, select an execution, and put a cursor on different steps in the workflow and check Input, Output, Exceptions etc for each step. You can even check steps input/output for the inner loops (change the index value in the dropdown)

Q: Can I change the deployed AWS step function and Lambda code directly in AWS console?

A: Absolutely yes.

Q: How do I update my existing AWS stack with the new version?

A: Go to CloudFormation, select the existing stack, click on the Update button and click select “Replace current template”, and use the same S3 URL.

For us-east-1 region, use the same S3 URL mentioned in Step 1 of AWS setup.

Then, click Next. This should update your AWS stack.

Summary

Having a lens into how visitors are interacting with your solutions on AppExchange can prove vital in your strategy and planning. I hope this blog post has helped you configure your Salesforce PBO org and AWS stack so it can start pulling the daily log files and start getting these insights. Please checkout the second part of the blog so you can visualize these data using TCRM…

--

--

Kam Patel
AppExchange and the Salesforce Ecosystem

Hello! I work at Salesforce as a technical expert for ISV partners. I love going for long walks, socializing, public speaking and teaching kids in rural areas.