Skip to main content

How To Publicly Display Google Analytics Data

Intro

If you have a scenario where you want to publicly display some of your Google Analytics data (like I do on the homepage of this website) then you can follow these steps to accomplish this.

The general strategy is that we create an AWS Lambda function that queries the API once a day, and saves the data we want into a JSON file that we put in an AWS S3 bucket. We make the bucket publicly available and then on the front-end we pull down the file and display the data.

Authentication Prep

  1. In Google Developer Console create a project
  2. Go into the APIs and Services section and enable the Google Analytics Data API for this project
  3. You will be provided with a JSON file that contains your authentication for this API. Save the file.
  4. Inside the authentication JSON file will be a field client_email. Note this email address.
  5. Go into Google Analytics Dashboard Admin > Property Settings > Property Access Management
  6. Add the client_email as authorized user with viewer role.

AWS Lambda

We are going to use AWS Lambda to query the data that we want, and then save that data to S3.

To interact with the Analytics API you will use the Google Analytics Data Node.js Client

tip

You will not be able to package the Google Analytics Data Node.js Client directly into your Lambda Zip as I show you how to do in this post: 📘 ncoughlin > aws-lambda-external-libraries as it exceeds the maximum size that a Lambda allows while still allowing you to edit the Lambda function in the editor.

You will need to create a custom Lambda Layer and attach it to the function.

If you are setup to use AWS SAM you can probably get around this as you aren't editing in the online editor anyway.

Take the authentication JSON file that we got earlier and place it in the main directory of the Lambda function.

Here is the code for the Lambda function:

// the objective of this function is to fetch google analytics data
// will be triggered once daily and then store the data to be fetched by the front end
// so that it does not have to be fetched every time a user visits the site

const {BetaAnalyticsDataClient} = require('@google-analytics/data');
const {S3Client, PutObjectCommand} = require('@aws-sdk/client-s3');

exports.handler = async (event, context) => {
// log event for debugging
console.log('🌎 EVENT', event);

// standardized error handler
const handleError = (error) => {
console.error('⚠ Full Error Code', JSON.stringify(error));

const errorResponse = {
statusCode: 400,
message: error.message,
requestId: context.awsRequestId,
function_name: process.env.AWS_LAMBDA_FUNCTION_NAME,
function_version: process.env.AWS_LAMBDA_FUNCTION_VERSION,
};

console.log('🚧 Custom Error Response', errorResponse);

throw new Error(JSON.stringify(errorResponse));
};

const propertyId = 'YOUR_PROPERTY_ID';
const analyticsDataClient = new BetaAnalyticsDataClient();
const s3Client = new S3Client();

function getDateDaysAgo(daysAgo) {
const today = new Date();
const targetDate = new Date();

// Set the date to the specified number of days ago
targetDate.setDate(today.getDate() - daysAgo);

// Extract the year, month, and day
const year = targetDate.getFullYear();
let month = targetDate.getMonth() + 1; // Months are 0-based in JavaScript
let day = targetDate.getDate();

// Pad month and day with leading zeros if needed
month = month < 10 ? '0' + month : month;
day = day < 10 ? '0' + day : day;

// Format into YYYY-MM-DD
const formattedDate = `${year}-${month}-${day}`;

return formattedDate;
}

try {
// Runs a simple report.
const runReport = async () => {
const [response] = await analyticsDataClient
.runReport({
property: `properties/${propertyId}`,
dateRanges: [
{
startDate: getDateDaysAgo(60),
endDate: getDateDaysAgo(1),
},
],
dimensions: [
{
name: 'date',
},
],
metrics: [
{
name: 'totalUsers',
},
{name: 'screenPageViews'},
],
orderBys: [
{
desc: false,
dimension: {
dimensionName: 'date',
orderType: 'NUMERIC',
},
},
],
})
.catch((error) => console.log(error));

return response;
};

const report = await runReport();

// Convert response to JSON string
const jsonData = JSON.stringify(report);

// Parameters for S3
const s3Params = {
Bucket: 'YOUR_BUCKET_NAME',
Key: `analytics.json`,
Body: jsonData,
ContentType: 'application/json',
};

// Upload to S3
const putObjectCommand = new PutObjectCommand(s3Params);
await s3Client.send(putObjectCommand);

// Return a success message
return {
statusCode: 200,
body: JSON.stringify({
message: 'Report generated and uploaded to S3 successfully',
}),
};
} catch (error) {
handleError(error);
}
};

Making sure to update the analytics property and s3 bucket variables with the correct values.

Additionally you need to do three things for this Lambda function to work:

  1. In the Lambda configuration set the timeout to 1+ minutes. That's more than is necessary, but the request takes more than three seconds (default timeout) and if you don't update this the function will always timeout.
  2. You need to update the IAM role for this Lambda function to have permission to modify the S3 bucket.
  3. Add environment variable Key: GOOGLE_APPLICATION_CREDENTIALS Value: /var/task/service_account.json where service_account.json is the name of the credentials file you placed in directory of the Lambda function.

This last one is a bit confusing, but essentially when the analytics client is initialized it automatically looks for an environment variable GOOGLE_APPLICATION_CREDENTIALS whose value is a pathname to the JSON file which contains the credentials.

Configure S3 Bucket

Your analytics.json file should now be landing in your S3 bucket, but it won't be publicly available.

Go to bucket permissions and set the Bucket Policy to:

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "allowPublicAccess",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::YOUR_BUCKET_NAME/*"
}
]
}

and set the CORS policy to this:

[
{
"AllowedHeaders": ["*"],
"AllowedMethods": ["GET"],
"AllowedOrigins": ["*"],
"ExposeHeaders": [],
"MaxAgeSeconds": 3000
}
]

The contents of this bucket will now be publicly available.

Schedule Lambda Function with EventBridge

At this point we have a Lambda function that fetches the analytics data we want for the last 60 days and saves the data in a public S3 bucket. But lastly we need to automatically trigger this function to fire once a day indefinitely.

The easiest way to do that is to use AWS EventBridge Schedules, where you simply select the Lambda function you want to trigger, and when you want to trigger it.

The CRON expression to trigger every day at 3AM is:

0/3/*/*/?/*

Fetch Data on Front End

To fetch the data on the front end you simply do a get request to the file in the bucket using whatever method you prefer, for example:

const response = await fetch(
'https://ncoughlin-analytics.s3.amazonaws.com/analytics.json',
);

Once you have the data in the front-end, you can manipulate it and display it however you want 👍

Comments

Recent Work

Free desktop AI Chat client, designed for developers and businesses. Unlocks advanced model settings only available in the API. Includes quality of life features like custom syntax highlighting.

Learn More

BidBear

bidbear.io

Bidbear is a report automation tool. It downloads Amazon Seller and Advertising reports, daily, to a private database. It then merges and formats the data into beautiful, on demand, exportable performance reports.

Learn More