Skip to main content

AppTweak API Automation

Use Google Cloud Runner and Google Big Query to automate your data pull from our API and connect it to Looker Studio.

Olivier Verdin avatar
Written by Olivier Verdin
Updated over a year ago

Welcome

Hey! If you are reading this chances are you are curious in obtaining deeper insights of the dynamic app store ecosystem. Well, you are in luck! In this tutorial we are going to be covering how to create a custom dashboard by tapping into the power of AppTweak's API. If at any point of the tutorial you feel either confused, or are stuck, do not hesitate to contact us.

The use case we are going to be setting up

Suppose that while tinkering with AppTweak tool, you are able to detect some download spikes for one of your apps, but despite your best efforts you are unable to make a quick mapping between this phenomena and the reason that caused it. After hours of thinking, you realize that maybe it would be useful to infer data from the keyword ranking history. Maybe a competitor fell-back on a keyword you were advertising for, maybe some seasonality started to kick in. Regardless of this creating custom dashboards is only one of the multiple possibilities you have the power to build with the AppTweak API, and in this tutorial you are going to learn how to harness such power for the after-mentioned case.

What's the architecture of what we are going to be building?

Imagine the good old pneumatic messaging system, powered by pipes, little cylinders with messages in them and two people, the receiver and the sender. Back in the days before the internet existed, this was peak instant messaging across an organization, the sender would write a message in a paper, encapsulate it in the cylinder, and place it on the correct pipe. The system would then take this message that the receiver would then get and do something based on this input. This is pretty much what we are going to be building to get automated dashboards in the Google Cloud platform.

We are going to need then;

  • A Google Cloud Pub/Sub with a single topic, this would be the pneumatic tubing system of our analogy

  • A Google Cloud Scheduler function that sends messages in periodic fashion, this would be the sender in our analogy

  • A Google Cloud Serverless function that fetches the data from AppTweak's API, this would be the receiver who would do a task based on the message that they get.

On top of this, we are going to need a couple of more things to make the work of the receiver possible;

  • A Google Big Query Database, where we are going to store the information we fetch from AppTweak.

  • A Google Data Studio Dashboard to visualize this

When you are done with the tutorial, you will have a dashboard that compares the download estimates and compares them with the featuring events of the app, that refreshes automatically.

Setting up Google Cloud

The first step in this tutorial is creating your Google Cloud project. In this service we'll be storing the code that will fetch the data for the use case and we'll be setting up the database. These words may seem complex, but don't despair, we are going to guide you in every step of the way, and you'll realize it's super simple.

  1. Go to Google Cloud's home page here and sign-in

  2. Click on "Get Started for free". A new window will open, sign-up to the service by completing your account details. You will be redirected to your Google Cloud Billing console.

  3. Click on Google Cloud in the top left corner

  4. Click on "My First Project" on the top-left corner, and in the dialog window, click on "New Project"

Name the project "AppTweak-KeywordRankings". If you want to, or if your organization already has a team in Google Cloud, fill out the "Organization: field too.

Wait for the project to be created, and once it's done, select it from the notifications menu or from the dropdown in the top left corner.

There you go! Your Google Cloud Account is all set up!

Creating a Pub/Sub Channel

Great, so after creating our Google Account, we need to create the Pub/Sub channel. Remember our little analogy? Well the Pub/Sub is the pneumatic tubing system itself, here we are going to be configuring the pipes themselves.

Please, follow these steps:

  1. Click on the Hamburger Icon

  2. Browse for "Pub/Sub" under More Products > Analytics, and select Topics


    💡Tip, when accessing menu items in Google Cloud Menus, you can click on the pushpin icon 📌 to anchor options in the top of the menu, so you don't have to find them again.

  3. Click on Create Topic


  4. Use fetch.downloads.trigger as the ID and create

Awesome! If everything is okay, you should be seeing this screen.

❓Not what you see? Contact us and we can help you.

Now you have the messaging system in place! We do need to tweak a bit how it works, to ensure its resilient to any problems we might encounter in the future. Let's follow these steps to set that up.

  1. Select Subscriptions in the left menu, you should see a subscription named fetch.keywordranking.trigger-sub


  2. Click on the three dots, and click on Edit

  3. Under Expiration period change the value to be Never Expire. This makes it so that you don't have to create all this configuration again if you pause the functions that fetch data from AppTweak for any period of time.

  4. Under Retry policy change it from Retry immediately to Retry after exponential backoff. For the min set 60 seconds, and for max set it to 600 seconds.

  5. Click on Update at the bottom

❓Had any issues? Contact us and we can help you.

Awesome, so now we have the piping system in place, and if any messages get stuck in the pipes, the system will retry again. Messages that are not consumed are also set to never expire, which can come in handy if our code breaks or if we pause our code.

Creating a Cloud Scheduler Job

We now need to create the agent responsible for pushing messages down the pipes. We can configure that with a Cloud Scheduler Job. Follow these steps:

  1. Click on the Hamburger icon (☰) on the top-left corner

  2. Select Cloud Scheduler under More Products > Integrations Services

  3. Click on Create Job


    👀 You might have to setup your billing information for this step

  4. In the name section put fetch-data-message-producer

  5. For region choose eu-west-1

  6. Under description you can put something that you like, or leave it empty, as this field is optional.

  7. Under frequency put 0 23 * * 7, this will cause the trigger to fetch the data occur every Sunday at 23:00.

  8. For the Timezone select Belgium (CET), since AppTweak offices are located there.

Your settings should look like this:

Click on continue and do the with the following:

  1. Select Pub/Sub from the dropdown menu under Target type

  2. Select the Pub/Sub channel we just created under topic

  3. For the body, you can put any string you like as the content is irrelevant for this. As an example we are using start_fetching

If everything is okay, you should have the following config:

Awesome! Now we are just missing one final piece of the puzzle. Click on continue and do the last two steps:

  1. Set the max retry attempts to 5

  2. Change the min backoff duration to 60 secs, and the max backoff to 1h

Your settings should look like this:

If that's the case, click on create, and you have done it, the scheduler is done.

❓Had any problems? Contact us and we can help you.

Setting up BigQuery

Remember how in the analogy we've been using we had a worker that would do something when a message came from the piping system? Well that worker we know is going to be fetching data from AppTweak's API, but they have to store that information somewhere. For this we are going to use the BigQuery service.

  1. In your Google Cloud console, click on the Hamburger icon (☰) on the top-left corner

  2. Browse for "BigQuery", under More Products > Analytics.

    ℹ️ Even though BigQuery has a submenu, we want to click on the menu item itself

  3. In the explorer section, you should see an item with the name of the Project ID, if we followed all the steps it should be apptweak-keywordrankings. You can double check by going to the mainscreen, and copying the value here

  4. Click on the three dots and click then on Create Dataset

  5. Under dataset ID set keyword_rankings

  6. Leave everything else as is, if everything looks good, it should be:

That should be all for this step!

Adding the code

So, we have our worker that sends messages, and we have the pipeline to transport said messages. We are just missing the final key of the puzzle, the actual code to integrate with AppTweak.

During this step, you may see some popups asking to enable some Google APIs, just click on enable each time you get asked.

For this we are going to setup a Cloud Function, so without further ado, the steps:

  1. Click on the Hamburger icon (☰) on the top-left corner

  2. Browse for Cloud Functions it should be under More Products > Serverless

  3. Click on "Create Function"

  4. Select 2nd Gen as the Environment

  5. Use fetch-downloads-data as the name

  6. For region, chose europe-west1.

Your configuration so far should look like this

Good? Great, now, under Trigger, click on ADD EVENTARC TRIGGER.

  1. Enable the API.

  2. In the dialog, for the event, leave google.cloud.pubsub.topic.v1.messagePublished

  3. Select the Pub/Sub topic we created in the previous steps

  4. For the Region, keep on selecting europe-west1 as we have

  5. Grant the permissions necessary

  6. Leave Service account as is

  7. Click on Retry-Failure

If everything went well, this should be your configuration:

If everything looks good, go ahead and save the trigger. Now we need some final touches, expand Runtime, build, connections and security settings.

  1. Under memory allocated, change it to be 512MiB

  2. Change the timeout to be 300s

At the bottom you will see a button for environment variables, lets create two of them.

  1. APPTWEAK-API-KEY as the name, and your AppTweak API key as the value

  2. GOOGLE-CLOUD-PROJECT-ID as the name, and your project id as the value. Should be apptweak-keywordrankings

You may see the following warning in this step:

Feel free to ignore it.

Click on Next and lets follow the next steps:

  1. In the left side panel click on the Runtime dropdown and select Python 3.10

  2. Under entry point, change it to be consume_event

  3. Replace the requirements.txt file with the following:

    functions-framework==3.*
    typing
    datetime
    python-dateutil==2.8.2
    requests
    pandas==1.4.3
    pandas_gbq==0.17.8


    ⚠️ Make sure each of these statements are in their own unique line as shown in the image

  4. Replace the main.py file with the following:

import base64
import functions_framework
import os
import numpy as np

# Triggered from a message on a Cloud Pub/Sub topic.
@functions_framework.cloud_event
def consume_event(_cloud_event):
# try:
fill_default_parameters()
parameters_validation()
recipe_parser = AppKeywordsRankingEvolution()
recipe_parser.fetch_and_parse_data()
recipe_parser.export_to_big_query()
# except ApiRuntimeError as e:
# recipe_parser.export_to_big_query()
# print_need_help()
# raise e
# except Exception as e:
# print_need_help()
# print_exception(e)
# raise e

"""
Recipe : App Keyword Ranking Evolution

The way this recipe works:
1. You provide the list of apps & keywords for which you would like to have rank & installs
2. Get the app installs & rank for each keyword for the past week:
https://developers.apptweak.com/reference/app-keyword-ranking-history
3. Output a csv breakdown of the data.

This recipe is useful to plot the relative strength of keywords based on the app rank and the number
of installs it provides
"""

### ------------------ Your job ------------------ ###

# Fill the different parameters between the double quotes

# Set your API token (mandatory if you want to run the recipe :D)
API_KEY = ""

# Set the device (device options : "iphone", "ipad" or "android")
# Default : "iphone"
DEVICE = ""

# Give the list of app ids here separated by a comma
# Warning: do not mix Android and iOS ids as that
# will cause the request to fail.
# Default for ios (iphone/ipad) : "388947468,409128287"
# Default for android : "uk.co.telegraph.kindlefire,com.guardian"
# (The Telegraph: UK & World News and The Guardian: Breaking News)
APPS = "1001501844,1058959277,719972451"

# Give the list of keywords here separated by a comma
# Default : "newspaper,news,info,uk info,live news"
KEYWORDS = "food delivery,food,delivery"

# Give the list of countries code here separated by a comma
# You can find the list of all the country codes supported
# here : https://developers.apptweak.com/reference/country-codes
# Default : "gb"
COUNTRIES = "be"

# Give the start date to get the downloads
# Default : One week ago on ISO 8601 format
# (Eg 2022-01-21)
START_DATE = "2021-11-2"

# Give the end_date to get the downloads
# Default : Today on ISO 8601 format
# (Eg 2022-01-28)
END_DATE = "2021-12-2"

# Give here the list of metrics you want separated by a comma
# Available metrics : "installs", "rank"
# Default : "installs,rank"
METRICS = ""

# ONLY FOR GOOGLE CLOUD INTEGRATION
GOOGLE_CLOUD_PROJECT_NAME = ""

# For more information about the endpoint parameters and options visit:
# https://developers.apptweak.com/reference/app-keyword-ranking-history

### ---- End of your job, just run and enjoy ! ---- ###

### -------------- ENTER MAGIC AREA --------------- ###
# Keep scrolling only if you are already an experienced wizard !

from cgi import print_exception
from typing import Dict
from datetime import datetime
from dateutil import relativedelta
import requests
import pandas as pd
import time

# ------ Exceptions creation -------

class ParameterValidationError(Exception):
BASE_MESSAGE = "Parameter validation error"
def __init__(self, parameter: str, value: str, message: str):
self.parameter = parameter
self.value = value
self.message = f"{self.BASE_MESSAGE} :\n{self.parameter} : {self.value}\n{message}"
super().__init__(self.message)

class ApiParameterValidationError(ParameterValidationError):
def __init__(self, api_value: str):
super().__init__("API", api_value, "Missing API token")

class DeviceParameterValidationError(ParameterValidationError):
def __init__(self, device_value: str, allowed_devices: str):
super().__init__("Device", device_value, f"Device not in allowed devices list {allowed_devices}")

class AppsFormatParameterValidationError(ParameterValidationError):
def __init__(self, device: str, value: str, message: str):
self.device = device
super().__init__(f"Apps format for {self.device}", value, message)

class AppsFormatAndroidParameterValidationError(AppsFormatParameterValidationError):
def __init__(self, value: str, wrong_app: str, wrong_character: str, idx_wrong_character: str):
super().__init__("android", value, f"Android can't contain numbers (found app \"{wrong_app}\" with \"{wrong_character}\" at index {idx_wrong_character})")

class AppsFormatIosParameterValidationError(AppsFormatParameterValidationError):
def __init__(self, value: str, wrong_app: str, wrong_character: str, idx_wrong_character: str):
super().__init__("ios", value, f"iOS can only contain numbers (found for app \"{wrong_app}\" with \"{wrong_character}\" at index {idx_wrong_character})")

class StartDateParameterValidationError(ParameterValidationError):
def __init__(self, start_date: str):
super().__init__("start_date", start_date, "Format of start date not following YYYY-MM-DD")

class EndDateParameterValidationError(ParameterValidationError):
def __init__(self, end_date: str):
super().__init__("end_date", end_date, "Format of end date not following YYYY-MM-DD")

class EndDateBeforeStartDateParameterValidationError(ParameterValidationError):
def __init__(self, start_date: str, end_date: str):
super().__init__("end_date < start_date", f"Start date : {start_date}\n End date : {end_date}", "Start date needs to be before end date")

class MetricsParameterValidationError(ParameterValidationError):
def __init__(self, metrics: str, allowed_metrics: str):
super().__init__("metrics", metrics, f"Metric not in allowed metrics list {allowed_metrics}")

class GoogleCloudParameterValidationError(ParameterValidationError):
def __init__(self):
super().__init__("google cloud project name", "Google cloud project name is empty!")

class ApiRuntimeError(Exception):
BASE_MESSAGE = "Error calling PublicAPI"
def __init__(self, reason: str):
self.reason = reason
self.message = f"{self.BASE_MESSAGE} :\n{self.reason}"
super().__init__(self.message)

class NotEnoughCreditsError(ApiRuntimeError):
def __init__(self):
super().__init__("Account with the associated API Token has ran out of credits")

class WrongAPIToken(ApiRuntimeError):
def __init__(self):
super().__init__("The API token provided is not correct")

# If a param validation fails on the API side.
# An example could be wrong country/language combination
class RuntimeParameterValidationError(ApiRuntimeError):
def __init__(self, reason: str, parameter: str):
super().__init__(f"Parameter {parameter} validation failed: {reason}")

# 500s will be caught here
class InternalError(ApiRuntimeError):
def __init__(self):
super().__init__(f"Something went terribly wrong in the internals of the API")

# ------ End exceptions ------

class AppKeywordsRankingEvolution:
def __init__(self) -> None:
self.API_KEY = {"x-apptweak-key": API_KEY}
self.KW_RANKINGS_ENDPOINT = "https://public-api.apptweak.com/api/public/store/apps/keywords-rankings/history.json"
self.APP_METADATA_ENDPOINT = "https://public-api.apptweak.com/api/public/store/apps/metadata.json"

self.DEVICE = DEVICE

self.APPS = APPS.split(",")
self.APPS_STRING = APPS

self.KEYWORDS = KEYWORDS.split(",")
self.KEYWORDS_STRING = KEYWORDS

self.COUNTRIES = COUNTRIES.split(",")
self.COUNTRIES_STRING = COUNTRIES

self.START_DATE = START_DATE
self.END_DATE = END_DATE

self.KW_METRICS = METRICS

self.GOOGLE_CLOUD_PROJECT_NAME = GOOGLE_CLOUD_PROJECT_NAME

self.credit_cost = 0
self.data = pd.DataFrame()

self.print_config()

# Function that prints the current configuration
def print_config(self) -> None:
print()
print("#----- Configuration of the recipe ----#")
print(f"API_KEY : **************** (FILTERED)")
print(f"DEVICE : {self.DEVICE}")
print(f"APPS : {self.APPS_STRING}")
print(f"KEYWORDS : {self.KEYWORDS_STRING}")
print(f"COUNTRIES : {self.COUNTRIES_STRING}")
print(f"START_DATE : {self.START_DATE}")
print(f"END_DATE : {self.END_DATE}")
print(f"METRICS : {self.KW_METRICS}")
print("#---------- End configuration ---------#")
print()

# Function to get data from the API
def get_url(_self, url: str, at_key: Dict) -> Dict:
response = requests.get(url, headers=at_key)

# API error checking
if response.status_code == 500:
raise InternalError()
elif response.status_code == 422:
parameter = response.json["error"]["parameter"]
reason = response.json["error"]["reason"]
raise RuntimeParameterValidationError(reason, parameter)
elif response.status_code == 403:
raise NotEnoughCreditsError()
elif response.status_code == 401:
raise WrongAPIToken()

resp_dict = response.json()
return resp_dict

# Gets installs for the specific app by country
def get_keyword_installs_and_ranking_per_app(self, keyword: str, app: str, country: str):
query = f"?apps={app}&keywords={keyword}&metrics={self.KW_METRICS}&country={country}&device={self.DEVICE}&start_date={self.START_DATE}&end_date={self.END_DATE}"
url = self.KW_RANKINGS_ENDPOINT + query

resp_dict = self.get_url(url, self.API_KEY)

self.credit_cost += resp_dict["metadata"]["request"]["cost"]
print(f"Keyword Installs - The current credit cost is: {self.credit_cost}")

return resp_dict

# Gets app name for an app id
def get_app_name(self, app: str, country: str):
query = f"?apps={app}&country={country}&device={self.DEVICE}"
url = self.APP_METADATA_ENDPOINT + query

resp_dict = self.get_url(url, self.API_KEY)

self.credit_cost += resp_dict["metadata"]["request"]["cost"]
print(f"App name - The current credit cost is: {self.credit_cost}")

return resp_dict

# The name of the CSV is app_keyword_ranking_report_XXXXXX
# where XXXX is the current timestamp
def export_to_csv(self):
if self.data.empty:
return

""""Writes the data to a csv"""
current_time = datetime.now().strftime("%Y-%m-%d-%H.%M.%S")
self.data.reset_index(drop=True, inplace=True)
self.data.index.name = "index"
self.data.to_csv(
f"app_keyword_ranking_report_{current_time}.csv"
)

# The name of the CSV is app_keyword_ranking_report_XXXXXX
# where XXXX is the current timestamp
def export_to_big_query(self):
if self.data.empty:
return

""""Writes the data to big query"""
dataset_name = 'keyword_rankings'
table_name = "apptweak_keyword_rankings_data"

# Work around int64 not being able to be saved
blankIndex=[''] * len(self.data)
self.data.index = blankIndex
self.data["rank"] = self.data["rank"].replace({None: np.nan})
self.data.to_gbq(destination_table=f"{dataset_name}.{table_name}", project_id=self.GOOGLE_CLOUD_PROJECT_NAME, if_exists='replace')

def fetch_data(self):
for country in self.COUNTRIES:
for app in self.APPS:
try :
app_name = (
self.get_app_name(app, country)
.get("result")
.get(app)
.get("metadata")
.get("title")
)
except ApiRuntimeError as e:
print(f"An error occurred while getting the data from the api")
print(f"Error: {e.reason}")
raise e
except Exception as e:
print(f"No data for app {app}")
continue
print (f" ------ Starting fetch for {country} - {app} ------- ")
for keyword in self.KEYWORDS:
try:
kw_metrics = (
self.get_keyword_installs_and_ranking_per_app(keyword, app, country)
.get("result")
.get(app)
.get(keyword)
)
yield (country, app, keyword, kw_metrics,app_name)
except:
print(f"No data for keyword {keyword}")
continue

def fetch_and_parse_data(self):
for _i, (country, app, keyword, kw_metrics, app_name) in enumerate(self.fetch_data()):
kw_ranks = pd.DataFrame(kw_metrics['rank'])
kw_ranks.rename(columns={"value": "rank", "fetch_performed": "fetch_performed_rank"},inplace=True)

kw_installs = pd.DataFrame(kw_metrics['installs'])
kw_installs.rename(columns={"value": "installs"},inplace=True)

kw_metrics_df=pd.DataFrame()
kw_metrics_df=kw_ranks.join(kw_installs.set_index("date"),on="date")

kw_metrics_df["app_id"] = app
kw_metrics_df["app_name"] = app_name
kw_metrics_df["keyword"] = keyword
kw_metrics_df["country"] = country
self.data = pd.concat([self.data, kw_metrics_df], ignore_index=True)

self.data = self.data[["date","keyword","app_id","app_name","country","installs","rank","fetch_performed_rank"]]

def fill_default_parameters():
global API_KEY, DEVICE, APPS, KEYWORDS, COUNTRIES, START_DATE, END_DATE, METRICS, GOOGLE_CLOUD_PROJECT_NAME
if API_KEY == "":
API_KEY = os.environ.get('APPTWEAK-API-KEY', "")

# DEVICE
if DEVICE == "":
DEVICE = "iphone"

# APPS
if APPS == "":
if DEVICE == "android":
APPS = "uk.co.telegraph.kindlefire,com.guardian"
else:
APPS = "388947468,409128287"

# KEYWORDS
if KEYWORDS == "":
KEYWORDS = "newspaper,news,info,uk info,live news"

# COUNTRIES
if COUNTRIES == "":
COUNTRIES = "gb"

# START_DATE
if START_DATE == "":
START_DATE = (datetime.now() - relativedelta.relativedelta(days=8)).strftime("%Y-%m-%d")

# END_DATE
if END_DATE == "":
END_DATE = (datetime.now() - relativedelta.relativedelta(days=1)).strftime("%Y-%m-%d")

# METRICS
if METRICS == "":
METRICS = "installs,rank"

if GOOGLE_CLOUD_PROJECT_NAME == "":
# Default name in the guide
GOOGLE_CLOUD_PROJECT_NAME = os.environ.get('GOOGLE-CLOUD-PROJECT-ID', "")

# --- Parameters validation ---

def parameters_validation():
global API_KEY, DEVICE, APPS, KEYWORDS, COUNTRIES, START_DATE, END_DATE, METRICS
# API_KEY
if API_KEY == "":
raise ApiParameterValidationError(API_KEY)

# DEVICE
ALLOWED_DEVICES = ["iphone", "ipad", "android"]
ALLOWED_DEVICES_STRING = ",".join(ALLOWED_DEVICES)
if DEVICE not in ALLOWED_DEVICES:
raise DeviceParameterValidationError(DEVICE, ALLOWED_DEVICES_STRING)

# APPS
apps_list = APPS.split(",")
if DEVICE == "android":
for app in apps_list:
for c in app:
if c.isdigit() and c != ",":
raise AppsFormatAndroidParameterValidationError(APPS, app, c)
else:
for app in apps_list:
for c in app:
if not c.isdigit() and c != ",":
raise AppsFormatIosParameterValidationError(APPS, app, c)

# START_DATE
try:
datetime.strptime(START_DATE, '%Y-%m-%d')
except:
raise StartDateParameterValidationError(START_DATE)

# END_DATE
try:
datetime.strptime(END_DATE, '%Y-%m-%d')
except:
raise EndDateParameterValidationError(END_DATE)

# START_DATE < END_DATE
START_DATE_DATE = datetime.strptime(START_DATE, "%Y-%m-%d")
END_DATE_DATE = datetime.strptime(END_DATE, "%Y-%m-%d")
if END_DATE_DATE < START_DATE_DATE:
raise EndDateBeforeStartDateParameterValidationError(START_DATE, END_DATE)

# METRICS
ALLOWED_METRICS = ["installs", "rank"]
ALLOWED_METRICS_STRING = ",".join(ALLOWED_METRICS)
METRICS_LIST = METRICS.split(",")
for metric in METRICS_LIST:
if metric not in ALLOWED_METRICS:
raise MetricsParameterValidationError(METRICS, ALLOWED_METRICS_STRING)

# GOOGLE PROJECT ID
if GOOGLE_CLOUD_PROJECT_NAME == "":
raise GoogleCloudParameterValidationError()

def print_need_help():
print()
print("######################################")
print("Any problem? Contact us : link")
print("--------------------------------------")

def print_exception(e: Exception):
print("For more experienced wizards, here is the error:")
print()
message_attr = getattr(e, "message", None)
if callable(message_attr) and e.message != "":
print(e.message)
else:
print(repr(e))
print("######################################")


⚠️ Make sure the code is more or less well formatted. If not, you will have errors when deploying. This can be the hardest step in the tutorial, so if you have any issues, contact us.

Now click on Deploy (this step may take a few seconds/minutes). You'll know when its ready because there will be (hopefully) a green check next to the function name

Notes:

  • If you are not using the default project name, you will have to configure it on line 80

  • For the purpose of this tutorial, we recommend running the code without touching the variables, but if you wish to change the data being fetched please do so

Testing that everything went well

In order to test that everything went right we'll make a manual trigger of the scheduler to produce a message so that it can be consumed with our code.

  1. Click on the Hamburger icon (☰) on the top-left corner

  2. Go to Cloud Scheduler under More Products > Integrations Services

  3. On the Job we created earlier, click on the three dots and click on Force a run job

  4. Wait a few minutes, shouldn't be that long. If everything is correct, the function should be running.

  5. Wait a few minutes

  6. Click on the Hamburger icon (☰) on the top-left corner

  7. Navigate to BigQuery

  8. Under the item in the explorer page you should see the dataset you created earlier. Click on the arrow

  9. Click the arrow to view the contents, and you should see a table named apptweak_rankings_data

  10. Click on it, and under the tab preview you should see the data you just fetched

Congratulations! You have done the automatic fetching of the data!

Didn't Work?

Do not despair contact us and we can help you figure out what's wrong.


Creating a Google Data Studio Dashboard

Now you can create visualizations on top of this data with any tool that you prefer. You can do it with Google Data Studio, or Tableu, as both of them connect to BigQuery seamlessly. In this tutorial, we are going to cover how to create a simple Google Data Studio dashboard.

  1. Go to the Google Data Studio homepage

  2. Click on Create

  3. Click on BigQuery under the Google Connectors when choosing your data source

  4. Authorize to connect

  5. Click on the project id, dataset and table, and click on Add

When prompted, add the data to the dashboard:

If everything went well, you should have a table in the view, and some options in the sidebar, as seen on this image:

For this case, we will want to have a dashboard that shows the performance of some keywords and the app rankings on each. Not only this allows for direct comparison between competitors, but we can also identify strengths and weaknesses of keywords for certain apps. What do we do with this? Well we can plan our advertising strategies accordingly, or change our app metadata to improve on certain areas as well.

We will create two graphs then.

  1. Start by selecting the table and click on Chart

  2. Select then Time Series Chart

Great! Now you should have a simple Graph, lets make sure the setup is correct. Select the graph and click on the tab setup

Then:

  1. Select date as the DateRange dimension and as the Dimension

  2. Select Keyword as the Breakdown Dimension

  3. Select Rank as the Metric and as the Breakdown Dimension Sort, and make sure to select "Average", if you are unsure where, its the button highlighted by purple in the following image:

If you followed the steps, you should have a graph that looks like the following:

Right now, that graph is showing the Average Rank per keyword of the three apps. This is not very interesting however, so let's do some changes to be able to filter by app.

  1. Click on Add Control, and then click on Drop-down list

  2. Drop the menu item so that it snaps with the graph. You will notice its snapping, because the background of the graph will turn light blue, and a subtle red line will appear

  3. Under Setup click on the Control option and select app name

And that's it. However you may not see that the graph changed, and that is because we need to de-select some options from the drop down we just created. Go ahead and click on the arrow next to the drop-down, and play around with it

Lets now create a competitor comparison graph.

  1. Make a new page by clicking on Add page

  2. Copy the graph we created and paste it in the new page

  3. Select it and under setup, change Breakdown dimension to app_name

  4. Create another Control Option, and snap it to this chart

  5. Select keyword under Control Field

And that should be all! Do not hesitate to contact us if you are stuck in any steps.

Did this answer your question?