May 6, 2019

BigQuery for marketers: getting started with Google’s data warehouse

By Supermetrics

BigQueryData Management

[ Updated Mar 7, 2023 ]

8-MINUTE READ · By Dino Mahi

Spreadsheets have long been the staple for marketing data analysis. However, the ever-evolving needs of customer and marketing data analysis are compelling an increasing number of marketers to explore more specialized tools.

Spreadsheets work great in that once-raw data has been captured, they can be used to quickly build basic calculations, pivot tables and visualizations with very little technical knowledge. There is almost no reliance on IT and usually no additional investment in procuring spreadsheet software. Even the most challenging part-data procurement can be greatly simplified with the use of third-party add-ons that can automatically fetch data from source systems without the need of any programming.

Yet, as data analysis needs become more complex, spreadsheets quickly begin to run out of utility. Ad-hoc data analysis requires a whole new bunch of underlying technical capabilities and to date, obtaining these was a complex undertaking requiring extensive involvement of IT and major capital investments, not to mention extended implementation timelines. However, new tools such as Google BigQuery have largely taken away that complexity and cost/time-related barriers to implementation. Using BigQuery, marketers can quickly migrate from spreadsheets to using fully-fledged business intelligence tools such as Tableau, Looker, QlikView, and Google Data Studio to easily convert raw data into actionable insights.

In this article, we provide a brief introduction to Google BigQuery, including some key use cases for marketing data analysis. Before doing that, however, here is a quick review of some of the key scenarios where using such a data warehouse might work better.

Google BigQuery: what can marketers do with it?

Crunch big data: A marketing data warehouse like BigQuery shines when working with datasets that have a large number of rows with multiple columns, and where the data is updated very frequently. The highly optimized technical architecture of BigQuery allows for quick analysis of even terabyte-scale datasets.

Combine data from multiple sources: With the proliferation of martech tools, marketers now have vast amounts of data stored across multiple platforms. However, it’s only when that data is brought and blended together in one centralized place that you’ll have a single source of truth. From ad data to web analytics data and pre-conversion data to post-conversion data, getting all that into a data warehouse is quick, simple, and robust—and that’s when you can start getting your hands dirty with in-depth analytics and reporting work.

Access all your historical data: Several marketing platforms have limits on the amount of historical data you can access. For example, Google Search Console offers six months of historical data within its native interface but with applications built on top of its API, like Supermetrics for Google Sheets and Supermetrics for Data Studio, you can increase that with up to 16 months of historical data. But with a data warehouse you can store all your historical data from all your marketing platforms so you’ll never lose it, giving you a huge repository of data to analyze and gather insights from.

Automate data refreshes: Reports and analyses created using spreadsheets are typically based on data snapshots that need to be manually refreshed. With a data warehouse such as BigQuery, these snapshots can be automatically refreshed at regular intervals using API connections to backend systems. This is an important requirement when working with dashboards that need to automatically show the most recent data.

Conduct ad-hoc data analysis: Truly meaningful insights almost always require arbitrary slicing and dicing of data. It is a common requirement to be able to create custom metrics and dimensions that are not present in the original data but are derived from primary fields. Running reports using parameterized inputs, showing visualizations based on complex filters, creating hierarchical groups of dimensions are all routine analysis tasks that can be easily implemented by connecting BigQuery to a data analysis/visualization tool.

These are just a few of the scenarios where something like BigQuery might work better than spreadsheets. A more competent approach to implementing such complex analysis tasks is to follow a conceptual architecture, somewhere on the lines described below.

BigQuery-based data analysis: a marketer’s perspective

At a high level, marketing data analysis and reporting involves the following 4 components:

  1. Data storage: A place to store and organize the data
  2. Connectivity: A way of pushing data from source systems into the data store
  3. Report builder: An interactive data analysis tool that can connect to the data store and create analysis reports
  4. Online dashboards: A dashboard tool where reports created in Step 3 can be published and shared

BigQuery is Google’s solution for the data storage layer above. It is a highly scalable, cost-effective data warehouse that is purpose-built to provide storage and querying of large datasets. The connectivity layer above comes in the form of pre-built connectors for Google BigQuery and an increasing number of vendors are creating adapters that can automatically feed data into BigQuery. The Report builder and Dashboard capabilities can be implemented with a number of tools including Tableau, Looker, PowerBI, and Google Data Studio.

Notice here that the components above are mostly plug-and-play and cloud-based, needing little to no IT involvement in initial setup and very little (if any) capital expenses.

Supermetrics for BigQuery

Start your free trial of Supermetrics for BigQuery

Putting Google BigQuery to use: 4 examples

Now that we have a basic understanding of what Google BigQuery is and the general challenges it addresses, let us dive into a few specific use cases to better understand its value proposition to digital marketers as compared to a spreadsheet-based data analysis.

1. Deep-dive analysis of Facebook Campaigns

Consider a large Facebook Ads Agency looking to optimize Ad spend for a large multinational client. The company runs multiple prospecting and retargeting campaigns on the Facebook platform and wider Facebook Network and has the following unique analysis requirements:

  • Identify the audience group that consistently performs best for CTR, CPC, CPA and CPM metrics over a period of time
  • At what percentage of lookalike audience (1%, 2%, etc.) does the performance of an audience group plateau out and therefore makes sense to switch to re-targeting from prospecting
  • Constantly creating video creatives for Facebook Ads is an expensive proposition but a tangible performance degradation sets in when the same creative is used for a long time. How can we predict creative fatigue?

Notice that these are just some of the complex analysis tasks that analysts routinely perform at large agencies. Without near real-time access to insights (something almost impossible to do using spreadsheets), large ad budgets are likely to be spent on under optimized campaigns resulting in a diminished value to the client. When using Google BigQuery, data from Facebook campaigns could be automatically pulled in at regular intervals. Plugging in a competent tool like Tableau can easily help answer the questions above.

2. Attribution analysis for multi-channel PPC campaigns

Large e-Commerce brands regularly run integrated PPC campaigns for acquisition. These cover multiple channels including Facebook, AdWords, Bing Ads to name a few. The ability to de-duplicate conversions from across multiple channels is critical to successfully managing ad budgets for such campaigns.

Consider for example a user that first comes from an AdWords Campaign but does not convert. AdWords will set a cookie for this user and if he converts anytime in future, AdWords will show a conversion. Now consider when the same user also comes through a Facebook Ad and again does not convert. The user finally visits the site directly after a certain time and completes a purchase. In this case, both Facebook and AdWords will take credit for this conversion but that may not be accurate depending upon whether you use first or last touch attribution.

Using Google BigQuery, it will be possible to capture conversions from both the channels along with timestamps and then tie them up under one single ID. One could then attribute the conversion correctly depending upon the modeling scheme selected. When working with large budgets, such granular insights go a long way in ensuring that budgets are allocated in the most optimized manner.

3. Combing web analytics with CRM data

Understanding visitor behavior before and after conversion is a critical requirement for all companies. Consider a prospect who makes multiple visits to a website and interacts with various marketing collateral before submitting a form for viewing a product demo. Before submission, all the user behavioral data will be tied into a unique, anonymous cookie ID and can be sent to Google BigQuery using Google Analytics API (or automatically if using Google Analytics Premium). This might include data points like the number of visits, pages viewed per visit, link clicks, and items downloaded. Once the form is submitted, this user would also have a CRM record ID which can be associated with his anonymous cookie ID by sending the cookie ID to CRM when the form is submitted. If the user provides a company name, then it might also be possible to get other data points such as the number of employees, turnover, key competitors and so on.

Such CRM data can now be exported into BigQuery under the same anonymous user ID in order to create high-quality prospect records that provide a wealth of contextual information to Sales and Marketing teams.

4. Analyzing content marketing performance

Imagine trying to analyze the content performance of a popular travel booking site which produces a large number of posts generating millions of hits per month. Each content item has multiple tags such as primary associated service offering (hotel bookings, flight bookings, package holidays, vacation rentals), primary content topic (tourist attractions, nightlife, food, local culture), author, content theme (vlog, peer review, themed holidays, family vacations), and format (long form article, video, eBook). The requirement is to be able to generate content performance reports for an arbitrary grouping of these tags, for example:

  • How many weekly page views were generated by each author, for each topic, and within a particular service offering?
  • Are there any cyclical variations in content consumption? For example, does a specific type of content perform better in certain months?
  • What is the ideal word count that generates maximum engagement? How does it vary by topic, format and service offering?
  • Which topics take the least time to produce content for, but which consistently generate the maximum engagement?

As can be seen, these are complex queries that need to be run on datasets that contain a large number of rows and dimensions. A tool such as BigQuery is optimized for quickly generating these insights with almost no coding when connected with a competent data analysis/visualization software.

Conclusion

As the number of customer touch points increases, and marketing budgets scale, marketers need greater capabilities for quickly aggregating multi-touchpoint data and then turning it into insights. Google BigQuery is a big leap in this direction in that it allows marketers to implement advanced data-driven marketing with minimum capital investments, reliance on IT and most importantly without the need to acquire extensive technical skills.

Start your free trial of Supermetrics for BigQuery today and unlock powerful analytics on demand.

About Dino Mahi

Dino Mahi is a Marketing Technology Specialist with over 15 years experience working with major brands on creating high level marketing technology landscapes and data strategies. A passionate Enterprise Architect with a background in marketing data analysis, Dino currently focuses on identifying innovate ways to blend online and offline data for building advanced customer and marketing intelligence solutions.

Turn your marketing data into opportunity

We streamline your marketing data so you can focus on the insights.

Book Demo