Export selected dashboard data to BigQuery

  • 4 February 2021
  • 2 replies

Hi there,

We want to enable our users to write the selected data of a dashboard to a BigQuery table. The rationale is simple:

  1. The users select the scope of their analyses by filtering the data (see illustrative screenshot).
  2. They push the selected data to BigQuery.
  3. Our data science team runs ML models based on the selected inputs.


I found 2 topics discussing data export to BigQuery but I did not manage to adapt the code to our use case.

  • Updates a BigQuery table from a Look but based on a single field value.

  • Updates a BigQuery table based on all selected values but from an Explore.


In our scenario we want our users to push to BigQuery all the selected data, directly from the dashboard.

How can we enable it?





2 replies

Userlevel 6

Hi Arthur,


I’ve got three solutions for you to consider :)

  1. You could write a small server (I like to use Google Cloud Functions for this, they’re very convenient) that receives requests from Looker’s “send/schedule” modal using our Action API. Your action would declare a supported action type of “dashboard” and a supported format of “csv_zip”. One downside of this approach is that the data would actually have to be in its granular form (that the ML uses) as a tile on the dashboard, which may or may not be the UX you want for you users
  2. If you are embedding (or could embed) the dashboard into a parent application, you can listen for our iframe’s JS events to know the current state of the dashboard filters, and then offer the user a button that uses our REST API to run the equivalent queries and process the data from there.
  3. The coolest option, but also the untested bleeding edge - If you are training a supported ML model, you could have a tile on your dashboard built on top of a derived table that uses a BQML create model statement, along with templated filters to pass in the filtered data and train the model right inline with your queries to BQ. I would also probably add a yes/no “train?” filter to the top of the dashboard so that users only run the training tile once they’re satisfied with their other filters. I wrote an article about using BQML from Looker, but this doesn’t get into making the table dynamic to user input - that would require some experimentation

Hi Fabio,

Thanks for your answer, it is very informative!


I think your first option should work perfectly well for our use case. I feel confident using Cloud Functions to act as an Action Hub for Looker. What is still unclear to me is:

  • Based on the documentation I guess that I will get 1 csv file per tile. The file names will be accessible through the fields “attachment” then “data” of the request. Am I right?
  • My Execute endpoint function will do the work: read the files and saves the content to BigQuery, as well as some authentication checks probably. Where is the zip directory actually located? How to delete it once the job is done?


FYI: your 3rd option is what we are heading toward so it is really cool to see what you have achieved using both Looker & BQML! 


Have a good day,