Question

How we're automating Looker user feedback collection

  • 10 April 2019
  • 1 reply
  • 231 views

Userlevel 4

This post is adapted from a longer Medium article I wrote here.


I needed to automate a monthly process where we email our Looker users and ask them for their feedback. Generally, I send two emails. One email asks about likes and dislikes and goes to people who are frequent users. The other asks about barriers to using Looker and goes to people with little to no activity.


I thought I could improve this process using the Looker API to get usage numbers and Airflow to schedule and email on a monthly cadence.


To get started, I set up a Look using the i__looker explore to show me how many days it had been since each user’s last query. The Look has these four columns:




  • First Name (Dimension)


  • Email (Dimension)


  • Most Recent Query (Measure)


  • Days of Inactivity (Table Calculation)


I added the First Name and Email dimensions to populate the email template later. I also added the Most Recent Query measure to calculate inactivity.


Last, I added a Table Calculation called Days of Inactivity to calculate the number of days since each user’s most recent query. It looks like this:


diff_days(${history.most_recent_query_date}, now())


I saved this setup as a Look with a table visualization and noted the Look ID for the API later (the number at the end of the Look’s URL).


I used the API to pull the results of the user activity Look in my Airflow DAG. The first step is to generate credentials ( client_id and client_secret ) for yourself, which you can do by editing your user in the Looker Admin section and adding an API key as described here.


I used the Python SDK for the Looker API, which you can install according to this Discourse post using the PyPi package lookerapi. I followed the steps in the Discourse post to authenticate using my previously obtained credentials.


Looker’s Run Look endpoint unfortunately doesn’t seem to return valid JSON at the moment (it single-quotes the fields instead of double-quoting them), so I opted for a CSV-formatted result instead.


I used the following Python code to convert the CSV string into a list of dictionaries, which I thought would be the easiest format to work with later on.


result = look_api.run_look(133, 'csv')
rows = [row.split(',') for row in result.strip().split('\n')]
header = [column.lower().replace(' ', '_') for column in rows.pop(0)]
users = []
for row in rows:
user = {}
for column, value in zip(header, row):
user[column] = value
users.append(user)

Finally, I set up an Airflow DAG to run on a monthly cadence, extracting usage statistics via the Looker API and emailing the appropriate email template. You can see more details on how I implemented the Airflow DAG and an example email template in my longer article here.


1 reply

Userlevel 7
Badge

From my understanding a user who has no activity will not have any history and will then not have a most recent query date. Could merge results of all users from the users explore and their history to get the complete list.

I have been thinking about spamming new users with some links of documentation and tips so might steal some of this code!

The other way of doing this would be to do it directly from Looker by running the schedule as each recipient and making use of user attributes, this is probably a big drain on the system if you have a lot of users though.

Reply