Is there a way to let users update data on DB right from looks?
I read about actions here but I found it very abstract.
Can someone please explain the process for this?
Thanks,
Krish.
Best answer by jesse.carah
View originalIs there a way to let users update data on DB right from looks?
I read about actions here but I found it very abstract.
Can someone please explain the process for this?
Thanks,
Krish.
Best answer by jesse.carah
View originalWell, the situation is that GCP is not doing the authentication for you. It is expected that your code will handle any authentication. The Action framework includes a secret that is configured when you add the Action to Looker, and which you should use to authenticate that the request came from your Looker instance.
This article talks about setting up the secret when deploying the official ActionHub code, but the process is similar for a custom written Action
Hi, I have a need which seems similar to the one Dimitri described 2 years ago. Unfortunately the link provided by
Is there any way to see Jesse’s post or should I start a new thread?
Hey Krishna,
I’ve been able to write data from Looker to BigQuery using both Data Actions as well as the Looker Action Hub. In either case, you’ll need to push data from Looker to some middleware that will interpret the webhook from Looker and perform the necessary operations to then stream the data to BigQuery.
Luckily, Google has a great service called Google Cloud Functions that makes this really easy. Like AWS’s Lambda, Cloud Functions let you deploy code that gets executed based off of some event. With a data action, you can push JSON containing data from Looker as well as user-defined form parameters to a Cloud Function endpoint. The Cloud Function then parses the JSON, extracts the relevant values, and calls on the BigQuery SDK to stream the results to BigQuery.
Here’s a quick overview of how to use Cloud Functions to stream data from Looker to BigQuery. In this example, we’ll create a data action and cloud function that lets an end user persist an annotation to BigQuery:
In this example, we’re going to attach a data action to field, and allow end-users to mark whether or not a name is a cool name.
dimension: name {
type: string
sql: ${TABLE}.name ;;
action: {
label: "Cool Name?"
url: ""
param: {
name: "name"
value: "{{ value }}"
}
form_param: {
name: "annotation"
type: select
label: "Cool name?"
default: "No"
description: "Do you think that this name is a cool name?"
option: {
name: "No"
}
option: {
name: "Yes"
}
}
}
}
Note: We’re going to leave the url blank for now. Once we’ve spun up the cloud function we’ll paste the endpoint in.
We’re now going to write a simple Python function that writes the user selected annotation to BigQuery, and place it in main.py
import google.cloud.bigquery as bigquery
import datetime
import time
def annotation(request):
r = request.get_json() # Fetch the data action JSON
client = bigquery.Client()
dataset_id = '' # Replace with name of the BQ dataset
table_id = '' # replace with your table ID
table_ref = client.dataset(dataset_id).table(table_id)
table = client.get_table(table_ref) # API request
# request variables
name = r['data']['name']
annotation = r['form_params']['annotation']
# system variables
sys_time = int(time.time())
row_to_insert = [
(
name,
annotation,
datetime.datetime.fromtimestamp(sys_time).strftime('%Y-%m-%d %H:%M:%S')
)
]
row = client.insert_rows(table, row_to_insert) # API request to insert row
return '{"looker": {"success": true,"refresh_query": true}}' # return success response to Looker
Additional things to configure:
google-cloud-bigquery==1.5.0
in requirements.txt
Cheers!
Jesse
Hi Jeese,
Any code snippet to move system activity data into bigquery using cloud functions.
Thanks
Srinivasarao
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.