Question

Send All Results for Looker Scheduled Dashboard

  • 2 March 2020
  • 1 reply
  • 356 views

Hi, I am trying to use sendAllResults option in Looker Scheduled Plan API, to make it send All results and don’t apply limits to underlying queries, what I am getting in response is confusing:


when sendAllResults = true, getting execption:

{

“message”:“Validation Failed”,

“errors”:[

{

“field”:“dashboard_id”,

“code”:“invalid”,

“message”:“Cannot send all results for dashboards”,

“documentation_url”:“http://docs.looker.com/

},

{

“field”:“format”,

“code”:“invalid”,

“message”:“format csv_zip invalid – valid formats for unlimited Dashboard sent as ‘webhook’ are [:txt, :csv, :inline_json, :json, :json_label, :json_detail, :json_detail_lite_stream, :xlsx]”,

“documentation_url”:“http://docs.looker.com/

}

],

“documentation_url”:“http://docs.looker.com/

}


it clearly says: valid formats for unlimited Dashboard sent as ‘webhook’ are…


when trying to set one of “valid” formats, like “csv”, getting exception:

{

“message”:“Validation Failed”,

“errors”:[

{

“field”:“dashboard_id”,

“code”:“invalid”,

“message”:“Cannot send all results for dashboards”,

“documentation_url”:“http://docs.looker.com/

}

],

“documentation_url”:“http://docs.looker.com/

}


So I set “csv” to get unlimited Dashboard, as said in validation exception description, but still getting error. Is it possible to disable limits when sending scheduled dashboard to webhook?


1 reply

Userlevel 4

This is a bit of an old topic, but I’m commenting in case someone comes across this down the road. 

The real error is the first one, `“Cannot send all results for dashboards”` with the second being a bit misleading and can be disregarded `“format csv_zip invalid ...`.

In reality, the only valid formats for a dashboard export are `csv_zip` and `pdf`, but the `send_all_results: true` parameter only works for Looks. There is no built in functionality that allows for sending or downloading all results from a dashboard via the scheduler. 

That said, this can be worked around by using the search_dashboard_elements and run_inline_query API methods to get the query object generated by each tile, remove the row limit, run a new query without row limit, and finally download the results of each query. I wrote a script that uses our Python SDK to do this, see full code here: https://github.com/NestleJsCrunch/LookerScripts/blob/main/DownloadDashboardUnlimited.ipynb

Walking through the key parts:

data = sdk.search_dashboard_elements(dashboard_id=dash_id)

Here we’re getting a dictionary with the json for each element on the dashboard.

for x in range(numelemnets):
element_query_body = data[x].result_maker.query
element_query_body['limit'] = -1

We’re then looping through each list, grabbing the `result_maker.query` object and changing the `limit` parameter to `-1` (aka all results). 

      elements_needed = ['model','view','fields','filters','sorts','limit','query_timezone']
final_body = {}

for y in range(listlen):
try:
new_el = element_query_body[elements_needed[y]]
except:
i+=1
else:
final_body[elements_needed[y]] = new_el

Since we only need a subset of the elements in the `result_maker.query` object to run a new query, we are initializing a list with just the keys for those elements. Then we then use another nested loop with a try catch statement to create the final body that we can pass into a call to `run_inline_query`. The try catch is necessary because not all dashboard elements will have `filters`. 

      resultset = sdk.run_inline_query('csv', final_body)

Finally, we run the new query with the body that we created in steps above and can download that result set via whatever methodology we prefer. This should all happen in one loop for a given tile, then can repeat for the next tile.

Reply