General Looker administration
All your administration & self-hosting questions and content
- 497 Topics
- 1,223 Replies
./looker startLooker can not start because: Unable to unwrap key, invalid key provided.What is causing this? How to fix? Is my environment bricked?https://cloud.google.com/looker/docs/changing-encryption-keysMy understanding is that only sensitive data is encrypted.If the key is lost, Is there a way purge encrypted data to restore the non sensitive settings?
how to fix the 'MySQL Syntax error is showing in 'database performance dashboard' of system activity reports'
looker is throwing the sql syntax error in pre-build ‘Database Performance’ ,’Instance Performance’ Dashboards in the System Activity page. For Ex: Looker is using Pre-build views ‘PDT Builds’ and ‘PDT Event Log’ in building the ‘Database performance dashboard’. ‘PDT Event Log’ is the Direct Log table where looker used to create in underlying MYSQL database and hence no issue with this explore, but ‘PDT Builds’ is the derived view(Explore) using the CTE ,where it is throwing the Syntax Error.Query generated by Looker is:WITH pdt_builds AS (SELECT pdt_event_log.tid AS `tid`, pdt_event_log.connection AS `connection`, pdt_event_log.model_name AS `model_name`, pdt_event_log.view_name AS `view_name`, MIN( pdt_event_log.AT ) AS `transaction_start`, MAX( pdt_event_log.AT ) AS `transaction_end`, SUBSTR(MAX(CASE WHEN pdt_event_log.action LIKE 'create begin' THEN '00-waiting dependencies' WHEN pdt_event_log.action LIKE 'create ready' THEN '
We are trying to upgrade our looker version to v 22.8.89, which requires AES-256 GCM encryption. Our environment does not run on the internal looker database but instead an external MySql instance. We created a CMK and ran the migrate_encryption command, this was successful but when try to start looker we are seeing a GCM encryption related error (shown below).We have searched the looker knowledge base but could not find any info specific to how to successfully migrate encryption for Looker instance that runs on external database, we also noticed quite a number of support article where others are experiencing this issue but no solution provided.Can you please provide any useful information that would help us resolve this issue and enables us upgrade our looker version? Thanks“Error starting Looker: This Looker instance must be migrated to GCM encryption using 'migrate_encryption'.
If you have an on premise instance and you’re migrating to AES-256 GCM encryption (docs) please remember to save your CMK (customer managed key) - be sure to store it in a safe and permanent location before continuing! Losing the CMK after encrypting the internal database can result in the loss of your instance.
Hi, we have been getting this error for couple of months nowCould not send webhook to "https://actions-gcp.looker.com/actions/google_sheets/execute" Looker support has suggested to recommend developers to re-authenticate and even after that the error keeps occurring any suggestions other than re-authentication?
I’m trying to get our Looker instance to connect to Redshift via an SSH tunnel. Our looker instance is managed by Looker (not self-hosted). Version is 22.14.44. The SSH server is an AWS EC2 instance with a public IP address. I’ve verified that our set-up works because our dbt Cloud projects can successfully read/write from/to the Redshift instance via SSH using the same server. At a high level, I’vecreated the user that Looker should use to SSH Added the public key to /home/username/.ssh/authorized_keys whitelisted the public IPs for the EC2 security group, the same place where I whitelisted dbt Cloud’s IP addresses. The following IPs were whitelisted 22.214.171.124 126.96.36.199 188.8.131.52 I retried this by following this document: https://cloud.google.com/looker/docs/using-an-ssh-tunnel#using_the_database_server But I am met with the same error. “Failed to connect”A vague errorThe `Log` section in the Admin UI doesn’t seem to provide more information. Has anyone encountered
Failed to send mail: An unknown error occurred with SMTP authentication. Please check your error logs. end of file reached
Hello, can anyone help me from this issue?I tried using SMTP to send mail from custom mail, and I have been following the instructions but there is error that I failed sent email but I don't understand the issue and I don't know how to solve it.SMTP : Use custom mail settings
Wondering if my teammates and I can collaborate on schedules. We use these to send emails out to clients with filtered looks, so it would be helpful to know which ones someone else has configured and edit if necessary. I’ve been using this help article to set up my email schedules: Looker Documentation Scheduling Data Deliveries Overview of scheduling deliveries of your data to various destinations.
Hi Folks,We are using looker, where we are getting 503 error randomly, and are unable to identify the root cause.Has anyone got this error earlier and I have not seen any post or discussion on sameGiving insights on same would be really helpful.503 error states - heavy load running hence unable to response after some time and application goes down where it auto restarts and comes back liveLooking for how to identify issue and resolve it, and can we setup alerting for memory cpu utilisation usage
Hi! I’m new-ish to Looker and have some security/governance questions. I was wondering if there was any way to export Looker audit logs (or just logs in general) to a GCS bucket or Cloud Logging. I’ve found Logs documentation but I can only view that from within Looker and it’s the last 500 lines. What happens if I need to keep logs for, say...7 years? Or if I want to alert off of it? I found GCS Looker Actions but it’s more for exporting a Look or explore to GCS periodically, which says to me that there’s either going to be a lot of overlap in logs/data or that I will have gaps in my logs/data. I also found this Looker Blog Archive with some feature: In the event of needing to investigate who has accessed what data, Looker provides a robust audit trail. Administrators can provide transparency to internal and external stakeholders and reveal who has accessed what data and when. The ‘in-database’ architecture means every query and viewed report creates a database event, which Looker l
Step 1: Create a Google Cloud Project, Big Query Project and Dataset. This document does not cover how to do that. However, if you need help please refer to the documentation here. (https://cloud.google.com/bigquery/docs). Once this is configured you can see the BigQuery dataset as referred to in the image below. Image1: BigQuery Project, Dataset PageNote: The Red box highlights the BigQuery Project Name and Dataset Name. The Copy button will also provide the said names. Image 2: Clicking “Copy” in Image 1 shows the Project Name and Dataset Name. Step 2: Create a BigQuery service account, with access to the BiQuery project. This creates an account that is separate from the principal account that can be used to establish connection. Image 3: Create Service Accounts in “IAM & Admin → Service Accounts”. Grant Access to Service Accounts in “IAM & Admin → IAM” Note: Go to Service Accounts in IAM & Admin to create a Service Account. To add or delete permissions and to create
With Google Cloud Storage actions the user can send data from Looker (Looks, Dashboards, Explore) to a specific Storage bucket in a GCP account. Customer-hosted instances: Make sure the instance fulfills these requirements to be able to use actions from the Looker Action Hub. More considerations for actions in customer-hosted instances can be found here. SetupIn your Google Cloud Console enable the Google Cloud Storage API and Google Cloud Storage JSON API. (Use the search bar from the console to find them) Generate a Service Account with JSON keys Go to IAM & Admin → Service Accounts → Create Service Account Set any name for the Service Account Grant the roles of "Storage Object Creator" and from the basic roles the "Viewer" one. Once the Service Account is created, click on its name to open the details page. Go to the Keys tab → Add Key → Create new key → JSON format Open the JSON file that was downloaded in any text/code editor. Enable and setup Google Cloud Storage actions
I understand that we can see usage data, history explore, look explores, etc. as mentioned in this article. I did want to know where this data was stored so I could perform further transformations or analysis that is not capable within a Looker view. Is it possible to obtain this raw data or query it from somewhere?
We have an internal application that uses the Looker API on a Looker instance hosted by Looker.com. The application creates and updates Looks. How can we back up these Looks and restore them individually?I found documentation about backing up self-managed Looker installations. I found only one mention of backups of hosted Looker:Starting in Looker 7.6, automatic backups of Looker’s i__looker internal database to an S3 bucket are no longer supported. Looker-hosted instances are still backed up daily through a separate process, but backup files cannot be sent to a custom S3 bucket.https://connect.looker.com/library/document/backup?version=22.6
Hi everyone, I’m Noura and I’m a TSE in Looker Support at Google. Today’s topic will be around what to do when you see that your code in production is up to date but when you switch to development mode you see a branch named HEAD out of nowhere. HEAD is a special pointer that point to the latest checkout. In this situation it looks like that the HEAD was detached from master.We usually want to attached back the HEAD to master by running the command below git checkout master, but for Looker-Hosted instance it can really be an option, so we will do it from the Looker UI.Here are the step to reattached the HEAD to master.Create a new branch that is based on the branch that have the changes you want to push to master Add a simple change like a comment (e.g # hi) Commit and push the changes to remote Depending if your have Pull Request required or not, you merge the branch to master. Pull the changes Deploy to productionNow you can see that these two branch are in sync.
Looker Actions - Google Cloud Storage Looker is launching a Google Cloud Storage action, allowing customers to send data to Google Cloud Storage from within Looker. This action will allow Looker customers ti send Looker data to Google Cloud Storage on a one-off or scheduled basis. Enable the Google Cloud Storage Action Note: Your Looker instance must be on Looker 5.6+. Customer-hosted instances may be unable to enable actions from the Looker Action Hub, especially actions that support streamed results or that use OAuth, if the customer-hosted Looker instance does not fulfill these requirements. See the Sharing Data Through an Action Hub documentation page for suggested solutions to this potential issue. To enable the action in Looker, go to your Admin panel and the Actions Tab under the Platform header [your-instance.looker.com/admin/actions]. (Admin > Platform > Actions) Select “Enable” on the Action you would like to enable. In your Google Cloud console [https://consol
As part of an upgrade from v22.6.54 to v22.10 I’ve followed the Migrating to AES-256 GCM encryption instructions to migrate from the legacy 128-bit encryption to the AES-256 GCM encryption. However, during the process of running java -jar looker.jar migrate_encryption I keep getting the following errors. The process eventually exits at which point I restore looker back to v22.6.54. Anyone seen this before? Sorry for the part text part image error messages below, something in the text causing the forum post to fail when saving. Exception in thread "Looker Thread Pool 'MultiCache Lazy Fetch' trimmer " Exception in thread "Looker Thread Pool 'Cache' reaper [0082e]" java.lang.OutOfMemoryError: Java heap spacejava.lang.OutOfMemoryError: Java heap spaceException in thread "Looker Thread Pool 'Pinger' reaper " java.lang.OutOfMemoryError: Java heap spaceException in thread "Looker Thread Pool 'Availability Checker' reaper " java.lang.OutOfMemoryError: Java heap spaceExce
Hi,I am building a staging/test cluster on AWS EKS. It looks like Looker only uses AWS metadata service to get the token for instance profile to authenticate w/ AWS. I have setup a service account to run the pod, and the service account is bound to an IAM role with permissions to use the CMK.The AWS_ROLE_ARN and AWS_WEB_IDENTITY_TOKEN_FILE are set correctly and I can verify that by running “aws sts get-caller-identity” on the pod. However, when starting Looker, it will error out saying the role that’s used for the node group does not have permission to get the CMK key. It would work if I added the permissions to the node group role. So it is clearly that Looker does not use provided Web Identity to auth.Is there another variable or parameter to change the behavior? Thanks
Already have an account? Login
Login to the community
No account yet? Create an account
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.