Is anyone running an on-premise install of Looker using Docker? We currently run our on-premise Looker instances (prod & staging) directly on AWS EC2 instances, but for a variety of reasons we are considering switching to using Docker.
I imagine the main issues in doing this would be:
- Need to use external DB for Looker (we already use RDS MySQL for this)
- Cached results wouldn’t persist between container restarts if using ephemeral storage
- Maybe any license issues if running on different container instances between restarts?
Surprisingly, I haven’t been able to find any reference to doing this when googling and searching these forums.
Is anyone doing this? For the Looker folks, any info or concerns about running this way?
Looker does not support running in Docker containers, but I do have a Docker setup on my local machine for testing our latest Looker releases. All I need is a fresh install with the looker-latest.jar file from our S3 bucket, with no saved config information. Docker is cool for this.
As you mentioned there are a number of issues running Looker in a container that negate many (most?) of the benefits of containerizing things, namely that there are lots of Looker things that need to be persisted between container destruction/construction. They include the things you listed along with model* directories, ssh/cluster-shared directories, tunnel configurations, caches and things that are added to the filesystem while you’re doing work (think Big Query certificates, for example).
One big benefit of Docker is that you can create ephemeral, lightweight services that you can automagically scale up/down. The Looker java application doesn’t really fit within this paradigm.
If you’re using Docker for Looker release testing or your own unit testing that may be appropriate. Using Looker in Docker in production is a risky proposition at best, and ‘is not supported.’
Thanks so much for the quick response and for the info. That all makes perfect sense. We’ll avoid any use of Docker for production use of Looker, but I like your idea of using it for testing.
Just wanted to add that we would really love to be able to do this as well.
Our company has been moving a lot of our production services over to Kubernetes and I would love to be able to run Looker as a containerized service on Kubernetes. This could also be a benefit to new customers who want to run on premise (once you have predefined Docker+Kubernetes setup it becomes really easy to deploy Looker).
@Michael_Erasmus. We would love to be able to control this as just another service in our Kubernetes cluster. Our main concerns are on licensing as Kubernetes is an elastic infrastructure, but that’s not really a technical limitation for us. Seems like everything is here for a Docker deployment; I’d love to see a Dockerhub distribution of Looker in the near future.
I’m a little late to this party, but after a discussion with one of the team and Looker I thought I’d contribute!
We’re using Looker in a container very successfully. We’re in Amazon, we have it linked to RDS, and we’re running under Rancher. We’ve got it set up as a “Pet” rather than as cattle - so, e.g. we don’t have multiple Looker containers running - but it’s very simple.
We’re currently on Looker 4.18, with plans to upgrade again shortly, but we’ve been through a few upgrade cycles and things have been very simple. We use a data container for the Looker home directory, which is backed by EBS, so there’s a bunch of state that gets retained. Plus, we don’t build the Looker .jar file into the container itself.
I’m planning to share the Dockerfile we use, as there’s nothing particularly exciting or confidential in it.
We would love to have this more officially supported as well. move any dependencies up to another type of service: memcache, redis, elastic, efs, etc…
Could you share this Dockerfile, please?
I’m currently running our on-prem Looker instances in Docker on Google Kubernetes Engine (GKE), connected to BigQuery.
Looker definitely could use some improvements that would make this setup much easier/straight-forward to manage. However, it can be done. I recommend talking with your support engineer for assistance.
We haven’t yet rolled this into production, but it’s actively under development, and generally works just fine. The key is to mount the persistent parts as a persistent-volume.
In our case, we are processing PHI/HIPAA compliant workloads and want to keep our tenants physically isolated. So, being able to run Looker instances in containers that are bound to tenant-dedicated node-pools is a must. Kubernetes makes this pretty easy.
Hey all. I uploaded a copy of the build file to github:
We’re currently running Looker 5.2.9 on this setup, and it works pretty well.
Looker does not officially support Docker, but many customers have been able to make it work successfully. Looker has created the following Docker config that seems to work well. The information is in github at…
Please read the README.md file associated with this for the important considerations. If you have improvements that you would be willing to share, please open a pull request.
Customer Success Architect
Sharing an update on our journey…
After a lengthy process we’ve finally completed a move from a stand alone ec2 node to a containerized service in our ECS cluster. I plan on doing a more detailed postmortem on this which will include some of the infrastructure components that went into it, challenges, issues and thoughts.
For now, we’re sharing our docker-compose and Dockerfile with the community in hopes it helps anyone else who wants to go in this direction.
Here’s a link to a github gist with the details:
I opened a PR in the looker/looker_docker repo with the setup we are using in production at Carta. Hopefully this is helpful to someone who visits and wants to run Looker in Docker.
The deployment step is very straightforward. We have an EBS volume in AWS mounted to /srv/data in the container.
Big thanks to
@MikeD and @tfoley for sharing your code for us to build on.
Open EBS looks promising. They have an NFS solution for Kubernetes