Using data to create better gaming experiences at FACEIT

  • 28 March 2022
  • 0 replies

Userlevel 5

This content, written by Maria Laura Scuri, was initially posted in Looker Blog on Dec 9, 2020. The content is subject to limited support.

From posting content, sharing match highlights, finding teams, and playing games with friends, there’s a lot users can do on the FACEIT platform. When broken down, all that activity translates into approximately 18 million users, 60 million monthly sessions, 800,000 daily active users, and around 270,000 peak concurrent users. In other words, a lot of data.

We wanted a way to better leverage that and use it to continue driving towards our mission of furthering the esports ecosystem for both players and publishers. In this piece, I’ll share how we went from big numbers and unused data to data insights powered by a robust tech stack of Looker and BigQuery.

Data bottlenecks and silos slow business agility

Before modernizing our technology stack, we ran a third-party visualization platform on top of BigQuery for creating and viewing dashboards. However, because this data then had to go to four different business intelligence platforms, it was a constant challenge for my team to keep it from breaking and being inaccurate.

What resulted were data silos. With no standardized definition of our critical business metrics, the information people would go into meetings with would vary depending on which platform they pulled the data from. In addition to these data silos, we also had functional challenges that were becoming bottlenecks. My team couldn’t clear the huge backlog of requests for data reports, which made data-driven decision making slower and delayed feature rollouts.

Looker offered a solution for governed, self-service analytics

Knowing we couldn’t continue to deliver innovations to the esports ecosystem with these issues, we decided we needed a data platform that could quickly provide access to reliable, accurate data. After thorough research of the different platforms on the market, we selected Looker because it:

  • Could integrate directly with BigQuery along with our various microservices, pulling data in straight from the source
  • Allowed us to easily define our business-critical measures and dimensions
  • Empowers business users with the same data-exploring capabilities of technical users, making it possible for everyone to answer their own business questions.

Achieving growth through technical and cultural change

To support our plans for growth, we rolled out Looker to the organization. We needed to promote data-driven decision making while removing the data bottleneck of everyone relying solely on my team for the latest numbers.

First, we built our data model with the input and collaboration of our business intelligence team. Once our model was solidified, we selected a small group of employees with individuals from each department — we called them our Looker champions — to try out Looker first before rolling it out to the whole company. By the time we rolled out company-wide, our BI team had established Looker advocates within each team. Our BI team continued to work with our Looker champions to keep learning about how the use of Looker was spreading and to better understand the unique challenges of each team.

Fast forward to today, over half the FACEIT organization is using Looker, spending an average of two hours each day running roughly 200 queries in Looker. And through enablement programs like our regular lunch and learn sessions, we aim to be at 100% active users by the end of June 2021.

Example: Saving time and automating insights with LookML auto-generation

One of the coolest ways we’ve used Looker so far is for coding instructions to automatically create a new view whenever a new table appears in BigQuery. To optimize this LookML auto-generation for size, cost, join accuracy, and scalability, we built this in a staging environment before pushing it to our Looker production environment.

Every ten minutes, this Python code checks for new tables in BigQuery. When a new table is found, this triggers the creation of a new LookML view that is then automatically added to the Looker model. And because the view auto-generates with measures already in it, developers and product managers have the basic data they need to start using it immediately. This not only saves time for our technical users, but also increases the availability and accessibility of accurate Looker dashboards for our business users.

Using Looker plus machine learning to stop toxic messages in their tracks

Another thing we’ve been able to use Looker to do is to help our teams track and label “toxic” messages — something we could only do manually in the past.

We want to provide our community with a healthy and positive social experience. Given the amount of matches played on our platform every day, we looked into using data and AI to develop a scalable solution to detect and fight toxic behaviours in chats, voice, and so on.

Using Looker along with machine learning, we’ve been able to automate the classification of toxic messages on the FACEIT platform while still enabling our team with the ability to go into Looker to re-label messages if/when necessary. These inputs are passed back to BigQuery to then train the model so that similar messages are more likely to be labeled correctly in the future.

Freeing up internal teams to focus on the user experience

Leveraging Looker alongside BigQuery has not only allowed us to reallocate dollars saved on managing our infrastructure to other initiatives, but it has also freed our BI team from being bottlenecks and working in data silos.

With the ability to leverage accurate and timely insights, we’re excited to continue creating new and innovative ways for users to have the best esports experiences on the FACEIT platform.

To continue learning about gaming analytics, check out the Looker . You’ll find in-depth webinars with customers, case studies, and relevant Blocks all in one place.

0 replies

Be the first to reply!