Live Webinar - Optimize LLM performance through Human-AI Collaboration

Top Gaming Company Strengthens Customer Support Capabilities with AI

Appen’s Data Labeling Platform Helps Gaming Industry Giant Scale to Meet Customer Needs
May 19, 2021
Share

“Appen provided the integration of different tools and options in terms of using our internal labelers. Being able to monitor contributor performance and have a system that's already integrated with that capability was super helpful. Not to mention the scalability of it.” – Gaming company specialist

 

The Company

A US-based large gaming company that has delivered games and online content to hundreds of millions of users around the world.

 

The Challenge

Gaming customers often seek out customer support when they face technical or other issues with game play. For quicker triage and greater support capacity, the company uses artificial intelligence (AI)-powered chatbots to manage support requests. As is often the case with language, player requests can be confusing or vague, creating interpretability difficulties for the AI-model.

The gaming company needed to enrich their machine learning model with more training data that covered a broader scope of support requests, the goal being to equip chatbots with the ability to better understand and address customer needs. The team internally worked on annotating past chat logs between customers and agents to serve as training data for the model, but only one team member had the bandwidth to work on the massive effort at any given time. Tooling was also limited: the team member used several Excel documents and programs to collect labels, making it extremely difficult to scale. The project also needed more resources and a comprehensive database for data storage.

 

The Solution

In 2019, they received a recommendation from one of our partners to connect with Appen. Our platform includes a set of functionality and tools that could easily integrate into their processes. The platform superseded the need for their various Excel documents, instead collating that information into a central data repository.

With our platform, they were able to onboard new internal labelers quickly and the project team was able to grow efficiently. Team leads used the platform to track labelers on valuable metrics, such as how long tasks were taking and how their performance compared to each other. Team leads could also pull reports to compare data on key measurements such as trust scores and label accuracy.

 

Results

Our platform enabled the gaming company to scale their project team, which led to a number of benefits. They discovered new, more specific labels that allowed the application of increased granularity and nuance to support issues. Customers in turn received a more personalized experience, with improved response times from chatbots.

The time required to review labeled data for completeness and accuracy reduced greatly as more team members were integrated into the labeling process. Through our platform, team leads could easily monitor contributors for performance and task fatigue, allowing adjustments to be made as needed.

At the start of the project, they developed over 13,000 rows of data, but that number increased nearly 12-fold as the team expanded. As a result, interactions between customers and agents have improved as chatbots are able to target support needs more efficiently. The continued success of the customer support model has opened further pathways for the gaming company to explore in terms of enhancing the player experience.