Skip to main content

Vince Darley

Vince Darley

With over 318 million monthly unique users as of third quarter 2015 across web, social and mobile platforms, King has over 20 billion events a day. That’s an incredible amount of information overseen by King’s Data Analytics & Business Performance teams. They work on all aspects of analytics and business intelligence, including downstream and upstream data pipelines, data warehousing, real-time and batch reporting, segmentation and real-time analytics. We met up with Vince Darley, Chief Scientist (previously VP Data Analytics & Business Intelligence), to hear more about how the data can be used across the business to improve the player experience.

How do you handle the tremendous amount of data?

What we’ve learnt over the years is that at our scale you need hybrid solutions. During our explosive growth with 100s of millions of players playing our Saga games on mobile we set up a Hadoop system, which is an effective “supertanker” – big, solid, but not very nimble – for, back then, 2 petabytes of compressed data. To allow us to do our core processing much more quickly, we decided to add a very fast, in-memory analytics database, Exasol system.

Where are you today?

We currently hold 5 petabytes of data, adding 1.2 terabytes each day from 20 billion daily events, which our Data Scientists can get a subset of and do a test on it within an hour. And when they’ve worked out what question to ask, they can test at a far larger scale on Hadoop. So we currently process data on a daily process, we’re heading towards an hour and the aim is obviously real-time. A challenge we call bridging the latency gap.

In general, what information do you look at?

One of the key things we want from our games is longevity. That sense of progression and challenge is why our players come back every day. So we do a lot of testing on features and experience, which includes actions players are making in our games’ levels – scores, stars, time-stamps, whether they finished or failed a level and, in that case, what made them fail. Other important aspects are how players help their friends – like sending lives or moves – and finally, any financial transactions.

What are the major challenges in deriving actionable insights from this data?

With games like ours, that can be played anytime, anywhere and on any device, it can be particularly challenging when merging very varied data and identifiers. But getting this right is extremely important because it paints a clear picture of the individual player. Understanding the player’s motivation/state-of-mind is vital to making the right decision. To do that we need to observe behavior closely to understand, for example, that someone might playing on a new device (i.e. not using old device) rather than that the reduced game time is due to a waning interest.

Can you give an example where data that you’ve found has changed the game?

A great example would be level 65 of Candy Crush, which is rather infamous now because people found it too hard. It was taking an average of 120 attempts to pass it. Inevitably there was a drop off and about 50% stopped playing the game right there. So we created an AB test where one half of the population got the existing version of the level and the other half got an easier version. In exchange for more people getting through the level, we were concerned the money was going to go down. But, actually, the data told us a very different story. Looking across that level and into the future, we could see levels making more money. Not because players need boosters, but because they were happy to spend $1 as they were having a more enjoyable experience.

What are the common problems with AB testing?

I could go on for hours about AB-testing, but for us in general it comes down to four major points:

  1. Metrics for free-to-play games are long-tailed, i.e. small numbers of players easily skew the analysis.
  2. Consistent experience across multiple devices complicates testing.
  3. Effects/impacts are subtle, so sometimes we have to run experiments for a long time to get conclusive outcomes.
  4. Gaining insights that inform decisions in other games. It’s easier to design AB-tests that tell you about a game, rather than a player.

How should one avoid those problems?

We employ great data scientists! People who combine stats wizardry and business insight, who can deal with the statistical subtleties and answer the right questions. People who have expertise in behavioural economics.

Then, of course, you need to look at the data. What kind of answer do you need; one that gives you an indication and can draw conclusions from or one that’s exact? When you’ve defined your question you know whether you need to run a test for a short or long time.

Most importantly though, always have the player and their experience in mind rather than staring blindly at what gives you the best metrics.

King in London

“If it’s a lunchtime battle on the foosball table you’re after, let’s hope you find the London job you’re looking for.”

Our Kingdom is in lively Soho, right in the heart of the West End in the centre of the city.  We’ve taken over four floors of the landmark Ampersand Building and turned them into a leafy hub (look out for the trees…) plus there’s a sweet terrace with inspirational city views.

Browse all of the roles

Data, Analytics & Strategy @ King

V4P FEB19 029

Data, Analytics & Strategy

At King we’re dreaming bigger. Data, analytics and strategy are playing a key part across our Kingdom to shape our future.

Browse all of the roles