Something strange was happening in our testing of Candy Crush Soda Saga. Players of test versions of the game came back for the first few days but after a couple of weeks some of them were drifting away. What was going on?
We tested a few theories by releasing tweaked versions of the game to subsets of users and then analysing our results. The first experiments were inconclusive. The initial idea, that the game was too hard and these people were quitting when they reached more difficult levels, was slowly replaced by the counterintuitive idea that perhaps the game was too easy.
We tested a version of the game in which the early levels were harder. Immediately we saw more people dropping out in the first few days. Second-day retention is an important metric for us so that was a bad sign but we decided to let the experiment run. And the decision paid off because after seven days and 14 days the number of returning players was higher than it had been before.
This evidence suggested that our old second-day retention numbers, which were really good, were being distorted by people who were never going to become long-term players. Meanwhile, we had been losing would-be fans of the game because we weren’t giving them enough of a challenge.
That experiment is typical of how we learn from data at King. We have about 150 people working in data roles, out of a total workforce of 2,000. They come from a range of backgrounds. Many are from the games industry, of course, but we also bring in lots of recruits straight from university.
These people will have just done their masters or PhD in a wide range of disciplines. Many of our team studied statistics, physics or computer science but we also have people who came from theoretical biology because work on DNA sequencing in that field has produced a lot of data-sophisticated people. Others are behavioural psychologists or behavioural economists.
Within the data team there are about 30 data engineers, responsible for organising our data. We also have a good number of business intelligence developers, who build a fantastic range of standardised reports used by hundreds of people across the business. Then we have a larger number of analytically-savvy business performance managers and directors, who work very closely with our games and other parts of the business to make sure we get the most from our data – which means taking the right data-driven decisions.
Data scientists make up by far the largest proportion of the team. In some companies data scientists are heavily involved in artificial intelligence and machine learning but those areas are, with a few small exceptions, not a priority for us. Our data scientists are scientists first and foremost. Like all good scientists they need to really know their field, which in our case means understanding players and their motivations as well as understanding how games work and how they can affect a player’s motivations.
They need to be able to think like detectives to really understand what is happening with our players and to be able to design great experiments to test their theories. They are scientists in the world of players and games. At their disposal is a vast ocean of data created by more than 320 million players who play King games each month. Few companies in the world can boast that amount of data and the chance to learn from that database, and impact so many people, is one of the things that attracts top talent to King.
We learn not only about games but also about human nature. For example, some time ago we noticed a discrepancy between two versions of Candy Crush. The Facebook version, coded in Flash, and the mobile version, written in C++, were behaving slightly differently. The physics of how the candies drop is surprisingly intricate and small variations in the code meant that in one version players were more likely to get power-ups and other in-game bonuses.
The data showed that one version of the game was performing slightly better with players. We decided to run an experiment to properly measure differences in behaviour for those players who got more power-ups. In other words – what happens if you give players an easier game?
As you might expect, we saw those players progress through the game more quickly. Unlike the Soda Saga example that began this piece, the game wasn’t so easy that people were put off. Instead it was just easy enough to improve retention, which is obviously a good thing.
What we might have expected was that these players would spend less money. After all, if the game is easier then you wouldn’t feel the need to buy any extras, right? Actually, the reverse was true. These players were spending more. It seems that people are more likely to spend money when they are really enjoying themselves. In the end, it’s all about fun.
The point of collecting all this data is to keep improving our games so that they are as much fun as possible. And finding ways to do that – well, that’s fun too.
“We learn not only about games but also about human nature.”