Improving Data Analytics by Avoiding Common Pitfalls

by Dan White

Casino operators enjoy access to an abundance of data. From gaming systems to marketing management tools, data is everywhere. This access creates tremendous opportunity to align decisions with accurate, timely and meaningful analytics. Without it, decision making would be much less reliable. But can a reliance on data have a downside?

The simple answer is yes; there are potential pitfalls when using data to manage casino operations and marketing programs. This is not to suggest a data-centric approach is overrated in some way. On the contrary – effectively harnessing the data at our disposal is a tremendous way to improve performance and increase profitability.

There are some concepts we can familiarize ourselves with, however. These “data traps” can interfere with our ability to utilize data to make reliable decisions and hinder our efforts to maximize an organization’s potential. By building awareness, the influence of these traps can be curbed, and overall results derived from data analytics improved.
 
The Cobra Effect

According to the often-told story, British authorities in colonial India sought to reduce the number of deaths resulting from venomous cobra attacks. To do so, they offered to pay a bounty to citizens in exchange for dead cobras. But some enterprising citizens saw an opportunity and began breeding cobras for the purpose of killing them and collecting the bounty. Once officials became aware of the scheme, they halted the program. Once cobra breeders no longer had incentive to care for the venomous cobras, they released them into the wild, dramatically increasing the cobra population in the process.

The cobra effect, as it has become known, is a form of a perverse incentive; an incentive that creates an unintended result. This has implications in casino marketing and operations.

An example of this is when casinos measure the success of marketing programs using coin-in data. While coin-in is a terrific measure of total slot volume, it can create a cobra effect-like outcome. Operators who rely on coin-in to measure promotional success may not consider the expenses associated with generating the coin-in. This can create a hyper-focus on the wrong metric as chasing high coin-in numbers may come at a price that is ultimately too high to pay (higher reinvestment and reduced margins).

One way to avoid the cobra effect, or any perverse incentive, is to properly align the results that are most meaningful to your organization to the metrics that best represent those results. In the case of coin-in, consider instead choosing to focus on EBITDA or net win (slot win minus promotional expenses) to determine program success.
 
Insensitivity to Sample Size

A well-known study by psychologists Amos Tversky and Daniel Kahneman presented subjects with a hypothetical situation in which two hospitals served a town. One of the hospitals was much larger than the other. Participants were asked which of the hospitals was more likely to record days in which more than 60 percent of babies born were boys.

Only 22 percent of the participants correctly answered that the smaller hospital was more likely to record such days. This is an example of an insensitivity to sample size – the tendency to underestimate variance and volatility in small samples. In the case of the two hospitals, the smaller hospital is likely to see more extreme outcomes than the larger one.

This is a common tendency. In such cases, many tend to infer that any sample population is representative of the total population without considering the size of the sample. For casino operators, this can occur in many forms.

For example, comment cards and other forms of customer feedback can be important tools in understanding the guest experience and customer sentiment. But operators should be careful not to extend those sentiments to a larger population without further research. A small sample size, in this case, may not represent the entire population.

Focus groups are another example. Focus groups are a valuable part of an organization’s research efforts and a great source for collecting qualitative data, but be mindful of the sample size and avoid hasty generalizations.

To combat this effect, ensure that the sample size is known for any report or analysis that is being presented or reviewed. Additionally, reserve judgement and do not draw too many conclusions until additional data is available.
 
Survivorship Bias

In his 2014 book, How Not to Be Wrong, author Jordan Ellenberg recounts the story of Abraham Wald, a mathematician who worked for the Statistical Research Group for the U.S. during World War II. Wald’s team was presented with a problem – too many allied planes were being shot down on their missions over Germany. Military officials wanted to add reinforcing armor to the planes, but weren’t sure where to place the armor. Data was collected from the planes that returned safely and officials decided to add armor to the places with the most bullet holes (the fuselage).

But Wald had a different idea. He suggested the armor should be added in the locations where there were no holes. His theory – the data only accounted for the planes that returned home safely but did not include any data from the planes that failed to return. Therefore, it was likely the returning planes demonstrated where there was tolerance for taking direct hits.

While there is no official count of how many planes Wald helped save, his strategy for reinforcing military planes was used through both the Korean and Vietnam wars.

This is an example of survivorship bias – the tendency to focus on things that makes it past some form of selection and overlooking things that did not.

When reviewing reports or data dashboards, it is natural to narrow our focus on the information that is presented and available. It requires more effort and intent to consider the information that is not present. But doing so can improve how the data is analyzed and how decisions are made.

An example of survivorship bias in casino marketing is when database demographics are evaluated to determine media spend. The typical method for this process is to align media buying with demographic attributes that most closely match those of a database. However, this approach, much like the missing bullet holes, doesn’t consider the people who aren’t visiting the casino, missing a potential opportunity for attracting a new demographic or appealing to new audiences altogether.

The effects of survivorship bias can be mitigated by being mindful of its presence and diligent in the analysis of data. Questions like “what information is missing?” or “what information is not included that could present a more complete picture?” can be helpful.

Data is a powerful tool. But like any tool, the results and benefits it yields depend on proper use. Effectively analyzing any form of data requires equal measures of diligence, curiosity and patience. Fostering these attributes can improve how data is used and the overall decision-making process.

Dan White is Principal of Dan White & Associates, a casino marketing and strategy consulting firm. He can be reached by calling (360) 890-1433 or email [email protected].