Have you ever won the National Lottery jackpot? Me neither. With the odds now stacked against you at 45 million to one, it’s highly unlikely you ever will.
Despite these lousy odds, you stand a much better chance at winning the National Lottery jackpot than you do of finding a pre-built business intelligence report that answers your specific data question at a given time.
If that sounds outlandish, consider this scenario based on a mobile phone retailer that sells 100 types of phones, offers 30 calling plans to 200 customer segments, through 2300 stores and 136 contract lengths. Against this relatively simple data set, the number of potential questions a business person could ask on this data set is a whopping 187,680,000,000. (That’s not a typo – we are talking billions here.)
This means the business intelligence and analytics industry is having to confront a difficult home truth: it’s been solving the wrong problem. Let me explain.
Complexity isn’t the real problem
Ask any of your data analysts or business intelligence power users what they struggle with most. They won’t tell you that it’s the complexity of questions coming in that’s the problem. In actual fact most data questions put to them are pretty simple – like how many Samsung handsets did we sell through our Liverpool stores last month? The real problem the data experts face is responding fast enough to the unrelenting stream of questions coming from your executives, managers and frontline workers. Many are only slight incremental variations that follow on from an initial query. However, as far as the analytics system is concerned, each variation requires a new query to be constructed.
Despite this, for decades, analytics technology vendors and their customers have been obsessed with the problem of complexity. New systems continued to roll off the production lines with elaborate drag-and-drop user interfaces and hundreds of buttons to handle many different query scenarios. Most of these systems required a data expert to set up a multidimensional dataset, an “OLAP cube,” in order to isolate and analyse selected data sources.
The average business person doesn’t have the time, skill or attention span to learn or use software this complicated. Most just want to be able to ask simple questions themselves and get instant answers. If they can’t, people lose interest and abandon these systems. According to Kissmetrics, 47% of consumers expect a web page to load in two seconds or less. Google’s page speed industry benchmark is around 2-3 seconds.
This consumer expectation for ultra-fast responsiveness is a key reason why so-called “self-service” analytics technologies still wind up on the shelf. They might be self-service for data professionals, but not business people. It explains why BI and analytics adoption levels are still very low – currently at 32%, but held at only 20-21% for the previous decade, according to Gartner.
Service industries are overwhelmed by data queries
British service industries, which now comprise roughly 80% of GDP, are full of knowledge workers who need access to specific data to do their jobs effectively. In healthcare for example, you have insurance brokers, hospital administrators, GPs and countless other roles that each have very distinct ways of working, measuring performance and compliance rules. This is why healthcare analytics teams are overwhelmed by the volume of queries in relation to providers, patients, admittance dates, patient conditions, and specific practitioner interactions.
Analytics teams in financial services companies too, get bogged down with demand for data answers. They face an overwhelming number of potential questions due to the many variables they analyse: individual credit card transactions, products, income brackets and many other customer demographics. There’s no way IT generated reports and dashboards can answer all the possible combinations of questions that arise from all these variables.
The odds are improving
Finally the market is moving to a new model known as ‘augmented analytics’ that is dramatically simplifying the end user experience. Gartner’s 2019 Magic Quadrant for Analytics and BI Platforms predicts that by 2020, 50% of queries will be generated by search, natural language processing or voice, or automatically generated. Both data leaders and technology builders recognised that they must give domain experts and business users free reign to explore data and answer their own questions.
The modern augmented analytics systems don’t just benefit your non-technical business people. Data teams, free from the shackles of endless report generation, have the bandwidth and resources to embark on the work they have been hired to perform, like data modelling, ETL, advanced and predictive analysis, that are all critical differentiators for the long term success of today’s enterprise.
We accept that giving everyone in a company access to fast data answers isn’t just about adopting new technology. It’s a major cultural shift that requires strong leadership, a level of transparency that might at first feel uncomfortable, and trust in people to act appropriately on data insights. You have to drive this from the top.
If you’re ready to sponsor the kind of change that will improve your people’s odds for winning the analytics lottery, systems are out there. All you need to do is start thinking differently about how you and your team evaluate them. Let your non-technical business users take the system for a spin and see how it performs on every day simple questions.