These days every organisation seems to want to “do” predictive analytics. Maybe it’s the same in your organisation? There do, however, seem to be two key barriers to actually doing. Not only is there a lack of understanding of what “it” is but there’s also a lack of understanding of how it can be used to benefit a business. To better understand how these barriers could be overcome, I recently set myself a challenge. Here’s what I discovered.
“Build a predictive demo that takes the output of a complex predictive algorithm operating on a mass of data and automatically distil it down to essential, actionable insight that non-analytical experts can understand and use.”
This is the “Holy Grail” and future of predictive analytics. The use of artificial intelligence and machine learning will eventually mean organisations can simply pose business questions to the enterprise systems of tomorrow and get actionable insights in return. Before we get to that brave new world, the demo featured below will at least enable you to test the waters and see how it can accelerate accurate decision-making.
The subject of the demo was marketing analytics, specifically analysis of the impact of TV advertising across a range of brands over the course of a year. This type of analysis is often something businesses outsource to marketing agencies or large management consultancies. By the time the data has been passed from pillar to post, the resulting reports can be dated and therefore of questionable value.
Instead of spending money with vendors to gain marketing insights why not utilise the mass of “big data” collected by your organisation, e.g. website clickstream information across brand microsites, to derive insight automatically and in real-time? This type of data was used as the input to the demo: website visits for seven different brands and associated TV advertising for one selected brand, for each of the 365 days in year 2015. Whilst this isn’t strictly “big data”, the use case could be scaled to a level where the analysis might need the likes of Hadoop, Spark or HANA Vora (but that’s for another insight here).
To build the demo, I used the latest version of “SAP Predictive Analytics”, specifically the “Expert Analytics” tool. This enabled me to create an entirely bespoke predictive app using the ‘R’ statistical programming language and quickly integrate it with the standard software.
If you are interested (here comes the maths bit!), I decided to tackle this challenge using cross-correlation analysis of each time-series with all others. I built the algorithm to allow a user to adjust the significance thresholds to therefore categorise the results in terms of those that showed a significant relationship and those that didn’t.
The output of the predictive app was a simple visualisation that showed the relationship between each brand’s web traffic with TV advertising for the one selected brand. To simplify this information further, I also created a dynamic text box that interpreted the analysis and presented it to the user in plain English. You can see the result below. All the user needed to do was:
a) Load the data (in this case a CSV but could also be a HANA connection, for example).
b) Filter out the data columns that were not needed in the analysis.
c) Select the “driving factor”, which was of course the TV advertising.
d) Run the algorithm and view the output.
Algorithm setup – drag and drop
The actionable insight was that, counterintuitively, the TV advertising for brand A in fact led to a positive uplift in web traffic to brands B and E, rather than to brand A itself. Possibly, this was due to the popularity of brand A meaning that interest was already saturated, but there was a measurable “halo” effect on two other brands! This could mean that demand should be adjusted to account for a predicted uplift for brands B and E when brand A is advertised on TV, thus avoiding stock-outs.
The cost saving to you
If there were hundreds of brands rather than seven, a lot of time could be saved, avoiding the head-scratching process of looking at plots of data for each brand and manually interpreting this information. Unnecessary spend could also be avoided by running the automated analysis instead of outsourcing it to marketing analytics vendors. Although this was a quick demo, it has clearly illustrated how predictive analytics can be “tamed” and simplified to remove the need for an expert to interpret statistical summary results.
There are several options for realising this type of capability and they needn’t be expensive or take a long time to implement. Often a quick and inexpensive “proof of concept” can be put together over a few weeks to address a selected business challenge and assess the potential value of predictive analytics to your business – so what’s stopping you?