Five Tips for Implementing Marketing Analytics

Competing on Analytics

The book Competing on Analytics by Thomas Davenport and Jeanne Harris is a short but very interesting read about the need for organisations to significantly improve their analytical capabilities if they want to compete in the modern marketplace. The argument, quoting directly from the author is that:

In today’s global and highly interconnected business environment, traditional competitive differentiators–like geography, protective regulation, even proprietary technology–are no longer enough. What’s left is the opportunity to execute a business with more efficiency and effectiveness than your competitors, and to make the smartest business decisions possible. Analytics can help do this. 

I.e. unless you are implementing and using advanced analytics, you’re going to be left behind because you can’t use some of the traditional differentiators to keep any sort of advantage. NB: I don’t discuss Big Data in this post at all, as it’s more about the why rather than the how. But probably a post there for another time..

One of my favourite quotes from the end of the book is:

Analytical competitors will continue to find ways to outperform their competitors. They’ll get the best customers and charge them exactly the price that the customer is willing to pay for their product and service. They’ll have the most efficient and effective marketing campaigns and promotions. Their customer service will excel, and their customers will be loyal in return. Their supply chains will be ultraefficient, and they’ll have neither excess inventory nor stock-outs. They’ll have the best people or the best players in the industry, and the employees will be evaluated and compensated based on their specific contributions. They’ll understand what nonfinancial processes and factors drive their financial performance, and they’ll be able to predict and diagnose problems before they become too problematic. They will make a lot of money, win a lot of games, or solve the world’s most pressing problems. They will continue to lead us into the future.

What a great place to be! But of course it’s not as simple as that. So below are a number of tips I’ve picked up from trying to implement this sort of thing over the years, or watching clients trying to do this as well.

1. Fit the implementation to your real needs. One of the first things to note is that a lot of the examples of success that they give in the book – Amazon, Netflix, Google and so on – are big companies. These organisations have millions of customers and, at that scale, the benefits of well-embedded customer analytics are obvious – Tesco Clubcard is another great example of significant profits generated through analytics.

So firstly, I think it’s a bit of a struggle implementing many of their ideas when you’re running at a much smaller scale – even if you thought it beneficial. One of the companies I worked at a few years ago had a total potential market of precisely 29 companies. If we wanted to know anything about these companies we didn’t look to data analysis to understand patterns of behaviour, we went to see them and asked!

2. Start Simple. A second more subtle point here though is about making sure you pick off the low hanging fruit before moving on to anything advanced. One of the questions I was asked at a job interview a long time ago now, was to do with detecting fraudulent activity on bank accounts. The question was “If we gave you the data showing the amounts going in and out of a bank account each day for the last 3 months, what’s the first thing you would do to try and spot fraudulent activity?”. I’d just finished a maths course 2 months before, so I launched in to a tirade about algorithms for picking out outliers, spotting complex patterns in the data, weekly seasonality calculations, de-trending the data and so on. After a few minutes of this, the interviewer interrupted and said, “Mm. I’d probably just draw a graph and see what was there”. I did manage to get the job in the end and found out that a reasonable proportion of the “advanced analytics” needed to detect banking fraud was really very simple algorithms to spot outliers, pretty close to what you’d be doing by eye (think “3 standard deviations away…”).

An example from the world of marketing – you might be trying to figure out “What type of customer in our CRM system is more likely to stay with me long term, continuing to buy products? (i.e. high LTV)”. There might be a lot of subtle factors here, but there could be some real no-brainers. For example, I’d suggest that customers who have spent a lot with you in the first 3 months are more likely to spend a lot in the future (because the initial spend is indicative of certain levels of budget and/or appetite for your products) compared to someone who spent very little. Of course you have to test this in the data (because it could be completely wrong – see the quote at the end about the blocks to implementing analytical thinking), but if you wanted a simple model for “Which accounts should we spend more time with?”, a simple binary model of “Spent more than $100k with us” vs. “Spent less than $100k” might be a good first start!

3. Get some early wins. There is always a (not completely unreasonable) objection to analytical marketing along the lines of “This stuff is all well and good, but if we spent more time just putting together some great ads, some great messages, reaching out to the community, running some great promos etc, then the money will flow. We’ll worry about analysing the detail later on”. Sometimes this comes from a certain mindset (again, mentioned in the quote below), about “the power of ideas over data”. Sometimes  it comes from seeing failed implementations. I have a lot of sympathy with the latter – it is very easy to spend an enormous amount of time and money on this sort of project – time that could be spent elsewhere in the business – and see precisely no advantage at the end.

One way of combatting this problem is to make sure you get some early wins, even if these aren’t the primary purpose of your project. For example, you may have a vision of an all-encompassing CRM system that knows exactly the right email to send at the right time to the right customer (“Dave, we know you’ve been enjoying our product for 19 days now and that your boss, Helen, is interested in how the product could help her with regulatory problems in the textiles industry – here’s a whitepaper answering her questions and a quote for a price that I know exactly fits her budget for this quarter.”). But if you wait for utopia, without showing some earlier, simpler wins, you’ll be working for years battling off disgruntled executives with other priorities.. So better to pick an early problem where you know you can win. If you’ve never sent any segmented emails at all, then start with a simple opportunity (like sending different emails to new vs. existing customers based on some analysis) then prove that this has had a positive impact on either outputs (such as click rate) or (much) better still, revenue. Once you’ve proven positive return, then move on to the next stage with something a bit more clever.  I’d use the phrase “Build success on success”, but it sounds far too cheesy.

4. Analyse off-line first, on-line later. There’s a great model for step-wise implementation of CRM analytics, along the lines of:

a. No analytics – purely transactional CRM system.
b. Offline analytics (e.g. showing that you need to treat new and existing customers differently), implemented manually (sending emails by hand once a week).
c. Offline analytics implemented automatically (e.g. above analysis embedded in to CRM/email system such that different customers automatically get the right emails. Or offline analysis shows that certain types of customers prefer particular products/prices, so these are hard-coded in the system),
d. On-line analytics, automatically adapting based on data coming in. For example, your initial model might say that “Customers who come in from Northern Europe are more likely to purchase product X” but, as data comes in, you find this changing over time. The models in the CRM automatically pick up that Southern Europe is now more likely to purchase product X, so automatically adapts to offer this product to these customers instead.

The last options is, for most, a complete pipe-dream, and can actually be quite dangerous. Therefore I’d strongly recommend a model where analysis is carried out off-line, on sets of data, then the conclusions from that analysis are proven online first. E.g. your offline work shows that customers in China are more price-sensitive than those in India? Implement this by just hard-coding different prices for the two countries and seeing how you get on..

5. Ignore all of the above if you’re working in a SaaS environment! A lot of the caution above stems from trying to run before you can walk. As I say, if you only need to sell 2 widgets this month to break a profit, your time might be better spent phoning up all of your potential clients, rather than analysing their behaviour to the nth degree. But – the SaaS world disrupts this approach. If you’re not analysing behaviour, interactions, price sensitivity, churn rates, click-through rates, conversion rates etc etc etc from the start, then you’re putting yourself in a very precarious position. Customer service and sensitivity to customer needs is everything in the SaaS world and if customers aren’t getting what they want, they’ll leave you and move elsewhere (the downside of easy adoption is easy rejection).

Hopefully some of these tips are vaguely useful. With regard to the book – it describes some interesting principles, and is good at a certain level (quite high level), though is a little short on detail. Nevertheless, I thought I’d end with my favourite quote, about the factors that hinder adoption of advanced analytics in organisations:

Gary cites four common factors that hinder analytical competition: deeply embedded conventional wisdom that has been around for so long, it’s hard to reverse; decision making–especially at high levels–that fails to demand rigor and analysis; employees themselves who are not willing or equipped to do analytic work; and the power of ideas over data.

Read More