The Marketing Analytics Intersect
 

On some level you know this: Nothing in life is free.

It is true. Sadly, not even data is free.

It pains me deeply to see how companies large and small fund very large data collection/reporting projects. Decision makers (usually senior non-data experts) move forward with nary a consideration of the cost-benefit analysis.

These senior non-data experts often don't have the skills to conclude that the 58 metrics being proposed are full of... full of... I was going to say crap but that sounds rude... how about... full of questionable signal quality.

It is rare that a data expert evaluates an expansive and complex data project and then approves it even though the project will result in suspicious signal quality. When that does happen, it is when the data expert manager or director wants a promotion/bonus and exists in an org where promotions are based on team size and number of projects.

The reason that ignoring cost-benefit of any data project pains me is that in the end, everyone loses.

1. The company loses because it invested a lot of money in that measurement project.

2. The decision makers lose because either they can't take any action based on the questionable signal quality, or they take action and terrible results follow.

3. The Analyst, or Agency Analytics Director, loses because at some level they knew going in that the project was a waste of people, time, money, and that feeling causes a negative emotional cost on their heart and soul.

Lose-Lose-Lose.

.#heartbreaking

Here are the costs you should consider before signing off on any analytics project.

***

1. The size of business expenditure.

How much money are you spending on the thing you are planning to measure?

For Marketing it is a bit easy to figure this out. You want to measure messaging vibrancy of your Facebook campaign. Okay. How much money are you spending on this campaign? $150,000. Excellent.

Don't measure messaging vibrancy. It turns out messaging vibrancy is very hard to measure - even if it is cleanly defined. The size of that spend does not justify investing in this analytics project. Just measure Conversation Rate and Amplification Rate and call it a day.

If you were going to spend $1,500,000, the decision changes.

Apply that thinking to your Finance team wanting to measure Customer Lifetime Value - what is the size of spending decision that project is expected to influence? (Even better, what is the size of the Profit that project is expected to influence?)

Consider the size of spend your Comms team is putting into a High Value Influencers project. The project is great, is their desire to measure change in perception justified by the $275k they are putting into the project?

The size of business spend is a proxy for: How important is this initiative in the grand scheme of things?

If the answer is the initiative will be 20% of the entire year's marketing (or whatever) spend, then certainly consider a commensurate investment in a complex analytics project.

If the answer is that the initiative will be a small part of the entire budget, use the highest quality of data that's already available and call it a day.

It might only be a Win-Small Loss-Win. But, it is the choice a smart Extremely Senior Leader (ESL) would make.

***

2. The duration of the initiative.

Collecting good data takes time. Good data is also made up of a whole ton of high quality signals (which take time to accumulate!).

If your initiative (Marketing, CRM, ERP, Customer Love) runs for 15 days it will yield less data than if it runs for a month.

Less data translates into smaller possibility of being able to separate signal from noise.

This is true even if you are spending $300,000 on the data analysis.

Some limits are God created.

Hence, computing Lifetime Value or Data-Driven Attribution on an initiative with three months of data is a complete waste of analytics investment. It is better to throw a large party in your office with that money (employee satisfaction has a causal relationship with company revenue!).

In my experience few ESLs take into account the amount of time it takes to collect data that might yield a quality signal.

The duration of an initiative is a proxy for: Will the confidence interval be wide or narrow? If we segment this data, does it instantly lose all significance?

The smart ESL, and true Analysis Ninjas, will take this into consideration as she/he invests in measurement.

3. The total cost of analytics for the project.

You should obviously compute the raw cost of any analytics project. Most people stop adding after identifying the tool they have to buy or the new database they might have to turn on in the Cloud or other such (usually minor) costs.

Here's what makes up the total cost of analytics:

Agency costs billed for producing proposals for measurement projects.
+
Per hour employee compensation costs for each employee evaluating aforementioned proposals.
+
Cost of collecting data (including tools, vendors, platforms).
+
Cost of tools for auditing various platforms/processes collecting data.
+
Cost of correcting mistakes in collecting data.
+
Agency costs billed and/or internal employee salaries for reporting the data.
+
Agency costs billed and/or internal employee salaries for the multiple revisions to the analysis and building the final PowerPoint as well as Excel models (or data pukes!).
+
Agency costs billed and the internal employee per hour compensation who participate in review of and deeper Q&A on findings delivered in above PPT or XLS.
.=
Total Cost of Analytics on Project.

For ROI, compare that to:

Incremental Revenue (or Profit!) from actions recommended by the Analytics Project.

Or

Net Cost Savings from actions recommended by the Analytics Project.

The total cost of analytics is a proxy for: Are you sure after all this money we will learn something worth learning?

It is ok if on occasion that the answer is no. It is not ok that the answer is often no, nor is it ok that you don't know what the answer is.

***

Bottom line: Some questions don't need expensive answers. Expensive answers have expansive time and signal requirements. Not all expensive answers are smart.

-Avinash.

PS: It might seem bizarre that the best selling author of two books on analytics, of the world's most popular blog on digital analytics, etc. is asking you to not measure. (Sometimes not measuring is indeed a smart strategy.) Sadly far too often I've seen bad data cause a massive and, often permanent, harm to good data initiatives. I would rather you measure less, but everything you do measure - after the three filters above - will have a credible and glorious impact on your business.

 
 
Powered by Mad Mimi®A GoDaddy® company