Advertising avoidance is the default for most consumers. So why do most copy testing methods insist on forcing exposure to adverts, and then measuring consumers reactions?
Imagine the scenario. You sit down to watch a show, and then turn it off half way through. What did you think of it? It was boring, the characters were confusing, and the acting was weak. You just didn’t enjoy it, so you grabbed your right to turn it off, and went to the next interesting looking thing. That was awful too, so you skipped that and went to do something else, or turned to the PVR and watched the old episode of The Simpsons that you love. Sounds familiar?
What about when the ads come on? You watch them in the same way, right? If you’re in the room at all when they come on, the ads you watch are the ones that grab your attention, are funny, likeable or say something to you, and if the brand is lucky, you might watch it all the way to the end. That’s the real world, and that’s how it works. Same with ads on your laptop, in the street or in your favourite magazine (hey, I still like a good piece of printed copy, so shoot me!). And what about ads on your phone – ugh – I’ve got an ad blocker for that thanks very much, why the hell should I let the marketing leeches suck up my data and spoil that article I’m reading or make me wait to watch the video I want to get to. But the ads pay for the copy I hear some of you say (and I have some sympathy). But I also say we in the media world created this screwed up scenario by giving all our good stuff away years ago, and then expecting reciprocity by cramming our programmes and sites full of terrible marketing in a bid to keep the lights on. But that’s a rant for another day.
So if that’s the real world, why would you measure consumers reactions to ads by forcing exposure to it and asking people what they think? In an already artificial scenario, I can’t think of a worse way to get consumers reactions to creative. Making them watch a video all the way to the end then asking them about it – that’s not the real world. To make the artificial scenario as real world as possible, copy tests need to stop forcing exposure on people, and show the ads in the real natural environment. So put your ad in a blind reel. Let people zap away from it when they want, and move to the next one. And the same for banners – put them in amongst other ads in a range of websites and measure what gets noticed, and what people recall of the ads they saw in their own words. Print, or radio or outdoor or anything else, the same – build a natural environment and make your ad stand on its own feet amongst a range of ads. Measure what people see, when they see it, how much they see and what they remember, and hey presto you have a whole set of vital new KPI’s which can help you judge the chance of success, optimise what works and what doesn’t, and tweak your media plan.
This stuff is important. Analysis of creative tests on our database shows us that the top performing TV ads require 300 fewer GRP’s to reach the same level of awareness as the worst performing ads. That’s why we want our clients to aim for the top 20% of performance, to optimise their chances of being heard in the big bad landscape of marketing. We’re not creative killers, we fully believe and can prove that “Effective Ads Work on the Heart, Not on the Head…Efforts Appealing to Emotions Are Profit-Boosters” as noted by the IPA.
Byron Sharp believes 80% of advertising doesn’t get noticed, which Dave Trott equates to “£17 billion falling in the forest with no one around to hear it” due to much of advertising’s inability to create engagement and be actually consumed by real people. And that’s what we’re trying to stop.
So we’ll keep on testing our clients ads through natural exposure to replicate the endemic ad avoidance which is default for most consumers, and our clients will keep on delivering best in class advertising to grow their market share and optimise their media budgets.
Contact Simon McDonald (email@example.com) for more information on how we integrate new metrics into copy and concept testing.