When there are so many choices and options in choosing a brand in-store, it’s good for researchers to get a little help to understand what gets seen.

AI fills in the blanks

In grocery, shoppers are bombarded with visual information every time they enter a store or shop online. Cognitive overload is a well-documented phenomenon where the brain’s ability to process information is overwhelmed by excessive stimuli. The notion that a shopper can accurately recall all the details of what they saw on a busy shelf is more myth than reality.

By putting AI to work to predict visibility, we can now learn new things about the pack, the shelf and the aisle. Analysing vast amounts of data, AI provides insights into what people see when they shop – and just as important what they miss. An enhancement that solves the recall question without the complexity and cost of full eye-tracking.

DVJ, in collaboration with Neurons, now embeds AI into all of our methods. This includes Shopper where AI contributes extra insight on aisle, shelf and brand visibility.

WALKING THE AISLE

Second-by-second visibility capture using AI provides critical insights into shopper behaviour in supermarket aisles. This approach enables retailers to accurately track which products shoppers notice and interact with in real-time. By gathering detailed visibility data, retailers gain a deeper understanding of how product placement, packaging, and shelf layout influence consumer attention and decision-making. Traditional methods of assessing product visibility, such as post-visit surveys or static shelf audits, fall short of capturing the dynamic interaction between customers and products on the shelves.

The findings are intuitive. High-traffic areas and eye-level shelves typically garner the most attention, while products on lower or upper shelves are often overlooked. Items with distinctive packaging or those positioned near popular products also attract more views. AI enables retailers to make informed adjustments that enhance the shopping experience and improve product visibility, ensuring that key items are seen and considered by customers. This capability is essential for optimising store layouts and increasing the likelihood of product engagement and purchase.

QUANTIFYING THE IMPACT OF PLACEMENT

We’ve grown up knowing that there’s more visibility in some shelf positions than others. The eye level for example is known to be prime real estate, but the delta between positions has lacked insights, largely due to the cost of running multi-cell consumer testing.

AI technology means that mass experimentation on shelf visibility is now possible and practical.

In the simplified example below, we took two Swedish coffee brands, Arvid Nordqvist and Lavazza. We then re-arranged the shelf to position the brand across six zones. One set for Arvid and one set for Lavazza. Then we ran the analysis, in this case predicting visibility twelve times and comparing results. We confirmed:

  • The Lavazza pack is a more striking design and therefore enjoys greater head-to-head visibility than Arvid regardless of zone
  • Central positions give a big advantage to brands
  • The further from the centre of the fixture, the weaker the visibility
  • Brand visibility is also a function of those brands immediately to the left and If they are dominant, it is clear that impacts others on the shelf

This data-driven approach allows retailers to make informed decisions about product placement, optimising shelf layouts to enhance visibility and drive sales.

DESIGNING FOR IMPACT

The one thing the brand owner can consistently influence is the design of the pack. This becomes even more important given the retailer’s ability to configure the planogram to their own requirements. Pack testing serves as a crucial step in understanding how consumers will react to new packaging designs before they hit the shelves.

Again AI enhances our understanding of pack design with additional metrics

Heat map – a visual representation that uses colours to show the intensity of attention, across different areas of a pack.

Fog map – a visual representation that highlights areas of confusion or difficulty in understanding

Areas of interest – a method to measure the impact of different visual elements on a product’s packaging in terms of percentage.

Start attention/ end attention – similar to a Heat Map, breaking down the prediction into the first two seconds then seconds three and four.

AI offers multiple ways of predicting on-pack attention

TURNING INSIGHTS INTO ACTION

As we noted in the recent article ‘Unpacking Success’ the landscape of pack testing in the FMCG sector in 2024 is rapidly evolving. Cutting-edge technologies like predicting visibility open up a new dialogue around winning at the shelf. Successful companies will be those that address these challenges head-on.

AI helps researchers because it predicts visibility and that can be hard for people to articulate. However, important to say, that it does not stand alone as it cannot provide all the answers needed to drive pack design decisions. As examples:

Measuring Instant Recognition. Using a T-Scope method is excellent for market research because it precisely measures how quickly and effectively visual stimuli are processed in 80 milliseconds. This involves exposing individuals to a stimulus for no more than 80 milliseconds, too brief for cognitive processing. This rapid exposure assesses recognition and instant appeal, comparing these responses with other shelf brands and designs.

Measuring speed of finding. How quickly a product is located compared to competitors. By presenting a product within a shelf context and timing how fast it’s identified, you gauge its cognitive visibility. This is particularly insightful when comparing not just brands but also sub-brands.

It’s good to know that AI is now a part of today’s pack testing process, adding another valuable metric to enhance our understanding and optimise packaging designs effectively. At DVJ, we’re known for methods built on robust science and innovation. Now, that approach has a boost from AI.

 

MORE INFO?

Contact the author if you like to know more about this (or related) case(s).

Adrian Sanger

LinkedIn