Liz Buchanan (below) is NielsenIQ North America Commercial Lead for the Consumer Intelligence Business Unit, with accountability for sales, customer success, pre-sales engineering and custom development organization(s). The views expressed here are those of the author.
Today’s consumers have more touchpoints at their disposal than could have ever been imagined only a few short years ago, and their purchase journeys now run the gamut from in-store to online, curbside pickup and delivery to social buying and much more. Consumer behaviors have rapidly evolved over the last 12 months, giving rise to blind spots that find retail trading partners sprinting to keep up.
At the crux of the race is the need for access to trustworthy, reliable, useful data, which has emerged as one of the most pressing challenges related to decision making in the fast-moving consumer goods industry today.
Proof of the same was plainly revealed during the past year, when the grocery sector confronted more transformation stemming from the global pandemic than it had throughout the entire decade prior. Consequently, many businesses were — and remain — ill-equipped to tackle the many disruptions that unfolded and that will continue to unfold over the coming months and years.
The lack of capabilities to make precision decisions is finding many retailers stumbling badly with their omni strategies, largely as a result of utilizing incomplete data that fails to measure the entirety of commerce and struggles to diagnose behaviors across consumers’ entire online/offline journeys. Today, omni measurement and shopper understanding are mandatory for any omni growth strategy, yet they are often missing or inadequate.
Many companies are taking a “good enough” approach with upstart data partners that bill themselves as being nimble and agile but fail to provide the quality required for precision decision making. Others are trying to cobble together views in-house and are quickly confronting the many challenges of doing so. At this still-early stage of omni retailing, making a “good enough” choice could be prohibitively costly, if not crippling, to organizations in the not-too-distant future.
Early computer science speak introduced one of the most enduring business truisms of all time — garbage in/garbage out, or GIGO. The same premise — that flawed input data produces incomplete and thus unreliable output — holds true in omnichannel sales measurement and shopper understanding. Even the best algorithms will fail when applied to a less-than-reliable data source, leading to omni strategies that are flawed, or worse, just plain wrong.
To the contrary, NielsenIQ’s history and legacy of delivering trust and transparency dates to 1923, when Arthur C. Nielsen created the first-ever index to measure drug and retail store sales. By 1935, he had invented the concept of market share — arguably the most prevalent KPI in any boardroom today. While measuring market share today seems simple, it was anything but back in 1935, when companies required transaction and product coding that accurately attributed sales to the right categories and brands, with limited technology available to do so programmatically.
In the 90 years since, NielsenIQ has innovated data coding and measurement time and again as the marketplace has evolved, including against this latest digital boom. Today’s omni commerce acceleration is not the first time the industry has been disrupted — it’s just the first time in a while.
It’s interesting to think about not just where the industry will be at the end of 2021, more than a year into the massive disruption driven by COVID-19, but where the industry will be in 2035 — 100 years after A.C. Nielsen first invented market share. Today, we hold true to his credo that “the price of light is less than the cost of darkness,” or in other words, that the companies with the best data will win.