Big DataBylinesDigitalization

Data First, Then Dinner: Why Data Quality is an Ingredient All F&B Organisations Must Embrace

By Gary Chua, Managing Director, Asia Pacific & Japan, Syniti

It’s taco night and you want to make a fresh serve of creamy guacamole for dinner. You investigate the avocadoes in your local supermarket, finding the ones with the right colour, shape, and size, that will do justice to your recipe.

For everyday consumers, these are the considerations that often come to mind when selecting any produce. However, rarely do we consider the journey a piece of produce takes from source to store — which is in fact, a highly complex one.

Food and beverage companies must navigate multiple steps to bring their products to market. From ingredient sourcing and manufacturing, testing and quality control, to inventory management and distribution — the journey is no match to that of selecting the perfect avocadoes and plating the freshest guacamole.

But, the difference between the produce that gets selected and the ones that don’t, often lies in how effectively food and beverage organisations harness their data.

Don’t cook up a data storm
Big data is steadily revolutionising the food and beverage market, igniting a fundamental change in the way companies conduct business. In fact, the big data market in the food and beverage industry is set to reach US$ 2.1 billion by 2026. However, big data initiatives will only succeed if data is of the right quality and padded with strong data management. Essentially, high volumes of complex data are not useful if they aren’t accurate and relevant.

Take the case of perishable produce, such as kiwifruit. With many varieties only able to stay ripe for five to seven days, precise and timely data across the value chain is crucial to delivering fresh, zesty kiwifruit to the hands of consumers.

Data quality is so paramount to the food and beverage value chain that any missteps can adversely impact quality, distribution, prices, and even the environment.

For example, if the data used for yield prediction is flawed or outdated, it may result in unrealistic estimations of crop yields with a ripple effect of consequences. A surplus produce from over-estimation can impact an organisation’s sustainability efforts and lead to waste, whilst under-estimation can lead to supply shortage and price impact on consumers.

In another instance, gaps in data pertaining to quality control and assurance of produce may result in contaminated or unsafe products reaching our grocery stores — risking the health of consumers.

It is therefore imperative for food and beverage organisations to have a proactive approach to ensuring data quality. By putting data first, and investing in reliable data management, validation, and governance processes, they can ensure that data is right from the start, to make the right decisions in a timely manner.

Serve up success by cleaning your data first
Embracing a data-first approach and the ability to proactively capture, manage, clean, and use it, not only helps to avert severe consequences, but more importantly to drive efficiencies and competitiveness of the organisation.
Organisations such as Zespri, the world’s largest marketer and distributor of kiwifruit, are already seeing the fruits of a proactive approach to data management.

As an organisation that sells into 59 countries and manages 30% of global volume, the company replaced manual tracking and inputting of data into spreadsheets, with an automated data platform with strong governance processes. Beyond driving efficiencies in data management, this ultimately helped to instil trust in their data among employees, who previously had to re-run and validate their data repetitively to make decisions with greater certainty. With greater visibility into their data quality, they are now strongly positioned to make better-informed decisions to drive business improvements.

Having the capability to make well-informed, timely decisions is a superpower for businesses — one that they can tap into to achieve business outcomes and set the business up for greater competitiveness.

For example, a global leading food and drink manufacturer implemented a robust data management platform, across all business processes, from invoicing and procurement, to go-to-market initiatives. With in-built, active governance and validation processes within the platform, the company saw its “first-time-right data”, or data that’s correct without the need for human remediation, jump from 24.3% to 97.7%.

Starting the data work early and focusing on the importance of high-quality data translated directly into their bottom-line. New product development and launches, which previously took three months, now takes a mere two weeks, owing to enhanced governance and automated validation of data. The organisation is also able to avoid incorrect deliveries and missed invoices, providing both reputational and cash flow benefits.

Nurture your secret ingredient
With many organisations in the industry turning to big data analytics to boost business, optimise costs, and improve customer experiences, among others, the critical question remains whether the data itself is accurate and relevant to provide reliable insights for effective decision-making. Having the right data management platform that takes care of underlying data quality will turbocharge insights from any existing analytics platforms — ultimately delivering improved outcomes, from source to store.

So, if you’re wondering what the secret ingredient behind your fresh-tasting guacamole is — it’s data quality.

Gary Chua

Managing Director, Asia Pacific & Japan, Syniti

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *