Implementing innovation with impact - evokeAG.

Use of cookies

The evokeAG. website uses cookies to enhance your experience and optimise site functionality.

Please refer to our Cookie Policy for more information on which cookies we use and how we collect and use your personal information through cookies

Skip to Content Skip to Navigation

Implementing innovation with impact

If global food waste were a country, it would be third largest emitter of greenhouse gases behind the United States of America and China. The scale of the problem is well quantified, and solutions exist, but for food systems entrepreneur and founder Abi Ramanan, barriers to implementation remain, and caution is needed as we embrace technologies like artificial intelligence.

Human beings have three colour receptive cones: red, blue and green, and everything we see in the world is through a combination of these three.

Prawn Killers aka the Mantis Shrimp have sixteen colour receptive cones, so they can see both ultraviolet and infrared light and even detect transparent prey. A rainbow experienced by the Mantis Shrimp comes from a combination of sixteen different colours, which the human mind can’t even comprehend.

Human vision is severely limited, restricted to a very narrow band of the electromagnetic spectrum and incapable of accessing a huge amount of existing information about our surrounding environment.

Abi Ramanan Co-founder of London-based catering social enterprise Papi’s Pickles in 2014, and then California-based ImpactVision in 2016 – a hyperspectral imaging and machine learning startup tackling food supply chain waste – has seen first-hand the power of technology in addressing agrifood challenges.

RELATED: No holds barred: Leading Aussie agritech founders’ frank assessment of industry challenges

And there is no shortage of them.

“Globally, a third of food produced is wasted,” Abi said.

“When you think of the energy, land and labour costs associated with producing those commodities, the problem magnifies. If food waste were a country, it would be third largest emitter of greenhouse gases after the USA and China.”

Additionally, the issue of food fraud, or intentional substitution or mislabelling of food products is costing the Australian economy $3 billion USD per year, with wine, veal, and fish the most at risk, according to a report by AgriFutures Australia.

Abi Ramanan on stage at evokeAG 2024.

Human vision is severely limited, restricted to a very narrow band of the electromagnetic spectrum and incapable of accessing a huge amount of existing information about our surrounding environment. Image | Darren Wigley.

Food fraud manifests in many ways, for example, contaminating melamine with milk powder, which has led to infant death, or if you buy red snapper in the US, nine times out of 10, you’re actually getting cheaper tilapia.

But it is close to impossible for the human eye to tell the difference between two white powders or two fillets of fish. Ramanan wanted to use these technologies to provide rapid, real-time, and accurate techniques to assess food quality but also product authenticity.

RELATED: Powering progress: Agrifood tech and innovation on show at the biggest evokeAG. yet

“In California, where I live, I’m aware of growers discarding thousands of tonnes of leafy greens because they lacked information about shelf life, and distribution was therefore too challenging,” she said.

“Decisions around growing, ripening, sorting, and distributing food all require a lot of information, and visual inspections or sample-based tests are a spectacularly inefficient way to process and distribute the industrialised food system. Especially considering that 50% of all the waste that occurs happens upstream.

ImpactVision leverages two technologies – hyperspectral imaging and machine learning.

Hyperspectral images are multi-dimensional and can be up to 700MB in size (the size of a small movie,) so while the technology isn’t new (having been used for space applications by NASA for some time), the simultaneous increase in computation allowed ImpactVision to do real-time transmission of these images for insights about contamination, freshness or ripeness.

Today, the ImpactVision technology is being developed for use in food processing and distribution centres to undertake real time, efficient quality control of produce – reducing waste while increasing efficiencies and quality.

“A picture is worth a thousand words, but a hyperspectral image is worth almost a thousand pictures. Essentially, it lets you see beyond the borders of human vision,” Abi said.

Persistent barriers to implementation

So, with the size of the food waste problem well defined, and so significant, why hasn’t technology like that of ImpactVision’s become more widespread and broadly implemented?

Abi sees two major barriers to implementation.

The food industry is low margin, high volume, and relatively conservative without the research and development budgets or investment incentives of an industry like pharmaceuticals.

RELATED: evokeAG. Brisbane-bound in 2025 

“It’s also hard to fundraise in the food industry, with debate currently raging about the best funding models,” Abi said.

“Also, while satellite imaging has been used pre-harvest for some time, it hasn’t to-date been available at the resolution required for individual crop monitoring or forecasting.

Abi Ramanan

Abi Ramanan said in California she’s aware of growers discarding thousands of tonnes of leafy greens because they lacked information about shelf life and distribution. Image | Darren Wigley.

“There is also the problem of a lack of publicly available data sets. When starting ImpactVision we had to build image and spectral libraries from scratch to train our models in collaboration with partners as well as software that could perform real-time analysis of internal quality under high throughput conditions of conveyor belt speeds of a 1m/s and up.”

It’s a problem Abi says persists even today, despite the increasing use and prevalence of artificial intelligence.

Be careful what you wish for

While no doubt powerful, Abi cautions against unbridled adoption of AI technology in the agrifood supply chain.

“Certainly, in its most reductive form, technologies like AI can help us produce more food in a constrained environment,” she said.

“But we need to acknowledge the fact that algorithms are optimising for a specific and objective function, like maximising crop yield for instance.

“This means the technology might do that without any regard for the negative externalities or impacts of that pursuit, like excessive pesticide or chemical usage.

“We need to take a holistic approach when we design these algorithms and systems that avoids just optimising for one variable for short term success. This is especially the case in a system like agriculture that is so interdependent on other factors like the environment, weather, and climate. We also need to ask ourselves – what are the objectives we’re optimising for, over what timeframe, who benefits, and who might be harmed?”

Abi Ramanan was a keynote speaker at AgriFutures evokeAG. 2024, presenting ‘Innovating for good: How agrifood tech can digitise food supply chains and reduce food waste’. She was also part of panel discussion, ‘The great balancing act: AI, the possibilities, and the responsibilities.’

Tap into more discussions here about the role of agrifood tech in driving sustainability across supply chains, news from agtech startups and updates ahead of evokeAG. 2025 in Brisbane, Queensland.

Read more news
Read more news