Winnow, a company that provides a way for caterers to monitor food waste, has developed a new system which uses computer vision powered by Nvidia technology.
Around one-third of the food produced globally for human consumption is wasted every year. That amounts to 1.3 billion tonnes.
In the hospitality sector, nearly $100bn of food is thrown away every year, as kitchen staff often need to cater for an unknown number of guests. Since they cannot be unprepared, in many cases they end up preparing too many meals, and all of the extra, unused ingredients end up in the bin.
Winnow has been providing a system to help kitchen staff track food waste since it was founded in 2012. It estimates it has already helped commercial kitchens save more than $30m in annualised food costs. That equates to preventing over 23 million meals from being binned.
Winnow co-founder and CEO Marc Zornes says the company originally wanted to use computer vision, but in 2012, when it was founded, the technology was not quite ready for use in busy catering environments, which often lack good network connectivity. “In 2012, there were limitations,” he says. “The algorithms were not there yet, kitchens tend to have patchy internet connections and are busy places.”
The company’s original food waste monitoring system involved a smart bin that weighed everything that was thrown away, combined with a touchscreen, where catering staff identified the food items being thrown away.
It has now updated the system. “About two years ago, we began looking at using computer vision to capture data on what is being thrown away,” says Zornes. “We have a camera that looks into the bin and takes a photo every time something is thrown away.”
The system, called Winnow Vision, involves a set of digital weighing scales which sits on top of a standard kitchen bin. Mounted above this is a camera and an embedded system containing an Nvidia Jetson TX2 supercomputer on a module. The module takes the images captured by the camera, as well as the weight recorded by the scales, and determines what is being thrown out and in what quantity.
“The embedded chips run computer vision in real time,” said Zornes. “When I throw food into the bin, the system takes a photo then a pipeline of AI is runs against that photograph.”
The result is identification of what food ingredient has just been thrown away. “This is surfaced to customers via a touchscreen interface,” he says.
To identify the wide variety of food the system may encounter, a huge amount of training data is needed – up to 1,000 images per food item.
The computer vision algorithm, running on the Nvidia Jetson TX2 embedded system, uses a neural network to match the food being thrown away with food waste it already knows about. If the system cannot correctly identify what is being thrown away, kitchen staff can use a touchscreen above the bin to enter the food waste type manually.
This provides a machine learning feedback loop. The data entered is collated and sent asynchronously up to the AWS public cloud where Nvidia V100 graphics processing units running TensorFlow in cloud are used to update the machine learning model.
The data uploaded to the cloud is also used to provide customers with reports, which can be shared with kitchen staff. The reports detail quantities and types of food being tossed, as well as recommendations as to how the kitchen can reduce waste.
The company claims Winnow Vision has surpassed human levels of accuracy, with over 80% of food waste being correctly identified. This will increase with time as more data is collected.
Identifying food waste is a complex problem for machine learning algorithms, according to Zornes. The team at Winnow consulted Imperial college to help it develop what Zornes describes as a “biologically inspired” machine learning algorithm.
The system is already installed in over 75 kitchens, and Winnow plans to roll out the technology to thousands more in the coming years. It said Ikea and Emaar are among the companies that have implemented Winnow Vision in their kitchens.
Monitoring these systems centrally is one of the issues Zornes says the company had to overcome. “The challenge we had is that when you run embedded systems, how do you know all of the systems are up to date? AWS didn’t fit into our needs, so we wrote our own custom code to get the right telemetry out of the Winnow Vision systems,” he says.