Internet of Trash? Recycling Food Waste with mF2C

Reading Time: 4 minutes

 

Posted by: Jens Jensen, Senior Scientist at UKRI/Science And Technology Facilities Council

 
Food waste is a big problem. It is expensive, it is bad for the environment, it means more food needs to be produced than is strictly necessary, and potentially food is wasted while people elsewhere go hungry.

To eliminate food waste, it is necessary to first get an overview of the size of the problem: in the homes, some people are much better at using food than others, and maybe there’s an opportunity to teach people about not wasting food. On a larger scale, waste of food in places that produce lots of food, such as restaurants and hospitals, is potentially a big problem. Also supermarkets throw away food which has reached its best-before date, even if the food is safe to eat.
Food waste truck

To address the problem of food waste, STFC, one of the partners of the mF2C project, runs a Food Network, which looks at bringing food researchers together with N8 and with STFC’s scientists and technologists.

Could an IoT project like mF2C help eliminate food waste?  The first step is to get an honest measure of the scale of the waste.  People measure the weight of the food they throw away, but the measures need to be recorded, aggregated, and finally anonymised – in order to not put people off submitting honest metrics.  The mF2C project might be a good fit: after all, two out of three use cases have sensors (smart cities and smart boats).

Starting at the edge, the scales that weigh the waste need to be instrumented to be able to send data to the fog – and only for waste, not for any other measurement.  So a bit of human collaboration is required; the human will need to weigh the food on the scale and press a button to say “measure this, and this is waste.”  As a proof of concept, the scale may already be digital, and there are scales with RS-232 or USB interfaces (the former seems to be more common among the scales on the market, the latter more common in computers) – these are readily available in capacities from 0.1kg to 150kg, so should cover all the needs.  However, something simple and not too expensive needs to be added to communicate the values onward in the right format.

The current sensor query in mF2C only has sensors respond to requests, they do not send anything of their own.  This makes sense for regular measurements (e.g. temperature), but is a bit daft when the measurement itself is an event, as in this proposal.  When the sensor receives a request, it sends a JSON formatted structure which identifies the sensor, the type of sensor, the time stamp if it has it – and of course the measurement.  We did not find an existing standard so had to create a suitable format, both for the data and to register the sensor as a resource in the project.

At the fog level, agents query sensors and aggregate data.  As initially planned in mF2C, once we have data from five or more different sensors, we consider it sufficient, that individual sensors are “anonymised,” even if the aggregation is done at a regional level.  The aggregation will need to record the minimum, maximum, sum, and the number of data points.  They will need to be binned (in the statistics sense, not the waste sense!) by time, so we pick a suitable time interval during which enough measurements can be aggregated, and gather data and report for each time interval.  This binning is introduced to be able to meaningfully aggregate data, as in statistics, and at the same time study how the data varies with time.  The regional aggregation makes sense, to get an overview of variations, or it could be done by the type of producer – restaurant, hospital, café, sandwich factory, bakery, etc, or by the type of waste (fruit, cat food, bread, etc.)

A service in the cloud aggregates data from regional fog-based sensors.  In STFC’s prototype, we did this by running a web service endpoint and management dashboard in the cloud (using Flask).  Fog devices submitted their data to the web service, but it could also be done with MQTT, running the hub in the cloud.  There are a few security challenges, too: how to ensure the sensor data is read only by the authorised fog node, and how to ensure the data is not leaked in the process.  Also how to ensure that a sensor is not blocked – its network may be down, so it needs to be smart enough to not lose data, but if someone tries to prevent it from publishing, that should be detected.

Once we have the data, what next? Initially, the goal is merely to build trust in the infrastructure, that sources feel they can share data about waste without being penalised. The next step is to repurpose the waste, depending on type: bread products can be repurposed, some waste can become biofuel, and some things might still be safe for consumption, or could be turned into animal feed.

And here’s the trick: obviously the sensor should send data when a measurement is made – if it’s online at the time – and a notification could then be made to interested parties (here’s where MQTT comes in) that waste of a particular type is available.

Even if the waste cannot be collected immediately, it makes sense to plan ahead for processing particular types of waste, so the relevant capacity can be allocated for collecting and processing it, thus potentially saving time and money, and making more efficient use of the products.