Let’s save the bees with machine learning

Machine learning and all its related forms of “AI” are being used to work on just about every problem under the sun, but even so, stemming the alarming decline of the bee population still seems out of left field. In fact it’s a great application for the technology and may help both bees and beekeepers keep hives healthy.

Machine learning and all its related forms of “AI” are being used to work on just about every problem under the sun, but even so, stemming the alarming decline of the bee population still seems out of left field. In fact it’s a great application for the technology and may help both bees and beekeepers keep hives healthy.

The latest threat to our precious honeybees is the varroa mite, a parasite that infests hives and sucks the blood from both bees and their young. While it rarely kills a bee outright, it can weaken it and cause young to be born similarly weak or deformed. Over time this can lead to colony collapse.

The worst part is that unless you’re looking closely, you might not even see the mites — being mites, they’re tiny: a millimeter or so across. So infestations often go on for some time without being discovered.

Beekeepers, caring folk at heart obviously, want to avoid this. But the solution has been to put a flat surface beneath a hive and pull it out every few days, inspecting all the waste, dirt, and other hive junk for the tiny bodies of the mites. It’s painstaking and time-consuming work, and of course if you miss a few, you might think the infestation is getting better instead of worse.

Machine learning to the rescue!

As I’ve had occasion to mention about a billion times before this, one of the things machine learning models are really good at is sorting through noisy data, like a surface covered in random tiny shapes, and finding targets, like the shape of a dead varroa mite.

Students at the École Polytechnique Fédérale de Lausanne in Switzerland created an image recognition agent called ApiZoom trained on images of mites that can sort through a photo and identify any visible mite bodies in seconds. All the beekeeper needs to do is take a regular smartphone photo and upload it to the EPFL system.

The project started back in 2017, and since then the model has been trained with tens of thousands of images and achieved a success rate of detection of about 90 percent, which the project’s Alain Bugnon told me is about at parity with humans. The plan now is to distribute the app as widely as possible.

“We envisage two phases: a web solution, then a smartphone solution. These two solutions allow to estimate the rate of infestation of a hive, but if the application is used on a large scale, of a region,” Bugnon said. “By collecting automatic and comprehensive data, it is not impossible to make new findings about a region or atypical practices of a beekeeper, and also possible mutations of the Varroa mites.”

That kind of systematic data collection would be a major help for coordinating infestation response at a national level. ApiZoom is being spun out as a separate company by Bugnon; hopefully this will help get the software to beekeepers as soon as possible. The bees will thank them later.

Let’s save the bees with machine learning

Machine learning and all its related forms of “AI” are being used to work on just about every problem under the sun, but even so, stemming the alarming decline of the bee population still seems out of left field. In fact it’s a great application for the technology and may help both bees and beekeepers keep hives healthy.

Machine learning and all its related forms of “AI” are being used to work on just about every problem under the sun, but even so, stemming the alarming decline of the bee population still seems out of left field. In fact it’s a great application for the technology and may help both bees and beekeepers keep hives healthy.

The latest threat to our precious honeybees is the varroa mite, a parasite that infests hives and sucks the blood from both bees and their young. While it rarely kills a bee outright, it can weaken it and cause young to be born similarly weak or deformed. Over time this can lead to colony collapse.

The worst part is that unless you’re looking closely, you might not even see the mites — being mites, they’re tiny: a millimeter or so across. So infestations often go on for some time without being discovered.

Beekeepers, caring folk at heart obviously, want to avoid this. But the solution has been to put a flat surface beneath a hive and pull it out every few days, inspecting all the waste, dirt, and other hive junk for the tiny bodies of the mites. It’s painstaking and time-consuming work, and of course if you miss a few, you might think the infestation is getting better instead of worse.

Machine learning to the rescue!

As I’ve had occasion to mention about a billion times before this, one of the things machine learning models are really good at is sorting through noisy data, like a surface covered in random tiny shapes, and finding targets, like the shape of a dead varroa mite.

Students at the École Polytechnique Fédérale de Lausanne in Switzerland created an image recognition agent called ApiZoom trained on images of mites that can sort through a photo and identify any visible mite bodies in seconds. All the beekeeper needs to do is take a regular smartphone photo and upload it to the EPFL system.

The project started back in 2017, and since then the model has been trained with tens of thousands of images and achieved a success rate of detection of about 90 percent, which the project’s Alain Bugnon told me is about at parity with humans. The plan now is to distribute the app as widely as possible.

“We envisage two phases: a web solution, then a smartphone solution. These two solutions allow to estimate the rate of infestation of a hive, but if the application is used on a large scale, of a region,” Bugnon said. “By collecting automatic and comprehensive data, it is not impossible to make new findings about a region or atypical practices of a beekeeper, and also possible mutations of the Varroa mites.”

That kind of systematic data collection would be a major help for coordinating infestation response at a national level. ApiZoom is being spun out as a separate company by Bugnon; hopefully this will help get the software to beekeepers as soon as possible. The bees will thank them later.

This drone shrinks to fit

Researchers at the University of Zurich and EPFL have created a robot that shrinks to fit through gaps, a feature that could make it perfect for search and rescue missions. The researchers initially created a drone that could assess man-made gaps and squeeze through in seconds using only one camera. This extra feature – a […]

Researchers at the University of Zurich and EPFL have created a robot that shrinks to fit through gaps, a feature that could make it perfect for search and rescue missions. The researchers initially created a drone that could assess man-made gaps and squeeze through in seconds using only one camera. This extra feature – a scissor like system to shrink the drone in flight – makes it even more versatile and allows these drones to react to larger or smaller gaps in nature.

“The idea came up after we worked on quadrotor flight through narrow gaps,” said PhD candidate Davide Falanga. “The goal of our lab is to develop drones which can be in the future used in the aftermath of a disaster, as for example an earthquake, in order to enter building through small cracks or apertures in a collapsed building to look for survivors. Our previous approach required a very aggressive maneuver, therefore we looked into alternative solutions to accomplish a task as passing through a very narrow gap without having to fly at high speed. The solution we came up with is the foldable drone, a quadrotor which can change its shape to adapt to the task.”

The system measures the gap and changes its shape without outside processing, a feat that is quite exciting. All of the processing is done on board and it could be turned into an autonomous system if necessary. The team build the drone with off the shelf and 3D-printed parts.

“The main difference between conventional drones and our foldable drone is in the way the arms are connected to the body: each arm is connected through a servo motor, which can change the relative position between the main body and the arm. This allows the robot to literally fold the arms around the body, which means that potentially any morphology can be obtained. An adaptive controller is aware of the drone’s morphology and adapts to it in order to guarantee stable flight at all times, independently of the configuration,” said Falanga.

The team published a report on their findings in Robotics and Automation Letters. As IEEE notes, this is no flying drone dragon but it is a far simpler, cooler, and more effective product.

TWIICE One Exoskeleton furthers the promise of robotic mobility aids

Few things in the world of technology can really ever be said to be “done,” and certainly exoskeletons are not among their number. They exist, but they are all works in progress, expensive, heavy, and limited. So it’s great to see this team working continuously on their TWIICE robotic wearable, improving it immensely with the guidance of motivated users.

Few things in the world of technology can really ever be said to be “done,” and certainly exoskeletons are not among their number. They exist, but they are all works in progress, expensive, heavy, and limited. So it’s great to see this team working continuously on their TWIICE robotic wearable, improving it immensely with the guidance of motivated users.

TWIICE made its debut in 2016, and like all exoskeletons it was more promise made than promise kept. It’s a lower-half exoskeleton that supports and moves the legs of someone with limited mobility, while they support themselves on crutches. It’s far from ideal, and the rigidity and weight of systems like this make them too risky to deploy at scale for now.

But two years of refinement have made a world of difference. The exoskeleton weighs the same (which doesn’t matter since it carries its own weight), but supports heavier users while imparting more force with its motors, which have been integrated into the body itself to make it far less bulky.

Perhaps most importantly, however, the whole apparatus can now be donned and activated by the user all by herself, as Swiss former acrobat and now handcycling champion Silke Pan demonstrated in a video. She levers herself from her wheelchair into the sitting exoskeleton, attaches the fasteners on her legs and trunk, then activates the device and stands right up.

She then proceeds to climb more stairs than I’d rather attempt. She is an athlete, after all.

That kind of independence is often crucially important for the physically disabled for a multitude of reasons, and clearly achieving the capability has been a focus for the TWIICE team.

Although the exoskeleton has been worked on as a research project within the Ecole Polytechnique Federale de Lausanne (EPFL), the plan is to spin off a startup to commercialize the tech as it approaches viability. The more they make and the more people use these devices — despite their limitations — the better future versions will be.