New wave system for picking fruit


Friday, 06 June, 2025

New wave system for picking fruit

Traditional manual harvesting is inefficient and labour intensive, so it can be costly, while fully automated robots often struggle with inaccurate and clumsy operation in complex environments. Now, a new robot has been designed to make harvesting more efficient by lowering the technical barrier associated with automation through a ‘human–robot collaboration’ model.

A research team led by Associate Professor Pei Wang from Southwest University has developed a gesture-controlled human–robot collaborative harvesting robot that can precisely locate and pick fruit with a simple wave. The technology is designed to not only improve efficiency but also provide a new approach for small-scale orchards to transition towards intelligent operations.

The core innovation of the robot lies in its human–machine division of labour. Researchers found that humans excel at identifying fruit locations and determining picking paths, while robotic arms outperform in repetitive motions and force control. Based on this insight, they designed a motion-sensing interaction system: the operator uses a Leap Motion sensor to capture hand movements in real time, directing the robotic arm to the target position, and then triggers the automated picking process with a double-tap gesture. This design combines human ‘eyes’ with machine ‘arms’, retaining human flexibility while leveraging mechanical stability.

To ensure precise execution by the robotic arm, the team overcame multiple technical hurdles. For instance, inverse kinematics calculations for robotic arms often yield multiple solutions, which can cause sudden jerks or freezes. To address this, researchers proposed a four-step screening method, evaluating mechanical interference, verifying solution correctness, assessing motion rationality, and optimising trajectory smoothness to select the safest joint angle combination. Simulation tests showed that the optimised robotic arm exhibited reduced movement paths and joint rotation ranges, resulting in smoother motions.

Unlike traditional robots reliant on camera recognition, this device achieves intuitive control through motion-sensing technology. The Leap Motion controller captures hand movements at a 0.01-millimetre resolution, maintaining stable performance even under uneven lighting or foliage occlusion. Researchers also implemented intelligent filtering algorithms to eliminate ‘jittery data’ caused by hand tremors or environmental interference, ensuring smooth robotic arm movement. The team dynamically mapped Leap Motion’s cubic interaction space to the robotic arm’s fan-shaped working area, allowing operators to move their hands within a virtual ‘box’ while the robotic arm responds synchronously in the real orchard — as intuitive as playing a motion-sensing game.

Tests revealed an average system response time of 74.4 milliseconds and a 96.7% accuracy rate in gesture recognition. After brief training, operators reduced single-fruit picking time from 8.3 seconds to 6.5 seconds. The tests also confirmed the system excels in complex terrains and small-scale orchards, adapting well to challenges like foliage occlusion and uneven lighting.

The study has been published in Frontiers of Agricultural Science and Engineering.

Image credit: iStock.com/aerogondo

Related News

Australian-made robot for temperature-controlled food delivery

Monash students are developing food delivery technology with separate compartments that can keep...

CSIRO launches food verification tool

The new platform consolidates isotopic data from Australia's leading research agencies into a...

Sustainable future for the food and beverage industry, new report

Sustainability has become the number one supply chain priority for food and beverage companies,...


  • All content Copyright © 2025 Westwick-Farrow Pty Ltd