Researchers at the University of Zurich and the Delft University of Technology have developed a way to use onboard cameras carried by a drone to autonomously stabilize its flight if a rotor fails.
The autonomous stabilization system equips a quadcopter with a standard camera to record images several times per second at a fixed rate and an event cameras which are based on independent pixels that are only activated when they detect a change in the light that reaches them.
Algorithms developed by the researchers combine information from the two cameras and use it to track the quadrotor’s position relative to its surroundings. This enables the onboard computer to control the drone as it flies and spins with only three rotors.
Davide Scaramuzza, head of the Robotics and Perception Group at the University of Zurich (UTH) and of the Rescue Robotics grand challenge at NCCR Robotics, which funded the research said, “When one rotor fails, the drone begins to spin on itself like a ballerina. This high-speed rotational motion causes standard controllers to fail unless the drone has access to very accurate position measurements.”
The current method of solving this problem is to provide the drone with a reference position through GPS. However, the use of visual information from different onboard cameras could provide a safety backup in areas where GPS is unavailable or in case of GPS failure.
The research project also revealed that both types of cameras perform well in normal light conditions. “When illumination decreases, however, standard cameras begin to experience motion blur that ultimately disorients the drone and crashes it, whereas event cameras also work well in very low light,” said UTH researcher on the project Sihao Sun.
The researchers hope the work can improve quadrotor flight safety in all areas where GPS signal is weak or absent as the use of quadcopters becomes widespread to avoid rotor failure causing accidents.