Researchers at the University of Klagenfurt have developed an autonomous inspection drone that uses artificial intelligence to monitor critical infrastructure without human operators in hazardous locations.
The quadcopter incorporates a USB3 Vision industrial camera from Imaging Development Systems to capture real-time image data for navigation and inspection tasks. The system enables the drone to autonomously recognize power poles and insulators, then fly around insulators at a distance of 10ft (3m) to capture inspection images.
“Precise localization is important such that the camera recordings can also be compared across multiple inspection flights,” said Thomas Georg Jantos, PhD student and member of the Control of Networked Systems research group. The project received funding from the Austrian Federal Ministry for Climate Action, Environment, Energy, Mobility, Innovation and Technology.
Unlike Global Navigation Satellite System approaches, the AI-powered system provides semantic information about objects. This enables the drone to understand not just spatial position but relative orientation to inspection targets, providing what Jantos described as “reproducible viewpoints” across multiple flights.
The hardware setup consists of a TWINs Science Copter platform equipped with a Pixhawk PX4 autopilot and an NVIDIA Jetson Orin AGX 64GB DevKit as the onboard computer. The team selected the U3-3276LE C-HQ camera model from Imaging Development Systems’ uEye LE family for navigation duties.
The camera features a Sony Pregius IMX265 sensor that delivers 3.19 megapixel resolution at 2,064 x 1,544 pixels with frame rates up to 58fps. The integrated 1/1.8in global shutter prevents image distortion during short exposure times compared to rolling shutter designs.
The camera connects to the onboard computer via USB3 interface and integrates with the Robot Operating System through Imaging Development Systems peak SDK. The system captures raw images at 50fps with 1,280 x 960 pixel resolution, which Jantos said represents the maximum frame rate achievable with the AI model on the drone’s onboard computer.
The drone uses sensor fusion combining camera data with inertial measurement unit, lidar, and Global Navigation Satellite System inputs for real-time navigation and stabilization. An Extended Kalman Filter (EKF) processes sensor data at up to 200Hz to maintain stable flight position.
The AI processes semantic information to identify objects and calculate precise relative positioning. “To ensure a safe and robust inspection flight, high image quality and frame rates are essential,” Jantos said.
According to the researchers, the approach avoids typical Global Navigation Satellite System problems including multipathing and signal shadowing caused by large infrastructure or terrain features. The team uses the CNS Flight Stack software for autonomous mission execution, which includes modules for navigation, sensor fusion and control algorithms.
“In contrast to Global Navigation Satellite System-based inspection approaches using drones, our AI with its semantic information enables the inspection of the infrastructure to be inspected from certain reproducible viewpoints,” Jantos noted. The approach also eliminates usual Global Navigation Satellite System problems such as multipathing and shadowing caused by large infrastructures or valleys.
Testing takes place in the university’s dedicated drone hall using scale models of power infrastructure. The researchers said the system could improve safety, efficiency and data quality compared to conventional inspection methods that require human operators in difficult-to-access or risky areas.
The camera’s capabilities enable the inspection system’s AI-based navigation algorithm to extract semantic information from raw sensory data. This allows image pixels to be understood not just as independent color values but as parts of specific objects such as insulators.
According to Imaging Development Systems, the approach enables real-time navigation and interaction adapted to specific inspection task conditions and requirements. The modularity of the CNS Flight Stack and Robot Operating System interfaces enable seamless integration of sensors and the AI-based state estimator for position detection.