About me

Raul Tapia received the B.Sc. degree in industrial engineering from the University of Seville, Spain, and the M.Sc. degree in robotics from the Miguel Hernández University, Spain. He is currently pursuing the Ph.D. degree in robotics with the GRVC Robotics Laboratory at the University of Seville, Spain. His research interests include computer vision, event-based vision, robot perception, robot navigation, and machine learning.

CURRICULUM VITAE

RESEARCH EXPERIENCE

2020-pres. | PhD Candidate

GRVC Robotics Lab, Universidad de Sevilla, Spain

2019-2020 | Research Assistant

GRVC Robotics Lab, Universidad de Sevilla, Spain


EDUCATION

2020-pres. | Doctorado en Ingeniería Automática, Electrónica y de Telecomunicación (PhD)

Universidad de Sevilla, Spain

2019-2020 | Máster Universitario en Robótica (MSc)

Universidad Miguel Hernández, Spain

2015-2019 | Grado en Ingeniería de las Tecnologías Industriales (BSc)

Universidad de Sevilla, Spain


Conferences

2023.10 | Rescaling of a Flapping-wing Aerial Vehicle for Flights in Confined Spaces

Jornadas de Automática 2024

🌎 Málaga, Spain

2023.10 | Leader-Follower Formation Control of a Large-Scale Swarm of Satellite System using the State-Dependent Riccati Equation: Orbit-to-Orbit and in-Same-Orbit Regulation

IEEE/RSJ 2023 International Conference on Intelligent Robots and Systems

🌎 Detroit, USA

2023.10 | A Comparison between Frame-based and Event-based Cameras for Flapping-Wing Robot Perception

IEEE/RSJ 2023 International Conference on Intelligent Robots and Systems

🌎 Detroit, USA

2021.09 | Why Fly Blind? Event-Based Visual Guidance for Ornithopter Robot Flight

IEEE/RSJ 2021 International Conference on Intelligent Robots and Systems

🌍 Prague, Czech Republic


OTHERS

2023.10 | Event-based Vision for Ornithopter Perception and Autonomy

Workshop on Learning Robot Super Autonomy @ IEEE/RSJ IROS 2023

2023.09 | A Comparison between Frame-based and Event-based Cameras for Energy Efficient Flapping-Wing Robots

Workshop on Energy-efficient Flapping-wing Robots

2022.05 | euROBIN Week 2023 Organizing Committee

euROBIN. The European Excellence Network on AI-Powered Robotics

2022.11 | SSRR 2022 Local Arrangement Committee

IEEE International Symposium on Safety, Security, and Rescue Robotics 2022

2021.10 | Heterogeneuos Robot Collaboration Hands-on

DroneDays 2021

×

PUBLICATIONS

2024  |  Rescaling of a Flapping-wing Aerial Vehicle for Flights in Confined Spaces

Jornadas de Automática 2024

PDF
POSTER
CITE

This paper presents the rescaling of a flapping-wing aerial robot. Our objective is to design a platform that enables auto-nomous flights in indoor and outdoor confined areas. A previous model has been rescaled using more lightweight parts. Theaerodynamic design includes a new airfoil (S1221) that improves the efficiency. In addition, significant modification have beenperformed in the mechanical and electronic designs to reduce the weight by using more lightweight materials and smallercomponents. The preliminary results suggest our prototype fulfill the weight and wing-loading constraints, providing a highmaniobrability.

@inproceedings{coca2024rescaling,
  author={Coca, S. and Crassous, P. and Sanchez-Laulhe, E. and Tapia, R. and Martínez-de Dios, J. R. and Ollero, A.},
  booktitle={Jornadas de Automática},
  title={Rescaling of a Flapping-wing Aerial Vehicle for Flights in Confined Spaces},
  year={2024},
  pages={1-4},
  doi={10.17979/ja-cea.2024.45.10914}
}

2024  |  eFFT: An Event-based Method for the Efficient Computation of Exact Fourier Transforms

IEEE Transactions on Pattern Analysis and Machine Intelligence

PDF
CODE
CITE

We introduce eFFT, an efficient method for the calculation of the exact Fourier transform of an asynchronous event stream. It is based on keeping the matrices involved in the Radix-2 FFT algorithm in a tree data structure and updating them with the new events, extensively reusing computations, and avoiding unnecessary calculations while preserving exactness. eFFT can operate event-by-event, requiring for each event only a partial recalculation of the tree since most of the stored data are reused. It can also operate with event packets, using the tree structure to detect and avoid unnecessary and repeated calculations when integrating the different events within each packet to further reduce the number of operations. eFFT has been extensively evaluated with public datasets and experiments, validating its exactness, low processing time, and feasibility for online execution on resource-constrained hardware. We release a C++ implementation of eFFT to the community.

@article{tapia2024efft,
  author={Tapia, R. and Martínez-de Dios, J. R. and Ollero, A.},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  title={{eFFT}: An Event-based Method for the Efficient Computation of Exact {Fourier} Transforms},
  year={2024},
  volume={0},
  number={0},
  pages={0000-0000},
  doi={10.1109/TPAMI.2024.3422209}
}

2023  |  Leader-Follower Formation Control of a Large-Scale Swarm of Satellite System Using the State-Dependent Riccati Equation: Orbit-to-Orbit and In-Same-Orbit Regulation

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023)

PDF
CITE

The state-dependent Riccati equation (SDRE) is a nonlinear optimal controller with a flexible structure which is one of the main advantages of this method. Here in this work, this flexibility is used to present a novel design for handling a soft constraint for state variables (trajectories). The concept is applied to a large-scale swarm control system, with more than 1000 agents. The control of the swarm satellite system is devoted to two modes of orbit-to-orbit and in-same-orbit cases. Keeping the satellites in one orbit in regulation (point-to-point motion) requires additional constraints while they are moving in Cartesian coordinates. For a small number of agents trajectory design could be done for each satellite individually, though, for a swarm with many agents, that is not practical. The constraint has been incorporated into the cost function of optimal control and resulted in a modified SDRE control law. The proposed method successfully controlled a swarm case of 1024 agents in leader-follower mode for orbit-to-orbit and in-same-orbit simulations. The soft constraint presented a percentage of 0.05 in the error of the satellites with respect to travel distance, in in-same-orbit regulation. The presented approach is systematic and could be performed for larger swarm systems with different agents and dynamics.

@inproceedings{nekoo2023leader,
  author={Nekoo, S. R. and Yao, J and Suarez, A. and Tapia, R. and Ollero, A.},
  booktitle={2023 IEEE/RSJ International Conference on Intelligent Robots and Systems},
  title={Leader-Follower Formation Control of a Large-Scale Swarm of Satellite System Using the State-Dependent Riccati Equation: Orbit-to-Orbit and In-Same-Orbit Regulation},
  year={2023},
  pages={10700-10707},
  doi={10.1109/IROS55552.2023.10342383}
}

2023  |  A Comparison between Frame-based and Event-based Cameras for Flapping-Wing Robot Perception

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023)

PDF
PREPRINT
POSTER
CITE

Perception systems for ornithopters face severe challenges. The harsh vibrations and abrupt movements caused during flapping are prone to produce motion blur and strong lighting condition changes. Their strict restrictions in weight, size, and energy consumption also limit the type and number of sensors to mount onboard. Lightweight traditional cameras have become a standard off-the-shelf solution in many flapping-wing designs. However, bioinspired event cameras are a promising solution for ornithopter perception due to their microsecond temporal resolution, high dynamic range, and low power consumption. This paper presents an experimental comparison between frame-based and an event-based camera. Both technologies are analyzed considering the particular flapping-wing robot specifications and also experimentally analyzing the performance of well-known vision algorithms with data recorded onboard a flapping-wing robot. Our results suggest event cameras as the most suitable sensors for ornithopters. Nevertheless, they also evidence the open challenges for event-based vision on board flapping-wing robots.

@inproceedings{tapia2023comparison,
  author={Tapia, R. and Rodríguez-Gómez, J. P. and Sanchez-Diaz, J. A. and Gañán, F. J. and Rodríguez, I. G. and Luna-Santamaria, J. and Martínez-de Dios, J. R. and Ollero, A.},
  booktitle={2023 IEEE/RSJ International Conference on Intelligent Robots and Systems},
  title={A Comparison between Frame-based and Event-based Cameras for Flapping-Wing Robot Perception},
  year={2023},
  pages={3025-3032},
  doi={10.1109/IROS55552.2023.10342500}
}

2023  |  Experimental Energy Consumption Analysis of a Flapping-Wing Robot

IEEE International Conference on Robotics and Automation (ICRA 2023) - Workshop on Energy Efficient Aerial Robotic Systems

PREPRINT
CITE

One of the motivations for exploring flapping-wing aerial robotic systems is to seek energy reduction, by maintaining manoeuvrability, compared to conventional unmanned aerial systems. A Flapping Wing Flying Robot (FWFR) can glide in favourable wind conditions, decreasing energy consumption significantly. In addition, it is also necessary to investigate the power consumption of the components in the flapping-wing robot. In this work, two sets of the FWFR components are analyzed in terms of power consumption: a) motor/electronics components and b) a vision system for monitoring the environment during the flight. A measurement device is used to record the power utilization of the motors in the launching and ascending phases of the flight and also in cruising flight around the desired height. Additionally, an analysis of event cameras and stereo vision systems in terms of energy consumption has been performed. The results provide a first step towards decreasing battery usage and, consequently, providing additional flight time.

@inproceedings{tapia2023experimental,
  author={Tapia, R. and Satue, A. C. and Nekoo, S. R. and Martínez-de Dios, J. R. and Ollero, A.},
  booktitle={2023 IEEE International Conference on Robotics and Automation. Workshop on Energy Efficient Aerial Robotic Systems},
  title={Experimental Energy Consumption Analysis of a Flapping-Wing Robot},
  year={2023},
  pages={1-4},
  doi={10.48550/arXiv.2306.00848}
}

2023  |  A 94.1 g Scissors-type Dual-arm Cooperative Manipulator for Plant Sampling by an Ornithopter using a Vision Detection System

Robotica

PDF
CITE

The sampling and monitoring of nature have become an important subject due to the rapid loss of green areas. This work proposes a possible solution for a sampling method of the leaves using an ornithopter robot equipped with an onboard 94.1 g dual-arm cooperative manipulator. One hand of the robot is a scissors-type arm and the other one is a gripper to perform the collection, approximately similar to an operation by human fingers. In the move toward autonomy, a stereo camera has been added to the ornithopter to provide visual feedback for the stem, which reports the position of the cutting and grasping. The position of the stem is detected by a stereo vision processing system and the inverse kinematics of the dual-arm commands both gripper and scissors to the right position. Those trajectories are smooth and avoid any damage to the actuators. The real-time execution of the vision algorithm takes place in the lightweight main processor of the ornithopter which sends the estimated stem localization to a microcontroller board that controls the arms. The experimental results both indoors and outdoors confirmed the feasibility of this sampling method. The operation of the dual-arm manipulator is done after the perching of the system on a stem. The topic of perching has been presented in previous works and here we focus on the sampling procedure and vision/manipulator design. The flight experimentation also approves the weight of the dual-arm system for installation on the flapping-wing flying robot.

@article{nekoo2023scissors,
  author={Nekoo, S. R. and Feliu-Talegon, D. and Tapia, R. and Satue, A. C. and Martínez-de Dios, J. R. and Ollero, A.},
  journal={Robotica},
  title={A 94.1 g Scissors-type Dual-arm Cooperative Manipulator for Plant Sampling by an Ornithopter using a Vision Detection System},
  year={2023},
  volume={41},
  number={10},
  pages={3022-3039},
  doi={10.1017/S0263574723000851}
}

2022  |  Efficient Event-based Intrusion Monitoring using Probabilistic Distributions

IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR 2022)

PDF
CITE

Autonomous intrusion monitoring in unstructured complex scenarios using aerial robots requires perception systems capable to deal with problems such as motion blur or changing lighting conditions, among others. Event cameras are neuromorphic sensors that capture per-pixel illumination changes, providing low latency and high dynamic range. This paper presents an efficient event-based processing scheme for intrusion detection and tracking onboard strict resourceconstrained robots. The method tracks moving objects using a probabilistic distribution that is updated event by event, but the processing of each event involves few low-cost operations, enabling online execution on resource-constrained onboard computers. The method has been experimentally validated in several real scenarios under different lighting conditions, evidencing its accurate performance.

@inproceedings{ganan2022autonomous,
  author={Gañán, F. J. and Sanchaz-Diaz, J. A. and Tapia, R. and Martínez-de Dios, J. R. and Ollero, A.},
  booktitle={2022 IEEE International Symposium on Safety, Security, and Rescue Robotics},
  title={Efficient Event-based Intrusion Monitoring using Probabilistic Distributions},
  year={2022},
  pages={211-216},
  doi={10.1109/SSRR56537.2022.10018655}
}

2022  |  Scene Recognition for Urban Search and Rescue using Global Description and Semi-Supervised Labelling

IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR 2022)

PDF
CITE

Autonomous aerial robots for urban search and rescue (USAR) operations require robust perception systems for localization and mapping. Although local feature description is widely used for geometric map construction, global image descriptors leverage scene information to perform semantic localization, allowing topological maps to consider relations between places and elements in the scenario. This paper proposes a scene recognition method for USAR operations using a collaborative human-robot approach. The proposed method uses global image description to train an SVM-based classification model with semi-supervised labeled data. It has been experimentally validated in several indoor scenarios on board a multirotor robot.

@inproceedings{sanchez2022scene,
  author={Sanchaz-Diaz, J. A. and Gañán, F. J. and Tapia, R. and Martínez-de Dios, J. R. and Ollero, A.},
  booktitle={2022 IEEE International Symposium on Safety, Security, and Rescue Robotics},
  title={Scene Recognition for Urban Search and Rescue using Global Description and Semi-Supervised Labelling},
  year={2022},
  pages={238-243},
  doi={10.1109/SSRR56537.2022.10018660}
}

2022  |  Aerial Manipulation System for Safe Human-Robot Handover in Power Line Maintenance

Robotics: Science and Systems (RSS 2022) - Workshop in Close Proximity Human-Robot Collaboration

PDF
CITE

Human workers conducting inspection and maintenance (I&M) operations on high altitude infrastructures like power lines or industrial facilities face significant difficulties getting tools or devices once they are deployed on this kind of workspaces. In this sense, aerial manipulation robots can be employed to deliver quickly objects to the operator, considering long reach configurations to improve safety and the feeling of comfort for the operator during the handover. This paper presents a dual arm aerial manipulation robot in cable suspended configuration intended to conduct fast and safe aerial delivery, considering a human-centered approach relying on an on-board perception system in which the aerial robot accommodates its pose to the worker. Preliminary experimental results in an indoor testbed validate the proposed system design.

@article{ganan2022aerial,
  author={Gañán, F. J. and Suarez, A. and Tapia, R. and Martínez-de Dios, J. R. and Ollero, A.},
  journal={2022 Robotics: Science and Systems. Workshop in Close Proximity Human-Robot Collaboration},
  title={Aerial Manipulation System for Safe Human-Robot Handover in Power Line Maintenance},
  year={2022},
  pages={1-4},
  doi={10.5281/zenodo.7153329}
}

2022  |  ASAP: Adaptive Transmission Scheme for Online Processing of Event-Based Algorithms

Autonomous Robots

PDF
PREPRINT
CODE
CITE

Online event-based perception techniques on board robots navigating in complex, unstructured, and dynamic environments can suffer unpredictable changes in the incoming event rates and their processing times, which can cause computational overflow or loss of responsiveness. This paper presents ASAP: a novel event handling framework that dynamically adapts the transmission of events to the processing algorithm, keeping the system responsiveness and preventing overflows. ASAP is composed of two adaptive mechanisms. The first one prevents event processing overflows by discarding an adaptive percentage of the incoming events. The second mechanism dynamically adapts the size of the event packages to reduce the delay between event generation and processing. ASAP has guaranteed convergence and is flexible to the processing algorithm. It has been validated on board a quadrotor and an ornithopter robot in challenging conditions.

@article{tapia2022asap,
  author={Tapia, R. and Martínez-de Dios, J. R. and Gómez Eguíluz, A. and Ollero, A.},
  journal={Autonomous Robots},
  title={{ASAP}: Adaptive Transmission Scheme for Online Processing of Event-Based Algorithms},
  year={2022},
  volume={46},
  number={8},
  pages={879-892},
  doi={10.1007/s10514-022-10051-y}
}

2022  |  Free as a Bird: Event-Based Dynamic Sense-and-Avoid for Ornithopter Robot Flight

IEEE Robotics and Automation Letters

PDF
CITE

Autonomous flight of flapping-wing robots is a major challenge for robot perception. Most of the previous senseand-avoid works have studied the problem of obstacle avoidance for flapping-wing robots considering only static obstacles. This paper presents a fully onboard dynamic sense-and-avoid scheme for large-scale ornithopters using event cameras. These sensors trigger pixel information due to changes of illumination in the scene such as those produced by dynamic objects. The method performs event-by-event processing in low-cost hardware such as those onboard small aerial vehicles. The proposed scheme detects obstacles and evaluates possible collisions with the robot body. The onboard controller actuates over the horizontal and vertical tail deflections to execute the avoidance maneuver. The scheme is validated in both indoor and outdoor scenarios using obstacles of different shapes and sizes. To the best of the authors’ knowledge, this is the first event-based method for dynamic obstacle avoidance in a flapping-wing robot.

@article{rodriguez2022free,
  author={Rodríguez-Gómez, J. P. and Tapia, R. and Guzmán Garcia, M. M. and Martínez-de Dios, J. R. and Ollero, A.},
  journal={IEEE Robotics and Automation Letters},
  title={Free as a Bird: Event-Based Dynamic Sense-and-Avoid for Ornithopter Robot Flight},
  year={2022},
  volume={7},
  number={2},
  pages={5413-5420},
  doi={10.1109/LRA.2022.3153904}
}

2021  |  UAV Human Teleoperation using Event-Based and Frame-Based Cameras

IEEE Aerial Robotic Systems Physically Interacting with the Environment (AIRPHARO 2021)

PDF
CITE

Teleoperation is a crucial aspect for human-robot interaction with unmanned aerial vehicles (UAVs) applications. Fast perception processing is required to ensure robustness, precision, and safety. Event cameras are neuromorphic sensors that provide low latency response, high dynamic range and low power consumption. Although classical image-based methods have been extensively used for human-robot interaction tasks, responsiveness is limited by their processing rates. This paper presents a human-robot teleoperation scheme for UAVs that exploits the advantages of both traditional and event cameras. The proposed scheme was tested in teleoperation missions where the pose of a multirotor robot is controlled in real time using human gestures detected from events.

@inproceedings{rodriguez2021uav,
  author={Rodríguez-Gómez, J. P. and Tapia, R. and Gómez Eguíluz, A. and Martínez-de Dios, J. R. and Ollero, A.},
  booktitle={2021 Aerial Robotic Systems Physically Interacting with the Environment},
  title={{UAV} Human Teleoperation using Event-Based and Frame-Based Cameras},
  year={2021},
  pages={1-5},
  doi={10.1109/AIRPHARO52252.2021.9571049}
}

2021  |  Why Fly Blind? Event-Based Visual Guidance for Ornithopter Robot Flight

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021)

PDF
CITE

The development of perception and control methods that allow bird-scale flapping-wing robots (a.k.a. ornithopters) to perform autonomously is an under-researched area. This paper presents a fully onboard event-based method for ornithopter robot visual guidance. The method uses event cameras to exploit their fast response and robustness against motion blur in order to feed the ornithopter control loop at high rates (100 Hz). The proposed scheme visually guides the robot using line features extracted in the event image plane and controls the flight by actuating over the horizontal and vertical tail deflections. It has been validated on board a real ornithopter robot with real-time computation in low-cost hardware. The experimental evaluation includes sets of experiments with different maneuvers indoors and outdoors.

@inproceedings{gomez2021why,
  author={Gómez Eguíluz, A. and Rodríguez-Gómez, J. P. and Tapia, R. and Maldonado, F. J. and Acosta, J. A. and Martínez-de Dios, J. R. and Ollero, A.},
  booktitle={2021 IEEE/RSJ International Conference on Intelligent Robots and Systems},
  title={Why fly blind? {E}vent-based visual guidance for ornithopter robot flight},
  year={2021},
  pages={1958-1965},
  doi={10.1109/IROS51168.2021.9636315}
}

2021  |  The GRIFFIN Perception Dataset: Bridging the Gap Between Flapping-Wing Flight and Robotic Perception

IEEE Robotics and Automation Letters

PDF
DATASET
CITE

The development of automatic perception systems and techniques for bio-inspired flapping-wing robots is severely hampered by the high technical complexity of these platforms and the installation of onboard sensors and electronics. Besides, flapping-wing robot perception suffers from high vibration levels and abrupt movements during flight, which cause motion blur and strong changes in lighting conditions. This paper presents a perception dataset for bird-scale flapping-wing robots as a tool to help alleviate the aforementioned problems. The presented data include measurements from onboard sensors widely used in aerial robotics and suitable to deal with the perception challenges of flapping-wing robots, such as an event camera, a conventional camera, and two Inertial Measurement Units (IMUs), as well as ground truth measurements from a laser tracker or a motion capture system. A total of 21 datasets of different types of flights were collected in three different scenarios (one indoor and two outdoor). To the best of the authors' knowledge this is the first dataset for flapping-wing robot perception.

@article{rodriguez2021griffin,
  author={Rodríguez-Gómez, J. P. and Tapia, R. and Paneque, J. L. and Grau, P. and Gómez Eguíluz, A. and Martínez-de Dios, J. R. and Ollero, A.},
  journal={IEEE Robotics and Automation Letters},
  title={The {GRIFFIN} Perception Dataset: Bridging the Gap Between Flapping-Wing Flight and Robotic Perception},
  year={2021},
  volume={6},
  number={2},
  pages={1066-1073},
  doi={10.1109/LRA.2021.3056348}
}

2020  |  Towards UAS Surveillance using Event Cameras

IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR 2020)

PDF
CITE

Aerial robot perception for surveillance and search and rescue in unstructured and complex environments poses challenging problems in which traditional sensors are severely constrained. This paper analyzes the use of event cameras onboard aerial robots for surveillance applications. Event cameras have high temporal resolution and dynamic range, which make them very robust against motion blur and lighting conditions. The paper analyzes the pros and cons of event cameras and presents an event-based processing scheme for target detection and tracking. The scheme is experimentally validated in challenging environments and different lighting conditions.

@inproceedings{martinez2020towards,
  author={Martínez-de Dios, J. R. and Gómez Eguíluz, A. and Rodríguez-Gómez, J. P. and Tapia, R. and Ollero, A.},
  booktitle={2020 IEEE International Symposium on Safety, Security, and Rescue Robotics},
  title={Towards {UAS} Surveillance using Event Cameras},
  year={2020},
  pages={71-76},
  doi={10.1109/SSRR50563.2020.9292606}
}

2020  |  ASAP: Adaptive Scheme for Asynchronous Processing of Event-Based Vision Algorithms

IEEE International Conference on Robotics and Automation (ICRA 2020) - Workshop on Unconventional Sensors in Robotics

PDF
PREPRINT
CODE
CITE

Event cameras can capture pixel-level illumination changes with very high temporal resolution and dynamic range. They have received increasing research interest due to their robustness to lighting conditions and motion blur. Two main approaches exist in the literature to feed the event-based processing algorithms: packaging the triggered events in event packages and sending them one-by-one as single events. These approaches suffer limitations from either processing overflow or lack of responsivity. Processing overflow is caused by high event generation rates when the algorithm cannot process all the events in real-time. Conversely, lack of responsivity happens in cases of low event generation rates when the event packages are sent at too low frequencies. This paper presents ASAP, an adaptive scheme to manage the event stream through variablesize packages that accommodate to the event package processing times. The experimental results show that ASAP is capable of feeding an asynchronous event-by-event clustering algorithm in a responsive and efficient manner and at the same time prevent overflow.

@inproceedings{tapia2020asap,
  author={Tapia, R. and Gómez Eguíluz, A. and Martínez-de Dios, J. R. and Ollero, A.},
  booktitle={2020 IEEE International Conference on Robotics and Automation. Workshop on Unconventional Sensors in Robotics},
  title={{ASAP}: Adaptive Scheme for Asynchronous Processing of Event-Based Vision Algorithms},
  year={2020},
  pages={1-3},
  doi={10.5281/zenodo.3855412}
}

2019  |  Efficient Mosaicking for Linear Infrastructure Inspection using Aerial Robots

Jornadas de Automática 2019

PDF
POSTER
CITE

This paper presents a mosaicking generation method using images captured by aerial robots for linear infrastructure inspection applications. The method has been designed using the problem hypotheses in order to reduce its computational cost, keeping its precise and robust performance. Particularly, it uses the rectilinear flight hypothesis and the estimation of the displacement between consecutive images to select regions of interest avoiding to detect and match features in the whole image, also reducing the outliers and, therefore, simplifying the optimization for calculating the transformation between images. The method has been validated experimentally on gas pipeline inspection missions with aerial robots.

@inproceedings{tapia2019efficient,
  author={Tapia, R. and Martínez-de Dios, J. R. and Ollero, A.},
  booktitle={Jornadas de Automática},
  title={Efficient Mosaicking for Linear Infrastructure Inspection using Aerial Robots},
  year={2019},
  pages={802-809},
  doi={10.17979/spudc.9788497497169.802}
}

×

REPOS

×

POSTERS

×

VIDEOS

2023 | A Comparison between Frame-based and Event-based Cameras for Flapping-Wing Robot Perception

2023 IEEE/RSJ International Conference on Intelligent Robots and Systems


2023 | A 94.1 g Scissors-type Dual-arm Cooperative Manipulator for Plant Sampling by an Ornithopter using a Vision Detection System

Robotica


2022 | Aerial Manipulation System for Safe Human-Robot Handover in Power Line Maintenance

2022 Robotics: Science and Systems - Workshop in Close Proximity Human-Robot Collaboration


2022 | Free as a Bird: Event-based Dynamic Sense-and-Avoid for Ornithopter Robot Flight

IEEE Robotics and Automation Letters


2021 | Why Fly Blind? Event-based Visual Guidance for Ornithopter Robot Flight

2021 IEEE/RSJ International Conference on Intelligent Robots and Systems


2021 | The GRIFFIN Perception Dataset: Bridging the Gap Between Flapping-Wing Flight and Robotic Perception

IEEE Robotics and Automation Letters