About me

Raul Tapia received the B.Sc. degree in industrial engineering from the University of Seville, Spain, and the M.Sc. degree in robotics from the Miguel Hernández University, Spain. He is currently pursuing the Ph.D. degree in robotics with the GRVC Robotics Laboratory at the University of Seville, Spain. His research interests include computer vision, event-based vision, robot perception, robot navigation, and machine learning.

CURRICULUM VITAE

EDUCATION

2020-pres. | Doctorado en Ingeniería Automática, Electrónica y de Telecomunicación

Universidad de Sevilla, Spain

2019-2020 | Máster Universitario en Robótica

Universidad Miguel Hernández, Spain

2015-2019 | Grado en Ingeniería de las Tecnologías Industriales

Universidad de Sevilla, Spain


WORK EXPERIENCE

2020-pres. | PhD candidate

GRVC Robotics Lab, Universidad de Sevilla, Spain

2019-2020 | Research assistant

GRVC Robotics Lab, Universidad de Sevilla, Spain

×

PUBLICATIONS

2022 | Free as a Bird: Event-based Dynamic Sense-and-Avoid for Ornithopter Robot Flight

IEEE Robotics and Automation Letters (RA-L)

Autonomous flight of flapping-wing robots is a major challenge for robot perception. Most of the previous senseand-avoid works have studied the problem of obstacle avoidance for flapping-wing robots considering only static obstacles. This paper presents a fully onboard dynamic sense-and-avoid scheme for large-scale ornithopters using event cameras. These sensors trigger pixel information due to changes of illumination in the scene such as those produced by dynamic objects. The method performs event-by-event processing in low-cost hardware such as those onboard small aerial vehicles. The proposed scheme detects obstacles and evaluates possible collisions with the robot body. The onboard controller actuates over the horizontal and vertical tail deflections to execute the avoidance maneuver. The scheme is validated in both indoor and outdoor scenarios using obstacles of different shapes and sizes. To the best of the authors’ knowledge, this is the first event-based method for dynamic obstacle avoidance in a flapping-wing robot.

2021 | UAV Human Teleoperation using Event-based and Frame-based Cameras

IEEE Aerial Robotic Systems Physically Interacting with the Environment (AIRPHARO 2021)

Teleoperation is a crucial aspect for human-robot interaction with unmanned aerial vehicles (UAVs) applications. Fast perception processing is required to ensure robustness, precision, and safety. Event cameras are neuromorphic sensors that provide low latency response, high dynamic range and low power consumption. Although classical image-based methods have been extensively used for human-robot interaction tasks, responsiveness is limited by their processing rates. This paper presents a human-robot teleoperation scheme for UAVs that exploits the advantages of both traditional and event cameras. The proposed scheme was tested in teleoperation missions where the pose of a multirotor robot is controlled in real time using human gestures detected from events.

2021 | Why Fly Blind? Event-based Visual Guidance for Ornithopter Robot Flight

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2021)

The development of perception and control methods that allow bird-scale flapping-wing robots (a.k.a. ornithopters) to perform autonomously is an under-researched area. This paper presents a fully onboard event-based method for ornithopter robot visual guidance. The method uses event cameras to exploit their fast response and robustness against motion blur in order to feed the ornithopter control loop at high rates (100 Hz). The proposed scheme visually guides the robot using line features extracted in the event image plane and controls the flight by actuating over the horizontal and vertical tail deflections. It has been validated on board a real ornithopter robot with real-time computation in low-cost hardware. The experimental evaluation includes sets of experiments with different maneuvers indoors and outdoors.

2021 | The GRIFFIN Perception Dataset: Bridging the Gap Between Flapping-Wing Flight and Robotic Perception

IEEE Robotics and Automation Letters (RA-L)

The development of automatic perception systems and techniques for bio-inspired flapping-wing robots is severely hampered by the high technical complexity of these platforms and the installation of onboard sensors and electronics. Besides, flapping-wing robot perception suffers from high vibration levels and abrupt movements during flight, which cause motion blur and strong changes in lighting conditions. This paper presents a perception dataset for bird-scale flapping-wing robots as a tool to help alleviate the aforementioned problems. The presented data include measurements from onboard sensors widely used in aerial robotics and suitable to deal with the perception challenges of flapping-wing robots, such as an event camera, a conventional camera, and two Inertial Measurement Units (IMUs), as well as ground truth measurements from a laser tracker or a motion capture system. A total of 21 datasets of different types of flights were collected in three different scenarios (one indoor and two outdoor). To the best of the authors' knowledge this is the first dataset for flapping-wing robot perception.

2020 | Towards UAS Surveillance using Event Cameras

IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR 2020)

Aerial robot perception for surveillance and search and rescue in unstructured and complex environments poses challenging problems in which traditional sensors are severely constrained. This paper analyzes the use of event cameras onboard aerial robots for surveillance applications. Event cameras have high temporal resolution and dynamic range, which make them very robust against motion blur and lighting conditions. The paper analyzes the pros and cons of event cameras and presents an event-based processing scheme for target detection and tracking. The scheme is experimentally validated in challenging environments and different lighting conditions.

2020 | ASAP: Adaptive Scheme for Asynchronous Processing of Event-based Vision Algorithms

IEEE International Conference on Robotics and Automation (ICRA 2020) - Workshop on Unconventional Sensors in Robotics

Event cameras can capture pixel-level illumination changes with very high temporal resolution and dynamic range. They have received increasing research interest due to their robustness to lighting conditions and motion blur. Two main approaches exist in the literature to feed the event-based processing algorithms: packaging the triggered events in event packages and sending them one-by-one as single events. These approaches suffer limitations from either processing overflow or lack of responsivity. Processing overflow is caused by high event generation rates when the algorithm cannot process all the events in real-time. Conversely, lack of responsivity happens in cases of low event generation rates when the event packages are sent at too low frequencies. This paper presents ASAP, an adaptive scheme to manage the event stream through variablesize packages that accommodate to the event package processing times. The experimental results show that ASAP is capable of feeding an asynchronous event-by-event clustering algorithm in a responsive and efficient manner and at the same time prevent overflow.

2019 | Efficient Mosaicking for Linear Infrastructure Inspection Using Aerial Robots

XL Jornadas de Automática

This paper presents a mosaicking generation method using images captured by aerial robots for linear infrastructure inspection applications. The method has been designed using the problem hypotheses in order to reduce its computational cost, keeping its precise and robust performance. Particularly, it uses the rectilinear flight hypothesis and the estimation of the displacement between consecutive images to select regions of interest avoiding to detect and match features in the whole image, also reducing the outliers and, therefore, simplifying the optimization for calculating the transformation between images. The method has been validated experimentally on gas pipeline inspection missions with aerial robots.