In late 2021, an autonomous drone delivered a defibrillator to the site of a cardiac arrest in Trollhättan, Sweden. Arriving within three minutes of emergency services being alerted, this rapid action saved the life of the 71-year-old man, who had collapsed while shovelling snow on his driveway.
This incident showed just how far drones have come for essential purposes such as medical uses, offering the potential to arrive at the scene much faster than an ambulance.
That’s key when it comes to a sudden cardiac arrest, where speed is of the essence: for every minute after someone’s heart stops, their survival chances are thought to drop 10%. As a major global killer, current survival rates are perhaps 10% or less when this occurs outside hospitals.
And there’s significant room for further development of autonomous drones for such purposes. Founded in 2019, a student team on the HORYZN initiative at the Technical University of Munich in Germany is aiming to develop an uncrewed aerial vehicle (UAV) that can deliver defibrillators.
Johannes Werner, business lead on HORYZN, said that a key difference compared with some other initiatives is that the team is optimising drones around defibrillators to create the perfect vehicle for this particular use, rather than adding them on to existing commercial drones.
Johannes Werner, responsible for business development in the HORYZN initiative: “Studies in recent years have shown it’s beneficial to deliver defibrillators and other medical supplies with drones. Once the regulations allow such drones, it’s exciting because you can use them for a lot of stuff.” – Read the interview with Johannes Werner
The idea is that after the drone is given the coordinates of the patient’s location, it flies to the scene, lowers the defibrillator with a winch and flies back again. Before an ambulance arrives, bystanders or someone with a medical background – potentially alerted by a smartphone-based system – can then administer the aid.
The HORYZN team believes drones flying at up to 125 kilometres per hour could slash the average arrival time at a scene to four or five minutes, compared with around 10 minutes by ambulance in Germany, tripling the survival rate. “This is theoretical and, of course, it’s hard to prove, but the numbers make sense,” said Werner.
Having already demonstrated a first prototype of the drone in December 2021, the team is now developing a second and hopes to fly a full mission in realistic conditions during the second half of 2023. This will involve the drone flying six kilometres in real airspace beyond visual line of sight, with Werner believing that such a system could be usable within as little as two or three years – although this also requires the relevant authorisations being in place for flying above populated areas.
Balázs Nagy, founder and leader of the HORYZN initiative: “Of course, this is not the range and speed that is technically feasible, but rather what we have found reasonable to develop from the regulators’ point of view in the first step. Our vision is, of course, an even more efficient system in the future that has both better technical properties and an improved medical concept that also operates nationwide. However, our approach is always to develop step by step and to increase the complexity incrementally.”
Werner added that HORYZN’s identity as a student team brings certain advantages. “We have no commercial pressure,” he pointed out. “We need money for building it, of course, but we don’t have to finance our lives with it. We are also flexible and spontaneous.”
While some UAV projects focus on specific essential uses, others are trying to boost the performance of overall autonomous drone systems for more general application. Achieving this would boost their effectiveness not just for medical uses, but also for all sorts of other applications, from search and rescue to precision agriculture, building inspection and package delivery.
One such project is the EU-funded AGILEFLIGHT initiative, which wants to enable drones to come closer to the navigation performance, agility and speed of human pilots in complex, cluttered environments such as cities.
Professor Davide Scaramuzza, director of the Robotics and Perception Group at the University of Zurich in Switzerland and project lead on AGILEFLIGHT, explains that many current autonomous drones move slowly and look “clumsy” in the way they take off and land vertically. For navigation they also often use GPS, signals for which can be blocked by buildings and when indoors.
In contrast, Prof. Scaramuzza’s team is developing and testing vision-based algorithms on drones with four rotors – or quadrotors – employing a combination of both standard cameras and so-called ‘event cameras’ to function like human or animal eyes, meaning they don’t rely on external infrastructure to fly.
Unlike normal cameras, event cameras do not output full images and instead detect motion through changes in light intensity, outputting a continuous stream of ‘events’. This means they can react faster, at microsecond resolutions.
Prof. Scaramuzza said algorithms for vision- and perception-based navigation are currently mature, but often far from human pilot performance in speed and agility. This is also important considering the often limited battery time of 30 minutes in drones, calling for performance to be optimised within this timescale.
Prof. Davide Scaramuzza, director of the Robotics and Perception Group at the University of Zurich in Switzerland: “At the moment, the only way to accomplish more within the limited battery time of a drone is to fly faster to cover longer distances.”
Read the interview with Davide Scaramuzza
To boost performance, the team is training algorithms using artificial intelligence and computer simulation via tools such as Unity and Unreal Engine, simulation engines used to make video games. This allows hundreds or thousands of scenarios to be played out to improve the machine-learning system far faster than via testing in the field. These are then used on drones for different types of test, including acrobatics, navigating ‘in the wild’ through environments such as forests, and racing around an aerial track.
Prof. Davide Scaramuzza: “We’re using more and more artificial intelligence by training vision algorithms on computer simulations… The idea is to design neural networks that take as inputs images from the camera and measurements from other sensors, and then output the commands for the drone.”
Already, these systems are enabling drones to perform almost as well or even better than some state-of-the-art human-controlled machines in certain tests. However, Prof. Scaramuzza explained that humans still have an advantage in adapting to different conditions, such as changes in the environment, wind or brightness – so there is room for improvement here too.
Exploring disaster areas
In addition, during the course of AGILEFLIGHT, Prof. Scaramuzza wants to demonstrate autonomous exploration of a known building as fast as possible as an entry point for using drones in search-and-rescue applications. To improve this, he said it would also be necessary to collect better data sets of disaster situations to “realify” simulations in environments that “violate all the usual assumptions”.
Clearly, wider use of the technology again relies on having the regulations in place, but Prof. Scaramuzza says these are getting better in areas such as search and rescue.
He predicts that within about five years we could have autonomous drones that reach human-level performance for certain applications. And he says the potential for the algorithms stretches beyond drones, in areas such as self-driving cars – where his team is working with a number of companies.
Professor Davide Scaramuzza: “My hope is that such technology can help save people’s lives in the aftermath of a disaster or on the roads. But in general, we are helping the market whenever anyone needs a drone or robot that’s able to see.”