Skip to main content
Announcements
Qlik Connect 2025: 3 days of full immersion in data, analytics, and AI. May 13-15 | Orlando, FL: Learn More
QlikSenseUpdates
Employee
Employee

At Qonnections 2019, we introduced the Emergency Response Drone – a proof of concept leveraging Qlik Core IoT and AI. This innovation can positively disrupt humanitarian efforts, proven with the initial trial flight and test situation debut, it raised eyebrows for emergency workers with the possibilities around improving response efficiency for critical situations.    The Emergency Response Drone uses Qlik Core for analyzing multiple data streams, including Computer Vision, to make smart recommendations at disaster sites for emergency workers to act in real time. Ultimately, the Qlik Associative and Cognitive Engines working together can make emergency responses safer and more effective.

What is it?

The Emergency Response Drone demonstrates how the Qlik platform and our approach to Augmented Intelligence can assist emergency responders to ensure units respond efficiently based on observations from multiple data feeds including drones and other sensors. The drone detects objects and sends them to the Qlik Core web application. This application allows the 911 Operator to make decisions based on data from the field in order to provide appropriate recommendations of which units to deploy. The drone has an onboard camera which recognizes and tracks the people, cars and hazardous material involved in an incident – in this case a multi-vehicle collision. This data is being aggregated in Qlik Core which is also running locally on the drone in order to calculate a risk score. The risk score is used to automatically recommend which resources should be sent to the scene of the incident.

Qlik Core app providing response recommendations to 911 operatorQlik Core app providing response recommendations to 911 operator

The Emergency Response Drone is an industrial-grade DJI hexacopter with an AWS DeepLens camera as payload. The DeepLens is an Amazon edge device that leverages deep learning AI. We run Qlik Core inside a Docker container on the camera. Data is streamed to the Qlik Core web app via the cloud. Our Qlik Core web app leverages the same Qlik Associative Engine we all know and love to provide responders with fast and efficient recommendations.

ArchitectureDiagram.jpg

How does the AI detection work?

The DeepLens runs a type of convolutional neural network called a Single Shot Detector which can quickly identify multiple objects and return bounding boxes of where each object is found in each frame of video. It leverages a resnet-50 architecture and was finetuned from Imagenet which is a popular base model that can classify 1000 different types of objects. For this project, we wanted to detect specific objects such as hazardous material placards and paramedics which weren’t in the Imagenet model. We also wanted to improve the accuracy, so we used annotated images from the video of the collision. We wrote a custom add-on to Adobe After Effects in order to leverage their motion tracking in order to produce the annotation files. Our model was trained with roughly 5000 labeled images and another 1000 images were used to validate the model. Through hyperparameter tuning and brightness/contrast/color augmentation, we were able to achieve over 80% mean average precision. The training was performed using AWS Sagemaker. Sagemaker helps facilitate deep learning through Jupyter Notebooks and S3 storage.

AI_TrainingStill.jpg

How does Qlik Core work?

Having this information from the Qlik drone can help shape the response plan by sending specific types of resources to the emergency while not over-prioritizing the response, which ensures responders have the capacity for other emergencies that might occur. Data from the drone and emergency response vehicles are later uploaded into the Qlik Sense application in the cloud for post-incident analysis.

Qlik Core app providing Incident Commander improved situational awarenessQlik Core app providing Incident Commander improved situational awareness

What else can this be used for?

In addition to this multi-vehicle collision scenario, this system could be applied to many other use cases. We’ve already begun discussing the potential of using this to assist responders during massive floods. We have also explored how this could be useful in several humanitarian efforts such as assisting woman and children with foraging for firewood and monitoring water holes near refugee camps.

Ottawa Flood application by GINQOOttawa Flood application by GINQO

This project took a lot of effort from a large team. We are very grateful for all the work from the following people and agencies: Qlik (Todd Margolis, Aiham Azmeh, Elif Tutuk, Chuck Bannon, John Trigg, Anthony Alteraic, Robin Muren, Johan Persson) City of Ottawa (Deputy Chief Greg Furlong, Project Manager Rebecca Anderson, Ottawa Paramedic Service, Ottawa Fire Services, Ottawa Police Service) GINQO (Martin Curtis, Mike Ellis)

This is only the beginning for the Emergency Response Drone and the impact it will have.  More to come on this innovation and how Qlik continues to make a difference  with our commitment to social responsibility and global humanitarian efforts.

2 Comments
joe_warbington
Luminary Alumni
Luminary Alumni

Incredible work team! Love these life-saving applications of data and Qlik.

1,725 Views
diegoberriel
Contributor II
Contributor II

Great work!!! Thanks for sharing!

0 Likes
1,704 Views