Mayday Mayday! Answering emergency calls with help from AI
As of 21 June, Sweden’s air and sea rescue leaders have been using artificial intelligence to assist with the monitoring of mayday calls around the Swedish coast and Baltic Sea. Frankie Youd speaks to Maranics AB, one of the companies involved with the technology, to discuss how this technology works and what the main benefits are over a human response to a mayday call.
Working at sea is a dangerous, potentially life-threatening role that sees coast guards and rescue leaders receiving mayday calls from seafarers who have found themselves in emergency situations.
Due to mayday calls signalling that an emergency situation is taking place, it is imperative for the rescue leaders to obtain information as to where the individual is located, what the current situation is, and what is happening to ensure they are able to assist as quickly and effectively as possible.
Although ‘mayday’ is a universal signal for distress – allowing operators to identify that an emergency situation is taking place – the process of logging details of the scenario can be time-consuming, and time is of the essence in these situations.
To assist these mayday situations, a new artificial intelligence (AI) system is being trialled by Sweden’s air and sea rescue leaders to monitor mayday calls, examining how this new technology can assist the sea and air rescue teams to identify the information relayed during the call.
Tobias Nicander, rescue leader at the Swedish Maritime Administration’s air and sea traffic control centre, wanted increased technical support to assist the operators. In a press release, Nicander said: “It feels tremendously satisfying that we can now conduct live tests using real emergency calls; I see great potential for the application. In air and sea rescue, it is a major advantage to gain technical support as a complement to the human ear.”
Incoming mayday call detected
The technology is based on the framework of the Heimdall Innovation Project – a multi-hazard cooperative management tool for data exchange, response planning, and scenario building. Presented in the form of an operator-friendly interface, the data capture presents ship information, position, speech to text information, and weather data for the response team.
Built by Maranics AB – a company that has created a platform designed for the automation of human processes – the technology provides real-time data for those answering the mayday call.
Mattias Larsson, chief innovation at Maranics explains how the technology works: “Maranics has been responsible for the backend application architecture, human user interface, and contextualising the data.
"Building such real-time applications requires modern cloud-native architecture and event-driven/data streaming design. The real-time data application's design contains updates of the interfaces, without the humans needing to refresh the application manually.
“Once the operator inputs information via an easy-to-use interface, the system produces contextualised events with high data quality that other systems can use. In this case, this data is to train the AI model.”
This is a perfect example of how to create a reliable AI service where man and machine work together.
The user interface technology is constantly receiving sound recordings from the various receivers located around the Swedish coastline that, when paired with the AI model – which has been developed in collaboration with AI service provider TenFifty – allows the AI technology to listen and learn.
Larsson says: “The system is constantly receiving sound recordings and the AI model is listening to those recordings. If recordings contain specific information patterns, they will be sent to the Heimdall interface as events. Heimdall will display the recordings for the operator with a highlight about a possible type of call, for example, mayday or pan-pan.
“The operator can then replay the sound file, when time allows, and provide feedback about the analysis's correctness with just a few simple clicks. This feedback is then used to train the AI model further.”
The AI and machine learning technology, developed by TenFifty, allows the technology to convert speech to text using neural networks. Speaking on this technology in a press release, David Fendrich, CTO at Tenfifty, said: “This is a perfect example of how to create a reliable AI service where man and machine work together.
"Technology designed to convert speech to text using neural networks has made immense strides in recent years and it is extremely pleasing to be able to use technology for social benefit.”
Technology lending a helping hand
The inclusion of this technology brings with it many benefits for those working within the maritime rescue centre. Prior to its inclusion, operators of Sweden’s air and sea rescue were working closely together in a single room where they would not only need to listen to the mayday call but also the several other conversations that would be taking place around them.
This busy environment paired with the heavy workload could potentially see human error taking place due to the radio traffic and noise levels in the room. Larsson explains: “The operators are working in a room where they need to listen to several speakers simultaneously, those different speakers are playing the radio traffic from various radio receivers located around the Swedish coast.
“As the workload can be heavy sometimes and the sound is sometimes of relatively poor quality, there is room for error. Implementing technology to assist the operator in selecting the most likely transmissions makes his job easier and safer.”
At present the technology is still being tested by Sweden’s air and sea rescue leaders in a hope that in a not too distant future it will become a staple of the rescue system which they have in place.
Main image credit: Cameron Venti / Unsplash