Jukka Peltola, Product Manager – ABB AbilityTM Marine Pilot Vision, ABB Marine & Ports
Situational awareness is a term used in the maritime industry to “identify existing threats as early and as far as possible”. For a bridge team, this traditionally translates into using their own senses to establish their ship’s place in the world. This can be complex in high traffic, coastal navigation and where visibility forces them to use sensors accessed through different user interfaces. Any means of consolidating the sensors must present information in a way that supports human sensing in the decision-making loop.
Gaining data from sensors must be holistically and logically presented to maintain the integrity of the bridge team’s decision making and support the safe operation of the ship under all circumstances. This can manifest to increased safety during berthing operations, appropriate augmented reality views in fog or estimating distances to ships and other obstacles during coastal navigation passages.
ABB Ability™ Marine Pilot Vision was launched in 2017 and has undergone significant testing on board Suomenlinna II. The solution takes both sensor technology and computer vision to create visualizations of a vessel and its surroundings. When in the virtual-world user interface (UI), a ship model is superimposed in context of its real surroundings. By detaching the view from the traditional north or heading up view, a third-person perspective is able to be presented giving visibility in areas that fall out of the normal visibility requirements set out by the International Maritime Organization’s (IMO) International Convention for the Safety of Life at Sea (SOLAS).
The platform allows for the integration of multiple sensors, both established and new, to ensure the officer of the watch (OOW) can assess threats in real time. Key to this is the deployment of the platform on board without the need for external connections. Utilizing onboard infrastructure increases the security and allows for better integration within a customer’s own networks. Additionally, by removing the need for external connections, latency is nearly eliminated making the ABB Ability™ Marine Pilot Vision platform. Blind spots can be eliminated with camera views that are also available on demand should they be required.
The tech
Rather than dictating to the OOW on what sensors are needed in all situations, ABB Ability™ Marine Pilot Vision accesses the data from radar, light detection and ranging (LiDAR), automatic identification system (AIS), global navigation satellite system (GLOSNASS) and the fitted cameras. The data is combined within a server on board to provide visualization in the OOW’s client view with the watchkeeper determining what data is presented on demand. This may be through the installed screens within the bridge, or even via a tablet should the bridge team require this.
By deploying functionality within the system in a modular manner, operators are able to define the requirements of the platform against their own specific needs and that of the vessel segment. By design, the sensor arrangement is completely flexible.
Depending on the requirements and sensors selected for the specific need, the product enables to create accurate 3D visibility of the ship surroundings using LiDARs with maximum range anywhere from 200 m up to 2000 m range as well as for high resolution scanning close to the ship’s hull. Navigation radar technology (such as X- and S- band) is used to provide visibility to a longer range, and close-range microwave radar technology to provide accurate all-weather sensing to manage the range where navigational radar is not reliable enough to provide the necessary situational awareness. These sensors are complemented with both regular and infrared cameras to provide additional context or verification should the OOW require.
ABB Ability™ Marine Pilot Vision overall concept for a ferry
The high-resolution sensors are typically placed within nodes to create the desired field of view and attain the required visibility range. Whilst preferences at the bow, stern and on the sides of the vessel may be the norm, this is by no means a limit to the configuration. Sampling and preprocessing are handled at the node locally before data is transferred via a single bus. From there all nodes feed into a central unit for integration. At this point, NMEA data is also fed into the model to consider GPS, gyrocompass, AIS and automatic radar plotting air (ARPA) feeds. At the central unit, an inertial measurement unit is fitted to understand the motions of the ship in relation to the sensor data and allow for plotting within the global coordinate system. As the sensor node data is collated and synchronized, the ABB Ability™ Marine Pilot Vision system itself will image this within a 3D graphics engine to allow the ship model to be scaled 1:1 within the sensor data.
Should the customer so require, and should the interface exist, electronic nautical chart (ENC) data can also be laid into the UI to fuse together the chart information with that of the sensors to provide familiarity to current electronic chart display and information system (ECDIS) interfaces.