The Naval Architect: March 2017
The increasing presence of sensory technology is changing the way crew and indeed vessels themselves perceive their local environment. Technology now has the power to see and relay the environment it beholds, further enhancing the sensory resources at a bridge team’s disposal.
Such developments have the potential to improve the safety of currently operating vessels, but also re-open the discussion of an autonomous future – a future that seems increasingly feasible (at least technologically).
Rolls-Royce has made a name for itself within the maritime industry for innovations in the field of ship autonomy. Its Advanced Autonomous Waterborne Applications Initiative (AAWA) project set out with the aim of having a remote-controlled ship in commercial use by the end of this decade, and products based on the project’s developments are percolating to market.
Last year an automatic crossing system was delivered to Norwegian ferry company Fjord1, and now a new commercial product emerges from the project.
Following more than six months of testing on AAWA’s test vessel Stella (a 65m ferry owned by Finferries), an Intelligent Awareness System (IAS) for vessels of any type is set to be launched; the first of its kind according to the company.
The system functions as an advisory tool that empowers decision-makers with a greater understanding of a ship’s surrounding environment. Its application is particularly aimed at the safe navigation of busy ports, for example a large cruiseship navigating the port of Shanghai, or operations in challenging environments, such as the dense fog experienced in Houston’s shipping channels.
Fundamentally, the system opens the eyes of operators to a wider range of perspectives based on a ship’s sensor technologies. Using the information it perceives, the system then informs operators of risks and can offer a unique bird’s-eye view of the environment – this is something that could be particularly useful in docking operations and Rolls-Royce reveals that work on an automated docking system is ongoing.
Iiro Lindborg, general manager of remote and autonomous operations, Rolls-Royce, explains that the system is based upon the use of multiple different kinds of sensors, including high definition cameras, night vision, radar, LIDARs and AIS data, and functions through the efficient integration of different data sets, what he calls “data fusion”. In addition, IAS also has the unique benefit of fog horn detection – a recent innovation that can map the distance between two vessels based on acoustic information.
This fusion allows for interchangeable layers of insight i.e. where an operator can view either a 3D map rendered by LIDAR, a radar overlay or a topography that shows the ocean bed.
Already deployed in the first autonomous cars, LIDARs are making their way to the maritime industry, creating 3D environments that can be mapped with GPS points and used to perceive what the human eye cannot when operating a ship. LIDARs are a particularly useful component, offering a view that is in principle four times better than the human eye in foggy conditions according to Lindborg. The technology is based on the creation of a point cloud, firing beams of light from a laser and then measuring when a beam is reflected. Roll-Royce says approximately 300,000 beams are pulsed to render a 3D map of the world around a vessel. In this way a vessel is able to map its environment and provide new insight for those on the bridge.
The way the system processes a ship’s surrounding environment and any objects, how its ‘brain’ works for want of a better description, is particularly innovative. It uses an artificial intelligence (AI)-based object classification solution that can independently detect and track objects. Onboard cameras classify approaching vessels or objects and, after some time, will be able to determine vessel characteristics i.e. how fast a vessel can travel or stop.
A process known as ‘labelling’ establishes what a vessel’s cameras ‘see’. This involves drawing boundaries around pictures containing two vessels and teaching the AI the difference between each boundary field, as well as any associated characteristics for each field/vessel. Rolls-Royce has amassed two million pictures for ‘labelling’ so far and the system on offer contains 100 different categories and can detect objects including lighthouses.
The potential is obviously expansive and improves with the amount of data the AI is able to leverage. At its most basic, it detects a vessel and whether the vessel is moving. Rolls-Royce controls the templates the system functions on and will improve the service offered based on its constantly growing data pool.
Tomorrow’s solutions today
Lindborg stresses that while the system will help to facilitate an autonomous future, the project aims to help existing crew. It is therefore increasing the usability of readily available data, not necessarily creating more data, he says.
With this in mind, there is not much extra equipment to install onboard. The cut and thrust is to improve the use of existing tech, integrating LIDARs and cameras. Asked about the cost, Lindborg says any solution can be different and will be the product of a co-development process with the owner. In other words, a system or a version of the system is tailored to each ship’s needs based on the requirements of the shipowner. The process, Lindborg explains, will begin by looking at a vessel’s current capabilities before moving on to discussions of future improvements and establishing a proof of concept.
Initial interest has been received from cruiseship owners, but owners of other ship types including cargo vessels have discussed the new system with Rolls-Royce. To this end, Lindborg is keen to point out that studies have ranged to represent the requirements of different vessel types.