Mapping the Ocean Floor

By on 13 May, 2010

JON FAIRALL

It’s a truism that the surfaces of Mars and the Moon are better mapped that the surface of the Earth. Three-quarters of our globe is covered by a layer of water several kilometres thick. Mapping through the murk is difficult and expensive.

However, the news is not all bad. New technologies are making it easier, if not less expensive, to generate high-resolution maps of the deep sea bottom. New drivers have emerged that can make the data more valuable. As always, GIS is making it easier to marshal the data in a way that makes sense for end users. And new space technologies are also helping to map the planet at high resolution.

The first point is that the depth of the water makes a difference. The dividing line occurs at about 50 metres. Less than that, and the mapping problem is amenable to traditional sounding methods: simple echo sounders or physically touching the bottom with weighted lines and measuring their length.

Over the past decade, the field has been revolutionised with the arrival of airborne laser depth sounding. This has made it possible to comprehensively probe huge areas of coastline and areas such as the Great Barrier Reef, where it can be hazardous to send ships.

Deep-water work is more difficult. Lasers run out of steam at about 70 metres, even in clear water. The best modern method of probing the depths is the multi-beam sonar, a technology that originated in the late 1950s. It was spurred by the US military because, potentially, it could stop their submarines running into undersea mountains. As late as 8 January 2005, however, it was still possible for the USS San Francisco to collide with an uncharted undersea mountain about 560 kilometres south of Guam, while on passage to Brisbane.

The submarine was operating at about 170 metres depth and travelling at almost 40 knots (about 85 kph).

The first commercial multi-beam, the SeaBeam Classic, was put into service in May 1977 on the survey vessel HMAS Cook. This system produced up to 16 beams across a 45 degree swathe. While enormous improvements have been made in computer processing for sonars in the last 20 years, the fact remains that they are limited by the fundamental properties of seawater. Sound travels through water better than light radiation, but the problem for sonar engineers is that, to a first approximation at least, high frequency sounds don’t propagate through sea water as well as sounds at low frequency. This means that if you want to survey the sea bottom in 50 metres of water, you can do so at 300 kHz. But if you want to use sonar 7000 metres down, you are restricted to 30 kHz and even less. This matters because the resolution of the subsequent return is also related to frequency – the lower the frequency, the worse the resolution.

There has been no end of subtle techniques used in attempts to improve the performance of sonar. Many are akin to the synthetic apertures found in processing radar data. Some have been helpful, but the reality is that Closer is Better. Hence the relatively recent moves to put instruments on unmanned vehicles that can be sent to the bottom for lengthy periods of time.

There are two types: remotely operated vehicles (ROVs) and autonomous underwater vehicles (AUVs). A ROV is secured to its mother ship via an umbilical, and operated from the ship. The operator usually has some kind of tele-presence system.

AUVs are programmed to undertake their mission without operator intervention, although telemetry can flow between the AUV and its operator via an acoustic link.

The oil and gas industry are heavy users of the technology. They employ it to make detailed maps of the seafloor before they start building subsea infrastructure. It helps them save costs and minimise disruption to the environment. AUVs allow precise surveys to be carried out in areas where traditional bathymetric surveys would be less effective or too costly. Post-lay pipe surveys are also now possible.

Hundreds of different AUVs have been designed over the past 50 or so years, but only a few companies sell vehicles in any significant numbers. Currently about 10 companies sell AUVs on the international market, including Kongsberg Maritime, Bluefin Robotics, International Submarine Engineering Ltd and Hafmynd.

Typical equipment carried includes compasses, depth sensors, sidescan and other sonars, magnetometers, thermistors and conductivity probes. AUVs can navigate using an underwater acoustic positioning system in which a net of transponders is deployed on the sea floor. When a surface reference such as a support ship is available, acoustics are used to calculate the position of the subsea vehicle in relation to the surface craft. Its position is known from GPS.

When it is operating completely autonomously, the AUV will surface to take its own GPS fix. Between position fixes, and for precise manoeuvring, an inertial navigation system measures acceleration. Doppler technology measures the rate of travel, and a pressure sensor calculates the vertical position. These observations are filtered to determine a final navigation solution.

The shallow/deep divide is not just a technology issue. There are different drivers for deep and shallow water activity. Some are obvious. Sailors don’t much care how deep the ocean is – provided it’s deep enough to float their boats. Thus the first preoccupation of shallow water surveyors is charting, and the production of navigational aids.

This work continues, and is a staple that allows dozens of small consultancy companies to survive. In October, for instance, Acoustic Imaging conducted a three-day course for Sydney Ports staff involved in processing multi-beam bathymetry. One spur for the work is the recent development and expansion of facilities at Port Botany, which involved massive dredging operations and considerable changes to the sea bottom around the port.

Still, as more and more of the world is surveyed accurately, this type of work is perhaps carrying less impetus. The most recent driver of activity has been climate change, and in particular the vulnerability of coastal areas to inundation, either from rising sea levels or from extreme flood and tide events. This work is tailor-made for airborne laser surveys.

The biggest operator in Australia is Fugro, which operates the LADS facility previously owned by Tenix. This unit flies a Fokker F27 with both a laser unit and a hyper-spectral camera. The other major player is the Royal Australian Navy, which operates a LADS unit from a purpose-built de Havilland Canada Dash-8 aircraft.

Over the past year, Fugro has surveyed almost the entire coastline of Victoria for the Department of Sustainability and Environment, and the coast of Western Australia for the Department of Primary Industries. Both organisations required a common view of the coastal zone, the beaches and the seabed down to about 20 metres. This data will be used to create digital terrain models that will inform inundation models along each coastline. Importantly, it will also be used as the basis for further surveys, enabling the state of beaches and dunes to be monitored over time.

In the deep ocean, the driver for the work is more purely economic. Numerous contractors work to provide high-resolution maps of the ocean bottom for the oil and gas industry. Typically, the work consists of pipeline or site surveys for oil rigs.

Of course, not all deep sea work is tied to particular companies or projects.

Geoscience Australia is a major source of deep ocean surveys. Its concerns are either pre-competitive research for exploration companies or a closer study of deep ocean environments. These could relate to the fishing industry or focus on issues such as climate change.

The centrepiece of this work is Australia’s Marine National Facility, a blue-water research capability funded by the Australian government and operated by the CSIRO. It operates a research vessel for scientific voyages called RV Southern Surveyor.

The facility has been a key element of Australia’s national research infrastructure since 1984. It is available to all Australian scientists and their international collaborators. Access is granted on the basis of proposals that are internationally peer-reviewed and independently assessed for science quality and contribution to the national interest.

The greatest problem facing those who seek to map the sea bottom is still the sheer vastness of our oceans. One study suggested that if every multibeam sonar in the world was kept at sea permanently for the next 125 years, we could just about map the Earth at 100 metre resolution. Even the best systems create data swathes that are only around 10 kilometres wide. There is still a lot of work to do and any system that can undertake broad-scale mapping of the oceans will attract interest.

Consider the case of satellite-based altimeters. The European Space Agency runs an instrument called Radio Altimeter-2 on the Envisat spacecraft. RA-2 is designed to measure the exact distance from the planetary surface by emitting radar pulses. It has an accuracy of a few centimetres from 800 kilometres away. To further improve reliability, Doppler tracking and laser ranging from Earth help to keep precise tabs on Envisat’s position and speed.

This is interesting. While the theoretical ellipsoidal shape of the globe fits the oceans remarkably well, the actual ocean surface deviates by up to 100 metres from this ideal ellipsoid. These bumps and dips in the ocean surface are caused by minute variations in the Earth’s gravitational field.

For example, the extra mass in a mountain on the ocean floor attracts water toward it, causing a local bump in the ocean surface. A typical undersea volcano is 2000 metres high and has a radius of about 20 kilometres. The water over it may be 100 metres higher than it should be. A trench operates in the other direction, and the local sea level will be slightly under the mean.

Given large amounts of coverage and almost unlimited data processing,it has been possible to reveal the sea floor in some detail. Two years worth of ERS-1 altimeter data was used to create a global sea floor map. Its accuracy was so high that the US Department of Defence subsequently declassified its own maps, which had been created over about five years by the GeoSar satellite.

Scientists at the Scripps Institution of Oceanography in the US used these two datasets to produce the first complete map of the world, which was published last year. The satellite-derived gravity grids reveal all the major structures of the ocean floor having widths greater than about 10 kilometres. This resolution matches the total swathe width of the multi-beam mapping system carried on ships (100 metre resolution), so the gravity maps are the perfect reconnaissance tool for planning more detailed shipboard surveys.

The upshot is a global map that surveys every structure on the ocean floor more than 1000 metres high. About half were unknown before this work. It is probably too soon to know where the work will lead, but it would be surprising if geologists did not soon begin use it in their quest for new discoveries.

Bear in mind that the first hard evidence for plate tectonics – now the fundamental theory of Earth science – came from the first great age of ocean exploration. That was the discovery of the Mid-Atlantic Ridge from HMS Challenger in 1872. There will be other such discoveries.

Jon Fairall is the editor of Position Magazine.

Issue 46; April – May 2010
 
 

You may also like to read:



Newsletter

Sign up now to stay up to date about all the news from Spatial Source. You will get a newsletter every week with the latest news.

Geospatial in the age of the metaverse
The geospatial sector is set to both underpin the metaverse ...
$140 million allocated for WA Spatial Digital Twin
The 10-year project aims to improve infrastructure delivery,...
March 21: Celebrating Global Surveyors’ Day
March 21 is the day on which we celebrate the essential work...
Government releases new Local Drone Rules map
UAV users can now easily see whether they need to obtain aut...
Photogrammetry with enhanced cloud capabilities
SimActive has announced improved cloud environment enhanceme...