Cleaning up: is enough being done to progress oil spill technologies?
Six years on the damage caused by the BP Deepwater Horizon spill to the nearby shoreline has been revealed in a new study conducted by the US Geological Survey and NASA. So was enough being done following the spill to clear it up, and is the industry investing enough in new technology to ensure safe and efficient clean-up operations in the future? Molly Lempriere finds out.
The BP Deepwater Horizon oil spill in 2010 decimated an ecosystem, killed eleven people and highlighted the dangers of offshore drilling. The spill in the Gulf of Mexico - the biggest in history - occurred when natural gas broke through a concrete core on the Macondo exploration well, causing a fire that ravaged the platform and led to its collapse. In the process the riser broke and began to discharge oil into the gulf.
It took BP five months to stop the flow of oil into the gulf, during which time 4.9 million barrels of oil were lost into the water - on average around 53,000 barrels a day. The effect on the surrounding ecosystem was disastrous, and despite efforts by the oil giant to control the damage, the effects are still clearly visible, including – as a new study points out – to the surrounding shoreline.
The study conducted by the U.S. Geological Survey (USGS) with the California Institute of Technology utilised NASA’s radar technology to reveal erosion on a huge scale. The radar used high frequency monitoring to map erosion, allowing the scientists to compare it with the damage caused by hurricane Isaac.
What shoreline damage can we see?
“We showed that the spatially limited erosion in the year before oil spill changed dramatically to widespread erosion in the two years following the 2010 oiling,” says USGS geophysicist Amina Rangoonwala.
The corrosive nature of the oil upon the marshlands has resulted in a decline of shoreline by as much as 12ft since 2009. “We measured erosion both before and after oiling. It’s hard to say exactly what the oil caused; however, what we saw is that the areas that were most heavily oiled or had medium oiling generally fell into the category of around four to twelve metres of recession,” says Cathleen Jones, scientist at the Jet Propulsion Laboratory at the California Institute of Technologies scientist and co-author of the study.
The shoreline of the Mississippi River Delta has been eroding for years at a low level occasionally quickened by cyclones. As USGS’s study explains, “petroleum has been observed to decrease belowground biomass and weaken soil in salt marshes. Heavily oiled marshes in upper Barataria Bay showed reduced biomass, lower soil shear strength, and higher erosion rates 3.5 years after exposure.”
The hydrocarbon attacks plant systems, destabilising the sediment that makes up the marsh and shorelines. This erosion is also more detrimental than that caused by natural events such as Hurricane Isaac. Although the hurricane displaced sediment, it did not completely destroy it.
What was done to prevent this erosion?
Following the spill BP received 92,000 clean-up suggestions on its dedicated hotline between May and June 2010. The clean-up operation was huge, with as many as 7,500 BP employees, 170 U.S. Coast Guard vessels and 2,000 volunteers involved in just the grassroots clean-up activity. On top of money donated by the company to the surrounding areas, BP spent $69bn on the clean-up operation.
Part of the initial problem with the spill was how to stem the flow; this persisted for months before BP found a solution that stopped oil entering the Gulf in August, 87 days after the fire broke out. Several strategies were attempted before a lower marine riser package cap was put in place, successfully stemming although not stopping the flow. On 12 August, a static kill took place, which involved mud being directly pumped into the well to finally stop the spill.
The clean-up took a range of different forms, from beach sweeping to the skimming and in-situ burning of the oil. Booms were used to control the flow of oil, containing it in specific areas and stopping oil slicks from reaching the shore. Much of this oil was skimmed from the top of the water both in deep and shallow areas. However, the efficiency of this technique is debated, with scientists at MIT claiming it is only 50% effective. Where appropriate the Unified Command trapped oil within a fireproof boom and then set it alight.
One of the key techniques was the use of some 7 million litres of chemical dispersants. These are designed to break down the oil in droplets, assisting the natural bacteria within the water to consume the oil. These dispersants, provided by Corexit, have been criticised for the damage they cause to marine life, with some suggesting they do more harm than good.
A study released by the Proceedings of the National Academy of Sciences in 2015 went further, suggesting that the dispersants were not only dangerous but also ineffective. Instead, it said, that within its experiments “dispersants did not enhance heterotrophic microbial activity or hydrocarbon oxidation rates.” With 24- 55% of the oil lost in the spill still unaccounted for according to the study, the success of the dispersants is questionable. Samantha Joye, a co-author of the paper, suggested that the lost oil was sitting on the sea floor instead of having been biodegraded by the dispersants.
Could more have been done?
A spill on the scale of the Deepwater Horizon is unprecedented and the clean-up was never going to be easy. But could it have been done better? “There are many technologies for collecting oil in the open water,” Jones says. “There are different ways to boom the oil, collect the oil, and to remediate the oil when it’s in the open water… it's a fairly common technology and it's one that is being advanced all the time.”
In 2010, BP spent no money on research into accident recovery and response although it did state that it supported independent organisations that specialised in the area. Following the spill, the company set up the Gulf of Mexico Research Initiative (GRI) providing long-term grants to scientists at gulf-state universities. Launched in 2011, the initiative will distribute $500m until 2020 to develop better responses as well as track the environmental impact of spills both at sea and on land.
There has been a surge in the technology designed for cleaning up oil spills, including magnetised oil and absorbent polymers. These technologies predominantly focus on capturing the oil whilst at sea, but as Jones points out, “you would want to prevent the oil from washing ashore, at least in large amounts, during a major spill”.
Two years after the spill, MIT developed water-repellent ferrous nanoparticles creating a process whereby oil can be removed from water using magnets. The technology relies on synthesised fluids or magnetic liquids, created by stabilising magnetic nanoparticles within oil. This can then create a hydrodynamic phenomenon when stressed by magnets. During a spill, the nanoparticles are synthesised with the oil, then the water and oil mix is filtered under a magnet onboard a vessel. The magnetic liquid, the oil, is attracted and removed, leaving just the water.
Even better, theses nanoparticles can then be removed from the oil and reused, thereby creating a commercial resource. This technology would work extremely quickly, is effective on both the oil slicks and sunken oil and could be appealing to oil companies who can retrieve their commodity, offsetting the cost of the clean-up.
Alternatively, scientists at the American Chemical Society have created a technology they claim could be “a complete solution to combating oil spills”. They have created a superabsorbent polymer which can swell to 45 times its own weight to suck up oil. It is buoyant, stable and easily recoverable, and when removed from the water the oil-swelled gel is suitable for regular oil refinement techniques.
Meanwhile, the USGS believes that surveys such as its own, which utilise scientific cooperation and high frequency mapping, will play their part. “We think that this particular method and many others as well, will allow us to do higher frequency monitoring of the changes in the marshes, and changes in the shoreline. That could help mediate loss, or at least measure irreversible loss,” study co-author Elijah Ramsey says.
Governments step up their regulations
Following the Deepwater Horizon spill, regulations around the world have been tightened to prevent another such disaster occurring. As a result of the 1989 Exxon oil spill, the Oil Pollution Act was brought into play in 1990. It ensures that it is the oil company which foots the bill for any clean-up operations following a spill, meaning BP was responsible for paying for clean-up efforts in the Gulf of Mexico. However, the act does not insist upon any investment into recovery technologies.
The Obama administration timed its most recent regulations to coincide with the anniversary of the BP spill. These insist on stricter use of blow-out preventers, along with tightening the rules for the design of offshore wells, specifically their lining. These regulations call for annual external inspections as well as constant underwater monitoring.
With a view towards growing interest in the Arctic region, oil companies will also be required to submit detailed Arctic-specific safety plans before any exploration is allowed within Arctic waters, including proving their ability to drill a relief well. However, there is still no requirement to invest in clean-up technology.
The Deepwater Horizon oil spill has caused a vast amount of environmental damage. Ramsey surmises that “in the year following the oil spill there was an anomalous decrease in biomass of the marsh behind the heavily oiled shorelines”. The clean-up methods used were inadequate given the sheer expanse of the damage, and it seems new technologies would improve efforts if it had happened today. However, investment in research remains limited and poorly enforced.