Today, Science Daily posted a press release by Spanish news service Plataforma SINC about a new population study with a surprising prediction: By 2100, the world will have about 800 million … fewer people than it does now. According to the researchers (at the Universidad Autonóma de Madrid), their model forecasts fit well with the UN’s lowest population projections, a scenario under which population rises to ~8 billion by 2050, then stops growing and gradually declines. Says co-author Félix F. Muñoz:
“Overpopulation was a spectre in the 1960s and 70s but historically the UN’s low fertility variant forecasts have been fulfilled”
What. WHAT?? What about everything everyone’s been telling us for decades — about how the population will keep on growing until we’re like sardines in a can (if sardines still exist)? What about Megacity One? What about Soylent Green? What about [insert dystopian future of choice here]? Monsanto’s main marketing argument’s been based on this for years: “we need to feed more people on less land,” etc. In the Millennium Ecosystem Assessment, all four future scenarios were based on projections of population growth to at least 8 billion by 2050 — and the M(E)A’s outlook for ecosystems was quite grim because of this.
So what gives? Can we stop worrying now? Can we start dreaming again about a future that doesn’t suck?
I hate to dwell on the negative (okay, I love to dwell on the negative), but even the UN doesn’t think its lowball population scenario is likely. Their Population Division is betting on the “medium variant” scenario, where we have 9.3 billion people by 2050, and level off around 10 billion by the end of the century. Their “high projection variant” predicts 15.3 billion people by 2100. So let’s not get cocky, kids. With even the medium scenario, we’re getting another 3 billion people on the planet in the next few decades. You might want to buy real estate before all the good cans are gone.
On the other hand, don’t give up hope.
It’s hard to write a good executive summary, especially for people like me, who prefer to ramble on in a stream-of-consciousness fashion for at least 20 pages. But the intro to this policy brief is one of the best summaries I have ever encountered — not only of ecosystem services (or “environmental capital”), but also of just about every environmental problem ever. For example:
The root causes of the degradation of environmental capital are the combined pressures of population growth, rising affluence, and frequent reliance on environmentally disruptive technologies to meet the associated material demands. All of these factors are compounded by bad management, traceable in part to under-appreciation of the importance of environmental capital for human well-being and to the exclusion of the value of its services from the economic balance sheets of producers and consumers. …
In the absence of [government] intervention, individuals and firms are able to capture the benefits of activities that produce … ecosystem disruption but are able to avoid most of the attendant damages, which are spread across society. … Private firms and individuals have little incentive, absent requirements imposed by government, to invest in maintaining or growing capital of this kind.
It is now much clearer than before that the historic drivers of degradation of environmental capital—replacement of complex natural ecosystems with simpler man-made ones, invasive species, overexploitation of commercially valuable plants and animals, chemical pollution—are being compounded and amplified to a rapidly growing degree by global climate change.
Anyway, it’d better be good — it’s for the executive-est of executives.
More evidence that global warming and deforestation are linked: peat swamps in Indonesia that have been clear-cut for agriculture are releasing 50 percent more carbon than swamps that are still forested. And no, that isn’t because of decayed or burning waste from the clear-cutting; the carbon measured by Open University researchers in deforested swamps was older than carbon that came from swamps where the forest ecosystem was still intact, and appeared to come from much deeper in the sediment.
This would appear to contradict the conclusions of a 2011 study that suggested peat bogs were unlikely to release a great deal of carbon; however, that study specifically examined conditions in the Northern Hemisphere, rather than the humid tropics. If there is a difference with latitude, though, the high rate of tropical deforestation should raise even more concern in light of the OU findings. A lot of Indonesia’s forests are being cleared right now to make way for agriculture, including palm oil and paper plantations (see Asia Pulp & Paper and Girl Scout Cookies for two of many reasons why, though we’re making some progress there). This is not only bad news for carbon storage, but also a culprit in the rapid disappearance of iconic species like Sumatran tigers and orangutans.
Now here is a situation where carbon markets could make a huge difference. Destroying rainforest and releasing peat gases should be hurting somebody’s pocketbooks, not making them more money. As with so many environmental issues, the global/public costs (increased global warming, loss of beloved species and their less-well-known supporters, and probably pollution/runoff issues too) don’t count for much against private revenues. Not that we can chide Indonesia (or its businesses and small farmers) for this when we’re the ones consuming so much palm oil and paper. Carbon trading would be nice, but let’s face it: we’re already trading whenever we buy a candy bar.
In 1997, a paper published in the journal Nature tried to estimate the financial value of all the services performed by Earth’s ecosystems (for instance, pollinators contribute to agricultural yields, beaches to tourism, wetlands to water quality improvement). The figure its authors (Robert Costanza of the University of Maryland in the lead) came up with was $33 trillion — roughly twice the value of the entire global economy at the time. Since its debut, the paper, “The value of the world’s ecosystem services and natural capital,” has been criticized at almost every level. Even its authors acknowledged that their analysis was crude and their methods questionable. But the paper was not published as an example of analytical precision. Rather, the point was to call attention to something that is too often overlooked: the economic importance of a healthy biosphere.
According to this relatively new school of thinking, we may put commodity prices on goods such as food products, game, and timber, and services such as water purification, but we fail to take into account (economically) the complex and interrelated natural systems that make the provision of these goods and services possible. For instance, we pay money for a fruit crop, but place no monetary value on the animals that pollinate the plants that produce this fruit, or the soil microbes that provide the plants with nutrients, etc. As Pavan Sukhdev says in his TED talk, “When was the last time a bee gave you an invoice?” Sukhdev believes that we treat natural systems so cavalierly because we are not confronted with their true value in terms we would understand — a problem he refers to as “the economic invisibility of nature.” He argues that we must therefore integrate the value of ecosystems into world markets. If people start to view ecosystem destruction as a bad idea financially, similar to walking into a Swarovski store and smashing all the crystal-ware with a hammer, they may be more motivated to preserve these systems.
Some environmentalists have embraced this idea. Personally, I think commoditization and consumerism are two of the basic reasons why the balance of the global ecosystem has gotten so screwed up lately, and I’m skeptical about the market’s ability to solve a problem it helped create. But I’m learning a great deal more about the “ecosystem services” school of thought now… We’ll see if pragmatism can win me over.
Stay tuned. Or whatever.
Tonight’s theme: Whodunit?(!)
Some foul villain has been contaminating the air in coastal California with that wonderfully toxic element, mercury! Water vapor in summer fogs in 2011 and 2012 was found to contain elevated mercury levels, and brave researchers from USC Santa Cruz set out with only their uh … fog-sampling equipment and other measurement tools … to find the culprit. From the temperate forests to the seaside, they tracked the toxin across the region until, after a terrifying* hunt, our heroes came face to face with the vicious polluter, who snarled defiance at them thusly: “Ka-shwoosh.”
No rly. The researchers measured the mercury levels in different strata of ocean water and decided that the increased mercury in coastal fog was caused by an upwelling of deep-ocean water, which then evaporated from the upper water layers and became fog.
THE FOG OF DEATH.
No! Don’t panic! It’s okay! Peter Weiss-Penzias, the UC Santa Cruz toxicologist who led the investigation, was careful to note that even at these elevated levels, the amount of mercury in fog poses no health risk to humans — the increase his team found was at the scale of parts per trillion. However, a National Geographic article notes that increased deposition of mercury in fog could lead to bioaccumulation in dangerous concentrations in animal and plant tissues.
Now, before you go blaming the ocean for all your problems (“the ocean stole my prom date!” etc.), the Science Daily article mentions that this mercury might originally have come from … humans! Excessive mercury has been accumulating in the environment for more than a century (since the start of the Industrial Revolution), and it may have built up in ocean sediments and then been slowly transported to the surface via the global conveyor belt. I.e., this may be evidence of environmental impacts over a much longer time span than we’re used to thinking of.
Or maybe it was the butler.
*by which I mean “probably not that terrifying”
Here’s your environmental LOLZ for the day: Gulf of Mexico clean-up makes 2010 spill 52-times more toxic — mixing oil with dispersant increased toxicity to ecosystems
That’s right. It turns out that, according to a study by Georgia Tech and the Universidad Autonoma de Aguascalientes (published in the journal Environmental Pollution), the dispersant chemicals BP used to deal with the crude oil may have ended up making the Deepwater Horizon spill 52 times more toxic than it would have been otherwise.
It’s not like nobody was concerned about this at the time. After all, if you’re trying to protect important marine ecosystems, the last thing you would normally do is spray detergent all over them. But you weigh your choices, and both BP and EPA felt it was better to try breaking up the oil slick into smaller globlets that, well, dispersed more easily and might be carried away from vulnerable coastlines.
Little blobs of oil also have way more surface area than one contiguous slick, which means that the blobs might be more quickly broken down by plankton and bacteria. Many types of small ocean critters will eat crude oil — another recent study by several Alabama institutions indicates that plankton populations increase when exposed to crude — and the authorities wanted to make the oil as bite-size-snackable as possible for the little guys.
Unfortunately, though, it’s hard for our planktonic pals to eat up your crude oil when you have killed them all off, and the GT-UAA study shows that the dispersant BP used, Corexit, is extremely toxic to planktonic rotifers when combined with oil. The Alabama study found that many types of plankton died when exposed to mixtures of Corexit and crude oil. Since these animals are mostly producers and primary consumers, a mass die-off could have reverberations throughout the Gulf food web. We haven’t seen evidence that this is happening on a large scale, but given that BP used more than 1.8 million gallons of dispersant in the Gulf, and given that effects of oil spills may not show up until years later, I wouldn’t count my baby rotifers before they hatch.
IMPORTANT FACT: It was EPA regulations that required BP to use dispersants. Obviously they need to review such policies in light of these studies. And in light of being sued by several environmental groups. But hey — at least there was ONE THING that wasn’t BP’s fault, right? It’s not like they knew this would happen, and they did their best to ensure that they harmed the environment as little as possible … right?
Weeeeeell, kind of. Except that Corexit was known to have major toxicity issues — more so than available alternatives. In fact, according to a Popular Science article and a more recent AP article, EPA tried to tell them to use a less-toxic dispersant, but BP refused and said, basically, “You don’t know it’s that bad.”
I’m sorry, aren’t you the guys who said the Deepwater Horizon rig was safe?