Archive for January, 2016

Jan 29 2016

U.S. Fisheries Management Clears High Bar for Sustainability Based on New Assessment


January 28, 2016 — The following was released by NOAA Fisheries:

Today, NOAA Fisheries announced the publication of a peer-reviewed self-assessment that shows the standards of the United States fishery management system under the Magnuson-Stevens Act more than meet the criteria of the United Nation’s Food and Agriculture Organization’s ecolabelling guidelines. These same guidelines serve as a basis for many consumer seafood certification and ranking schemes. The assessment demonstrates that the U.S. fisheries management system is particularly strong when considering responsiveness and science-based criteria. Beyond the biological and ecosystem criteria, the assessment also pointed out that the U.S. system incorporates the social and economic components of fisheries essential for effective long-term stewardship.

This assessment was authored by Dr. Michelle Walsh, a former NOAA Fisheries Knauss Fellow and current member of the Marine Science Faculty at Florida Keys Community College. Walsh evaluated the sustainability of how U.S. federal fisheries are managed using the FAO’s Guidelines for the Ecolabelling of Fish and Fishery Products from Marine Capture Fisheries. These guidelines are a set of internationally recognized criteria used to evaluate the sustainability of fisheries around the world.

“While the performance of U.S. fisheries clearly illustrates that the U.S. management system is effective, my colleagues and I wanted to evaluate the U.S. approach to fisheries management as a whole against these international guidelines for ecolabelling seafood,” said Walsh.

Walsh found that the U.S. federal fisheries management system meets all of the FAO guidelines for sustainability. In particular, the assessment highlighted some key strengths of the U.S. system (represented by white/green dots on infographic) including:

  • Complying with national and international laws
  • Developing and abiding by documented management approaches with frameworks at national or regional levels
  • Incorporating uncertainty into stock reference points and catch limits while taking actions if those limits are exceeded
  • Taking into account the best scientific evidence in determining suitable conservation and management measures with the goal of long-term sustainability
  • Restoring stocks within reasonable timeframes

Evaluating Sustainability

“Sustainability” is about meeting the needs and wants of current generations without compromising those of future generations (WCED, 1987; United Nations, 1987). However, evaluating sustainability can become considerably more complex in the context of wild-caught fisheries in the dynamic ocean environment, where population trends and environmental conditions are often unclear or unknown.

Due to this complexity, many certification schemes assess sustainability on a fishery-by-fishery basis by evaluating discrete management approaches (such as gear type) and current stock status at a snapshot in time. This assessment, on the other hand, evaluates the U.S. management system as a whole against the FAO guidelines for ecolabels. It evaluates the capacity of the management system to respond to changes in stock levels and adapt to changing conditions via management measures that maintain sustainability over the long-term.

Jan 27 2016


— Posted with permission of SEAFOODNEWS.COM. Please do not republish without their permission. —

Copyright © 2016

Seafood News

SEAFOODNEWS.COM by John Sackton – January 25, 2016 — Last week the media was full of a new round of global fishery disaster stories, prompted by an article in Nature Communications by Daniel Pauly & Dirk Zeller affiliated with the Sea Around Us project.

Pauly and Zeller state that FAO global fisheries data has underestimated prior catch, and that therefore if this is taken into account, the decline in fish catch from the peak in the late 1990’s is not 400,000 tons per year, but 1.2 million tons per year.

“Our results indicate that the decline is very strong and is not due to countries fishing less. It is due to countries having fished too much and having exhausted one fishery after another,” said Pauly to the Guardian newspaper.  As a result, a new round of handwringing ensued about global overfishing.

But, the facts don’t support Pauly’s interpretation.  Catch rates are simply not a suitable measure of fisheries abundance.  In fact, declines in catch rates often are due to improvement in fisheries management, not declines in abundance.

Over at cfood, a number of scientists specifically rebutted the premise of Pauly’s article.

Ray Hilborn of the University of Washington says:

This paper tells us nothing fundamentally new about world catch, and absolutely nothing new about the status of fish stocks.

It has long been recognized that by-catch, illegal catch and artisanal catch were underrepresented in the FAO catch database, and that by-catch has declined dramatically.

What the authors claim, and the numerous media have taken up, is the cry that their results show that world fish stocks are in worse shape than we thought. This is absolutely wrong. We know that fish stocks are stable in some places, increasing in others and declining in yet others.

Most of the major fish stocks of the world, constituting 40% of the total catch are scientifically assessed using a mixture of data sources including data on the trends in abundance of the fish stocks, size and age data of the fish caught and other information as available. This paper really adds nothing to our understanding of these major fish stocks.

Another group of stocks, constituting about 20% of global catch, are assessed using expert knowledge by the FAO. These experts use their personal knowledge of these fish stocks to provide an assessment of their status. Estimating the historical unreported catch for these stocks adds nothing to our understanding of these stocks.

For many of the most important stocks that are not assessed by scientific organizations or by expert opinion, we often know a lot about their status. For example; abundance of fish throughout almost all of South and Southeast Asia has declined significantly. This is based on the catch per unit of fishing effort and the size of the individuals being caught. Estimating the amount of other unreported catches does not change our perspective on the status of these stocks.

In the remaining fisheries where we know little about their status, does the fact that catches have declined at a faster rate than reported in the FAO catch data tell us that global fisheries are in worse shape than we thought? The answer is not really. We would have to believe that the catch is a good index of the abundance.

Figure 1 of the Pauly and Zeller paper shows that a number of major fishing regions have not seen declines in catch in the last 10 years. These areas include the Mediterranean and Black Sea, the Eastern Central Atlantic, the Eastern Indian Ocean, the Northwest Pacific and the Western Indian Ocean. Does this mean that the stocks in these areas are in good shape, while areas that have seen significant declines in catch like the Northeast Atlantic, and the Northeast Pacific are in worse shape?

We know from scientific assessments that stocks in the Mediterranean and Eastern Central Atlantic are often heavily overfished – yet catches have not declined.

We know that stocks in the Northeast Pacific are abundant, stable and not overfished, and in the Northeast Atlantic are increasing in abundance. Yet their catch has declined.

Total catch, and declines in catch, are not a good index of the trends in fish stock abundance.

Michael Kaiser of Bangor University commented:

Catch and stock status are two distinct measurement tools for evaluating a fishery, and suggesting inconsistent catch data is a definitive gauge of fishery health is an unreasonable indictment of the stock assessment process. Pauly and Zeller surmise that declining catches since 1996 could be a sign of fishery collapse. While they do acknowledge management changes as another possible factor, the context is misleading and important management efforts are not represented. The moratorium on cod landings is a good example – zero cod landings in the Northwest Atlantic does not mean there are zero cod in the water. Such distinctions are not apparent in the analysis.

Also David Agnew, director of standards for the Marine Stewardship Council, said:

It is noteworthy that the peak of the industrial catches – in the late 1990s/early 2000s – coincidentally aligns with the start of the recovery of many well managed stocks. This point of recovery has been documented previously and particularly relates to the recovery of large numbers of stocks in the north Pacific, the north Atlantic and around Australia and New Zealand, and mostly to stocks that are assessed by analytical models. For stocks that need to begin recovery plans to achieve sustainability, this most often entails an overall reduction in fishing effort, which would be reflected in the reductions in catches seen here. So, one could attribute some of the decline in industrial catch in these regions to a correct management response to rebuild stocks to a sustainable status, although I have not directly analyzed the evidence for this. This is therefore a positive outcome worth reporting.

This opinion piece originally appeared on, a subscription site. It has been reprinted with permission.

Subscribe to

Jan 23 2016



January 22, 2016—The following is commentary from Michel J. Kaiser of Bangor University and David Agnew of the Marine Stewardship Council concerning the recently published article, Catch Reconstructions Reveal that Global Marine Fisheries Catches are Higher than Reported and Declining” by Daniel Pauly and Dirk Zeller in Nature.

A new paper led by Daniel Pauly of the University of British Columbia that found global catch data, as reported to the FAO, to be significantly lower than the true catch numbers. “Global fish catches are falling three times faster than official UN figures suggest, according to a landmark new study, with overfishing to blame.”

400 researches spent the last decade accumulating missing global catch data from small-scale fisheries, sport fisheries, illegal fishing activity and fish discarded at sea, which FAO statistics, “rarely include.”

“Our results indicate that the decline is very strong and is not due to countries fishing less. It is due to countries having fished too much and having exhausted one fishery after another,” Pauly says.

Despite these findings, Pauly doesn’t expect countries to realize the need to rebuild stocks, primarily because the pressures to continue current fishing effort are too strong in the developing world. But this study will allow researchers to see the true problems more clearly and hopefully inform policy makers accordingly.

Comment by Michel J. Kaiser, Bangor University, @MicheljKaiser

Catch and stock status are two distinct measurement tools for evaluating a fishery, and suggesting inconsistent catch data is a definitive gauge of fishery health is an unreasonable indictment of the stock assessment process. Pauly and Zeller surmise that declining catches since 1996 could be a sign of fishery collapse. While they do acknowledge management changes as another possible factor, the context is misleading and important management efforts are not represented. The moratorium on cod landings is a good example – zero cod landings in the Northwest Atlantic does not mean there are zero cod in the water. Such distinctions are not apparent in the analysis.

Another key consideration missing from this paper is varying management capacity. European fisheries are managed more effectively and provide more complete data than Indian Ocean fisheries, for example. A study that aggregates global landings data is suspect because indeed landings data from loosely managed fisheries are suspect.

Finally the author’s estimated catch seems to mirror that of the official FAO catch data, ironically proving its legitimacy. “Official” FAO data is not considered to be completely accurate, but rather a proportionate depiction of global trends. Pauly’s trend line is almost identical, just shifted up the y axis, and thus fails to significantly alter our perception of global fisheries.

Michel J. Kaiser is a Professor of Marine Conservation Ecology at Bangor University. Find him on twitter here.

Comment by David Agnew, Director of Standards, Marine Stewardship Council

The analysis of such a massive amount of data is a monumental task, and I suspect that the broad conclusions are correct. However, as is usual with these sorts of analyses, when one gets to a level of detail where the actual assumptions can be examined, in an area in which one is knowledgeable, it is difficult to follow all the arguments.  The Antarctic catches “reconstruction” apparently is based on one Fisheries Centre report (2015 Volume 23 Number 1) and a paper on fishing down ecosystems (Polar Record; Ainley and Pauly 2014). The only “reconstruction” appears to be the addition of IUU and discard data, all of which are scrupulously reported by CCAMLR anyway, so they are not unknown. But there is an apparent 100,000 t “unreported” catch in the reconstruction in Figure 3, Atlantic, Antarctic (48). This cannot include the Falklands (part of the Fisheries Centre paper) and it is of a size that could only be an alleged misreporting of krill catch in 2009. This is perhaps an oblique reference to concerns that CCAMLR has had in the past about conversion factors applied to krill products, or perhaps unseen (net-impact) mortality, but neither of these elements have been substantiated, nor referenced in the supporting documentation that I have seen (although I could not access the polar record paper).

The paper does not go into much detail on these reasons for the observed declines in catches and discards, except to attribute it to both reductions in fishing mortality attendant on management action to reduce mortality and generate sustainability, and some reference to declines in areas that are not managed. It is noteworthy that the peak of the industrial catches – in the late 1990s/early 2000s – coincidentally aligns with the start of the recovery of many well managed stocks. This point of recovery has been documented previously (Costello et al 2012Rosenberg et al 2006; Gutierrez et al 2012) and particularly relates to the recovery of large numbers of stocks in the north Pacific, the north Atlantic and around Australia and New Zealand, and mostly to stocks that are assessed by analytical models. For stocks that need to begin recovery plans to achieve sustainability, this most often entails an overall reduction in fishing effort, which would be reflected in the reductions in catches seen here. So, one could attribute some of the decline in industrial catch in these regions to a correct management response to rebuild stocks to a sustainable status, although I have not directly analyzed the evidence for this. This is therefore a positive outcome worth reporting.

The above-reported inflection point is also coincident with the launch of the MSC’s sustainability standard. These standards have now been used to assess almost 300 fisheries, and have generated environmental improvements in most of them (MSC 2015). Stock sustainability is part of the requirements of the standard, and previous analyses (Gutierrez et al 2012Agnew et al 2012) have shown that certified fisheries have improved their stock status and achieved sustainability at a higher rate than uncertified fisheries. The MSC program does not claim responsibility for the turn-around in global stocks, but along with other actions – such as those taken by global bodies such as FAO, by national administrations, and by industry and non-Governmental Organisations – it can claim to have provided a significant incentive for fisheries to become, and then remain, certified.

David Agnew is the Director of Standards at the Marine Stewardship Council, the largest fishery sustainability ecolabel in the world. You can follow MSC on twitter.

Read the commentary at CFOOD

Read the original post:

Jan 22 2016



January 20, 2016—The following is a commentary from Dr. Ray Hilborn, Professor of Aquatic and Fishery Science at the University of Washington, concerning the recently published article, “Eating Right Can Save the World”  by Tim Zimmerman in Outside Magazine.

“The endless cascade of nutritional information—about localism, vegetarianism, veganism, organic food, the environmental impact of eating meat, poultry, or fish, and more—makes the simple goal of a healthy, sustainable diet seem hopelessly complex. We talked to scientists, chefs, and farmers to get the ultimate rundown on how you should fuel up.”

Author Tim Zimmerman’s discussion of this topic focuses primarily on carbon footprint of different foods. When it comes to seafood, he cites Dalhousie University professor, Peter Tyedmers, who argues, “when it comes to nitrogen and phosphorous, greenhouse gases, and other global-scale phenomena, absolutely most seafood is much better than most terrestrial animal production.”

But seafood sustainability certifications like Monterray Bay Aquarium’s Seafood Watch do not calculate emissions into their ratings. For example, some pot-caught species that are considered a “Best Choice” or “Good Alternative” by Seafood Watch standards might actually have greater greenhouse gas emissions than beef because of the exhaustive extraction practicalities of this gear type.

Zimmerman recommends mussels, clams, forage fish that “aren’t caught by a trawler,” and Pollock from the, “reasonably managed” Alaskan Pollock fishery as the best choices for consumers looking to maximize Seafood Watch ratings and minimize carbon footprint. Aquaculture is more complicated because the more “sustainable” choices are typically from closed re-circulating systems that require more energy and water use than open net pens.

Comment by Ray Hilborn, University of Washington, @hilbornr

At last – a great article on the environmental impacts of our food choices. The material on fish is particularly good and relies on the acknowledged world expert Peter Tyedmers. In the past decade, environmental impacts of capture fisheries has been put under the microscope. A tour through high end grocery stores will show you labels about what fish are “sustainable,” but step over to the meat or vegetable counter and there is no sustainability labeling, just information such as “organic” or “GMO free.” The implication is that meat and vegetables are obviously sustainable, as farming has been practiced for thousands of years.

In his book, The Perfect Protein, Andy Sharpless, the CEO of the environmental non-profit, OCEANA points out that fish are caught without fertilizer, pesticides, antibiotics or freshwater. Combine that with the generally low carbon footprint of most fisheries compared to the protein alternatives, and you have “The Perfect Protein.”

Another issue that has had great publicity in fisheries is biodiversity impacts, often through by-catch of non-target species. Species like sharks, turtles, marine birds and mammals are caught in some kinds of fishing gear. While such by-catch can often be largely eliminated by good fishing practices, there is no denying that fishing impacts biodiversity. But again we see agriculture and livestock getting an almost clean bill of health. This ignores the fact that agriculture transforms land dramatically: in most farmed areas the native large animals are essentially gone. While it is hard to compare the oceans directly, they have seen far less loss of biodiversity than farmed areas.

Zimmerman and Outside Magazine provide an excellent and much needed perspective on most of these issues; lets hope that it is the start of a rational conversation on the environmental impacts of what we eat.

Ray Hilborn is a Professor in the School of Aquatic and Fishery Sciences at the University of Washington. Find him on twitter here:@hilbornr


Read the commentary at CFOOD

Read the original post:

Jan 21 2016

2015 Was Hottest Year in Historical Record, Scientists Say

Clockwise from top left: A family sleeping on the roof of a house in New Delhi last May; people navigating a flooded street in a canoe in Arnold, Mo., on Dec. 31; tourists in a haze-shrouded Singapore last September; the drought-stricken Molatedi Dam in South Africa in November. Credit Clockwise from top left; Tsering Topgyal/Associated Press, Jeff Roberson/Associated Press, Edgar Su/Reuters, Stuart Graham/Associated Press

Scientists reported Wednesday that 2015 was the hottest year in the historical record by far, breaking a mark set only the year before — a burst of heat that has continued into the new year and is roiling weather patterns all over the world.

In the contiguous United States, the year was the second-warmest on record, punctuated by a December that was both the hottest and the wettest since record-keeping began. One result has been a wave of unusual winter floods coursing down the Mississippi River watershed.

Scientists started predicting a global temperature record months ago, in part because an El Niño weather pattern, one of the largest in a century, is releasing an immense amount of heat from the Pacific Ocean into the atmosphere. But the bulk of the record-setting heat, they say, is a consequence of the long-term planetary warming caused by human emissions of greenhouse gases.

“The whole system is warming up, relentlessly,” said Gerald A. Meehl, a scientist at the National Center for Atmospheric Research in Boulder, Colo.

It will take a few more years to know for certain, but the back-to-back records of 2014 and 2015 may have put the world back onto a trajectory of rapid global warming, after a period of relatively slow warming dating to the last powerful El Niño, in 1998.

Politicians attempting to claim that greenhouse gases are not a problem seized on that slow period to argue that “global warming stopped in 1998,” with these claims and similar statements reappearing recently on the Republican presidential campaign trail.

Statistical analysis suggested all along that the claims were false, and that the slowdown was, at most, a minor blip in an inexorable trend, perhaps caused by a temporary increase in the absorption of heat by the Pacific Ocean.

“Is there any evidence for a pause in the long-term global warming rate?” said Gavin A. Schmidt, head of NASA’s climate-science unit, the Goddard Institute for Space Studies, in Manhattan. “The answer is no. That was true before last year, but it’s much more obvious now.”


The Hottest Year on Record

Globally, 2015 was the warmest year in recorded history.

How far above or below average temperatures were in 2015 Compared with the average from 1901 to 2000

Average global surface air temperatures Compared with the average from 1901 to 2000

Michael E. Mann, a climate scientist at Pennsylvania State University, calculated that if the global climate were not warming, the odds of setting two back-to-back record years would be remote, about one chance in every 1,500 pairs of years. Given the reality that the planet is warming, the odds become far higher, about one chance in 10, according to Dr. Mann’s calculations.

Two American government agencies — NASA, the National Aeronautics and Space Administration, and NOAA, the National Oceanic and Atmospheric Administration — compile separate analyses of the global temperature, based upon thousands of measurements from weather stations, ships and ocean buoys scattered around the world. Meteorological agencies in Britain and Japan do so, as well. The agencies follow slightly different methods to cope with problems in the data, but obtain similar results.

The American agencies released figures on Wednesday showing that 2015 was the warmest year in a global record that began, in their data, in 1880. British scientists released figures showing 2015 as the warmest in a record dating to 1850. The Japan Meteorological Agency had already released preliminary results showing 2015 as the warmest year in a record beginning in 1891.

On Jan. 7, NOAA reported that 2015 was the second-warmest year on record, after 2012, for the lower 48 United States. That land mass covers less than 2 percent of the surface of the Earth, so it is not unusual to have a slight divergence between United States temperatures and those of the planet as a whole.

The end of the year was especially remarkable in the United States, with virtually every state east of the Mississippi River having a record warm December, often accompanied by heavy rains.

A warmer atmosphere can hold more water vapor, and an intensification of rainstorms was one of the fundamental predictions made by climate scientists decades ago as a consequence of human emissions. That prediction has come to pass, with the rains growing more intense across every region of the United States, but especially so in the East.

The term global warming is generally taken to refer to the temperature trend at the surface of the planet, and those are the figures reported by the agencies on Wednesday.

Some additional measurements, of shorter duration, are available for the ocean depths and the atmosphere above the surface, both generally showing an inexorable long-term warming trend.

Most satellite measurements of the lower and middle layers of the atmosphere show 2015 to have been the third- or fourth-warmest year in a 37-year record, and scientists said it was slightly surprising that the huge El Niño had not produced a greater warming there. They added that this could yet happen in 2016.

When temperatures are averaged at a global scale, the differences between years are usually measured in fractions of a degree. In the NOAA data set, 2015 was 0.29 degrees Fahrenheit warmer than 2014, the largest jump ever over a previous record. NASA calculated a slightly smaller figure, but still described it as an unusual one-year increase.

The intense warmth of 2015 contributed to a heat wave in India last spring that turns out to have been the second-worst in that country’s history, killing an estimated 2,500 people. The long-term global warming trend has exacted a severe toll from extreme heat, with eight of the world’s 10 deadliest heat waves occurring since 1997.

Only rough estimates of heat deaths are available, but according to figures from the Center for Research on the Epidemiology of Disasters, in Brussels, the toll over the past two decades is approaching 140,000 people, with most of those deaths occurring during a European heat wave in 2003 and a Russian heat wave in 2010.

The strong El Niño has continued into 2016, raising the possibility that this year will, yet again, set a global temperature record. The El Niño pattern is also disturbing the circulation of the atmosphere, contributing to worldwide weather extremes that include a drought in southern Africa, threatening the food supply of millions.

Read the original post:


Jan 19 2016

El Niño is here, so where’s SoCal’s non-stop rain?

So far in terms of rainfall, this winter’s El Niño weather pattern has been more of a gecko than a Godzilla.

Southern California, which usually sees the bulk of the state’s El Niño-related storms, only experienced a few wet days in the first half of January. Overall, precipitation in Los Angeles and the rest of California is several inches behind where it was at this time during the last big El Niño in 1998.

graph1(Recent rainfall totals for California as of January 15th (in black) compared with rainfall from the five strongest El Niño systems on record. Image: California-Nevada Climate Applications Program / NOAA.)


Still, scientists are telling us this El Niño is one of the strongest ever recorded. So what gives?

Well, first off, it’s still early in the game, said Anthony Barnston, Chief Forecaster for Columbia University’s International Research Institute for Climate and Society.

“California typically shows its greatest responses to El Niño during January-March, rather than the earlier part of the winter,” he noted.

In short, there is still plenty of time for a good soaking.

That’s welcome news since much of the state is still below where it typically would be for an average water year.

Nate Mantua with the National Oceanic and Atmospheric Administration said there may be another factor worth considering.

Sure, this El Niño is strong when it comes to some key indicators like record warm surface temperatures in a swath of the Pacific associated with the weather pattern.

But Mantua noted that it is weaker in other climate signals, like the strength of the trade winds or the temperature of the ocean below the surface.

“It has a lot of the same characteristics as big El Niños of the past, but it also has some differences that may end up leading to different outcomes for what it does to weather in California and along the whole pacific coast,” he explained.

For example, he says the Pacific Northwest is getting a lot of heavy rain this winter, which isn’t typical for strong El Niño years.

So, expect surprises from this climate pattern.

map(This map shows the amount of rain in CA for this water year which starts on October 1st and ends on September 30th. Yellow and orange areas are below average precipitation and blues and purples are above average. Image via NOAA.)


Recent observations of the El Niño signal have noted that it seems to be weakening, as is often the case by this point in the winter.

That shouldn’t stop it from sending storms our way through the spring, though, Mantua said.

By summer, it’s likely the El Niño pattern will have completely disappeared, and scientists will start watching the signals again to see if it will return or if the world will see a neutral or La Niña pattern instead.

Read the original post:

Jan 14 2016

Baby Fish May Get Lost in Silent Oceans as CO2 Rises


Future oceans will be much quieter places, making it harder for young marine animals that navigate using sound to find their way back home, new research has found.

Under acidification levels predicted for the end of the century, fish larvae will cease to respond to the auditory cues that present-day species use to orient themselves, scientists reported in the journal Biology Letters.

While ocean acidification is known to affect a wide range of marine organisms and processes such as smell, until now its effect on marine soundscapes and impact on the larvae of marine animals was unknown.

The ocean is filled with sounds that carry information about location and habitat quality, study co-author Sir Ivan Nagelkerken said.

“Along with chemical and other cues, because of sound’s ability to travel long distances underwater, it is used as a navigational beacon by marine animals, particularly larvae,” Dr Nagelkerken said.

“More than 95 per cent of marine animals have a dispersive larval stage, where larvae drift with the currents for anywhere from a few days to a year, before returning to settle in their adult habitat near where they were spawned.”

To understand how acidification affects these marine animals, the team led by PhD student Tullio Rossi travelled to a naturally occurring carbon dioxide vent near White Island in New Zealand, where ocean acidification levels are similar to those predicted for the end of the century under business-as-usual conditions.

“This natural laboratory gave us a peek into the future,” Dr Nagelkerken said.

“We recorded the soundscape around the vent, then compared the loudness and composition of sounds with control sites a few hundred metres away.”

The area around the vent was much quieter, the team found.

“There could be a number of explanations for the decrease in sound,” Dr Nagelkerken said.

“For example, as acidification increases, kelp forests may be replaced by turf algae. This results in changing abundance of the animals that produce sounds, such as snapping shrimp whose ubiquitous crackle forms the backdrop to present-day ocean soundscapes.”

To understand how acidification affects marine animals’ auditory preferences, the researchers studied the impact of increased carbon dioxide levels on settlement-stage mulloway (Argyrosomus japonicas), a common temperate fish species.

They found that the 25- to 28-day-old larvae that had been exposed to higher carbon dioxide concentrations deliberately avoided present-day acoustic habitat cues recorded near White Island, while fish reared in present-day carbon dioxide levels responded positively.

Neither group of fish responded to the “future” soundscape recorded around the vent, despite the hearing of the normal fish being unimpaired.

Ocean acidification is known to increase the size of otoliths — fish ear bones — used for hearing, orientation and balance.

It has been hypothesised that bigger ear bones would increase the hearing range of larval fish, but the hearing in fish reared in future carbon dioxide levels was negatively impacted by ocean acidification, even though they had larger ear bones.

Dr Nagelkerken said the findings suggested that in the future, affected species would have to use other, potentially less reliable cues to help them navigate, even though other senses such as vision and smell are also negatively impacted by ocean acidification.

“Finding a home is the key to population sustainability,” Dr Nagelkerken said.

“Those that rely on sound as an orientation cue will be heavily impacted, limiting their ability to survive and contribute to the population.”


Article first appeared on ABC Science.

Read the Discovery post:


Jan 14 2016

La Jolla considering new way to deal with sea lions


SAN DIEGO (CBS 8) – La Jolla may have finally found a solution for dealing with the strong stench coming from the poop of sea lions.

On Tuesday night, the town council heard a new idea for keeping sea lions off the bluffs at La Jolla Cove.

Since ssea lion make their way up to the guardrail each night, the city hired a company to spray germ killing foam to get rid of the poop, but critics say it only lasts a week and the odor is back.

There are strict coastal regulations on how to take care of the sea lions and now there could be a solution.

The high surf has the sea lions on higher ground in the La Jolla Cove and residents and visitors can smell their poop is giving off strong stench.

“The smell here in La Jolla makes it very difficult for anyone of us who live here to put up with it and it makes it very difficult for tourists to come here. It hurts the business, it hurts the community and it hurts the individuals,” said La Jolla resident Barry Jadgoda.

After months of exploring options of what is legal, humane and efficient, the La Jolla Town Council Coastal Committee gave a first look at the marine mammal safety barriers.

“These are large cylinders that are inflatable and when the sea lions try to go over them they spin so the sea lions can’t get any leverage on them,” said La Jolla Town Council President Steve Haskins.

The safety barriers have been supported by the national oceanic atmospheric administration.

“When the sea lions attempt to pass over the spin, no matter how much they try to get traction, they can’t,” said Haskins.

The rollers will be placed on the east and west end of the cove to control where the sea lions do their business.

“I like this idea. I’m actually pleasantly surprised to have come down here to see it,” said Claude-Anthony Marengo, La Jolla Merchants Association President.

Still, how and who should scoop the poop has been raising a stink for several years.

“How do you interface with the city? How do you get them off their ass and how are we going to move forward on this because obviously we appreciate your leadership,” said Barry Jagoda.

The La Jolla Town Council is expected to approve the barrier plan on Thursday, share it with organizations, and hope the City Council will approve without going through the coastal commission approval.

Council President Lightner supports the plan, while a judge has rejected a complaint that it’s the city’s responsibility to scoop the poop, which is on appeal.

Read the original post:

Jan 12 2016

Dual Impact of Ocean Acidification and Low-Oxygen on West Coast Foretells Future for World Oceans

— Posted with permission of SEAFOODNEWS.COM. Please do not republish without their permission. —

Copyright © 2016

Seafood News

SEAFOODNEWS.COM [UW Today] By Michelle Ma – January 12, 2016

The Pacific Ocean along the West Coast serves as a model for how other areas of the ocean could respond in coming decades as the climate warms and emission of greenhouse gases like carbon dioxide increases. This region — the coastal ocean stretching from British Columbia to Mexico — provides an early warning signal of what to expect as ocean acidification continues and as low-oxygen zones expand.

Now, a panel of scientists from California, Oregon and Washington has examined the dual impacts of ocean acidification and low-oxygen conditions, or hypoxia, on the physiology of fish and invertebrates. The study, published in the January edition of the journal BioScience, takes an in-depth look at how the effects of these stressors can impact organisms such as shellfish and their larvae, as well as organisms that have received less attention so far, including commercially valuable fish and squid.

The results show that ocean acidification and hypoxia combine with other factors, such as rising ocean temperatures, to create serious challenges for marine life. These multiple-stressor effects will likely only increase as ocean conditions worldwide begin resembling those off the West Coast, which naturally expose marine life to stronger low-oxygen and acidification stressors than most other regions of the seas.

“Our research recognizes that these climate change stressors will co-occur, essentially piling on top of one another,” said co-author Terrie Klinger, professor and director of the University of Washington’s School of Marine and Environmental Affairs.

“We know that along the West Coast temperature and acidity are increasing, and at the same time, hypoxia is spreading. Many organisms will be challenged to tolerate these simultaneous stressors, even though they might be able to tolerate individual stressors when they occur on their own.”

Oceans around the world are increasing in acidity as they absorb about a quarter of the carbon dioxide released into the atmosphere each year. This changes the chemistry of the seawater and causes physiological stress to organisms, especially those with calcium carbonate shells or skeletons, such as oysters, mussels and corals.

Hypoxia, on the other hand, is a condition in which ocean waters have very low oxygen levels. At the extreme, hypoxia can result in “dead zones” where mass die-offs of fish and shellfish occur. The waters along the West Coast sometimes experience both ocean acidification and hypoxia simultaneously.

“Along this coast, we have relatively intensified conditions of ocean acidification compared with other places. And at the same time we have hypoxic events that can further stress marine organisms,” Klinger said. “Conditions observed along our coast now are forecast for the global ocean decades in the future. Along the West Coast, it’s as if the future is here now.”

Klinger is co-director of the Washington Ocean Acidification Center based at the UW and served on the West Coast Ocean Acidification and Hypoxia Science Panel, which was convened two years ago to promote coast-wide collaboration and cooperation on science and policy related to these issues.

For this paper, the authors examined dozens of scientific publications that reported physiological responses among marine animals exposed to lower oxygen levels, elevated acidity and other stressors. The studies revealed how physiological changes in marine organisms can lead to changes in animal behavior, biogeography and ecosystem structure, all of which can contribute to broader-scale effects on the marine environment.

The tri-state panel has completed this phase of its work and will wrap things up in the coming months. Among the products already published or planned are a number of scientific publications — including this synthesis piece — as well as resources for policymakers and the general public describing ocean research priorities, monitoring needs and management strategies to sustain marine ecosystems in the face of ocean acidification and hypoxia.

The group’s other papers and findings related to ocean acidification and hypoxia will soon be available on its website.

Co-authors of this paper include George Somero, Jody Beers and Steve Litvin at Stanford University’s Hopkins Marine Station; Francis Chan of Oregon State University; and Tessa Hill of the University of California, Davis.

The research was funded by the California Ocean Protection Council, the California Ocean Science Trust, the Institute for Natural Resources at Oregon State University and the National Science Foundation.

Subscribe to SEAFOODNEWS.COM Read the original post:

Jan 5 2016

A steady conveyor belt of El Niño storms is what has officials concerned

la-me-el-nino-storms-pictures-20160104-044Robert Gauthier / Los Angeles Times  San Dimas Public Works Supervisor Terry Gregory cleears a clogged drain from North San Dimas Canyon Road as heavy rains cause clogged drains and mud flows in San Dimas, Glendora and Azusa.


To understand the power and potential dangers of El Niño, look at satellite images of the Pacific Ocean on Sunday.

At least four storms were brewing — the farthest still getting going in Asia — and all aimed at California.

It’s this pattern, a series of back-to-back-to-back storms seemingly arriving on a conveyor belt, that concerns officials bracing for potential damage from the predicted winter of heavy rains.

“El Niño storms: it’s steady, not spectacular. But it’s relentless,” said Bill Patzert, climatologist at NASA’s Jet Propulsion Laboratory in La Cañada Flintridge. “It’s not 10 inches in 24 hours and nothing afterward. It’s a 1-inch storm, a 2-inch storm, followed by a 1-inch storm, followed by a 2-inch storm.

“As this goes on for many weeks, then you start to soak the hillsides — then you get more instability. And then, instead of having 6 inches of mud running down your street or off the hillside behind your house, then you can get serious mudflows — 2 to 3 feet in height.”

This week was the first that the weather pattern associated with El Niño has formed over California this season. A first system Monday didn’t amount to much after it ran into dry air out of the mountains, but three more storms are targeting California on Tuesday, Wednesday and Thursday, Patzert said.

“The next systems seem primed to deliver at least a couple good punches Tuesday and Wednesday, followed by plenty of showers Thursday,” the National Weather Service in Oxnard said in its forecast.

The riskiest areas for this week are areas recently burned by wildfires, such as the Camarillo Springs community in Ventura County, Silverado Canyon in Orange County, and the communities near the Christmas weekend brush fire that burned north of Ventura. Officials are concerned about flash floods in those areas, and a voluntary evacuation advisory is planned for Silverado Canyon, which is recovering from a fire in 2014.

But the worst problems will probably come later in the winter. “This is the first major line of storms. The ground isn’t quite saturated yet,” said meteorologist James Thomas of the National Weather Service in San Diego.

It’s later in the winter that the risk heightens; in Southern California, that’s particularly in neighborhoods and roads below arroyos and canyons and along the beach.

“That’s called, ‘The price you pay with the view,'” Patzert said.

Still, Patzert said, Southern California isn’t expected to encounter the same kind of widespread regional flooding that has hit the South in recent weeks. Although such devastating flooding occurred earlier in the 20th century, the transformation of the Los Angeles, San Gabriel and Santa Ana rivers into concrete-lined flood control channels has protected the region for generations.

Besides this El Niño, there are only two similarly strong El Niños in the record books over the last half-century.

The 1982-83 El Niño caused more than $500 million in property damage in California, which is equivalent to more than $1 billion in today’s dollars, and unleashed flooding and sent mud and rock raining over canyon and coastal roads, destroying the Seal Beach Pier and severely damaging the Santa Monica Pier.

The El Niño of 1997-98 also caused more than $500 million in damage, and 17 people died during those storms. In February 1998, 13.68 inches of rain poured down on Los Angeles — almost a year’s worth of precipitation. That month, two California Highway Patrol officers died in San Luis Obispo County after their car fell into a massive sinkhole as a river eroded a highway; two Pomona College students were killed when a tree slammed into their SUV; and mud pummeled homes in Laguna Beach, crushing homes and killing two men.

The arrival of the El Niño-influenced weather pattern in California comes just as expected, when El Niño’s influence on California weather peaks in January, February and March. A subtropical jet stream that’s normally not well-defined has emerged as a strong force over California. And “when the jet stream is stronger and closer, the storms can maintain their strength or get stronger as they approach California,” Daniel Swain, a climate scientist at Stanford University, said in an interview.

The back-to-back storms means a week not seen since December 2010, the last time a weeklong series of weather systems had a significant effect on Southern California, said Thomas, the San Diego meteorologist. The National Weather Service estimates as much as 2 to 3 inches of rain will fall along the coast of Los Angeles and Orange counties through Thursday — a decent amount, given that the average rainfall for all of January in downtown Los Angeles is about 3 inches.

Through Thursday night, there could be 2 to 4 feet of fresh snow in the San Bernardino Mountains at elevations above 7,000 feet, where Big Bear Lake is. “So that’s significant,” Thomas said. He warned of areas of near zero visibility because of blowing snow from Tuesday afternoon through Thursday night, and gusts of up to 50 mph. “So it’ll be a mess up there.”

Tuesday is expected to be the heaviest storm day for officials monitoring the Solimar fire burn area north of Ventura, which charred more than 1,200 acres over Christmas weekend. Vegetation, once burned, can no longer hold back loose sediment, and officials are worried about mud and debris crashing onto Solimar Beach communities, Pacific Coast Highway and sections of the 101 Freeway all the way up to the Sea Cliff area.

Even half an inch of rain in an hour could create a debris flow in these burn areas, said Gil Zavlodaver of the Ventura County Sheriff’s Office of Emergency Services.

Steven Frasher, a spokesman for the Los Angeles County Department of Public Works, cautioned residents and the homeless to stay out of flood control channels such as the L.A. River and Sepulveda Dam that, in dry times, are popular recreational areas.

“They’re incredibly dangerous,” Frasher said.

“It seems that every time something like this comes through, someone underestimates the power of how much water goes through there,” Frasher said. “This is not the time to use the recreational trails. Certainly don’t go anywhere near rushing water. If you see water on the roadways, don’t go through it. It’s often faster or deeper than you think it is. Not to mention running into debris and stuff like that.”

Officials encourage residents to sign up for emergency alerts and learn more about storm preparation by visiting websites such as and for Los Angeles County residents, for Ventura County, for Orange County, for San Diego County and for San Bernardino County.

Read the original post: