Tag Archives: environmental

The Solution to America’s Most Unexpectedly Dangerous Mammal

By ThinkReliability Staff

It’s hard to imagine that the mammal responsible for over 200 human deaths in America each year is the cute, cuddly…. deer.  These beautiful and seemingly harmless animals are hardly malicious.  Instead, they are in the wrong place at the wrong time, resulting in more than one million deer / vehicle collisions each year.  While the drivers have partial responsibility in these collisions, it seems that changes in the food chain have also contributed to this situation.   

In the 1800s, cougars (also called pumas or mountain lions) could be found roaming across the United States and Canada.  However, beginning in the early 1900s, states began implementing bounty programs enticing hunters to kill cougars.  The goal was to protect livestock and humans from these seemingly dangerous animals.  By the 1950s, the cougar population was primarily limited to areas west of the Rocky Mountains.  As the food chain predicts, the absence of a predator resulted in the overpopulation of its prey.  As the deer population increased, the probability for deer / vehicle collisions also increased.  

Expensive solutions have been considered to help decrease the collision rate, including deer culling, contraception and highway crossings.  However, it seems that nature may now be working towards its own natural solution.  As the bounty programs were removed in the 1960s and 1970s, the cougars have slowly begun migrating back towards the east.  A recent study published in Conservation Letters suggests that repopulation of cougars in the Eastern portion of the US could prevent 708,600 deer / vehicle collisions and 155 deaths over the next 30 years.   (The original fear of cougars attacking humans seems unfounded.  According to The Cougar Network, “Cougars are a retreating animal and very wary of people. Within the United States and Canada since 1890, there have been less than 100 attacks on humans, with about 20 fatalities. Encountering a cougar, let alone being attacked, is incredibly rare.”) 

A Cause Map is a helpful tool to dissect the cause-and-effect relationships contributing to a problem or situation.   Starting with the goals that were impacted, the causes and effects can be linked to create a chain.   For this situation, we begin with the safety goal that is impacted by the many fatalities each year.  Asking ‘Why’ questions, we can dig deeper to understand what causes are behind the impacted goal.   

In this case, the fatalities are a result of car collisions with deer.  The collisions are due to two factors: the deer unexpectedly crossing the road and the fact that the driver didn’t see the deer in time to stop.  We can trace each of these causes one at a time, revealing more causes.  The deer unexpectedly crosses the road because deer are moving to new areas.  This is because deer are overcrowded and need to expand their habitat.  The overcrowding is due to the growing deer population, which is due to the decrease in natural deer predators.  This decrease is caused by the decline in the cougar population, which is a result of the bounty programs that were implemented in the early 1900s.  These bounty programs were motivated by fear that the cougars would endanger humans or livestock.   

Going back to the driver’s role in the situation, we see that the driver may not have seen the deer in time due to poor lighting because deer often travel at dawn or dusk, and the driver may not have been paying close enough attention perhaps because they were distracted.   A second goal, property, was also impacted in this situation because the vehicles are damaged or destroyed as a result of the collisions.   

The Cause Map is also helpful in that it allows us to document the evidence and potential solutions directly on the causes that they can impact.   For example, the statistics about the number of collisions each year, fatalities each year, and cougar population changes are included right below the causes that they support.   Similarly, possible solutions are added right above the causes that they can impact.  In this case, deer culling and contraception could help control the deer overcrowding, and special deer highway crossings could help mitigate the deer crossing the road unexpectedly.  However, nature’s solution seems to fit further back in the chain by impacting the cause that is the decrease in the cougar population.   Time will tell if this solution will, in fact, reduce the number of collisions and injuries as predicted. 

To view the initial Cause Map of this issue, click on “Download PDF” above.

Oil Leaked from shipwreck near Newfoundland

By Kim Smiley

On March 31, 2013, oil was reported in Notre Dame Bay, Newfoundland.  Officials traced the source of the oil back to a ship, the Manolis L, that sank in 1985 after running aground.  The Manolis L is estimated to have contained up to 462 tons of fuel and 60 tons of diesel when it sank and much of that oil is believed to still be contained within the vessel.  Officials are working to ensure the oil remains contained, but residents of nearby communities who rely on tourism and fishing are concerned about the potential for more oil to be released into the environment.

A Cause Map, a visual format for performing root cause analysis, can be built to better understand this issue.  There are three steps in the Cause Mapping process. The first step is to fill out an Outline with the basic background information along with listing how the problem impacts the goals.  There is also space on the Outline to note the frequency of the issue.  For this example, 2013 was the first time oil was reported to be leaking from this particular sunken ship, but there have been 700 at-risk sunken vessels identified in Canadian waters alone.  It’s worth noting this fact because the amount of resources a group is willing to use to address a problem may well depend on how often it is expected to occur.  One leaking sunken ship is a different problem than potentially having hundreds that may require action.

The second step is to perform the analysis by building the Cause Map.  A Cause Map is built by asking “why” questions and laying out the answers to visually show the cause-and-effect relationships.  Once the causes have been identified, the final step is to develop and implement solutions to reduce the risk of similar problems occurring in the future.  Click on “Download PDF” to view an Outline and intermediate level Cause Map for this problem.

In this case, the environmental goal is clearly impacted because oil was released into the environment.  Why? Oil leaked out of a sunken ship because a ship had sunk that contained a large quantity of oil and there were cracks in the hull.  The hull of this particular ship is thin by modern standards (only a half-inch) and it has been sitting in sea water for the last 30 years.  A large storm hit the region right before oil was first reported and it is believed that the hull (already potentially weakened by corrosion) was damaged during the storm.  The Coast Guard identified two large cracks in the ship that were leaking oil during their investigation.

Once the causes of the issue have been identified, the final step is to implement solutions to reduce the risk of future problem.  This is where a lot of investigations get tricky.  It is often easier to identify the problem than to actually solve it. It can be difficult to determine what level of risk is acceptable and how many resources should be allotted to an issue.  The cracks in the hull of the Manolis L have been patched using weighted neoprene sealants and a cofferdam has been installed to catch any oil that leaks out.  The vessel is being monitored by the Canadian Coast Guard via regular site visits and aerial surveillance flights. But the oil remains in the vessel so there is the potential that it could be released into the environment.

Many local residents are fighting for the oil to be removed from the sunken ship, rather than just contained, to further reduce the risk of oil being released into the environment. But removing oil from a sunken ship is very expensive.  In 2013, it cost the Canadian Coast Guard about $50 million to remove oil from a sunken ship off the coast of British Columbia. So far, officials feel that the measures in place are adequate and that the risk doesn’t justify the cost of removing the oil from the vessel. If they are right, the oil will stay safely contained at a fraction of the cost of removing it, but if they are wrong there could be lasting damage to local communities and wildlife.

In situations like this, there are no easy answers.  Anybody who works to reduce risk faces similar tradeoffs and generally the best you can do is to understand a problem as thoroughly as possible to make an informed decision about the best use of resources.

Heavy metal detected in moss in Portland

By Kim Smiley

Residents and officials are struggling to find a path forward after toxic heavy metals were unexpectedly found in samples of moss in Portland, Oregon. According to the U.S. Forest Service, the moss was sampled as part of an exploratory study to measure air pollution in Portland.  The objective of the study was to determine if moss could be used as a “bio-indicator” of hydrocarbons and heavy metals in air in an urban environment.  Researchers were caught off guard when the samples showed hot spots of relatively high heavy metal levels, including chromium, arsenic, and cadmium (which can cause cancer and kidney malfunction).  Portland officials and residents are working to determine the full extent of the problem and how it should be addressed.

So where did the heavy metals come from?  And how is it that officials weren’t already aware of the potential issue of heavy metals in the environment? The investigation into this issue is still ongoing, but an initial Cause Map can be built to document what is known at this time.  A Cause Map is built by asking “why” questions and visually laying out all the causes that contributed to the problem.  (Click on “Download the PDF” to view the initial Cause Map.)

Officials are still working to verify where the heavy metals are coming from, but early speculation is that nearby stained-glass manufacturers are the likely source.  Heavy metals are used during the glass manufacturing process to create colors. For example, cadmium is used to make red, yellow and orange glass and chromium is used to make green and blue glass. The hot spots where heavy metals were detected surround two stained-glass manufacturers, but there are other industrial facilities nearby that may have played a role as well.  There are still a lot of unknowns about the actual emissions emitted from the glass factories because no testing has been done up to this point.  Testing was not required by federal regulations because of the relatively small size of the factories.  If the heavy metals did in fact originate from the glass factories, many hard questions about the adequacy of current emissions regulations and testing requirements will need to be answered.

Part of the difficulty of this issue is understanding exactly what the impacts from the potential exposure to heavy metals might be.  Since the levels of heavy metals detected so far are considered below the threshold of “acute”,  investigators are still working to determine what the potential long-term health impacts might be.

A long-term benefit of this mess is the validation that moss can be used as an indicator of urban air quality.  Moss has been used as an “bio-indicator” for air quality since the 1960s in rural environments, but this the first attempt to sample moss to learn about air quality in an urban setting.  As moss is plentiful and testing it is relatively inexpensive, this technique may dramatically improve testing methods used in urban environments.

Both glass companies have voluntarily suspended working with chromium, cadmium and arsenic in response to a request by the Oregon Department of Environmental Quality.  The DEQ has also begun additional air monitoring and begun sampling soil in the impacted areas to determine the scope of the contamination. As officials gain a better understanding of what is causing the issue and what the long-term impacts are, they will be able to develop solutions to reduce the risk of similar problems occurring in the future.

Avoiding Procedure Horrors in Your Little Shop

By ThinkReliability Staff

Are you singing “Suddenly Seymour”, yet?  In this blog, we take a look at the ever-so-interesting example of a Venus Flytrap.  These fascinating creatures have captured imaginations and inspired many science fiction books, movies and even a musical (Little Shop of Horrors).  When thinking about a Venus Flytrap, the “problem” really depends on the point of view   From the point of view of the fly, the problem is getting eaten for lunch.  From the point of view of the Venus Flytrap, the problem is how to catch its lunch.  Since it’s really only a problem for one of the parties, we will  focus on the question of how, and examine the Process Map as a best practice for documenting the how in your shop.

Process Maps are very useful tools.  Converting a written job procedure or word of mouth instructions into a picture or map can illuminate a complicated process and make it seem quite simple.  Asking how something happens, or how something gets done can provide valuable detail that can be useful for anyone attempting that task now and in the future.  The benefit can include preventing or minimizing incidents that often recur from lack of clarity in a procedure.

To start with, a very simple map can be created that shows the process of a Venus Flytrap eating a fly in 4 steps:  The fly lands in the trap, the trap closes, the plant eats the fly, and the trap opens again.  However, this ‘simple’ process is actually extremely complex.  In his recent article titled “Venus Flytraps Are Even Creepier Than We Thought” (The Atlantic, January 21 , 2016), Ed Yong outlines the process and intricacies of how the carnivorous plant works.  When the fly lands on the Flytrap’s bright red and enticing leaves, a complicated process of chemicals, electrical impulses and physics is kicked off… all with very delicate timing.  The Flytrap’s leaves are covered with sensitive hairs.  If the fly touches those hairs more than once in 20 seconds, it begins a process ensuring its own demise.  A well-timed increase in calcium ions and electrical impulses result in water flowing to the Flytrap’s leaves, causing them to change shape, trapping the fly inside.  At this point, the more the fly struggles, the more problems it creates for itself.  Further stimulating these hairs results in more calcium ions and more electrical impulses, this time resulting in the flow of hormones and digestive enzymes.  Over time, the leaves will create a hermetic seal and fill up with liquid, causing the fly to asphyxiate and die.  Next, the pH level of the fluid inside the trap drops to 2, and the digestive process begins in earnest.  Recent research suggests that chemical sensors on the Flytrap’s leaves can detect the level of digestion of the fly, stimulating the release of more digestive enzymes if needed, or causing the trap leaves to open back up.  The Flytrap is then ready to begin the process again.  As Charles Darwin said, “THIS plant, commonly called Venus’ fly-trap, from the rapidity and force of its movements, is one of the most wonderful in the world.”  (1875. Insectivorous Plants)

This Process Map, while detailed, could surely be broken down into further detail by a entomologist who deeply understands the intricate workings of a Venus Flytrap.  Fortunately for a baby Venus Flytrap, this process map is coded directly into its DNA, so it doesn’t have to rely on anything to know what to do.  Unfortunately for us, work-related tasks are rarely so instinctual.  We rely on job procedures, process maps and word of mouth to learn the best, safest way to get the job done. Ensuring consistency with that transfer of information is key to making sure that incidents and problems are avoided.  Problems that result from poorly defined procedures or work processes can go by many names: procedure not followed, human error, etc.  At the end of the day, the roots (pun intended) of many of these problems are poorly articulated or poorly communicated work processes.  The simple tool of a process map can help minimize these problems by making the steps of the process clear and easy to understand.

Landslide of construction debris buries town, kills dozens

By ThinkReliability Staff

Shenzhen, China has been growing fast. After a dump site closed in 2013, construction debris from the rapid expansion was being dumped everywhere. In an effort to contain the waste, a former rock quarry was converted to a dump site. Waste at the site reached 100 meters high, despite environmental assessments warning about the potential for erosion. On December 20, 2015, the worries of residents, construction workers and truckers came true when the debris slipped from the quarry, covering 380,000 square meters (or about 60 football fields) with thick soil as much as 4 stories high.

A Cause Map can be built to analyze this issue. One of the steps in the Cause Mapping process is to determine how the issue impacted the overall goals. In this case, the landslide severely impacted multiple goals. Primarily, the safety goal was impacted due to a significant number of deaths. 58 have been confirmed dead, and at least 25 are missing. The environmental goal and customer service goal were impacted due to the significant area covered by construction waste. The regulatory goal is impacted because 11 have been detained as part of an ongoing criminal investigation. The property goal is impacted by the 33 buildings that were destroyed. The labor goal is also impacted, as are more than 10,600 people participating in the rescue effort.

The Cause Map is built by visually laying out the cause-and-effect relationships that contributed to the landslide. Beginning with the impacted goals and asking “Why” questions develops the cause-and-effect relationships. The deaths and missing persons resulted from being buried in construction waste. Additionally, the confusion over the number of missing results from the many unregistered migrants in the rapidly growing area. The area was buried in construction waste when waste spread over a significant area, due to the landslide.

The landslide resulted from soil and debris that was piled 100 meters high, and unstable ground in a quarry. The quarry was repurposed as a waste dump in order to corral waste, which had previously been dumped anywhere after the closure of another dump. Waste and debris was piled so high because of the significant construction debris in the area. There was heavy construction in the area because of the rapid growth, resulting in a lot of debris. Incentives (dumpsite operators make money on each load dumped) encourage a high amount of waste dumping. Illegal dumping also adds to the total.

While an environmental impact report warned of potential erosion, and the workers and truck drivers at the dump registered concerns about the volume of waste, these warnings weren’t heeded. Experts point to multiple recent industrial accidents in China (such as the warehouse fire/ explosion in Tianjin in August, the subject of a previous blog) as evidence of the generally lax enforcement of regulations. Heavy rains contributed to ground instability, as did the height of the debris, and the use of the site as a quarry prior to being a waste dump.

Actions taken in other cities in similar circumstances include charging more for dumping debris in an effort to encourage the reuse of materials and monitoring dump trucks with GPS to minimize illegal dumping. These actions weren’t implemented in Shenzhen prior to the landslide, but this accident may prompt their implementation in the future. Before any of that can happen, Shenzhen has a long way to go cleaning up the construction debris covering the city.

Neurotoxin makes California crabs unsafe to eat

By Kim Smiley

California officials have delayed indefinitely both recreational and commercial fishing for Dungeness and Rock crab from the coast north of Santa Barbara all the way to the Oregon border because the crabs have been determined to be a threat to public safety.  Testing has shown that many of the crabs in this region contain potentially unsafe levels of domoic acid, a powerful neurotoxin, that can cause illness in humans if they consume the crabs. Domoic acid poisoning causes vomiting, diarrhea, cramping and can even lead to brain damage and death in severe cases.  Scientists are continuing to test crabs caught off the California coast and the hope is to open crabbing season if/when the crabs are found to be safe for consumption.

A Cause Map, a visual root cause analysis, can be built to help understand the causes that contribute to this issue.  The first step in building a Cause Map is to understand the impacts from the issue being considered.  Obviously this issue has the potential to impact public safety because the crabs have the potential to cause illness, although no cases of domoic acid poisoning in humans have been reported in this year. The economic impact to the fishing industry from the delay in the start of crabbing season is also very significant.  California’s crabbers typically gross about $60 million a year and many families depend on the money made during crab season to live on throughout the year.  This issue also impacts the environment because humans aren’t the only animals that can suffer from domoic acid poisoning and other creatures are continuing to eat the contaminated crabs.  Sea lions in particular have been affected by the neurotoxin and many have died.  Removing large predators has the potential to significantly impact the entire ecosystem.

The Cause Map itself is built by asking “why” questions and laying out the answers to intuitively show the cause-and-effect relationships. So why do the crabs have high levels of domoic acid in their bodies?  This year off the coast of California, warmer than typical ocean temperatures have led to an unusually large and long-lasting algae bloom created by Pseudo-nitzschia. Domoic acid is naturally produced by Pseudo-nitzschia and it can be concentrated into dangerous levels as it moves up the food chain.  Small fish and shellfish such as krill, anchovies and sardines consume the domoic acid along with the algae.  Crabs eat the smaller creatures that have been contaminated with domoic acid.  Crabs can eventually excrete the domoic acid, but the process is slow and takes enough time that the domoic acid can build up to high levels in the bodies of the crabs.  If bigger creatures such as humans and sea lions eat the contaminated crabs, they can be poisoned by the domoic acid that was initially produced by the algal bloom.  There is nothing that can make the contaminated crabs safe for consumption. Neither cooking nor cleaning can eliminate the risk of poisoning from the neurotoxin so the only safe option is to wait until the domoic acid returns to safe levels in the crabs.

To view an Outline that lists the impacted goals and see a high level Cause Map of this issue, click on “Download PDF” above.

Interim Recommendations After Fatal Chemical Release

By ThinkReliability Staff

After a fatal chemical release on November 15, 2014 (see our previous blog for an initial analysis), the Chemical Safety Board (CSB) immediately sent an investigative team. The team spent seven months on-site. Prior to the release of the final report, the CSB has approved and released interim recommendations that will be addressed by the site as part of its restart.

Additional detail related to the causes of the incident was also released. As more information is obtained, the root cause analysis can be updated. The Cause Map, or visual root cause analysis, begins with the impacts to the organization’s goals. While multiple goals were impacted, in this update we’ll focus on the safety goal, which was impacted due to four fatalities.

Four workers died due to chemical asphyxiation. This occurred when methyl mercaptan was released and concentrated within a building. Two workers were in the building and were unable to get out. One of these workers made a distress call, to which four other workers responded. Two of the responding workers were also killed. (Details on the attempted rescue process, including personal protective equipment used, have not yet been released.) Although multiple gas detectors alarmed in the days prior to the incident, the building was not evacuated. The investigation found that the alarms were set above permissible exposure limits and did not provide effective warning to workers.

Methyl mercaptan was used at the facility to manufacture pesticide. Prior to the incident, water accessed the piping system. In the cold weather, the water and methyl mercaptan formed a solid, blocking the pipes. Just prior to the release, the blockage had been cleared. However, different workers, who were unaware the blockage had been cleared, opened valves in the system as previously instructed to deal with a pressure problem. Investigators found that the pressure relief system did not vent to a “safe” location but rather into the enclosed building. The CSB has recommended performing a site-wide pressure relief study to ensure compliance with codes and standards.

The building, which contained the methyl mercaptan piping, was enclosed and inadequately ventilated. The building had two ventilation fans, which were not operating.   Even though these fans were designed PSM critical equipment (meaning their failure could result in high consequence event), an urgent work order written the month prior had not been fulfilled. Even with both fans operating, preliminary calculations performed as part of the investigation determined the ventilation would still not have been adequate. The CSB has recommended an evaluation of the building design and ventilation system.

Although the designs for processes involving methyl isocyanate were updated after the Bhopal incident, the processes involving methyl mercaptan were not. The investigation has found that there was a general issue with control of hazards, specifically because non-routine operations were not considered as part of hazard analyses. The CSB has recommended conducting and implementing a “comprehensive, inherently safer design review” as well as developing an expedited schedule for other “robust, more detailed” process hazard analyses (PHAs).

Other recommendations may follow in the CSB’s final report, but these interim recommendations are expected to be implemented prior to the site’s restart, in order to ensure that workers are protected from future similar events.

To view an updated Cause Map of the event, including the CSB’s interim recommendations, click “Download PDF” above. Click here to view information on the CSB’s ongoing investigation.

Invasive Pythons Decimating Native Species in the Everglades

By Kim Smiley

Have you ever dreamed of hunting pythons?  If so, Florida is hosting the month-long 2016 Python Challenge and all you need to do to join in is to pay a $25 application fee and pass an online test to prove that you can distinguish between invasive pythons and native snake species.

The idea behind the python hunt is to reduce the population of Burmese pythons in the Florida Everglades.  As the number of pythons has increased, there has been a pronounced decline in native species’ populations, including several endangered species.  Researchers have found that 99% of raccoons and opossums have vanished along with 88% of bobcats, along with declines in nearly every other species.  Pythons are indiscriminate eaters and consume everything from small birds to full-grown deer.  The sheer number of these invasive snakes in the Florida Everglades is having a huge environmental impact.

The exact details of how pythons were released into the Everglades aren’t known, but genetic testing has confirmed that the population originated from pet snakes that were either released or escaped into the wild. Once the pythons were introduced into the Everglades, their number quickly grew as the python population thrived.  The first Burmese python was found in the Florida Everglades in 1979 and now there are estimated to be as many as 100,000 of the snakes in the area.

There are many factors that have led to the rapid growth in the python population.  They are able to live in the temperate Florida climate, have plentiful food available, and are successfully reproducing.  Pythons produce a relatively large number of eggs (an average of 40 eggs about every 2 years) and the large female python protects them.  Hatchling pythons are also larger than most hatchling snakes, which increases their chance of surviving into adulthood.  There are very few animals that prey on adult pythons.  Researchers have found that alligators occasionally eat pythons, but that the relationship between these two top predators can go both ways and pythons have occasionally eaten alligators up to 6 feet in length.  The only other real predators capable of taking down a python are humans and even that is a challenge.

Before a python can be hunted, it has to be found and that is often much easier said than done. Pythons have excellent camouflage and are ambush predators that naturally spend a large percentage of the day hiding.  They also are semi-aquatic and excellent climbers so they can be found in both the water and in trees.  Despite their massive size (they can grow as long as 20 feet and weigh up to 200 pounds), they blend in so well with the environment that researchers even have difficulty finding snakes with radio transmitters showing their locations.

The last python challenge was held about 3 years ago and 68 snakes were caught.  While that number may not sound large, it is more snakes than have been caught in any other month.  The contest also helped increase public awareness of the issue and hopefully discouraged any additional release of pets of any variety into the wild.  For the 2016 contest, officials are hoping to improve the outcome by offering prospective hunters on-site training with a guide who will educate them on swamps and show them areas where snakes are most likely to be found.

To view a Cause Map, a visual root cause analysis format, of this issue click on “Download PDF” above.  A Cause Map intuitively lays out the cause-and-effect relationships that contributed to the problem.

You can check out some of our previous blogs to view more Cause Maps for invasive species if you want to learn more:

Small goldfish can grow into a large problem in the wild

Plan to Control Invasive Snakes with Drop of Dead Mice

Waste Released from Gold King Mine

By Renata Martinez

On August 5, 2015 over 3 million gallons of waste was released from Gold King Mine into Cement Creek which then flowed into the Animas River. The orangish colored plume moved over 100 miles downstream from Silverton, Colorado through Durango reaching the San Juan River in New Mexico and eventually making its way to Lake Powell in Utah (although the EPA stated that the leading edge of the plume was no longer visible by the time it reached Lake Powell a week after the release occurred).

Some of the impacts were immediate.  No workers at the mine site were hurt in the incident but the collapse of the mine opening and release of water can be considered a near miss because there was potential for injuries. After the release, there were also potential health risks associated with the waste itself since it contained heavy metals.

Water sources along the river were impacted and there’s potential that local wells could be contaminated with the waste.   To mitigate the impacts, irrigation ditches that fed crops and livestock were shut down.  Additionally, the short-term impacts include closure of the Animas River for recreation (impacting tourism in Southwest Colorado) from August 5-14.

The long-term environmental impacts will be evaluated over time, but it appears that the waste may damage ecosystems in and along the plume’s path. There are ongoing investigations to assess the impact to wildlife and aquatic organisms, but so far the health effects from skin contact or incidental ingestion of contaminated river water are not considered significant.

“Based on the data we have seen so far, EPA and the Agency for Toxic Substances and Disease Registry (ATSDR) do not anticipate adverse health effects from exposure to the metals detected in the river water samples from skin contact or incidental (unintentional) ingestion. Similarly, the risk of adverse effects to livestock that may have been exposed to metals detected in river water samples from ingestion or skin contact is low. We continue to evaluate water quality at locations impacted by the release.”

The release occurred when the EPA was working to stabilize the existing adit (a horizontal shaft into a mine which is used for access or drainage). The force of the weight of a pool of waste in the mine overcame the strength of the adit, releasing the water into the environment.  The  EPA’s scope of work at Gold King Mine also included assessing the ongoing leaks from the mine to determine if the discharge could be diverted to retention ponds at the Red and Bonita sites.

The wastewater had been building up since the adit collapsed in 1995.  There are networks and tunnels that allow water to easily flow between the estimated 22,000 mine sites in Colorado.  As water flows through the sites it reacts with pyrite and oxygen to form sulfuric acid.  When the water is not treated and it contacts (naturally occurring) minerals such as zinc, lead, cadmium, copper and aluminum and breaks down the heavy metals, leaving tailings.  The mines involved in this incident were known to have been leaking waste for years.  In the 90s, the EPA agreed to postpone adding the site to the Superfund National Priorities List (NPL), so long as progress was made to improve the water quality of the Animas River.  Water quality improved until about 2005 at which point it was re-assessed.  Again in 2008, the EPA postponed efforts to include this area on the NPL.  From the available information, it’s unclear if this area and the waste pool would have been treated if the site was on the NPL.

In response, the “EPA is working closely with first responders and local and state officials to ensure the safety of citizens to water contaminated by the spill. ” Additionally, retention ponds have been built below the mine site to treat the water and continued sampling is taking place to monitor the water.

So how do we prevent this from happening again?  Mitigation efforts to prevent the release were unsuccessful.  This may have been because the amount of water contained in the mine was underestimated.  Alternatively, if the amount of water in the mine was anticipated (and the risk more obvious) perhaps the excavation work could have been planned differently to mitigate the collapse of the tunnel.  As a local resident, I’m especially curious to learn more facts about the specific incident (how and why it occurred) and how we are going to prevent this from recurring.

The EPA has additional information available (photos, sampling data, historic mine information) for reference: http://www2.epa.gov/goldkingmine

Spider in air monitoring equipment causes erroneously high readings

By Kim Smiley

Smoke drifting north from wildfires in Washington state has raised concerns about air quality in Calgary, but staff decided to check an air monitoring station after it reported an alarming rating of 28 on a 1-10 scale.  What they found was a bug, or rather a spider, in the system that was causing erroneously high readings.

The air monitoring station measures the amount of particulate matter in air by shining a beam of light through a sample of air.  The less light that makes it through the sample, the higher the number of particulates in the sample and the worse the quality of air.  You can see the problem that would arise if the beam of light was blocked by a spider.

This example is a great reminder not to rely solely on instrument readings.  Instruments are obviously useful tools, but the output should always be run through the common sense check.  Does it make sense that the air quality would be so far off the scale?  If there is any question about the accuracy of readings, the instrument should probably be checked because the unexpected sometimes happens.

In this case, inaccurate readings of 10+ were reported by both Environment Canada and Alberta Environment before the issue was discovered and the air quality rating was adjusted down to a 4.  Ideally, the inaccurate readings would have been identified prior to posting potentially alarming information on public websites.  The timing of the spider’s visit was unfortunate because it coincided with smoky conditions that made the problem more difficult to identify, but extremely high readings should be verified before making them public if at all possible.

Adding an additional verification step when there are very high readings prior to publicly posting the information could be a potential solution to reduce the risk of a similar problem recurring.  A second air monitoring station could be added to create a built-in double check because an error would be more obvious if the monitoring stations didn’t have similar readings.

Depending on how often insects and spiders crawl into the air monitoring equipment, the equipment itself could be modified to reduce the risk of a similar problem recurring in the future.

To view a Cause Map, a visual root cause analysis, of this issue, click on “Download PDF” above.