Staff Scientist, Silent Spring Institute
The history of flame retardants stretches back at least as far as 450 B.C. when, as noted by Herodotus, the Egyptians soaked wood in alum. But it wasn’t until World War II, and the subsequent flush of highly flammable petroleum-based products into the market, that the flame retardants so popular today came into widespread use. The addition of these chemicals to our couches, TVs, and computers has soared in recent decades in response to flammability standards developed in the 1970s. Of course, we all want to protect ourselves and our families from fires. But the very regulations intended to protect us have unintentionally exposed us to chemicals that may be doing more harm than good.
Mounting research suggests that flame retardants may cause neurological and reproductive harm, thyroid disruption, and cancer. What is the latest evidence from animal and human studies? Are some people disproportionately exposed? Do less toxic alternatives exist? How can the emerging research inform chemicals policy reform? We explored these questions on a teleconference hosted by the CHE-Fertility Working Group and the Women’s Health and Environment Initiative (WHEI) on April 15.