Animal studies find that a replacement compound for the estrogen-mimicking chemical bisphenol A may also be harmful to human health
Aug 11, 2014 |By Jenna Bilbrey
Currently, no federal agency tests the toxicity of new materials before they are allowed on the market.
Credit: Kyle LeBoeuf via Flickr
In 2012 the U.S. Food and Drug Administration banned the sale of baby bottles that contain bisphenol A (BPA), a compound frequently found in plastics. The ban came after manufacturers’ responded to consumer concerns of BPA's safety after several studies found the chemical mimics estrogen and could harm brain and reproductive development in fetuses, infants and children.* Since then store shelves have been lined with BPA-free bottles for babies and adults alike. Yet, recent research reveals that a common BPA replacement, bisphenol S (BPS), may be just as harmful.
BPA is the starting material for making polycarbonate plastics. Any leftover BPA that is not consumed in the reaction used to make a plastic container can leach into its contents. From there it can enter the body. BPS was a favored replacement because it was thought to be more resistant to leaching. If people consumed less of the chemical, the idea went, it would not cause any or only minimal harm.
Yet BPS is getting out. Nearly 81 percent of Americans have detectable levels of BPS in their urine. And once it enters the body it can affect cells in ways that parallel BPA. A 2013 study by Cheryl Watson at The University of Texas Medical Branch at Galveston found that even picomolar concentrations (less than one part per trillion) of BPS can disrupt a cell’s normal functioning, which could potentially lead to metabolic disorders such as diabetes and obesity, asthma, birth defects or even cancer. “[Manufacturers] put ‘BPA-free’ on the label, which is true. The thing they neglected to tell you is that what they’ve substituted for BPA has not been tested for the same kinds of problems that BPA has been shown to cause. That’s a little bit sneaky,” Watson says.
A 2011 study published in Environmental Health Perspectives found that almost all of the 455 commercially available plastics that were tested leached estrogenic chemicals. This study lead to a bitter legal battle between Eastman Chemical Co. and the study’s author, George Bittner, professor of neurobiology at The University of Texas at Austin and founder of CertiChem and PlastiPure, two companies designed to test and discover nonestrogenic plastics.
Bittner claimed in the peer-reviewed report that Eastman’s product Tritan, marketed to be completely free of estrogenic leaching, showed such activity. Eastman claimed otherwise and filed a suit. A federal jury ruled in favor of the latter, saying Bittner’s testing methods were inadequate because the tests were done in vitro—in a petri dish rather than in vivo, in a live animal.
Since this episode, independent scientists have focused their efforts on in vivo testing. Deborah Kurrasch, from the University of Calgary, turned to zebra fish to study the effects of BPS on embryo development. Brain development in zebra fish is similar to that in humans but much easier to track. When the fish were dosed with BPS in similar concentrations to that found in a nearby river, neuronal growth exploded, rising 170 percent for fish exposed to BPA and a whopping 240 percent for those exposed to BPS. As the fish aged they began zipping around their tank much faster and more erratically than the unexposed fish. The researchers concluded that increased neural growth likely lead to hyperactivity. “Part of the problem with endocrine disruptors is they usually have a U-shaped dose response profile,” Kurrasch says. “At very low doses they have activity and then as you increase the dose it drops in activity. Then at higher doses it has activity again.” She found a very low dose—1,000-fold lower than the daily recommended amount for humans—can affect neural growth in zebra fish.
In another study, Hong-Sheng Wang, an associate professor at the University of Cincinnati, found that both BPA and BPS cause heart arrhythmia in rats. He tested almost 50 rats, giving them the chemicals in doses akin to concentrations found in humans. Even at such low concentrations the rats’ hearts began to race, but curiously only those of the females. They found that BPS blocked an estrogen receptor found only in female rats, which lead to the disruption of calcium channels—a common cause of heart arrhythmia in humans.
These in vivo studies agree with in vitro studies claiming that BPS is a hazard. But the problem doesn’t stop with removing bisphenol S from the market as was done for bisphenol A. The problem, according to Kurrasch, lies in the lack of industry regulation. Currently, no federal agency tests the toxicity of new materials before they are allowed on the market. “We’re paying a premium for a ‘safer’ product that isn’t even safer,” Kurrasch says. There are many types of bisphenols out there, so part of the public’s responsibility “is making sure [manufacturers] don’t just go from BPA to BPS to BPF or whatever the next one is.”
*Clarification (8/12/14): This sentence was edited after posting to more precisely explain how the BPA ban came about.”
University of Michigan computer scientist reviews frontier technologies to determine fundamental limits of computer scaling
Algorithms help optimize the placement of parts on an integrated circuit--a way to continue scaling.
August 13, 2014
From their origins in the 1940s as sequestered, room-sized machines designed for military and scientific use, computers have made a rapid march into the mainstream, radically transforming industry, commerce, entertainment and governance while shrinking to become ubiquitous handheld portals to the world.
This progress has been driven by the industry's ability to continually innovate techniques for packing increasing amounts of computational circuitry into smaller and denser microchips. But with miniature computer processors now containing millions of closely-packed transistor components of near atomic size, chip designers are facing both engineering and fundamental limits that have become barriers to the continued improvement of computer performance.
Have we reached the limits to computation?
In a review article in this week's issue of the journal Nature, Igor Markov of the University of Michigan reviews limiting factors in the development of computing systems to help determine what is achievable, identifying "loose" limits and viable opportunities for advancements through the use of emerging technologies. His research for this project was funded in part by the National Science Foundation (NSF).
"Just as the second law of thermodynamics was inspired by the discovery of heat engines during the industrial revolution, we are poised to identify fundamental laws that could enunciate the limits of computation in the present information age," says Sankar Basu, a program director in NSF's Computer and Information Science and Engineering Directorate. "Markov's paper revolves around this important intellectual question of our time and briefly touches upon most threads of scientific work leading up to it."
The article summarizes and examines limitations in the areas of manufacturing and engineering, design and validation, power and heat, time and space, as well as information and computational complexity.
"What are these limits, and are some of them negotiable? On which assumptions are they based? How can they be overcome?" asks Markov. "Given the wealth of knowledge about limits to computation and complicated relations between such limits, it is important to measure both dominant and emerging technologies against them."
Limits related to materials and manufacturing are immediately perceptible. In a material layer ten atoms thick, missing one atom due to imprecise manufacturing changes electrical parameters by ten percent or more. Shrinking designs of this scale further inevitably leads to quantum physics and associated limits.
Limits related to engineering are dependent upon design decisions, technical abilities and the ability to validate designs. While very real, these limits are difficult to quantify. However, once the premises of a limit are understood, obstacles to improvement can potentially be eliminated. One such breakthrough has been in writing software to automatically find, diagnose and fix bugs in hardware designs.
Limits related to power and energy have been studied for many years, but only recently have chip designers found ways to improve the energy consumption of processors by temporarily turning off parts of the chip. There are many other clever tricks for saving energy during computation. But moving forward, silicon chips will not maintain the pace of improvement without radical changes. Atomic physics suggests intriguing possibilities but these are far beyond modern engineering capabilities.
Limits relating to time and space can be felt in practice. The speed of light, while a very large number, limits how fast data can travel. Traveling through copper wires and silicon transistors, a signal can no longer traverse a chip in one clock cycle today. A formula limiting parallel computation in terms of device size, communication speed and the number of available dimensions has been known for more than 20 years, but only recently has it become important now that transistors are faster than interconnections. This is why alternatives to conventional wires are being developed, but in the meantime mathematical optimization can be used to reduce the length of wires by rearranging transistors and other components.
Several key limits related to information and computational complexity have been reached by modern computers. Some categories of computational tasks are conjectured to be so difficult to solve that no proposed technology, not even quantum computing, promises consistent advantage. But studying each task individually often helps reformulate it for more efficient computation.
When a specific limit is approached and obstructs progress, understanding the assumptions made is key to circumventing it. Chip scaling will continue for the next few years, but each step forward will meet serious obstacles, some too powerful to circumvent.
What about breakthrough technologies? New techniques and materials can be helpful in several ways and can potentially be "game changers" with respect to traditional limits. For example, carbon nanotube transistors provide greater drive strength and can potentially reduce delay, decrease energy consumption and shrink the footprint of an overall circuit. On the other hand, fundamental limits--sometimes not initially anticipated--tend to obstruct new and emerging technologies, so it is important to understand them before promising a new revolution in power, performance and other factors.
"Understanding these important limits," says Markov, "will help us to bet on the right new techniques and technologies."