quinta-feira, 20 de novembro de 2014

Everything You Need to Know about the U.S.–China Climate Change Agreement

 

A turning point has been reached in the world's bid to curb global warming

November 12, 2014 By David Biello

obama-arrives-in-beijing-november-2014

HISTORIC AGREEMENT: President Obama's visit to Beijing has yielded a pact to cut greenhouse gas pollution from the world's two biggest emitting nations. Official White House Photo by Chuck Kennedy

The presidents of the world's two most polluting nations agree: something should be done about climate change. And they're just the leaders to do it, per the terms of what President Barack Obama called a "historic agreement" announced November 12 between the U.S. and China. Although neither country has plans to stop burning coal or oil in the near future, both countries now have commitments to reduce the greenhouse gases that result.
"As the world's two largest economies, energy consumers and emitters of greenhouse gases, we have a special responsibility to lead the global effort against climate change," said Obama in a joint press conference with President Xi Jinping, wrapping up a visit to Beijing that included the
joint effort on climate change.
The U.S. will
double the speed of its current pollution reduction trajectory, which has seen carbon dioxide emissions fall roughly 10 percent below 2005 levels to date. The country will now aim to reduce greenhouse gas emissions 26 to 28 percent below 2005 levels by 2025. That's in addition to the 17 percent reduction below 2005 levels due by 2020 and shows the kind of five-year planning the U.S. would like to see adopted in international plans to combat climate change. In other words, ever-increasing ambition in reduction targets delivered every five years. "This is an ambitious goal, but it is an achievable goal," Obama said. "It puts us on a path to achieving the deep emissions reductions by advanced economies that the scientific community says is necessary to prevent the most catastrophic effects of climate change."
Although Chinese leaders are quite fond of five-year plans, their new climate version would not begin until "around 2030," under the terms of the new agreement. That is when the country's CO2 pollution will peak, advancing the Chinese war on pollution onto the invisible front. The nation will also "strive" to reach that peak even sooner. Just as Vice Premier Zhang Gaoli
pledged at the United Nations in September "to peak total CO2 pollution as soon as possible," now Xi has followed through in November with the first agreed-on date to cap its global warming pollution by 2030.
The problem is coal, which currently provides more than 70 percent of the energy the fast-developing nation uses. Hundreds of new coal-burning power plants account for how China surpassed the U.S. in the past decade as the world's largest emitter of greenhouse gases. But already several Chinese cities and provinces are experimenting with the kind of capitalist solutions favored by U.S. free marketeers—cap-and-trade programs that in some cases even extend to cover public transportation and buildings themselves. These programs are pilots and may be scaled up for a national program expected in coming years or scrapped in favor of some new national plan or carbon tax. "China has plans for a national market and one that is the most ambitious in the world," says Barbara Finamore, Asia director at the environmental group Natural Resources Defense Council and longtime Beijing resident. "It would dwarf any other carbon market in the world."
More importantly perhaps, the Chinese central government has begun to talk about a cap on coal-burning itself. Statistics show that this year coal use in China slowed for the first time this century,
dropping by around 1 percent, according to Greenpeace International. The hopeful sign suggests that peak coal use could come within the next decade or so, although this dip could also be a result of slowing economic growth rather than proactive efforts to slow climate change. But peak coal use is exactly what the Chinese have agreed to ensure now, along with cuts in CO2 intensity by 2020. Already China's National Development and Reform Commission has laid out a plan to cope with climate change through the end of the decade.
That means building even
more nuclear power plants, wind farms, hydroelectric dams and even to start employing more solar power, of which the country installed 12 gigawatts-worth in 2013. In fact, in 2013 more new clean energy sources were added to the grid in China than fossil fuel-fired power—for the first time ever. China has added several hundred gigawatts-worth of such clean energy—the Three Gorges Dam alone pumps out 22 gigawatts— but hopes to add as much as 1,000 gigawatts of these low-carbon emitting sources by 2030. That would constitute 20 percent of its energy—and roughly the total amount of all electricity produced in the U.S. or all the coal-fired power plants China has built in the last few decades. It's also double what the Chinese have committed to achieve by 2015 in their current Five-Year Plan.
China is already the
world leader in new nuclear and new renewable energy sources, and the energy intensity of its economy dropped by more than 19 percent between 2006 and 2010. But this week’s commitment will require an acceleration in these already fast-paced transition efforts.
At the same time, the U.S. and China will continue to collaborate on developing the kind of
CO2 capture and storage (CCS) that could help clean coal-burning for power plants but also industry, such as steel and cement-making. That change will come through increased funding to the U.S.-China Clean Energy Research Center—one of the fruits of the last deal between the two countries in 2009—and at least one "large-scale pilot project." This project will not rely on flushing more oil out of the ground with the CO2 to be buried, as U.S. CCS projects have done, but rather serve as a bid to help solve China's water crisis by using CO2 to produce fresh water from an underground saltwater aquifer. The project is expected to "inject one million tons of CO2 and create approximately 1.4 million cubic meters of freshwater per year," a major technological advance if achieved.
Trade will also play a role: The tariff agreement signed this week by the two countries may also extend to green goods in the future, such as more energy-efficient and resilient building materials. After all,
how China builds out its cities in the next few decades will lock in either highly polluting energy for decades—or not.
Already, conventional air pollution like smog is driving change in China, including the mandated shutdown of factories and
driving restrictions around Beijing in recent days in a bid to clear the air for world leaders. "Air pollution has become one of the most important issues facing China today, both for social stability and also international reputation," Finamore notes. "They are beginning to decouple coal use and economic growth."
Cleaner air could also come in the form of converting
coal to gas or liquids at giant chemical plants before burning it, which would help reduce smog but exacerbate CO2 pollution. Or it could come in the form of CCS, nuclear and renewables. "There is real potential for shifting of coal use in China from most polluted regions inland, which is why a national cap on coal consumption that's mandatory is so important," Finamore says, adding that it's a "real possibility."
The agreement between the two countries that together emit more than 40 percent of global CO2 pollution suggests a strong deal will be signed by the world's nations in Paris in 2015, under the terms of the
United Nations Framework Convention on Climate Change, unlike Copenhagen in 2009. Prior to that meeting China and the U.S. pledged to cooperate but made no firm commitments to reduce pollution, resulting in the last-minute hullabaloo to salvage international efforts known as the Copenhagen Accord.
This weeks’ agreement does not mean, however, that the problem of climate change is solved. The U.S. and China are still on pace to add billions of metric tons of CO2 pollution each year into the atmosphere. The Intergovernmental Panel on Climate Change suggests that the world has already put into the atmosphere about
half the carbon it can to avoid more than 2 degrees Celsius of global warming, and time is running out. Already, global average temperatures are up nearly 1 degree C and atmospheric concentrations of CO2 have touched 400 parts-per-million for the first time since Homo sapiens sapiens walked on Earth.
The 28 nations of the European Union have pledged to cut greenhouse gas pollution by 40 percent below 1990 levels by 2030, which means, including this new agreement, the countries responsible for more than half of the world's emissions now have plans to cut it. But other
major polluters such as Australia, Canada and Japan have all fallen back in their efforts to curb global warming pollution. Australia's pollution jumped after it repealed its carbon tax this year; Canada has repudiated its prior commitment to reduce pollution under the current international global warming agreement; and Japan's pollution has swelled as it burns more natural gas and coal to produce electricity to replace that generated by the nuclear reactors shuttered in the wake of the accident at Fukushima Daiichi. There is no guarantee that the U.S. or China might not also see similar setbacks on the road to less CO2, particularly given the climate denial politics that are rampant in the U.S. Congress and China's imperative to deliver continued economic growth or its history of gaming international carbon markets.
And a
new major polluter—India—has arrived on the scene. In hoping to repeat China's development success it will burn more and more coal. Just as China has accounted for the bulk of rapidly rising climate-changing pollution in the first decades of the 21st century, India could soon take over driving that growth. But there is hope in the form of cleaner energy sources. New Prime Minister Narendra Modi is "very engaged on renewables and solar power to electrify and provide energy access in India to hundreds of millions who don't have it," says Jennifer Morgan, global director of the climate program at the World Resources Institute, an environmental group.
Plus, the new agreement is nowhere near ambitious enough to meet the reduction targets laid out in the most recent report of the IPCC. No country or league of countries—not even the E.U.—is on track to reduce pollution enough. Policy modeler Chris Hope of the University of Cambridge fed the
new commitments plus the E.U. effort into a computer model under the assumption that other countries would continue to allow pollution to grow. He came out with "less than a 1 percent chance of keeping the rise in global mean temperatures below the iconic 2 [degree C] level in 2100. Most likely the rise will be about 3.8 [degrees C]." In other words, more needs to be done and China's level of striving to reach peak pollution before 2030 will prove crucial.
The world still has a long way to go to combat climate change and even this new inadequate agreement will require some
tough, perhaps impossible, efforts from the U.S. and China. "We're nowhere near the world we need to be in to achieve our most ambitious climate goals," says Valerie Karplus, director of the Tsinghua-M.I.T. China Energy and Climate Project. "We need to recognize that reality and think where do we go from here."
But just as John Kerry did when he was the
lone U.S. senator to hold a press conference back in Copenhagen in 2009 stressing the importance of this issue, as Secretary of State in 2013 and 2014 has pressed the issue of combating climate change with his Chinese counterparts. As a result, the U.S. and China have now begun to show that it just might be possible for the nations of the world to stop global warming for the first time since the Kyoto Protocol was signed in the 1990s. As Obama also said in his joint press conference with Xi: "When we work together, it's good for the United States, it's good for China and it is good for the world."

 

Snap 2014-11-19 at 02.15.11

Smartphone Screens Correct for Your Vision Flaws

 

Self-correcting screens on smartphones and iPads tailor themselves to a viewer's vision—no glasses necessary

Nov 18, 2014 By Rachel Nuwer

Ben Voldman

In the U.S., more than 40 percent of 40-year-olds need eyeglasses for reading, and that figure jumps to nearly 70 percent for people aged 80 and older. “As we get older, refractive errors play more significant roles in our lives,” says Gordon Wetzstein, an assistant professor of electrical engineering at Stanford University.

But glasses and contact lenses are not always ideal. If you are farsighted, for example, you do not need glasses to see traffic while driving, but you do need them to read your speedometer or GPS. The best solution in such cases, Wetzstein says, would be vision-correcting displays—screens that wear the glasses for you.

Wetzstein and his colleagues at M.I.T. (where he was formerly based) and the University of California, Berkeley, have developed just such a screen. The vision-correcting display makes two modifications to a standard high-resolution smartphone or tablet screen. The first is a low-cost, pinhole-covered printed transparency that covers the screen. The second: algorithms coded into the smartphone or tablet that determine the viewer's position relative to the screen and distort the image that is projected, according to his or her prescription. As the distorted image passes through the matrix of pinholes in the transparent screen cover, the hardware-software combination creates errors on the screen that cancel errors in the eye, thus delivering what appears to be a crisp image. The screen can correct for myopia, hyperopia, astigmatism and more complicated vision problems. The team presented the work at the SIGGRAPH conference in August in Vancouver.

Informal tests on a handful of users have shown that the technology works, Wetzstein says, but large-scale studies are needed to further refine it. In the process, the researchers also plan on developing a slider that can be used to manually adjust the focus of the screen. Wetzstein says that the technology could be a boon for people in developing countries who have easier access to mobile devices than prescription eyewear.

Snap 2014-11-19 at 02.15.11

Acid Maps Reveal Worst of Climate Change

 

By David Biello | November 20, 2014


Much of the change in climate change is happening to the ocean. It’s not just the extra heat hiding within the waves. The seven seas also absorb a big share of the carbon dioxide released by burning the fossilized sunshine known as coal, natural gas and oil. All those billions and billions of CO2 molecules interact with the brine to make it ever so slightly more acidic over time and, as more and more CO2 gets absorbed, the oceans become more acidic.

ocean-acidification-chemistry

Now scientists have delivered the most comprehensive maps of this acid phenomena, a global picture of the oceans in 2005 against which future scientists can track just how much more acidic the oceans have become.

global-ocean-ph-map-february-2005

Courtesy of Columbia University's Lamont-Doherty Earth Observatory

The maps are an attempt to bring to visual life a problem that is just as invisible as the excess CO2 piling up in the atmosphere for the past couple of centuries. People cannot see, taste or feel the subtle shift in the seawater and it has taken years of measurement around the world to gather enough data for this new global picture. Calls for such measurements had been made since at least 1956. Charles David Keeling heeded the call back then, producing the Keeling Curve, which tracks rising CO2 levels to this day. But a similar set of measurements for the oceans has been lacking—until now.

Geochemist Taro Takahashi of Columbia University has spent four decades measuring these changes, which amount to a generally basic ocean growing 30 percent more acidic—a change in pH from 8.2 to 8.1. That’s the result of absorbing roughly 25 percent of annual CO2 pollution, which now amounts to 36 billion metric tons in total.

In other words, the oceans currently take in roughly 9 billion metric tons of CO2 each year—a number that is growing. Cumulatively the oceans have absorbed at least 150 billion metric tons of CO2. Despite the size of the increase, it is still just one part in 1,000 of the CO2 already in the ocean.

The new maps highlight the fact that the world ocean is not uniform. The vast belts of seawater that lie in the tropics and temperate zones vary the least in pH while Arctic and Antarctic waters seesaw as plankton blooms, sucking CO2 out of the water, before the cold waters of winter again absorb yet more CO2 and become more acidic. In fact, in the northern winter the Bering Sea becomes the most acidic ocean on Earth, reaching a pH of as low as 7.7. That won’t remain an anomalously low number for long, however, as that is the pH the entire ocean may experience if present trends continue through 2100.

Another anomaly is the Indian Ocean, which is about 10 percent more acidic than the Atlantic or Pacific. The exact reason why is a scientific mystery but perhaps it is because it is the most isolated sea, dominated by monsoon rains and river drainage rather than ocean currents mixing the waters, Takahashi suggests.

Globally, if current trends continue, then corals and other shell-building microscopic sea animals and plants may find it harder to build and sustain their shells. Already, the shells of foraminifera (microscopic amoeba-like animals) are shrinking as a result of more acidic waters in the Southern Ocean around Antarctica. And shell-building life in Arctic oceans may become chemically impossible, forcing shell-building sea snails and sea slugs that thrive there now to migrate to warmer waters, if that’s even possible.

Fossil fuel burning is now making oceans more acidic at a rate unseen in at least 300 million years so it will be important to continue to follow up on these measurements. Perhaps there can soon be a “Takahashi Map” to complement the Keeling Curve. Because as marine chemist Scott Doney of Woods Hole Oceanographic Institution wrote in Scientific American in March 2006: “Although the effects may be hidden from people’s view, dramatic alterations in the marine environment appear to be inevitable.”

Snap 2014-11-19 at 02.15.11

7 Solutions to Climate Change Happening Now

 

Even as the world continues to spew more carbon pollution, change has begun—and is accelerating

November 17, 2014 - By David Biello

three-gorges-dam-and-solar-panels

Clean Energy Boom: Big dams and little solar panels like these in China are helping produce electricity with less greenhouse pollution, one of several solutions to climate change advancing around the world.

A man who once flew all the way to Copenhagen from Washington, D.C., just to tell journalists that climate change wasn't that big a deal is likely now to return to lead (or at least strongly influence) the environment committee of the U.S. Senate. As Sen. James Inhofe (R–Okla.) said at that time, in December 2009, he came to Copenhagen to "make sure that nobody is laboring under the misconception that the U.S. Senate is going to do something" about climate change. His thinking likely will not change by 2015; in fact, Inhofe has already decried the new U.S.–China climate agreement as a "nonbinding charade."
Even though the U.S. is responsible for the largest share of carbon dioxide and other greenhouse gases in the atmosphere, the country will not be able to take national legislative action on climate change anytime soon. Despite a president who avers that "those who are already feeling the effects of climate change don't have time to deny it —they're busy dealing with it," the U.S. Congress seems content to let climate change languish as a priority. National climate action is already devolving into a
fight over approval of the Keystone XL Pipeline that would enable more oil to flow from Canada's tar sands and implementation of the Clean Power Plan, known to some as the “war on coal.” In fact, the likely new Senate Majority Leader Mitch McConnell (R–Ky.) was reelected in part based on a platform reduced to a bumper sticker: "Coal. Guns. Freedom."
As a result of similar complacency or intransigence around the globe, greenhouse gas pollution continues to rise and atmospheric concentrations have now touched
400 parts per million. Australia elected a climate skeptic as prime minister who promptly repealed its carbon tax—and pollution has promptly soared this year. In fact, the world has dawdled long enough that the United Nations Intergovernmental Panel on Climate Change now suggests that technologies to remove CO2 from the atmosphere will be required to prevent too much global warming this century.
But, believe it or not, action on climate change is taking place in the U.S. “We don't have
time for a meeting of the Flat Earth Society," Pres. Barack Obama noted back in June 2013. So his administration has moved forward without Congress as a result, through the U.S. Environmental Protection Agency's Clean Power Plan and the new agreement to reduce pollution with China.
Here are seven
solutions to global warming that are advancing and gathering steam in the U.S.—and around the world.
1. Clean Power Plants—More than 20 percent of
new, large power plants built in the U.S. in 2013 employ sunlight to generate electricity. And that does not include solar panels on people's rooftops, which alone added nearly two gigawatts of capacity last year. In addition, natural gas replaced coal as the largest source of new electricity generation. The shale revolution enabled the U.S. to reduce its carbon dioxide emissions “while maintaining economic growth by switching from coal to gas," notes Nobuo Tanaka, former head of the International Energy Agency (IEA) and a now a visiting fellow at Columbia University.
Of the two coal-fired power plants that were completed last year, one was the coal gasification facility in Edwardsport, Ind., which could one day
incorporate technology to capture its CO2 emissions. Per megawatt-hour, the new power plant also spews half the CO2 as the old coal-fired power plant it replaced.
Although the U.S. is currently
retiring more nuclear reactors than it is building, the same is not true in China. In addition to its 22 operating reactors, it has 26 under construction and plans for even more. That's because the Chinese leadership sees nuclear power as one of the key ways—besides solar, wind and hydropower—to cut down on the country's coal burning, which is responsible both for its choking air pollution and swelling greenhouse gas emissions.
Without nuclear power the IEA and other energy experts suggest that curtailing climate change may prove impossible. Regardless, nuclear power today, along with
hydropower, provides the bulk of low-CO2 electricity production. "We need everything," Tanaka says. "We need CCS [carbon capture and storage]. We need nuclear. We need renewables."
2. Local Action—In 2008 six states in the northeastern U.S. launched a
regional effort to cap CO2 pollution from power plants in the region via the use of alternative sources and energy-efficiency programs. By 2014, the effort had grown to include nine states and had helped cut such greenhouse gas emissions in nearly half. In fact, the program has been so successful that the states agreed to lower the overall cap by an additional 45 percent starting this year.
The Regional Greenhouse Gas Initiative, aka RGGI, or "Reggie" for short, is not the only climate action in the U.S. in the absence of a federal effort. California has launched its own cap-and-trade market, and it has, along with 34 other states, some form of mandate for electricity generated from less polluting sources, such as the wind, sun and Earth's geothermal heat. Cities large (New York) and small (Dubuque, Iowa) have set targets to reduce greenhouse gas emissions. In fact, cities around the world are
reducing CO2 in a bid for self-preservation, and some, such as Copenhagen and Melbourne, are aiming to cancel out any CO2 emitted with CO2 absorbed or otherwise avoided.
As a result of these actions—and the surge in generation from cleaner power sources—U.S. greenhouse gas pollution has
dropped by 10 percent since 2005. And the recent election holds out the prospect that RGGI may gain another member state—Pennsylvania.
3. Control of Methane Leaks—Colorado has become the first state in the nation to
try to address methane—a greenhouse gas even more potent than CO2. Leaks in pipelines, storage tanks and other infrastructure will have to be fixed within weeks after their discovery, among other new regulations administered by the Colorado Air Quality Control Commission. The Rocky Mountain state is unlikely to remain the only place to act to control such pollution.
In this case even the federal government is taking action. The Obama administration has implemented
new rules to capture the methane leaking from garbage dumps, coal mines and the manure piles created by large animal farms. And both the EPA and U.S. Bureau of Land Management are considering rules around methane for the oil and gas industry. At the local level Google has partnered with environmental groups and cities to sniff out leaks in aging gas infrastructure.
Methane currently accounts for
9 percent of U.S. greenhouse gas pollution—and growing because of fracking for oil and the gas itself. In fact, that number does not include all the methane that is flared—that is, burned off as a precaution to prevent explosions but in the process releasing CO2. So action against methane can buy more time to address the problem of rising CO2 pollution.
4. Tougher Emissions and Efficiency Standards—New standards for cars are becoming a global phenomenon, whether it’s more stringent rules in China, the European Union or the U.S. As the world heads toward
two billion vehicles, making sure that these cars emit less pollution—both smog-forming nitrogen oxides and heat-trapping carbon dioxide—will be crucial, whether it is accomplished via more efficient internal combustion engines or better hybrids. Electric cars, including the Chevy Volt, Nissan LEAF or Tesla Model S, can help, too, provided the electricity does not come from burning fossil fuels.
At the same time, power plants face ever-tighter requirements for air pollution. As a result, U.S. pollution is already 10 percent below 2005 levels and will drop lower. "You can't build a coal plant in the U.S. unless it complies with high emission reductions," says Ethan Zindler, head of policy analysis at Bloomberg New Energy Finance. "People do the low-cost option for compliance, which is to build a new gas plant." One day
even natural gas–fired power plants may have to employ CO2 capture and storage, much like new coal-fired power plants in Mississippi and Texas do today.
And new efficiency standards for appliances, from air conditioners to washing machines, have helped keep electricity demand from growing, paired with improvements in basic technology, such as light-emitting diodes, or
LEDs, replacing less efficient incandescent bulbs.
5. Greener Farming—The U.S. Department of Agriculture is helping farmers adapt to climate change, even if they
don't believe in global warming. Regional climate hubs now provide extension agents with technical support on best practices to deal with a changing climate, including preserving buffer wetlands to cut down on both erosion and flooding. National forests and grasslands will now be managed with the goal of storing CO2, among other aims. Already farmers have begun adopting various techniques to reduce greenhouse emissions, including precision agriculture to grow crops efficiently, cover crops to reduce soil erosion and biodigesters to reduce animal waste.
Ultimately, farms may even help address climate change by providing
energy crops for biofuels. Such fuel from plants paired with the kind of CCS used on coal-fired power plants could even begin to draw down the amount of CO2 already in the atmosphere. And the regrowth of U.S. forests as less land is devoted to farming is already helping restrain global warming.
6. Private Sector Action—It's not just that tech giants like Apple and Google are powering data centers with wind and solar. It's not just big food brands like General Mills and PepsiCo preparing their businesses for the crop and water scarcity predicted by climate change. It's not even the internal
carbon prices employed at oil companies like Shell and even ExxonMobil. It's the fact that—outside of coal companies, a few coal-burning utilities and the U.S. Chamber of Commerce—it's hard to find businesses that do not accept the science on global warming or have plans to deal with it.
The simple fact is that climate change also means changes to business models, whether the company is an insurance giant like Swiss Re facing extreme weather risk or a utility, like NRG Energy, facing new regulations as well as customers desirous of a new relationship with power production and producers. And for those in the energy business like Alstom, General Electric, Siemens, Toshiba or Westinghouse, fortunes will be made—or lost—as China and other fast-developing countries build out cleaner energy options.
7. New Kinds of Geopolitical Consensus—On November 12 China and the U.S., the world's two most polluting countries, signed an
agreement to combat climate change. China will cap its CO2 emissions by 2030 and plan to get 20 percent of its electricity from the wind, sun, dams and fission. The U.S. will ratchet up its pollution reduction trajectory aiming for a doubling of the annual rate of decline from just over 1 percent per year to more than 2 percent per year. Combined with the pledge of 28 countries of the European Union to reduce CO2 as well in the same time frame, more than half of global pollution is now set to decline in coming decades. "The politics in both countries are easier if both countries move forward together," notes Jennifer Morgan, global director of the climate program at the World Resources Institute (WRI). "If both are acting together, others cannot hide behind them and they can inspire greater ambition."
Nor is the U.S.–China deal the only bright spot, internationally. Brazil and
Indonesia—the two countries where deforestation is fastest—are both attempting to reverse course.
At the same time, developing countries are aiming to skip fossil fuels in their drive for prosperity. Kenya has become the world's
hotspot for geothermal power, thanks to the Rift Valley, where geology brings hot rocks close to the surface. Both Brazil and India have made big pushes for more solar power. And a recent survey by Bloomberg New Energy Finance found that 55 developing countries installed nearly twice as much renewable power as developed countries between 2008 and 2013. "If you're burning diesel to light your home, that isn't great," says Bloomberg's Zindler. "You can now displace that with PV," or photovoltaics, thanks to the price drop in these cleaner technologies in recent years.
That's one thing individuals can do: join the
growing ranks of solar homeowners. But, at least in the U.S., there is another action that is probably more important to the fight against climate change in the long run: voting for politicians and policies that promise climate action. As WRI's Morgan notes, to keep the global average temperature from rising more than 2 degrees Celsius by 2100, “the pace and scale of change just needs to increase dramatically."

Snap 2014-11-19 at 02.15.11

Subaru revs up TV screens with Viziv GT Vision Gran Turismo

 

The VIZIV GT gets an extreme body kit

The VIZIV GT gets an extreme body kit

Image Gallery (23 images)

The grand parade of Vision Gran Turismos continues. Subaru took the cloth off a Vision Gran Turismo design based on the VIZIV 2 concept it showed earlier this year, hanging a dramatic body kit on an already eye-catching concept car. The car packs the most aggressive styling and specs in Subaru's history, including 591 hp of turbo hybrid power.

Showing no sign of fear in the face of redundancy, Subaru calls its video game car the VIZIV GT Vision Gran Turismo. We'll stick with VIZIV GT. The automaker envisions the car as the ultimate sports car in its Vision for Innovation (VIZIV) concept line, which includes the VIZIV conceptfrom the 2013 Geneva Motor Show and the VIZIV 2 from the 2014 Geneva show.

Starting with the design ethic of the 2014 VIZIV 2, Subaru added bulging, solid shapes, aiming for a design that looks carved from a single block of metal. The blistered fenders, scalped roof line, stretched rear wing and rugged rear diffuser give it the look of a powerful track demon that's ready to devour the air in front of its face before excreting it milliseconds later.

The VIZIV GT draws power from a combination of Subaru's classic horizontally opposed 2.0-liter turbo, backed up with a single high-power motor in front and a pair in back. All told, the power plants combine for 591 hp and 593 lb-ft of torque. The outputs of the drive units can be controlled separately, ramping up torque vectoring for sharpened cornering.

We haven't seen a mock-up of the VIZIV GT at Subaru's corner of the LA Auto Show, so it looks like the car debuts in the virtual world only. It will become available for use in Gran Turismo 6 as part of update 1.14. Take a look at its every exaggerated corner and carve in our gallery.

Source: Gran Turismo

 

Improving memory by suppressing a molecule that links aging to Alzheimer's disease

 


 

In a new study conducted by the Sagol Department of Neurobiology at the University of Haifa and published recently in the Journal of Neuroscience, researchers report that they've found a way to improve memory by manipulating a specific molecule that is known to function poorly in old age and is closely linked to Alzheimer's disease.

The researchers even succeeded, for the first time, in manipulating the molecule's operations without creating any cognitive impairment.

"We know that in Alzheimer's, this protein, known as PERK, doesn't function properly. Our success in manipulating its expression without causing any harm to the proper functioning of the brain paves the way for improving memory and perhaps even slowing the pathological development of diseases like Alzheimer's," said Prof. Kobi Rosenblum, who heads the lab in which the research was done.

Previous studies at the University of Haifa and other labs throughout the world had shown that the brain's process of formulating memory is linked to the synthesis of proteins; high rates of protein production will lead to a strong memory that is retained over the long term, while a slow rate of protein production leads to weak memories that are less likely to be impressed on a person's long-term memory and thus forgotten.

In the current study, the researchers, Dr. Hadile Ounallah-Saad and Dr. Vijendra Sharma, both of whom work in Prof. Rosenblum's lab at the Sagol Department of Neurobiology, sought to examine the activity of a protein called elF2 alpha, a protein that's known as the "spigot" or regulator that determines the pace of protein synthesis in the brain during memory formation.

From earlier studies the researchers knew that there are three main molecules that act on the protein and either make it work, or stop it from working. During the first stage they sought to determine the relative importance and the task of each one of the molecules that control the activity of efF2 alpha and as a result, the ability to create memories. After doing tests at the tissue and cell levels, the researchers discovered that the main molecule controlling the efF2 alpha's activity was the PERK molecule.

"The fact that we identified the PERK as the primary controller had particular significance," said Dr. Ounallah-Saad. "Firstly, of course, we had identified the dominant component. Secondly, from previous studies we already knew that in generative diseases like Alzheimer's, PERK performs deficiently. Third, PERK acts on various cells, including neurons, as a monitor and controller of metabolic stress. In other words, we found a molecule that has a major impact on the process of creating and formulating memory, and which we know performs deficiently in people with Alzheimer's disease."

During the second stage of the study, the researchers sought to examine whether they could manipulate this molecule in rats in a way that would improve memory. To do this they used two accepted methods, one using a drug called a small-molecule inhibitor and the other making a genetic change to the brain cells using a type of virus also used in gene therapy.

After paralyzing PERK's activity or reducing its expression through gene therapy (which was done with the help of Dr. Efrat Edry, of the University's Center for Gene Manipulation in the Brain), the researchers measured a 30% increase in the memory of either positive or negative experiences. The rats also demonstrated improved long-term memory and enhanced behavioral plasticity, becoming better able to "forget" a bad experience. In other words, on a behavioral level it was clear that manipulating PERK by either of two methods improved memory and cognitive abilities.

When the researchers examined the tissues on a cell and molecular level, the discovered that the steps they'd taken had indeed stopped the expression of PERK, which allowed the "spigot" -- the elF2 alpha protein -- to perform better and increase the pace of protein synthesis. Even more, there was a clear correlation between memory function and the degree to which PERK was suppressed; the more efficiently PERK was suppressed, the better the memory function.

But the researchers faced another problem. Previous studies that had manipulated PERK in general in genetically engineered animals led to fixated behavior. "The brain operates in a most sophisticated fashion, with each action closely linked to many other actions," said Dr. Ounallah-Saad. "In our study we succeeded in maintaining such control of the PERK that it didn't influence the retrieval of existing memories, or do anything other cognitive damage."

"With this study we proved that we are capable of strengthening the process of protein synthesis in the brain and of creating stronger memories that last a long time," said Prof. Rosenblum. "The moment we did this by manipulating a molecule that we know performs deficiently in people with Alzheimer's and is linked to the aging process, we have paved the way for the possible development of drugs that can slow the progress of incurable diseases like degenerative brain conditions, Alzheimer's chief among them."

Taking antibiotics during pregnancy increases risk for child becoming obese

 

November 18, 2014

Columbia University's Mailman School of Public Health

A study just released by Columbia University's Mailman School of Public Health found that children who were exposed to antibiotics in the second or third trimester of pregnancy had a higher risk of childhood obesity at age 7. The research also showed that for mothers who delivered their babies by a cesarean section, whether elective or non-elective, there was a higher risk for obesity in their offspring.


A study just released by Columbia University's Mailman School of Public Health found that children who were exposed to antibiotics in the second or third trimester of pregnancy had a higher risk of childhood obesity at age 7. The research also showed that for mothers who delivered their babies by a Caesarean section, whether elective or non-elective, there was a higher risk for obesity in their offspring. Study findings are published online in the International Journal of Obesity.

Although previous studies have shown that antibiotics administered early in life may be associated with increased risk of obesity in childhood, this is the first study reporting that maternal antibiotic use in the second or third trimester of pregnancy increases the risk of offspring obesity. Antibiotics affect microbes in the mother and may enter fetal circulation via the placenta. Researchers are beginning to understand that the bacteria that normally inhabit our colon have important roles in maintaining our health and imbalances in these bacterial populations can cause a variety of illnesses. Disturbances in the normal, transmission of bacteria from the mother to the child are thought to place the child at risk for several health conditions, including obesity.

The study is based on data of healthy, non-smoking, pregnant women who were recruited for the Northern Manhattan Mothers and Children Study from prenatal clinics at New York-Presbyterian Hospital and Harlem Hospital Center between 1998 and 2006. Of 727 mothers enrolled in the study, 436 mothers and their children were followed until 7 years of age. Of these 436 children, 16 percent had mothers who used antibiotics in the second or trimester. This work is part of the Columbia Center for Children's Environmental Health's efforts to understand how to promote healthy growth and development through out childhood and adolescence.

The children exposed to antibiotics in this timeframe had an 84-percent higher risk of obesity, compared with children who were not exposed.

"Our findings on prenatal antibiotics and risk for offspring obesity are novel, and thus warrant replication in other prospective cohort studies," said Noel Mueller, PhD, postdoctoral research fellow at Columbia University's Mailman School of Public Health and Institute of Human Nutrition. "If these findings hold up, they suggest new mechanisms through which childhood growth trajectories are influenced at the earliest stages of development. Our findings should not discourage antibiotic use when they are medically needed, but it is important to recognize that antibiotics are currently overprescribed."

Independent of prenatal antibiotic usage, delivery by Caesarean section was also associated with a 46-percent higher risk of childhood obesity. The researchers controlled for maternal age, ethnicity, birth weight, sex, breastfeeding in the first year, and gestational antibiotics or delivery mode.

"Our findings are consistent with a series of papers that looked at data on Caesarean section. While earlier studies suggested that childhood outcomes differ by whether the Caesarean section was elective or non-elective, we did not observe such evidence," said Andrew Rundle, DrPH, associate professor of Epidemiology at the Mailman School of Public Health. "Thus, our findings provide new evidence in support of the hypothesis that Caesarean section independently contributes to the risk of childhood obesity."

Similar to antibiotic use during pregnancy, Caesarean section birth is thought to reduce the normal transmission of bacteria from the mother to the child and to disturb the balance of bacteria in the child. "Strategies to reduce medically unnecessary C-sections and to provide the infant with health promoting bacteria after C-section need to be researched," noted Dr. Mueller.

"Further research is needed on how mode of delivery, antibiotic use during pregnancy and other factors influence the establishment of the ecosystem of bacteria that inhabit each of us," said Dr. Rundle. "This research will help us understand how to create an early platform to support the healthy growth and development of children."


Story Source:

The above story is based on materials provided by Columbia University's Mailman School of Public Health. Note: Materials may be edited for content and length.


Journal Reference:

  1. N T Mueller, R Whyatt, L Hoepner, S Oberfield, M G Dominguez-Bello, E M Widen, A Hassoun, F Perera, A Rundle. Prenatal exposure to antibiotics, cesarean section and risk of childhood obesity. International Journal of Obesity, 2014; DOI: 10.1038/ijo.2014.180

 

Unique sense of 'touch' gives a prolific bacterium its ability to infect anything

 

Pseudomonas is the first pathogen found to initiate infection after merely attaching to the surface of a host, Princeton University and Dartmouth College researchers report in the journal the Proceedings of the National Academy of Sciences. This mechanism means that the bacteria, unlike most pathogens, do not rely on a chemical signal specific to any one host, and just have to make contact with any organism that's ripe for infection.

The researchers found, however, that the bacteria could not infect another organism when a protein on their surface known as PilY1 was disabled. This suggests a possible treatment that, instead of attempting to kill the pathogen, targets the bacteria's own mechanisms for infection.

Corresponding author Zemer Gitai, a Princeton associate professor of molecular biology, explained that the majority of bacteria, viruses and other disease-causing agents depend on "taste," as in they respond to chemical signals unique to the hosts with which they typically co-evolved. Pseudomonas, however, through their sense of touch, are able to thrive on humans, plants, animals, numerous human-made surfaces, and in water and soil. They can cause potentially fatal organ infections in humans, and are the culprit in many hospital-acquired illnesses such as sepsis. The bacteria are largely unfazed by antibiotics.

"Pseudomonas' ability to infect anything was known before. What was not known was how it's able to detect so many types of hosts," Gitai said. "That's the key piece of this research -- by using this sense of touch, as opposed to taste, Pseudomonas can equally identify any kind of suitable host and initiate infection in an attempt to kill it."

The researchers found that only two conditions must be satisfied for Pseudomonas to launch an infection: Surface attachment and "quorum sensing," a common bacterial mechanism wherein the organisms can detect that a large concentration of their kind is present. The researchers focused on the surface-attachment cue because it truly sets Pseudomonas apart, said Gitai, who worked with first author Albert Siryaporn, a postdoctoral researcher in Gitai's group; George O'Toole, a professor of microbiology and immunology at Dartmouth; and Sherry Kuchma, a senior scientist in O'Toole's laboratory.

To demonstrate the bacteria's wide-ranging lethality, Siryaporn infected ivy cells with the bacteria then introduced amoebas to the same sample; Pseudomonas immediately detected and quickly overwhelmed the single-celled animals. "The bacteria don't know what kind of host it's sitting on," Siryaporn said. "All they know is that they're on something, so they're on the offensive. It doesn't draw a distinction between one host or another."

When Siryaporn deleted the protein PilY1 from the bacteria's surface, however, the bacteria lost their ability to infect and thus kill the test host, an amoeba. "We believe that this protein is the sensor of surfaces," Siryaporn said. "When we deleted the protein, the bacteria were still on a surface, but they didn't know they were on a surface, so they never initiated virulence."

Because PilY1 is on a Pseudomonas bacterium's surface and required for virulence, it presents a comprehensive and easily accessible target for developing drugs to treat Pseudomonas infection, Gitai said. Many drugs are developed to target components in a pathogen's more protected interior, he said.

Kerwyn Huang, a Stanford University assistant professor of bioengineering, said that the research is an important demonstration of an emerging approach to treating pathogens -- by disabling rather than killing them.

"This work indicates that the PilY1 sensor is a sort of lynchpin for the entire virulence response, opening the door to therapeutic design that specifically disrupts the mechanical cues for activating virulence," said Huang, who is familiar with the research but had no role in it.

"This is a key example of what I think will become the paradigm in antivirals and antimicrobials in the future -- that trying to kill the microbes is not necessarily the best strategy for dealing with an infection," Huang said. "[The researchers'] discovery of the molecular factor that detects the mechanical cues is critical for designing such compounds."

Targeting proteins such as PilY1 offers an avenue for combating the growing problem of antibiotic resistance among bacteria, Gitai said. Disabling the protein in Pseudomonas did not hinder the bacteria's ability to multiply, only to infect.

Antibiotic resistance results when a drug kills all of its target organisms, but leaves behind bacteria that developed a resistance to the drug. These mutants, previously in the minority, multiply at an astounding rate -- doubling their numbers roughly every 30 minutes -- and become the dominant strain of pathogen, Gitai said. If bacteria had their ability to infect disabled, but were not killed, the mutant organisms would be unlikely to take over, he said.

"I'm very optimistic that we can use drugs that target PilY1 to inhibit the whole virulence process instead of killing off bacteria piecemeal," Gitai said. "This could be a whole new strategy. Really what people should be doing is screening drugs that inhibit virulence but preserve growth. This protein presents a possible route by which to do that."

PilY1 also is found in other bacteria with a range of hosts, Gitai said, including Neisseria gonorrhoeae or the large bacteria genus Burkholderia, which, respectively, cause gonorrhea in humans and are, along with Pseudomonas, a leading cause of lung infection in people with cystic fibrosis. It is possible that PilY1 has a similar role in detecting surfaces and initiating infection for these other bacteria, and thus could be a treatment target.

Frederick Ausubel, a professor of genetics at Harvard Medical School, said that the research could help explain how opportunistic pathogens are able to infect multiple types of hosts. Recent research has revealed a lot about how bacteria initiate an infection, particularly via quorum sensing and chemical signals, but the question about how that's done across a spectrum of unrelated hosts has remained unanswered, said Ausubel, who is familiar with the research but had no role in it.

"A broad host-range pathogen such as Pseudomonas cannot rely solely on chemical cues to alert it to the presence of a suitable host," Ausubel said.

"It makes sense that Pseudomonas would use surface attachment as one of the major inputs to activating virulence, especially if attachment to surfaces in general rather than to a particular surface is the signal," he said. "There is probably an advantage to activating virulence only when attached to a host cell, and it is certainly possible that other broad host-range opportunistic pathogens utilize a similar strategy."

New type of silicon could find use in solar cells and LEDs

 

A view through the channels of the new zeolite-type allotrope of silicon (Image: Timothy S...

A view through the channels of the new zeolite-type allotrope of silicon (Image: Timothy Strobel)

You probably wouldn't be reading this if it weren't for silicon. It's the second most-abundant element in the Earth's crust as well as the key to modern technology – used in the integrated circuits that power such electronics as computers, mobile phones, and even some toasters and refrigerators. It's also used in compound form in building, ceramics, breast implants, and many other areas. And now the ubiquitous element may have a plethora of new applications, thanks to a team of Carnegie scientists who synthesized an allotrope (new/different physical form) with the chemical formula Si24.

The diamond-structured form of silicon normally used in technology applications has a semiconducting property called an indirect band gap, which differs from a direct band gap in that it requires an extra step to excite bound electrons into a free state so that they can participate in electrical conduction. Direct band gap semiconductors need only two entities to intersect; a photon imparts momentum on an electron. But indirect band gap semiconductors require a third entity – a lattice vibration called a phonon – because the minimum energy state of the conduction band and the maximum energy state of the valence band occur at different values of momentum.

This new form of silicon is a quasi-direct band gap material, which means not only that it can conduct electricity more efficiently than diamond-structured silicon but also that it can absorb and emit light – a property never before achieved. (I say quasi-direct because it is technically a very small and almost flat indirect band gap.) These properties make it ripe for use in next-generation solar cells, LEDs, and other semiconductor technologies.

To create Si24, the researchers first formed a polycrystalline compound of silicon and sodium (Na4Si24) with help from a tantalum capsule, very high temperature, and a 1,500 ton multi-anvil press that gradually reached a pressure of 10 gigapascals (1,450,377 pounds per square inch). This compound was then "degassed" in a vacuum at 400 Kelvin (260 F) for eight days, after which they had pure Si24 in an open framework called a zeolite-type structure.

Small atoms such as sodium (yellow) and lithium (green), or molecules such as water, can d...

The structure is comprised of five-, six-, and eight-membered silicon rings through which small atoms and molecules could spread, with potential applications in electrical energy storage and molecular-scale filtering, among other things.

Si24 could be just the tip of the iceberg for desirable new materials formed at high pressure, the researchers suggest. Lead researcher Timothy Strobel has gone so far as to call high-pressure precursor synthesis "an entirely new frontier in novel energy materials" that goes above and beyond silicon. And the stability of the new structures at atmospheric pressure means that low-pressure methods such as chemical vapor deposition could potentially allow large-scale production.

A paper describing the research was published in the journal Nature Materials.

Source: Carnegie Institution for Science

Share

Portable Indoor Air Quality Monitor

 

Mon, 09/29/2014 - 1:14pm

 

The new AQ Expert from E Instruments International provides monitoring and real-time data logging for indoor air quality testing in a variety of settings, including laboratories and cleanrooms.

The AQ Expert includes up to 11 parameters and detects a wide variety of gases and atmospheric characteristics: carbon dioxide, carbon monoxide, relative humidity, temperature, volatile organic compounds, oxygen, nitrogen dioxide, ozone, hydrogen sulfide, sulfur dioxide, and barometric air pressure.

The monitor features a large internal memory, a lithium-ion rechargable battery, dual temperature TcK inputs, Bluetooth, and an active internal sampling pump. A handheld probe is optional.

E Instruments International

 

 

Photoelectric Sensor

 

Fri, 10/31/2014 - 12:19pm

 

SICK, one of the world’s leading manufacturers of sensors, safety systems, machine vision, encoders and automatic identification solutions for factory and logistics automation, has announced the launch of the DeltaPac MultiTask photoelectric sensor. This sensor precisely counts, detects and differentiates between successive packaging items on conveyor belts.

The IP 67-rated DeltaPac, which doesn’t require any backing up, buffering or mechanical product separation, optimizes product flow and reduces the amount of hardware in packaging applications. This small, energy-efficient solution improves quality and reduces downtime and product damage caused by collisions. The DeltaPac features patented Delta-S-Technology, which uses four PinPoint 2.0 LEDs and two receivers with SICK-specific SIRIC ASIC technology. This technology enables the DeltaPac to seamlessly detect corners, folds and grooves regardless of object color, size, surface or background.

The DeltaPac is a pre-configured sensor that uses SICK’s flexible SOPAS-ET configuration software for easy installation, operation and customization. With a sensing range from 30 to 40 mm to the front edge of the object, the DeltaPac is able to detect up to 200,000 packages per hour. It can be used for controlling packaging, triggering downstream processes and ensuring package quality.

SICK

 

Fluorescent Light Tester

 

Tue, 11/18/2014 - 8:55am

 

Fluke 1000FLT Fluorescent Light Tester

Fluke Corp. has introduced the Fluke 1000FLT Fluorescent Light Tester. The fluorescent light tester eliminates the guesswork of maintaining fluorescent lamps by performing the essential tests on lamps in less than 30 secs: lamp tester, ballast tester, non-contact voltage detector, pin continuity tester and ballast discriminator. The point-and-shoot ballast discriminator in the1000FLT speeds the replacement of old magnetic ballasts with new energy-efficient electronic models by quickly identifying exactly what type of ballast is in the fixture before they climb the ladder. There’s no need to remove the bulbs or to make contact with live circuitry. Simply point the 1000FLT fluorescent light tester at the glowing bulb and determine the ballast type. The tester comes with a rugged, metal test rod that extends up to 31 in (79 cm) and a belt-style holster so the tester is always within easy reach. 

Fluke Corp. -  www.fluke.com

 

Nondestructive testing – What is it?

 

Nondestructive testing or Non-destructive testing (NDT) is a wide group of analysis techniques used in science and industry to evaluate the properties of a material, component or system without causing damage.

The terms Nondestructive examination (NDE), Nondestructive inspection (NDI), and Nondestructive evaluation (NDE) are also commonly used to describe this technology. Because NDT does not permanently alter the article being inspected, it is a highly valuable technique that can save both money and time in product evaluation, troubleshooting, and research. Common NDT methods include ultrasonic, magnetic-particle, liquid penetrant, radiographic, remote visual inspection (RVI), eddy-current testing, and low coherence interferometry.NDT is commonly used in forensic engineering, mechanical engineering, electrical engineering, civil engineering, systems engineering, aeronautical engineering, medicine, and art.

Methods

NDT methods may rely upon use of electromagnetic radiation, sound, and inherent properties of materials to examine samples. This includes some kinds of microscopy to examine external surfaces in detail, although sample preparation techniques for metallography, optical microscopy and electron microscopy are generally destructive as the surfaces must be made smooth through polishing or the sample must be electron transparent in thickness. The inside of a sample can be examined with penetrating radiation, such as X-rays or neutrons. Sound waves are utilized in the case of ultrasonic testing. Contrast between a defect and the bulk of the sample may be enhanced for visual examination by the unaided eye by using liquids to penetrate fatigue cracks. One method (liquid penetrant testing) involves using dyes, fluorescent or non-fluorescent, in fluids for non-magnetic materials, usually metals. Another commonly used NDT method used on ferrous materials involves the application of fine iron particles (either liquid or dry dust) that are applied to a part while it is in an externally magnetized state (magnetic-particle testing). The particles will be attracted to leakage fields within the test object, and form on the objects surface. Magnetic particle testing can reveal surface & some sub-surface defects within the part. Thermoelectric effect (or use of the Seebeck effect) uses thermal properties of an alloy to quickly and easily characterize many alloys. The chemical test, or chemical spot test method, utilizes application of sensitive chemicals that can indicate the presence of individual alloying elements. Electrochemical methods, such as electrochemical fatigue crack sensors, utilize the tendency of metal structural material to oxidize readily in order to detect progressive damage.

Analyzing and documenting a non-destructive failure mode can also be accomplished using a high-speed camera recording continuously (movie-loop) until the failure is detected. Detecting the failure can be accomplish using a sound detector or stress gauge which produces a signal to trigger the high-speed camera. These high-speed cameras have advanced recording modes to capture some non-destructive failures. After the failure the high-speed camera will stop recording. The capture images can be played back in slow motion showing precisely what happen before, during and after the non-destructive event, image by image.

 

source : Wikipedia