domingo, 14 de setembro de 2014

Nicotine withdrawal reduces response to rewards across species

 

September 12, 2014

Florida Atlantic University

While more than half of US smokers try to quit every year, less than 10 percent are able to remain smoke-free, and relapse commonly occurs within 48 hours of smoking cessation. In a first of its kind study on nicotine addiction, scientists measured a behavior that can be similarly quantified across species like humans and rats, the responses to rewards during nicotine withdrawal. Learning about withdrawal and difficulty of quitting can lead to more effective treatments to help smokers quit.


Cigarette smoking is a leading cause of preventable death worldwide and is associated with approximately 440,000 deaths in the United States each year, according to the U.S. Centers for Disease Control and Prevention, but nearly 20 percent of the U.S. population continues to smoke cigarettes. While more than half of U.S. smokers try to quit every year, less than 10 percent are able to remain smoke-free, and relapse commonly occurs within 48 hours of smoking cessation. Learning about withdrawal and difficulty of quitting can lead to more effective treatments to help smokers quit.

In a first of its kind study on nicotine addiction, scientists measured a behavior that can be similarly quantified across species like humans and rats, the responses to rewards during nicotine withdrawal. Findings from this study were published online on Sept. 10, 2014 in JAMA Psychiatry.

Response to reward is the brain's ability to derive and recognize pleasure from natural things such as food, money and sex. The reduced ability to respond to rewards is a behavioral process associated with depression in humans. In prior studies of nicotine withdrawal, investigators used very different behavioral measurements across humans and rats, limiting our understanding of this important brain reward system.

Using a translational behavioral approach, Michele Pergadia, Ph.D., associate professor of clinical biomedical science in the Charles E. Schmidt College of Medicine at Florida Atlantic University, who completed the human study while at Washington University School of Medicine, Andre Der-Avakian, Ph.D., who completed the rat study at the University of California San Diego (UCSD), and colleagues, including senior collaborators Athina Markou, Ph.D. at UCSD and Diego Pizzagalli, Ph.D. at Harvard Medical School, found that nicotine withdrawal similarly reduced reward responsiveness in human smokers -- particularly those with a history of depression -- as well as in nicotine-treated rats.

Pergadia, one of the lead authors, notes that replication of experimental results across species is a major step forward, because it allows for greater generalizability and a more reliable means for identifying behavioral and neurobiological mechanisms that explain the complicated behavior of nicotine withdrawal in humans addicted to tobacco.

"The fact that the effect was similar across species using this translational task not only provides us with a ready framework to proceed with additional research to better understand the mechanisms underlying withdrawal of nicotine, and potentially new treatment development, but it also makes us feel more confident that we are actually studying the same behavior in humans and rats as the studies move forward," said Pergadia.

Pergadia and colleagues plan to pursue future studies that will include a systematic study of depression vulnerability as it relates to reward sensitivity, the course of withdrawal-related reward deficits, including effects on relapse to smoking, and identification of processes in the brain that lead to these behaviors.

Pergadia emphasizes that the ultimate goal of this line of research is to improve treatments that manage nicotine withdrawal-related symptoms and thereby increase success during efforts to quit.

"Many smokers are struggling to quit, and there is a real need to develop new strategies to aid them in this process. Therapies targeting this reward dysfunction during withdrawal may prove to be useful," said Pergadia.

 

Snap 2014-09-12 at 18.10.27


Story Source:

The above story is based on materials provided by Florida Atlantic University. Note: Materials may be edited for content and length.


Journal Reference:

  1. Michele L. Pergadia, Andre Der-Avakian, Manoranjan S. D’Souza, Pamela A. F. Madden, Andrew C. Heath, Saul Shiffman, Athina Markou, Diego A. Pizzagalli. Association Between Nicotine Withdrawal and Reward Responsiveness in Humans and Rats. JAMA Psychiatry, 2014; DOI: 10.1001/jamapsychiatry.2014.1016

Liposome research meets nanotechnology to improve cancer treatment

 

 

September 11, 2014

Boise State University

In treating cancer, chemotherapy and radiotherapy are two of the best weapons in a doctor’s arsenal. Reports have shown that ideally, both methods would be employed at the same time. But doing so produces levels of toxicity that often are deadly. To reduce the remote toxicity inherent to chemotherapy, the drugs can be administered into solid tumors by using liposomes, which are nanoscale vesicles made from fats and loaded with anti-cancer drugs, researchers report.


Liposome diagram.

In the race to find more effective ways to treat cancer, Boise State University biophysicist Daniel Fologea is working outside the rules of general mathematics that say one plus one equals two. In his world, one plus one adds up to a whole lot more.

While radiotherapy can precisely target just the tumor site, systemic chemotherapy spreads a wide net, sending drugs speeding throughout the entire body in an attempt to kill cancer cells while also killing many healthy cells. Neither of these methods is highly effective when applied alone, therefore separated sessions of chemo and radiotherapy are required when fighting against solid tumors.

Reports have shown that ideally, both methods would be employed at the same time. But doing so produces levels of toxicity that often are deadly. To reduce the remote toxicity inherent to chemotherapy, the drugs can be administered into solid tumors by using liposomes, which are nanoscale vesicles made from fats and loaded with anti-cancer drugs. Liposomes self-accumulate within the tumor but the loaded drugs will be released very slowly from their encasing.

A new patent awarded to Fologea, a professor in the Department of Physics, and co-researchers from the University of Arkansas in August 2014 holds promise of a way to combine the oomph of chemotherapy with the precision of radiotherapy, without harm to healthy cells.

In the new approach, said Fologea, "The liposomes are designed to release their precious cargo upon exposure to x-ray. Not only does this target where the medication goes, it also allows for a huge concentration of the drug to be released at once at the tumor site, thus increasing its efficacy. In addition, this combined modality of treatment employing concomitant radio and chemotherapy is supra-additive, which means it is several times more efficient than each therapy applied alone."

Here's how it works: liposomes have small scintillating nanoparticles embedded within them. When hit with the x-ray, they emit ultraviolet (UV) light. UV light triggers the release of Ca2+ entrapped into a photolabile cage inside the liposomes. The free Ca2+ activates an enzyme called phospholipase A2 that starts chewing the fats in the wall of the liposomes and triggers the fast release of the drug.

Now that they have a patent on the technique, researchers still expect several years of testing before the method is approved and available for cancer patients.

In the meantime, Fologea completed the initial phase of another method to provide similar results by using only materials previously approved by the FDA for treatment of cancer and other diseases. This approach will pave the way for earlier translational studies.

Working with Boise State biology professor Cheryl Jorcyk, he is looking for ways to put antibodies on the surface of the liposome, allowing them to recognize and attack cancer cells that are circulating in the body. A distinct approach to develop liposomes useful for treatment of diabetes is under development with Boise State biology professor Denise Wingett.

 

Snap 2014-09-12 at 18.10.27 


Story Source:

The above story is based on materials provided by Boise State University. The original article was written by Kathleen Tuck. Note: Materials may be edited for content and length.

New species of electrons can lead to better computing

 


In a research paper published this week in Science, the collaboration led by MIT's theory professor Leonid Levitov and Manchester's Nobel laureate Sir Andre Geim report a material in which electrons move at a controllable angle to applied fields, similar to sailboats driven diagonally to the wind.

The material is graphene -- one atom-thick chicken wire made from carbon -- but with a difference. It is transformed to a new so-called superlattice state by placing it on top of boron nitride, also known as `white graphite', and then aligning the crystal lattices of the two materials. In contrast to metallic graphene, a graphene superlattice behaves as a semiconductor.

In original graphene, charge carriers behave like massless neutrinos moving at the speed of light and having the electron charge. Although an excellent conductor, graphene does not allow for easy switching on and off of current, which is at the heart of what a transistor does.

Electrons in graphene superlattices are different and behave as neutrinos that acquired a notable mass. This results in a new, relativistic behaviour so that electrons can now skew at large angles to applied fields. The effect is huge, as found in the Manchester-MIT experiments.

The reported relativistic effect has no known analogue in particle physics and extends our understanding of how the universe works.

Beyond the discovery, the observed phenomenon may also help enhance the performance of graphene electronics, making it a worthy companion to silicon.

The research suggests that transistors made from graphene superlattices should consume less energy than conventional semiconductor transistors because charge carriers drift perpendicular to the electric field, which results in little energy dissipation.

The Manchester-MIT researchers demonstrate the first such transistor, which opens a venue for less power hungry computers.

Professor Geim comments 'It is quite a fascinating effect, and it hits a very soft spot in our understanding of complex, so-called topological materials. It is extremely rare to come across with a phenomenon that bridges materials science, particle physics, relativity and topology.'

Professor Levitov adds 'It is widely believed than unconventional approaches to information processing are key for the future of IT hardware. This belief has been the driving force behind a number of important recent developments, in particular the development of spintronics. The demonstrated transistor highlights the promise of graphene-based systems for alternative ways of information processing. '

 

Snap 2014-09-12 at 18.10.27


Story Source:

The above story is based on materials provided by University of Manchester. Note: Materials may be edited for content and length.


Journal Reference:

  1. R. V. Gorbachev, J. C. W. Song, G. L. Yu, A. V. Kretinin, F. Withers, Y. Cao, A. Mishchenko, I. V. Grigorieva, K. S. Novoselov, L. S. Levitov, and A. K. Geim. Detecting topological currents in graphene superlattices. Science, 11 September 2014 DOI: 10.1126/science.1254966

Astrophysicists to probe how early universe made chemical elements

 


ASU astrophysicists have received an important research grant to study how massive stars, such as Eta Carinae, depicted here, evolve and eventually seed the universe with heavy elements created by nuclear reactions inside them.

In the beginning, all was hydrogen -- and helium, plus a bit of lithium. Three elements in all. Today's universe, however, has nearly a hundred naturally occurring elements, with thousands of variants (isotopes), and more likely to come.

Figuring out how the universe got from its starting batch of three elements to the menagerie found today is the focus of a new Physics Frontiers Center research grant to Arizona State University's School of Earth and Space Exploration (SESE). The grant is from the National Science Foundation's Joint Institute for Nuclear Astrophysics -- Center for the Evolution of the Elements. Of the full $11.4 million NSF grant, about $1 million will come to ASU over five years.

SESE astrophysicist Frank Timmes is the lead scientist for ASU's part of the Physics Frontiers Center research project. Timmes, ASU's director of advanced computing, focuses his astrophysical research on supernovae, cosmic chemical evolution, their impacts on astrobiology and high-performance computing. He is also a scientific editor of The Astrophysical Journal.

The evolution of elements project also includes Michigan State University in Lansing (the lead institution), the University of Notre Dame in South Bend, Indiana, and the University of Washington in Seattle.

Joining Timmes on the project will be astrophysicists Patrick Young, Evan Scannapieco and Sumner Starrfield, also from the School of Earth and Space Exploration In addition, the award will fund two postdoctoral researchers to collaborate on the effort.

Take it from the top

Time started 13.7 billion years ago with the Big Bang, which produced the basic three elements. Yet by the time the Bang was a billion years old, essentially all the other chemical elements we know had formed. How did this happen?

"It takes place inside stars," says Timmes. "They're the element-factories of the universe. They take light stuff, such as hydrogen and helium, process it in nuclear reactions, and then crank out carbon, nitrogen, oxygen and all those good things that make you and me."

While the broad outline is clear, details are a lot murkier, he says, and that's where ASU's researchers enter the picture.

"ASU's contribution is to provide the glue between experimental low-energy nuclear astrophysics measurements and astronomical observations of stars," Timmes says.

Ancient stars were fundamentally different from those today, he notes, because they started off with a different collection of initial ingredients -- no heavy elements. But those first-generation stars are gone.

As Timmes explains, "The stars that began back then went through their life cycles and died, so we naturally don't directly see them today. But when they died, they exploded and threw out little bits of carbon, oxygen and nitrogen, which ended up in the next generation of stars."

Round and round in cycles

In a process that still continues today, massive stars create more and more complex elements, then explode as supernovas and scatter the newly created elements into space for another generation of stars to use. Cycle after stellar cycle, stars became steadily richer in heavier and more complex elements.

The sun, its planets and moons all formed about 4.5 billion years ago. Most of the elements they contain didn't exist when the universe was young, so what generation does the sun belong to?

Timmes explains, "A typical massive star, in round numbers, lives about a million years. The Big Bang occurred about 7 billion years before the sun formed. I need a thousand generations of massive stars to get us to a billion years, so I need on the order of 10,000 generations of massive stars to get one with the sun's composition.

"We are the product of many, many, many previous generations of stars."

The researchers at the School of Earth and Space Exploration plan to develop computer models of stars of all sizes, masses and chemical compositions, then set them on their life courses. It's building stars in computers and comparing them to observations of stars to see how the universe builds them for real.

"The toughest theoretical problem we have to work on is how stars explode," says Timmes. "In a loose, hand-waving sense, we know that stars explode, of course, but exactly how it happens isn't well-known or understood."

The new research project fits well with the expertise of the school's astrophysicists. And there's another plus as well. With this project, ASU is joining a small group of research centers that deal with "Frontiers Physics." The entire country has only about ten such centers, Timmes explains. Highly competitive and highly sought-after, they cover subjects such as biological physics and theoretical physics.

 

Snap 2014-09-12 at 18.10.27

Favoritism linked to drug use in 'disengaged' families

 

September 12, 2014

Brigham Young University

In families, the perception that parents have a favorite is linked with the less-favored children being twice as likely to use alcohol, cigarettes or drugs. For parents worrying about keeping score and managing perceptions of fairness, one expert has some very simple advice. "Show your love to your kids at a greater extent than you currently are. As simple as it sounds, more warmth and less conflict is probably the best answer."


Professor Alex Jensen with his two daughters.

Before you revive the debate about which sibling in your family is the favorite, you'll want to know what the latest research shows.

Brigham Young University professor Alex Jensen analyzed 282 families with teenage siblings for a study that appears in the Journal of Family Psychology. Favoritism in parenting is a complex topic for sure, but here are some important take-aways.

Does it really matter?

Yes, at least for some families. Jensen looked at perceived preferential treatment in different types of family dynamics. For families that aren't very close to each other -- so-called "disengaged" families -- favoritism was strongly associated with alcohol, cigarette and drug use by the less-favored children.

In these disengaged families, children who view themselves as slightly less favored were almost twice as likely to use alcohol, cigarettes or drugs. If the preferential treatment was perceived to be dramatic, the less favored child was 3.5 times more likely to use any of these substances.

In other words, favoritism appears to be the most problematic when love and support are generally scarce.

"With favoritism in disengaged families, it wasn't just that they were more likely to use any substances, it also escalated," Jensen said. "If they were already smoking then they were more likely to drink also. Or if they were smoking and drinking, they were more likely to also use drugs."

How can you tell who is the favorite?

When you ask people which sibling gets preferential treatment, their perception often doesn't match reality. But that's where things get tricky: Perceptions matter more.

"It's not just how you treat them differently, but how your kids perceive it," Jensen said. "Even in the case where the parents treated them differently, those actual differences weren't linked to substance use -- it was the perception."

What should parents do?

For parents worrying about keeping score and managing perceptions of fairness, Jensen has some very simple advice.

"Show your love to your kids at a greater extent than you currently are," Jensen said. "As simple as it sounds, more warmth and less conflict is probably the best answer."

That's based on what they saw in the data -- the link between substance use and favoritism didn't exist among families that take a strong interest in each other.

Jensen also recommends that parents look for unique things that each of their children are trying to build into their identity.

"Every kid as they get older develops their own interests and start to have their own identity," Jensen said. "If you value that and respect that, and as a parent support what they see as their identity, that would help them feel loved."

 

Snap 2014-09-12 at 18.10.27


Story Source:

The above story is based on materials provided by Brigham Young University. Note: Materials may be edited for content and length.


Journal Reference:

  1. Alexander C. Jensen, Shawn D. Whiteman. Parents’ differential treatment and adolescents’ delinquent behaviors: Direct and indirect effects of difference-score and perception-based measures.. Journal of Family Psychology, 2014; 28 (4): 549 DOI: 10.1037/a0036888

Corn spots: Study finds important genes in defense response

 

September 12, 2014

North Carolina State University

When corn plants come under attack from a pathogen, they sometimes respond by killing their own cells near the site of the attack, committing "cell suicide" to thwart further damage from the attacker. This cell sacrifice can cause very small, often microscopic, spots or lesions on the plant. Researchers have now scoured the corn genome to find candidate genes that control this important defense response.


The hypersensitive defense response in corn protects the plant by killing a few of its own cells in a rapid and localized response. NC State and USDA researchers have now identified the top candidate genes involved in this response.

When corn plants come under attack from a pathogen, they sometimes respond by killing their own cells near the site of the attack, committing "cell suicide" to thwart further damage from the attacker. This cell sacrifice can cause very small, often microscopic, spots or lesions on the plant.

But up until now it's been difficult to understand how the plant regulates this "spotty" defense mechanism because the response is so quick and localized.

Researchers at North Carolina State University have identified a number of candidate genes and cellular processes that appear to control this so-called hypersensitive defense response (HR) in corn. The findings, which appear in PLOS Genetics, could help researchers build better defense responses in corn and other plants; HR is thought to occur in all higher-order plants, including all trees and crop plants, and is normally a tightly regulated response.

The 44 candidate genes appear to be involved in defense response, programmed cell death, cell wall modification and a few other responses linked to resisting attack, says Dr. Peter Balint-Kurti, the paper's corresponding author and a U.S. Department of Agriculture (USDA) professor who works in NC State's plant pathology and crop science departments.

To arrive at the finding, the NC State researchers joined researchers from Purdue University in examining more than 3,300 maize plants that contained a similar mutation: They all had exaggerated HR because one particular resistance gene, Rp1-D21, doesn't turn off.

"It's similar to a human having an auto-immune response that never stops," Balint-Kurti says. "This mutation causes a corn plant to inappropriately trigger this hypersensitive defense response, causing spots on the corn plant as well as stunted growth."

The researchers examined the entire corn gene blueprint -- some 26.5 million points in the 2 to 3 billion base pair genome -- to find the genes most closely associated with HR. Balint-Kurti said the top candidates made sense, as they mostly appear to be linked to defense or disease resistance.

"All of the processes associated with the top candidate genes have been previously associated with HR," Balint-Kurti said. "Hopefully this work provides an opening to really characterize this important defense response and learn more about it in other plants."

USDA plant geneticist and breeder Jim Holland co-authored the paper along with first authors Bode Olukolu and Guan Feng Wang, who are post-doctoral researchers at NC State. Vijay Vontimitta, a post-doctoral researcher at Purdue working in a group headed by Guri Johal, is also a first author.

Snap 2014-09-12 at 18.10.27


Story Source:

The above story is based on materials provided by North Carolina State University. Note: Materials may be edited for content and length.


Journal Reference:

  1. Bode A. Olukolu, Guan-Feng Wang, Vijay Vontimitta, Bala P. Venkata, Sandeep Marla, Jiabing Ji, Emma Gachomo, Kevin Chu, Adisu Negeri, Jacqueline Benson, Rebecca Nelson, Peter Bradbury, Dahlia Nielsen, James B. Holland, Peter J. Balint-Kurti, Gurmukh Johal. A Genome-Wide Association Study of the Maize Hypersensitive Defense Response Identifies Genes That Cluster in Related Pathways. PLoS Genetics, 2014; 10 (8): e1004562 DOI: 10.1371/journal.pgen.1004562

Ahoy, offshore wind: Advanced buoys bring vital data to untapped energy resource

 


Research on the High Seas: Pacific Northwest National Laboratory staff conduct tests in Sequim Bay, Washington, while aboard one of two new research buoys being commissioned to more accurately predict offshore wind’s power-producing potential.

Two massive, 20,000-pound buoys decked out with the latest in meteorological and oceanographic equipment will enable more accurate predictions of the power-producing potential of winds that blow off U.S. shores.

The bright yellow buoys -- each worth about $1.2 million -- are being commissioned by the Department of Energy's Pacific Northwest National Laboratory in Washington state's Sequim Bay. Starting in November, they will be deployed for up to a year at two offshore wind demonstration projects: one near Coos Bay, Oregon, and another near Virginia Beach, Virginia.

"We know offshore winds are powerful, but these buoys will allow us to better understand exactly how strong they really are at the heights of wind turbines," said PNNL atmospheric scientist Will Shaw. "Data provided by the buoys will give us a much clearer picture of how much power can be generated at specific sites along the American coastline -- and enable us to generate that clean, renewable power sooner."

Offshore wind is a new frontier for U.S. renewable energy developers. There's tremendous power-producing potential, but limited information is available about ocean-based wind resources. DOE's Office of Energy Efficiency and Renewable Energy purchased the buoys to improve offshore turbine performance in the near term and reduce barriers to private-sector investment in large-scale offshore wind energy development in the long term. The buoys were manufactured by AXYS Technologies, Inc., in Sydney, British Columbia.

A recent report estimated the U.S. could power nearly 17 million homes by generating more than 54 gigawatts of offshore wind energy, but more information is needed. Instruments have long been sent out to sea to measure winds on the ocean's surface, but the blade tips of offshore wind turbines can reach up to 600 feet above the surface, where winds can behave very differently.

The buoys carry a bevy of advanced instruments, including devices called lidar, which is short for light detection and ranging, to measure wind speed and direction at multiple heights above the ocean. Other onboard instruments will record air and sea surface temperature, barometric pressure, relative humidity, wave height and period, and water conductivity. Subsurface ocean currents will also be measured with acoustic Doppler sensors.

All of these measurements will help scientists and developers better understand air-sea interactions and their impact on how much wind energy a turbine could capture at particular offshore sites. The data will also help validate the wind predictions derived from computer models, which have thus far relied on extremely limited real-world information.


Story Source:

The above story is based on materials provided by Pacific Northwest National Laboratory. The original article was written by Frances White. Note: Materials may be edited for content and length.