quarta-feira, 17 de setembro de 2014

Treating insomnia in elderly reduces inflammation, lowers risk for chronic diseases

 


Lack of sleep can make you sick. And while everybody has the occasional restless night, for those who suffer from chronic insomnia -- some 15 percent of older adults in the United States -- that sleep loss can increase the risk for cardiovascular disease, hypertension, weight gain, type 2 diabetes, and even lead to an earlier death.

The reason for the increased risk of health problems is thought to be an association between insomnia and an increase in inflammation throughout the body that becomes chronic. Though inflammation can be a good thing -- part of a robust immune response that heals injury and fights infection, chronic inflammation can damage and kill healthy cells, leading to disease.

What hasn't been known is whether treating insomnia could reduce inflammation, thereby lowering the risk for chronic disease in older adults. Nor has it been known what the most effective therapy is to treat insomnia.

Now UCLA researchers have answered both these questions. In a new study, they demonstrate that reducing insomnia can indeed lead to decreases in inflammation, and second, that a form of psychotherapy called cognitive behavioral therapy proved superior to other forms of treatment.

The study appears in the September issue of the journal Sleep.

The results were obtained from a randomized clinical trial of 123 adults older than 55, and showed that treating insomnia led to decreases in a known marker of inflammation called C-reactive protein (CRP). The protein is found in blood plasma, and its levels rise in response to an acute inflammatory stimulus. The CRP levels were measured at the beginning of the study, again after treatment, and again in a follow-up 16 months later.

"What we found particularly intriguing was that the levels of the CRP inflammatory marker remained low even 16 months after treating the insomnia," said Michael Irwin, first author, and a professor of psychiatry and director of the Cousins Center for Psychoneuroimmunology at the UCLA Semel Institute for Neuroscience and Human Behavior.

The researchers also compared three treatments for insomnia:

• Cognitive behavioral treatment (CBT), a form of therapy that helps people learn how to identify and change destructive or disturbing thought patterns that have a negative influence on behavior;

• Tai chi chih (TCC), the westernized version of the Chinese martial art characterized by slow movement and meditation;

• And sleep seminar education, which provided educational information related to the physical, medical, and psychosocial factors of aging and their contribution to sleep problems.

• They found that, "by far" noted Irwin, cognitive behavioral therapy performed better than tai chi chih and sleep seminar education in reducing insomnia, and also showed greater and more sustained improvements in sleep quality, the ability to maintain continuous sleep throughout the night, and study participants' reports of fatigue, and depressive symptoms.

"This is the first randomized, controlled trial that has evaluated the comparative efficacy of TCC versus CBT, a standardized behavioral intervention for insomnia," Irwin said. The research team found that those who got cognitive behavioral treatment showed a reduced rate of diagnostic insomnia that was nearly double either of the other two treatments.

The benefit of treating insomnia to reduce inflammation is comparable to the benefit reported with vigorous physical activity or weight loss, he noted. "To advance public health, these findings prominently emphasize the position of sleep among the three pillars of health -- diet, exercise and sleep.

"Finally, if insomnia is untreated and sleep disturbance persists, we found that CRP levels progressively increase," Irwin said. "Together, these findings indicate that it is even more critical to treat insomnia in this population who are already at elevated risk for aging-related inflammatory disease."

Snap 2014-09-12 at 18.10.27


Story Source:

The above story is based on materials provided by University of California, Los Angeles (UCLA), Health Sciences. The original article was written by Mark Wheeler. Note: Materials may be edited for content and length.


Journal Reference:

  1. Michael R. Irwin, Richard Olmstead, Carmen Carrillo, Nina Sadeghi, Elizabeth C. Breen, Tuff Witarama, Megumi Yokomizo, Helen Lavretsky, Judith E. Carroll, Sarosh J. Motivala, Richard Bootzin, Perry Nicassio. Cognitive Behavioral Therapy vs. Tai Chi for Late Life Insomnia and Inflammatory Risk: A Randomized Controlled Comparative Efficacy Trial. SLEEP, 2014; DOI: 10.5665/sleep.4008

More cheese, please! News study shows dairy is good for your metabolic health

 

September 16, 2014

Canadian Science Publishing (NRC Research Press)

Researchers studied the dairy-eating habits of healthy French-Canadians' and monitored how dairy consumption may have an effect on their overall metabolic health. It's well known that dairy products contain calcium and minerals good for bones, but new research has shown that dairy consumption may also have beneficial effects on metabolic health and can reduce risk of metabolic diseases such as obesity and type 2 diabetes.


New research has shown that dairy consumption may have beneficial effects on metabolic health.

Dairy is considered part of a healthy diet and dietary guidelines recommend the daily consumption of 2-4 portions of milk-based products such as milk, yogurt, cheese, cream and butter.

It's well known that dairy products contain calcium and minerals good for bones, but new research has shown that dairy consumption may also have beneficial effects on metabolic health and can reduce risk of metabolic diseases such as obesity and type 2 diabetes.

Curious about these impacts, researchers from CHU de Québec Research Center and Laval University studied the dairy-eating habits of healthy French-Canadians' and monitored how dairy consumption may have an effect on their overall metabolic health. They published their findings today in the journal Applied Physiology, Nutrition, and Metabolism.

The aim of this study was to determine associations between dairy intake and specific metabolic risk factors, including anthropometric status, plasma glucose, plasma lipid profile, inflammatory markers and blood pressure, in a healthy population.

A total of 254 participants from the greater Quebec City metropolitan area were recruited; 233 participants (105 men and 128 women) met all the eligibility criteria for the study ‒ meaning subjects had healthy metabolic profiles.

The study showed that the average individual consumed 2.5 ± 1.4 portions of dairy per day. However, nearly 45% of the population in this study did not meet Canada's Food Guide recommendations of at least 2 portions of dairy products a day. These findings are supported by recent Canadian surveys that highlighted an under consumption of dairy products by Canadians.

Data suggest that trans-palmitoleic acid found in plasma may be potentially used as a biomarker to evaluate dairy consumption. Trans-palmitoleic acid, is naturally present in milk, cheese, yogurt, butter, and meat fat but cannot be synthetized by the body. This fatty acid has been recently shown to have health-promoting effects. In this study, that trans-palmitoleic acid level was related to lower blood pressure in men and women, and to lower body weight in men.

Dairy intake is associated with lower blood glucose and blood pressure in the population studied, though no causal relationships can be made due to the cross-sectional design. This study adds to a growing body of literature demonstrating a lack of detrimental health effects with higher dairy intake.

Dr. Iwona Rudkowska, a research scientist at the Endocrinology and Nephrology Department, at the CHU de Québec Research Center and assistant professor at Laval University , says "additional well-designed intervention studies are needed to ascertain the effects of increased dairy consumption on metabolic health in healthy and in metabolically deteriorated populations."

Snap 2014-09-12 at 18.10.27


Story Source:

The above story is based on materials provided by Canadian Science Publishing (NRC Research Press). Note: Materials may be edited for content and length.


Journal Reference:

  1. Marine S. Da Silva, Pierre Julien, Patrick Couture, Simone Lemieux, Marie-Claude Vohl, Iwona Rudkowska. Associations between dairy intake and metabolic risk parameters in a healthy French-Canadian population. Applied Physiology, Nutrition, and Metabolism, 2014; 1 DOI: 10.1139/apnm-2014-0154

Pseudoephedrine #3: Outsmarting Methamphetamine Producers

 

Pseudoephedrine #3:  Outsmarting Methamphetamine Producers

Life Sciences, Chemistry, Scientific American, Engineering/Design, RTP

AWARD: $100,000 USD  |  DEADLINE: 11/20/14  |  ACTIVE SOLVERS: 115  |  POSTED: 8/20/14

The Seeker desires a method for formulating pseudoephedrine products in such a way that it will be extremely difficult for clandestine chemists to extract or use the pseudoephedrine in the production of the illegal drug methamphetamine.  The Seeker may be interested in pursuing further collaboration with Solvers whose solutions they find promising.  The Seeker may also allow Solvers additional time after the submission deadline to finalize experiments required for the full reduction-to-practice award.

This is a Reduction-to-Practice Challenge that requires a written proposal and experimental proof-of-concept data (and/or sample delivery).  The use of one or more surrogate compounds (e.g. substituted benzylic alcohols, structural mimics, etc.) to demonstrate proof-of-concept is required.  Solvers should NOT conduct reduction-to-practice experiments with illegal or federally controlled substances.

Source: InnoCentive      Challenge ID: 9933615

Challenge Overview

The Seeker desires a method for formulating pseudoephedrine products in such a way that it will be extremely difficult for clandestine chemists to extract or use the pseudoephedrine in the production of the illegal drug methamphetamine.  The Seeker may be interested in pursuing further collaboration with Solvers whose solutions they find promising.  The Seeker may also allow Solvers additional time after the submission deadline to finalize experiments required for the full reduction-to-practice award.

The submission to the Challenge should include the following:

  1. The detailed description of the proposed Solution addressing specific Solution Requirements presented in the Detailed Description of the Challenge.  This description should be accompanied by a well-articulated rationale supported by literature/patent precedents.
  2. Experimental proof-of-concept data obtained as outlined in the Detailed Description of the Challenge.   

The award is contingent upon theoretical evaluation and review of the experimental proof-of-concept of the submitted Solutions by the Seeker.

To receive an award, the Solvers will have to transfer to the Seeker their exclusive Intellectual Property (IP) rights to the solution.

What is InnoCentive?

InnoCentive is the global innovation marketplace where creative minds solve some of the world's most important problems for cash awards up to $1 million. Commercial, governmental and humanitarian organizations engage with InnoCentive to solve problems that can impact humankind in areas ranging from the environment to medical advancements.

What is an RTP Challenge?

An InnoCentive RTP (Reduction to Practice) Challenge is a prototype that proves an idea, and is similar to an InnoCentive Theoretical Challenge in its high level of detail. However, an RTP requires the Solver to submit a validated solution, either in the form of original data or a physical sample. Also the Seeker is allowed to test the proposed solution.

Snap 2014-09-17 at 17.21.49

https://www.innocentive.com/ar/challenge/9933615?cc=SciAm9933615&utm_source=SciAm&utm_medium=pavilion&utm_campaign=challenges

How Diversity Makes Us Smarter

 

Being around people who are different from us makes us more creative, more diligent and harder-working

Sep 16, 2014 By Katherine W. Phillips

Edel Rodriguez

In Brief

The first thing to acknowledge about diversity is that it can be difficult. In the U.S., where the dialogue of inclusion is relatively advanced, even the mention of the word “diversity” can lead to anxiety and conflict. Supreme Court justices disagree on the virtues of diversity and the means for achieving it. Corporations spend billions of dollars to attract and manage diversity both internally and externally, yet they still face discrimination lawsuits, and the leadership ranks of the business world remain predominantly white and male.

It is reasonable to ask what good diversity does us. Diversity of expertise confers benefits that are obvious—you would not think of building a new car without engineers, designers and quality-control experts—but what about social diversity? What good comes from diversity of race, ethnicity, gender and sexual orientation? Research has shown that social diversity in a group can cause discomfort, rougher interactions, a lack of trust, greater perceived interpersonal conflict, lower communication, less cohesion, more concern about disrespect, and other problems. So what is the upside?

The wishful thinking: it is the conclusion I draw from decades of research from organizational scientists, psychologists fact is that if you want to build teams or organizations capable of innovating, you need diversity. Diversity enhances creativity. It encourages the search for novel information and perspectives, leading to better decision making and problem solving. Diversity can improve the bottom line of companies and lead to unfettered discoveries and breakthrough innovations. Even simply being exposed to diversity can change the way you think. This is not just , sociologists, economists and demographers.

Information and Innovation
The key to understanding the positive influence of diversity is the concept of informational diversity. When people are brought together to solve problems in groups, they bring different information, opinions and perspectives. This makes obvious sense when we talk about diversity of disciplinary backgrounds—think again of the interdisciplinary team building a car. The same logic applies to social diversity. People who are different from one another in race, gender and other dimensions bring unique information and experiences to bear on the task at hand. A male and a female engineer might have perspectives as different from one another as an engineer and a physicist—and that is a good thing.

Research on large, innovative organizations has shown repeatedly that this is the case. For example, business professors Cristian Deszö of the University of Maryland and David Ross of Columbia University studied the effect of gender diversity on the top firms in Standard & Poor's Composite 1500 list, a group designed to reflect the overall U.S. equity market. First, they examined the size and gender composition of firms' top management teams from 1992 through 2006. Then they looked at the financial performance of the firms. In their words, they found that, on average, “female representation in top management leads to an increase of $42 million in firm value.” They also measured the firms' “innovation intensity” through the ratio of research and development expenses to assets. They found that companies that prioritized innovation saw greater financial gains when women were part of the top leadership ranks.

Racial diversity can deliver the same kinds of benefits. In a study conducted in 2003, Orlando Richard, a professor of management at the University of Texas at Dallas, and his colleagues surveyed executives at 177 national banks in the U.S., then put together a database comparing financial performance, racial diversity and the emphasis the bank presidents put on innovation. For innovation-focused banks, increases in racial diversity were clearly related to enhanced financial performance.

Evidence for the benefits of diversity can be found well beyond the U.S. In August 2012 a team of researchers at the Credit Suisse Research Institute issued a report in which they examined 2,360 companies globally from 2005 to 2011, looking for a relationship between gender diversity on corporate management boards and financial performance. Sure enough, the researchers found that companies with one or more women on the board delivered higher average returns on equity, lower gearing (that is, net debt to equity) and better average growth.

How Diversity Provokes Thought
Large data-set studies have an obvious limitation: they only show that diversity is correlated with better performance, not that it causes better performance. Research on racial diversity in small groups, however, makes it possible to draw some causal conclusions. Again, the findings are clear: for groups that value innovation and new ideas, diversity helps.

In 2006 Margaret Neale of Stanford University, Gregory Northcraft of the University of Illinois at Urbana-Champaign and I set out to examine the impact of racial diversity on small decision-making groups in an experiment where sharing information was a requirement for success. Our subjects were undergraduate students taking business courses at the University of Illinois. We put together three-person groups—some consisting of all white members, others with two whites and one nonwhite member—and had them perform a murder mystery exercise. We made sure that all group members shared a common set of information, but we also gave each member important clues that only he or she knew. To find out who committed the murder, the group members would have to share all the information they collectively possessed during discussion. The groups with racial diversity significantly outperformed the groups with no racial diversity. Being with similar others leads us to think we all hold the same information and share the same perspective. This perspective, which stopped the all-white groups from effectively processing the information, is what hinders creativity and innovation.

Other researchers have found similar results. In 2004 Anthony Lising Antonio, a professor at the Stanford Graduate School of Education, collaborated with five colleagues from the University of California, Los Angeles, and other institutions to examine the influence of racial and opinion composition in small group discussions. More than 350 students from three universities participated in the study. Group members were asked to discuss a prevailing social issue (either child labor practices or the death penalty) for 15 minutes. The researchers wrote dissenting opinions and had both black and white members deliver them to their groups. When a black person presented a dissenting perspective to a group of whites, the perspective was perceived as more novel and led to broader thinking and consideration of alternatives than when a white person introduced that same dissenting perspective. The lesson: when we hear dissent from someone who is different from us, it provokes more thought than when it comes from someone who looks like us.

This effect is not limited to race. For example, last year professors of management Denise Lewin Loyd of the University of Illinois, Cynthia Wang of Oklahoma State University, Robert B. Lount, Jr., of Ohio State University and I asked 186 people whether they identified as a Democrat or a Republican, then had them read a murder mystery and decide who they thought committed the crime. Next, we asked the subjects to prepare for a meeting with another group member by writing an essay communicating their perspective. More important, in all cases, we told the participants that their partner disagreed with their opinion but that they would need to come to an agreement with the other person. Everyone was told to prepare to convince their meeting partner to come around to their side; half of the subjects, however, were told to prepare to make their case to a member of the opposing political party, and half were told to make their case to a member of their own party.

The result: Democrats who were told that a fellow Democrat disagreed with them prepared less well for the discussion than Democrats who were told that a Republican disagreed with them. Republicans showed the same pattern. When disagreement comes from a socially different person, we are prompted to work harder. Diversity jolts us into cognitive action in ways that homogeneity simply does not.

For this reason, diversity appears to lead to higher-quality scientific research. This year Richard Freeman, an economics professor at Harvard University and director of the Science and Engineering Workforce Project at the National Bureau of Economic Research, along with Wei Huang, a Harvard economics Ph.D. candidate, examined the ethnic identity of the authors of 1.5 million scientific papers written between 1985 and 2008 using Thomson Reuters's Web of Science, a comprehensive database of published research. They found that papers written by diverse groups receive more citations and have higher impact factors than papers written by people from the same ethnic group. Moreover, they found that stronger papers were associated with a greater number of author addresses; geographical diversity, and a larger number of references, is a reflection of more intellectual diversity.

The Power of Anticipation
Diversity is not only about bringing different perspectives to the table. Simply adding social diversity to a group makes people believe that differences of perspective might exist among them and that belief makes people change their behavior.

Members of a homogeneous group rest somewhat assured that they will agree with one another; that they will understand one another's perspectives and beliefs; that they will be able to easily come to a consensus. But when members of a group notice that they are socially different from one another, they change their expectations. They anticipate differences of opinion and perspective. They assume they will need to work harder to come to a consensus. This logic helps to explain both the upside and the downside of social diversity: people work harder in diverse environments both cognitively and socially. They might not like it, but the hard work can lead to better outcomes.

In a 2006 study of jury decision making, social psychologist Samuel Sommers of Tufts University found that racially diverse groups exchanged a wider range of information during deliberation about a sexual assault case than all-white groups did. In collaboration with judges and jury administrators in a Michigan courtroom, Sommers conducted mock jury trials with a group of real selected jurors. Although the participants knew the mock jury was a court-sponsored experiment, they did not know that the true purpose of the research was to study the impact of racial diversity on jury decision making.

Sommers composed the six-person juries with either all white jurors or four white and two black jurors. As you might expect, the diverse juries were better at considering case facts, made fewer errors recalling relevant information and displayed a greater openness to discussing the role of race in the case. These improvements did not necessarily happen because the black jurors brought new information to the group—they happened because white jurors changed their behavior in the presence of the black jurors. In the presence of diversity, they were more diligent and open-minded.

Group Exercise
Consider the following scenario: You are writing up a section of a paper for presentation at an upcoming conference. You are anticipating some disagreement and potential difficulty communicating because your collaborator is American and you are Chinese. Because of one social distinction, you may focus on other differences between yourself and that person, such as her or his culture, upbringing and experiences—differences that you would not expect from another Chinese collaborator. How do you prepare for the meeting? In all likelihood, you will work harder on explaining your rationale and anticipating alternatives than you would have otherwise.

This is how diversity works: by promoting hard work and creativity; by encouraging the consideration of alternatives even before any interpersonal interaction takes place. The pain associated with diversity can be thought of as the pain of exercise. You have to push yourself to grow your muscles. The pain, as the old saw goes, produces the gain. In just the same way, we need diversity—in teams, organizations and society as a whole—if we are to change, grow and innovate.

Snap 2014-09-13 at 12.29.02

Patient's question triggers important study about blood thinners

 


Jamie Dossett-Mercer (left) asked how all the new oral blood thinners compare to one another. It was a question hematologist and scientist Dr. Marc Carrier (right) of The Ottawa Hospital couldn't answer, so he took it upon himself to find out. The result is a network meta-analysis published in JAMA on Sept., 16, 2014.

Physicians around the world now have guidance that can help them determine the best oral blood thinners to use for their patients suffering from blood clots in their veins, thanks to a patient of The Ottawa Hospital who asked his physician a question he couldn't answer. This new guidance is found in a study published today by JAMA, the Journal of the American Medical Association.

"Right there in the clinic, he identified an important knowledge gap for clinicians. We decided to act on it and find the answer," says hematologist Dr. Marc Carrier, who also a scientist at The Ottawa Hospital and associate professor at the University of Ottawa.

Dr. Carrier was treating Jamie Dossett-Mercer for major blood clotting in his leg veins, called deep vein thrombosis, that reached from his ankle to his groin. If one of these clots were to break off, it could travel to the lung and cause a pulmonary embolism, which is often fatal. These two common medical conditions are known together as venous thromboembolism and form the third leading cause of cardiovascular death.

In recent years, a number of new oral anticoagulants have been approved for use. Faced with eight possible therapies, Dossett-Mercer asked, "How do all these different blood thinners compare head to head?"

Dr. Carrier went looking for the answer. Although he found dozens of trials that studied the effect of different agents separately, none had analyzed all the results together.

His team reviewed 45 randomized trials (involving nearly 45,000 patients) using a process called network meta-analysis, which allows them to set a baseline treatment and compare all the other treatments to that. All the clinical trials they found compared the newer treatments to the standard of care, which is low-molecular-weight heparin (LMWH) with vitamin K antagonists.

Using the LMWH-vitamin K antagonist combination as the central node of the network, they compared safety and effectiveness with seven other anticoagulant therapies for venous thromboembolism: unfractionated heparin (UFH) with vitamin K antagonists; fondaparinux with vitamin K antagonists; LMWH with dabigatran; LMWH with edoxaban; rivaroxaban; apixaban; and LMWH alone.

While they found no major differences in effectiveness and safety, there were some notable variations.

  • Patients taking the UFH-vitamin K antagonist combination had a higher percentage who experienced a recurrent blood clot within three months.
  • Patients taking rivaroxaban and apixaban had a lower percentage who experienced a major bleeding event within three month.

"This will help physicians tailor their care according to patient characteristics," says Dr. Carrier. "For example, if I am worried about recurrent clotting, but I'm not too worried about the risk of bleeding, then I can select the drug with the best safety profile."

"I was already impressed with Dr. Carrier's exceptional care," says Dossett-Mercer. "But that he would do this research based on a patient question is just astounding."

Snap 2014-09-12 at 18.10.27


Story Source:

The above story is based on materials provided by Ottawa Hospital Research Institute. Note: Materials may be edited for content and length.


Journal Reference:

  1. Lana A. Castellucci, Chris Cameron, Grégoire Le Gal, Marc A. Rodger, Doug Coyle, Philip S. Wells, Tammy Clifford, Esteban Gandara, George Wells, Marc Carrier. Clinical and Safety Outcomes Associated With Treatment of Acute Venous Thromboembolism. JAMA, 2014; 312 (11): 1122 DOI: 10.1001/jama.2014.10538

New drug formulations to boost fight against respiratory illnesses, antibiotic-resistant superbugs

 

September 16, 2014

The Agency for Science, Technology and Research (A*STAR)

Antibiotic resistance is a challenge in the treatment of diseases today as bacteria continuously mutate and develop resistance against multiple drugs designed to kill them, turning them into superbugs. New ways to enhance the efficacy of drugs used to treat respiratory system infections and antibiotic-resistant superbugs have now been uncovered by researchers.


The drug cocktails are formulated into powder forms and loaded into powder containment units (e.g. capsules) for delivery using a dry powder inhaler.

A team from A*STAR's Institute of Chemical and Engineering Sciences (ICES) and the National University Hospital (NUH) has discovered new ways to enhance the efficacy of drugs used to treat respiratory system infections and antibiotic-resistant superbugs.

A team of five researchers and clinicians in Singapore led by Dr Desmond Heng, ICES, has developed a new combination of drugs to effectively combat bacteria in the lungs which lead to common respiratory system infections, or bacteria-linked pulmonary diseases such as pneumonia, bronchiectasis and cystic fibrosis.

An acute upper respiratory tract infection, which includes the common flu, was reported to be among the top four conditions diagnosed at polyclinics for eight consecutive years, from 2006 to 2013.

Pneumonia on the other hand, was the second leading cause of death in 2012, contributing to 16.8 per cent of the total number of deaths from illnesses behind cancer.

The team has developed four new drug formulations of antibiotics and muco-actives which have proven to be extremely effective in laboratory trials in treating these diseases, as well as in reducing the antibiotic resistance of so-called "superbugs."

Breaking down the bacterium's 'shield'

Antibiotic resistance is a challenge in the treatment of diseases today as bacteria continuously mutate and develop resistance against multiple drugs designed to kill them, turning them into superbugs.

To fight these superbugs, the research team has developed and patented three other drug formulations that are each made up of three different antibiotics. These antibiotics complement each other by fighting bacteria in different ways and they can potentially be used interchangeably to prevent bacteria from developing drug resistance.

The team's findings show that all three mixtures are effective against multi-drug resistant strains which include bacterial pathogens such as Pseudomonas aeruginosa and Klebsiella pneumoniae. These formulations kill more multi-drug resistant bacteria than a single drug, and are up to five times more effective than antibiotics used for treating respiratory system infections today. This will allow doctors to prescribe smaller, more effective drug doses to treat patients.

In addition, their formulations can be inhaled by the patient directly, thereby allowing a higher concentration of medicine to reach the lungs compared to injections or orally-administered drugs.

These drug formulations are a result of an on-going collaboration between A*STAR and NUH which started in 2010. Buoyed by the results from the laboratory tests, the team is looking to move into clinical trials to test the stability and efficiency of their new drug formulations.

Dr Keith Carpenter, Executive Director of ICES, said: "I am delighted that the work stems from the results of our expertise in inhaled novel formulations. This is an excellent example of how our collaboration with the local medical community is helping to translate our research from bench to bedside, and further developing innovative therapies for patients."

Snap 2014-09-12 at 18.10.27


Story Source:

The above story is based on materials provided by The Agency for Science, Technology and Research (A*STAR). Note: Materials may be edited for content and length.

Human faces are so variable because we evolved to look unique

 

Our highly visual social interactions are almost certainly the driver of this evolutionary trend, said behavioral ecologist Michael J. Sheehan, a postdoctoral fellow in UC Berkeley's Museum of Vertebrate Zoology. Many animals use smell or vocalization to identify individuals, making distinctive facial features unimportant, especially for animals that roam after dark, he said. But humans are different.

"Humans are phenomenally good at recognizing faces; there is a part of the brain specialized for that," Sheehan said. "Our study now shows that humans have been selected to be unique and easily recognizable. It is clearly beneficial for me to recognize others, but also beneficial for me to be recognizable. Otherwise, we would all look more similar."

"The idea that social interaction may have facilitated or led to selection for us to be individually recognizable implies that human social structure has driven the evolution of how we look," said coauthor Michael Nachman, a population geneticist, professor of integrative biology and director of the UC Berkeley Museum of Vertebrate Zoology.

The study will appear Sept. 16 in the online journal Nature Communications.

In the study, Sheehan said, "we asked, 'Are traits such as distance between the eyes or width of the nose variable just by chance, or has there been evolutionary selection to be more variable than they would be otherwise; more distinctive and more unique?'"

As predicted, the researchers found that facial traits are much more variable than other bodily traits, such as the length of the hand, and that facial traits are independent of other facial traits, unlike most body measures. People with longer arms, for example, typically have longer legs, while people with wider noses or widely spaced eyes don't have longer noses. Both findings suggest that facial variation has been enhanced through evolution.

Finally, they compared the genomes of people from around the world and found more genetic variation in the genomic regions that control facial characteristics than in other areas of the genome, a sign that variation is evolutionarily advantageous.

"All three predictions were met: facial traits are more variable and less correlated than other traits, and the genes that underlie them show higher levels of variation," Nachman said. "Lots of regions of the genome contribute to facial features, so you would expect the genetic variation to be subtle, and it is. But it is consistent and statistically significant."

Using Army data

Sheehan was able to assess human facial variability thanks to a U.S. Army database of body measurements compiled from male and female personnel in 1988. The Army Anthropometric Survey (ANSUR) data are used to design and size everything from uniforms and protective clothing to vehicles and workstations.

A statistical comparison of facial traits of European Americans and African Americans -- forehead-chin distance, ear height, nose width and distance between pupils, for example -- with other body traits -- forearm length, height at waist, etc. -- showed that facial traits are, on average, more varied than the others. The most variable traits are situated within the triangle of the eyes, mouth and nose.

Sheehan and Nachman also had access to data collected by the 1000 Genome project, which has sequenced more than 1,000 human genomes since 2008 and catalogued nearly 40 million genetic variations among humans worldwide. Looking at regions of the human genome that have been identified as determining the shape of the face, they found a much higher number of variants than for traits, such as height, not involving the face.

Prehistoric origins

"Genetic variation tends to be weeded out by natural selection in the case of traits that are essential to survival," Nachman said. "Here it is the opposite; selection is maintaining variation. All of this is consistent with the idea that there has been selection for variation to facilitate recognition of individuals."

They also compared the human genomes with recently sequenced genomes of Neanderthals and Denisovans and found similar genetic variation, which indicates that the facial variation in modern humans must have originated prior to the split between these different lineages.

"Clearly, we recognize people by many traits -- for example their height or their gait -- but our findings argue that the face is the predominant way we recognize people," Sheehan said.

Snap 2014-09-12 at 18.10.27

Fact or Fiction?: Video Games Are the Future of Education

 

Some educators swear by them as valuable high-tech teaching tools but little is known about their impact on learning

Sep 12, 2014 By Elena Malykhina

A student plays MinecraftEdu.

Courtesy of TeacherGaming LLC

As kids all across the U.S. head back to school, they’re being forced to spend less time in front of their favorite digital distractions. Or are they?
Video games are playing an increasing role in school curricula as teachers seek to deliver core lessons such as math and reading—not to mention new skills such as computer programming—in a format that holds their students’ interests. Some herald this gamification of education as the way of the future and a tool that allows students to take a more active role in learning as they develop the technology skills they need to succeed throughout their academic and professional careers.
Few would argue that video games can do it all in terms of education, says Scot Osterweil, a research director in Massachusetts Institute of Technology’s
Comparative Media Studies program and creative director of the school’s Education Arcade initiative to explore how games can be used to promote learning. But games are a powerful learning tool when combined with other exploratory, hands-on activities and ongoing instruction from a teacher acting more as a coach than a lecturer, he adds.
Others, however, question whether a greater reliance on video games is in students’ best interests, indicating there is little proof that skillful game play translates into better test scores or broader cognitive development.
In the past decade schools have become preoccupied with meeting national
Common Core standards, which dictate what students should be able to accomplish in English and mathematics at the end of each grade and use standardized testing as a way of tracking a student’s progress. Such demands are not conducive to creative teaching methods that incorporate video games, Osterweil acknowledges. He adds, however, that a growing backlash against the perceived overuse of standardized tests is starting to encourage creativity once again.
Gamestars
Testing fatigue, combined with more pervasive computer use in and out of the classroom and continued experimentation with games as learning tools, suggests that such video games will play a significant role in the future of education. The
Quest to Learn public school in New York City offers a glimpse of how gaming is already transforming not just how students learn, but also what they learn. The teachers there have been using the principles of video game design to write their curriculum since the school opened its doors in 2009. This curriculum—organized into missions and quests—focuses on multifaceted challenges that may have more than one correct answer, letting students explore different solutions by making choices along the way, says Ross Flatt, assistant principal at the school.
More than simply playing video games, Quest to Learn students also study game design using
Gamestar Mechanic and other computer programs. After students successfully complete Gamestar missions, they are awarded avatars and other tools they can use to build their own games.
If educational video games are well executed, they can provide a strong framework for inquiry and project-based learning, says
Alan Gershenfeld, co-founder and president of E-Line Media, a publisher of computer and video games and a Founding Industry Fellow at Arizona State University’s Center for Games and Impact. “Games are also uniquely suited to fostering the skills necessary for navigating a complex, interconnected, rapidly changing 21st century,” he adds.
Digital literacy and understanding how systems (computer and otherwise) work will become increasingly important in a world where many of today’s students will pursue jobs that do not currently exist, says Gershenfeld, who wrote about video games’ potential to transform education in the
February Scientific American. Tomorrow’s workers will also likely change jobs many times throughout their careers and “will almost certainly have jobs that require some level of mastery of digital media and technology,” he adds.
Gaming the system
Parents of school-age children are likely familiar with
Minecraft, a digital game that promotes imagination as players build various structures out of cubes. MinecraftEdu, a version of the game that teachers created for educational purposes, teaches students mathematical concepts including perimeter, area and probabilities as well as foreign languages. SimCityEDU, a version of the popular city-building game, is likewise a learning and assessment tool for middle school students that covers the English, math and other lessons they need to master to meet Common Core State and Next Generation Science standards.
Beyond teaching, video games can also offer useful information about how well a child is learning and can even provide helpful visual displays of that information, says Brian Waniewski, social entrepreneur and former managing director of the
Institute of Play, a nonprofit that promotes the problem-solving nature of game play and game design as a model for learning in secondary schools. Video games can also provide instantaneous feedback—typically via scores—that teachers and students can use to determine how well students understand what the games are trying to teach them.
Limitations
For all of the enthusiasm around games and learning, very few studies have examined whether video games improve classroom performance and academic achievement, says
Emma Blakey, PhD researcher in developmental psychology at the University of Sheffield in England. “Because we know memory is a crucial cognitive skill for school learning, practice at playing games that challenge memory should, in theory, lead to improvements in classroom behavior and academic skills,” she says. But only additional research can say if that notion is correct.
A 2013 University of Cambridge
study joined by Darren Dunning of the University of York, found that the improvements in game scores for children with low levels of working memory did not extend to broader skills. Working memory is the cognitive system responsible for the temporary storage of information we need to support ongoing everyday activities, such as a locker combination or a friend’s Twitter handle. The study gave seven- to nine-year-olds up to 25 sessions of either video games set to challenge their working memory—the so-called “CogMed” approach—or the same video games set at an easy level. The researchers then examined whether playing the more difficult games improved performance on additional measures of working memory as well as enhanced other skills, including math, reading, writing and following instructions in a classroom. The study concluded that brain-training video games improve children’s performance only on very similar games, an effect that likely results from practice.
Digital games cannot be treated like the latest quick fix to the education system, Waniewski says. “They can seem like a godsend, a next-generation digital textbook that further reduces the need for human resources,” he notes. Yet games alone will not make schools more efficient, replace teachers or serve as an educational resource that can reach an infinite number of students, he adds.
Video games are not necessarily the most cost-effective option for schools with tight budgets and crowded classrooms, either. They require computers, tablets or other specialized technology as well as dedicated Internet servers and other communications systems. There may also be a need for additional infrastructure, personnel and teacher training. A full, game-infused curriculum could cost millions of dollars and require ongoing support, Gershenfeld says.
The extent to which video games are the future of education remains to be seen. But if the present is any indication, teachers are embracing the medium and are likely to continue to do so. In fact, of those teachers who use video games in the classroom, more than half have kids play them as part of the curriculum at least once a week, according to a
national survey released by education researchers at Joan Ganz Cooney Center in June.
Perhaps the biggest impact of video games will be on students who have not responded as well to traditional teaching methods. Nearly half of the teachers surveyed say it is the low-performing students who generally benefit from the use of games, and more than half believe games have the ability to motivate struggling and special education students.

Ebola outbreak 'out of all proportion' and severity cannot be predicted, expert says


This image was captured in Monrovia, Liberia’s capital city, during the 2014 West African Ebola hemorrhagic fever (Ebola HF) outbreak that affected not only Liberia, but Sierra Leone, Guinea, and Nigeria as well.

A mathematical model that replicates Ebola outbreaks can no longer be used to ascertain the eventual scale of the current epidemic, finds research conducted by the University of Warwick.

Dr Thomas House, of the University’s Warwick Mathematics Institute, developed a model that incorporated data from past outbreaks that successfully replicated their eventual scale.

The research, titled "Epidemiological Dynamics of Ebola Outbreaks" and published by eLife, shows that when applying the available data from the ongoing 2014 outbreak to the model that it is, according to Dr House, “out of all proportion and on an unprecedented scale when compared to previous outbreaks”.

Dr House commented: “If we analyse the data from past outbreaks we are able to design a model that works for the recorded cases of the virus spreading and can successfully replicate their eventual size. The current outbreak does not fit this previous pattern and, as a result, we are not in a position to provide an accurate prediction of the current outbreak”.   

Chance events, Dr House argues, are an essential factor in the spread of Ebola and many other contagious diseases. “If we look at past Ebola outbreaks there is an identifiable way of predicting their overall size based on modelling chance events that are known to be important when the numbers of cases of infection are small and the spread is close to being controlled”.

Chance events can include a person’s location when they are most infectious, whether they are alone when ill, the travel patterns of those with whom they come into contact or whether they are close to adequate medical assistance.

The Warwick model successfully replicated the eventual scale of past outbreaks by analysing two key chance events: the initial number of people and the level of infectiousness once an epidemic is underway.

“With the current situation we are seeing something that defies this previous pattern of outbreak severity. As the current outbreak becomes  more severe, it is less and less likely that it is a chance event and more likely that something more fundamental has changed”, says Dr House.

Discussing possible causes for the unprecedented nature of the current outbreak, Dr House argues that there could be a range of factors that lead it to be on a different scale to previous cases; “This could be as a result of a number of different factors: mutation of virus, changes in social contact patterns or some combination of these with other factors. It is implausible to explain the current situation solely through a particularly severe outbreak within the previously observed pattern”.

In light of the research findings and the United Nations calling for a further $1bn USD to tackle the current outbreak, Dr House says that “Since we are not in a position to quantify the eventful scale of this unprecedented outbreak, the conclusion from this study is not to be complacent but to mobilise resources to combat the disease.”

Snap 2014-09-12 at 18.10.27


Story Source:

The above story is based on materials provided by University of Warwick. Note: Materials may be edited for content and length.


Journal Reference:

  1. Thomas House. Epidemiological Dynamics of Ebola Outbreaks. eLife, 2014; 3 DOI: 10.7554/eLife.03908

DARPA's guided sniper bullet changes path mid-flight

 

DARPA has conducted live-fire testing of its .50 caliber guided bullet

DARPA has conducted live-fire testing of its .50 caliber guided bullet

Image Gallery (3 images)

With an ability to strike from great distances, snipers present a unique threat in the field of battle. This long-range lethality is not without its complications, however, with accuracy often dictated by wind, rain and dust, not to mention targets that are constantly on the move. Over the last few months, DARPA has been conducting live-fire tests of guided .50 caliber bullets and today unveiled footage demonstrating the project's success.

With the aim of improving accuracy and safety for military snipers, DARPA's Extreme Accuracy Tasked Ordnance (EXACTO) project is tasked with developing more accurate artillery that will enable greater firing range, minimize the time required to engage with targets and also help to reduce missed shots that can give away a troop's location.

The EXACTO 50-caliber round is claimed to be the first ever guided small-caliber bullet. The maneuverable projectile uses a real-time optical guidance system to change its path mid-flight and home in on a target, potentially overcoming adverse weather and hostile conditions to improve sniper accuracy.

DARPA isn't giving too much away in terms of technical detail. However, if the illustration above is any indication, the steering mechanism used by DARPA appears different to the method used by a team at the Sandia National Laboratories back in 2012.

In that case, researchers developed a small-caliber guided bullet prototype capable of steering toward a laser-marked target 2 km (1.2 mi) away. This was accomplished by way of an optical sensor on the bullet's nose that gathers flight path information, while onboard electronics controlled tiny fins on its side to direct it toward the target. No such fins can be seen on the EXACTO round.

The DARPA footage, which can be seen below, demonstrates two rounds of live-fire testing. With the rifle intentionally aimed to the right of the marked target, the bullet can be seen veering in trajectory, altering its path to strike accurately over an undisclosed distance. DARPA claims the technology is likely to markedly extend the day and night-time range of current sniper systems.

Following the successful demonstration of the round's guidance systems and sensor, DARPA will now work to refine the technology to improve performance and conduct system-level live fire testing.

Source: DARPA

Share About the Author

Nick was born outside of Melbourne, Australia, with a general curiosity that has drawn him to some distant (and very cold) places. Somewhere between enduring a winter in the Canadian Rockies and trekking through Chilean Patagonia, he graduated from university and pursued a career in journalism. He now writes for Gizmag, excited by tech and all forms of innovation, Melbourne's bizarre weather and curried egg sandwiches.

Snap 2014-09-11 at 19.35.16

Turret flight tests to pave the way for laser weapons on military aircraft

 

The laser turret prototype being tested on the University of Notre Dame’s Airborne Aero Op...

The laser turret prototype being tested on the University of Notre Dame’s Airborne Aero Optical Laboratory Transonic Aircraft in Michigan (Photo: Air Force Research Laboratory)

Image Gallery (2 images)

High energy laser (HEL) systems have been the subject of military research for decades, but it is only in recent years that the technology has advanced to the point where it is feasible for such systems to be mounted on military ground vehicles and sea vessels. Initial flight tests have now been conducted on a new aircraft laser turret that will help pave the way for HEL systems to be integrated into military aircraft.

Lockheed Martin, working in partnership with the Air Force Research Laboratory (AFRL) and the University of Notre Dame, recently conducted eight flight tests in Michigan with a prototype Aero-adaptive Aero-optic Beam Control (ABC) turret fitted to the University of Notre Dame's Airborne Aero Optical Laboratory Transonic Aircraft.

The tests were intended to demonstrate the airworthiness of the new beam control turret being developed for DARPA and AFRL that is designed to provide 360-degree coverage for HEL weapons on an aircraft, thereby allowing it to engage enemy aircraft and missiles above, below and behind the aircraft.

Close up of the prototype Aero-adaptive Aero-optic Beam Control (ABC) turret (Photo: Air F...

The turret features flow control and optical compensation technologies developed by Lockheed that are designed to counteract the effects of turbulence resulting from the turret protruding from the aircraft's fuselage.

"These initial flight tests validate the performance of our ABC turret design, which is an enabler for integrating high energy lasers on military aircraft," said Doug Graham, vice president of advanced programs, Strategic and Missile Defense Systems, Lockheed Martin Space Systems.

Lockheed says subsequent flight tests will be conducted over the course of the next year, which will see the complexity of operations steadily ramping up.

At the 2014 Air Force Association's Air & Space and Technology Expo this week in Washington D.C., Maj. Gen. Tom Masiello who is the Commander of the AFRL, revealed that he has on occasion been leery of the HEL program's progress, saying, "[The high-energy laser program] has been over-promised, but under-delivered."

But it appears that perseverance is finally beginning to pay off with Masiello going on to say that the solid-state laser has emerged as a breakthrough program in recent years. "Now you can actually package it and fit it on an aircraft. I can’t over emphasize the progress we have made in solid-state lasers. Initially, we are looking at more self-defense. Eventually [we will] deploy against soft targets ... getting to hard target kills."

Sources: Lockheed Martin, US Air Force

Snap 2014-09-16 at 09.46.44

15 Minute Cleanups for Every Room of Your Home

 

Clean kitchen - Glow Decor/Glow/Getty Images

Glow Decor/Glow/Getty Images

What a 15 Minute Cleanup Is:

A 15 Minute Cleanup is a brief step-by-step guide to cleaning a room in 15 minutes or less. While the room may not be white glove clean, it will be presentable to guests. 15 Minute Cleanups are a great way to maintain a home in between more thorough cleanings. These cleanups include step-by-step instructions, a list of supplies, tips, and links for more thorough information.

What a 15 Minute Cleanup Isn’t:

A 15 Minute Cleanup is not the only cleaning regimen you’ll need to keep your home in tip top shape. It’s not an intense cleaning, but only a brief pickup of a room. To really keep a home clean, more thorough work will need to be done according to your cleaning schedule. Before intensely cleaning a room, it’s a good idea to start with that room's quick 15 Minute Cleanup.

The Cleanups:

15 Minute Living Area Cleanup

This cleanup can work for formal and informal living areas and sitting areas. It also works for formal dining rooms, and other rooms where entertaining is done. The idea with this cleanup is to get the room ready for guests, or events without spending a lot of time on the details.

15 Minute Kitchen Cleanup

The kitchen cleanup aims to remove trash, dirty dishes, and unwanted items from a kitchen, making it ready to be used, or in good shape for guests to be around. While you won’t scrub out the refrigerator, or clean under the sink in this cleanup, you will remove any offending trash and dishes, making your kitchen clean and clutter free.

15 Minute Bathroom Cleanup

The bathroom cleanup helps you quickly organize and wipe down bathroom surfaces. This cleanup will work to maintain your weekly bathroom scrub down, or get a bathroom ready in a hurry when unexpected guests show up.

15 Minute Kid’s Room Cleanup

This kid’s room cleanup is great for parents at a loss for where to begin when faced with a disaster area child’s room. It’s also perfect for older readers who can actually follow the list themselves to get their rooms clean. Print it off for them and watch as an all day chore turns into a 15 minute miracle.

15 Minute Bedroom Cleanup

Like many people, I usually leave cleaning my own bedroom for last. After a busy day or week, a bedroom can be in terrible shape. This cleanup will help maintain a bedroom or give it a quick pickup just in time for a restful night.

15 Minute Dorm Room Cleanup

The dorm room cleanup is for those who live in a very small area like an efficiency apartment, or campus housing. These small places can quickly fill up with junk and trash, but this quick cleanup will help organize even the worst-case scenario.

Why A 15 Minute Cleanup Works:

A 15 Minute Cleanup works because it recognizes that there are three types of items we need to clean up in a room. Everything falls into one of three categories...

  • Trash
  • Things that don't belong in the room.
  • Things that belong in the room, in a different place.

Having only a few categories to sort items into makes a pickup run more smoothly. The 15 Minute Cleanups also work because they are designed to maintain a room. Having a 15 Minute Cleaning session in a room several times a week will keep your home looking great between more intense cleanings.

More Intense Cleanups:

If you need a little more intense cleanup, you can try the following. These cleanups will take you between 1-2 hours per room depending on how much cleaning they’ve had lately. When you make your cleaning schedule, you can use the 15 Minute Cleanups and the Intense Cleanups together to create a perfectly clean home.

Intense Living Area Cleanup
Intense Kitchen Cleanup
Intense Bathroom Cleanup
Intense Bedroom Cleanup
Intense Dorm Room Cleanup

Snap 2014-09-17 at 12.42.37

Staring at a Screen for Hours Changes Your Tears

 

Office workers exhibit symptoms of dry eye

Aug 19, 2014 By Dina Fine Maron

Thomas Fuchs

Most Americans sit for at least six hours a day—an act that has been linked to obesity and heart disease, among other ailments. Mounting evidence suggests long hours staring at computer monitors may also be taking a toll on the eyes. People peering for hours into a screen tend to blink less often and have tears that evaporate more quickly, which dries out the eye and can cause blurred vision or pain. Left untreated, dry eye can lead to corneal ulcers and scarring.

Tears keep the eye moist and wash away dust or debris that could cause damage. But the tears of people with dry eye have less of a protein called mucin 5AC, which normally helps to keep tears sticky and spread evenly across the eye. A new study, based on 96 Japanese office workers, found that staring at a screen for eight hours or more is associated with lower mucin levels. The results were published online in June in JAMA Ophthalmology.

The good news is that the damage from staring at a screen is not likely to be permanent. Certain molecules that help to produce mucin remained roughly equal among test subjects with and without dry eye, regardless of their eyestrain status. Although larger studies need to be done, the findings confirm mucin as a possible target for future diagnosis and treatments of dry eye disease. Doctors already suggest taking regular breaks from the screen. Take a walk—or you may have a reason to tear up.

This article was originally published with the title "Dried Up."

Snap 2014-09-13 at 12.29.02

World War I: Naval Technology, 1914

 

 

A look at the science of naval warfare in the first year of the Great War

Sep 16, 2014 By Daniel C. Schlenoff

battleship

The battleship, while impressive looking, is an older and smaller “pre-dreadnought” design of the Braunschweig class.
Image: Scientific American, October 3, 1914

More In This Article

When the First World War engulfed Europe in August 1914 it was widely believed that sea power would be a decisive factor. In the decade before the war, Germany had begun to expand their fleet of battleships to challenge the dominance of British naval power. This arms race has been cited as one of the causes of the war.

Sea power was indeed decisive in the war. But it wasn’t the large fleets of giant battleships that were significant. In fact, the only large naval battle in the war, Jutland, in 1916, had no clear winner, and even today remains controversial. The real power of the sea came from the fleets of merchant ships bringing vast amounts of weapons, ammunition and raw materials to France, Britain, Russia and Italy—and after the U.S. entered the war, fresh troops by the million.

Through this same sea power a blockade denied food and weaponry to Germany and Austria. Do not underestimate the effect of this blockade: by the end of the war up to half a million people in the Central Powers had died of starvation and malnutrition because of it. As we noted in an editorial shortly after the war began, “food is as essential to success as guns.” [November 7, 1914]

The decisive fight at sea became a battle between fleets of cargo vessels escorted by naval surface ships against fleets of submarines. Those tasked with winning the war were under no illusion as to the importance of winning this battle.

>> Here is a slideshow of images from our archives on the science of naval warfare from 1914, the first year of the Great War.

For a more comprehensive look at all the aspects of World War I, military, economic, social, technological, see the World War I archive package at www.ScientificAmerican.com/wwi

Snap 2014-09-13 at 12.29.02

Crime Ring Revelation Reveals Cybersecurity Conflict of Interest

 

Hold Security’s nebulous report on the “CyberVor” online hacker gang exposed the cybersecurity world’s troubling practice of uncovering online threats and then selling proposed solutions

Sep 15, 2014 By Erik Schechter

A small cybersecurity firm claimed this summer to have uncovered a scam by Russian Internet thieves to amass a mountain of stolen information from 420,000 Web and FTP sites. The hacker network, dubbed “CyberVor,” possessed 1.2 billion unique credentials—a user name and matching password—belonging to 500 million e-mail addresses, asserted Hold Security, LLC.

Those numbers made Internet security watchers and even some consumers sit up and take notice—people use such credentials to access banking, investment and social media accounts after all. If true, the CyberVor haul would dwarf last December’s data breach of retailer Target, in which 40 million customer credit cards were compromised. Although a New York Times story lent credibility to Hold Security’s claims, some observers question whether the cybersecurity vendor’s big reveal was more of a publicity stunt than a public service. The firm’s decision to charge potential victims a $120 fee for their Breach Notification Service did not help matters.

Panic and publicity certainly play a role in cybersecurity efforts, as companies that make antivirus and other protective software try to provide computer users with a sense of the unseen threats facing their devices and data on a daily basis. But questions arise when these companies yoke together the part of their businesses that finds and analyzes security threats with the part that sells software and services to mitigate those threats.

Even large, established firms such as Symantec Corp. have been accused of exaggerating the gravity of security threats to boost sales. A decade ago U.S. regulators cracked down on financial services firms for the questionable practice of having their equity research and investment banking divisions work together to endorse and then sell certain investments. No such oversight exists for cybersecurity companies. Although not surprising, given the relatively nascent nature of cyber threats, this conflict of interest means these companies walk a thin line between defending computers and other Internet-connected devices and profiting from people’s fear that their personal data is vulnerable at any time to online attackers.

The people behind CyberVor started off buying stolen credentials on the black market, according to Hold Security. At some point the gang changed tactics and purchased information about certain Web sites’ security weaknesses. This data had been gleaned by botnets, a group of virus-infected computers secretly controlled by a hacker. CyberVor then used these vulnerabilities to steal more credentials from the sites. It is unclear, however, how long this process took or whether those credentials were ever used to commit fraud or steal from the credentials’ owners.

Hold Security did not respond to a media inquiry from Scientific American. But a FAQ page on the company’s Web site insisted that private e-mail users would not have to pay $120 to find out if they were on the CyberVor list. Only Web site owners and Internet service providers would be charged for the Breach Notification Service, which “also provides a full year of service for other breach notifications.”

Hold Security’s actions exposed key flaws in the way certain cybersecurity companies operate but were not emblematic of the industry as a whole, some cybersecurity executives argue. “There’s always some skepticism about the latest cyber crime of the day,” acknowledges Michael Malloy, executive vice president of products and strategy at Webroot, a provider of antivirus and antispyware services as well as research. Still, cyber crime is a real threat—large antivirus software makers must defend their customers’ computers from hundreds of thousands of new threats everyday, he says.
One red flag concerning the CyberVor warning was the lack of specifics about how much of a threat the hacker network posed and whom it affected. Hold Security begged off naming targeted Web sites, citing nondisclosure agreements, says
Jasper Graham, a former U.S. National Security Agency technical director and now senior vice president of cyber technologies and analytics at network security provider Darktrace. But security companies usually offer breakdowns “that'll say something like 30 percent of the data was financial…60 percent of the data was social media,” he adds. These figures help other cybersecurity researchers and their customers assess the seriousness of the attack for themselves, but Hold Security withheld even that level of detail.

It is not uncommon for cybersecurity companies to research and hype threats that could drive sales, says Dan Guido, chief executive of information security company Trail of Bits and hacker-in-residence at New York University’s Polytechnic School of Engineering. The cybersecurity market and its customers would benefit from government regulation and research to help buyers separate good and bad security products and get objective reports on emerging cyber threats, he says.

And the same companies selling security engage in outright fraud. In March 2013 Symantec agreed to pay a settlement of $11 million to end a class-action lawsuit. The Norton antivirus program developer had been accused of tricking users into buying an unnecessary “registry cleaner” for $29.95 after running free scans with Symantec tools that identified nonexistent problems on their computers. Similarly, tiny companies such as MediaFire, Alpha Red and Branch Software have been sued for using pop-up advertisements to scare unsuspecting computer users into buying antivirus products, some of which actually damaged consumers’ computers, once installed.

Cybersecurity firms ideally should perform their work of finding and stopping malware and cyber attacks in a way that enables measurement of their success or failure, says Stefan Savage, a computer scientist in the Systems and Networking Group at the University of California, San Diego. Establishing a method by which people could determine and assess the quality and value of a product or service, he says, would go a long way toward helping the industry shake concerns that companies are simply resorting to scare tactics.

Snap 2014-09-13 at 12.29.02