sexta-feira, 20 de fevereiro de 2015

Olive oil ingredient leads cancer cells to their death

 

A compound in extra-virgin olive oil has been found to destroy cancer cells without harmin...

A compound in extra-virgin olive oil has been found to destroy cancer cells without harming healthy cells (Photo: Shutterstock)

An ingredient found in extra-virgin olive oil called oleocanthal has been known as a compound capable of killing a variety of human cancer cells, but how this process actually played out was not understood. Now, a team of researchers has uncovered not only how oleocanthal destroys cancer cells, but that it is able to do so while leaving healthy cells unharmed.

Paul Breslin, a professor of nutritional sciences at Rutgers University, had thought that oleocanthal killed the cancer cells by targeting a key protein in cancer cells that triggers apoptosis, a process that sees dangerous or damaged cells self-destruct by upsetting the balance of ions in the cell membranes. In investigating this theory, he teamed up with David Foster and Onica LeGendre, two cancer biologists from New York City's Hunter College to more closely examine the process.

"We needed to determine if oleocanthal was targeting that protein and causing the cells to die," says Breslin.

What first surprised the scientists was how quickly the oleocanthal destroyed the cancer cells. While apoptosis requires between 16 and 24 hours to take effect, the oleocanthal was killing off the cancer cells within 30 minutes to one hour. This led the team to believe that there were some other factors at play.

What they discovered was that the oleocanthal was piercing the cancer cell's vesicles, the containers that store the cell's waste. By puncturing these "dumpsters," as Breslin describes them, it creates an outpouring of enzymes that then cause the cell to die.

"Once you open one of those things, all hell breaks loose," says Breslin.

And when it came to the healthy cells, the researchers found that they remain completely unharmed. While the application of oleocanthal caused a temporary halt in their life cycles, after 24 hours they returned to normal.

With the testing thus far carried out in the lab, the researchers say that they will now look to establish the effects of oleocanthal on cancer cells in living animals.

The findings were published in the journal Molecular and Cellular Oncology.

Source: Rutgers University

 

Petrobrás ameaça a levar o Brasil à recessão, diz NYT


Para o The New York Times, crise da estatal provoca onda de inadimplência de empresas em uma onda que pode agravar o quadro da já estagnada economia brasileira

As acusações de corrupção na gigante brasileira do petróleo, Petrobrás, controlada pelo governo, já levaram a um escândalo político e à troca da diretoria. Agora, os problemas ameaçam outras empresas brasileiras e podem até levar o país a uma recessão.

Seria difícil exagerar ao descrever a importância da Petrobrás para o País. A empresa produz mais de 90% do petróleo brasileiro, é dona de todas as refinarias nacionais, opera mais de 34 mil km de oleodutos, domina a distribuição de gasolina e diesel no atacado, e é até dona da maior rede de postos de gasolina.

“O plano do governo consistia em tornar a Petrobrás tão grande quanto possível”, disse Samuel Pessoa, economista da Fundação Getúlio Vargas, no Rio de Janeiro. Ele estimou que a empresa seria responsável por um décimo da produção econômica do Brasil, por meio de suas operações próprias e terceirizadas.

Na esteira da investigação policial, chamada Operação Lava Jato, revelando que fornecedores e prestadores de serviços da Petrobrás teriam subornado executivos em troca de contratos superfaturados, a empresa suspendeu os pagamentos a muitos projetos. A Petrobrás também proibiu o estabelecimento de novos contratos com algumas das maiores empresas de engenharia e petroquímica do País.

A queda nos gastos da empresa deve eliminar 0,75% do crescimento esperado para a economia brasileira este ano, disse Pessoa – o bastante para empurrar uma economia letárgica para uma leve recessão.

As jogadas da empresa também ameaçam o faturamento dos prestadores de serviços e terceirizados, que são atingidos duplamente: seu fluxo de dinheiro caiu muito e a crise significa que não podem emprestar dinheiro para aliviar o aperto.

Os problemas da Petrobrás estão também se espalhando para os mercados de capitais do Brasil.

Em decorrência das incertezas em relação ao valor revisado que a empresa terá de atribuir aos seus ativos e bens por causa da corrupção, a auditora da Petrobrás, Pricewaterhouse Coopers, se recusou a assinar a divulgação dos rendimentos trimestrais.

Sem o relatório trimestral de rendimentos assinado pela auditoria, a Petrobrás, com dívida líquida de US$ 110 bilhões, não pode recorrer ao mercado global de obrigações.

Como a Petrobrás era vista como uma das melhores empresas para se investir, suas obrigações costumavam servir como referência para todas as empresas brasileiras. Sem essa referência, outras empresas brasileiras estão simplesmente evitando o mercado de obrigações.

As empresas locais venderam US$ 37 bilhões em obrigações globais no ano passado, de acordo com a Dealogic. Desde novembro, quando a Petrobras deixou de apresentar os rendimentos trimestrais aprovados pela auditoria, nenhuma emissão de títulos de dívida foi feita por empresas brasileiras.
Janeiro costuma ser um mês positivo para a venda de obrigações por parte das empresas brasileiras. Em janeiro de 2014, essas vendas tiveram volume de quase US$ 6,5 bilhões.

“Algumas empresas com balanço patrimonial sólido ainda podem vender títulos de sua dívida, mas pagariam por isso mais do que antes e, por isso, evitam essa opção. Há outras empresas que realmente precisam captar o dinheiro agora, e não conseguem fazê-lo”, disse Marcel Kussaba, diretor de pesquisas em equity e dívidas da firma brasileira de gestão de bens Quantitas.

Uma empresa contratada pela Petrobrás, Alumini, já entrou com pedido de recuperação judicial sob proteção do tribunal, alegando que a Petrobrás deveria à empresa R$ 1,2 bilhão, ou US$ 420 milhões.

A quinta maior empresa de engenharia do país, OAS, e uma das principais prestadoras de serviços à Petrobrás, ficou inadimplente no pagamento das obrigações e tenta negociar com os credores para evitar a falência. A OAS tem dívida de R$ 7,9 bilhões, ou US$ 2,8 bilhões, incluindo quase US$ 1,8 bilhão em obrigações, muitas delas pertencentes a investidores estrangeiros.

A empresa de perfuração Sete Brasil diz estar em negociações com bancos estatais para captar US$ 4,5 bilhões e manter-se em funcionamento. Como OAS e Alumini, a Sete Brasil está sob investigação, acusada de transferir dinheiro de contratos superfaturados a políticos e executivos da Petrobrás e, por isso, o empréstimo pode ser impossível.

A Sete Brasil deve US$ 4,3 bilhões aos bancos. Entre os proprietários da empresa há três bancos, entre eles o BTG Pactual, dono da maior participação, 27%. Os donos da empresa investiram outros US$ 3 bilhões, mas, como tais investimentos costumam se dar por meio de instrumentos que envolvem investidores, a exposição dos bancos pode ser menor.

Se a Sete Brasil pedir concordata, as empresas contratadas para construir suas instalações também serão afetadas.
Problemas semelhantes são esperados em todo o setor de construção e energia enquanto os prestadores de serviços à Petrobrás – em muitos casos, empresas imensas – cortam gastos, por mais que evitem a falência.

“Veremos o fechamento de muitas empresas do setor”, disse Adriano Pires, diretor do Centro Brasileiro para a Infraestrutura. Outras terão de vender ativos e bens para sobreviver.

“Por causa da Operação Lava Jato, haverá setores específicos, particularmente na infraestrutura, nos quais veremos mais fusões e aquisições esse ano”, disse Antonio Pereira, diretor de investimentos bancários do Goldman Sachs no Brasil.

De acordo com Pereira, aeroportos e rodovias devem estar entre os bens postos à venda.

Enquanto os bancos de investimentos buscam formas de lucrar orientando as fusões e aquisições, outros bancos podem ser prejudicados.
O Banco do Brasil, maior do país e controlado pelo Estado, tem 11% de seu portfólio de empréstimos no setor da energia, construção e setores ligados, de acordo com estudo do banco de investimentos Brasil Plural. Assim como os demais grandes bancos brasileiros, o Banco do Brasil não é tido como ameaçado por concordata porque possui reservas substanciais e um fluxo de renda diversificado, bem como o apoio implícito do governo. Mas alguns dos bancos menores estão em situação vulnerável.

Citando uma exposição desproporcional do Banco Pine às empresas de construção, em janeiro a Moody’s rebaixou a nota de sua dívida, considerada tóxica, e alertou para futuros rebaixamentos.

“Não espero que algum banco seja obrigado a pedir recuperação judicial por causa da Operação Lava Jato”, disse João Augusto Salles, analista do setor financeiro da consultoria Lopes Filho, do Rio de Janeiro, “mas talvez algumas instituições bancárias tenham de ser vendidas a outras, maiores”.

O primeiro desafio da Petrobrás é calcular o desconto que deve ser aplicado ao valor de seus ativos em decorrência da corrupção para, com isso, divulgar um balanço patrimonial aprovado pela auditoria. Se a empresa não o fizer até junho, os credores de seus US$ 54,5 bilhões em obrigações podem exigir o pagamento imediato.

A maioria dos analistas afirmou que tal situação seria improvável e, mesmo se ocorresse, não significaria que a empresa declararia moratória em suas obrigações. Se há no Brasil uma empresa considerada grande demais para falir, essa é a Petrobrás.

“Achamos que o governo intercederia ou pressionaria os bancos locais a oferecer o financiamento necessário”, disse Brigitte Posch, diretora de dívidas corporativas em mercados emergentes da Babson Capital, dona de obrigações da Petrobrás. “A empresa é importante demais para o Brasil”.

Fonte: Estadão.com


 

 

Redução na vazão do São Francisco preocupa produtores


Fruticultores que dependem do São Francisco estão preocupados com redução na vazão do rio.

Volume destinado ao consumo está na metade do que entra na principal represa da região.

A crise hídrica atinge uma das regiões mais acostumadas a conviver com a seca no Brasil. No sertão de Pernambuco e da Bahia, produtores rurais estão preocupados com a diminuição da vazão do Rio São Francisco, principal via de abastecimento das cidades e da agricultura nos dois estados. O setor produtivo local está se unindo a entes públicos e privados para organizar um manifesto que chegue à Brasília.

Segundo a Companhia de Desenvolvimento dos Vales do São Francisco e Parnaíba (Codevasf), o volume que vaza para consumo do “velho Chico” é metade do que  entra diariamente no lago de Sobradinho, onde fica armazenada toda a água disponível para geração de energia e consumo da população local.

“Atualmente entram 600 metros cúbicos por segundo e saem 1.800. Com isso o lago está hoje com 17% da sua capacidade e a tendência é que chegue em março a 14%”, afirma João Bosco, superintendente da Codevasf de Petrolina (Pernambuco). A diminuição do nível das águas é atribuída à falta de chuvas em toda a bacia do São Francisco, especialmente em Minas Gerais, onde está a principal fonte do rio, explica ele.

Em janeiro deste ano, choveu 9% do esperado para o mês, conforme dados levantados pelas estações meteorológicas locais. A média pluviométrica anual para a região de Petrolina (Pernambuco) e Juazeiro (Bahia) é de 400 milímetros.

O quadro acende o sinal de alerta para a atividade agrícola regional, predominada pela produção de frutas e vegetais em pequenas áreas. Praticamente todo o plantio sobrevive da irrigação nas plantas. Portanto, dependendo da fase de desenvolvimento das culturas, cada dia sem água significa prejuízo.

O sertão da Bahia e Pernambuco é o principal polo de produção de frutas do Brasil. Quase toda a manga exportada pelo país sai da região. Além disso, lá estão concentrados um dos maiores e mais sofisticadas produção de uvas finas e vinhos.

 

Por Cassiano Ribeiro . Fonte: Globo Rural.


Sensor technology may help improve accuracy of clinical breast exams

Sensor technology has the potential to significantly improve the teaching of proper technique for clinical breast exams (CBE), according to a new study by researchers at the University of Wisconsin School of Medicine and Public Health.

The results of the study were published in a correspondence today in the New England Journal of Medicine.

Carla Pugh, director of patient safety and education at the University of Wisconsin Hospital and Clinics and principal investigator of the study, says the use of sensors allows a level of critical analysis unavailable to clinicians until recently.

"Variations in palpable force used during a CBE cannot be reliably measured by human observation alone," Pugh says. "Our findings revealed that 15 percent of the physicians we tested were using a technique that put them at significant risk of missing deep tissue lesions near the chest wall. This research underscores the potential for sensor technology to be used not only to improve clinical performance, but to also allow for objective evidence-based training, assessment and credentialing."

For the study, Pugh and her team asked 553 practicing physicians during annual clinical meetings of the American Society of Breast Surgeons, American Academy of Family Physicians and American College of Obstetricians to perform simulated CBE under conditions that mimic an office visit for a symptomatic patient. Participants completed a demographic survey, reviewed a clinical scenario, performed the CBE on a sensor-enabled breast model, and then documented their findings. The goal was to capture CBE technique while clinicians were purposefully seeking a mass.

The sensor data revealed that physicians who palpated fewer than 10 newtons (a common measurement of force) were able to find two superficial masses on the breast model but missed the two deeper ones. The physicians who increased the amount of palpation pressure improved the probability that they would identify the deeper lesions. The study suggests that the optimal palpable force for deeper lesions is between 12 and 17 newtons.

"I want to spark a serious conversation about the potential for high-end, mastery training in the health care profession," Pugh says. "Health care is at a critical juncture where there are huge opportunities for major information exchanges that can empower physicians and patients. Both patients and physicians will benefit from clinical-skills performance data."


Story Source:

The above story is based on materials provided by University of Wisconsin-Madison. Note: Materials may be edited for content and length.


Journal Reference:

  1. Shlomi Laufer, Elaine R. Cohen, Calvin Kwan, Anne-Lise D. D'Angelo, Carla M. Pugh, Rachel Yudkowsky, John R. Boulet, William C. McGaghie. Sensor Technology in Assessments of Clinical Skill. New England Journal of Medicine, 2015; 372 (8): 784 DOI: 10.1056/NEJMc1414210

 

Honeymooners at last alone…

casal   casal-idoso-feliz  adri-ensaio-casal-1

Graphene offers X-ray photoelectron spectroscopy a window of opportunity

 

Snap 2015-02-20 at 15.06.48

X-ray photoelectron spectroscopy (XPS) is among the most sensitive and informative surface analysis techniques. However, it requires high vacuum equipment for operation because electrons can only penetrate a short distance in dense media such as air. An international team led by CNST researchers working with collaborators from Elettra Sincrotrone Trieste, Italy and the Technical University of Munich, Germany have overcome this limitation by employing the fact that graphene is transparent to electrons (electrons can pass through it largely unimpeded) but is impermeable to gas or liquids.

The researchers used graphene covering a small opening to separate a liquid sample cell from the high vacuum conditions of the electron spectrometer. They were able to demonstrate that good quality XPS data can be recorded from liquid using the approach. They evaluated the electron transparency of graphene membranes quantitatively and compared it with theoretical predictions.

Additionally, the researchers were also able to spectroscopically measure in situ the chemistry of bubble formation due to radiation-induced splitting of water into oxygen and hydrogen. Because the bubble formation is a frequent unwanted process, the measurement of the onset of bubble formation sets an upper limit for the intensities of the X-rays (or electrons) which can be used in this approach.

The researchers’ work potentially fills a much needed gap. Assessing the chemical status of surfaces and interfaces immersed in liquids or atmospheric pressure gas is very much needed for many applications such as biomedical research, electrochemical energy devices, and catalysis. The current state of the art of high pressure XPS is to use sophisticated, expensive and bulky differentially pumped stages in front of the electron energy analyzer. Only a few instruments of this kind are currently available worldwide but there is great need for these experimental capabilities. The researchers’ design is far simpler and has the potential to reduce costs to the level that this type of measurement could be afforded by many more labs.

As often happens with new technologies, there remain some challenges and limitations. The graphene adhesion to the surface of the apertures needs to be improved, as do the radiation and electrochemical stability of the atomically thin graphene. The researchers plan to work on these challenges in order to develop better techniques for clean and non-disruptive transfer of graphene windows to the supporting openings.

 

A new technique for subsurface characterization of thin-film photovoltaic materials

 

Snap 2015-02-20 at 14.49.57
 
Although the performance of many materials used in devices is determined by microscale and nanoscale structures and buried interfaces hidden up to a few micrometers below the material surface, most commonly used nanoscale characterization tools are only sensitive to phenomena either at the surface or just a few nanometers below it. As a step towards addressing the increasing need for subsurface characterization with nanoscale spatial resolution, researchers from the CNST and the University of Maryland have demonstrated a new approach for probing the interior of photovoltaic devices.

The technique uses a scanning probe microscope to control a tapered optical fiber with a nanoscale aperture that transmits laser light in order to locally excite a subsurface volume of a photovoltaic device. The photocurrent generated by the cell is then measured using a low-noise amplifier. By varying the wavelength of the laser light in the optical fiber, the penetration depth of the light can be readily changed from approximately 100 nm to more than 3 μm, allowing different volumes of the device to be excited. However, deeper penetration results in lower spatial resolution. Therefore, for comparison, the researchers used a focused ion beam available to users in the CNST NanoFab to prepare a wedge-shaped device sample by carefully thinning down the absorber thickness at a small angle while ensuring that the device still functioned as a solar cell. This shape effectively brings some of the buried interfaces closer to the exposed surface so that these interfaces can be interrogated with higher spatial resolution using light with shallower penetration.

The characterization technique was demonstrated on cadmium telluride (CdTe) solar cells, a common photovoltaic technology. These solar cells are based on 1 μm to 3 μm thick multilayer polycrystalline films with the size of the crystal grains comparable to or smaller than the device thickness. The cells have both intentional and unintentional variation in composition throughout the absorber layer and device interfaces, making them ideal for testing this technique. Currently, solar cells of this type operate at efficiencies well below the theoretical limit, and their inefficiency is believed to be due to the uncontrolled and often unknown effects of the microscale structure buried beneath the surfaces of the devices.

According to CNST researcher Nikolai Zhitenev, the new technique is just an early step towards the development of a full set of quantitative nanoscale sub-surface measurement tools. While patterning the photovoltaic device into a wedge shape inevitably modifies its structure and performance, the measurements are highly reproducible for a variety of processing conditions. Through the development of new theoretical models explicitly incorporating the effects of the device modification, the new measurement approach can be developed into a powerful quantitative technique.

Measuring absorption maps and spectra of plasmonic resonators with nanoscale resolution

 

Snap 2015-02-20 at 10.29.42

Researchers from the CNST and the University of Maryland have for the first time used photothermal induced resonance (PTIR) to characterize individual plasmonic nanomaterials in order to obtain absorption maps and spectra with nanometer-scale resolution. Nanostructuring of plasmonic materials enables engineering of their resonant optical response and creates new opportunities for applications that benefit from enhanced light-matter interactions, including sensing, photovoltaics, photocatalysis, and therapeutics.
Progress in nanotechnology is often enabled by the availability of measurement methods for characterizing materials at appropriately small length scales. By measuring infrared absorption at the nanoscale, PTIR provides information that is not otherwise available for characterizing and engineering plasmonic materials. PTIR measures light absorption in a material using a wavelength-tunable laser and a sharp tip in contact with the sample as a local detector. Unlike many other methods that use nanoscale tips for probing materials, in PTIR the tip is passive and does not interfere with measurement. Consequently, light absorption in the sample can be measured directly without requiring either a model of the tip or prior knowledge of the sample.

The researchers collected nanoscale absorption information in two ways: first, by mapping infrared absorption while scanning a tip on a sample under constant wavelength illumination; and second, by measuring location-specific absorption spectra while sweeping a laser across a range of infrared wavelengths. Using tunable lasers that give CNST facility users the ability to vary the wavelengths from
1.55 μm to 16.00 μm, the researchers acquired the nanoscale infrared absorption spectra of gold resonators, the first such measurement of any plasmonic nanomaterial. Although absorption images allow immediate visualization and can be measured with other techniques, the PTIR spectra provide needed information to interpret the images and guide experiments.
Schematic showing the photothermal induced resonance (PTIR) technique, which combines the lateral resolution of atomic force microscopy (AFM) with the chemical specificity of IR spectroscopy. A wavelength-tunable, pulsed IR laser (purple) illuminates a sample consisting of plasmonic gold resonators from the below. The resulting thermal expansion of the sample is detected locally by the AFM cantilever tip, which is monitored by reflecting a laser (blue) off the back of the cantilever.

Plasmonic materials like gold, which have large thermal conductivity and relatively small thermal expansion coefficients, were previously thought to be challenging to measure using PTIR because the technique relies on the sample’s thermal expansion for measuring light absorption. According to Andrea Centrone, a Project Leader in the Energy Research Group, “we showed that PTIR characterization is not just applicable to insulators and semiconductors, as demonstrated previously, but that metals are also amenable to it. This is an important step forward for applying the PTIR technique to a wider variety of functional devices.”

Nanoscale imaging and spectroscopy of plasmonic modes with the PTIR technique, A. M. Katzenmeyer, J. Chae, R. Kasica, G. Holland, B. Lahiri, and A. Centrone, Advanced Optical Materials 2, 718–722 (2014).

Flip out ! Noncommital material could make for hypersensitive magnetic field direction detector

 

Snap 2015-02-20 at 09.37.15

While the mysterious, unseen forces magnets project are now (mostly) well-understood, they can still occasionally surprise us. For instance, thin films of cobalt have been observed to spontaneously switch their poles—something that typically doesn’t happen in the absence of an external magnetic field. Researchers from the CNST and the University of Maryland have measured this phenomenon on the largest scale yet.
Most magnets are “permanent,” meaning a magnetic field of some strength must be applied to reverse their north and south poles. This permanence enables the billions of tiny magnets in computer hard drives to reliably store data. It also allows nanomagnetic sensor technology, for example, in the magnetometers which detect the earth’s magnetic field in smartphone compasses. Making these devices more energy efficient will require magnets which are increasingly sensitive to external influences, such as small magnetic fields. However, as these magnets become more sensitive they also become more unstable, flipping from north to south and back, even with no magnetic field. The researchers mapped out this instability in a film of cobalt, only a few atoms thick, and determined the conditions under which the instability arises.

They hypothesize that the development of magnetic technology will benefit from their continuously flipping cobalt films, which can function as extremely sensitive magnetic test beds. Many proposed devices implement layers of ferromagnetic material which, to be useful, must be controllable by an external influence. According CNST/UMD Postdoctoral Researcher Andy Balk, however, most magnetic materials are too stable to be influenced at all by small interactions, and researchers have no way of knowing if their proposed devices are even close to working. “As an alternative,” Balk says, “we could make a proposed magnetic device from our unstable film. This way, even if the film were influenced only a very small amount we would see, for example, slightly more north flips than south flips, and we would know we are on the right track.”

The measurements were done with video-rate Kerr Interestingly, these fluctuations exhibit scale microscopy, a form of polarized light microscopy invariance—meaning that their behavior is the that can image the fine-grained details of a same regardless of the length scale on which material’s magnetic state. The scientists found they are observed—a property they share the magnetic fluctuations in the thin cobalt film with otherwise unrelated phenomena such as interact with each other; a fluctuation from north earthquakes and crumpling paper. to south will always have a corresponding nearby fluctuation from south to north.

Snooping on self-organizing molecules

Afew short years ago, the idea of a practical manufacturing process based on getting molecules to organize themselves in useful nanoscale shapes seemed … well, cool, sure, but also a little fantastic. Now the day isn’t far off when your cell phone may depend on it. Two recent papers emphasize the point by demonstrating complementary approaches to fine-tuning the key step: depositing thin films of a uniquely designed polymer on a template so that it self-assembles into neat, precise, even rows of alternating composition just 10 or so nanometers wide.

The work by researchers at the Massachusetts Institute of Technology, the IBM Almaden Research Center, the NIST Material Measurement Laboratory (MML), and the CNST focuses on block copolymers, a special class of polymers that under the proper conditions, will segregate on a microscopic scale into regularly spaced “domains” of different chemical composition. The two groups demonstrated ways to observe and measure the shape and dimensions of the polymer rows in three dimensions. The experimental techniques can prove essential in verifying and tuning the computational models used to guide the fabrication process development.

It’s old news that the semiconductor industry is starting to run up against physical limits to the decades-long trend of ever-denser integrated chips with smaller and smaller feature sizes, but it hasn’t reached bottom yet. Just recently, Intel Corp. announced that it had in production a new generation of chips with a 14-nanometer minimum feature size. That’s a little over five times the width of human DNA.
At those dimensions, the problem is creating the multiple masking layers, sort of tiny stencils, needed to define the microscopic patterns on the production wafer. The optical lithography techniques used to create the masks in a process akin to old-school wet photography are simply not capable of reliably reproducing the extremely small, extremely dense patterns. There are tricks you can use such as creating multiple, overlapping masks, but they are very expensive.

Hence the polymers. “The issue in semiconductor lithography is not really making small features— you can do that—but you can’t pack them close together,” explains CNST Nanofabrication Research Group Leader Alexander Liddle. “Block copolymers take advantage of the fact that if I make small features relatively far apart, I can put the block copolymer on those guiding patterns and sort of fill in the small details.” The strategy is called “density multiplication” and the technique, “directed self-assembly.”

Block copolymers (BCPs) are a class of materials made by connecting two or more different polymers that, as they anneal, will form predictable, repeating shapes and patterns. With the proper lithographed template, the BCPs in question will form a thin film in a pattern of narrow, alternating stripes of the two polymer compositions. Alternatively, they can be designed so one polymer forms a pattern of posts embedded in the other. Remove one polymer, and in theory, you have a near-perfect pattern for lines spaced 10 to 20 nanometers apart to become, perhaps, part of a transistor array.

If it works. “The biggest problem for the industry is the patterning has to be perfect. There can’t be any defects,” says MML researcher Joseph Kline. “In both of our projects we’re trying to measure the full structure of the pattern. Normally, it’s only easy to see the top surface, and what the industry is worried about is that they make a pattern, and it looks okay on the top, but down inside the film, it isn’t.”
Kline’s group, working with IBM, demonstrated a new measurement technique, called “resonant critical dimension small angle X-ray scattering” (res-CDSAXS), that uses low-energy or “soft” X rays produced by the

Advanced Light Source at Lawrence Berkeley National Labs to probe the structure of the BCP film from multiple angles. Because the film has a regular, repeating structure, the scattering pattern can be interpreted, much as crystallographers do, to reveal the average shapes of the stripes in the film. If a poor match between the materials causes one set of stripes to broaden out at the base, for example, it will show up in the scattering pattern.

Their major innovation was to note that although the basic technique was developed using short-wavelength “hard” X-rays that have difficulty distinguishing two closely related polymers, much better results can be obtained using longer wavelength X-rays that are more sensitive to differences in the molecular structure.
While X-ray scattering can measure average properties of the films, Liddle’s group, working with MIT, developed a method to look, in detail, at individual sections of a film by doing three-dimensional tomography with a transmission electron microscope (TEM). Unlike the scattering technique, the TEM tomography can actually image defects in the polymer structure—but only for a small area. The technique can image an area about 500 nanometers across.
Between them, the two techniques can yield detailed data on the performance of a given BCP patterning system. The data, the researchers say, are most valuable for testing and refining computer models. “Our measurements are both fairly time-consuming, so they’re not something industry can use on the fab floor,” says Kline. “But as they’re developing the process, they can use our measurements to get the models right, then they can do a lot of simulations and let the computers figure it out.”
“It’s just so expensive and time-consuming to test out a new process,” agrees Liddle. “But if my model is well validated and I know the model is going to give me accurate results, then I can crank through the simulations quickly. That’s a huge factor in the electronics industry.”


Perfect colors, captured with one ultra-thin lens

 

This completely flat, ultrathin lens can focus different wavelengths of light at the same point, achieving instant color correction in one extremely thin, miniaturized device.

Most lenses are, by definition, curved. After all, they are named for their resemblance to lentils, and a glass lens made flat is just a window with no special powers.

But a new type of lens created at the Harvard School of Engineering and Applied Sciences (SEAS) turns conventional optics on its head.

A major leap forward from a prototype device demonstrated in 2012, it is an ultra-thin, completely flat optical component made of a glass substrate and tiny, light-concentrating silicon antennas. Light shining on it bends instantaneously, rather than gradually, while passing through. The bending effects can be designed in advance, by an algorithm, and fine-tuned to fit almost any purpose.

With this new invention described today in Science, the Harvard research team has overcome an inherent drawback of a wafer-thin lens: light at different wavelengths (i.e., colors) responds to the surface very differently. Until now, this phenomenon has prevented planar optics from being used with broadband light. Now, instead of treating all wavelengths equally, the researchers have devised a flat lens with antennas that compensate for the wavelength differences and produce a consistent effect--for example, deflecting three beams of different colors by the same angle, or focusing those colors on a single spot.

"What this now means is that complicated effects like color correction, which in a conventional optical system would require light to pass through several thick lenses in sequence, can be achieved in one extremely thin, miniaturized device," said principal investigator Federico Capasso, the Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at Harvard SEAS.

Bernard Kress, Principal Optical Architect at Google[X], who was not involved in the research, hailed the advance: "Google [X], and especially the Google Glass group, is relying heavily on state-of-the-art optical technologies to develop products that have higher functionalities, are easier to mass produce, have a smaller footprint, and are lighter, without compromising efficiency," he said. "Last year, we challenged Professor Capasso's group to work towards a goal which was until now unreachable by flat optics. While there are many ways to design achromatic optics, there was until now no solution to implement a dispersionless flat optical element which at the same time had uniform efficiency and the same diffraction angle for three separate wavelengths. We are very happy that Professor Capasso did accept the challenge, and also were very surprised to learn that his group actually solved that challenge within one year."

The team of researchers, led by Capasso and postdoctoral fellow Francesco Aieta, has developed a design that rivals the bulky equipment currently used in photography, astronomy, and microscopy. It could also enable the creation of new miniature optical communications devices and find application in compact cameras and imaging devices.

The new lens, dubbed an "achromatic metasurface," dramatically improves on the flat lens Capasso's research group demonstrated in 2012. That prototype, the first of its kind, corrected for some of the aberrations of conventional lenses but suffered from the limitation of only focusing light of a single wavelength, and its focusing efficiency was small. The new model uses a dielectric material rather than a metal for the nanoantennas, a change which greatly improves its efficiency and, combined with a new design approach, enables operation over a broad range of wavelengths.

Most significantly, the new design enables the creation of two different flat optical devices. The first, instead of sending different colors in different directions like a conventional grating, deflects three wavelengths of light by exactly the same angle. In the second device, the three wavelengths can all be focused at the same point. A flat lens can thus create a color image--focusing for example red, green, and blue, the primary colors used in most digital displays. The team's computational simulations also suggest that a similar architecture can be used to create a lens that collimates many different wavelengths, not just three.

"This is a major step forward in establishing a planar optical technology with a small footprint which overcomes the limitations of standard flat optics, known as diffractive optics," said Capasso. "It also opens the door to new functionalities because of the enormous design space made possible by metasurfaces."

"This is an elegant and groundbreaking accomplishment," said Nader Engheta, H. Nedwill Ramsey Professor at the University of Pennsylvania, who was not involved in the research. "The planar optical structures designed and demonstrated by Professor Capasso's group have much less volume than their conventional bulky counterparts and at the same time their chromatic aberration has been suppressed. This is an important development that will undoubtedly lead to other exciting innovations in the field of flat photonics."

Harvard's Office of Technology Development has filed for a provisional patent on the new optical technology and is actively pursuing commercial opportunities.

"Our previous work on the metallic flat lens produced a great excitement in regard to the possibility of achieving high numerical aperture and spherical aberration-free focusing with a very compact design. By demonstrating achromatic lenses we have now made a major step forward towards widespread future application of flat optics that will certainly attract the interest of the industry," said lead author Francesco Aieta, now employed by Hewlett Packard, who conducted the research at Harvard SEAS.