quinta-feira, 27 de fevereiro de 2014

Einstein's Lost Theory Uncovered


The famous physicist explored the idea of a steady-state universe in 1931
 
  AFBFEEEB-EDF5-4115-B5F0E327E783DCF3_article
A manuscript that lay unnoticed by scientists for decades has revealed that Albert Einstein once dabbled with an alternative to the Big Bang theory, proposing instead that the Universe expanded steadily and eternally. The recently uncovered work, written in 1931, is reminiscent of a theory championed by British astrophysicist Fred Hoyle nearly 20 years later. Einstein soon abandoned the idea, but the manuscript reveals his continued hesitance to accept that the Universe was created during a single explosive event.
The Big Bang theory had found observational support in the 1920s, when US astronomer Edwin Hubble and others discovered that distant galaxies are moving away and that space itself is expanding. This seemed to imply that, in the past, the contents of the observable Universe had been a very dense and hot ‘primordial broth’.
But, from the late 1940s, Hoyle argued that space could be expanding eternally and keeping a roughly constant density. It could do this by continually adding new matter, with elementary particles spontaneously popping up from space, Hoyle said. Particles would then coalesce to form galaxies and stars, and these would appear at just the right rate to take up the extra room created by the expansion of space. Hoyle’s Universe was always infinite, so its size did not change as it expanded. It was in a ‘steady state’.
The newly uncovered document shows that Einstein had described essentially the same idea much earlier. “For the density to remain constant new particles of matter must be continually formed,” he writes. The manuscript is thought to have been produced during a trip to California in 1931 — in part because it was written on American note paper.
It had been stored in plain sight at the Albert Einstein Archives in Jerusalem — and is freely available to view on its website — but had been mistakenly classified as a first draft of another Einstein paper. Cormac O’Raifeartaigh, a physicist at the Waterford Institute of Technology in Ireland, says that he “almost fell out of his chair” when he realized what the manuscript was about. He and his collaborators have posted their findings, together with an English translation of Einstein’s original German manuscript, on the arXiv preprint server (C. O’Raifeartaigh et al. Preprint at http://arxiv.org/abs/1402.0132; 2014) and have submitted their paper to the European Physical Journal.
“This finding confirms that Hoyle was not a crank,” says study co-author Simon Mitton, a science historian at the University of Cambridge, UK, who wrote the 2005 biography Fred Hoyle: A Life in Science. The mere fact that Einstein had toyed with a steady-state model could have lent Hoyle more credibility as he engaged the physics community in a debate on the subject. “If only Hoyle had known, he would certainly have used it to punch his opponents,” O’Raifeartaigh says.
Although Hoyle’s model was eventually ruled out by astronomical observations, it was at least mathematically consistent, tweaking the equations of Einstein’s general theory of relativity to provide a possible mechanism for the spontaneous generation of matter. Einstein’s unpublished manuscript suggests that, at first, he believed that such a mechanism could arise from his original theory without modification. But then he realized that he had made a mistake in his calculations, O’Raifeartaigh and his team suggest. When he corrected it — crossing out a number with a pen of a different color — he probably decided that the idea would not work and set it aside.
The manuscript was probably “a rough draft commenced with excitement over a neat idea and soon abandoned as the author realized he was fooling himself”, says cosmologist James Peebles of Princeton University in New Jersey. There seems to be no record of Einstein ever mentioning these calculations again.
But the fact that Einstein experimented with the steady-state concept demonstrates his continued resistance to the idea of a Big Bang, which he at first found “abominable”, even though other theoreticians had shown it to be a natural consequence of his general theory of relativity. (Other leading researchers, such as the eminent Cambridge astronomer Arthur Eddington, were also suspicious of the Big Bang theory, because it suggested a mystical moment of creation.) When astronomers found evidence for cosmic expansion, Einstein had to abandon his bias towards a static Universe, and a steady-state Universe was the next best thing, O’Raifeartaigh and his collaborators say.
Helge Kragh, a science historian at Aarhus University in Denmark, agrees. “What the manuscript shows is that although by then he accepted the expansion of space, [Einstein] was unhappy with a Universe changing in time,” he says.
This article is reproduced with permission from the magazine Nature. The article was first published on February 24, 2014.












New Space Station Photos Show North Korea at Night, Cloaked in Darkness

 

North Korea's isolation is visible in new satellite photos that show the energy-bankrupt country at night.

A photo of North Korea from space.

This night image of the Korean Peninsula shows that North Korea is almost completely dark compared to neighboring South Korea and China.

PHOTOGRAPH BY EARTH SCIENCE & REMOTE SENSING UNITY, NASA JOHNSON SPACE CENTER

Daniel Stone

National Geographic

Published February 26, 2014

Since the mid-1990s, when fuel stopped flowing from the defunct Soviet Union to North Korea, the famously hermetic country has descended into darkness.

Newly released photos taken from the International Space Station last month reveal just how energy bankrupt North Korea has become. The photos, and a time-lapse video of the region, show the country as almost completely black, in contrast to the bright lights of neighbors like South Korea and Japan. (See related, "North Korea: Nuclear Ambition, Power Shortage.")

In South Korea, each person consumes 10,162 kilowatt hours of power a year. North Koreans each use just 739. Other than several small spots of light, including the brightly illuminated capital of Pyongyang, the country just about blends in with the surrounding black ocean.

Satellites have traditionally been the best tools for observing North Korea; they capture detailed views from far beyond sealed borders. Starting in 1948 with Kim Il Sung, the grandfather of the current North Korean supreme leader Kim Jong Un, the country has shunned most of the world.

The North Korean government has refused offers of food and energy aid in exchange for a commitment to curtailing its nuclear energy ambitions. International inspectors have been denied entry, which has resulted in increasingly harsh sanctions led in large part by the United States and South Korea. China remains the staunchest of the north's few allies.

In her 2009 book Nothing to Envy: Ordinary Lives in North Korea, Barbara Demick described the effect darkness has on culture.

Streets become too dark for people to walk, limiting social interactions outside of daytime work hours. No one can watch TV or consume the limited amount of media allowed by the government.

Still, some parts of North Korea never go dark. Several government buildings, as well as Kim Jong Un's personal palace, stay lit at all times. Also illuminated around the clock: the famous 560-foot (171-meter) Juche Tower at the center of Pyongyang. It stands as a lonely symbol of nationalism and self-reliance, whatever the cost.

Big Bang Secrets Swirling in a Fluid Universe

 

With a new approach that treats the universe as a fluid, cosmologists plan to tease out the fine details of the big bang from its behavior and evolution

Deep Space

A new model that treats the matter in the universe as a fluid could enable researchers to retrace the flow of the cosmos back to the Big Bang. In this image, fluidlike wisps are created as ejected gas from a supernova collides with gas and dust in the surrounding interstellar medium. Credit: Canada-France-Hawaii Telescope/Coelum

From Quanta Magazine (find original story here).

To a sound wave, the cosmos has the consistency of chocolate syrup.

That’s one discovery that scientists investigating the Big Bang have made using a new approach that treats the matter in the universe as a peculiar kind of fluid. They have calculated properties that characterize the universe’s behavior and evolution, including its viscosity, or resistance to deformation by sound waves and other disturbances.

“Twenty pascal-seconds is the viscosity of the universe,” said Leonardo Senatore, an assistant professor of physics at Stanford University — just as it is for the ice cream topping.

The viscosity calculation could help cosmologists sleuth out the details of the Big Bang, and possibly someday identify its trigger, by enabling them to track the fluidlike flow of the cosmos back 13.8 billion years to its initial state.

As other techniques for probing the Big Bang reach their limits of sensitivity, cosmologists are co-opting the fluid approach, called “effective field theory,” from particle physics and condensed matter physics, fields in which it has been used for decades. By modeling the matter swirling throughout space as a viscous fluid, the cosmologists say they can precisely calculate how the fluid has evolved under the force of gravity — and then rewind this cosmic evolution back to the beginning. “With this approach, you can really zoom in on the initial conditions of the universe and start asking more and more precise questions,” said Enrico Pajer, a postdoctoral research fellow at Princeton University with a recent paper on the technique that has been accepted by the Journal of Cosmology and Astroparticle Physics.

The more information that astronomers gather about the distribution of galaxies throughout space — known as the “large-scale structure” of the universe — the more accurate the fluid model becomes. And the data are pouring in. The sketchy scatter plot of several thousand nearby galaxies that existed in the 1980s has given way to a far richer map of millions of galaxies, and planned telescopes will soon push the count into the billions. Proponents believe that tuned with these data points, the fluid model may grow precise enough within 10 or 15 years to prove or refute a promising Big Bang theory called “slow-roll inflation” that says the universe ballooned into existence when an entity called an inflation field slowly slid from one state to another. “There has been a big community trying to do this type of calculation for a long time,” said Matias Zaldarriaga, a professor of cosmology at the Institute for Advanced Study in Princeton, N.J. Further in the future, the researchers say, applying effective field theory to even bigger datasets could reveal properties of the inflation field, which would help physicists build a theory to explain it.

“It’s obviously the right tool to be using,” said John Joseph Carrasco, a theoretical physicist at Stanford. “And it’s the right time.”

Senatore, Carrasco and their Stanford collaborator Mark Hertzberg first proposed the fluid approach to modeling the universe’s large-scale structure in a 2012 paper in the Journal of High Energy Physics, motivated by the Big Bang details it could help them glean from the increasingly enormous data sets. Other researchers have since jumped on board, helping to hone the method in a slew of papers, talks and an upcoming workshop. “We’re a small, plucky band of people who are convinced this is the way forward,” said Sean Carroll, a theoretical cosmologist at the California Institute of Technology.

A Fluid Cosmos
In water, chocolate syrup and other fluids, matter is smoothly distributed on large scales and partitioned into chunks, such as atoms or molecules, on small scales. To calculate the behavior of water on the human scale, where it is a fluid, it isn’t necessary to take into account every collision between H₂O molecules on the atomic scale. In fact, having to do so would render the calculation impossible. Instead, the collective effects of all the molecular interactions at the atomic scale can be averaged and represented in the fluid equations as “bulk” properties. (Viscosity, for example, is a measure of the friction between particles and depends on their size and shape as well as the forces between them.)

A similar trick works for modeling the evolution of the universe’s large-scale structure.

Just like water, the universe is smooth on large scales: The same amount of matter exists in one billion-light-year-wide region as the next. Slight variations in the matter distribution, such as more- and less-dense patches of galaxies, appear when you zoom in. At short distances, the variation becomes extreme: Individual galaxies are surrounded by voids, and within the galaxies, stars pinprick empty space. The matter distribution is constantly changing at every scale as gravity causes stars, galaxies and galaxy clusters alike to clump together and dark energy stretches the space between them. By modeling these changes, cosmologists can use the output — galaxy distribution data — to deduce the input — the initial conditions of the universe.

To a first approximation, the matter distribution at each distance scale (from large to small) can be treated as if it evolves independently. However, just as small ripples in the surface of water can affect the evolution of bigger waves, smaller clumps of matter in the universe (such as galaxy clusters) gravitationally influence the larger clumps that encompass them (such as superclusters). Accounting for this interplay in models of cosmic evolution is problematic because the gravitational effects at the shortest distance scales — at which the universe is not smooth like a fluid but rather condensed into isolated, particlelike objects — sabotage the calculation.

Effective field theory fixes the problem by accounting for the interplay between scales only down to a few times the distance between galaxies. “Everything smaller than that length scale, we treat as complicated and hard to understand, and whatever goes on at those small scales can be bundled up into one big effect,” Carroll explained. The average gravitational effect of matter on small scales is represented as a fluid’s viscosity; hence, the connection between the cosmos and chocolate syrup.

Although the former is sparse and cold while the latter is thick and usually served warm, their viscosities are calculated from data and simulations to be almost exactly equal. The number means both fluids immediately damp out an incident sound wave. “It just goes ‘dum,’ and then it disappears,” Pajer said.

The Ultimate Probe
“It’s still early days for the effective field theory of large-scale structure,” said Marc Kamionkowski, a professor of physics and astronomy at Johns Hopkins University who is not involved in developing the approach. While “it certainly does present some advantages,” he said, much work is needed before the tool can be used to extract new discoveries from astronomical data.

For example, so far, cosmologists have only developed an effective field theory model of the evolution of dark matter, an invisible substance that makes up roughly six-sevenths of the matter in the universe. Visible matter is slightly more complicated, and researchers say its behavior on short distance scales might be more difficult to represent as bulk properties of a fluid. “That is the next challenge,” said Zaldarriaga, who co-authored a November 2013 paper on the effective field theory approach. “We are doing one thing at a time.”

The researchers’ ultimate goal is to measure so-called “non-Gaussianities” in the initial conditions of the universe. If inflation theory is correct and an inflation field briefly transitioned to an unstable state, causing space to balloon 1078 times in volume, random ripples of energy called quantum fluctuations would have surfaced in the field and later grown into the large-scale structure that exists today. These ripples would be expected to follow a “Gaussian” distribution, in which energy is evenly distributed on both sides of a bell curve. Cosmologists look for non-Gaussianities, or subtle biases in the energy distribution, as signs of other, more meaningful events during inflation, such as interactions between multiple inflation fields. The recently released Planck satellite image of the cosmic microwave background indicated that energy fluctuations in the primordial universe followed a Gaussian curve to at least one part in 100,000, compatible with the slow-roll model in which the universe arose from a single inflation field. But alternative models that would have produced even smaller amounts of non-Gaussianity have not yet been ruled out.

By tuning the effective field theory model with galaxy distribution data from imminent sky surveys such as the Large Synoptic Survey Telescope project and Euclid mission, cosmologists estimate that it may be possible to improve detection of non-Gaussianities by a factor of 10 or 20. If none is detected at that sensitivity level, “we can be sure it is standard slow-roll inflation,” Senatore said. “This is extremely exciting.”

If it can be proved that the Big Bang began with slow-roll inflation, the next task would be to probe the properties of the “inflaton” — the particle associated with the inflation field, and a component of an all-encompassing theory of nature. During inflation, the inflaton must at least have interacted with itself and gravity, and both interactions would nudge the inflation field’s energy distribution ever so slightly to one side or another. Planned sky surveys will not be sensitive enough to detect such subtle non-Gaussianities, but researchers expect them to be imprinted on a signal emitted by hydrogen gas in the early universe. “This is the ultimate probe,” Pajer said.

Telescopes should detect this hydrogen signal, called the 21-centimeter line, in approximately 30 or 40 years, and effective field theory will be used to try to tease out the non-Gaussianities. “While we’re old,” said Senatore, who is 35, “we will for sure detect something.”

Reprinted with permission from Quanta Magazine, an editorially independent division of SimonsFoundation.org whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

 

A Happy Life May not be a Meaningful Life - Scientific American - Mozilla Firefox 2014-02-19 18.42.38

Come manomettere il sistema immunitario per prevenire danni dopo un infarto

 

 

Microparticelle che bloccano la risposta immunitaria del corpo verso i tessuti danneggiati potrebbero aiutare a prevenire danni ulteriori.

di Mike Orcutt

L'utilizzo di minuscole particelle biodegradabili per interrompere la normale risposta immunitaria del corpo dopo un infarto potrebbe contribuire a salvare i pazienti da danni ai tessuti e da alcune complicazioni per la salute che possono manifestarsi a lungo termine.
I ricercatori hanno
dimostrato che iniettando alcune particelle nei topi, entro 24 ore da un infarto, non solo si riduce significativamente il danno ai tessuti, ma in questi topi la funzione cardiaca risulta più forte dopo 30 giorni. Gli ideatori della nuova tecnologia ora intendono procedere con la sperimentazione sull'uomo.
Molti dei danni riportati dai tessuti a seguito di un infarto sono il risultato di un'infiammazione, una risposta immunitaria del corpo a uno stimolo nocivo come, per esempio, un muscolo danneggiato. Nel caso di un infarto, però, queste cellule immunitarie fanno più male che bene, spiega
Daniel Getts, ideatore della nuova terapia e direttore scientifico del Cour Pharmaceutical Development.
L’armamentario del sistema è "abbastanza generico", spiega. I composti tossici che le cellule immunitarie secernono, possono essere utili nel difendere il corpo dalle infezioni ma allo stesso tempo possono causare dei danni. Questo fenomeno si verifica non solo dopo un infarto, ma anche in una serie di altre malattie come il virus del Nilo Occidentale, le malattie infiammatorie intestinali e la sclerosi multipla.
Le 500 particelle nanometriche devono essere caricate negativamente e possono essere costituite da vari materiali, compreso quello utilizzato per le suture biodegradabili. Stando a
questa nuova ricerca, una volta che le particelle sono nel sangue, la carica negativa attrae uno specifico recettore sulla superficie dei monociti infiammatori, le particelle si legano a quel recettore e deviano le cellule immunitarie lontano dal cuore, verso la milza dove muoiono.
Impedendo a queste cellule di raggiungere il cuore si permette al muscolo danneggiato di rigenerarsi "durante processi più controllati", spiega Getts. Se la terapia si trasferisse alle persone, continua, si potrebbero ridurre sostanzialmente gli inconvenienti a lungo termine per la salute che, per pazienti colpiti da infarto, comprendono respiro corto e una limitata capacità di tenersi in esercizio.
L'obiettivo è quello di iniziare la sperimentazione sull'uomo all'inizio del prossimo anno. La compagnia conta sul meccanismo relativamente semplice della terapia, e il fatto che il materiale di cui sono costituite le particelle è composto da acido poliglicolico già approvato dalla Food and Drug Administration, velocizzerà lo sviluppo del processo.
"C'è ancora molto lavoro da fare", in particolare risolvere ogni potenziale effetto collaterale che le particelle potrebbero produrre, sostiene
Matthias Nahrendorf, professore di biologia dei sistemi ad Harvard. Ad esempio, le particelle possono attivare il sistema immunitario in molti modi ancora sconosciuti, spiega. Inoltre, sarà importante determinare come gestire la terapia in modo da non compromettere la capacità che hanno queste cellule di guarire e difendere il corpo da infezioni e da altri invasori esterni, dice Nahrendorf.

 

Technology Review - La rivista del MIT per l'innovazione - Mozilla Firefox 2014-02-27 12.32.02

Scenari Tecnologici 2025

 

 

Gli articoli usciti sulla edizione americana di MIT Technology Review alla fine dell’anno, con la rassegna dei migliori interventi del 2013 in materia di Informatica, Web, Energia, Scienze della vita, hanno un doppio valore. Non solo offrono uno spaccato di quanto di più importante e attuale è emerso negli ultimi dodici mesi, ma rappresentano anche una base solida per un salto in avanti di 10 anni, per costruire uno scenario tecnologico al 2025.
Sottoponendo gli elenchi delle nuove tecnologie descritte nel 2013 a esperti della nostra redazione e del nostro Comitato scientifico, sono emerse valutazioni interessanti e non sempre univoche.
Un caso tipico è quello della diversa valutazione della energia nucleare, sotto forma sia di fissione, sia di fusione, come strumento per risolvere il problema energetico del pianeta.
Il quadro definitivo o, meglio, i quadri definitivi sono qui riportati come tavole che indicano le tecnologie emergenti e ne propongono una classifica (ranking) in funzione della loro possibilità di venire tradotte in innovazione entro una decina di anni. Il termine “tradotte”, che sta diventando molto di moda, deriva dal termine americano translational research, che indica proprio la capacità di passare dal laboratorio dello scienziato, al mercato e all’adozione diffusa.
Il passo successivo della nostra costruzione dello scenario 2025 è stato un colloquio con Roberto Cingolani, direttore scientifico dell’Istituto Italiano di Tecnologia di Genova, in cui si è discussa la difficoltà di fare previsioni, in un mondo in così rapido cambiamento e con una così rilevante interdipendenza di settori in apparenza molto diversi.
«Prendiamo solo tre esempi», dice Cingolani, «seguendo lo schema della vostra rivista: le biotecnologie, le fonti di energia portatili, la pervasività di Internet».
Nelle biotecnologie la caratteristica comune a tutte le ricerche più avanzate e quella di avere a che fare con interventi a livello molecolare su cellule singole. Ciò comporta una profonda interdisciplinarità tra la biologia molecolare, la fisica nano, le tecnologie di imaging, che permettono di osservare all’interno della dimensione molecolare, e infine la medicina.
Quando mi illustra la possibilità di avere nello stesso vettore, per esempio un virus, sia la capacità diagnostica per identificare il segmento di DNA male codificato, sia quella di sostituirlo con uno corretto, direttamente in situ, mi viene da chiedergli sorridendo: «Al 2025?». Cingolani mi risponde: «L’anno è difficile dirlo, ma vedrai che non tarderà. Il tempo dal laboratorio al mercato diventa sempre più breve».
«Nell’energia», continua Cingolani, «non darei per conclusa la vicenda del nucleare.
C’è ancora molta strada da fare per la fusione, ma lo sforzo è ancora intenso e, anche se al 2025 non si registreranno certo specifici impatti in termini di innovazione, il settore è ancora aperto a possibili risultati. Non certo la fusione fredda, ma le macchine di contenimento del plasma potrebbero arrivare a portarci il Sole sulla Terra. Per la fissione vedo solo reattori più o meno convenzionali di dimensioni più piccole.
Il problema più grave, quello delle scorie a vita lunga non verrebbe risolto. Ora, se partisse una nuova filiera con produzione di elementi di fissione diversi dagli attuali (il torio, per esempio) si potrebbe cambiare il profilo di rischio. Ma gli investimenti necessari a una nuova filiera sono tali da farmi ritenere che ci sia nessuno disposto a farli nel prossimo decennio.
Nelle energie alternative il solare ha ancora buoni margini di miglioramento nelle celle e, con le crescenti prestazioni delle batterie per lo stoccaggio della energia prodotta in eccesso, dovrebbe continuare ad accrescere il suo contributo alla produzione di energia pulita.
Ma la vera rivoluzione potrebbe venire, e credo che verrà, dal settore affascinante dei cosiddetti Portable Energy Harvesters. Difficile tradurre Energy Harvest, una espressione che vuole dire mietere, raccogliere.
Ognuno di noi produce e disperde una energia di non pochi watt/ora, mentre camminiamo, facciamo ginnastica o stiamo esposti al vento. Il metabolismo degli zuccheri produce continuamente energia, in un modo che dovremmo imparare ad imitare. Se tutti questi watt di potenza prodotta individualmente venissero moltiplicati per centinaia di milioni, se non per miliardi di individui, si potrebbe raccogliere abbastanza energia da fare a meno di parecchi megawatt installati.
Ci sarebbero non trascurabili margini di recupero nell’adottare a nuovi tessuti per abiti che raccolgono l’energia di attrito dell’aria mentre si ci muoviamo, o suole di scarpe che raccolgono energia a ogni passo. Sull’Energy Harvesting c’è molto da fare e in tanti hanno già cominciato a farlo. L’argomento è molto sottodimensionato in termini di comunicazione rispetto al suo potenziale.
Un argomento che invece mi pare sovradimensionato nella comunicazione è quello della gigantesca crescita di connessione che, in modo un po’ fantasioso e accattivante, viene definito Internet delle cose. Anche qui la multidisciplinarietà è regina, dalle tecnologie dei server alla microelettronica, dai sensori ai software semantici. Se qualcosa resta indietro, si ferma tutto.
Oggi siamo al collegamento in rete di circa il 2 per cento di ciò che ci circonda. Pensare, come fanno alcuni, che si possa arrivare al 100 per cento, mi pare sinceramente impossibile e credo anche non consigliabile, Questo collegamento di tutto con tutti renderebbe necessarie tali ridondanze affinché, in caso qualcosa si guastasse, la vita potesse continuare normale, che forse alla fine il rapporto costi/benefici si rivelerebbe non soddisfacente. Accontentiamoci di un altro 10 per cento».
Gli chiedo infine delle automobili senza pilota. Mi risponde: «Non mi ci vedo. Mi piace molto guidare. Certo che gli sciami di autocarri, o anche di aerei presentano dei rilevanti vantaggi economici. Ma se mi comprassi una bella auto, non vorrei proprio farla guidare a un computer».
OK, Roberto. Mi terrò la mia Alfa Duetto rosso del 1966, tutta manuale, fino al 2025!
Tavola 1
Milestones BIO (2014-2025)
(in ordine di rilevanza dell’impatto)
1. Immunoterapia
2. Biologia rigenerativa (per creare campioni di organi a fini non sostitutivi, ma farmacologici)
3. Gene Editing (modificazione genetica)
4. Personalizzazione di cellule staminali
5. Miniaturizzazione di apparati per imaging
6. BRAIN (Brain Research Advanced Neurotech)
(di pari impatto, quasi certamente raggiungibile entro il 2025)
* Nuove tecniche di sequenziamento del DNA (Illumina)
* Terapie HIV per neonati
* Terapie per la malaria
* Terapia per la talassemia

Tavola 2
Milestones ENERGIA (2014-2025)
1. Cattura di CO2 (malgrado le rinnovabili, anche i combustibili fossili continuano a crescere)
2. Nuova generazione di pannelli solari flessibili e a basso costo
3. Nuove batterie per l’accumulo di elettricità (maggiore capacità e velocità di ricarica)
4. Integrazione di diverse soluzioni portatili di raccolta (harvesters) di energia
5. Progressi nella illuminazione a base di LED
6. Isolamento e recupero domestici della energia
7. Nuove linee di piccoli reattori nucleari
8. Nuovi reattori nucleari a sali fusi con bassissimi livelli di scorie a lunga vita
9. Grafene
10. Avanzamenti nella fusione nucleare (da Ignitor a ITER, alla concentrazione laser)

Tavola 3
Milestones INFORMATICA E AUTOMAZIONE (2014-2025)
1. Tecnologie per apparati portatili
- Smart watches (Qualcomm, Samsung)
- Google Glasses (un sistema per amplificare la persona)
2. Internet delle cose, allargato
3. Utilizzo di elio all’interno dei drives per ridurre l’attrito
4. Quantum Computing
5. Robotica
- Intelligenza artificiale basata su ricerca neuromorfica
- Sensori ottici basati su un modello di retina umana

 

Technology Review - La rivista del MIT per l'innovazione - Mozilla Firefox 2014-02-27 12.32.02