If you can't read this newsletter, check it online

Scientific Indian

Trapping atoms in a laser beam offers a new way to measure gravity

The technique can measure slight gravitational variations, which could help in mapping terrain

By Maria Temming

By watching how atoms behave when they’re suspended in midair, rather than in free fall, physicists have come up with a new way to measure Earth’s gravity. 

A A new type of experiment to measure the strength of gravity uses atoms suspended in laser light (with the machinery pictured      above), rather than free-falling atoms.

Traditionally, scientists have measured gravity’s influence on atoms by tracking how fast atoms tumble down tall chutes. Such experiments can help test Einstein’s theory of gravity and precisely measure fundamental constants (SN: 4/12/18). But the meters-long tubes used in free-fall experiments can be unwieldy and difficult to shield from environmental interference such as stray magnetic fields. With a new tabletop setup, physicists can gauge the strength of Earth’s gravity by monitoring atoms suspended a couple millimeters in the air by laser light. 

This redesign, described in the Nov. 8 Science, could better probe the gravitational forces exerted by small objects. The technique also could be used to measure slight gravitational variations at different places in the world, which may help in mapping the seafloor or finding oil and minerals underground (SN: 2/12/08). 

Physicist Victoria Xu and colleagues at the University of California, Berkeley began by launching a cloud of cesium atoms into the air and using flashes of light to split each atom into a superposition state. In this weird quantum limbo, each atom exists in two places at once: one version of the atom hovering a few micrometers higher than the other. Xu’s team then trapped these split cesium atoms in midair with light from a laser. 

Got you, atom

To measure gravity, physicists split atoms into a weird quantum state called superposition — where one version of the atom is slightly higher than the other (blue dots connected by vertical yellow bands in this illustration). The researchers trap these atoms in midair using laser light (horizontal blue bands). While held in the light, each version of a single atom behaves slightly differently, due to their different positions in Earth’s gravitational field. Measuring those differences allows physicists to determine the strength of Earth’s gravity at that location.

Measuring the strength of gravity with atoms that are held in place, rather than being tugged downward by a gravitational field, requires tapping into the atoms’ wave-particle duality (SN: 11/5/10). That quantum effect means that, much as light waves can act like particles called photons, atoms can act like waves. And for each cesium atom caught in superposition, the higher version of the atom wave undulates a little faster than its lower counterpart, due to the atoms’ slightly different positions in Earth’s gravitational field. By tracking how fast the waviness of the two versions of an atom gets out of sync, physicists can calculate the strength of Earth’s gravity at that spot. 

“Very impressive,” says physicist Alan Jamison of MIT. To him, one big promise of the new technique is more controlled measurements. “It’s quite a challenge to work on these drop experiments, where you have a 10-meter-long tower,” he says. “Magnetic fields are hard to shield, and the environment produces them all over the place — all the electrical systems in your building, and so forth. Working in a smaller volume makes it easier to avoid those environmental noises.” 

More compact equipment can also measure shorter-range gravity effects, says study coauthor Holger Müller. “Let’s say you don’t want to measure the gravity of the entire Earth, but you want to measure the gravity of a small thing, such as a marble,” he says. “We just need to put the marble close to our atoms [and hold it there]. In a traditional free-fall setup, the atoms would spend a very short time close to our marble — milliseconds — and we would get much less signal.” 

Physicist Kai Bongs of the University of Birmingham in England imagines using the new kind of atomic gravimeter to investigate the nature of dark matter or test a fundamental facet of Einstein’s theory of gravity called the equivalence principle (SN: 4/28/17). Many unified theories of physics proposed to reconcile quantum mechanics and Einstein’s theory of gravity — which are incompatible — violate the equivalence principle in some way. “So looking for violations might guide us to the grand unified theory,” he says. “That’s one of the Holy Grails in physics.”

 

NASA's NICER Catches Record-setting X-ray Burst
 

NASA’s Neutron star Interior Composition Explorer (NICER) telescope on the International Space Station detected a sudden spike of X-rays at about 10:04 p.m. EDT on Aug. 20. The burst was caused by a massive thermonuclear flash on the surface of a pulsar, the crushed remains of a star that long ago exploded as a supernova.

The X-ray burst, the brightest seen by NICER so far, came from an object named SAX J1808.4-3658, or J1808 for short. The observations reveal many phenomena that have never been seen together in a single burst. In addition, the subsiding fireball briefly brightened again for reasons astronomers cannot yet explain.

“This burst was outstanding,” said lead researcher Peter Bult, an astrophysicist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, and the University of Maryland, College Park. “We see a two-step change in brightness, which we think is caused by the ejection of separate layers from the pulsar surface, and other features that will help us decode the physics of these powerful events.”

The explosion, which astronomers classify as a Type I X-ray burst, released as much energy in 20 seconds as the Sun does in nearly 10 days. The detail NICER captured on this record-setting eruption will help astronomers fine-tune their understanding of the physical processes driving the thermonuclear flare-ups of it and other bursting pulsars. 

A thermonuclear blast on a pulsar called J1808 resulted in the brightest burst of X-rays seen to date by NASA’s Neutron star Interior Composition Explorer (NICER) telescope. The explosion occurred on Aug. 20, 2019, and released as much energy in 20 seconds as our Sun does in almost 10 days. 

A pulsar is a kind of neutron star, the compact core left behind when a massive star runs out of fuel, collapses under its own weight, and explodes. Pulsars can spin rapidly and host X-ray-emitting hot spots at their magnetic poles. As the object spins, it sweeps the hot spots across our line of sight, producing regular pulses of high-energy radiation.

J1808 is located about 11,000 light-years away in the constellation Sagittarius. It spins at a dizzying 401 rotations each second, and is one member of a binary system. Its companion is a brown dwarf, an object larger than a giant planet yet too small to be a star. A steady stream of hydrogen gas flows from the companion toward the neutron star, and it accumulates in a vast storage structure called an accretion disk.

Gas in accretion disks doesn’t move inward easily. But every few years, the disks around pulsars like J1808 become so dense that a large amount of the gas becomes ionized, or stripped of its electrons. This makes it more difficult for light to move through the disk. The trapped energy starts a runaway process of heating and ionization that traps yet more energy. The gas becomes more resistant to flow and starts spiraling inward, ultimately falling onto the pulsar.

Hydrogen raining onto the surface forms a hot, ever-deepening global “sea.” At the base of this layer, temperatures and pressures increase until hydrogen nuclei fuse to form helium nuclei, which produces energy — a process at work in the core of our Sun.     

“The helium settles out and builds up a layer of its own,” said Goddard’s Zaven Arzoumanian, the deputy principal investigator for NICER and a co-author of the paper. “Once the helium layer is a few meters deep, the conditions allow helium nuclei to fuse into carbon. Then the helium erupts explosively and unleashes a thermonuclear fireball across the entire pulsar surface.”

Astronomers employ a concept called the Eddington limit — named for English astrophysicist Sir Arthur Eddington — to describe the maximum radiation intensity a star can have before that radiation causes the star to expand. This point depends strongly on the composition of the material lying above the emission source.  

“Our study exploits this longstanding concept in a new way,” said co-author Deepto Chakrabarty, a professor of physics at the Massachusetts Institute of Technology in Cambridge. “We are apparently seeing the Eddington limit for two different compositions in the same X-ray burst. This is a very powerful and direct way of following the nuclear burning reactions that underlie the event.”

As the burst started, NICER data show that its X-ray brightness leveled off for almost a second before increasing again at a slower pace. The researchers interpret this “stall” as the moment when the energy of the blast built up enough to blow the pulsar’s hydrogen layer into space.

The fireball continued to build for another two seconds and then reached its peak, blowing off the more massive helium layer. The helium expanded faster, overtook the hydrogen layer before it could dissipate, and then slowed, stopped and settled back down onto the pulsar’s surface. Following this phase, the pulsar briefly brightened again by roughly 20 percent for reasons the team does not yet understand.

During J1808’s recent round of activity, NICER detected another, much fainter X-ray burst that displayed none of the key features observed in the Aug. 20 event.

In addition to detecting the expansion of different layers, NICER observations of the blast reveal X-rays reflecting off of the accretion disk and record the flickering of “burst oscillations” — X-ray signals that rise and fall at the pulsar’s spin frequency but that occur at different surface locations than the hot spots responsible for its normal X-ray pulses.

A paper describing the findings has been published by The Astrophysical Journal Letters and is available online.

NICER is an Astrophysics Mission of Opportunity within NASA's Explorer program, which provides frequent flight opportunities for world-class scientific investigations from space utilizing innovative, streamlined, and efficient management approaches within the heliophysics and astrophysics science areas. NASA's Space Technology Mission Directorate supports the SEXTANT component of the mission, demonstrating pulsar-based spacecraft navigation.



 

New research synthesizes different aspects of causality in quantum field theory

by Ingrid Fadelli
 

In current quantum field theory, causality is typically defined by the vanishing of field commutators for spacelike separations. Two researchers at the University of Massachusetts and Universidade Federal Rural in Rio de Janeiro have recently carried out a study discussing and synthesizing some of the key aspects of causality in quantum field theory. Their paper, published in Physical Review Letters, is the result of their investigation of a theory of quantum gravity commonly referred to as "quadratic gravity."

"Like the ingredients of the standard model, quadratic gravity is a renormalizable quantum field theory, but it has some peculiar properties," John Donoghue, one of the researchers who carried out the study, told Phys.org. "The small violation of causality is the most important of these and our goal was to understand this better. In the process, we realized that some of the insights are of more general interest and we decided to write our understanding as a Physical Review Letter, to share these insights more widely."

The paper written by Donoghue and his colleague Gabriel Menezes synthesizes many different aspects of causality that have been part of quantum field theory for several decades now. The realization that there can be microscopic violations of causality in certain theories dates back to the 1960s, specifically to the work of physicists T.D. Lee and G.C. Wick. In their study, however, Donoghue and Menezes also drew inspiration from a more recent study carried out by Donal O'Connell, Benjamin Grinstein and Mark B. Wise.

So far, most theoretical discussions about causality, specifically the "arrow of time," have asserted that the laws of physics do not have any preference for the flow of time. However, this particular assumption is not applicable to quantum physics, where a direction for causal effects is present.

"The various factors of i in the quantization procedures are related to the direction of causal action, which leads to the 'arrow of causality' in quantum physics," Donoghue explained. "This connection is not discussed very often."

Donoghue and Menezes were intrigued by the fact that the macroscopic feeling of causality, which is also applicable to classical physics, is due to the underlying structure of quantum theory. In their recent paper, they thus examined this particular aspect of causality further, in order to gather insight about its meaning and implications.

"The idea that there can be dueling arrows of causality within the same theory is even more obscure," Donoghue said. "However, it happens in a very simple setting where the Lagrangian for the theory has more powers of derivatives than usual. This is what happens in quadratic gravity, but it could also happen in other theories too."

Even though the direction of causal influence is a convention associated with the choice of a description of the measurement of time, its existence is a necessary requirement based on the laws of quantum physics. In this context, Donoghue and Menezes observed that the arrow of causality can be potentially violated by having conflicting conventions.

"Perhaps the most important implication of our study is that we collected evidence of causal uncertainty due to spacetime fluctuations that can arise in a quantum theory of gravity," Menezes said. "This would provide us with a deep intuitive understanding of the origins of causality."

About a decade ago, O'Connell, Grinstein and Wise carried out a study that was partially based on a series of lectures by Sidney Coleman. They specifically suggested that in a wavepacket description of a scattering process with mixed causal arrows, one can verify that the decay products can be detected earlier than would be expected from the time of production and the associated probability of detection decreases exponentially backwards in time. In their study, Donoghue and Menezes examined this idea further.

"An implication of our study is that while the ideas put forward by O'Connell and his colleagues, as well as other research teams, could in principle be observed, there is no conflict with experiments in the case of gravity as the phenomenon occurs at energies of order of the Planck scale, which is 15 orders of magnitude greater than the energy range accessible to the LHC," Menezes said.

The recent study by Donoghue and Menezes offers a general and valuable discussion of causality and the arrow of causality, specifically focusing on how a given theory may have both forward and backward arrows. This discussion touches on the topic of time reversal in field theory, so it could inform a variety of physics studies. It could also help to clarify the quantum theory of quadratic gravity, which still has many unanswered questions.

Overall, Donoghue and Menezes suggest that mixed conventions in individual physics theories could in fact be possible and that future studies should explore this topic further. The researchers are now working on a project aimed at fully exploring the phenomenon of causality uncertainty due to quantum fluctuations of the gravitational field.

"There are some other technical considerations that we need to address concerning this description of quantum gravity as a renormalizable quantum field theory," Menezes said. "One of them concerns the stability of quadratic gravity in curved backgrounds, which has already been studied by other authors. Hopefully these will also be part of this future work. In any case, the most intriguing investigation we hope to conduct will be the study of the effect of the causal uncertainty in the early Universe."



 
.     Origins of life: new evidence first cells could have formed at the bottom of the ocean

By Sean Jordan
 

Where did life come from? In recent years, many scientists have shifted from favouring a “primordial soup” in pools of water to hydrothermal vents deep in the ocean as the original source of life on Earth. But one of the biggest problems with this idea is that researchers have been unable to recreate in the lab one of the key processes that would have been involved if this theory was true.

Specifically, they haven’t been able to form simple cell membranes in seawater-like conditions, which most agree would have been required to create the first living organisms. But my colleagues and I have recently shown, in a paper in Nature Ecology and Evolution, that the combination of molecules scientists have used to recreate these membranes doesn’t reflect the components that would have been available at the time. In fact, we found that, with the right ingredients, ocean vent conditions are actually necessary to form some cell membranes.

Imagine the Earth 4.5 billion years ago. The period of geological history we call the Hadean was not as hellish as we once believed. It was not a sea of lava fuelled by countless volcanoes, although they certainly existed. It was probably more like small areas of rocky surface surrounded by a substantial global water ocean.

But it was not the ocean we know today. It was warmer, more acidic and rich in iron. The atmosphere was mostly nitrogen, carbon dioxide and no oxygen. There was also no life. Yet deep down at the bottom of the ocean, something was beginning to happen.

Hot chemicals rising through the sea floor enabled a chemical reaction between hydrogen and carbon dioxide, producing simple organic compounds. These organic molecules reacted to form increasingly more complex compounds. These became encapsulated in simple cell membranes and grew further in complexity, producing molecules that could carry information and eventually DNA. These were the first living cells that could grow, divide and evolve.

At least that may be how life on Earth began. There are still many different theories of how and where life started, including those proposing hydrothermal pondsice sheets or even outer space. To try to understand which of these settings is the most plausible, scientists take the different components essential for life and see if we can find a way to reproduce them in different conditions in the lab.

Cell membrane, a prerequisite for life. 

We are not trying to create life, merely the different bits and pieces it requires. Not everyone can agree what life actually is, but one thing that many scientists in this field agree on is that the first living organism would have had a cell membrane.

Cell membranes are mostly made up of phospholipids, which are composed of simple molecules including fatty acids, isoprenoids and sugars. Phospholipids are only produced by living organisms. But fatty acids can be formed in the environment through reactions between rocks and water, and isoprenoids or similar molecules may also be produced this way.

These simple molecules form membrane structures called vesicles, which are cell-like in that they form a double-layer membrane surrounding a space full of water. It turns out vesicles can perform many of the same functions as cell membranes. This has led origin-of-life researchers to investigate the possibility that vesicles may have been the first cell membranes, earning them the title of “protocells”.

A lot of experiments have been performed on protocells to see if they can withstand the environmental conditions necessary for the first living cells. For example, they form readily in water, can encapsulate other organic molecules, and the DNA-like molecule RNA can be made inside them. One big problem is that these protocells do not like salt. In fact, they hate it.

Researchers have shown protocells just will not form in the presence of the concentrations of sodium chloride, magnesium and calcium found in seawater. This has led some to claim that life couldn’t possibly have started in the ocean.

 

Necessary conditions

But my colleagues and I noticed something about all of this previous research. The artificial protocells were made from between one and three types of molecule, even though experiments have shown there would be many more molecules available in the primordial ocean.

When we used a combination of 14 molecules, we found we could form protocells that could encapsulate organic molecules, even in mixtures of sodium chloride, magnesium and calcium at seawater concentrations. The solutions had to be around 70°C and alkaline, around pH 12.

Our work shows not only that these protocells can form in the conditions created by hydrothermal vents, but that they actually need these conditions to survive. This doesn’t prove life started in the vents, but it does renew the possibility that it did. It’s also relevant to the search for life on other planets. It is possible that alkaline hydrothermal vents exist at the bottom of the oceans on the icy moons of Jupiter and Saturn.

However, the problem of the origin of life is not solved yet, with ongoing promising research from several different theories. It is a very exciting time for the field and we are slowly getting closer to the answer to one of life’s most fundamental questions. We have shown that membranes can form where it was previously thought impossible. Who knows what will be proven possible in the future? As evidence builds for each of these theories, eventually it will become clear which environment was the most likely site of the origin of life.

 



     Solving a Riddle That Would Provide the World With Entirely Clean, Renewable Energy

      By TRINITY COLLEGE DUBLIN
 

Scientists from Trinity College Dublin have taken a giant stride towards solving a riddle that would provide the world with entirely renewable, clean energy from which water would be the only waste product.

Reducing humanity’s carbon dioxide (CO2) emissions is arguably the greatest challenge facing 21st-century civilization – especially given the ever-increasing global population and the heightened energy demands that come with it.

The Trinity team behind the latest breakthrough combined chemistry smarts with very powerful computers to find one of the “holy grails” of catalysis.

One beacon of hope is the idea that we could use renewable electricity to split water (H2O) to produce energy-rich hydrogen (H2), which could then be stored and used in fuel cells. This is an especially interesting prospect in a situation where wind and solar energy sources produce electricity to split water, as this would allow us to store energy for use when those renewable sources are not available.

The essential problem, however, is that water is very stable and requires a great deal of energy to break up. A particularly major hurdle to clear is the energy or “overpotential” associated with the production of oxygen, which is the bottleneck reaction in splitting water to produce H2.

Although certain elements are effective at splitting water, such as Ruthenium or Iridium (two of the so-called noble metals of the periodic table), these are prohibitively expensive for commercialization. Other, cheaper options tend to suffer in terms of their efficiency and/or their robustness. In fact, at present, nobody has discovered catalysts that are cost-effective, highly active and robust for significant periods of time.

So, how do you solve such a riddle? Stop before you imagine lab coats, glasses, beakers, and funny smells; this work was done entirely through a computer.

By bringing together chemists and theoretical physicists, the Trinity team behind the latest breakthrough combined chemistry smarts with very powerful computers to find one of the “holy grails” of catalysis.

The team, led by Professor Max García-Melchor, made a crucial discovery when investigating molecules that produce oxygen: Science had been underestimating the activity of some of the more reactive catalysts and, as a result, the dreaded “overpotential” hurdle now seems easier to clear. Furthermore, in refining a long-accepted theoretical model used to predict the efficiency of water splitting catalysts, they have made it immeasurably easier for people (or super-computers) to search for the elusive “green bullet” catalyst.

Lead author, Michael Craig, Trinity, is excited to put this insight to use. He said: “We know what we need to optimize now, so it is just a case of finding the right combinations.”

The team aims to now use artificial intelligence to put a large number of earth-abundant metals and ligands (which glue them together to generate the catalysts) in a melting pot before assessing which of the near-infinite combinations yield the greatest promise.

In combination, what once looked like an empty canvas now looks more like a paint-by-numbers as the team has established fundamental principles for the design of ideal catalysts.

Professor Max García-Melchor added: “Given the increasingly pressing need to find green energy solutions it is no surprise that scientists have, for some time, been hunting for a magical catalyst that would allow us to split water electrochemically in a cost-effective, reliable way. However, it is no exaggeration to say that before now such a hunt was akin to looking for a needle in a haystack. We are not over the finishing line yet, but we have significantly reduced the size of the haystack and we are convinced that artificial intelligence will help us hoover up plenty of the remaining hay.”

He also stressed: “This research is hugely exciting for a number of reasons and it would be incredible to play a role in making the world a more sustainable place. Additionally, this shows what can happen when researchers from different disciplines come together to apply their expertise to try to solve a problem that affects each and every one of us.

 

Photosynthesis seen in a new light by rapid X-ray pulses

Arizona State University

The ability to transform sunlight into energy is one of Nature's more remarkable feats. Scientists understand the basic process of photosynthesis, but many crucial details remain elusive, occurring at dimensions and fleeting time scales long deemed too minuscule to probe.

Now, that is changing.

In a new study, led by Petra Fromme and Nadia Zatsepin at the Biodesign Center for Applied Structural Discovery, the School of Molecular Sciences and the Department of Physics at ASU, researchers investigated the structure of Photosystem I (PSI) with ultrashort X-ray pulses at the European X-ray Free Electron Laser (EuXFEL), located in Hamburg, Germany.

PSI is a large biomolecular system that acts as a large solar energy converter transforming solar energy into chemical energy. Photosynthesis provides energy for all complex life on Earth and supplies the oxygen we breathe. Advances in unraveling the secrets of photosynthesis promise to improve agriculture and aid in the development of next-generation solar energy storage systems that combine the efficiency of Nature with the stability of human engineered systems.

"This work is so important, as it shows the first proof of concept of megahertz serial crystallography with one of the largest and most complex membrane proteins in photosynthesis: Photosystem I" says Fromme. "The work paves the way towards time-resolved studies at the EuXFEL to determine molecular movies of the light-driven path of the electrons in photosynthesis or visualize how cancer drugs attack malfunctioning proteins."

The EuXFEL, which recently began operation, is the first to employ a superconducting linear accelerator that yields exciting new capabilities including very fast megahertz repetition rates of its X-ray pulses -- over 9000 times faster than any other XFEL -- with pulses separated by less than 1 millionth of a second. With these incredibly brief bursts of X-ray light, researchers will be able to much more quickly record molecular movies of fundamental biological processes and will likely impact diverse fields including medicine and pharmacology, chemistry, physics, materials science, energy research, environmental studies, electronics, nanotechnology, and photonics. Petra Fromme and Nadia Zatsepin are co-corresponding authors of the paper, published in the current issue of the journal Nature Communications.

Strength in numbers

Fromme is the director of the Biodesign Center for Applied Structural Discovery (CASD) and leads the experimental team efforts of the project, while Zatsepin led the XFEL data analysis team.

"This is a significant milestone in the development of serial femtosecond crystallography, building on the well-coordinated effort of a large, cross-disciplinary, international team and years of developments in disparate fields" emphasizes Zatsepin, former Research Assistant Professor in the ASU Department of Physics and Biodesign CASD, and now Senior Research Fellow at La Trobe University in Australia.

Christopher Gisriel, the paper's co-first author, worked on the project while a Postdoctoral Researcher in the Fromme laboratory and is excited about the project. "Fast data collection in serial femtosecond crystallography experiments makes this revolutionary technique more accessible to those interested in the structure-function relationship for enzymes. This is exemplified by our new publication in Nature Communications showing that even the most difficult and complex protein structures can be solved by serial femtosecond crystallography while collecting data at megahertz repetition rate."

"It is very exciting to see the hard work from the many folks that drove this project to materialize," says Jesse Coe, co-first author who graduated last year with a Ph.D. in Biochemistry from ASU. "This is a huge step in the right direction toward better understanding Nature's process of electron transfer that has been refined over billions of years. "

Extreme science

An XFEL (for X-ray free-electron laser) delivers X-ray light that is a billion times brighter than conventional X-ray-sources. The brilliant, laser-like X-ray pulses are produced by electrons accelerated to near light speed and fed through the gap between series of alternating magnets, a device known as an undulator. The undulator forces the electrons to jiggle and bunch up into discrete packages. Each of the perfectly synchronized wiggling electron bunches emits a powerful, brief X-ray pulse along the electron flight path.

In serial femtosecond crystallography, a jet of protein crystals is injected into the path of the pulsed XFEL beam at room temperature, yielding structural information in the form of diffraction patterns. From these patterns, scientists can determine atomic scale images of proteins in close-to-native conditions, paving the way toward accurate molecular movies of molecules at work.

X-rays damage biomolecules, a problem that has plagued structure determination efforts for decades, requiring the biomolecules to be frozen to limit the damage. But the X-ray bursts produced by an XFEL are so short -- lasting mere femtoseconds -- that X-ray scattering from a molecule can be recorded before destruction takes place, akin to using a fast camera shutter. As a point of reference a femtosecond is a millionth of a billionth of a second, the same ratio as a second is to 32 million years.

Due to the sophistication, size and cost of XFEL facilities, only five are currently available for such experiments worldwide -- a severe bottleneck for researchers since each XFEL can typically only host one experiment at a time. Most XFELs generate X-ray pulses between 30 and 120 times per second and it can take several hours to days to collect the data required to determine a single structure, let alone a series of frames in a molecular movie. The EuXFEL is the first to employ a superconducting linear accelerator in its design, enabling the fastest succession of X-ray pulses of any XFEL, which can significantly reduce the time it takes to determine each structure or frame of the movie.

High risk, high reward

Because the sample is obliterated by the intense X-ray pulses, it must be replenished in time for the next X-ray pulse, which required PSI crystals to be delivered 9000 times faster at the EuXFEL than at earlier XFELs -- at a jet speed of about 50 meters per second (160 feet per second), like a microfluidic fire hose. This was challenging as it requires large amounts of the precious protein contained within uniform crystals to reach these high jet speeds and avoid blocking the sample delivery system. Large membrane proteins are so difficult to isolate, crystallize and deliver to the beam, that it wasn't known if this important class of proteins could be studied at the EuXFEL.

The team developed new methods that allowed PSI, which is large complex consisting of 36 proteins and 381 cofactors, that include the 288 chlorophylls (the green pigments that absorb the light) and has over 150,000 atoms and is over 20 times larger than previous proteins studied at the EuXFEL, to have its structure determined at room temperature to a remarkable 2.9 angstrom resolution -- a significant milestone.

Billions of microcrystals of the PSI membrane protein, derived from cyanobacteria, had to be grown for the new study. Rapid crystal growth from nanocrystal seeds was required to guarantee the essential uniformity of crystal size and shape. PSI is a membrane protein, which is a class of proteins of high importance that have been notoriously tricky to characterize. Their elaborate structures are embedded in the cell membrane's lipid bilayer. Typically, they must be carefully isolated in fully active form from their native environment and transformed into a crystalline state, where the molecules pack into crystals but maintain all their native function.

In the case of PSI, this is achieved by extracting it with very mild detergents that replace the membrane and surround the protein like a pool inner tube, which mimics the native membrane environment and keeps PSI fully functional once it's packed within the crystals. So when researchers shine light on the green pigments (chlorophylls) that catch the light by the antenna system of PSI, the energy is used it to shoot an electron across the membrane.

To keep PSI fully functional, the crystals are only weakly packed containing 78% water, which makes them soft like a piece of butter in the sun and makes it difficult handling these fragile crystals . "To isolate, characterize and crystallize one gram of PSI, or one billion billion PSI molecules, for the experiments in their fully active form was a huge effort of the students and researchers in my team" says Fromme." In the future, with even higher repetition rates and novel sample delivery systems the sample consumption will be dramatically reduced."

The recording and analysis of the diffraction data was another challenge. A unique X-ray detector was developed by the EuXFEL and DESY to handle the demands of structural biology studies at the EuXFEL: the adaptive-gain integrating pixel detector, or AGIPD. Each of AGIPD's 1 million pixels are less than a hundredth of an inch across and contain 352 analog memory cells, which enable the AGIPD to collect data at megahertz rates over a large dynamic range. However, to collect accurate crystallographic data from microcrystals of large membrane proteins required a compromise between spatial resolution and sampling of the data.

"Pushing for higher resolution data collection with the current detector size could preclude useful processing of the crystallographic data because the diffraction spots are insufficiently resolved by the X-ray detector pixels" warns Zatsepin, "yet in terms of data rates and dynamic range, what the AGIPD is capable of is incredible."

The novel data reduction and crystallographic analysis software designed specifically to deal with the challenges unique to the massive datasets in XFEL crystallography, whose development was led by collaborators at CFEL, DESY, and ASU, have come a long way since the first high-resolution XFEL experiment in 2011.

"Our software and DESY's high-performance computing capabilities are really being put to the test with the unprecedented data volumes generated at the EuXFEL. It is always exciting to push the limits of state-of-the-art technology," adds Zatsepin.

Membrane proteins: floppy, yet formidable

Membrane proteins like PSI -- named because they are embedded into cell membranes -- are vital to all life processes including respiration, nerve function, nutrition uptake, and cell-cell signaling. As they are at the surface of each cell they are also the most important pharmaceutical drug targets. More than 60% of all current drugs are targeted to membrane proteins. The design of more effective drugs with fewer side effects is therefore contingent on understanding how particular drugs bind with their target proteins and their highly detailed structural conformations and dynamic activities.

Despite their enormous importance in biology, membrane protein structures make up less than 1% of all protein structures solved to date because they are notoriously tricky to isolate, characterize and crystallize. This is why major advances in crystallographic methods, such as the advent of membrane protein megahertz serial femtosecond crystallography, are undoubtedly going to have a significant impact on the scientific community.

It takes a village

These recent achievements would not be possible without the tireless effort from a dedicated team of nearly 80 researchers from 15 institutions, including ASU, the European XFEL, DESY, the Center for Ultrafast X-ray Science, Hauptman-Woodward Institute, SUNY Buffalo, SLAC, University of Hamburg, University of Goettingen, Hungarian Academy of Sciences, University of Tennessee, Lawrence Livermore National Laboratory, University of Southampton, Hamburg University of Technology, University of Wisconsin. The research group included US collaborators in the NSF BioXFEL Science and Technology Center and a group of international collaborators, including Adrian P. Mancuso and Romain Letrun, lead scientists at the EuXFEL beamline and Oleksandr Yefanov and Anton Barty from CFEL/DESY who worked closely with the ASU team on the complex data analysis.

 


 

 

 

This newsletter is sent from MyOliveBooks. Click here to unsubscribe from this newsletter.
		

curriculum:request_stats

curriculum:elapsed_time: 0.127 curriculum:log_queries curriculum:label:system_cache: Yes