NASA's Goddard Space Flight Center in Greenbelt, Maryland, will monitor the landing of NASA astronaut Scott Kelly and Russian cosmonaut Mikhail Kornienko from their Year in Space Mission. Goddard's Network Integration Center, pictured above, leads all coordination for space-to-ground communications support for the International Space Station and provides contingency support for the Soyuz TMA-18M 44S spacecraft, ensuring complete communications coverage through NASA's Space Network. The Soyuz 44S spacecraft will undock at 8:02 p.m. EST this evening from the International Space Station. It will land approximately three and a half hours later, at 11:25 p.m. EST in Kazakhstan. Both Kelly and Kornienko have spent 340 days aboard the International Space Station, preparing humanity for long duration missions and exploration into deep space. Credits: NASA/Goddard/Rebecca Roth

Spending nearly a year in space, 249 miles from Earth, could be a lonely prospect, but an office at NASA's Goddard Space Flight Center in Greenbelt, Maryland, made sure astronaut Scott Kelly could reach home for the entire 340-day duration of his mission. Not only could Kelly communicate with mission control in Houston, but Goddard's Network Integration Center connected him with reporters and even family. 

Reliable space-to-ground communication is critical to all missions - when astronauts venture outside the International Space Station to install new equipment and perform important maintenance, as well as for any other on-orbit needs.

Data collected in space, like video transmission of a spacewalk, travel as radio signals from antennas on spacecraft to much larger antennas on Earth, some with diameters up to 230 feet. From there, they travel via cables underground, or even under the ocean, to data centers around the world where scientists collect and analyze the data. 

With hundreds of satellites operating in orbit around Earth and elsewhere in the solar system, it's easy to imagine that communication channels might become overwhelmed with data from the satellites. To prevent this, NASA manages and maintains three large communications networks. A spacecraft's distance from Earth decides which network it will use. Spacecraft in the far reaches of our solar system, such as New Horizons, just past Pluto, communicate via the Deep Space Network, while spacecraft closer to home, such as the ISS, use the Space Network or the Near Earth Network. Spacecraft utilizing the Space Network communicate using a constellation of geosynchronous Tracking and Data Relay Satellites known as TDRS. The Near Earth Network consists of ground-based stations located around the Earth. While the Space Network generally services spacecraft in low Earth orbit, the Near Earth Network can service spacecraft in low-Earth orbit, geosynchronous orbit and even in orbit around the moon.

The Space Communications and Navigation Program office is located at NASA Headquarters in Washington. Engineers and technicians at Goddard Space Flight Center in Greenbelt, Maryland, are primarily responsible for the management and operation of the Space Network and the Near Earth Network. The Deep Space Network is managed at NASA's Jet Propulsion Laboratory in Pasadena, California.

Goddard's Network Integration Center (NIC) is the primary operations center for coordinating the communications for missions using the Near Earth Network and Space Network. Capabilities include robotic satellite missions as well as all human spaceflight missions. Service capabilities typically begin with the preflight testing of a spacecraft's communications systems prior to launch and culminates with the launch and initial in-orbit activities of the spacecraft. 

Human spaceflight missions are the NIC's specialty. The center has been operational in one form or another since Project Mercury, NASA's first human spaceflight program. Maintaining communications with human-occupied spacecraft is essential for mission success regardless of whether it is in low-Earth orbit or beyond. Today the NIC is involved in all human space missions and regularly supports the ISS and the visiting cargo and crew transport vehicles that service the space station. The NIC will provide similar communication and navigation to the new commercial crew spacecraft being built by Boeing and SpaceX. 

Communication and navigation for most spacecraft in low-Earth orbit is relatively straightforward, said Human Spaceflight Network Director Mark Severance, who manages the communications services from all networks during human spaceflight missions. Most low-Earth-orbit spacecraft connect with and maintain communications with one or two NASA communications networks. Future exploration missions will be more complicated.

"Typically when you fly a mission beyond Earth orbit, you launch and go around Earth a couple times, and you communicate through the Near Earth Network and the Space Network," Severance said. "Then you do a big rocket firing, you depart from Earth orbit and you're not going to return. You're then on the Deep Space Network forever. However, the return trips of human missions will require not only network handovers as the spacecraft leaves Earth, but return handovers between networks as well."

Because of this, future exploration missions will use all three of NASA's space communication's networks at various times during the mission. Not only must the NIC team ensure that all networks are functioning correctly, but that the handovers between networks are orchestrated to maintain communications between the spacecraft and mission control as it leaves Earth or approaches on its return journey. These plans can change rapidly due to in-flight complications, leaving the team to coordinate a new handover plan between the networks.

A preview of this type of mission capabilities occurred during the Orion Exploration Flight Test-1 (EFT-1) in December 2014. The flight orbited Earth twice to test NASA's new Orion spacecraft, designed to carry astronauts to destinations in deep space, including an asteroid and Mars. EFT-1 flew the Orion capsule to more than 15 times further from Earth than the International Space Station, about 3,600 miles above the planet's surface. Data collected during the flight will help finalize Orion's designs and show how the capsule performs during, and returns from, deep-space journeys. This includes testing Orion's communications capabilities with the Space Network, which was overseen by Severance's team in the NIC.

The NIC Human Space Flight team at Goddard is already planning the communications for Exploration Mission-1, the first flight of the agency's new Space Launch System rocket and Orion spacecraft to demonstrate the integrated system performance prior to the first crewed flight. Severance said this mission would be the biggest communications challenge moving forward into the next several years.

As NASA soars into space beyond Earth orbit once more, a legacy of space communications that began at Goddard more than 50 years ago continues.

VTT Technical Research Centre of Finland and Aalto University, together with a group of contributing local companies, are starting a new Tekes-funded project on optical switching and transmission technologies to improve the scalability and energy-efficiency of data centres and 5G networks where the volumes of data transfer grow exponentially.

The way we use and share information and entertainment content are changing from local media hardware into distributed content with on-line and mobile access. In entertainment, DVDs and CDs have already been replaced by streaming and on-demand movie services. Data storage and bookkeeping are moving into cloud with on-line mobile access and internet of things will soon connect everyday devices into the local or global network.

Already before the onset of this transition, the volume of data transfer was increasing exponentially and the capacity of the data centres was doubled every 18 months. In 2014 the data centres in EU alone consumed about 120 TWh of energy, roughly equivalent to the full capacity of fourteen 1 GW nuclear reactors.

With the current data center networking technologies, addressing the exponential increase in data volume would lead to an enormous magnification of the cost.

The new Tekes-funded project, Optical Information Processing for Energy-Efficient Data Centres (OPEC), focuses on the development of novel optical components and technologies on VTT's proprietary silicon photonics platform, as well as new silicon wafer production and precision assembly concepts. This is done in close collaboration with Nokia, Rockley Photonics and other Finnish technology companies aiming to meet the industrial demands of data centres and 5G networks.

Future challenges are approached by developing graphene and other layered 2D material based active photonic components in collaboration between VTT and Aalto University to achieve performance beyond the theoretical limit of the traditional materials. The project also explores the feasibility of integrated photonics in analog signal transfer and manipulation, such as radio-over-fiber and microwave beam steering in mobile link stations.

The project is supported financially and technologically by Nokia, Rockley Photonics, Okmetic, nLight, Ginolis and Picosun. It is part of Tekes' 5thGear programme that launched several new projects early 2016 in connection with Business from Digitalization call.

CAPTION Ashfaq Adnan, UTA assistant professor in the Mechanical and Aerospace Engineering Department, received an Office of Naval Research grant that will build a computational model to measure the impact of blast shockwaves on neurons in the brain.

A University of Texas at Arlington engineering researcher supported by the Office of Naval Research is developing a computational model to measure how and when battlefield blasts can cause devastating damage to neurons in the brain.

Ashfaq Adnan, an assistant professor of Mechanical and Aerospace Engineering, will measure the blasts on a computer, then forward that information to partners at the Naval Research Laboratory in Washington D.C., which will then use Adnan's data and compare it to existing soldiers' data. The work is supported by a three-year, $386,586 grant through the ONR's Warfighter Performance Department and UTA.

"Since neuron damage can be directly connected to brain damage, we're finding out that even mild exposure to blasts can eventually cause serious brain injury to these veterans. Sometimes these soldiers don't even feel anything at all initially," Adnan said. "In developing the computer model, we're directly quantifying what happens at what magnitude and force, and deciding at which measurements neuron damage occurs."

Existing scans and medical technology cannot detect how these blasts affect a person's neurons, the brain cells responsible for processing and transmitting information by electrochemical signaling. Adnan's research will focus on studying the structural damage in neurons and the surrounding area in the brain. He will then determine the point at which mechanical forces injure the neurons.

"We do know there is a mechanical force at play in these shockwaves, no matter how small they are," Adnan said. "We are studying the tiny, nano-sized part of the brain. We're studying the mechanical and physiological behaviors of a single axon or neuron subjected to blast-like loading conditions and how that may induce cavitational damage."

Adnan's work is related to another Naval Research project led by Michael Cho, professor and chair of UTA's Bioengineering Department. Cho's research is supported by a $1.24 million Warfighter Performance Department grant and examines how shockwaves on the battlefield cause brain tissue brain injuries and compromise the blood-brain barrier. Duane Dimos, UTA vice president for research, said both projects will provide significant advancements in the way physicians understand and diagnose brain injuries. The work is representative of UTA's increasing research expertise focused on advancing health and the human condition under the University's Strategic Plan 2020: Bold Solutions | Global Impact.

"Dr. Adnan's work will certainly provide new, important insight into how the brain is affected in combat scenarios, and this research will provide knowledge that will benefit anyone suffering from brain injury, Dimos said. "It is heartening to see how UTA faculty are working to benefit our military servicemen and women, and their work will have lasting value for society overall."

Several other UTA research efforts are focused on improving lives of veterans, among them:

  • Manfred Huber, associate professor of computer science and engineering, is principal investigator on a project with Robotic Research LLC, which is leading a driverless vehicle project for veterans on military bases.
  • Muthu Wijesundara, UTA Research Institute principal scientist, is developing a smart seat cushion that would provide relief for wheelchair-bound people whose impairments were caused by spinal cord injuries or other neurological complications. The work is supported by a Congressionally Directed Medical Research Program Department of Defense grant.
  • And bioengineering professor Hanli Liu and associate professor of social work Alexa Smith-Osborne are studying the use of functional near infrared spectroscopy to map brain activity responses during cognitive activities related to digit learning and memory retrial among veterans.

Adnan has been involved in more than $1.4 million in research support since he joined UTA in 2010 following post-doctoral work at Northwestern University in Illinois. He earned his doctoral degree in aerospace engineering from Purdue University in Indiana.

In 2015, He won a highly competitive $120,000 Early-concept Grant for Exploratory Research, or EAGER, award from the National Science Foundation to advance his work to modify molecular structures and blend ceramics to create new material that would be less brittle but retain the strength of the original ceramic.

He also has received summer faculty fellowships with the Air Force Office of Scientific Research and the Office of Naval Research.

 

In MIDI-STEM (right), developed at Berkeley Lab, an electron beam travels through a ringed “phase plate,” producing a high-resolution image (bottom right) that provides details about a sample containing a heavy element (gold) and light element (carbon). Details about the carbon are missing in an image (bottom left) of the sample using a conventional electron imaging technique (ADF-STEM). (Colin Ophus/Berkeley Lab, Nature Communications: 10.1038/ncomms10719)

Electrons can extend our view of microscopic objects well beyond what’s possible with visible light—all the way to the atomic scale. A popular method in electron microscopy for looking at tough, resilient materials in atomic detail is called STEM, or scanning transmission electron microscopy, but the highly focused beam of electrons used in STEM can also easily destroy delicate samples.

This is why using electrons to image biological or other organic compounds, such as chemical mixes that include lithium—a light metal that is a popular element in next-generation battery research—requires a very low electron dose.

Scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have developed a new imaging technique, tested on samples of nanoscale gold and carbon, that greatly improves images of light elements using fewer electrons.

The newly demonstrated technique, dubbed MIDI-STEM, for matched illumination and detector interferometry STEM, combines STEM with an optical device called a phase plate that modifies the alternating peak-to-trough, wave-like properties (called the phase) of the electron beam.

This phase plate modifies the electron beam in a way that allows subtle changes in a material to be measured, even revealing materials that would be invisible in traditional STEM imaging.

Another electron-based method, which researchers use to determine the detailed structure of delicate, frozen biological samples, is called cryo-electron microscopy, or cryo-EM. While single-particle cryo-EM is a powerful tool—it was named as science journal Nature’s 2015 Method of the Year—it typically requires taking an average over many identical samples to be effective. Cryo-EM is generally not useful for studying samples with a mixture of heavy elements (for example, most types of metals) and light elements like oxygen and carbon.

“The MIDI-STEM method provides hope for seeing structures with a mixture of heavy and light elements, even when they are bunched closely together,” said Colin Ophus, a project scientist at Berkeley Lab’s Molecular Foundry and lead author of a study, published Feb. 29 in Nature Communications, that details this method. 

If you take a heavy-element nanoparticle and add molecules to give it a specific function, conventional techniques don’t provide an easy, clear way to see the areas where the nanoparticle and added molecules meet.

“How are they aligned? How are they oriented?” Ophus asked. “There are so many questions about these systems, and because there wasn’t a way to see them, we couldn’t directly answer them.”

While traditional STEM is effective for “hard” samples that can stand up to intense electron beams, and cryo-EM can image biological samples, “We can do both at once” with the MIDI-STEM technique, said Peter Ercius, a Berkeley Lab staff scientist at the Molecular Foundry and co-author of the study.

The phase plate in the MIDI-STEM technique allows a direct measure of the phase of electrons that are weakly scattered as they interact with light elements in the sample. These measurements are then used to construct so-called phase-contrast images of the elements. Without this phase information, the high-resolution images of these elements would not be possible.

In this study, the researchers combined phase plate technology with one of the world’s highest resolution STEMs, at Berkeley Lab’s Molecular Foundry, and a high-speed electron detector. This animated representation shows a Berkeley Lab-developed technique called MIDI-STEM (at right) and conventional STEM (at left) that does not use a ringed object called a phase plate. In MIDI-STEM, an interference pattern (bottom right) introduced by the phase plate (top right) interacts with the electron beam before it travels through a sample (the blue wave in the center). As the phase of the sample (the distance between the peaks and valleys of the blue wave) changes, the electrons passing through the sample are affected and can be measured as a pattern (bottom right). (Colin Ophus/Berkeley Lab)

They produced images of samples of crystalline gold nanoparticles, which measured several nanometers across, and the superthin film of amorphous carbon that the particles sat on. They also performed supercomputer simulations that validated what they saw in the experiment.

The phase plate technology was developed as part of a Berkeley Lab Laboratory Directed Research and Development grant in collaboration with Ben McMorran at University of Oregon.

The MIDI-STEM technique could prove particularly useful for directly viewing nanoscale objects with a mixture of heavy and light materials, such as some battery and energy-harvesting materials, that are otherwise difficult to view together at atomic resolution.

It also might be useful in revealing new details about important two-dimensional proteins, called S-layer proteins, that could serve as foundations for engineered nanostructures but are challenging to study in atomic detail using other techniques.

In the future, a faster, more sensitive electron detector could allow researchers to study even more delicate samples at improved resolution by exposing them to fewer electrons per image.

“If you can lower the electron dose you can tilt beam-sensitive samples into many orientations and reconstruct the sample in 3-D, like a medical CT scan. There are also data issues that need to be addressed,” Ercius said, as faster detectors will generate huge amounts of data. Another goal is to make the technique more “plug-and-play,” so it is broadly accessible to other scientists.

Berkeley Lab’s Molecular Foundry is a DOE Office of Science User Facility. Researchers from the University of Oregon, Gatan Inc. and Ulm University in Germany also participated in the study.

A new NASA visualization shows the 2015 El Niño unfolding in the Pacific Ocean, as sea surface temperatures create different patterns than seen in the 1997-1998 El Niño. SuperComputer models are just one tool that NASA scientists are using to study this large El Nino event, and compare it to other events in the past.

 "The start of an El Niño is important," said Robin Kovach, a research scientist with the Global Modeling and Assimilation Office (GMAO) at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The visualization shows how the 1997 event started from colder-than-average sea surface temperatures – but the 2015 event started with warmer-than-average temperatures not only in the Pacific but also in in the Atlantic and Indian Oceans.

"The '97 El Niño was much stronger in the Eastern Pacific, with much warmer water up to the coast of South America," Kovach said. In 2015, the warmest waters are instead in the Central Pacific and extend west of the International Date Line.

The water temperature variations typical of El Niño are not only at the surface of the equatorial Pacific, but below the surface as well. And these variations were also different in 2015, compared to 1997. At the height of the El Niño in November, colder-than-average temperatures in the Western Pacific and warmer-than-average temperatures in the Eastern Pacific were stronger and extended deeper in 1997 than in 2015.

Goddard’s supercomputer models, with input from ocean buoys, atmospheric models, satellite data and other sources, can also simulate what ocean water temperatures could do in the coming months. The GMAO seasonal forecast, which takes 18 hours to complete, and creates more than 9 Terabytes of data, shows that this 2015 El Niño could be different until the end.

"In the past, very strong El Niño events typically transition to neutral conditions and then a La Niña event," said Kovach. February supercomputer model runs forecast a return to normal sea surface temperatures by June. The latest Feb 5, 2016 forecast does not yet predict below normal sea surface temperatures that would result in a large La Niña. As of Feb. 14, 2016, the latest ocean supercomputer model shows colder-than-average water temperatures off the South American coast from Ecuador to Panama. "This current El Niño has been different so it will be interesting to see what happens in the next forecast and the coming months."

For previous features on NASA studying the 2015 El Niño:

NASA Studying 2015 El Nino Event As Never Before

NASA Examines Global Impacts of the 2015 El Nino

How NASA Sees El Nino Effects from Space

For NASA's El Nino/La Nina Watch page, visit:

Latest El Nino/La Nina Watch Data

Page 8 of 48