Astronomers apply special data analysis techniques to Webb’s first images, spectra of Mars

The James Webb Space Telescope captured its first images and spectra of Mars on 5 September 2022. The telescope, an international collaboration between NASA, ESA, and the Canadian Space Agency, provides a unique perspective with its infrared sensitivity on our neighboring planet, complementing data being collected by orbiters, rovers, and other telescopes. First Webb observations of Mars

Webb’s unique observation post nearly 1.5 million kilometers away at the Sun-Earth Lagrange point 2 (L2) provides a view of Mars’ observable disk (the portion of the sunlit side that is facing the telescope). As a result, Webb can capture images and spectra with the spectral resolution needed to study short-term phenomena like dust storms, weather patterns, seasonal changes, and, in a single observation, processes that occur at different times (daytime, sunset, and nighttime) of a Martian day.

Because it is so close, the Red Planet is one of the brightest objects in the night sky in terms of both visible light (which human eyes can see) and the infrared light that Webb is designed to detect. This poses special challenges to the observatory, which was built to detect the extremely faint light of the most distant galaxies in the universe. Webb’s instruments are so sensitive that without special observing techniques, the bright infrared light from Mars is blinding, causing a phenomenon known as “detector saturation.” Astronomers adjusted for Mars’ extreme brightness by using very short exposures, measuring only some of the light that hit the detectors, and applying special data analysis techniques.

Webb’s first images of Mars, captured by the Near-Infrared Camera (NIRCam), show a region of the planet’s eastern hemisphere at two different wavelengths, or colors of infrared light. This image shows a surface reference map from NASA and the Mars Orbiter Laser Altimeter (MOLA) on the left, with the two Webb NIRCam instrument fields of view overlaid. The near-infrared images from Webb are shown on the right.

Webb’s first near-infrared spectrum of Mars, captured by the Near-Infrared Spectrograph (NIRSpec), demonstrates Webb’s power to study the Red Planet with spectroscopy. First Webb infrared spectrum of Mars

Whereas the Mars images show differences in brightness integrated over a large number of wavelengths from place to place across the planet at a particular day and time, the spectrum shows the subtle variations in brightness between hundreds of different wavelengths representative of the planet as a whole. Astronomers will analyze the features of the spectrum to gather additional information about the surface and atmosphere of the planet.

In the future, Webb will be using this imaging and spectroscopic data to explore regional differences across the planet and to search for trace species in the atmosphere, including methane and hydrogen chloride.

These observations of Mars were conducted as part of Webb’s Cycle 1 Guaranteed Time Observation (GTO) Solar System program led by Heidi Hammel of the Association of Universities for Research in Astronomy (AURA).

ESA operates two Mars orbiters, Mars Express and the ExoMars Trace Gas Orbiter, that have brought a treasury of insight into the Red Planet’s atmosphere and surface. Furthermore, ESA collaborates with the Japanese Aerospace Exploration Agency (JAXA) on the Martian Moons eXploration (MMX) mission, soon to launch for Mars’ moon Phobos.

NIRSpec was built for the European Space Agency (ESA) by a consortium of European companies led by Airbus Defence and Space (ADS) with NASA’s Goddard Space Flight Centre providing its detector and micro-shutter subsystems.

Naval Postgraduate School explores use of persistent augmented reality to advance decision cycle

Decision advantage, one of six “Force Design Imperatives” in Chief of Naval Operations Adm. Mike Gilday’s NAVPLAN 2022, places a high priority on the naval forces’ ability to “out-sense, out-decide, and out-fight any adversary by accelerating our decision cycles with secure, survivable, and cyber-resilient networks.”  U.S. Navy Lt. JaMerra Turner and Lt. Joanna Cruz used their NPS thesis to explore the use of persistent augmented reality to enhance visual representations of a carrier strike group’s communications and cyber network operations, with the goal of advancing the decision cycle in this complex operational environment.

Two students at the Naval Postgraduate School (NPS), U.S. Navy Lt. JaMerra Turner and Lt. Joanna Cruz, are exploring the application of emerging technologies in augmented reality and how Sailors can use them to advance the decision cycle. As the thesis for their master’s degrees at NPS, titled “Supporting Mission Planning with a Persistent Augmented Environment,” Cruz and Turner’s research provides insights into the feasibility and effectiveness of this novel form of data representation and user interaction, and its capability to support faster and improved situational awareness and decision-making in a complex operational technology environment.

Turner, now a graduate of the NPS Cyber Systems and Operations program with a master’s in computer science, was looking for a partner for her thesis. While combing through other students' research, she found Cruz, a student in the computer science program through the Modeling Virtual Environments and Simulations (MOVES) Institute, who wanted to focus her research on augmented reality. 

“Joanna had this brilliant idea of combining her expertise in virtual environments with my proficiency as a professional information officer,” said Turner. “Using my knowledge of communications networks, and her ability to create augmented realities, we could develop something that would drastically improve the decision-making capabilities of the senior decision-makers in the field.”

A ‘persistent augmented environment’ is a concept the students’ thesis advisor Dr. Amela Sadagic has been working on for some time and consists of two major components. ‘Persistent’ means that a diverse set of data are being collected and visualized in real-time. The data sets are updated and corrected as quickly as possible, ensuring the most accurate information is available to operators who need it to support their decision-making. An ‘augmented environment’ refers to superimposed, computer-generated visual elements over visual information that comes from the real-world environment. Cruz described it as “Pokémon Go on ships” or “tactical Pokémon Go.” 

With their thesis, the two students wanted to use simulated data of communications systems and cyber network operations, reporting, and resource management decision-making to demonstrate their product. Using augmented reality displays allowed the operators to see each other and discern key non-verbal cues typically used in group discussions. At the same time, they could also see the elements of the simulated operational environment and interact with them. 

When a carrier strike group goes on deployment, the ships’ lines of communication are of critical importance. Between several forms of radio frequencies, types of satellites, and the litany of other ways ships need to talk to one another, there is bound to be confusion. One ship might have a certain system down, and the other vessels need to be aware of that failure promptly. 

“Imagine every ship has a system where they could input data automatically in real-time, and every ship in the strike group would know immediately [through an augmented reality interface],” said Cruz. “With our program, the communications officer on every ship could have a headset with that data constantly updating, and all they would have to do to determine which systems are down and which systems are functional would be to look at the ship they are trying to reach.” 

Turner and Cruz designed and developed a straightforward 3D visualization prototype of how the augmented reality interface would look and then tested its usability with 27 Naval officers. They focused on users’ performance and comprehension of the prototype that depicted augmented reality-enhanced Naval cyber battlespace onboard multiple ships. An additional objective was to acquire a better understanding of how that type of system could be used to assist in effective mission planning at the tactical level. And the results were very positive and encouraging, they said.

“[Users] found the interface easy to understand and operate, and the prototype was characterized as a valuable alternative to their current practices,” said Turner.

Although the research just focused on communications, Cruz and Turner suggest the concept could be expanded to cover several systems onboard Navy vessels. From damage control conditions to weapons systems, key personnel on every ship could simply put on a headset and be instantly updated on the rest of the strike group. 

“It can be used by different warfare areas, and it can go towards different readiness conditions,” said Cruz. “We can input data regarding air warfare, surface warfare, submarine warfare, ballistic warfare, strike warfare, and the list goes on and on.”

Both students were very satisfied with their final thesis, and its potential to support a key initiative in NAVPLAN 2020. The interdisciplinary approach, capitalizing on each student’s unique area of expertise to advance a Navy-relevant topic of study, is a rare opportunity, except for students at NPS.

“The opportunity for two people in two different communities to combine efforts on a MOVES-related thesis and to actually come up with a brand-new idea and start implementing it was a very rewarding and satisfying experience,” said Cruz. “The world is changing, technology is improving at an incredible rate, and NPS is doing all it can to stay at the very tip of innovation.”

Mizzou researchers use AI to advance anatomical research from scalpels, scissors to modeling

There was once a time, not so long ago when scientists like Casey Holliday needed scalpels, scissors, and even their own hands to conduct anatomical research. But now, with recent advances in technology, Holliday and his colleagues at the University of Missouri are using artificial intelligence (AI) to see inside an animal or a person — down to a single muscle fiber — without ever cutting. Contrast imaging data and machine learning approaches can now model the 3D architecture of jaw musculature.

Holliday, an associate professor of pathology and anatomical sciences, said his lab in the MU School of Medicine is one of only a handful of labs in the world currently using this high-tech approach.

AI can teach computer programs to identify a muscle fiber in an image, such as a CAT scan. Then, researchers can use that data to develop detailed 3D computer models of muscles to better understand how they work together in the body for motor control, Holliday said.

Holliday, along with some of his current and former students, did that recently when they began to study the bite force of a crocodile.

“The unique thing about crocodile heads is that they are flat, and most animals that have evolved to bite really hard, like hyenas, lions, T. rexes, and even humans have really tall skulls because all those jaw muscles are oriented vertically,” Holliday said. “They’re designed that way so they put a big vertical bite force into whatever they're eating. But a crocodile’s muscles are oriented more horizontally.”

The 3D models of muscle architecture could help the team determine how muscles are oriented in crocodile heads to help increase their bite force. Helping to lead this effort is one of Holliday’s former students, Kaleb Sellers, who is now a postdoctoral researcher at the University of Chicago.

“Jaw muscles have long been studied in mammals with the assumption that relatively simple descriptors of muscle anatomy can tell you a great deal about skull function,” Sellers said. “This study shows how complex jaw muscle anatomy is in a reptile group.”

Holliday’s lab first began experimenting with 3D imaging several years ago. Some of their early findings were published in 2019 with a study in Integrative Organismal Biology that showed the development of a 3D model of the skeletal muscles in a European starling.

Transitioning into a digital world

Historically, Holliday said anatomical research — and much of what he did growing up — involved dissecting animals with a scalpel or scissors, or what he calls an “analog” approach. He was first introduced to the benefits of using digital imaging to study anatomy when he joined the “Sue the T. rex” project in the late 1990s. To date, it remains one of the largest and most well-preserved Tyrannosaurus rex specimens ever discovered.

Holliday recalls the moment when the T. rex’s giant skull was transported to Boeing’s Santa Susana Field Laboratory in California to be imaged in one of the aerospace company’s massive CAT scanners normally used to scan jet engines on commercial airplanes.

“At the time, it was the only CAT scanner in the world big enough to fit a T. rex skull, and also had the power needed to push X-rays through rocks,” Holliday said. “Coming out of college I had looked at becoming a radiology technician, but with the Sue project I was learning all about how they CAT scanned this thing, and that really caught my fancy.”

Nowadays, Holliday said many of his current and former students at MU are learning to understand anatomy by using the “cutting edge” imaging and modeling methods that he and his colleagues are creating. One of those students is Emily Lessner, a recent MU alumna who developed her passion for “long-dead animals” by working in Holliday’s lab.

“The digitization process is not only useful to our lab and research,” Lessner said. “It makes our work shareable with other researchers to help hasten scientific advancement, and we can also share them with the public as educational and conservation tools. Specifically, my work looking at the soft tissues and bony correlates in these animals has not only created hundreds of future questions to answer but also revealed many unknowns. In that way, not only did I gain imaging skills to help with my future work, but I now have more than a career-worth of avenues to explore.”

Holliday said plans are also in the works to take their 3D anatomical models a step further by studying how human hands have evolved from their evolutionary ancestors. The project, which is still in its early stages, recently received a grant from the Leakey Foundation. Joining Holliday on the project will be two of his colleagues at MU, Carol Ward, a Curators Distinguished Professor of pathology and anatomical sciences, and Kevin Middleton, an associate professor of biological sciences.

While about 90% of the research done in Holliday’s lab involves studying things that exist in the modern world, he said the data they collect can also inform the fossil record, like additional knowledge about how the T. rex moved and functioned.

“With better knowledge of actual muscle anatomy, we can really figure out how the T. rex could really do fine motor controls, and more nuanced behaviors, such as bite force and feeding behavior,” Holliday said.