Danish prof shatters data transmission record with a single optical chip

COLOURBOX5571106 farver frekvenser web cd6e4Their data transmission method uses significantly less power and can help reduce the Internet’s climate footprint.

An international group of researchers from the Technical University of Denmark (DTU) and the Chalmers University of Technology in Gothenburg, Sweden have achieved dizzying data transmission speeds and are the first in the world to transmit more than 1 petabit per second (Pbit/s) using only a single laser and a single optical chip. 1 petabit corresponds to 1 million gigabits.

In the experiment, the researchers succeeded in transmitting 1.8 Pbit/s, which corresponds to twice the total global Internet traffic. And only carried by the light from one optical source. The light source is a custom-designed optical chip, which can use the light from a single infrared laser to create a rainbow spectrum of many colors, i.e. many frequencies. Thus, a single laser's one frequency (color) can be multiplied into hundreds of frequencies (colors) in a single chip.

All the colors are fixed at a specific frequency distance from each other - just like the teeth on a comb - which is why it is called a frequency comb. Each color (or frequency) can then be isolated and used to imprint data. The frequencies can then be reassembled and sent over optical fiber, thus transmitting data. Even a huge volume of data, as the researchers have discovered.

One single laser can replace thousands

The experimental demonstration showed that a single chip could easily carry 1.8 Pbit/s, which—with modern state-of-the-art commercial equipment—would otherwise require more than 1,000 lasers.

Victor Torres Company, associate professor at the Chalmers University of Technology, is head of the research group that has developed and manufactured the chip.

“What is special about this chip is that it produces a frequency comb with ideal characteristics for fiber-optical communications – it has high optical power and covers a broad bandwidth within the spectral region that is interesting for advanced optical communications,” says Victor Torres Company.

Interestingly enough, the chip was not optimized for this particular application.

“In fact, some of the characteristic parameters were achieved by coincidence and not by design,” says Victor Torres Company. “However, with efforts in my team, we are now capable to reverse engineering the process and achieve with high reproducibility micro combs for target applications in telecommunications.”

Enormous potential for scaling

In addition, the researchers created a computational model to theoretically examine the fundamental potential for data transmission with a single chip identical to the one used in the experiment. The calculations showed enormous potential for scaling up the solution.

Professor Leif Katsuo Oxenløwe, Head of the Centre of Excellence for Silicon Photonics for Optical Communications (SPOC) at DTU, says: “Our calculations show that—with the single chip made by the Chalmers University of Technology, and a single laser—we will be able to transmit up to 100 Pbit/s. The reason for this is that our solution is scalable—both in terms of creating many frequencies and in terms of splitting the frequency comb into many spatial copies and then optically amplifying them, and using them as parallel sources with which we can transmit data. Although the comb copies must be amplified, we do not lose the qualities of the comb, which we utilize for spectrally efficient data transmission.”

Astronomers apply special data analysis techniques to Webb’s first images, spectra of Mars

The James Webb Space Telescope captured its first images and spectra of Mars on 5 September 2022. The telescope, an international collaboration between NASA, ESA, and the Canadian Space Agency, provides a unique perspective with its infrared sensitivity on our neighboring planet, complementing data being collected by orbiters, rovers, and other telescopes. First Webb observations of Mars

Webb’s unique observation post nearly 1.5 million kilometers away at the Sun-Earth Lagrange point 2 (L2) provides a view of Mars’ observable disk (the portion of the sunlit side that is facing the telescope). As a result, Webb can capture images and spectra with the spectral resolution needed to study short-term phenomena like dust storms, weather patterns, seasonal changes, and, in a single observation, processes that occur at different times (daytime, sunset, and nighttime) of a Martian day.

Because it is so close, the Red Planet is one of the brightest objects in the night sky in terms of both visible light (which human eyes can see) and the infrared light that Webb is designed to detect. This poses special challenges to the observatory, which was built to detect the extremely faint light of the most distant galaxies in the universe. Webb’s instruments are so sensitive that without special observing techniques, the bright infrared light from Mars is blinding, causing a phenomenon known as “detector saturation.” Astronomers adjusted for Mars’ extreme brightness by using very short exposures, measuring only some of the light that hit the detectors, and applying special data analysis techniques.

Webb’s first images of Mars, captured by the Near-Infrared Camera (NIRCam), show a region of the planet’s eastern hemisphere at two different wavelengths, or colors of infrared light. This image shows a surface reference map from NASA and the Mars Orbiter Laser Altimeter (MOLA) on the left, with the two Webb NIRCam instrument fields of view overlaid. The near-infrared images from Webb are shown on the right.

Webb’s first near-infrared spectrum of Mars, captured by the Near-Infrared Spectrograph (NIRSpec), demonstrates Webb’s power to study the Red Planet with spectroscopy. First Webb infrared spectrum of Mars

Whereas the Mars images show differences in brightness integrated over a large number of wavelengths from place to place across the planet at a particular day and time, the spectrum shows the subtle variations in brightness between hundreds of different wavelengths representative of the planet as a whole. Astronomers will analyze the features of the spectrum to gather additional information about the surface and atmosphere of the planet.

In the future, Webb will be using this imaging and spectroscopic data to explore regional differences across the planet and to search for trace species in the atmosphere, including methane and hydrogen chloride.

These observations of Mars were conducted as part of Webb’s Cycle 1 Guaranteed Time Observation (GTO) Solar System program led by Heidi Hammel of the Association of Universities for Research in Astronomy (AURA).

ESA operates two Mars orbiters, Mars Express and the ExoMars Trace Gas Orbiter, that have brought a treasury of insight into the Red Planet’s atmosphere and surface. Furthermore, ESA collaborates with the Japanese Aerospace Exploration Agency (JAXA) on the Martian Moons eXploration (MMX) mission, soon to launch for Mars’ moon Phobos.

NIRSpec was built for the European Space Agency (ESA) by a consortium of European companies led by Airbus Defence and Space (ADS) with NASA’s Goddard Space Flight Centre providing its detector and micro-shutter subsystems.

Naval Postgraduate School explores use of persistent augmented reality to advance decision cycle

Decision advantage, one of six “Force Design Imperatives” in Chief of Naval Operations Adm. Mike Gilday’s NAVPLAN 2022, places a high priority on the naval forces’ ability to “out-sense, out-decide, and out-fight any adversary by accelerating our decision cycles with secure, survivable, and cyber-resilient networks.”  U.S. Navy Lt. JaMerra Turner and Lt. Joanna Cruz used their NPS thesis to explore the use of persistent augmented reality to enhance visual representations of a carrier strike group’s communications and cyber network operations, with the goal of advancing the decision cycle in this complex operational environment.

Two students at the Naval Postgraduate School (NPS), U.S. Navy Lt. JaMerra Turner and Lt. Joanna Cruz, are exploring the application of emerging technologies in augmented reality and how Sailors can use them to advance the decision cycle. As the thesis for their master’s degrees at NPS, titled “Supporting Mission Planning with a Persistent Augmented Environment,” Cruz and Turner’s research provides insights into the feasibility and effectiveness of this novel form of data representation and user interaction, and its capability to support faster and improved situational awareness and decision-making in a complex operational technology environment.

Turner, now a graduate of the NPS Cyber Systems and Operations program with a master’s in computer science, was looking for a partner for her thesis. While combing through other students' research, she found Cruz, a student in the computer science program through the Modeling Virtual Environments and Simulations (MOVES) Institute, who wanted to focus her research on augmented reality. 

“Joanna had this brilliant idea of combining her expertise in virtual environments with my proficiency as a professional information officer,” said Turner. “Using my knowledge of communications networks, and her ability to create augmented realities, we could develop something that would drastically improve the decision-making capabilities of the senior decision-makers in the field.”

A ‘persistent augmented environment’ is a concept the students’ thesis advisor Dr. Amela Sadagic has been working on for some time and consists of two major components. ‘Persistent’ means that a diverse set of data are being collected and visualized in real-time. The data sets are updated and corrected as quickly as possible, ensuring the most accurate information is available to operators who need it to support their decision-making. An ‘augmented environment’ refers to superimposed, computer-generated visual elements over visual information that comes from the real-world environment. Cruz described it as “Pokémon Go on ships” or “tactical Pokémon Go.” 

With their thesis, the two students wanted to use simulated data of communications systems and cyber network operations, reporting, and resource management decision-making to demonstrate their product. Using augmented reality displays allowed the operators to see each other and discern key non-verbal cues typically used in group discussions. At the same time, they could also see the elements of the simulated operational environment and interact with them. 

When a carrier strike group goes on deployment, the ships’ lines of communication are of critical importance. Between several forms of radio frequencies, types of satellites, and the litany of other ways ships need to talk to one another, there is bound to be confusion. One ship might have a certain system down, and the other vessels need to be aware of that failure promptly. 

“Imagine every ship has a system where they could input data automatically in real-time, and every ship in the strike group would know immediately [through an augmented reality interface],” said Cruz. “With our program, the communications officer on every ship could have a headset with that data constantly updating, and all they would have to do to determine which systems are down and which systems are functional would be to look at the ship they are trying to reach.” 

Turner and Cruz designed and developed a straightforward 3D visualization prototype of how the augmented reality interface would look and then tested its usability with 27 Naval officers. They focused on users’ performance and comprehension of the prototype that depicted augmented reality-enhanced Naval cyber battlespace onboard multiple ships. An additional objective was to acquire a better understanding of how that type of system could be used to assist in effective mission planning at the tactical level. And the results were very positive and encouraging, they said.

“[Users] found the interface easy to understand and operate, and the prototype was characterized as a valuable alternative to their current practices,” said Turner.

Although the research just focused on communications, Cruz and Turner suggest the concept could be expanded to cover several systems onboard Navy vessels. From damage control conditions to weapons systems, key personnel on every ship could simply put on a headset and be instantly updated on the rest of the strike group. 

“It can be used by different warfare areas, and it can go towards different readiness conditions,” said Cruz. “We can input data regarding air warfare, surface warfare, submarine warfare, ballistic warfare, strike warfare, and the list goes on and on.”

Both students were very satisfied with their final thesis, and its potential to support a key initiative in NAVPLAN 2020. The interdisciplinary approach, capitalizing on each student’s unique area of expertise to advance a Navy-relevant topic of study, is a rare opportunity, except for students at NPS.

“The opportunity for two people in two different communities to combine efforts on a MOVES-related thesis and to actually come up with a brand-new idea and start implementing it was a very rewarding and satisfying experience,” said Cruz. “The world is changing, technology is improving at an incredible rate, and NPS is doing all it can to stay at the very tip of innovation.”