Harvard researcher develops program to read any genome sequence, decipher its genetic code

Yekaterina “Kate” Shulgina was a first-year student in the Graduate School of Arts and Sciences, looking for a short computational biology project so she could check the requirement of her program in systems biology. She wondered how genetic code, once thought to be universal, could evolve and change.

That was 2016 and today Shulgina has come out the other end of that short-term project with a way to decipher this genetic mystery. She describes it in a new paper in the journal eLife with Harvard biologist Sean Eddy.

The report details a new computer program that can read the genome sequence of an organism and then determine its genetic code. The program, called Codetta, has the potential to help scientists expand their understanding of how the genetic code evolves and correctly interpret the genetic code of newly sequenced organisms.

“This is of itself is a very fundamental biology question,” said Shulgina, who does her graduate research in Eddy’s Lab.

The genetic code is the set of rules that tells the cells how to interpret the three-letter combinations of nucleotides into proteins, often referred to as the building blocks of life. Almost every organism, from E. coli to humans, uses the same genetic code. It’s why the code was once thought to be set in stone. But scientists have discovered a handful of outliers — organisms that use alternative genetic codes – exist where the set of instructions are different.

This is where Codetta can shine. The program can help to identify more organisms that use these alternative genetic codes, helping shed new light on how genetic codes can even change in the first place.

“Understanding how this happened would help us reconcile why we originally thought this was impossible… and how these really fundamental processes actually work,” Shulgina said.

Already, Codetta has analyzed the genome sequences of over 250,000 bacteria and other single-celled organisms called archaea for alternative genetic codes and has identified five that have never been seen. In all five cases, the code for the amino acid arginine was reassigned to a different amino acid. It’s believed to mark the first time scientists have seen this swap in bacteria and could hint at evolutionary forces that go into altering the genetic code.

The researchers say the study marks the largest screening for alternative genetic codes. Codetta essentially analyzed every genome that’s available for bacteria and archaea. The name of the program is a cross between the codons, the sequence of three nucleotides that form pieces of the genetic code, and the Rosetta Stone, a slab of rock inscribed with three languages.

The work marks a capstone moment for Shulgina, who spent the past five years developing the statistical theory behind Codetta, writing the program, testing it, and then analyzing the genomes. It works by reading the genome of an organism and then tapping into a database of known proteins to produce a likely genetic code. It differs from other similar methods because of the scale at which it can analyze genomes.

Shulgina joined Eddy’s lab, which specializes in comparing genomes, in 2016 after coming to him for advice on the algorithm she designed to interpret genetic codes.

Until now, no one has done such a broad survey for alternative genetic codes.

“It was great to see new codes because for all we knew, Kate would do all this work and there wouldn’t turn out to be any new ones to find,” said Eddy, who’s also a Howard Hughes Medical Investigator. He also noted the potential of the system to be used to ensure the accuracy of the many databases that house protein sequences.

“Many protein sequences in the databases these days are only conceptual translations of genomic DNA sequences,” Eddy said. “People mine these protein sequences for all sorts of useful stuff, like new enzymes or new gene-editing tools and whatnot. You’d like for those protein sequences to be accurate, but if the organism is using a nonstandard code, they’ll be erroneously translated.”

The researchers say the next step of the work is to use Codetta to search for alternative codes in viruses, eukaryotes, and organellar genomes like mitochondria and chloroplasts.

“There’s still a lot of diversity of life where we haven’t done this systematic screening yet,” Shulgina said.

Texas A&M simulations show how prolonged radiation exposure damages nuclear reactors

Supercomputer simulation reveals multiple factors that contribute to radiation damage

New research from Texas A&M University scientists could help in boosting the efficiency of nuclear power plants soon. By using a combination of physics-based modeling and advanced simulations, they found the key underlying factors that cause radiation damage to nuclear reactors, which could then provide insight into designing more radiation-tolerant, high-performance materials. 

“Reactors need to run at either higher power or use fuels longer to increase their performance. But then, at these settings, the risk of wear and tear also increases,” said Dr. Karim Ahmed, assistant professor in the Department of Nuclear Engineering. “So, there is a pressing need to come up with better reactor designs, and a way to achieve this goal is by optimizing the materials used to build the nuclear reactors.”

The results of the study are published in the journal Frontiers in Materials.

According to the Department of Energy, nuclear energy surpasses all other natural resources in power output and accounts for 20% of the United States’ electricity generation. The source of nuclear energy is fission reactions, wherein an isotope of uranium splits into daughter elements after a hit from fast-moving neutrons. These reactions generate enormous heat, so nuclear reactors parts, particularly the pumps and pipes, are made with materials possessing exceptional strength and resistance to corrosion.

However, fission reactions also produce intense radiation that causes a deterioration in the nuclear reactor’s structural materials. At the atomic level, when energetic radiation infiltrates these materials, it can either knock off atoms from their locations, causing point defects, or force atoms to take vacant spots, forming interstitial defects. Both these imperfections disrupt the regular arrangement of atoms within the metal crystal structure. And then, what starts as tiny imperfections grow to form voids and dislocation loops, compromising the material’s mechanical properties over time.

While there is some understanding of the type of defects that occur in these materials upon radiation exposure, Ahmed said it has been arduous to model how radiation, along with other factors, such as the temperature of the reactor and the microstructure of the material, together contribute to the formation defects and their growth. 

“The challenge is the computational cost,” he said. “In the past, simulations have been limited to specific materials and for regions spanning a few microns across, but if the domain size is increased to even 10s of microns, the computational load drastically jumps.”

In particular, the researchers said to accommodate larger domain sizes, previous studies have compromised on the number of parameters within the simulation’s differential equations. However, an undesirable consequence of ignoring some parameters over others is an inaccurate description of the radiation damage.

To overcome these limitations, Ahmed and his team designed their simulation with all the parameters, making no assumptions on whether one of them was more pertinent than the other. Also, to perform the now computationally heavy tasks, they used the resources provided by the Texas A&M High-Performance Research Computing group.

Upon running the simulation, their analysis revealed that using all parameters in nonlinear combinations yields an accurate description of radiation damage. In particular, in addition to the material’s microstructure, the radiation condition within the reactor, the reactor design, and temperature are also important in predicting the instability in materials due to radiation.

On the other hand, the researchers’ work also sheds light on why specialized nanomaterials are more tolerant to voids and dislocation loops. They found that instabilities are only triggered when the border enclosing clusters of co-oriented atomic crystals, or grain boundary, is above a critical size. So, nanomaterials with their extremely fine grain sizes suppress instabilities, thereby becoming more radiation-tolerant.  

“Although ours is a fundamental theoretical and modeling study, we think it will help the nuclear community to optimize materials for different types of nuclear energy applications, especially new materials for reactors that are safer, more efficient, and economical, ” said Ahmed. “This progress will eventually increase our clean, carbon-free energy contribution.”

Dr. Abdurrahman Ozturk, a research assistant in the nuclear engineering department, is the lead author of this work. Merve Gencturk, a graduate student in the nuclear engineering department, also contributed to this research.

SwRI, UTSA win $1.5 million grant to study hypersonic separation events

Southwest Research Institute will advance hypersonics research in collaboration with The University of Texas at San Antonio (UTSA) under a three-year, $1.5 million grant through the University Consortium of Applied Hypersonics. As a subcontractor to UTSA, SwRI will design experiments to push the envelope on what is capable with hypersonic system designs and provide methods to better model complex system behavior during separation events.

Hypersonic speeds are faster than five times the speed of sound or greater than Mach 5. When something is flying that fast, the air around a flying object will chemically decompose. Some points behind the shockwave created by the vehicle are hotter than the surface of the Sun. This strange chemical environment causes whatever is traveling through it to heat up, and even melt and chemically react with the air.

SwRI engineers, led by Nicholas Mueschke, program manager of SwRI’s Computational Mechanics Section, are studying hypersonic separation events when two or more things intentionally come apart.

Separation events are commonplace in many aerospace applications. For example, rocket boosters are ejected during space launches, including some that now return to the launch pad after separation. Military aircraft require safe separation of payloads carried underwing or within storage bays.  Some rocket nose cones are designed to protect launch packages, such as satellites, which split open and separate from the vehicle in flight.

“Flying at hypersonic speeds within the atmosphere makes the aerodynamics and loads experienced by separating structures more difficult to predict and harder to safely design around because the time scales of these events are squeezed into milliseconds,” Mueschke said.

As next-generation hypersonic technology progresses, the ability to support separating components must also advance. A booster that separates from a vehicle, for example, allows for extended range and novel flight corridors. The challenge is designing components that can separate easily, avoid damaging or upsetting the primary vehicle, but also withstand the extreme aerodynamics and thermal environment associated with traveling at hypersonic speeds. Southwest Research Institute’s two-stage light-gas gun simulates hypersonic flight conditions and allows researchers to image objects in hypersonic flight. It will be instrumental in the Institute’s efforts to design experiments that advance knowledge of hypersonic separation events.

SwRI is designing novel experiments to evaluate hypersonic system designs while also providing methods to better model complex system behavior during separation events. To accomplish this, the team is designing tests that can be conducted in the Institute’s two-stage light-gas gun, which simulates hypersonic flight conditions and allows researchers to image objects in hypersonic flight.

“The goal is to generate aerodynamic and kinematic data that will anchor high-fidelity simulation models,” Mueschke said. “We will also leverage some of our advanced simulation capabilities to both design these experiments and evaluate how simulation models can improve future vehicle designs. Ultimately, this work is part of the broader effort to leverage hypersonic technology to deliver operational capability and options to combatant commanders that otherwise don’t exist today.”

Mueschke and his colleagues began work under the new contract in October.

“It’s encouraging to see academia, government, and industry collaborating on multiyear efforts to advance hypersonics research,” Mueschke said. “I hope this effort will open new doors to operational capabilities we haven’t seen before.”