The Commerce Department’s National Institute of Standards and Technology (NIST) has presented 64 local staff members with the Departmental Bronze Medal and other NIST awards in ceremonies to be held today at NIST's Gaithersburg, In recognition of their awards, each of the recipients also received letters of congratulation from U.S. Senator Barbara Mikulski, U.S. Senator Benjamin Cardin and U.S. Representative Chris Van Hollen. The individual awards, their recipients and the citations describing the work leading to the honors are listed below.

Bronze Medal

The Bronze Medal Award is the highest honor presented by NIST. The award, initiated in 1966, recognizes work that has resulted in more effective and efficient management systems as well as the demonstration of unusual initiative or creative methods and procedures. It also is given for significant contributions affecting major programs, scientific accomplishment within the Institute, and superior performance of assigned tasks for at least five consecutive years.

    * Andrew J. Allen, Physicist, Materials Science and Engineering Laboratory

    For the development and application of X-ray and neutron scattering methods and instrumentation for characterizing complex, real-world materials.

    * Jacqueline K. DesChamps, Supervisory General Business Specialist, Baldrige National Quality Program

    For innovation in the design and delivery of key customer-focused and administrative processes required to operate the Baldrige National Quality Program.

    * Allen R. Hefner, Jr., Electronics Engineer, Electronics and Electrical Engineering Laboratory

    For leadership in the development of high megawatt power conversion technologies in collaboration with multiple federal agencies and their stakeholders.

    * R. Joseph Kline, Materials Research Engineer, Materials Science and Engineering Laboratory

    For outstanding contributions in grazing incidence X-ray diffraction of organic thin films to advance the development of organic electronic devices.

    * Juscelino B. Leao, Physicist, NIST Center for Neutron Research

    For the development and support of novel high pressure cells that have greatly expanded the research capabilities of the NIST Center for Neutron Research.

    * Warren D. Livengood, Electrical Engineer, Office of Chief Facilities Management Officer

    For leadership, engineering skill, and customer service in the installation of complex nano-fabrication researcher tools and laboratory renovations and system modifications.

    * Stephen E. Long, Research Chemist, Chemical Science and Technology Laboratory

    For leadership and expertise in the development and implementation of improved mercury metrology.

    * Egon Marx, Physicist, Manufacturing Engineering Laboratory

    For the development of advances in the fundamental theory of electromagnetic scattering and its application to accurate NIST metrology and standards.

    * Steven P. Mates, Mechanical Engineer, Materials Science and Engineering Laboratory

    For improving pipeline safety through the development of new methods to determine fatigue resistance in pipeline steels at near-actual explosion conditions.

    Three other members of this group are from the Materials Science and Engineering Laboratory in Boulder, Colo.

    * Alan L. Migdall, Physicist, Physics Laboratory

    For advancing photon-based metrology for applications in optical radiation measurement, fundamental physics, and quantum information.

    * Karen A. Scarfone, Computer Scientist, Information Technology Laboratory

    For leading the development of one of the world’s largest and most influential collections of computer security guidelines.

    * Ian B. Spielman, Physicist, Physics Laboratory

    For the development of methods to simulate condensed matter models by creating simple experimental realizations using ultra-cold atomic gases.

    * E. Ambler Thompson, Physical Scientist, Technology Services

    For leadership in developing documentary standards for international legal metrology, including quantities, units, and the International System of Units (SI).

    * R. Michael Verkouteren, Research Chemist, Chemical Science and Technology Laboratory

    For extraordinary innovation in developing piezoelectric inkjet printing for the calibration of vapor detection systems used for homeland security.

    * Christopher C. White, Research Chemist, Building and Fire Research Laboratory

    For outstanding scientific and engineering contributions to the Sealants Service Life Prediction Consortium.

    * Dale P. Bentz, Chemical Engineer; and Kenneth A. Snyder, Supervisory Physicist

    For outstanding research leading to the discovery of and patent application for the VERDiCT = doubling of concrete service life technology.

    The members of the group are from the Building and Fire Research Laboratory.

    * Nicholas G. Dagalakis, Mechanical Engineer; Daniel S. Sawyer, Mechanical Engineer; and Craig M. Shakarji, Mathematician

    For outstanding technical achievement in applying dimensional metrology in the invention of a computer assisted orthopedic surgery artifact.

    The members of the group are from the Manufacturing Engineering Laboratory.

    * Jeffrey A. Fagan, Chemical Engineer; Erik K. Hobbie, Physicist; and Frederick R. Phelan, Jr., Chemical Engineer

    For innovative methods to produce suspensions of high-purity single-wall carbon nanotubes with well-defined distributions of physical properties.

    The members of the group are from the Materials Science and Engineering Laboratory.

    * Joseph A. Falco, Mechanical Engineer; Richard J. Norcross, General Engineer; and Sandor Szabo, Computer Scientist

    For the development of performance test and measurement methods for automotive crash warning systems.

    The members of this group are from the Manufacturing Engineering Laboratory.

    * Muhammad Arif, Supervisory Physicist; Charles W. Clark, Chief, Electron and Optical Physics Division; Patrick P. Hughes, Physicist; Alan K. Thompson, Physicist; and Robert E. Vest, Physicist

    For the development of a new detector of neutrons based on the detection of extreme ultraviolet radiation from excited atoms produced by nuclear reaction.

    The members of the group are from the Physics Laboratory.

    * William I. MacGregor; Computer Scientist; Walter G. McDonough, Materials Engineer; Athanasios T. Karygiannis, Electronics Engineer; and Chad R. Snyder, Research Chemist

    For the analysis and certification of the U.S. Passport Card architecture resulting in a mitigation of security threats and privacy concerns.

    The members of the group are from the Materials Science and Engineering Laboratory and the Information Technology Laboratory. Four members not listed are from the Electronics and Electrical Engineering Laboratory in Boulder, Colo.

    * Sandra A. Nail, Human Resources Specialist; Louise G. Parrish, Supervisory Human Resources Specialist; Theresa W. Shuggars, Human Resources Assistant; and Marie G. Summerville, Human Resources Specialist

    For streamlining the NIST on-boarding process for new hires, resulting in productivity gains valued at more than $50,000 per year.

    The members of this group are from the Office of the Chief Human Capital Officer.

    * Mary P. Clague, General Business Specialist; Cynthia Huang, Computer Scientist; Jack E. Pevenstein, General Engineer; Sean K. Sell, Supervisory Computer Scientist; Cathy A. Smith, Administrative Specialist; Ronald B. Wilson, Computer Scientist; and Aiping L. Zhang, Information Technology Specialist

    For developing and implementing the web-based NIST Associate Information System (NAIS).

    The members of this group are from the Office of the Chief Information Officer and Technology Services.

Eugene Casson Crittenden Award

The Crittenden Award, established in 1967, recognizes superior achievement by permanent employees who perform supporting services that have a significant impact on technical programs beyond their own offices.

    * Tommy E. Armstrong, Engineering Technician, Office of the Chief Facilities Management Officer

    For improving the reliability of the communication network of the NIST site alarm system.

    * Carey E. Clark, Gardener, Office of the Chief Facilities Management Officer

    For outstanding contributions in maintaining the NIST campus and serving as the only certified arborist.

    * Bruce L. Connelly, Gardener, Office of the Chief Facilities Management Officer

    For outstanding contributions in maintaining the NIST campus and fulfilling a crucial role in the daily operations of the Grounds Shop.

    * David W. Easton, Motor Vehicle Operator, Office of the Chief Facilities Management Officer

    For developing and deploying new methods of communicating via the intranet with NIST transportation services customers.

    * Rose Estes Miller, Library Technician, Technology Services

    For providing outstanding interlibrary loan services to NIST Gaithersburg staff and associates.

    * Christopher F. Amigo, Engineering Technician, Manufacturing Engineering Laboratory

    For superior technical support to the Metallurgy and Ceramics divisions and for consistently fostering a culture of high quality customer service.

Judson C. French Award

The French award, first presented in 2000, is granted for significant improvement in products delivered directly to industry, including new or improved NIST calibration services, Standard Reference Materials and Standard Reference Databases.

    * Leonard M. Hanssen, Physicist, Physics Laboratory

    For developing and maintaining a critical Standard Reference Material (SRM 1921) which is used in the wavelength calibration of infrared spectrometers.

    * Aaron N. Johnson, Mechanical Engineer; and James R. Whetstone, Chief, Process Measurements Division

    For establishing a new calibration service for pipeline-scale flow metering of natural gas which enhances equity-in-trade for this critical energy sector.

    The members of this group are from the Chemical Science and Technology Laboratory

Edward Bennett Rosa Award

The Rosa Award, established in 1964, is granted for outstanding achievement in or contributions to the development of meaningful and significant engineering, scientific or documentary standards either within NIST or in cooperation with other government agencies or private groups.

    * Steven T. Bushby, Supervisory Electronics Engineer, Building and Fire Research Laboratory

    For outstanding technical work and leadership in developing standards for computerized building energy management and control systems.

George A. Uriano Award

The Uriano Award, first presented in 1996, is granted for outstanding achievements by NIST staff in building or strengthening NIST extramural programs, with emphasis on fostering U.S. competitiveness and business excellence.

    * Dawn M. Bailey, Writer/Editor, Baldrige National Quality Program

    For outstanding leadership in developing Baldrige Case Studies, the premier annual training tool for using and learning the Baldrige Criteria.

    * Kari M. Reidy, General Business Specialist; J. Michael Simpson, Director, Systems Operations Office; Stephen J. Thompson, Supervisory Industrial Specialist; and Benjamin S. Vickery, Industrial Specialist

    For creating a national program focused on accelerating the growth of U.S. manufacturers in alliance with industry and NIST MEP’s manufacturing extension centers.

    The members of this group are from the Hollings Manufacturing Extension Partnership Program.

Jacob Rabinow Applied Research Award

The Jacob Rabinow Applied Research Award, first presented in 1975, is granted for outstanding achievements in the practical application of the results of scientific engineering research.

    * Jeffrey W. Gilman, Supervisory Research Chemist, Building and Fire Research Laboratory

    For a pioneering role in enabling a new class of fire retardant material—clay nanocomposites.

Equal Employment Opportunity/Diversity Award

The Equal Employment Opportunity/Diversity Award, first presented in 1977, is granted for exceptionally significant accomplishments and contributions to equal employment opportunity/diversity goals.

    * Angela R. Hight Walker, Chemist, Physics Laboratory

    For years of devotion to educational outreach, through compelling science demonstrations to students at NIST events and at local schools.

Colleague’s Choice Award

The NIST Colleagues’ Choice Award, first presented in 2006, recognizes non-supervisory employees at NIST who, in the eyes of their colleagues, have made significant contributions that broadly advance the NIST mission and strategic goals or broadly contribute to the overall health and effectiveness of NIST.

    * William A. Kamitakahara, Physicist, NIST Center for Neutron Research

    For creating the NCNR Proposal System, the backbone of NCNR scientific operations and broadly hailed by the user community as among the world’s best.

As a non-regulatory agency of the U.S. Department of Commerce, NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life.

Silica is one of the most common minerals on Earth. Not only does it make up two-thirds of our planet's crust, it is also used to create a variety of materials from glass to ceramics, computer chips and fiber optic cables. Yet new quantum mechanics results generated by a team of physicists from Ohio State University (OSU) show that this mineral only populates our planet superficially—in other words, silica is relatively uncommon deep within the Earth. Cross-section of the Earth

Using several of the largest supercomputers in the nation, including the National Energy Research Scientific Computing Center’s (NERSC) Cray XT4 "Franklin" system, the team simulated the behavior of silica in high-temperature, high-pressure environments that are particularly difficult to study in a lab. These details may one day help scientists predict complex geological processes like earthquakes and volcanic eruptions. Their results were published in the May 10 online early edition of the Proceedings of the National Academy of Sciences (PNAS).

"Silica is one of the simplest and most common minerals, but we still don't understand everything about it. A better understanding of silica on a quantum-mechanical level would be useful to earth science, and potentially to industry as well," says Kevin Driver, an OSU graduate student who was a lead author on the paper. "Silica adopts many different structural forms at different temperatures and pressures, not all of which are easy to study in the lab."

Over the past century, seismology and high-pressure laboratory experiments have revealed a great deal about the general structure and composition of the Earth. For example, such work has shown that the planet's interior structure exists in three layers called the crust, mantle, and core. The outer two layers—the mantle and the crust—are largely made up of silicates, minerals containing silicon and oxygen. Still, the detailed structure and composition of the deepest parts of the mantle remain unclear. These details are important for complex geodynamical modeling that may one day predict large-scale events, such as earthquakes and volcanic eruptions.

Driver notes that even the role that the simplest silicate—silica—plays in the Earth's mantle is not well understood. "Say you're standing on a beach, looking out over the ocean. The sand under your feet is made of quartz, a form of silica containing one silicon atom surrounded by four oxygen atoms. But in millions of years, as the oceanic plate below becomes subducted and sinks beneath the Earth's crust, the structure of the silica changes dramatically," he said.

As pressure increases with depth, the silica molecules crowd closer together, and the silicon atoms start coming into contact with more oxygen atoms from neighboring molecules. Several structural transitions occur, with low-pressure forms surrounded by four oxygen atoms and higher-pressure forms surrounded by six. With even more pressure, the structure collapses into a very dense form of the mineral, which scientists call alpha-lead dioxide.

Driver notes that it is this form of silica that likely resides deep within the earth, in the lower part of the mantle, just above the planet's core. When scientists try to interpret seismic signals from that depth, they have no direct way of knowing what form of silica they are dealing with. They must rely on high-pressure experiments and computer simulations to constrain the possibilities. Driver and colleagues use a particularly high-accuracy, quantum mechanical simulation method to study the behavior of different silica forms, and then compare the results to the seismic data.

In PNAS, Driver, his advisor John Wilkins, and their coauthors describe how they used a quantum mechanical method to design computer algorithms that would simulate the silica structures. When they did, they found that the behavior of the dense, alpha-lead dioxide form of silica did not match up with any global seismic signal detected in the lower mantle. This result indicates that the lower mantle is relatively devoid of silica, except perhaps in localized areas where oceanic plates have subducted.

"As you might imagine, experiments performed at pressures near those of Earth's core can be very challenging. By using highly accurate quantum mechanical simulations, we can offer reliable insight that goes beyond the scope of the laboratory," said Driver.

Supercomputers Dissect Silica with Quantum Monte Carlo Calculations

Credit: Kevin Driver, Ohio State University
The structure of a six-fold coordinated silica phase called stishovite, a prototype for many more complex mantle minerals. Stishovite is commonly created in diamond anvil cells and often found naturally where meteorites have slammed into earth. Driver and co-authors computed the shear elastic constant softening of stishovite under pressure with quantum Monte Carlo using over 3 million CPU hours on Franklin

The team's work was one of the first to show that quantum Monte Carlo (QMC) methods could be used to study complex minerals deep in the planet's interior. Although the algorithms have been around for over half a century, Driver notes that applying them to silica was simply too labor– and computer–intensive, until recently.

"In total, we used the equivalent of six million CPU hours to model four different states of silica. Three million of those CPU hours involved using NERSC's Franklin system to calculate a shear elastic constant for silica with the QMC method. This is the first time it had ever been done," said Driver.

He notes that the QMC calculations on Franklin were completed during the system's pre-production phase, before the system was formally accepted by NERSC. During this phase, the center's 3,000 science users were encouraged to try out the Cray XT4 system to see if it could withstand the gamut of scientific demands from different disciplines. Driver notes that the Franklin results allow him to measure how silica deforms at different temperatures and pressures.

"From computing hardware to the consulting staff, the resources at NERSC are really excellent. The size and speed of the center’s machines is something that I don’t normally have access to at other places," says Driver.

"This work demonstrates both the superb contributions a single graduate student can make, and that the quantum Monte Carlo method can compute nearly every property of a mineral over a wide range of pressures and temperatures," said Wilkins. "The study will stimulate a broader use of quantum Monte Carlo worldwide to address vital problems."

He and his colleagues expect that quantum Monte Carlo will be used more often in materials science in the future, as the next generation of computers goes online. Coauthors on the paper included Ronald Cohen of the Carnegie Institution of Washington; Zhigang Wu of the Colorado School of Mines; Burkhard Militzer of the University of California, Berkeley; and Pablo López Ríos, Michael Towler, and Richard Needs of the University of Cambridge.

This research was funded by the National Science Foundation and the Department of Energy. In addition to NERSC, computing resources were also provided by the National Center for Atmospheric Research, the National Center for Supercomputing Applications, the Computational Center for Nanotechnology Innovations, the TeraGrid and the Ohio Supercomputer Center. Click here to read the PNAS abstract.

Horst Simon, an internationally recognized expert in computer science and applied mathematics, has been named Deputy Director of Lawrence Berkeley National Laboratory (Berkeley Lab).

“Horst is a strong leader who has helped to lead a tremendously productive program in high performance computing that is world-class,” said Berkeley Lab Director Paul Alivisatos. “As Deputy Director he’ll help me lead major scientific initiatives, oversee strategic research investments, and maintain the intellectual vitality of Berkeley Lab.”

Prior to this appointment, Simon served as Associate Lab Director for Computing Sciences. In his capacity as Associate Lab Director, Simon helped to establish Berkeley Lab as a world leader in providing supercomputing resources to support research in fields ranging from global climate modeling to astrophysics. He is also an adjunct professor in the College of Engineering at the University of California, Berkeley. In that role he worked to bring the Lab and the campus closer together, developing a designated graduate emphasis in computational science and engineering. In addition, he has worked with project managers from the Department of Energy, the National Institutes of Health, the Department of Defense and other agencies, helping researchers define their project requirements and solve technical challenges.

Simon joined Berkeley Lab in early 1996 as director of the newly formed National Energy Research Scientific Computing Center (NERSC), and was one of the key architects in establishing NERSC at its new location at Berkeley Lab. Under his leadership NERSC enabled important discoveries for research across a wide spectrum of scientific disciplines. Simon was also the founding director of Berkeley Lab’s Computational Research Division, which conducts applied research and development in computer science, computational science, and applied mathematics.

Simon’s research interests are in the development of sparse matrix algorithms, algorithms for large-scale eigenvalue problems, and domain decomposition algorithms for unstructured domains for parallel processing. His algorithm research efforts were honored with the 1988 and the 2009 Gordon Bell Prize for parallel processing research. He was also member of the NASA team that developed the NAS Parallel Benchmarks, a widely used standard for evaluating the performance of massively parallel systems. He is co-editor of the biannual TOP500 list that tracks the most powerful supercomputers worldwide, as well as related architecture and technology trends.

He holds an undergraduate degree in mathematics from the Technische Universtät in Berlin, Germany, and a Ph.D. in Mathematics from the University of California at Berkeley. Simon succeeds Jay Keasling, who served as interim Deputy Director. Keasling will return to his duties as Chief Executive Officer of the Joint BioEnergy Institute.

As Deputy Director, Simon will receive an annual salary of $335,000. The Deputy Director’s salary, like that of all other UC employees at the laboratory, is paid from funds derived from the federal DOE contract. No general funds from the state are used to pay the Deputy Director’s salary. In accordance with university policy, he is eligible for participation in the UC Mortgage Origination Program. Simon also will receive standard pension and health and welfare benefits and standard senior management benefits, including senior manager life insurance and executive salary continuation for disability.

Lawrence Berkeley National Laboratory provides solutions to the world’s most urgent scientific challenges including clean energy, climate change, human health, and a better understanding of matter and force in the universe. The Lab is a world leader in improving our lives and knowledge of the world around us through innovative science, advanced computing, and technology that makes a difference. Berkeley Lab is a U.S. Department of Energy (DOE) national laboratory. It conducts unclassified scientific research and is managed by the University of California for the DOE Office of Science. Visit our website.

Additional information

To read a Q & A with Horst Simon, visit
http://www.lbl.gov/Publications/Deputy-Director/about.html

Theoretical simulations of protein structures using better computational methods provide important information on the biological functions that make life possible.

Proteins are found everywhere in living bodies, and are vital and essential for life. Scientists since the latter half of the twentieth century have been conducting research into structural biology to clarify the three-dimensional structure and to understand the functions of proteins based on their structures. Research by the many scientists involved in the field has brought about a significant advance in our understanding of life phenomena. “However, the knowledge obtained solely by observing the structures of proteins is insufficient for a full understanding of the mechanism of how proteins function in the body,” says Yuji Sugita, associate chief scientist and leader of the Biomolecular Dynamics Simulation Research Team in the Theoretical Biochemistry Laboratory at the RIKEN Advanced Science Institute. With the aim of understanding protein functions at the atomic or molecular level, the laboratory has simulated the behavior of proteins using various theoretical calculation methods, specifically focusing on molecular dynamics. The laboratory is also working on the development of a new calculation method in close collaboration with structural biologists from within RIKEN and from other institutions.

Structural biology and simulation

Sugita is inundated with applications for research collaboration from many structural biologists within and outside RIKEN. “We use computers to simulate structural changes in biomacromolecules such as proteins and cellular membranes so as to clarify their functions. These people say, ‘I have just solved the three-dimensional structure of a new protein, so why don’t we conduct a joint simulation?’” he says. Proteins, responsible for many of life’s mechanisms, consist of a number of amino acids and fold into the specific three dimensional structures. The three-dimensional structure of a protein is closely related to its function, and so protein functions can be investigated by clarifying the three-dimensional structure. This is the essence of structural biology.

It was in 1958 that the three-dimensional structure of a protein was first elucidated. John Kendrew, a British biochemist, and others created crystals of myoglobin, which stores and carries oxygen in the blood. They exposed the crystals to x-rays to clarify their three-dimensional structure at the atomic level, for which they were awarded the 1962 Nobel Prize in Chemistry. This technique, x-ray crystallographic analysis, is still a mainstream technique in three-dimensional structural analysis. Nuclear magnetic resonance (NMR) is also widely used for three-dimensional structural analyses of solutions.

As Sugita points out, “So far, more than 60,000 proteins have been clarified in terms of their three-dimensional structure, contributing significantly to our understanding of protein functions. However, the artificially arranged three-dimensional structure of protein crystals is not an exact match to the structure within the living organism. Proteins in a living organism either exist in solution in the cytoplasm or are embedded in biological membranes such as the cellular membrane and the endoplasmic reticulum membrane. Furthermore, a protein can change its structure dynamically by itself when it functions. However, when we want to observe these changes in greater detail, there is a limit to observation based only on x-ray analysis and NMR. Thus, much attention has been paid to computer-based molecular simulation techniques as a new approach that allows observation of the dynamics of proteins at the atomic level.”

How the three-dimensional structure of proteins is computed

To simulate the structural changes in proteins, atomic-level information on the three-dimensional structure of the target protein is required. The resolution of the information should be equivalent to that obtained by x-ray analysis or NMR. As the information on the three-dimensional structure does not provide sufficient information with regard to hydrogen atoms, the positions of the hydrogen atoms around the target protein are first predicted theoretically before creating the full atomic model. When the target is a water-soluble protein, the water molecules are properly arranged around the protein in the model. When the target is a membrane protein, it is embedded in a lipid bilayer membrane, and water molecules are properly arranged around the target protein. Finally, the forces between the atoms are calculated by solving the classical Newton’s equation to analyze and observe the dynamics of proteins, that is, how the positions of the atoms change with time. This procedure is known as molecular dynamics simulation.

The molecular dynamics simulation of proteins started in 1977. “The first report was of results calculated for a small protein called BPTI, a chain of 58 amino acids, in a vacuum for a period of one picosecond. Accurate representation of the motion of atoms, however, requires calculation at intervals of one femtosecond. The amount of calculation required increases from being proportional to the number of atoms or the square of the number of atoms. Computers at that time did not have sufficient computing power for such computations, and so the interaction between water molecules had to be neglected, and the calculation time was limited to very short time periods.”

Current-day computing power makes it easy to simulate a target protein and surrounding water molecules for periods of one microsecond, and a BPTI protein in water can even be simulated for a period of up to one millisecond. Molecular dynamics simulations are now being applied to more complex, large-scale models such as membrane proteins embedded in biological membranes and DNA-protein complexes.

Sugita notes that there are three factors contributing to the rapid development in the molecular dynamics simulation of proteins. The first factor is the dramatic improvement in computing power, which at present is least a million times what it was when molecular dynamics simulation first began. The second factor is the amount of information accumulated on three-dimensional structures with the development of structural biology, and the third factor is the development of new calculation methods. “The molecular dynamics simulation of proteins has developed rapidly, supported by the multiplied computing power, three-dimensional structure information, and various calculation methods. We have also made a contribution through the development of calculation methods.”

Development of an epoch-making replica-exchange molecular dynamics method

Sugita successfully developed a new calculation technique called the ‘replica-exchange molecular dynamics method’ when he worked for the Institute for Molecular Science. “The three-dimensional structure of a protein is most stable when it is at the lowest energy level. However, the energy level of a protein is constantly changing, and its energy distribution is like a rough landscape with many valleys and cliffs. Thus, it has many metastable states. We may be trapped in a valley during calculation and lost in the middle of this endless energy distribution. We could get out of the valley if we used an ultra-high performance computer for an indefinite period of time, but that is not realistic. Under these circumstances, the replica-exchange molecular dynamics method, which I and Prof. Yuko Okamoto at the Institute for Molecular Science (at present, Nagoya University) jointly developed, is one of the most effective calculation methods to solve this problem.”

The replica-exchange molecular dynamics method is based on a technique developed in theoretical solid-state physics. The technique was then modified and applied to molecular dynamics calculations. In the replica-exchange molecular dynamics method, multiple replicas of a target protein are prepared. These individual replicas are then simulated simultaneously at different temperatures. These temperatures are exchanged during the calculations, and the operation is repeated (Fig. 1). “We can obtain energetically stable structures when calculations are performed at low temperatures, but it is difficult to get out of the valleys. In contract, we can easily get out of the valleys when calculations are performed at high temperatures, but we tend to obtain unstable structures. So we exchange the temperatures so as to take advantage of the two calculation methods and to drive various structures. We use these results to restructure the structural changes of the protein at a constant temperature and to achieve a long simulation period. A normal molecular dynamics calculation would require several hundred to several thousand times the computing time needed for the replica-exchange molecular dynamics method.”

The replica-exchange molecular dynamics method is computationally very efficient because individual replicas are calculated in parallel. This method has been included in major molecular simulation software packages and is used widely around the world. The original article has been cited more than 600 times, and that number is still increasing. “Thanks to the replica-exchange molecular dynamics method, we are now able to simulate the folding of proteins in water, which had been considered impossible. The development of calculation methods for simulation can be compared to the development of equipment for experimental science. As advanced research results are created by a group with new equipment, so new simulations are created by a group that is actively working on the development of new calculation methods.”

Moving proteins as if they were alive

“The advantage of research into simulation is high versatility. We can apply the calculation methods to various life phenomena after they have been examined in detail,” says Sugita. In his laboratory, researchers are working on various simulations for phenomena such as structural changes in membrane proteins, structural prediction of amyloid proteins (considered to be a major cause of Alzheimer's disease), folding and degenerative processes of proteins in water, and the behavior of lipid molecules in biological membranes. “I have been amazed by the ingenious functions of many biomacromolecules. In particular, I was deeply impressed by the complex molecular mechanism by which the calcium ion pump changes its structure to enable active transport of Ca2+. This is the research project I have been continuing in collaboration with Prof. Chikashi Toyoshima of the Institute of Molecular and Cellular Biosciences at the University of Tokyo since I once worked for the institute.”

The Ca2+ pump (Fig. 2) is a membrane protein embedded in the endoplasmic reticulum membrane in muscle cells that transports Ca2+ from the cytoplasm into the endoplasmic reticulum. The application of x-ray crystallographic analysis to membrane proteins is very difficult because they do not crystallize easily. Toyoshima and others, however, successfully determined multiple three-dimensional structures of the Ca2+ pump in different states. Sugita is using computer simulation to connect and arrange all of these ‘snapshots’ in order. “I can really see the true meaning of ‘understanding functions through three-dimensional structures’ when I discuss the structures of the Ca2+ pump with Prof. Toyoshima.”

However, they are only half way toward meeting the challenge of understanding the functions of the Ca2+ pump through its three-dimensional structure. The driving force that the Ca2+ pump requires for ion transport is the chemical energy produced when adenosine triphosphate (ATP) is hydrolyzed into adenosine diphosphate (ADP). However, the molecular dynamics simulation based on classical mechanics are unable to deal with chemical reactions. Thus, the effects of the chemical reactions cannot be taken into account. “We need to combine multiple theoretical calculation methods, including quantum-chemistry calculations.” Setting this challenge as one of his goals, Sugita started the RIKEN Theoretical Biochemistry Laboratory in 2007.

Translocon is a membrane protein that transports proteins in the cytoplasm of a cell across its biological membrane. When a translocon is isolated, it is in a closed form in which the hole of the translocon is plugged. The plug opens when a partner protein, such as SecA, binds to translocon, allowing the transport of proteins (right). A recently found Thermus thermophilus-derived translocon, however, combines with antibody molecules (Fab) to assume a special form different to the closed form (upper left). Simulation of the molecular dynamics of translocon without antibody molecules for a period of 100 ns has revealed that translocon changes its structure and returns to the closed form when the antibody is removed. The simulation also showed that the structure with bound antibody molecule is in a ’pre-open’ form at the early protein transport stage.

One of the laboratory’s recent research targets is related to the simulation of a membrane protein called translocon, which has the dual function of transporting proteins across a biological membrane and embedding other membrane proteins in biological membranes (Fig. 3). The three-dimensional structure of translocon has been clarified to have a closed form before a protein is transported, but Osamu Nureki of the University of Tokyo and others discovered another three-dimensional structure for translocon, one involving the binding of an antibody molecule. They wanted to prove that this three-dimensional structure is the stage following the closed form, and asked Sugita for his cooperation. “Dr Takaharu Mori, a contract researcher in our laboratory, and his team successfully conducted a simulation for a period of 100 nanoseconds. We found that the new structure changed into the closed form when the antibody molecule was removed. The article including pictures generated by our simulations was published in Nature in 2008.” The team is now moving forward with joint research toward elucidating the molecular mechanism of how translocon transports proteins.

Sugita’s laboratory is also making progress on the simulation of several membrane proteins with as yet undisclosed three-dimensional structures. These studies have been made possible by close collaboration between the theoretical biochemistry laboratory and structural biologists.

The field of protein simulation is very competitive, but Sugita has confidence in his laboratory. “In our laboratory, we work on both the development of calculation methods and computer simulation. We also have a close relationship with structural biologists. Furthermore, RIKEN’s computer environment is wonderful. We have no parallel in the world in our ability to make progress with our research because of these conditions.” In addition to their own computers, they can use MDGRAPE-3, a special-purpose computer system for molecular dynamics simulations incorporated into the RICC, the RIKEN Supercomputer System. The RICC supercomputer is very powerful, and MDGRAPE-3 is the largest system in Japan for molecular dynamics simulations of proteins.

Suyong Re, another contract researcher, and his team, who work on simulations based on quantum chemistry, are beginning to produce visible results. However, they need a faster computer because the amount of calculation required in quantum chemistry is proportional to the fourth power of the number of atoms. “We will get molecular dynamics and quantum chemistry simulations into full swing using the Next-Generation Supercomputer that RIKEN is now developing. We want to move large and complex proteins such as membrane proteins in the simulation as if they were alive. We have decided on a time length of one millisecond as a target because that time length allows us to observe the entire function cycle of the protein. Next, we would like to attempt the simulation of life phenomena involving multiple proteins,” says Sugi/ta.

Inspiration for theoretical biochemistry

In 1987, shortly before Sugita entered the Faculty of Science at Kyoto University, Susumu Tonegawa, director of the RIKEN Brain Science Institute, won the Nobel Prize for Physiology or Medicine. Sugita seemed to be interested in biology, but he could not give up his interest in physics. So he enrolled in Nobuhiro Go’s laboratory, which was conducting research applying theory and calculation in the field of biology. “Go’s laboratory belonged to the Department of Chemistry. At first, I was not interested in chemistry because I thought it consisted of purely memorization study. However, as I studied it further, I found it more and more interesting. In a broader sense, chemistry is the study of various phenomena driven by the interaction between atoms and molecules. We use the information obtained through theoretical calculations and experiments to understand the functions and structures of proteins at the atomic or molecular level. I once asked Dr Ryoji Noyori, president of RIKEN, whether our research activities were in the area of chemistry. I was very pleased when he told me that our research activities were exactly in the area of chemistry,” says Sugita. The Theoretical Biochemistry Laboratory symbolizes Sugita’s research interests. “I would like to focus on understanding living organisms from the perspective of chemistry, based on molecules and atoms.”

ICHEC to Work with SGI to Provide Consultancy Services for GPGPU-Based Applications

SGI today announced its partnership with the Irish Centre for High-End Computing (ICHEC) to provide customers with complete consultation services in the growing field of General-Purpose Computing on Graphics Processing Units (GPGPU). ICHEC will work exclusively with SGI to enable institutions and businesses to accelerate the development of applications that utilize GPGPU systems.

Although traditionally associated with gaming applications, the integration of graphics processing for other applications is a rapidly emerging trend. Having largely benefited intensive scientific computing to date, more general-purpose use of GPU technology is expected to provide solutions to a much broader range of disciplines.

“More users are beginning to turn to GPGPU systems to gain a competitive advantage in a wider range of fields,” said Peter Luff, professional services director at SGI. “This partnership with ICHEC will offer a broad range of consultancy services – including proof-of-concept development, benchmarking, porting and training – enabling businesses to accelerate the development, implementation and roll-out of GPGPU systems.”

Founded in 2005, ICHEC was established as a national high performance computing (HPC) provider with offices in Dublin and Galway, Ireland. Its team of system administrators and computational scientists work with researchers to support the development of internationally competitive computational modeling and research across a variety of disciplines and institutions. In the past year, ICHEC has established a significant team of researchers focused primarily on porting applications on GPGPU architecture.

“Our focus has been placed firmly on developing new algorithms capable of exploiting the considerable power of GPGPUs. For example, recent results from our port of the DL_POLY MD application suggest that this technology has indeed the potential to make HPC accessible to a broader audience. We expect similar success with our ongoing port of Quantum Espresso,” said James Slevin, director of ICHEC. “By partnering with SGI, ICHEC will provide business, scientific and academic customers with a high level of consultancy to support the development of effective computing solutions.”

For more information on ICHEC’s GPU activities, please visit www.ichec.ie/research/gpgpu_projects.

Page 6 of 17