Headlines

SC Online News is the only newspaper covering the rapidly evolving supercomputer marketplace.
  1. For 2017, Cray reported total revenue of $392.5 million, which compares with $629.8 million in 2016. Net loss for 2017 was $133.8 million, or $3.33 per diluted share, compared to net income of $10.6 million, or $0.26 per diluted share in 2016.  Non-GAAP net loss, which adjusts for selected unusual and non-cash items, was $40.5 million, or $1.01 per diluted share for 2017, compared to non-GAAP net income of $19.9 million, or $0.49 per diluted share in 2016.

    Revenue for the fourth quarter of 2017 was $166.6 million, compared to $346.6 million in the fourth quarter of 2016.  Net loss for the fourth quarter of 2017 was $97.5 million, or $2.42 per diluted share, compared to net income of $51.8 million, or $1.27 per diluted share in the fourth quarter of 2016.  Non-GAAP net income was $9.2 million, or $0.22 per diluted share for the fourth quarter of 2017, compared to non-GAAP net income of $56.3 million, or $1.38 per diluted share for the same period in 2016.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    The Company’s GAAP Net Loss for the fourth quarter and year ended December 31, 2017 was significantly impacted by both the enactment of the Tax Cuts and Jobs Act of 2017 and by its decision to record a valuation allowance against all of its U.S. deferred tax assets.  The combined GAAP impact totaled $103 million.  These items have been excluded for non-GAAP purposes.

    For 2017, overall gross profit margin on a GAAP and non-GAAP basis was 33% and 34%, respectively, compared to 35% on a GAAP and non-GAAP basis for 2016.

    Operating expenses for 2017 were $196.4 million, compared to $211.1 million in 2016.  Non-GAAP operating expenses for 2017 were $176.5 million, compared to $199.7 million in 2016.  GAAP operating expenses in 2017 included $8.6 million in restructuring charges associated with our recent workforce reduction.

    As of December 31, 2017, cash, investments and restricted cash totaled $147 million.  Working capital at the end of the fourth quarter was $354 million, compared to $373 million at December 31, 2016.

    “Despite difficult conditions in our core market we finished 2017 strong, highlighted by several large acceptances at multiple sites around the world, including completing the installation of what is now the largest supercomputing complex in India at the Ministry of Earth Sciences,” said Peter Ungaro, president and CEO of Cray.  “As we shift to 2018, we’re seeing signs of a rebound at the high-end of supercomputing as well as considerable growth opportunities in the coming years.  Supercomputing continues to expand in importance to both government and commercial customers, driving growth and competitiveness across many different disciplines and industries.  As the leader at the high-end of the market, we’re poised to play a key role in this growth and I’m excited about where we’re headed.”

  2. Quantum entanglement is a key feature of a quantum supercomputer. Yet, how can we verify that a quantum supercomputer indeed incorporates a large-scale entanglement? Using conventional methods is hard since they require a large number of repeated measurements. Aleksandra Dimić from the University of Belgrade and Borivoje Dakić from the Austrian Academy of Sciences and the University of Vienna have developed a novel method where in many cases even a single experimental run suffices to prove the presence of entanglement. Their surprising results will be published in the online open access journal npj Quantum Information of the Nature Publishing group.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    The ultimate goal of quantum information science is to develop a quantum computer, a fully-fledged controllable device which makes use of the quantum states of subatomic particles to store information. As with all quantum technologies, quantum supercomputing is based on a peculiar feature of quantum mechanics, quantum entanglement. The basic units of quantum information, the qubits, need to correlate in this particular way in order for the quantum supercomputer to achieve its full potential.

    One of the main challenges is to make sure that a fully functional quantum supercomputer is working as anticipated. In particular, scientists need to show that the large number of qubits are reliably entangled. Conventional methods require a large number of repeated measurements on the qubits for reliable verification. The more often a measurement run is repeated the more certain one can be about the presence of entanglement. Therefore, if one wants to benchmark entanglement in large quantum systems it will require a lot of resources and time, which is practically difficult or simply impossible. The main question arises: can we prove entanglement with only a low number of measurement trials?

    Now researchers from the University of Belgrade, the University of Vienna and the Austrian Academy of Sciences have developed a novel verification method which requires significantly fewer resources and, in many cases, even only a single measurement run to prove large-scale entanglement with a high confidence. For Aleksandra Dimić from the University of Belgrade, the best way to understand this phenomenon is to use the following analogy: "Let us consider a machine which simultaneously tosses, say, ten coins. We manufactured the machine such that it should produce correlated coins. We now want to validate whether the machine produces the anticipated result. Imagine a single trial revealing all coins landing on tails. This is a clear signature of correlations, as ten independent coins have 0.01% chance to land on the same side simultaneously. From such an event, we certify the presence of correlations with more than 99.9% confidence. This situation is very similar to quantum correlations captured by entanglement." Borivoje Dakić says: "In contrast to classical coins, qubits can be measured in many, many different ways. The measurement result is still a sequence of zeros and ones, but its structure heavily depends on how we choose to measure individual qubits", he continues. "We realized that, if we pick these measurements in a peculiar way, entanglement will leave unique fingerprints in the measured pattern", he concludes.

    The developed method promises a dramatic reduction in time and resources needed for reliable benchmark of future quantum devices.

  3. The U.S. Army Research Laboratory (ARL) has selected ICF, a global consulting and digital services provider, as one of two awardees eligible to compete for scientific and engineering support services under a new indefinite delivery, indefinite quantity (IDIQ) contract.

    The new Command, Control, Communications, Computers, Combat Systems, Intelligence, Surveillance, and Reconnaissance (C5ISR) IDIQ was awarded by the Computational and Information Sciences Directorate, the principal Army organization for basic and applied research in information sciences, network sciences, battlefield environment and advanced computing and computational sciences. The agreement has a shared ceiling value of $175 million and a possible term of up to eight years.

    "ICF's selection for this work further distinguishes us as leaders in a broader spectrum of information sciences," said Randy James, senior vice president for ICF. "It allows ICF to promote the initiatives of ARL's research partners in academia, industry and across the federal government and to continue the lab's global scientific leadership and STEM outreach efforts. Through this effort, ICF will support ARL's goals in achieving global superiority to deliver the next generation of computer and security technologies."

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    ICF was also awarded its first task order under the IDIQ: an estimated $20 million engagement to perform full spectrum defensive cyber operations and research and development. The firm will engage its expertise as a Cybersecurity Service Provider to create opportunities where new concepts and technologies can be researched, tested and applied to operational environments that can be transitioned to the joint warfighter supporting our nation's defense. The task order has a term of three years including one base and two one-year options.

    For more than two decades, ICF has partnered with ARL to develop new concepts and research, test and apply new technologies to our nation's defense. In 2017, ICF was re-engaged to support ARL's Cybersecurity Service Program where the firm is currently helping in the lab's efforts to develop cyber tools and techniques and advance state-of-the-art computer network defense.

    ICF's cybersecurity specialists help the military, national security and commercial clients build and successfully defend the most aggressively attacked infrastructures on the planet.

  4. A hole at the heart of a stunning rose-like interstellar cloud has puzzled astronomers for decades. But new research, led by the University of Leeds, offers an explanation for the discrepancy between the size and age of the Rosetta Nebula's central cavity and that of its central stars.

    The Rosette Nebula is located in the Milky Way Galaxy roughly 5,000 light-years from Earth and is known for its rose-like shape and distinctive hole at its centre. The nebula is an interstellar cloud of dust, hydrogen, helium and other ionized gases with several massive stars found in a cluster at its heart.

    Stellar winds and ionising radiation from these massive stars affect the shape of the giant molecular cloud. But the size and age of the cavity observed in the centre of Rosette Nebula is too small when compared to the age of its central stars.

    Through supercomputer simulations, astronomers at Leeds and at Keele University have found the formation of the Nebula is likely to be in a thin sheet-like molecular cloud rather than in a spherical or thick disc-like shape, as some photographs may suggest. A thin disc-like structure of the cloud focusing the stellar winds away from the cloud's centre would account for the comparatively small size of the central cavity.

    Study lead author, Dr Christopher Wareing, from the School of Physics and Astronomy said: "The massive stars that make up the Rosette Nebula's central cluster are a few millions of years old and halfway through their lifecycle. For the length of time their stellar winds would have been flowing, you would expect a central cavity up to ten times bigger.

    "We simulated the stellar wind feedback and formation of the nebula in various molecular cloud models including a clumpy sphere, a thick filamentary disc and a thin disc, all created from the same low density initial atomic cloud.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    "It was the thin disc that reproduced the physical appearance - cavity size, shape and magnetic field alignment -- of the Nebula, at an age compatible with the central stars and their wind strengths.

    "To have a model that so accurately reproduces the physical appearance in line with the observational data, without setting out to do this, is rather extraordinary.

    "We were also fortunate to be able to apply data to our models from the ongoing Gaia survey, as a number of the bright stars in the Rosette Nebula are part of the survey.

    Applying this data to our models gave us new understanding of the roles individual stars play in the Rosette Nebula. Next we'll look at the many other similar objects in our Galaxy and see if we can figure out their shape as well."

    The simulations, published today in the Monthly Notices of the Royal Astronomical Society, were run using the Advanced Research Computing centre at Leeds. The nine simulations required roughly half a million CPU hours -- the equivalent to 57 years on a standard desktop computer.

    Martin Callaghan, a member of the Advanced Research Computing team, said: "The fact that the Rosette Nebula simulations would have taken more than five decades to complete on a standard desktop computer is one of the key reasons we provide powerful supercomputing research tools. These tools enabled the simulations of the Rosette Nebula to be done in a matter of a few weeks."

  5. Scientists from the Universities of Bristol and Parma, Italy, have used molecular simulations to understand resistance to osimertinib - an anticancer drug used to treat types of lung cancer.

    Osimertinib binds tightly to a protein, epidermal growth factor receptor (EGFR), which is overexpressed in many tumours.

    EGFR is involved in a pathway that signals for cell proliferation, and so is a target for drugs. Blocking the action of EGFR (inhibiting it) can switch it off, and so is a good way to treat the disease.

    Osimertinib is an effective anticancer drug that works in this way. It is used to treat non-small-cell lung cancer (NSCLC), in cases where the cancer cells have a particular (T790M) mutant form of EGFR.

    It is a so-called 'third-generation' EGFR inhibitor, which was approved as a cancer treatment in 2017. Osimertinib is a covalent inhibitor: as such, it binds irreversibly to EGFR by forming a chemical bond with it.

    Although patients generally respond well to osimertinib, most acquire drug resistance within one year of treatment, so the drug stops working.

    Drug resistance arises because the EGFR protein mutates, so that the drug binds less tightly.

    One such mutation, called L718Q, was recently discovered in patients in the clinic by the Medical Oncology Unit of the University Hospital of Parma.

    In this drug resistant mutant, a single amino acid is changed. Unlike other drug resistant mutants, it was not at all clear how this change stops the drug from binding effectively, information potentially crucial in developing new drugs to overcome resistance.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    Now, a collaboration between medicinal and computational chemists and clinical oncologists has revealed exactly how subtle changes in the protein target cause drug resistance.

    Using a range of advanced molecular simulation techniques, scientists from the Universities of Bristol and Parma, Italy, showed that the structure of the mutant protein changes in a way that stops the drug reacting and binding to it.

    Adrian Mulholland, Professor of Chemistry at the University of Bristol, said: "This work shows how molecular simulations can reveal mechanisms of drug resistance, which can be subtle and non-obvious.

    "In particular, here we've used combined quantum mechanics/molecular mechanics (QM/MM) methods, which allow us to study chemical reactions in proteins.

    "This is crucial in investigating covalent inhibitors, which react with their biological targets, and are the focus of growing interest in the pharmaceutical industry."

    His collaborators, Professor Alessio Lodola and Professor Marco Mor of the Drug Design and Discovery group at the University of Parma, added: "It was an exciting experience to work closely with clinical colleagues who identified the mutant, and to help analyse its effects.

    "Now the challenge is to exploit this discovery in the development of novel drugs targeting EGFR mutants for cancer treatment in future." 

  6. Data is vital in research that tackles diseases like cancer and heart disease. Now Swansea University's leading role in this field has been recognised again, with news today that it is to become one of six substantive sites of the newly-formed Health Data Research UK (HDR UK), in a strategic partnership with Queen's University Belfast.

    The six sites in the HDR project will share an initial investment of £30M for the next 5 years. Swansea University's involvement stems from its world-leading expertise in health informatics. The project will also strengthen the UK's position at the forefront of population data science.

    The Wales and Northern Ireland HDR UK site is led by Professor Ronan Lyons at Swansea and Professor Mark Lawler at Queen's. The team will focus upon two major research initiatives; Modernising Public Health and Enabling Precision Medicine.

    Both partner sites share a vision to upscale the quantity and impact of research in scientific discovery and its translation, patient and population health, policy and economic development by addressing major health challenges and to lead interdisciplinary research aligned to HDR UK's mission.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    On confirmation of the Award, Professor Lyons of Swansea University Medical School commented:

    "I am thrilled that Swansea is to become a founding substantive site of HDR UK. One of Swansea University's key strengths has always been to work collaboratively both with UK and global partners and also across disciplines based in Medicine, Computer Science, Mathematics and Engineering.

    This will help strengthen our position at the forefront of rapidly advancing science to benefit the health, wellbeing and prosperity of the population. I look forward to working with colleagues at the other HDR UK substantive sites to apply cutting-edge science to address the most pressing health challenges".

    This Award sees Swansea at the core of a collaborative research community working together to deliver the priorities of Health Data Research UK. This initial funding is awarded following a rigorous application process, which included interviews with an international panel of experts.

    Professor Andrew Morris, Director of Health Data Research UK, said, "I am delighted to make today's announcement, which marks the start of a unique opportunity for scientists, researchers and clinicians to use their collective expertise to transform the health of the population. "

    The six HDR UK sites, comprising 21 universities and research institutes, have tremendous individual strengths and will form a solid foundation for our long-term ambition. By working together and with NHS and industry partners to the highest ethical standards, our vision is to harness data science on a national scale.

    This will unleash the potential for data and technologies to drive breakthroughs in medical research, improving the way we are able to prevent, detect and diagnose diseases like cancer, heart disease and asthma.

    I am grateful to our funders who recognise the importance of collaboration at scale, and the pivotal contribution of health data research to the UK's ambition to be a global leader in life sciences, for health and economic benefit."

    This is the first phase of investment to establish Health Data Research UK. A further £24 million will be invested in upcoming activities, including a Future Talent Programme and work to address targeted data research challenges through additional partnership sites.

    Health Data Research UK is committed to the highest ethical standards and will work with experts in public engagement to ensure the public voice is central to its activity. It will work at scale and forge national and international partnerships to deliver:

    • New scientific discovery
    • A vibrant training environment for the next generation of data scientists
    • The creation of a trustworthy UK-wide research and innovation ecosystem for health data research.

    Health Data Research UK is a joint investment co-ordinated by the Medical Research Council, working in partnership with the British Heart Foundation, the National Institute for Health Research, the Economic and Social Research Council, the Engineering and Physical Sciences Research Council, Health and Social Care Research and Development Division (Welsh Government), Health and Social Care Research and Development Division (Public Health Agency, Northern Ireland), Chief Scientist Office of the Scottish Government Health and Social Care Directorates, and Wellcome.

    Other substantive sites will be led by consortia from Cambridge, Midlands, Scotland, London and Oxford. For further details, please visit the Health Data Research UK website.

  7. Just one phenomenon may underlie all solar eruptions, according to researchers from the CNRS, École Polytechnique, CEA and INRIA in an article featured on the cover of the February 8 issue of Nature magazine. They have identified the presence of a confining 'cage' in which a magnetic rope forms, causing solar eruptions. It is the resistance of this cage to the attack of the rope that determines the power and type of the upcoming flare. This work has enabled the scientists to develop a model capable of predicting the maximum energy that can be released during a solar flare, which could have potentially devastating consequences for the Earth.

    Just as on Earth, storms and hurricanes sweep through the atmosphere of the Sun. These phenomena are caused by a sudden, violent reconfiguration of the solar magnetic field, and are characterized by an intense release of energy in the form of light and particle emissions and, sometimes, by the ejection of a bubble of plasma. Studying these phenomena, which take place in the corona (the outermost region of the Sun), will enable scientists to develop forecasting models, just as they do for the Earth's weather. This should limit our technological vulnerability to solar eruptions, which can impact a number of sectors such as electricity distribution, GPS and communications systems.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    In 2014, researchers showed that a characteristic structure, an entanglement of magnetic force lines twisted together like a hemp rope, gradually appears in the days preceding a solar flare. However, until recently then they had only observed this rope in eruptions that ejected bubbles of plasma. In this new study, the researchers studied other types of flare, the models of which are still being debated, by undertaking a more thorough analysis of the solar corona, a region where the Sun's atmosphere is so thin and hot that it is difficult to measure the solar magnetic field there. They did this by measuring stronger magnetic field at the surface of the Sun, and then using these data to reconstruct what was happening in the solar corona.

    They applied this method to a major flare that developed over a few hours on October 24, 2014. They showed that, in the hours before the eruption, the evolving rope was confined within a multilayer magnetic 'cage'. Using evolutionary models running on supercomputer, they showed that the rope had insufficient energy to break through all the layers of the cage, making the ejection of a magnetic bubble impossible. Despite this, the high twist of the rope triggered an instability and the partial destruction of the cage, causing a powerful emission of radiation that led to disruptions on Earth.

    Thanks to their method, which makes it possible to monitor the processes taking place in the last few hours leading up to a flare, the researchers have developed a model able to predict the maximum energy that can be released from the region of the Sun concerned. The model showed that for the 2014 eruption, a huge ejection of plasma would have occurred if the cage had been less resistant. This work demonstrates the crucial role played by the magnetic 'cage-rope' duo in controlling solar eruptions, as well as being a new step towards early prediction of such eruptions, which will have potentially significant societal impacts.

  8. Despite significant advances in medicine, an effective vaccine for the human immunodeficiency virus (HIV) is still not available, although recent hope has emerged through the discovery of antibodies capable of neutralizing diverse HIV strains. However, HIV can sometimes evade known broadly neutralizing antibody responses via mutational pathways, which makes it all the more difficult to design an effective solution.

    An ideal vaccine would elicit broadly neutralizing antibodies that target parts of the virus's spike proteins where mutations severely compromise the virus's fitness, or the virus' ability to reproduce and replicate. This requires knowledge of the fitness landscape, a mapping from sequence to fitness. To achieve this goal, data scientists from the HKUST have employed a supercomputational approach to estimate the fitness landscape of gp160, the polyprotein that comprises HIV's spike. The inferred landscape was then validated through comparisons with diverse experimental measurements.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    Their findings were published in the journal PNAS in January 2018 (doi: 10.1073/pnas.1717765115).

    "Without big data machine learning methods, it is simply impossible to make such a prediction," said Raymond Louie, co-author, Junior Fellow of HKUST's Institute for Advanced Study and Research Assistant Professor in the Department of Electronic & Computer Engineering. "The number of parameters needed to be estimated came close to 4.4 million."

    The data processed by the team consisted of 815 residues and 20,043 sequences from 1,918 HIV-infected individuals.

    "The computational method gave us fast and accurate results," said Matthew McKay, co-author and Hari Harilela Associate Professor in the Departments of Electronic & Computer Engineering and Chemical & Biological Engineering at HKUST. "The findings can assist biologists in proposing new immunogens and vaccination protocols that seek to force the virus to mutate to unfit states in order to evade immune responses, which is likely to thwart or limit viral infection."

    "While this method was developed to address the specific challenges posed by the gp160 protein, which we could not address using methods we developed to obtain the fitness landscapes of other HIV proteins, the approach is general and may be applied to other high-dimensional maximum-entropy inference problems," said Arup K. Chakraborty, co-author and Robert T. Haslam Professor in Chemical Engineering, Physics, and Chemistry at MIT's Institute for Medical Engineering & Science. "Specifically, our fitness landscape could be clinically useful in the future for the selection of combination bnAb therapy and immunogen design."

    "This is a multi-disciplinary study presenting an application of data science, and big data machine learning methods in particular, for addressing a challenging problem in biology", said McKay.

  9. An off-the-wall new study by Brown University researchers shows that terahertz frequency data links can bounce around a room without dropping too much data. The results are good news for the feasibility of future terahertz wireless data networks, which have the potential to carry many times more data than current networks.

    Today's cellular networks and Wi-Fi systems rely on microwave radiation to carry data, but the demand for more and more bandwidth is quickly becoming more than microwaves can handle. That has researchers thinking about transmitting data on higher-frequency terahertz waves, which have as much as 100 times the data-carrying capacity of microwaves. But terahertz communication technology is in its infancy. There's much basic research to be done and plenty of challenges to overcome.

    For example, it's been assumed that terahertz links would require a direct line of sight between transmitter and receiver. Unlike microwaves, terahertz waves are entirely blocked by most solid objects. And the assumption has been that it's not possible to bounce a terahertz beam around--say, off a wall or two--to find a clear path around an object.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    "I think it's fair to say that most people in the terahertz field would tell you that there would be too much power loss on those bounces, and so non-line-of-sight links are not going to be feasible in terahertz," said Daniel Mittleman, a professor in Brown University's School of Engineering and senior author of the new research published in APL Photonics. "But our work indicates that the loss is actually quite tolerable in some cases -- quite a bit less than many people would have thought."

    For the study, Mittleman and his colleagues bounced terahertz waves at four different frequencies off of a variety of objects--mirrors, metal doors, cinderblock walls and others -- and measured the bit-error-rate of the data on the wave after the bounces. They showed that acceptable bit-error-rates were achievable with modest increases in signal power.

    "The concern had been that in order to make those bounces and not lose your data, you'd need more power than was feasible to generate," Mittleman said. "We show that you don't need as much power as you might think because the loss on the bounce is not as much as you'd think."

    In one experiment, the researchers bounced a beam off two walls, enabling a successful link when transmitter and receiver were around a corner from each other, with no direct line-of-sight whatsoever. That's a promising finding to support the idea of terahertz local-area networks.

    "You can imagine a wireless network," Mittleman explained, "where someone's computer is connected to a terahertz router and there's direct line-of-sight between the two, but then someone walks in between and blocks the beam. If you can't find an alternative path, that link will be shut down. What we show is that you might still be able to maintain the link by searching for a new path that could involve bouncing off a wall somewhere. There are technologies today that can do that kind of path-finding for lower frequencies and there's no reason they can't be developed for terahertz."

    The researchers also performed several outdoor experiments on terahertz wireless links. An experimental license issued by the FCC makes Brown the only place in the country where outdoor research can be done legally at these frequencies. The work is important because scientists are just beginning to understand the details of how terahertz data links behave in the elements, Mittleman says.

    Their study focused on what's known as specular reflection. When a signal is transmitted over long distances, the waves fan out forming an ever-widening cone. As a result of that fanning out, a portion the waves will bounce off of the ground before reaching the receiver. That reflected radiation can interfere with the main signal unless a decoder compensates for it. It's a well-understood phenomenon in microwave transmission. Mittleman and his colleagues wanted to characterize it in the terahertz range.

    They showed that this kind of interference indeed occurs in terahertz waves, but occurs to a lesser degree over grass compared to concrete. That's likely because grass has lots of water, which tends to absorb terahertz waves. So over grass, the reflected beam is absorbed to a greater degree than concrete, leaving less of it to interfere with the main beam. That means that terahertz links over grass can be longer than those over concrete because there's less interference to deal with, Mittleman says.

    But there's also an upside to that kind of interference with the ground.

    "The specular reflection represents another possible path for your signal," Mittleman said. "You can imagine that if your line-of-site path is blocked, you could think about bouncing it off the ground to get there."

    Mittleman says that these kinds of basic studies on the nature of terahertz data transmission are critical for understanding how to design the network architecture for future terahertz data systems.

  10. Researchers at the University of Illinois at Chicago describe a new technique for precisely measuring the temperature and behavior of new two-dimensional materials that will allow engineers to design smaller and faster microprocessors. Their findings are reported in the journal Physical Review Letters.

    Newly developed two-dimensional materials, such as graphene -- which consists of a single layer of carbon atoms -- have the potential to replace traditional microprocessing chips based on silicon, which have reached the limit of how small they can get. But engineers have been stymied by the inability to measure how temperature will affect these new materials, collectively known as transition metal dichalcogenides, or TMDs.

    Using scanning transmission electron microscopy combined with spectroscopy, researchers at UIC were able to measure the temperature of several two-dimensional materials at the atomic level, paving the way for much smaller and faster microprocessors. They were also able to use their technique to measure how the two-dimensional materials would expand when heated.

    "Microprocessing chips in computers and other electronics get very hot, and we need to be able to measure not only how hot they can get, but how much the material will expand when heated," said Robert Klie, professor of physics at UIC and corresponding author of the paper. "Knowing how a material will expand is important because if a material expands too much, connections with other materials, such as metal wires, can break and the chip is useless."

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    Traditional ways to measure temperature don't work on tiny flakes of two-dimensional materials that would be used in microprocessors because they are just too small. Optical temperature measurements, which use a reflected laser light to measure temperature, can't be used on TMD chips because they don't have enough surface area to accommodate the laser beam.

    "We need to understand how heat builds up and how it is transmitted at the interface between two materials in order to build efficient microprocessors that work," said Klie.

    Klie and his colleagues devised a way to take temperature measurements of TMDs at the atomic level using scanning transition electron microscopy, which uses a beam of electrons transmitted through a specimen to form an image.

    "Using this technique, we can zero in on and measure the vibration of atoms and electrons, which is essentially the temperature of a single atom in a two-dimensional material," said Klie. Temperature is a measure of the average kinetic energy of the random motions of the particles, or atoms that make up a material. As a material gets hotter, the frequency of the atomic vibration gets higher. At absolute zero, the lowest theoretical temperature, all atomic motion stops.

    Klie and his colleagues heated microscopic "flakes" of various TMDs inside the chamber of a scanning transmission electron microscope to different temperatures and then aimed the microscope's electron beam at the material. Using a technique called electron energy-loss spectroscopy, they were able to measure the scattering of electrons off the two-dimensional materials caused by the electron beam. The scattering patterns were entered into a computer model that translated them into measurements of the vibrations of the atoms in the material - in other words, the temperature of the material at the atomic level.

    "With this new technique, we can measure the temperature of a material with a resolution that is nearly 10 times better than conventional methods," said Klie. "With this new approach, we can design better electronic devices that will be less prone to overheating and consume less power."

    The technique can also be used to predict how much materials will expand when heated and contract when cooled, which will help engineers build chips that are less prone to breaking at points where one material touches another, such as when a two-dimensional material chip makes contact with a wire.

    "No other method can measure this effect at the spatial resolution we report," said Klie. "This will allow engineers to design devices that can manage temperature changes between two different materials at the nano-scale level."