The John Stewart Bell Prize for Research on Fundamental Issues in Quantum Mechanics and their Applications, awarded by the University of Toronto.

University of Toronto award celebrates significant recent achievements in quantum mechanics

A trio of scientists who defied Einstein by proving the nonlocal nature of quantum entanglement will be honoured with the John Stewart Bell Prize from the University of Toronto (U of T). The prize recognizes the most significant recent achievements in the world in quantum mechanics and is considered by many to be the top international award in the field.

The recipients each led separate experiments in 2015 that showed two particles so distant from one another that no signal could connect them even at the speed of light nevertheless possessed an invisible and instantaneous connection. They are:

  • Ronald Hanson, Delft University of Technology, Netherlands
  • Sae-Woo Nam of the National Institute of Standards & Technology, United States
  • Anton Zeilinger, University of Vienna, Austria

According to quantum entanglement, the world is a very weird place where quantum particles become correlated in pairs. These pairs predictably interact with each other regardless of how far apart they are: if you measure the properties of one member of the entangled pair you know the properties of the other. Einstein was not a believer: in the 1930s, he called it "spooky action at a distance."

"While many experiments have come close to proving quantum entanglement, the scientists we are honouring have closed previous loopholes," says Professor Aephraim Steinberg, a quantum physicist at the U of T's Centre for Quantum Information & Quantum Control (CQIQC) and one of the founders of the Bell Prize. Earlier tests, for example, were plagued by the difficulties of ensuring that no signal could make it from one detector to the other as well as the fact that so many photons were being lost in the test process.

"Collectively, they have removed all reasonable doubt about the nonlocal nature of quantum entanglement. In so doing they are also opening the door to exciting new technologies including super-secure communications and the ability to perform certain computations exponentially faster than any classical computer," says Steinberg.

Created by the CQIQC at U of T in 2005, the John Stewart Bell Prize for Research on Fundamental Issues in Quantum Mechanics and their Applications is judged by an international panel of experts and awarded every two years for achievements in the previous six years.

"Advancing understanding of quantum mechanics, along with its technological applications, is something that deserves to be celebrated and recognized around the world. We expect that, in some cases, the Bell Prize will prove to be a precursor to the Nobel Prize in Physics," says Daniel James, director of the CQIQC.

The prize will be awarded on Thursday, August 31 at 1:25 pm at the Fields Institute on the U of T campus. Recipients will give short talks after the ceremony.

Ken Dill explains the supercomputational model that shows how certain molecules fold and bind together in the evolution of chemistry into biology, a key step to explain the origins of life.

Stony Brook scientists detail the “Foldamer” hypothesis in PNAS, which models the growth of prebiotic polymers in the evolution of chemistry into biology

Scientists have yet to understand and explain how life’s informational molecules – proteins and DNA and RNA – arose from simpler chemicals when life on earth emerged some four billion years ago. Now a research team from the Stony Brook University Laufer Center for Physical and Quantitative Biology and Lawrence Berkeley National Laboratory believe they have the answer. They developed a supercomputational model explaining how certain molecules fold and bind together to grow longer and more complex, leading from simple chemicals to primitive biological molecules. The findings are reported early online in PNAS .

Previously scientists learned that the early earth likely contained the basic chemical building blocks, and sustained spontaneous chemical reactions that could string together short chains of chemical units. But it has remained a mystery what actions could then prompt short chemical polymer chains to develop into much longer chains that can encode useful protein information. The new supercomputational model may help explain that gap in the evolution of chemistry into biology.

“We created a computational model that illustrates a fold-and-catalyze mechanism that amplifies polymer sequences and leads to runaway improvements in the polymers,” said Ken Dill, lead author, Distinguished Professor and Director of the Laufer Center. “The theoretical study helps to understand a missing link in the evolution of chemistry into biology and how a population of molecular building blocks could, over time, result in the emergence of catalytic sequences essential to biological life.”

In the paper, titled “The Foldamer Hypothesis for the growth and sequence-differentiation of prebiotic polymers,” the researchers used computer simulations to study how random sequences of water-loving, or polar, and water-averse, or hydrophobic, polymers fold and bind together. They found these random sequence chains of both types of polymers can collapse and fold into specific compact conformations that expose hydrophobic surfaces, thus serving as catalysts for elongating other polymers. These particular polymer chains, referred to as “foldamer” catalysts, can work together in pairs to grow longer and develop more informational sequences. 

This process, according to the authors, provides a basis to explain how random chemical processes could have resulted in protein-like precursors to biological life.  It gives a testable hypothesis about early prebiotic polymers and their evolution.

“By showing how prebiotic polymers could have become informational ‘foldamers’, we hope to have revealed a key step to understanding just how life started to form on earth billions of years ago,” explained Professor Dill. 

Co-authors of the paper include Elizaveta Guseva of the Laufer Center and Departments of Chemistry and Physics & Astronomy at Stony Brook University, and Ronald N. Zuckermann of Lawrence Berkeley National Laboratory in Berkeley, Calif.

The research was supported in part by the National Science Foundation.

CAPTION Lung tissue histology.

SuperComputational approach could lead to better understanding of emphysema and other human lung diseases

Scientists have developed a new virtual model of mouse lung function that illuminates the relative importance of different factors that contribute to lung changes accompanying chronic inflammation. Christopher Massa and his colleagues at Rutgers University, New Jersey, present the work in PLOS Computational Biology.

Chronic inflammation, which is often found in people with chronic obstructive pulmonary disease (COPD), and aging both alter lung structure and function. Respiratory impedance--a measure of air flow and pressure in the throat--can be used to model and assess general lung function. However, this approach cannot distinguish the relative impact of specific changes, such as destruction of air sacs and altered airway size.

In pursuit of a more detailed approach, Massa and colleagues examined lung tissue and function in healthy mice, as well as in mice that had been genetically engineered to mimic conditions seen in people with emphysema, a type of COPD. Then, they used this experimental data to build a new model that simulates changes in mouse lung function, as indicated by respiratory impedance.

The researchers used the model to compare the relative influence on lung function of several factors associated with chronic inflammatory disease. These included lung tissue destruction, changes in density of elastic fiber structures that aid in stretching and contraction of lung walls during breathing, and changes in lung recruitment, in which elevated pressure keeps open lung regions that might otherwise collapse.

Modeling results suggested that changes in lung recruitment and elastic fiber density were mainly responsible for the observed decrease in lung function associated with chronic inflammatory disease in the mice. This contracts with previous proposals that tissue damage and loss of structure have the greatest impact.

The new model provides a fresh approach for examining how pathological changes alter lung function. "The most exciting aspect of this research is that by conducting these analyses we can determine what has happened to the lung tissue simply by measuring air flow and pressure at the throat," says study co-author Andrew Gow.

Further work will be needed to refine the new model and determine how the results might apply to humans. Nonetheless, it opens up new pathways for research. "The next steps for this research are to directly test how well the model can predict the effects of a therapeutic intervention, such as budesonide, upon lung function," Gow says.

Students participate in Bridge 2017, a two-week summer immersion program that prepares incoming freshmen who might have less programming experience for their first computer science course at Purdue. Undergraduate applications to the computer science program have increased so much that the admission rate has declined even as the quality of applicants improves, said Sunil Prabhakar, head of the Department of Computer Science.

 Undergraduate enrollment in Purdue University’s Department of Computer Science has more than doubled since 2012, a circumstance that will help fill a national shortage.

In recent years, educators and employers in computer science have received a wakeup call – they need employees to fill the hundreds of thousands of vacant jobs in the field, especially women and underrepresented minorities.

There are currently 530,000 open computing jobs nationwide, and the National Center for Women and Information Technology predicts that only 41 percent of jobs in the field will be filled by 2024. Purdue President Mitch Daniels recognized the school’s potential to help fill those gaps, and in 2013, he named expanding computer science one of his Purdue Moves. Purdue Moves is a range of initiatives at the West Lafayette campus designed to broaden Purdue's global impact and enhance educational opportunities for its students. All of the moves fall into four broad categories: science, technology, engineering and math (STEM) leadership; world-changing research; transformative education; and affordability and accessibility.

Purdue received more than 4,000 computer science undergraduate applications for fall semester 2017. Applications have increased so much that the admission rate has continued to decline even as the quality of applicants improves, said Sunil Prabhakar, head of the Department of Computer Science.

Growth of the field isn’t an isolated phenomenon. Across the nation, universities have struggled to keep up with the rising number of students who want to take classes or major in computer science. A report by the Computing Research Association found that the number of computer science majors at doctoral institutions in North America tripled between 2006 and 2017, and the trend is likely to continue.

Many universities are having a difficult time meeting the needs of all the students who want to take computer science courses, according to the report. Shortage of classroom space, insufficient numbers of faculty and instructors to teach courses, and increased faculty workloads are among the top challenges. 

Since 2012, Purdue’s Department of Computer Science has hired 15 new faculty members and added three new academic advisers. The university also created two new degree programs: an undergraduate major in data science and a master’s program in information security.

Purdue’s computer science students are doing well, during school and after graduation. Ninety-five percent of first-year students stay in the major, though many American universities grapple with retention in STEM majors. They also receive the highest undergraduate starting salary at Purdue and a post-graduation job placement rate of nearly 100 percent.

Despite these achievements, diversity in the field is lacking. Women earn 57 percent of all undergraduate degrees in the United States, but only 18 percent of computer and information sciences degrees, according to the National Center for Women and Information Technology. 

As a result, educators across the nation have recognized the need to attract more women to the field and have been making a concerted effort to do so.

Twenty-one percent of first-year computer science majors at Purdue are now women, compared to 13 percent in 2015. Female faculty members have also doubled since 2012. Adrian Thomas, diversity specialist for the Department of Computer Science at Purdue, said targeted marketing to women is responsible for this increase.

“We really got the word out about what Purdue does and how important it is to have women here,” said Thomas. 

The department also split CS 180, a first-year course that all computer science students are required to take. The course was divided into sections for more and less experienced students, and the majority of women in the 2016 class enrolled in the less experienced section.  

“The effort is really making a difference because they’re starting the course with fewer assumptions. We’re meeting them where they are and getting them where they need to go,” Thomas said. “Both sections end up at the same place, but they start out differently.”

Although these efforts have been successful in attracting female students to the department, ethnic minority students are still underrepresented. This problem has only been recognized very recently at the national level, Thomas said, but she expects that programs like Black Girls CODE and CODE2040 will help solve it. 

Alexandra Boltasseva in her lab at Birck Nanotechnology Center. (Purdue University photo/Alex Kumar)

Interactions between light and matter are a fundamental unit of modern physics, but recently researchers have started to look beyond the standard textbook interactions.

Alexandra Boltasseva is a professor of electrical and computer engineering at Purdue University. For years, she’s been working with optical metamaterials (artificially engineered materials containing nanostructures which give them unique visual properties) to create nanotweezers, metasurfaces and other tiny objects. Now, she’s exploring an entirely new chapter of physics.   

“If we look at a textbook and there’s a chapter on how light interacts with matter, it would go from how light interacts with transparent materials to how light interacts with reflective materials,” Boltasseva said. “What we’re going to study is this area between the two types.”

When light propagates through transparent materials, the light doesn’t change very much, meaning it has a positive dielectric permittivity (also called epsilon). The opposite is true for reflective materials, which expel light and have a negative permittivity (negative epsilon). Between positive and negative is an exotic, largely unexplored region referred to as Epsilon-Near-Zero (ENZ).

When light enters a medium whose permittivity is zero, observers will see the same light going in and coming out. It’s almost like the light is tunneling from one side to the other without changing its properties – an unlikely phenomenon in physics.

“Since zero is so different from plus and minus one, we expect that many interesting things are happening there,” Boltasseva said. “It brings completely news physics and insights into play.”

Conventional ENZ media, such as metal, have naturally occurring zero crossings but often experience material loss (light absorption) at that point. To find a material that has a zero crossing but doesn’t allow light absorption will be difficult, Boltasseva said.

The research team plans to experiment with both natural materials and new metamaterials, though natural materials are more likely to experience absorption. Adding a light-amplifying medium could counteract absorption, but it would be challenging. The group believes transparent conducting oxides and transition metal nitrides (newly developed materials that have a naturally occurring ENZ point in the visible and near infrared wavelength ranges, as well as tailorable optical properties) could help solve this problem.  

Although Boltasseva believes this project’s biggest impact will be on fundamental science, she thinks it will lead to new device applications as well.

“Ultrafast modulation is one of the big problems in optics. There’s always a trade-off. You’re either changing things really slowly at a large amplitude, or very fast but in a small range. I hope we can break this cycle,” she said. “This could lead to a variety of ultrafast optical devices for communication and information technologies.”

This is a simulation of a laboratory projectile impacting ceramic backed by polycarbonate showing damage in the ceramic

"While various ceramic materials possess high hardness, they fail easily when pulled apart. That is, when subjected to tensile forces. The amount of tension that these materials can withstand before failure, is a small fraction of the compression they can withstand. As a result, high velocity impact of bullets and fragments causes extensive cracking and fragmentation of the material, reducing its ability to fully utilize its superior hardness to resist complex stress states generated by the impact event," explained Dr. Sikhanda Satapathy, of ARL's Soldier Protection Sciences Branch. 

Traditionally, the relationship between the granular material's ability to withstand compression and its ability to resist shearing deformation, which causes material to change shape has been described by the Mohr-Coulomb model. This model approximates the material's resistance to shearing deformation (shear strength) as a linear function of applied pressure. In reality, the shear strength does not increase linearly with pressure and will saturate at high pressures. 

The UF researchers developed a new model that describes the granular material response more accurately by studying the stress state at which a variety of ceramics fail as reported in the literature by various research teams.

The ARL and the UF team collaborated to employ this improved granular response model in conjunction with a dynamic cavity expansion modeling framework to capture the response of ceramics to the complex impact-induced stress state that includes compression, tension and shear. The dynamic cavity expansion modeling framework uses the pressure required to expand a cavity in an intact material to characterize its ability to resist deep penetration. This pressure, of course, is dependent on how the material responds to compression, tension and shear forces. Due to the applicability of this new model to a broad class of ceramics, the need for expensive experiments to characterize penetration response is significantly reduced. The new penetration model improves the understanding of how brittle ceramic responds to high impact stress by fracturing and comminuting to granular like material, and increases modeling ability of penetration events. 

The improved model has been shown to better predict the resistance of a wide range of ceramic targets when shot at by a long-rod projectiles at velocities up to 3km/s. The important material parameters for penetration performance of a ceramic target have been identified through this collaborative effort, which will guide how the failure processes in ceramic can be controlled through improved material design or through a multi-materials systems approach. 

"Understanding the mechanics of material response to projectile impact generated stress conditions is crucial in this research," Satapathy said. The research will appear in the International Journal of Solids and Structures, DOI: 10.1016/j.ijsolstr.2017.07.014


Treehouse Childhood Cancer Initiative Founder Olena Morozova, in a recent interview.

Genomics Institute project matches tumors to treatment options, helps find hope

When you hear the words "Santa Cruz," you likely think of surf, sand and other beachside fun. It's easy to forget there is a world class research institution, University of California, Santa Cruz, nestled in the redwoods right above town.

With the arresting beauty of the Monterey Bay and nostalgic fun of the Santa Cruz Boardwalk, tourists rarely ask about the serious research happening just up the hill. They seldom lay eyes on UC Santa Cruz, a university ranked 4th in the world for research influence, as measured by the number of times UC Santa Cruz faculty's published work is cited by scholars around the world.

So, it is understandable that even locals may not know that there is a top research team at UC Santa Cruz's Baskin School of Engineering who is taking on pediatric cancer using big data.

The City of Santa Cruz Economic Development Office recently sat down with Treehouse Childhood Cancer Initiative Founder Olena Morozova and UC Santa Cruz Genomics Institute Scientific Director David Haussler to learn more about how UC Santa Cruz is working to better understand and better treat cancer in children -- all without the benefit of a medical school.

"We are giving new hope to these families that otherwise would be out of treatment options," says Morozova. "And time will tell if some of those will actually translate to cures," she said.

CAPTION Left: Raw electron microscopy images of small pieces of brain tissues. Right: Color-coded brain maps generated by computers, where different colors represent different neurons.

A potentially faster way to understand brain circuitry

A WSU research team for the first time has developed a supercomputer algorithm that is nearly as accurate as people are at mapping brain neural networks -- a breakthrough that could speed up the image analysis that researchers use to understand brain circuitry.

A report on the WSU team's work currently in the journal, Bioinformatics.

Like mapping 100 billion homes

For more than a generation, people have been trying to improve understanding of human brain circuitry, but are challenged by its vast complexity. It is similar to having a satellite image of the earth and trying to map out 100 billion homes, all of the connecting streets and everyone's destinations, said Shuiwang Ji, associate professor in the School of Electrical Engineering and Computer Science and lead researcher on the project.

Researchers, in fact, took more than a decade to fully map the circuitry of just one animal's brain -- a worm that has only 302 neurons. The human brain, meanwhile, has about 100 billion neurons, and the amount of data needed to fully understand its circuitry would require 1000 exabytes of data, or the equivalent of all the data that is currently available in the world.

Neuron by neuron

To map neurons, researchers currently use an electron microscope to take pictures -- with one image usually containing a small number of neurons. The researchers then study each neuron's shape and size as well as its thousands of connections with other nearby neurons to learn about its role in behavior or biology.

"We don't know much about how brains work," said Ji.

With such rudimentary understanding of our circuitry, researchers are limited in their ability to understand the causes of devastating brain diseases, such as Alzheimer's, schizophrenia, autism or Parkinson's disease, he said. Instead, they have to rely on trial and error experimentation to come up with treatments. The National Academy of Engineering has listed improving understanding of the human brain as one of its grand challenges for the 21st century.

Accurate as humans

In 2013, MIT organized a competition that called on researchers to develop automated computer algorithms that could speed up image analysis, decode and understand images of brain circuitry. As part of the competition, the algorithms are compared to work that was done by a real team of neuroscientists. If computers can become as accurate as humans, they will be able to do the computations much faster and cheaper than humans, said Ji.

WSU's research team developed the first computational model that was able to reach a human level of performance in accuracy.

Just as a human eye takes in information and then analyzes it in multiple stages, the WSU team developed a computational model that takes the image as its input and then processes it in a many-layered network before coming to a decision. In their algorithm, the researchers developed an artificial neural network that imitates humans' complex biological neural networks.

While the WSU research team was able to approach human accuracy in the MIT challenge, they still have a lot of work to do in getting the computers to develop complete and accurate neural maps. The computers still make a large number of mistakes, and there is not yet a gold standard for comparing human and computational results, said Ji. Although it may not be realistic to expect that automated methods would completely replace human soon, improvements in computational methods will certainly lead to reduced manual proof-reading, he added.

CAPTION Projective Awareness Model: a. The field of consciousness incorporates beliefs and preferences using projective geometry and minimizing free energy in order to motivate action. b. The subject imagines: option A (buying an expensive cake) and option B (making the cake him or herself) to reach the imagined situation 2 where the children at home are happy. Options A and B are very similar: both bring pleasure, but A demands an irreversible expense and B requires a one-off effort. The anticipated final free energy is minimal for B. The subject chooses B as the scenario to be realized.

An international team of experts led by the University of Geneva has developed a mathematical model of human psychology that can predict and analyze normal and pathological human behavior

A human being's psychological make-up depends on an array of emotional and motivational parameters, such as desire, suffering or the need for security. In addition, it includes spatial and temporal dimensions that also play a key role in rationalising the decisions we make and planning our actions. A team of researchers from the Universities of Geneva (UNIGE), Texas, Paris and University College London joined forces to create the first mathematical model of embodied consciousness. Their aim? To understand, study and predict human behaviour. The model, which is based on solid mathematical concepts and is demonstrated using simulations, makes it possible to anticipate and explain a host of cognitive phenomena and behavioural reactions. The research -- which you can read in full in the Journal of Theoretical Biology -- also paves the way for a wealth of industrial applications in robotics, artificial intelligence and the health sector.

An international, interdisciplinary team of researchers, headed by David Rudrauf, professor in UNIGE's faculty of psychology and educational science, was keen to produce a psychological theory that operated on the model developed by the hard sciences. The goal was to devise a mathematical model of human psychology for predicting and evaluating (normal and pathological) human behaviour. More than a decade of research -- combining maths, psychology, neuroscience, philosophy, computer science and engineering -- was required to construct this theoretical model of consciousness.

Free energy determines choices

We all are constantly faced with a range of choices, some of which are important, some not. But how do we make our decisions? There are many factors at work, conscious and unconscious, which are forever colliding whenever a decision is made. "We built a model to replicate decision-making based on the time, framework and perceptions (real and imaginary) that are linked to it," explains David Rudrauf. "The next step was to analyse the best solution that the mind would naturally opt for." Depending on an individual's personal preferences (such as security), and including different real and imaginary perspectives on the world, the mind calculates the probabilities of obtaining what it wants in the safest possible way. This probability calculation, which is derived from an individual's personal preferences and values, can be expressed as free energy. "Our consciousness uses free energy to actively explore the world and to satisfy its predilections by imagining the anticipated consequences of its actions," says Karl Friston from University College London. Depending on the free energy, the mathematical model can predict the states of consciousness and behaviour adopted by the individual and analyse the mechanisms. 

This Projective Consciousness Model analyses possible forms of behaviour according to events: if you spot a cake in a shop window, will you buy it or carry on your way? Based on your preferences -- whether you have a sweet tooth, for example, or are a penny-pincher -- the model will determine what best suits your state of mind: it will then predict your psychological state and behaviour using a combination of projective geometry and free-energy calculation.

Understanding and making a mathematical model of the phenomenology of the mind: projective geometry

As Kenneth Williford, professor of philosophy at the University of Texas, explains: "The aim was to understand and model the essential structures of conscious experience." Daniel Bennequin, professor in the mathematics department at the University of Paris 7, adds: "Perception, imagination and action are supported by unconscious mechanisms, we discovered that consciousness integrates them through a specific geometry: projective geometry." The researchers started with a synthesis of psychological phenomena, including basic perceptual phenomena: the illusion, for instance, that train tracks converge in the distance when they are actually parallel. The scientists were able to select the mathematical template for modelling this perception and the imagination associated with it. "It was then a question of understanding how this field of consciousness is related to affect, emotions and motivation as well as memory and intentions," says Rudrauf. 

Virtual reality: a space for experimenting and research

"Once the theoretical components were defined," continues Rudrauf, "we implemented them in computer programs. We are now working on connecting them to virtual reality in order to reproduce a spatial, temporal and emotional environment that is as close as possible to our own." The research team is then able to make predictions about behaviour by playing with the model's mechanisms, perfecting it and bringing it closer to human psychology. It was long-term work: "But our aim is also to gradually direct the research towards psychopathological models," points out Rudrauf. "We found, for example, that if we deprive the model of the faculty of imagination, it behaves like a person with autism. This suggests research pathways on the importance of the imagination and its specific mechanisms in managing the illness." The model works on a concept of reciprocity: humans are used to test and reinforce the effectiveness of the model; and the model is used to experiment with different cases and sources of psychological illnesses in humans.

The initial results show that this first mathematical model of embodied consciousness, incorporating temporality, spatialisation and emotions, can predict a vast array of known human behaviours and understand the mechanisms behind them. There is still much work to be done, however, to replicate human consciousness identically, since every possible type of behaviour must be implemented in the mathematical system. The researchers are now working on an extension of the algorithm that will produce machines that can adapt to the reactions of their interlocutors and act according to the principle of empathic reciprocity.


Page 14 of 42