We humans may be more aligned with the universe than we realize.

According to research published in the journal Physical Review C, neutron stars and cell cytoplasm have something in common: structures that resemble multistory parking garages.

In 2014, UC Santa Barbara soft condensed-matter physicist Greg Huber and colleagues explored the biophysics of such shapes -- helices that connect stacks of evenly spaced sheets -- in a cellular organelle called the endoplasmic reticulum (ER). Huber and his colleagues dubbed them Terasaki ramps after their discoverer, Mark Terasaki, a cell biologist at the University of Connecticut.

Huber thought these "parking garages" were unique to soft matter (like the interior of cells) until he happened upon the work of nuclear physicist Charles Horowitz at Indiana University. Using supercomputer simulations, Horowitz and his team had found the same shapes deep in the crust of neutron stars.

"I called Chuck and asked if he was aware that we had seen these structures in cells and had come up with a model for them," said Huber, the deputy director of UCSB's Kavli Institute for Theoretical Physics (KITP). "It was news to him, so I realized then that there could be some fruitful interaction."

The resulting collaboration, highlighted in Physical Review C, explored the relationship between two very different models of matter.

Nuclear physicists have an apt terminology for the entire class of shapes they see in their high-performance supercomputer simulations of neutron stars: nuclear pasta. These include tubes (spaghetti) and parallel sheets (lasagna) connected by helical shapes that resemble Terasaki ramps.

"They see a variety of shapes that we see in the cell," Huber explained. "We see a tubular network; we see parallel sheets. We see sheets connected to each other through topological defects we call Terasaki ramps. So the parallels are pretty deep."

However, differences can be found in the underlying physics. Typically matter is characterized by its phase, which depends on thermodynamic variables: density (or volume), temperature and pressure -- factors that differ greatly at the nuclear level and in an intracellular context.

"For neutron stars, the strong nuclear force and the electromagnetic force create what is fundamentally a quantum-mechanical problem," Huber explained. "In the interior of cells, the forces that hold together membranes are fundamentally entropic and have to do with the minimization of the overall free energy of the system. At first glance, these couldn't be more different."

Another difference is scale. In the nuclear case, the structures are based on nucleons such as protons and neutrons and those building blocks are measured using femtometers (10-15). For intracellular membranes like the ER, the length scale is nanometers (10-9). The ratio between the two is a factor of a million (10-6), yet these two vastly different regimes make the same shapes.

"This means that there is some deep thing we don't understand about how to model the nuclear system," Huber said. "When you have a dense collection of protons and neutrons like you do on the surface of a neutron star, the strong nuclear force and the electromagnetic forces conspire to give you phases of matter you wouldn't be able to predict if you had just looked at those forces operating on small collections of neutrons and protons."

The similarity of the structures is riveting for theoretical and nuclear physicists alike. Nuclear physicist Martin Savage was at the KITP when he came across graphics from the new paper on arXiv, a preprint library that posts thousands of physics, mathematics and computer science articles. Immediately his interest was piqued.

"That similar phases of matter emerge in biological systems was very surprising to me," said Savage, a professor at the University of Washington. "There is clearly something interesting here."

Co-author Horowitz agreed. "Seeing very similar shapes in such strikingly different systems suggests that the energy of a system may depend on its shape in a simple and universal way," he said.

Huber noted that these similarities are still rather mysterious. "Our paper is not the end of something," he said. "It's really the beginning of looking at these two models."

Researchers at North Carolina State University have developed a new technique for creating NV-doped single-crystal nanodiamonds, only four to eight nanometers wide, which could serve as components in room-temperature quantum supercomputing technologies. These doped nanodiamonds also hold promise for use in single-photon sensors and nontoxic, fluorescent biomarkers.

Currently, computers use binary logic, in which each binary unit - or bit - is in one of two states: 1 or 0. Quantum supercomputing makes use of superposition and entanglement, allowing the creation of quantum bits - or qubits - which can have a vast number of possible states. Quantum supercomputing has the potential to significantly increase computing power and speed.

A number of options have been explored for creating quantum computing systems, including the use of diamonds that have "nitrogen-vacancy" centers. That's where this research comes in.

Normally, diamond has a very specific crystalline structure, consisting of repeated diamond tetrahedrons, or cubes. Each cube contains five carbon atoms. The NC State research team has developed a new technique for creating diamond tetrahedrons that have two carbon atoms; one vacancy, where an atom is missing; one carbon-13 atom (a stable carbon isotope that has six protons and seven neutrons); and one nitrogen atom. This is called the NV center. Each NV-doped nanodiamond contains thousands of atoms, but has only one NV center; the remainder of the tetrahedrons in the nanodiamond are made solely of carbon.

It's an atomically small distinction, but it makes a big difference.

"That little dot, the NV center, turns the nanodiamond into a qubit," says Jay Narayan, the John C. Fan Distinguished Chair Professor of Materials Science and Engineering at NC State and lead author of a paper describing the work. "Each NV center has two transitions: NV0 and NV-. We can go back and forth between these two states using electric current or laser. These nanodiamonds could serve as the basic building blocks of a quantum computer."

To create these NV-doped nanodiamonds, the researchers start with a substrate, such as such as sapphire, glass or a plastic polymer. The substrate is then coated with amorphous carbon - elemental carbon that, unlike graphite or diamond, does not have a regular, well-defined crystalline structure. While depositing the film of amorphous carbon, the researchers bombard it with nitrogen ions and carbon-13 ions. The carbon is then hit with a laser pulse that raises the temperature of the carbon to approximately 4,000 Kelvin (or around 3,727 degrees Celsius) and is then rapidly quenched. The operation is completed within a millionth of a second and takes place at one atmosphere - the same pressure as the surrounding air. By using different substrates and changing the duration of the laser pulse, the researchers can control how quickly the carbon cools, which allows them to create the nanodiamond structures.

"Our approach reduces impurities; controls the size of the NV-doped nanodiamond; allows us to place the nanodiamonds with a fair amount of precision; and directly incorporates carbon-13 into the material, which is necessary for creating the entanglement required in quantum computing," Narayan says. "All of the nanodiamonds are exactly aligned through the paradigm of domain matching epitaxy, which is a significant advance over existing techniques for creating NV-doped nanodiamonds."

"The new technique not only offers unprecedented control and uniformity in the NV-doped nanodiamonds, it is also less expensive than existing techniques," Narayan says. "Hopefully, this will enable significant advances in the field of quantum computing."

The researchers are currently talking with government and private sector groups about how to move forward. One area of interest is to develop a means of creating self-assembling systems that incorporate entangled NV-doped nanodiamonds for quantum supercomputing.

Artificially intelligent computer software that can learn, adapt and rebuild itself in real-time could help combat climate change.

Researchers at Lancaster University’s Data Science Institute have developed a software system that can for the first time rapidly self-assemble into the most efficient form without needing humans to tell it what to do.

The system – called REx – is being developed with vast energy-hungry data centres in mind. By being able to rapidly adjust to optimally deal with a huge multitude of tasks, servers controlled by REx would need to do less processing, therefore consuming less energy.

REx works using ‘micro-variation’ – where a large library of building blocks of software components (such as memory caches, and different forms of search and sort algorithms) can be selected and assembled automatically in response to the task at hand.

“Everything is learned by the live system, assembling the required components and continually assessing their effectiveness in the situations to which the system is subjected,” said Dr Barry Porter, lecturer at Lancaster University’s School of Computing and Communications. “Each component is sufficiently small that it is easy to create natural behavioural variation. By autonomously assembling systems from these micro-variations we then see REx create software designs that are automatically formed to deal with their task.

“As we use connected devices on a more frequent basis, and as we move into the era of the Internet of Things, the volume of data that needs to be processed and distributed is rapidly growing. This is causing a significant demand for energy through millions of servers at data centres. An automated system like REx, able to find the best performance in any conditions, could offer a way to significantly reduce this energy demand,” Dr Porter added.

In addition, as modern software systems are increasingly complex – consisting of millions of lines of code – they need to be maintained by large teams of software developers at significant cost. It is broadly acknowledged that this level of complexity and management is unsustainable. As well as saving energy in data centres, self-assembling software models could also have significant advantages by improving our ability to develop and maintain increasingly complex software systems for a wide range of domains, including operating systems and Internet infrastructure.

REx is built using three complementary layers. At the base level a novel component-based programming language called Dana enables the system to find, select and rapidly adapt the building blocks of software. A perception, assembly and learning framework (PAL) then configures and perceives the behaviour of the selected components, and an online learning process learns the best software compositions in real-time by taking advantage of statistical learning methods known as ‘linear bandit models’.

The work is presented in the paper ‘REx: A Development Platform and Online Learning Approach for Runtime Emergent Software Systems’ at the conference ‘OSDI ‘16 12th USENIX Symposium on Operating Systems Design and Implementation’. The research has been partially supported by the Engineering and Physical Sciences Research Council (EPSRC), and also a PhD scholarship of Brazil.

The next steps of this research will look at the automated creation of new software components for use by these systems and will also strive to increase automation even further to make software systems an active part of their own development teams, providing live feedback and suggestions to human programmers.

The paper’s authors are Dr Barry Porter, Matthew Grieves, Roberto Rodrigues Filho and Professor David Leslie.

University of Utah computer scientists analyze how Twitter users feel about presidential election

We know how Donald Trump feels about everyone through Twitter, but how do Twitter users feel about Donald Trump?

Computer scientists from the University of Utah's College of Engineering have developed what they call "sentiment analysis" software that can automatically determine how someone feels based on what they write or say. To test out the accuracy of this software's machine-learning model, the team used it to analyze the individual sentiments of more than 1.6 million (and counting) geo-tagged tweets about the U.S. presidential election over the last five months. A database of these tweets is then examined to determine whether states and their counties are leaning toward the Republicans or Democrats.

"With sentiment analysis, it will try to predict the emotions behind every human being when he or she is talking or writing something," says Debjyoti Paul, a doctoral student in the University of Utah's School of Computing and the project leader along with School of Computing associate professor Feifei Li. "With that in mind, we are not just trying to look at the information in the tweets. We are trying to incorporate the emotion with the information."

As a result of their work, the team has created an interactive website at http://www.estorm.org in which users can find out if the tweets coming out of their state and its counties are more positive or negative toward Republicans or Democrats during any defined period of time since June 5. Also, the data can tell you the percentage of both positive and negative tweets toward a political party and when there was a surge for a particular type of tweet in the last five months.

Some interesting facts about this year's U.S. presidential election based on a sample of what people are tweeting:

  • Based on the number of positive tweets posted since June toward each party, the computer model predicts that Hillary Clinton will win the presidential election.
  • Republicans sent out 17 percent more political tweets than Democrats.
  • Delaware was the only state in which a majority of tweets from all counties in the state were positive toward the same party -- in this case, the Democrats.
  • For the Republicans, South Dakota had the highest percentage of counties in which most of their tweets were positive toward the party (73 percent of the counties).
  • The biggest surge of positive tweets for Republicans was during the Republican National Convention on June 18 and when the video of Donald Trump boasting about groping women was leaked Oct. 7 (presumably defenders of Trump tweeting their support of him).
  • The largest surge of positive tweets for Democrats was after the last two presidential debates and after the New York Times published its story Oct. 1 that Donald Trump avoided paying federal taxes for nearly two decades.
  • Not only did the number of positive tweets for Democrats peak after the last two debates and the Trump federal taxes story, it's also when the most negative tweets about the Democratic Party were posted.

Analyzing the tweets

Paul and his team started with more than 250 million tweets posted around the world from June 5 to Oct. 30 and then weeded out all non-political tweets based on a system of keywords using advanced software. They were left with more than 1.6 million political tweets posted in the U.S.

Then those tweets were sifted through the team's "sentiment analysis" software where each tweet was analyzed and assigned a score from 0 to 1 where 0 is the most negative sentiment, 1 is the most positive sentiment, and 0.5 is neutral. The scores are then collected in a database that can calculate a state or county's political leanings in real time based on the tweets. The database is constantly updated with new tweets.

To measure the accuracy of the model, the team compared its results to the New York Times Upshot election forecast website and found the state-by-state analysis was very similar.

"I think it works really well. It matches up with the major events that happened during this election season. That's a good indicator that the results are accurate," says Li. "We're hoping to develop some more scientific measurements to confirm this observation for an upcoming paper, but the early results are very positive."

Paul believes that their sentiment analysis software could be used to more accurately reflect the feelings of crowd-sourced opinions on the Internet, for example reviews of products on Amazon or restaurant reviews on Yelp, in which the software can "drill down to the individual sentences of the text" to determine a person's true feelings about something, he says.

He also said that voice-enabled assistants such as iPhone's Siri could use such software to better determine what the user wants, not just based on what he or she says but how they say it. 

This is a screenshot from estorm.org showing Twitter sentiment toward or against the major presidential candidates.
This is a screenshot from estorm.org showing Twitter sentiment toward or against the major presidential candidates.

Figure 1: (left) Image of Saturn's rings taken by the Cassini spacecraft. Provided by NASA http://photojournal.jpl.nasa.gov/catalog/PIA06077). (right) Image of Uranus' rings taken by the Hubble Space Telescope. Provided by NASA.

A team of researchers have presented a new model for the origin of Saturn's rings based on results of supercomputer simulations. The results of the simulations are also applicable to rings of other giant planets and explain the compositional differences between the rings of Saturn and Uranus. The findings were published on October 6 in the online version of Icarus.

The lead author of the paper is HYODO Ryuki (Kobe University, Graduate School of Science), and co-authors are Professor Sébastien Charnoz (Institute de Physique du Globe/Université Paris Diderot), Professor OHTSUKI Keiji (Kobe University, Graduate School of Science), and Project Associate Professor GENDA Hidenori (Earth-Life Science Institute, Tokyo Institute of Technology).

The giant planets in our solar system have very diverse rings. Observations show that Saturn's rings are made of more than 95% icy particles, while the rings of Uranus and Neptune are darker and may have higher rock content. Since the rings of Saturn were first observed in the 17th century, investigation of the rings has expanded from earth-based telescopes to spacecraft such as Voyagers and Cassini. However, the origin of the rings was still unclear and the mechanisms that lead to the diverse ring systems were unknown.

The present study focused on the period called the Late Heavy Bombardment that is believed to have occurred 4 billion years ago in our solar system, when the giant planets underwent orbital migration. It is thought that several thousand Pluto-sized (one fifth of Earth's size) objects from the Kuiper belt existed in the outer solar system beyond Neptune. First the researchers calculated the probability that these large objects passed close enough to the giant planets to be destroyed by their tidal force during the Late Heavy Bombardment. Results showed that Saturn, Uranus and Neptune experienced close encounters with these large celestial objects multiple times.

Next the group used supercomputer simulations to investigate disruption of these Kuiper belt objects by tidal force when they passed the vicinity of the giant planets (see Figure 2a). The results of the simulations varied depending on the initial conditions, such as the rotation of the passing objects and their minimum approach distance to the planet. However they discovered that in many cases fragments comprising 0.1-10% of the initial mass of the passing objects were captured into orbits around the planet (see Figures 2a, b). The combined mass of these captured fragments was found to be sufficient to explain the mass of the current rings around Saturn and Uranus. In other words, these planetary rings were formed when sufficiently large objects passed very close to giants and were destroyed.

The researchers also simulated the long-term evolution of the captured fragments using supersupercomputers at the National Astronomical Observatory of Japan. From these simulations they found that captured fragments with an initial size of several kilometers are expected to undergo high-speed collisions repeatedly and are gradually shattered into small pieces. Such collisions between fragments are also expected to circularize their orbits and lead to the formation of the rings observed today (see Figures 2b, c).

This model can also explain the compositional difference between the rings of Saturn and Uranus. Compared to Saturn, Uranus (and also Neptune) has higher density (the mean density of Uranus is 1.27g cm-3, and 1.64g cm-3 for Neptune, while that of Saturn is 0.69g cm-3). This means that in the cases of Uranus (and Neptune), objects can pass within close vicinity of the planet, where they experience extremely strong tidal forces. (Saturn has a lower density and a large diameter-to-mass ratio, so if objects pass very close they will collide with the planet itself). As a result, if Kuiper belt objects have layered structures such as a rocky core with an icy mantle and pass within close vicinity of Uranus or Neptune, in addition to the icy mantle, even the rocky core will be destroyed and captured, forming rings that include rocky composition. However if they pass by Saturn, only the icy mantle will be destroyed, forming icy rings. This explains the different ring compositions.

These findings illustrate that the rings of giant planets are natural by-products of the formation process of the planets in our solar system. This implies that giant planets discovered around other stars likely have rings formed by a similar process. Discovery of a ring system around an exoplanet has been recently reported, and further discoveries of rings and satellites around exoplanets will advance our understanding of their origin. 

Schematic illustration of the ring formation process. The dotted lines show the distance at which the giant planets' gravity is strong enough that tidal disruption occurs. (a) When Kuiper belt objects have close encounters with giant planets, they are destroyed by the giant planets' tidal forces. (b) As a result of tidal disruption some fragments are captured into orbits around the planet. (c) Repeated collisions between the fragments cause the captured fragments to break down, their orbit becomes gradually more circular, and the current rings are formed (partial alteration of figure from Hyodo, Charnoz, Ohtsuki, Genda 2016, Icarus).
Schematic illustration of the ring formation process. The dotted lines show the distance at which the giant planets' gravity is strong enough that tidal disruption occurs. (a) When Kuiper belt objects have close encounters with giant planets, they are destroyed by the giant planets' tidal forces. (b) As a result of tidal disruption some fragments are captured into orbits around the planet. (c) Repeated collisions between the fragments cause the captured fragments to break down, their orbit becomes gradually more circular, and the current rings are formed (partial alteration of figure from Hyodo, Charnoz, Ohtsuki, Genda 2016, Icarus).

Page 7 of 392