Split junction overview. Illustration of protein filaments (red) propelled by molecular motors (green) arriving at a junction where they perform a calculation operation (adding 5 or adding 0).

A study reports the realization of a parallel computer based on designed nanofabricated channels explored in a massively parallel fashion by protein filaments propelled by molecular motors.

A study published this week in Proceedings of the National Academy of Sciences reports a new parallel-supercomputing approach based on a combination of nanotechnology and biology that can solve combinatorial problems. The approach is scalable, error-tolerant, energy-efficient, and can be implemented with existing technologies. The pioneering achievement was developed by researchers from the Technische Universität Dresden and the Max Planck Institute of Molecular Cell Biology and Genetics, Dresden in collaboration with international partners from Canada, England, Sweden, the US, and the Netherlands.

Conventional electronic computers have led to remarkable technological advances in the past decades, but their sequential nature –they process only one computational task at a time– prevents them from solving problems of combinatorial nature such as protein design and folding, and optimal network routing. This is because the number of calculations required to solve such problems grows exponentially with the size of the problem, rendering them intractable with sequential computing. Parallel supercomputing approaches can in principle tackle such problems, but the approaches developed so far have suffered from drawbacks that have made up-scaling and practical implementation very difficult. The recently reported parallel-computing approach aims to address these issues by combining well established nanofabrication technology with molecular motors which are highly energy efficient and inherently work in parallel.

In this approach, which the researchers demonstrate on a benchmark combinatorial problem that is notoriously hard to solve with sequential computers, the problem to be solved is ‘encoded’ into a network of nanoscale channels. This is done, on the one hand by mathematically designing a geometrical network that is capable of representing the problem, and on the other hand by fabricating a physical network based on this design using so-called lithography, a standard chip-manufacturing technique.

The network is then explored in parallel by many protein filaments (here actin filaments or microtubules) that are self-propelled by a molecular layer of motor proteins (here myosin or kinesin) covering the bottom of the channels. The design of the network using different types of junctions automatically guides the filaments to the correct solutions to the problem. This is realized by different types of junctions, causing the filaments to behave in two different ways. As the filaments are rather rigid structures, turning to the left or right is only possible for certain angles of the crossing channels. By defining these options (‘split junctions’ Fig. 3b and ‘pass junctions’) the scientists achieved an ‘intelligent’ network giving the filaments the opportunity either to cross only straight or to decide between two possible channels with a 50/50 probability. 

The time to solve combinatorial problems of size N using this parallel-computing approach scales approximately as N2, which is a dramatic improvement over the exponential (2N) time scales required by conventional, sequential computers. Importantly, the approach is fully scalable with existing technologies and uses orders of magnitude less energy than conventional computers, thus circumventing the heating issues that are currently limiting the performance of conventional computing.

The Purdue University Department of Computer Science will offer a new degree concentration in information security starting with courses in summer 2016.

The program will allow anyone with a computing or programming background to obtain a Master of Computer Science degree in one year. Unlike a traditional master's degree, this program is open to professionals whose undergraduate degree may not have been in computer science. The program is appropriate for professionals with programming experience acquired during their professional career, those who earned a computer science minor as an undergraduate, or recent undergraduates with a computing major, said Randy Bond, assistant head of the Department of Computer Science.

Students will earn the degree with 30 credit hours, including two gateway courses (six credits), six computer science courses (18 credits) and two electives (six credits). Students are anticipated to graduate in 12 months. The first group of students will be admitted into the program in the summer with the gateway course sequence starting in the June summer session. The gateway sequence courses will offer students a refresher so that they are better prepared to succeed in the remainder of the courses. There is not an online option for the program at this time.

Bond said the concentration will provide students with the skills to implement security features on traditional computer systems and networks, plus a variety of devices, including mobile phones, airplanes, cars and point-of-sale machines. They would learn how to write software that would defend a system against phishing emails or viruses that try to probe a computer.

Prospective students can apply for the program at http://www.purdue.edu/gradschooland on the department's website. Visit https://www.cs.purdue.edu/ISCP for more information. 

Sailing history is rife with tales of monster-sized rogue waves -- huge, towering walls of water that seemingly rise up from nothing to dwarf, then deluge, vessel and crew. Rogue waves can measure eight times higher than the surrounding seas and can strike in otherwise calm waters, with virtually no warning.

Now a prediction tool developed by MIT engineers may give sailors a 2-3 minute warning of an incoming rogue wave, providing them with enough time to shut down essential operations on a ship or offshore platform.

The tool, in the form of an algorithm, sifts through data from surrounding waves to spot clusters of waves that may develop into a rogue wave. Depending on a wave group's length and height, the algorithm computes a probability that the group will turn into a rogue wave within the next few minutes.

"It's precise in the sense that it's telling us very accurately the location and the time that this rare event will happen," says Themis Sapsis, the American Bureau of Shipping Career Development Assistant Professor of Mechanical Engineering at MIT. "We have a range of possibilities, and we can say that this will be a dangerous wave, and you'd better do something. That's really all you need."

Sapsis and former postdoc Will Cousins have published their results this week in the Journal of Fluid Mechanics.

"Not just bad luck"

Like many complex systems, the open ocean can be represented as a chaotic mix of constantly changing data points. To understand and predict rare events such as rogue waves, scientists have typically taken a leave-no-wave-behind approach, in which they try to simulate every individual wave in a given body of water, to give a high-resolution picture of the sea state, as well as any suspicious, rogue-like activity. This extremely detailed approach is also computationally expensive, as it requires a cluster of computers to solve equations for each and every wave, and their interactions with surrounding waves.

"It's accurate, but it's extremely slow -- you cannot run these computations on your laptop," Sapsis says. "There's no way to predict rogue waves practically. That's the gap we're trying to address."

Sapsis and Cousins devised a much simpler, faster way to predict rogue waves, given data on the surrounding wave field.

In previous work, the team identified one mechanism by which rogue waves form in unidirectional wave fields. They observed that, while the open ocean consists of many waves, most of which move independently of each other, some waves cluster together in a single wave group, rolling through the ocean together. Certain wave groups, they found, end up "focusing" or exchanging energy in a way that eventually leads to an extreme rogue wave.

"These waves really talk to each other," Sapsis says. "They interact and exchange energy. It's not just bad luck. It's the dynamics that create this phenomenon."

Going rogue

In their current work, the researchers sought to identify precursors, or patterns in those wave groups that ultimately end up as rogue waves. To do this, they combined ocean wave data available from measurements taken by ocean buoys, with nonlinear analysis of the underlying water wave equations.

Sapsis and Cousins used the statistical data to quantify the range of wave possibilities, for a given body of water. They then developed a novel approach to analyze the nonlinear dynamics of the system and predict which wave groups will evolve into extreme rogue waves.

They were able to predict which groups turned rogue, based on two parameters: a wave group's length and height. The combination of statistics and dynamics helped the team identify the length-scale of a critical wave group, which has the highest likelihood of evolving into a rogue wave. Using this, the team derived a simple algorithm to predict a rogue wave based on incoming data. By tracking the energy of the surrounding wave field over this length-scale, they could immediately calculate the probability of a rogue wave developing.

"Using data and equations, we've determined for any given sea state the wave groups that can evolve into rogue waves," Sapsis says. "Of those, we only observe the ones with the highest probability of turning into a rare event. That's extremely efficient to do."

Sapsis says the team's algorithm is able to predict rogue waves several minutes before they fully develop. To put the algorithm into practice, he says ships and offshore platforms will have to utilize high-resolution scanning technologies such as LIDAR and radar to measure the surrounding waves.

"If we know the wave field, we can identify immediately what would be the critical length scale that one has to observe, and then identify spatial regions with high probability for a rare event," Sapsis says. "If you are performing operations on an aircraft carrier or offshore platform, this is extremely important."

CAPTION A study in Iquitos, Peru, is focusing on how asymptomatic human carriers contribute to the spread of dengue fever, transmitted by the Aedes aegypti mosquito. "This information is important," says Emory University disease ecologist Gonzalo Vazquez-Prokopec, "because Latin America is more than 80 percent urban and the Aedes aegypti mosquito is in every town."

It's time for a new model of disease surveillance

Ebola. Chikungunya. Zika. Once rare and exotic pathogens keep popping up and turning into household names. It's the new reality as the climate warms, humans expand more into wildlife habitats and air travel shrinks the distances across the globe.

"Africa and other parts of the developing world are undergoing rapid urbanization, so we are going to keep seeing more of these explosive epidemics," says Gonzalo Vazquez-Prokopec, a disease ecologist focused on mosquito-borne diseases in Emory University's Department of Environmental Sciences.

The complex properties driving today's disease transmission -- and the speed at which an epidemic can travel -- call for new methods of surveillance, Vazquez-Prokopec says. He is lead author on an opinion piece proposing a novel way of developing mathematical models of infectious diseases to uncover hidden patterns of transmission, recently published by Trends in Parasitology.

For example, he says, disease surveillance tends to focus on people with symptoms, but in cases of many mosquito-borne viruses -- such as dengue, chikungunya and Zika -- many of the people infected have no symptoms. And these asymptomatic carriers have the potential to infect others. They may even play the role of super spreaders -- those who contribute the most to the transmission of the pathogen.

"There is a gradient in the manifestation of disease, from no symptoms at all to death," Vazquez-Prokopec says. "And during an epidemic of mosquito-borne disease, that spectrum of disease manifestation is coupled with variable factors such as the movement of people and mosquitoes and whether individual people are more attractive to the mosquitoes and get bitten more often."

The so-called 80-20 rule -- 80 percent of disease transmission events in an epidemic are caused by 20 percent of people -- is a well-established phenomenon. "We know this pattern is prevalent across disease systems," Vazquez-Prokopec says, "but we don't know the variations that combine to make someone a super spreader. We need to determine if each variable is just noise or is contributing to transmission in a predictable way, so that we can target interventions that have more impact."

The uneven contribution of certain individuals, locations or reservoir hosts to the spread of a disease is known as transmission heterogeneity.

Vazquez-Prokopec and his co-authors propose a framework that moves beyond investigations of single sources of heterogeneity and accounts for the complex couplings between conditions that have potential synergistic impacts on disease transmission. This framework aims to uncover whether there is a hidden, unified process underlying the significant levels of heterogeneity for any infectious disease.

"The time is right to embrace the full complexity of transmission dynamics," Vazquez-Prokopec says. "We now have enough baseline data, and the necessary [super]computer power, to develop more complex models of disease transmission to help contain outbreaks."

Vazquez-Prokopec specializes in spatial analysis of disease transmission patterns and has several research projects for dengue fever ongoing in Latin America. His work in the city of Iquitos, Peru, for instance, is focusing on how asymptomatic carriers contribute to the spread of an epidemic. Dengue is spread by the same mosquito species, Aedes aegypti, that spreads the Zika and chikungunya viruses, so the data his lab is gathering has the potential for broader applications.

"The wealth of data that we've collected for dengue, combining the components of humans, pathogens, mosquitos and the environment, is giving us a detailed picture of the complexity of disease transmission across an urban landscape in the developing world," Vazquez-Prokopec says. "This information is important because Latin America is more than 80 percent urban and the Aedes aegypti mosquito is in every town."

Woolpert has been awarded a five-year contract to provide on-call geographic information system (GIS) services for San Francisco International Airport (SFO), it was announced today.

Woolpert Project Director Mark Ricketson said the work could involve GIS data collection, compliance with Federal Aviation Administration (FAA) airport GIS standards and implementing strategic planning services. The project also includes GIS application development and integration with existing airport business systems, and updating data previously collected by Woolpert.

“A little over five years ago, we were awarded a full airport-wide data collection effort with airspace analysis at SFO,” Ricketson said. “Then about two-and-a-half years later, we updated the data to reflect the fairly significant improvements and changes to the airfield.”

The airport is in the midst of a $4.4 billion capital improvement plan, which includes a newly completed air traffic control tower, a new Terminal 1, a new 350-room hotel, and numerous cargo and service-related buildings.

Ricketson said SFO’s airfield, building facilities and supporting infrastructure will continue to change as new capital projects are constructed, and that the GIS contract will support the analysis of those changes and reflect updates to the airport.

“We enjoy working for such a world-class airport, which cares so much about technology and sustainability,” said Ricketson, adding that Woolpert works extensively in sustainable design under its architecture and engineering disciplines. “The staff makes SFO one of my favorite clients.”

SFO is the seventh largest airport in the country, per total passengers, according to the Airports Council International (ACI). The airport had more than 47 million passengers in 2014, its most recent report.

Woolpert is expected to begin the GIS project this month.

Page 9 of 48