Office of Naval Research (ONR) officials announced a new program Oct. 22 to optimize tactical handheld technology for quick decision-making in the field.
The Exchange of Actionable Information at the Tactical Edge (EAITE) program, designed to sift through data from multiple sources for faster analysis, is among more than a dozen Future Naval Capability (FNC) programs kicking off in fiscal year 2014.
ONR Director of Transition Dr. Thomas Killion explained the need for the program—and how it benefits both the U.S. Navy and industry—during an FNC overview at the 2012 ONR Naval Science & Technology Partnership Conference.
"EAITE gives the U.S. Navy a means to build on industry advances in mobile technology and cloud computing to develop and ultimately transition a cutting-edge product to acquisition managers and ultimately the warfighter," Killion said.
Currently, information feeds from sensors and other assets flow into command centers, where intelligence analysts make sense of the material before it is shipped down to lower echelon troops. This process can take hours or even days. Complicating the situation, access to many more data sources and advanced analytic technology could threaten Marines with information overload.
EAITE aims to cut that delivery time down to minutes or even seconds by using automation to sift through data and send only the most relevant information to Marines operating handheld devices, said John Moniz, C4 program officer in ONR's Expeditionary Maneuver Warfare and Combating Terrorism Department.
"We want to figure out what we can automate and how we go about automating it so we can take all of these various sources of data and create something that can immediately be used by the end-user," Moniz said.
Under EAITE, researchers will develop firmware and software to distill imagery and information from unmanned aircraft feeds and other sources, efficiently move it over a tactical network and present it in an immediately understandable form to decision-makers on the battlefield.
Trimming the so-called "data to decisions" timeline down to minutes would be a drastic improvement, but researchers know there are situations when Marines need critical information even sooner.
"We would like to have some sort of warning that tells them when something that could put them in danger is getting close, and we need to get that to them in seconds," Moniz said.
In addition to EAITE, the new crop of FY14 FNC initiatives includes: a gel-wound cover for managing blast injuries in forward locations; real-time detection and assessment of traumatic brain injury in theater; an undersea weapon system that can autonomously neutralize surface and subsurface threats in shallow and intermediate waters; and surveillance tools for unmanned aircraft that can autonomously detect improvised explosive device precursors and hidden targets.
In general, the FNC program's goal is to match solutions with acquisition requirements to close warfighting gaps within five years.
A research team from the University of Bristol’s Centre for Quantum Photonics (CQP) have brought the reality of a quantum computer one step closer by experimentally demonstrating a technique for significantly reducing the physical resources required for quantum factoring.
The team have shown how it is possible to recycle the particles inside a quantum computer, so that quantum factoring can be achieved with only one third of the particles originally required. The research is published in the latest issue of Nature Photonics.
Using photons as the particles, the Bristol team constructed a quantum optical circuit that recycled one of the photons to set a new record for factoring 21 with a quantum algorithm - all previous demonstrations have factored 15.
Dr Anthony Laing, who led the project, said: “Quantum computers promise to harness the counterintuitive laws of quantum mechanics to perform calculations that are forever out of reach of conventional classical computers. Realising such a device is one of the great technological challenges of the century.”
While scientists and mathematicians are still trying to understand the full range of capabilities of quantum computers, the current driving application is the hard problem of factoring large numbers. The best classical computers can run for the lifetime of the universe, searching for the factors of a large number, yet still be unsuccessful.
In fact, Internet cryptographic protocols are based on this exponential overhead in computational time: if a third party wants to spy on your emails, they will need to solve a hard factoring problem first. A quantum computer, on the other hand, is capable of efficiently factoring large numbers, but the physical resources required mean that constructing such a device is highly challenging.
CQP PhD student Enrique Martín-López, who performed the experiment, said: “While it will clearly be some time before emails can be hacked with a quantum computer, this proof of principle experiment paves the way for larger implementations of quantum algorithms by using particle recycling.”
Paper: Experimental realization of Shor’s quantum factoring algorithm using qubit recycling, Enrique Martín-López, Anthony Laing, Thomas Lawson, Roberto Alvarez, Xiao-Qi Zhou and Jeremy L. O’Brien, Nature Photonics, 21 October 2012.
Carnegie Mellon University's Department of Electrical and Computer Engineering and the University of Porto's Business School in Porto, Portugal, have embarked on a new double-degree graduate program in engineering and business this semester.
A special event celebrating the program launch was held Oct. 23 in Portugal at Porto Business School. CMU's Ed Schlesinger and James Hoe joined Nuno de Sousa Pereira, dean of the Porto Business School, Jorge Farinha, vice dean of the Porto Business school, along with Sebastião Feyo de Azevedo director of the faculty of engineering of the University of Porto, Carlos Oliveira, Portugal's secretary of state for Entrepreneurship and Innovation, and Allan Katz, U.S. ambassador to Portugal, at a program that included a panel discussion about "Technology, Entrepreneurship and Management Education" with the participation of representatives from several national and international IT companies and entrepreneurs.
"We have our first students enrolled in the new program, and we expect the initiative to grow as more companies globally seek the problem-solving, managerial and technical skills acquired through engineering and business studies provided by this double MS MBA experience," said Schlesinger, the David Edward Schramm Memorial Professor and head of CMU's Electrical and Computer Engineering Department.
The two-year program gives participants the opportunity of studying one year in Portugal at the University of Porto and another year in the U.S. at Carnegie Mellon. Students are required to meet academic requirements from both universities to receive a master's of science degree in electrical and computer engineering from Carnegie Mellon and a master's of business administration ("The Magellan MBA") from Porto Business School.
Over the past decade, increasing numbers of engineering students have found career opportunities from startup companies to venerable consulting firms and Wall Street. Industry analysts and recruiters report that engineering students find unique opportunities in business because of their sharply honed technology skills and team building experience.
"This new program builds on the long-standing relationship developed with the successful ICTI program (Information and Communication Technology Institute) in which Carnegie Mellon has partnered with a number of leading Portuguese universities," Schlesinger said.
"Bringing together Engineering and Business in this double degree program will be a major step toward preparing the next generation of highly skilled individuals that will be able to handle the most advanced technologies and related products while keeping at all times a business and managerial perspective. We expect to observe here the formation of a new breed of top engineers that are able to understand the language and practices of business and to effectively use soft skills such as communication, team management, negotiation or leadership to help companies achieve success in the marketplace in very complex organizational settings" Farinha said.
Supercomputer model: Chimp lifespan evolves into human longevity
Supercomputer simulations provide new mathematical support for the "grandmother hypothesis" – a famous theory that humans evolved longer adult lifespans than apes because grandmothers helped feed their grandchildren.
"Grandmothering was the initial step toward making us who we are," says Kristen Hawkes, a distinguished professor of anthropology at the University of Utah and senior author of the new study published Oct. 24 by the British journal Proceedings of the Royal Society B.
The simulations indicate that with only a little bit of grandmothering – and without any assumptions about human brain size – animals with chimpanzee lifespans evolve in less than 60,000 years so they have a human lifespan. Female chimps rarely live past child-bearing years, usually into their 30s and sometimes their 40s. Human females often live decades past their child-bearing years.
The findings showed that from the time adulthood is reached, the simulated creatures lived another 25 years like chimps, yet after 24,000 to 60,000 years of grandmothers caring for grandchildren, the creatures who reached adulthood lived another 49 years – as do human hunter-gatherers.
The grandmother hypothesis says that when grandmothers help feed their grandchildren after weaning, their daughters can produce more children at shorter intervals; the children become younger at weaning but older when they first can feed themselves and when they reach adulthood; and women end up with postmenopausal lifespans just like ours.
By allowing their daughters to have more children, a few ancestral females who lived long enough to become grandmothers passed their longevity genes to more descendants, who had longer adult lifespans as a result.
Hawkes conducted the new study with first author and mathematical biologist Peter Kim, a former University of Utah postdoctoral researcher now on the University of Sydney faculty, and James Coxworth, a University of Utah doctoral student in anthropology. The study was funded by the National Science Foundation and the Australian Research Council.
How Grandmothering Came to Be
Hawkes, University of Utah anthropologist James O'Connell and UCLA anthropologist Nicholas Blurton Jones formally proposed the grandmother hypothesis in 1997, and it has been debated ever since. Once major criticism was that it lacked a mathematical underpinning – something the new study sought to provide.
The hypothesis stemmed from observations by Hawkes and O'Connell in the 1980s when they lived with Tanzania's Hazda hunter-gatherer people and watched older women spend their days collecting tubers and other foods for their grandchildren. Except for humans, all other primates and mammals collect their own food after weaning.
But as human ancestors evolved in Africa during the past 2 million years, the environment changed, growing drier with more open grasslands and fewer forests – forests where newly weaned infants could collect and eat fleshy fruits on their own.
"So moms had two choices," Hawkes says. "They could either follow the retreating forests, where foods were available that weaned infants could collect, or continue to feed the kids after the kids are weaned. That is a problem for mothers because it means you can't have the next kid while you are occupied with this one."
That opened a window for the few females whose childbearing years were ending – grandmothers – to step in and help, digging up potato-like tubers and cracking hard-shelled nuts in the increasingly arid environment. Those are tasks newly weaned apes and human ancestors couldn't handle as infants.
The primates who stayed near food sources that newly weaned offspring could collect "are our great ape cousins," says Hawkes. "The ones that began to exploit resources little kids couldn't handle, opened this window for grandmothering and eventually evolved into humans."
Evidence that grandmothering increases grandchildren's survival is seen in 19th and 20th century Europeans and Canadians, and in Hazda and some other African people.
But it is possible that the benefits grandmothers provide to their grandchildren might be the result of long postmenopausal lifespans that evolved for other reasons, so the new study set out to determine if grandmothering alone could result in the evolution of ape-like life histories into long postmenopausal lifespans seen in humans.
Simulating the Evolution of Adult Lifespan
The new study isn't the first to attempt to model or simulate the grandmother effect. A 1998 study by Hawkes and colleagues took a simpler approach, showing that grandmothering accounts for differences between humans and modern apes in life-history events such as age at weaning, age at adulthood and longevity.
A recent simulation by other researchers said there were too few females living past their fertile years for grandmothering to affect lifespan in human ancestors. The new study grew from Hawkes' skepticism about that finding.
Unlike Hawkes' 1998 study, the new study simulated evolution over time, asking, "If you start with a life history like the one we see in great apes – and then you add grandmothering, what happens?" Hawkes says.
The simulations measured the change in adult longevity – the average lifespan from the time adulthood begins. Chimps that reach adulthood (age 13) live an average of another 15 or 16 years. People in developed nations who reach adulthood (at about age 19) live an average of another 60 years or so – to the late 70s or low 80s.
The extension of adult lifespan in the new study involves evolution in prehistoric time; increasing lifespans in recent centuries have been attributed largely to clean water, sewer systems and other public health measures.
The researchers were conservative, making the grandmother effect "weak" by assuming that a woman couldn't be a grandmother until age 45 or after age 75, that she couldn't care for a child until age 2, and that she could care only for one child and that it could be any child, not just her daughter's child.
Based on earlier research, the simulation assumed that any newborn had a 5 percent chance of a gene mutation that could lead to either a shorter or a longer lifespan.
The simulation begins with only 1 percent of women living to grandmother age and able to care for grandchildren, but by the end of the 24,000 to 60,000 simulated years, the results are similar to those seen in human hunter-gatherer populations: about 43 percent of adult women are grandmothers.
The new study found that from adulthood, additional years of life doubled from 25 years to 49 years over the simulated 24,000 to 60,000 years.
The difference in how fast the doubling occurred depends on different assumptions about how much a longer lifespan costs males: Living longer means males must put more energy and metabolism into maintaining their bodies longer, so they put less vigor into competing with other males over females during young adulthood. The simulation tested three different degrees to which males are competitive in reproducing.
What Came First: Bigger Brains or Grandmothering?
The competing "hunting hypothesis" holds that as resources dried up for human ancestors in Africa, hunting became better than foraging for finding food, and that led to natural selection for bigger brains capable of learning better hunting methods and clever use of hunting weapons. Women formed "pair bonds" with men who brought home meat.
Many anthropologists argue that increasing brain size in our ape-like ancestors was the major factor in humans developing lifespans different from apes. But the new supercomputer simulation ignored brain size, hunting and pair bonding, and showed that even a weak grandmother effect can make the simulated creatures evolve from chimp-like longevity to human longevity.
So Hawkes believes the shift to longer adult lifespan caused by grandmothering "is what underlies subsequent important changes in human evolution, including increasing brain size."
"If you are a chimpanzee, gorilla or orangutan baby, your mom is thinking about nothing but you," she says. "But if you are a human baby, your mom has other kids she is worrying about, and that means now there is selection on you – which was not on any other apes – to much more actively engage her: 'Mom! Pay attention to me!'"
"Grandmothering gave us the kind of upbringing that made us more dependent on each other socially and prone to engage each other's attention," she adds.
That, says Hawkes, gave rise to "a whole array of social capacities that are then the foundation for the evolution of other distinctly human traits, including pair bonding, bigger brains, learning new skills and our tendency for cooperation."
A new study by Northwestern University researchers has revealed that public DNS services could actually slow down users' web-surfing experience. As a result, researchers have developed a solution to help avoid such an impact: a tool called namehelp that could speed web performance by 40 percent.
Through a large-scale study involving more than 10,000 hosts across nearly 100 countries, Fabián Bustamante, associate professor of electrical engineering and computer science at Northwestern's McCormick School of Engineering and Applied Science, and his team found that one cause of slow web performance is a growing trend toward public Domain Name Systems (DNS), a form of database that translates Internet domain and host names into Internet Protocol (IP) addresses.
DNS services play a vital role in the Internet: every time a user visits a website, chats with friends, or sends email, his computer performs DNS look-ups before setting up a connection. Complex web pages often require multiple DNS look-ups before they start loading, so users' computers may perform hundreds of DNS look-ups a day. Most users are unaware of DNS, since Internet Service Providers (ISP) typically offer the service transparently.
Over the last few years, companies such as Google, OpenDNS, and Norton DNS have begun offering "public" DNS services. While "private" DNS services, such as those offered by ISPs, may be misconfigured, respond slowly to queries, and go down more often, public DNS services offer increased security and privacy, and quicker resolution time. The arrangement is also beneficial for public DNS providers, who gain access to information about users' web habits.
Bustamante and his team found that while using public DNS services may provide many benefits, users' web performance can suffer due to the hidden interaction of DNS with Content Delivery Networks (CDNs), another useful and equally transparent service in the web.
CDNs help performance by offering exact replicas of website content in hundreds or thousands of computer servers around the world; when a user types in a web address, he is directed to the copy geographically closest to him. Most popular websites – more than 70 percent of the top 1,000 most popular sites, according to the Northwestern study – rely on CDNs to deliver their content quickly to users around the world.
But researchers found that using public DNS services can result in bad redirections, sending users to content from CDN replicas that are three times farther away than necessary.
Public DNS and CDN services are working to address the problem, but current users are left with two mediocre options – bad web performance through public DNS services or bad security and privacy support through private DNS services.
Now Bustamante and his group have developed a tool called namehelp that may let users have their cake and eat it, too – by using public DNS services without compromising on web performance.
namehelp runs personalized benchmarks in the background, from within users' computers, to determine their optimal DNS configuration and improve their web experience by helping sites load faster. If it finds that a user is receiving less than optimal web performance, namehelp automatically fixes it by cleverly interacting with DNS services and CDNs to ensure the user gets his content from the nearest possible copy.