Reserved Bandwidth on ESnet Makes Possible Mulit-Gigabit Streaming Between Argonne, SC Conference in Portland

As both an astrophysicist and director of the San Diego Supercomputer Center (SDSC), Mike Norman understands two common perspectives on archiving massive scientific datasets. During a live demonstration at the SC09 conference of streaming data simulating cosmic structures of the early universe, Norman said that some center directors view their data archives as "black holes," where a wealth of data accumulates and needs to be protected.

But as a leading expert in the field of astrophysics, he sees data as intellectual property that belongs to the researcher and his or her home institution — not the center where the data was computed. Some people, Norman says, claim that it's impossible to move those terabytes of data between computing centers and where the researcher sits. But in a live demo in which data was streamed over a reserved 10-gigabits-per-second provided by the Department of Energy's ESnet (Energy Sciences Network), Norman and his graduate assistant Rick Wagner showed it can be done.

While the scientific results of the project are important, the success in building reliable high-bandwidth connections linking key research facilities and institutions addresses a problem facing many science communities.

“A lot of researchers stand to benefit from this successful demonstration,” said Eli Dart, an ESnet engineer who helped the team achieve the necessary network performance.  “While the science itself is very important in its own right, the ability to link multiple institutions in this way really paves the way for other scientists to use these tools more easily in the future.”

"This couldn't have been done without ESnet," Wagner said. Two aspects of the network came into play. First, ESnet operates the circuit-oriented Science Data Network, which provides dedicated bandwidth for moving large datasets. However, with numerous projects filling the network much of the time for other demos and competitions at SC09, Norman and Wagner took advantage of OSCARS, ESnet's On-Demand Secure Circuit and Advance Reservation System.

"We gave them the bandwidth they needed, when they needed it," said ESnet engineer Evangelos Chaniotakis. The San Diego team was given two two-hour bandwidth reservations on both Tuesday, Nov. 17, and Thursday, Nov. 19. Chaniotakis set up the reservations, then the network automatically reconfigured itself once the window closed.

At the SDSC booth, the live streaming of the data drew a standing-room-only crowd as the data was first shown as a 4,0963 cube containing 64 billion particles and cells. But Norman pointed out that the milky white cube was far too complex to absorb, then added that it was only one of numerous time-steps. In all, the data required for rendering came to about 150 terabytes of data.

In real time, the data was rendered on the Eureka Linux cluster at the Argonne Leadership Computing Facility and reduced to one-sixty-fourth of the original size for a 1,0243 representation, making it more manageable and able to be explored interactively. The milky mesh was shown to contain galaxies and clusters linked by sheets and filaments of cosmic gases.

The project, Norman explained, is aimed at determining whether the signal of faint ripples in the universe known as baryon acoustic oscillations, or BAO, can actually be observed in the absorption of light by the intergalactic gas. It can, according to research led by Norman, who said they were the first to determine this. Such a finding is critical to the success of a dark energy survey known as BOSS, the Baryon Oscillation Spectroscopic Survey. The results of his proof-of-concept project, Norman said, "ensure that BOSS is not a waste of time."

Creating a simulation of this size, even using the petaflops Cray XT5 Kraken system at the University of Tennessee can take three months to complete as it is run in batches as time is allocated, Norman said. The data could then be moved in three nights to Argonne for rendering. The images were then streamed to the SDSC OptiPortal for display. Norman said the next step is to close the loop between the client side and the server side to allow interactive use. But the hard work — connecting the resources with adequate bandwidth — has been done, as evidenced by the demo, he noted.

But it wasn't just an issue of bandwidth, according to ESnet's Dart. "We did a lot of testing and tuning"” said Dart.

ESnet is managed by Lawrence Berkeley National Laboratory (LBNL) Other contributors to the demo were Joe Insley of Argonne National Laboratory (ANL), who generated the images from the data, and Eric Olson, also of Argonne, who was responsible for the composition and imaging software. Network engineers Linda Winkler and Loren Wilson of ANL and Thomas Hutton of SDSC worked to set up and tune the network and servers before moving the demonstration to SC09.  The project was a collaboration between ANL, CalIT2, ESnet at the Lawrence Berkeley National Laboratory, the National Institute for Computational Science, Oak Ridge National Laboratory and SDSC.

For more information about computing sciences at the Lawrence Berkeley National Laboratory, please visit: www.lbl.gov/cs

  • Spectra Logic continues its proven record of success supporting the federal, state and local government agencies, ranking in the Top 10% of Government GSA contractors for the 3rd year in a row. 

  • Exemplifying this success, Spectra’s Federal sales comprised more than 20 percent of overall company revenue in 2009.

Spectra Logic today announced that it ranked in the top ten percent of U.S. General Services Administration (GSA) information technology (IT) Schedule 70 contractors for 2009. This is the third consecutive year Spectra Logic has ranked as a top vendor based on annual revenues of pre-approved GSA Schedule 70 IT products and services purchased by federal, state and local government agencies. Spectra Logic’s Federal sales division has a proven record of success supporting government organizations, and its sales comprise more than 20 percent of overall company revenue.

"Federal, state and local government agencies want backup and archive solutions that can easily handle large, fast-growing data volumes and high data availability, while helping to deliver greener IT environments that use less energy and minimize floor space," said Brian Grainger, vice president of worldwide sales, Spectra Logic. "Spectra Logic’s solutions are ideally suited for the government market – from high density, energy-efficient tape libraries to disk-based deduplication appliances that reduce stored data volumes."

Spectra Logic added several new products and services to the GSA schedule in 2009, including the Spectra T-Finity enterprise tape library, the Spectra T680 mid-range tape library, Spectra’s disk-based nTier Deduplication product line, media, backup application software and TranScale upgrade service options. Spectra Logic’s archive and backup products have been listed on GSA Schedule 70 since 2003 under GSA contract number GS-35F-0563K.

“The high-capacity Spectra T-Finity tape library enables large enterprise-class organizations to protect, archive and quickly access petabytes of classified and unclassified data,” said Mark Weis, director of federal sales, Spectra Logic. “T-Finity’s inclusion on the GSA Schedule 70 simplifies the purchasing process for our federal, state and local government customers.”

GSA establishes long-term government-wide contracts with commercial firms to provide access to more than 11 million commercial products and services that can be ordered directly from GSA Schedule contractors. The Information Technology Schedule 70 (a Multiple Award Schedule) grants agencies direct access to commercial experts who can thoroughly address the needs of the government IT Community through 20 Special Item Numbers (SINS).  These SINs cover the most general purpose commercial IT hardware, software and services.

In addition to GSA, Spectra Logic’s products are also listed on several Government Acquisition Contracts including ITES, NETCENTS and SEWP.

The Department of Science and Technology (DST) and Council for Scientific and Industrial Research (CSIR) have announced that work on the national backbone network of the South African National Research Network (SANReN) has been completed ahead of schedule.

SANReN forms a crucial part of the national cyber infrastructure initiative funded by the DST. As part of this national cyber infrastructure, SANReN’s powerful network capabilities support projects of national importance.

The CSIR’s Meraka Institute is responsible for the implementation of the DST’s cyber infrastructure initiative which, in addition to SANReN, comprises the Centre for High Performance Computing (CHPC) and the proposed very large datasets data storage initiative.

The CSIR contracted Telkom for the installation of the national backbone network in July 2009.

The Minister of Science and Technology, Mrs Naledi Pandor says, “The completion of the national backbone network is an important milestone. The network will greatly reduce the cost of bandwidth for all research and higher education institutions in the country. For the first time, South African researchers will have world class networking enabling them to collaborate nationally and with their international peers. This positions South Africa internationally as a player in global science efforts. It also makes it possible to harness South Africa’s full research and development capacity to address national priority issues, including health, food security and understanding and mitigating the effect of climate change.”

“Bandwidth abundance resulting from SANReN’s networking of universities will shape the growth and development of a new generation of students whose knowledge and skills will contribute to the goal of creating an inclusive information society, enabling socio-economic benefits through information and communications technology and broadband specifically. In turn these advances on the scientific front will contribute to the competitiveness of local industry through the scientific breakthroughs achieved and through the establishment of a world-class national cyber infrastructure.”

The Director-General of the DST, Dr Phil Mjwara, says, “The broadband connectivity provided by SANReN will allow reciprocal participation between South Africa and international research institutions. It will give the global research community access to facilities such as the Southern African Large Telescope and the Karoo Array Telescope (also known as MeerKAT), and allow South Africa to participate in international projects with the European Organisation for Nuclear Research, among others. This milestone will further demonstrate South Africa’s readiness to host the Square Kilometre Array (SKA) radio telescope, for which the country is currently bidding.”

“We have entered a new era in research networking made possible by the vision and funding of the DST. Unlocking its potential will undoubtedly benefit South Africa’s research community as our researchers are now able to engage in meaningful online collaboration with peers locally and abroad,” says CSIR President and CEO, Dr Sibusiso Sibisi.

“We would like to commend Telkom for the work done on the SANReN national backbone and for delivering well within the agreed deadline.”

Says Godfrey Ntoele, Telkom's Managing Executive for Medium and Large Business Services, “Telkom, the CSIR and the Meraka Institute are satisfied with the pace at which we are jointly proceeding with the SANReN project. We are happy that all elements are on track and Telkom remains committed to delivering on all aspects of this initiative so that the national agenda of attaining high technology connectivity at our academic institutions is realised in order to promote skills development in our country.”

The national backbone now interconnects the metros of Tshwane, Johannesburg, Mangaung, Cape Town, Nelson Mandela Bay and eThekwini on a ten gigabites per second fibre optic ring network.

The tertiary education network has acquired international bandwidth from Seacom which can now be distributed via the SANReN national backbone network. Seacom is a 1,28 terabytes per second, 17 000km long submarine fibre optic cable system linking southern and East Africa to global networks via India and Europe. This development bodes well for South Africa’s ability to tackle bandwidth hungry projects such as the Square Kilometre Array (SKA).

The first phase of SANReN will connect 50 higher education and research institutions to the network and in the longer term SANReN aims to connect all research and higher education institutions in the country.

Leading UML modelling tool vendor supports seminal workshop on interoperability best practice.

 

Sparx Systems (www.sparxsystems.com) will sponsor CEN/TC 287 Workshop on Interoperability Best Practice to be held on 14th September, 2010 in St. Georges Bay, Malta. The workshop will be held in conjunction with the 27th plenary meeting of the CEN/TC 287. Enterprise Architect, Sparx Systems' flagship modeling tool, is extensively deployed by the global geospatial community for the development of industry reference models.

CEN/TC 287 is concerned with standardization in the field of digital geographic information for Europe, a task that is undertaken through close co-operation with ISO TC/211 and the Open Geospatial Consortium Inc (OGC).

Commenting on the workshop, Ken Harkin, Business Development Manager for Sparx Systems said, "The identification of interoperability reference material which can be captured in a central repository and made available to future EU geospatial projects as best practices, is essential to ongoing results improvement. As a technology provider to the global standards community, Sparx Systems supports this initiative and is pleased to sponsor the workshop."

He added, "CEN/TC 287, through maintenance of key project deliverables and components will build on existing geospatial standards and in doing so will play a pivotal role in the concerted adoption of best practices and in shaping the evolution of the industry".

Martin Ford, CEN/TC 287 Secretary noted, "The significance of this workshop which is open to the public, is highlighted by an impressive line up of industry thought leaders. These speakers will share their ideas, experience and insight on strategies to use or build upon standards and specifications, in the domain of geographic information. CEN/TC 287 is very appreciative of Sparx Systems support in making this important discussion possible."

Further information about the CEN/TC 287 Interoperability Workshop is available from: www.gistandards.eu

Scientists are planning to use the largest supercomputers to simulate life on Earth, including the financial system, economies and whole societies. The project is called "Living Earth Simulator" and part of a huge EU research initiative named FuturIcT.

Supercomputers are already being used to explore complex social and economic problems that science can understand in no other way. For example, ETH Zurich's professor for transport engineering Kay Axhausen is simulating the travel activities of all 7.5 Million inhabitants of Switzerland to forecast and mitigate traffic congestion. Other researchers at the ETH -- all working within its Competence Center for Coping with Crises in Complex Socio-Economic Systems (CCSS) -- are mining huge amounts of financial data to detect dangerous bubbles in stock and housing markets, potential bankruptcy cascades in networks of companies, or similar vulnerabilities in other complex networks such as communication networks or the Internet.

Coping with Crises

In the past, supercomputers have been used mainly in physics or biology, or for difficult engineering problems such as the construction of new aircrafts. But now they are increasingly being used for social and economic analyses, even of the most fundamental human processes. At the CCSS, for example, Lars-Erik Cederman uses large-scale computer models to study the origin of international conflict, and is creating a large database documenting the geographic interdependencies of civil violence and wars in countries such as the former Yugoslavia or Iraq. In sociology, simulations at the CCSS have explored the conditions under which cooperation and solidarity can thrive in societies. They show that the crust of civilization is disturbingly vulnerable. These simulations reveal common patterns behind breakdowns of social order in events as diverse as the war in former Yugoslavia, lootings after earthquakes or other natural disasters, or the recent violent demonstrations in Greece.

Social Super-Computing

The CCSS, particularly the Financial Crisis Observatory led by Didier Sornette, is currently the biggest shareholder of ETH Zurich's Brutus supercomputing cluster, which is currently the 88th fastest computer in the world and ranked 10th in Europe. Social supercomputing is also a new focus of other renowned research centres such as the Los Alamos National Laboratory and the Brookings Institution in the United States. Such simulations, researchers now widely recognize, represent the best chance to gain insight into highly complex problems ranging from traffic flows to evacuation scenarios of entire cities or the spreading of epidemics. Independent projects in the United States and in Europe have already embarked on efforts to build simulations of the entire global economy.


The FuturIcT project aims to bring many efforts of this kind together in order to simulate the entire globe, including all the diverse interactions of social systems and of the economy with our environment. The concept for the project has already been deeply explored within several European research projects.

Large-Scale Data Mining

Complementary to large-scale computer simulations, the FuturIcT project also aims to gather and organise data on social, economic and environmental processes on an unprecedented scale, especially by augmenting the results of field studies and laboratory experiments with the overwhelming flood of data now resulting from the world wide web or massive multi-player online worlds such as Second Life. Furthermore, the rapid emergence of vast networks of distributed sensors will make data available on an almost unimaginable scale for direct use in computer simulations. At the same time, an ethics committee and targeted research will ensure that these data will be explored in privacy-respecting ways and not misused. The goal is to identify statistical interdependencies when many people interact, but not to track or predict individual behaviour.

Crises Observatories

In a practical sense, the scientists behind the FuturIcT project foresee the development of crises observatories and decision-support systems for politicians and business leaders. "Such observatories would detect advance warning signs of many different kinds of emerging problems," says Dirk Helbing, "including large-scale congestion, financial instabilities, the spreading of diseases, environmental change, resource shortages and social conflicts." The FuturIcT project led by him aims to put the power of today's information technology to work in creating the tools needed to address the challenges of humanity in the future, and to ensure social and economic well-being around the globe.

Economic and Policy Opportunities

George Soros, who has established the Institute of New Economic Thinking (INET) with an endowment of 50 million dollars, has welcomed the initiative and writes: "The team of scientists that Dr. Helbing has gathered together can, I believe, make a significant contribution to the understanding of the evolution and change in societies as they meet the formidable issues of governance, climate change, sustainable economic balance that we are all faced with in the coming decades."

http://www.futurict.eu

Page 5 of 17