Modern high-throughput screening methods for analysing genetic information, proteins and metabolic products offer new ways of obtaining large quantities of data on life processes. These OMICS technologies, as they are known, are fuelling hopes of major advances in medicine, pharmacy, biochemistry, the food sciences and related fields. However, the German National Academy of Sciences Leopoldina has expressed concerns that Germany is failing to keep abreast of these developments. The Report on Tomorrow's Science entitled "Life sciences in transition" sets out six recommendations on how existing deficiencies can be overcome and research and teaching better equipped for the challenges of modern life sciences.

OMICS technologies help today's life scientists collect very large quantities of data on the genetic material, proteins and metabolic products of organisms. The extensive data will pave the way for new research approaches in developing individualised therapies, more productive crops and tailored microorganisms for applications such as cosmetics, medicine and food production. So far, however, the full potential of these enormous data stocks has hardly been exploited. The "Life sciences in transition" report therefore recommends closer cooperation between the life sciences and disciplines such as mathematics, computer science and engineering in handling big data.

"The opportunities now open to the life sciences are redefining requirements for the training of young researchers, for technical and IT equipment, networks between university and non-university research institutes, and the sustainable development of infrastructure," says Prof. Jörg Hacker, President of the Leopoldina. According to the report, Germany is not sufficiently prepared for the challenges that are emerging in the life sciences. The recommendations include setting up a national OMICS and IT infrastructure and providing targeted support to young researchers in this field.

The Report on Tomorrow's Science issued by the German National Academy of Sciences Leopoldina addresses topics connected to the medium and long-term development of science that are particularly relevant to the relationship between science, politics and society. The paper was produced by one of the Leopoldina's standing committees. Leopoldina committees help shape current scientific discussion in their specific field, advise on topics that are important for the future, and propose topics for discussion with policymakers and society.

The Report on Tomorrow's Science "Life sciences in transition" is available online at http://www.leopoldina.org/en/report-on-tomorrow's-science

In 1997, IBM's Deep Blue computer beat chess wizard Gary Kasparov. This year, a computer system developed at the University of Wisconsin-Madison achieved something far more complex. It equaled or bested scientists at the complex task of extracting data from scientific publications and placing it in a database that catalogs the results of tens of thousands of individual studies.

"We demonstrated that the system was no worse than people on all the things we measured, and it was better in some categories," says Christopher Ré, who guided the software development for a project while a UW professor of computer science. "That's extremely exciting!"

The development, described in the current issue of PLoS, marks a milestone in the quest to rapidly and precisely summarize, collate and index the vast output of scientists around the globe, says first author Shanan Peters, a professor of geoscience at UW-Madison.

Chess, however complex, is built on rigid rules; in any given situation, only certain moves are legal. The rules for scientific publication are less exact, and so extracting structured information from publications is a challenge for both humans and machines.

Peters and colleagues set up the face-off between PaleoDeepDive, their new machine reading system, and data that scientists had manually entered into the Paleobiology Database. This repository, compiled by hundreds of researchers, is the destination for data from paleontology studies funded by the National Science Foundation and other agencies internationally.

The knowledge produced by paleontologists is fragmented into hundreds of thousands of publications. Yet many research questions require what Peters calls a "synthetic approach: for example, how many species were on the planet at any given time?"

Despite 16 years of effort, the Paleobiology Database remains incomplete and a large amount of hard-earned field data remains locked in publications. Was it possible to automate and accelerate the process?

Teaming up with Ré, who is now at Stanford University, and UW-Madison computer science professor Miron Livny, the group built on the DeepDive machine reading system and the HTCondor distributed job management system to create PaleoDeepDive. "We were lucky that Miron Livny brought the high throughput computing capabilities of the UW-Madison campus to bear," says Peters. "Getting started required a million hours of computer time."

Much like the people who assembled the Paleobiology Database, PaleoDeepDive inhales documents and extracts structured data, such as species names, time periods, and geographic locations. "We extracted the same data from the same documents and put it into the exact same structure as the human researchers, allowing us to rigorously evaluate the quality of our system, and the humans," Peters says.

Many organizations, including IBM and Google, are trying to extract meaning from natural language, but Ré says, "The thing that is different here is that we decided to pivot and look at the scientific literature, where the language is cleaner."

Instead of trying to divine the single correct meaning from any body of copy, the tactic was to "to look at the entire problem of extraction as a probabilistic problem," says Ré, who credits much of the heavy lifting to UW-Madison Ph.D. candidate Ce Zhang. "People had done pieces of that, but not the entire problem, end to end. This was the DeepDive advance."

Ré imagines a study containing the terms "Tyrannosaurus rex" and "Alberta, Canada." Is Alberta where the fossil was found, or where it is stored? Did the finder work there? Did the study actually focus on a fossil related to T.rex? Computers often have trouble deciphering even simple-sounding statements, Ré says. "We take a more relaxed approach: There is some chance that these two are related in this manner, and some chance they are related in that manner."

In these large-data tasks, PaleoDeepDive has a major advantage, Peters says. "Information that was manually entered into the Paleobiology Database by humans cannot be assessed or enhanced without going back to the library and re-examining original documents. Our machine system, on the other hand, can extend and improve results essentially on the fly as new information is added. It can also extract related information that may not have been in the original database, but that is critical to tackling new science questions, and do so on a huge scale."

Further advantages can result from improvements in the computer tools. "As we get more feedback and data, it will do a better job across the board," Peters says. "There are, potentially, systematic and wholesale improvements to the quality of all of the data."

Jacquelyn Crinion, assistant director of licensing and acquisitions services at the UW-Madison General Library System, says the volume of downloads of scientific papers from publishers threatened logjams in document delivery. "Publishers are not going to complain about usage, but about how hard their system is getting hit." Eventually, Elsevier gave the UW-Madison team broad access to 10,000 downloads per week.

As text- and data-mining takes off, Crinion says the library system and publishers will adapt. "Elsevier is very interested in this project; they see it as the future, and it might allow them to develop new products and ways to deliver service. The challenge for all of us is to provide specialized services for researchers while continuing to meet the core needs of the vast majority of our customers."

The Paleobiology Database has already generated hundreds of studies about the history of life, Peters says. "It's a very good example of the added scientific value provided by synthetic databases, where the whole truly is greater than the sum of its individual data parts."

Peters notes that many fields are being challenged to optimize usage of old findings and make streams of new data readily accessible.

Paleontology and geology are inseparably linked through the role that fossils have played in characterizing geologic sequences, Peters notes. "Ultimately, we hope to have the ability to create a computer system that can do almost immediately what many geologists and paleontologists try to do on a smaller scale over a lifetime: read a bunch of papers, arrange a bunch of facts, and relate them to one another in order to address big questions."

DataScience@SMU will address the growing need for data scientists by preparing professionals who can work across a range of industries.

Southern Methodist University (SMU) will offer a Master of Science in Data Science program designed to meet the growing demand for data scientists across many industries. DataScience@SMU is delivered in partnership with 2U, Inc., a leading provider of online education solutions for top-tier nonprofit colleges and universities.

DataScience@SMU features an interdisciplinary curriculum that draws from three SMU schools: Dedman College of Humanities and Sciences, Lyle School of Engineering, and Meadows School of the Arts. With classes and content designed and delivered by SMU faculty, DataScience@SMU includes coursework in computer science, statistics, strategic behavior, and data visualization.

The first classes for this online degree program are scheduled to begin in January 2015.

"We look forward to building an innovative online data science program that will prepare students to effectively manage and analyze data as well as communicate and visualize findings," said SMU Provost and Vice President for Academic Affairs Paul W. Ludden. "Our partnership with 2U will allow SMU to offer our top-tier academics to professionals working around the world."

DataScience@SMU is designed with working professionals in mind. Live classes will be held weekly in a virtual classroom and will be kept small to facilitate group work and collaborative discussion. Coursework and lectures will be available 24/7 and accessible to enrolled students no matter where they are located. As with all 2U-powered programs, DataScience@SMU will feature faculty-designed coursework, including hands-on, project-based assignments and highly produced recorded lectures.

"There is no question that the field of data science has emerged rapidly and that it continues to evolve. New technologies and vast amounts of data are shifting the way organizations operate, and there is more need than ever for professionals who can work with that data to solve problems," said Chip Paucek, 2U CEO and co-founder. "We are proud to support SMU's commitment to the field of data science and to serve their students through our technology and dedicated support wherever they may be located."

DataScience@SMU students will complete a total of 30 credits of core coursework in statistics, data mining and data visualization, and a 1-credit, hands-on, in-person immersion experience on the SMU campus in Dallas.

SMU's online Master of Science in Data Science program will begin accepting applications late summer 2014 for classes beginning in January 2015. For more information, visit http://2u.com/partners/smu.

Missouri University of Science and Technology is one of 15 schools selected by the Anita Borg Institute (ABI), a nonprofit organization focused on advancing women in computing, and Harvey Mudd College to participate in a new program designed to increase the percentage of undergraduate computer science majors who are female or students of color.

Through the Building Recruiting and Inclusion for Diversity (BRAID) initiative, Missouri S&T will receive $30,000 per year for three years to implement programs that will help attract women and underrepresented minorities to the computer science program at Missouri S&T.

The 15 colleges and universities selected for the BRAID initiative have committed to implementing a number of approaches that have demonstrated success at Harvey Mudd College and other institutions with diverse computer science programs. Those approaches include expanding outreach to high school teachers and students, modifying introductory computer sciences courses to make them more appealing and less intimidating to students from underrepresented groups, building community among underrepresented students, and developing joint majors in areas like computer science and biology to encourage interdisciplinary approaches.

Currently 7 percent of Missouri S&T’s undergraduate computer science students are female, which is below the national average of 17 percent.

“We are very excited with the opportunity to partner with Anita Borg Institute to make a difference in the diversity of undergraduate computer science majors at Missouri S&T,” says Dr. Sajal Das, the Daniel C. St. Clair Endowed Chair and professor and chair of computer science at Missouri S&T. “We have an ambitious goal and a strategic plan to increase the female undergraduate enrollment from 7 percent to 20 percent in the next three to four years. We are committed to work hard to achieve this goal through a multitude of ways.”

This fall, Missouri S&T revamped its Introduction to Programming courses to feature assignments that focus on more exciting, contemporary, real-world problems that include domains in the natural sciences, humanities and social sciences. That model will be expanded to other courses in the department over the next two years.

“Our students will learn and practice firsthand the importance of using problems of an interdisciplinary nature that have societal relevance to teach computer science,” Das says.

He says the department also plans to build confidence and a sense of community among women in the department by hosting monthly events through the Association of Computing Machinery-Women’s chapter at Missouri S&T. Planned activities include social events, field trips and mentoring opportunities

In addition, Dr. Jennifer Leopold, associate professor of computer science and associate chair for undergraduate studies and outreach, will visit high schools in urban and rural areas of Missouri and neighboring states to talk to students about computer science education, research and careers, and to encourage female students to consider Missouri S&T.

Missouri S&T and the other participating universities will provide data for a research study documenting the progress made across departments. That study will be led by faculty from the University of California, Los Angeles Graduate School of Education and Information Studies.

Each participating university is encouraged to send underrepresented students and computer science faculty to the annual Grace Hopper Celebration of Women in Computing. Known as the world’s largest gathering of women technologists, the conference was held Oct. 8-11 in Phoenix, Arizona. Missouri is a silver-level sponsor of the conference. Leopold accompanied a group of female Missouri S&T computer science students to the conference.

Former Secretary of State Hillary Clinton announced the BRAID initiative during her address at the 2014 Clinton Global Initiative (CGI) Annual Meeting. BRAID is supported by three-year funding commitments from Facebook, Google, Intel and Microsoft. Maria Klawe, president of Harvey Mudd, and Telle Whitney, president and CEO of ABI, will lead the BRAID initiative.

In addition to Missouri S&T, universities participating in the BRAID initiative are Arizona State University, New Jersey Institute of Technology, University of California-Irvine, University of Illinois-Chicago, University of Maryland-Baltimore County, University of Maryland-College Park, University of Nebraska-Lincoln, University of North Texas, University of Rochester, University of South Carolina, University of Texas-El Paso, University of Vermont, University of Wisconsin-Milwaukee and Villanova University.

College will offer 12 degrees in computer, information sciences, and health care fields

Regis University is making a move to meet the growing demands of the expanding computer industry by opening a new College of Computer & Information Sciences in fall 2014.

Regis University offers programs that cater to busy professionals, including flexible options for obtaining bachelor's degrees, master's degrees, or academic certificates in these areas of study:

    --  Computer Science
    --  Computer Networking
    --  Computer Information Systems
    --  Health Care Informatics and Information Management
    --  Business Technology Management
    --  Database Technologies
    --  Information Assurance
    --  Software Engineering
    --  Systems Engineering
    --  Data Sciences

The College of Computer & Information Sciences is the first college in Colorado dedicated to the field. Currently, 1,950 students are enrolled in computer science-related fields at Regis. Regis also provides graduate certificates in information assurance; information assurance policy and management; agile technologies; cybersecurity; mobile software development; software engineering; systems engineering; health information management; and software engineering and database technologies.

The computer and information technology field is projected to add more than 800,000 jobs by 2020, according to the Bureau of Labor Statistics. By 2018, it is estimated there will be 2.4 million STEM (science, technology, engineering and math) job openings. Additionally, health care professionals will take advantage of intersecting health informatics and management degrees, an ever-increasing field of study and requirement for health care agencies.

"We are thrilled to announce this news and excited about the enhanced role we will play in positioning students for success in this fast-growing field. By creating a dedicated college of study, we can expand upon our already strong leadership and offerings of computer information and sciences courses and continue to meet the needs of this rapidly changing industry," said Father John P. Fitzgibbons, S.J., president of Regis University.

The new college opens fall 2014, and students can enroll now. It will be housed primarily at Regis' northwest Denver campus off 50(th) and Lowell. The university is in the process of hiring a founding academic dean.

Regis pioneered an innovative, dedicated online learning cloud platform that provides constant access to the latest software and technology, ensures continued educational evolution and enhances student learning outcomes. These efforts enable students to focus on a big-picture approach to the field while equipping them to affect real change at their organizations.

Regis is a National Center of Academic Excellence in Information Assurance Education as designated by the National Security Agency and the Department of Homeland Security.  The university engages industry experts, government and academia to prepare thousands of new students for careers in computer information and science industries.

Regis University, inspired by its Jesuit Catholic heritage, has been committed to academic excellence, personal development and community engagement since its founding in 1877. Based in Denver, Colo., Regis provides a values-based liberal arts education offering professional certifications and degree programs at the undergraduate, graduate and doctoral levels.

To learn more about Regis University, please visit www.regis.edu.

Page 6 of 406