Molecular data analysis using R published by the US publisher, Wiley Blackwell. This book was written with two kinds of people targeted as audience. Researchers working in molecular biology laboratories who intend to analyze their experimental data with a modern statistical environment, and bioinformaticians wishing to understand the experimental methods behind the data they are working with. 

This book addresses the difficulties experienced by wet lab researchers with the statistical analysis of molecular biology related data.  The authors explain how to use R and Bioconductor for the analysis of experimental data in the field of molecular biology.  The content is based upon two university courses for bioinformatics and experimental biology students (Biological Data Analysis with R and High-throughput Data Analysis with R). The material is divided into chapters based upon the experimental methods used in the laboratories.  

Key features include:
• Broad appeal--the authors target their material to researchers in several levels, ensuring that the basics are always covered.
• First book to explain how to use R and Bioconductor for the analysis of several types of experimental data in the field of molecular biology.
• Focuses on R and Bioconductor, which are widely used for data analysis. One great benefit of R and Bioconductor is that there is a vast user community and very active discussion in place, in addition to the practice of sharing codes. Further, R is the platform for implementing new analysis approaches, therefore novel methods are available early for R users.

Big data for the molecular biology lab: new educational resource from Finland

"In this work we used research data produced from molecular biology experiments conducted at the University of Tampere," says the author Chaba Ortutay. He continues: The content of the book uses our experiences with constant interactions with students at our courses delivered at different universities in Finland, Ireland, Hungary; and on our online platform, worldwide. The feedback from these students were indispensable for writing and finalizing our book."

Detailed information on Wiley’s website: http://eu.wiley.com/WileyCDA/WileyTitle/productCd-1119165024.html

Our Galaxy's gravitational field limits the accuracy of astrometric observations of distant objects. This is most clearly appeared for objects that are visually located behind the central regions of the Galaxy and the Galactic plane, where the deviation can be up to several dozen microarcseconds. And, more importantly, the effect of this gravitational "noise" cannot be removed. This means that at a certain moment it will no longer be possible to improve the accuracy of determining the position of reference objects, which are used to define the coordinates of all other sources.

The results of the study have been published in The Astrophysical Journal.

It is widely known that our planet Earth and the Solar System itself are in the depths of the Milky Way, and it is through this galaxy that we look out onto the Universe. As it turns out, this fact is no small matter in astrophysical studies.

How strong an effect can our Galaxy's gravitational field and its non-uniformity have on the accuracy of determining the coordinates of distant - extragalactic - objects? A group of Russian astrophysicists from the Astro Space Center (ASC) of P.N. Lebedev Physical Institute, the Space Research Institute of the RAS, MIPT, and the Max-Planck-Institut fuer Astrophysik (Germany) attempted to find an answer to this question.

Proper motions, angular sizes, and trigonometric parallaxes (visible displacements) of celestial bodies, including stars, are the basic parameters for solving many astrophysical problems. These parameters are determined by astrometric techniques, and to calculate the position or radial velocity of star, for example, a coordinate system is needed that can be used to measure them against. All of the coordinate systems currently in use, including the International Celestial Reference Frame (ICRF), are based on the coordinates of several hundred "defining" extragalactic sources. Quasars and distant galaxies are ideal reference points for determining the celestial reference frame, as their angular movement is very small - around one-hundredth of a milliarcsecond (compared to the diameter of the Moon for example, which is a little more than 31 arcminutes).

An arcsecond is an astronomical unit used to measure small angles, identical to the second of a plane angle. In the same way that an hour is divided as a time interval, the degree of an angle is divided into 60 minutes, and a minute into 60 seconds.

Astrophysical instrumentation is developing rapidly and it is expected that the accuracy of radio interferometric observations will soon reach 1 microarcsecond, and the accuracy of optical observations - 10 microarcseconds per year. However, with this level of accuracy there comes a new challenge - the general theory of relativity, and in particular the deflection of a light beam when moving in a gravitational field, interfere with the observations.

When a light beam from a distant source passes close to any object, it is slightly deflected by the gravity of the latter. This deviation is typically very small, but if the beam encounters several of these objects on its path, the deviation may be significant. In addition to this, as the objects are moving, the beam deflection angle changes in time and the source coordinates start to "jitter" around their true value. It is important to note that the coordinate "jittering" effect applies to all distant sources, including those that are used as reference points for different coordinate systems.

"In attempting to improve the accuracy of implementing the coordinate reference system, we reach a limitation that cannot be bypassed by improving the accuracy of the detecting instruments. In fact, there is a gravitational noise, which makes it impossible to increase the accuracy of implementing a coordinate system above a certain level," says Alexander Lutovinov, a professor of the RAS, the head of laboratory of the Space Research Institute of the RAS, and a lecturer at MIPT.

The researchers tried to estimate how much of an effect gravitational noise can have on observations. The calculations were based on modern models of the Galactic matter distribution. The two-dimensional "maps" of the entire sky were built for each model showing the standard deviation of the angular shifts in positions of distant sources with respect to their true positions.

"Our calculations show that over a reasonable observational time of around ten years, the value of the standard deviation of shifts in positions of sources will be around 3 microarcseconds at high galactic latitudes, rising to several dozen microarcseconds toward the Galactic center," says Tatiana Larchenkova, a senior researcher at the ASC of P.N. Lebedev Physical Institute. "And this means that when the accuracy of measurements in absolute astrometry reaches microarcseconds, the "jittering" effect of reference source coordinates, which is caused by the Galaxy's non-stationary field, will need to be taken into account."

The scientists investigated the properties of this gravitational noise that, in the future, will enable the noise to be excluded from observational data. They also demonstrated that the "jittering" effect of the coordinates can be partially compensated by using mathematical methods. 

CAPTION A map showing the characteristic values of the
CAPTION A map showing the characteristic values of the "jittering" of source coordinates around their true position caused by the Galaxy's "gravitational noise", in microarcseconds (shown in contours) for a ten-years observation period. The crosses represent the positions of ICRF reference sources

The Oscar-nominated film “Hidden Figures” offers a “revolutionary” impact in promoting STEM to women and minorities, according to Virginia Booth Womack, director of the Minority Engineering Program at Purdue University.

Womack, previously a practicing engineer for several years, said the movie shows the importance of mathematics to our society, and it presents the discipline to young people in a way that makes it exciting.

“This movie was very influential in showing women and minorities that there is no limit to what is possible if you believe in yourself,” Womack said. “Belief and perseverance were the power of the movie, in spite of seemingly unsurmountable obstacles.”

 “Hidden Figures,” nominated for an Oscar for Best Motion Picture, recounts the stories of three black female mathematicians and their work at NASA in launching John Glen into orbit in 1962.

Womack said “Hidden Figures” shows the collaborative nature of the STEM fields - science, technology, engineering and mathematics.

“Math was not just presented as a lot of redundant problems to solve, but as the gateway to science, engineering and technology,” she said. “There were clear applications of mathematics in redesigning the capsule of the spacecraft to improve aerodynamics and structural integrity, operating the IBM systems, as well as bringing the spacecraft safely back into earth’s atmosphere.”

She noted the movie demonstrates the importance of overcoming issues of racism and discrimination.

“This movie is a timely and powerful portrayal of critical issues of racism and sexism in recent American history that, left unaddressed, would have severely limited America’s competitiveness in space,” Womack said. “It illustrates that when we move beyond the constraints of discriminatory mindsets, we achieve greatness as a nation.”

The Academy Awards are Sunday (Feb. 26). 

CAPTION This is an illustration of the new allosteric model developed in this study. CREDIT Matthieu Wyart/EPFL

EPFL scientists have created a new supercomputer model that can help better design of allosteric drugs, which control proteins "at a distance".

Enzymes are large proteins that are involved in virtually every biological process, facilitating a multitude of biochemical reactions in our cells. Because of this, one of the biggest efforts in drug design today aims to control enzymes without interfering with their so-called active sites -- the part of the enzyme where the biochemical reaction takes place. This "at a distance" approach is called "allosteric regulation", and predicting allosteric pathways for enzymes and other proteins has gathered considerable interest. Scientists from EPFL, with colleagues in the US and Brazil, have now developed a new mathematical tool that allows more efficient allosteric predictions. The work is published in PNAS.

Allosteric drugs

Allosteric regulation is a fundamental molecular mechanism that modulates numerous cell processes, fine-tuning them and making them more efficient. Most proteins contain parts in their structure away from their active site that can be targeted to influence their behavior "from a distance". When an allosteric modulator molecule -- whether natural or synthetic -- binds such a site, it changes the 3D structure of the protein, thereby affecting its function.

The main reason allosteric sites are of such interest to drug design is that they can be used to inhibit or improve the activity of a protein, eg. the binding strength of an enzyme or a receptor. For example, diazepam (Valium) acts on an allosteric site of the GABAA receptor in the brain, and increases its binding ability. Its antidote, flumazenil (Lanexat), acts on the same site, but instead inhibits the receptor.

Generally speaking, an allosteric drug would also be used at a comparatively lower dose than a drug acting directly on the protein's active site, thus providing more effective treatments with fewer side effects.

Developing an allosteric model

Despite the importance of allosteric processes, we still do not fully understand how a molecule binding on a distant and seemingly unimportant part of a large protein can change its function so dramatically. The key lies in the overall architecture of the protein, which determines what kinds of 3D changes an allosteric effect will have.

The lab of Matthieu Wyart at EPFL sought to address several questions regarding our current understanding of allosteric architectures. Scientists classify these into two types: hinges, which cause scissor-like 3D changes, and shear, which involve two planes moving side-by-side. Despite being clear mechanically, the two models do not capture all cases of allosteric effects, where certain proteins cannot be classified as having either hinge or shear architectures.

The researchers explored alternative allosteric architectures. Specifically, they looked at the structure of proteins as randomly packed spheres that can evolve to accomplish a given function. When one sphere moves a certain way, this model can help scientists track its structural impact on the whole protein.

Using this approach, the scientists addressed several questions that conventional models do not answer satisfactorily. Which types of 3D "architecture" are susceptible to allosteric effects? How many functional proteins with a similar architecture are they? How can these be modeled and evolved in a supercomputer to offer predictions for drug design?

Using theory and supercomputer power, the team developed a new model that can predict the number of solutions, their 3D architectures and how the two relate to each other. Each solution can even be printed in a 3D printer to create a physical model.

The model proposes a new hypothesis for allosteric architectures, introducing the concept that certain regions in the protein can act as levers. These levers amplify the response induced by binding a ligand and allow for action at a distance. This architecture is an alternative to the hinge and shear designs recognized in the past. The supercomputational approach can also be used to study the relationship between co-evolution, mechanics, and function, while being open to many extensions in the future.

  • Foot-and-mouth epidemics could be controlled quickly and effectively by rapidly establishing how many animals can be vaccinated per day of an outbreak.

  • University of Warwick researchers simulated future UK outbreaks with a mathematical model - calculated most successful approach to stopping spread of infection

  • Strategies using model could save up to £50 million and around 200,000 animals could be spared from culling - and disease eradicated a week sooner than previous outbreaks

Future outbreaks of foot-and-mouth disease (FMD) can be controlled effectively and quickly with vaccinations - saving millions of pounds and hundreds of thousands of livestock - according to research by the University of Warwick.

Dr Michael Tildesley and Naomi Bradbury from the School of Life Sciences have discovered that a key issue for successfully containing and eradicating a FMD outbreak is to establish how many animals can be vaccinated per day, and tailor controls accordingly.

Using a mathematical model of the UK farming landscape, Dr Tildesley and colleagues simulated numerous scenarios of infection - to varying levels of severity and speed - calculating the most effective and efficient approaches to stave the spread of disease.

Many dangerous uncertainties exist when dealing with epidemics like FMD, such as: the efficacy of vaccinations, the time it takes for livestock to become immune after receiving vaccines, and the number of vaccine doses available. Uncertainty leads to huge potential losses of both money and livestock.

The Warwick FMD model demonstrates that the major uncertainty to be resolved is how many vaccine doses are available. If this is known, the infection can be contained efficiently - even when faced with all other unknown factors.

The 2001 FMD outbreak cost the UK economy an estimated £8 billion and led to the culling of approximately seven million livestock.

Using the Warwick FMD model and confirming what vaccination capacity exists, the UK could save up to £50 million, and around 200,000 animals could be spared from culling in any future epidemic.

Furthermore, any outbreak using such tailored vaccination can generally be eradicated almost a week sooner than previous outbreaks.

Dr Michael Tildesley comments:

"There is always uncertainty in the likely effectiveness of any control strategy for an infectious disease outbreak. However in the case of FMD, if we can accurately determine the daily capacity to vaccinate animals, we can potentially save millions of pounds for the farming industry."

The paper, 'Quantifying the Value of Perfect Information in Emergency Vaccination Campaigns' is published in PLOS Computational Biology.

Page 1 of 11