Agriculture Data Shifts from Farm to Field-Level, Helping Growers Make Decisions in Real-Time

Farmers Edge has announced a collaboration with The Weather Company, an IBM Business, the world's largest private weather enterprise. Under the terms of the deal, Farmers Edge has integrated hyper-local forecasts from Weather’s superior Forecasts on Demand (FoD) weather forecasting engine into its field-centric approach to predictive modeling. This enhancement of real-time meteorological data analytics is another step towards the digitization of agriculture, an industry in the midst of a radical transformation.

Established growing regions have traditionally relied on static climate data sets collected by an existing fleet of government, city or airport weather stations. Now, with a global network of distributed weather stations in the field, Farmers Edge is providing real-time data from the field that can enable highly precise, predictive models that inform growers’ decision-making on: critical crop stages, the timing of field operations, pest and disease pressure, equipment deployment, soil needs and nutrient requirements. Through this collaboration, Farmers Edge continues to marry data science with agricultural science to provide the most accurate field-centric data in the industry.

“Through our work with Farmers Edge, we hope to help the agriculture industry leverage precise weather data in order to optimize critical decisions,” said Mark Gildersleeve, President, Business Solutions, The Weather Company, an IBM Business. “Weather is the single biggest variable in business performance, and this deal brings together some of the most advanced weather forecasting science in the industry and a leader in ag-data to create a one-of-a-kind platform to support decision-making to improve yields for growers around the world.”

Farmers Edge has established itself as an industry leader in deploying weather stations on farms. By Spring 2016, the company expects to have the largest real-time weather-monitoring network in Canada with over 1,000 automated field-centric weather stations and will be targeting similar deployment in other substantive agriculture markets. The enhanced weather offerings underscored by this collaboration are a part of the larger vision of Farmers Edge to bring cutting edge technologies to growers worldwide.

Through a field-centric technology approach, Farmers Edge enables more productive and sustainable farming, ultimately increasing crop yields. The Farmers Edge Precision Solutions package is a comprehensive turnkey system that includes: Variable Rate Technology, soil sampling and analysis, field-centric weather monitoring, in-field telematics and data transfer, high-resolution satellite imagery, field-centric data analytics, access to an integrated farm management platform and a network of highly experienced, trusted advisors on the ground.

“Though weather is a key factor in the decision-making of today’s farmers, growers have not had access to high-quality weather data that can enable more informed management and facilitate better growing, particularly in data-sparse environments,” said Wade Barnes, President and CEO of Farmers Edge. “We’ve created a scalable solution that works for our global network of growers and one that can be tailored to support the needs of individual fields, from grain production in Brazil to variant weather in Australia. Predictive forecasting models are the next step towards achieving higher global crop yields, sustainably.”

Itai Yanai, PhD – whose in-depth study of how embryos develop has led to internationally recognized breakthroughs in the analysis of gene composition and expression – has been named the inaugural director of the newly created Institute for Computational Medicine at NYU Langone Medical Center. He officially takes his new position on May 1, 2016.

The Institute for Computational Medicine will act as the hub for multidisciplinary efforts to reveal patterns in medical data that empower disease diagnosis and the design of new treatments.

"We are tremendously excited to welcome Dr. Yanai, who will help many research teams at NYU Langone to take advantage of the rapidly evolving field of computational biology," said Dafna Bar-Sagi, PhD, senior vice president and vice dean for Science and chief scientific officer at NYU Langone. "His recruitment to NYU Langone embodies our strategy to bring the very best experts in important fields into an environment where innovation and excellence thrive."

Dr. Yanai, who also will hold the academic title of professor in the Department of Biochemistry and Molecular Pharmacology at NYU School of Medicine, comes to NYU Langone from the Faculty of Biology at the Technion–Israel Institute of Technology. There he served since 2008 as a research leader in the study of gene regulation through the lens of evolution and development.  Combining experimental approaches in embryology, molecular biology, and computational biology, he has explored the principles by which developmental pathways evolve.

"It is a great honor to become part of the ambitious and world renowned research enterprise at NYU Langone," says Yanai. "Overall, the institute's mission is to promote the advancement of biomedical research through the development of novel data mining tools and the design of translational applications." 

Yanai's lab at the Technion has pioneered a powerful method for single-cell gene expression analysis that he will use to explore the progression of cancer and the process of infection. A prolific researcher, Yanai has also co-authored a recent science book—The Society of Genes—with Prof. Martin Lercher of the University of Dusseldorf.

Widely recognized for his contributions to science, Yanai is the recipient of many distinctions, including a 2014 Fellowship at the Radcliffe Institute for Advanced Study at Harvard University and the Krill Prize of the Wolf Foundation. He earned his PhD in bioinformatics from Boston University.

There is no question that data are big: 2.5 quintillion new bytes are added every day from our keyboards, sensors, entertainment, and medical scans, to name but a few. Data scientists have begun refining tools to encourage civilians to join them in extracting important insights from the aggregation and analysis of big data sets.

NYU Tandon Assistant Professor of Computer Science and Engineering Enrico Bertini and his graduate student Cristian Felix recently received a $35,000 Knight Foundation Prototype Fund grant to do just that for RevEx, which can perform faceted searches and analyze a combination of text and data across multiple domains. The Knight Foundation, which supports investigative journalism, will support refinements to RevEx that will enable reporters to elicit stories, but the appeal of the tool reaches well beyond the Fourth Estate.

Even before the Knight grant, RevEx--developed with graduate student Anshul Pandey--was embraced by journalists.

The nonprofit investigative journalism unit ProPublica used it to sort through millions of online reviews of medical services on Yelp. (Some findings: Thumbs down for low-cost dental clinics; medical doctors' staff were panned more often than the actual care; and simple transactions like hair removal scored individual highs.)

The Economist employed RevEx to discover some fundamental unfairness in students' assessments on Rate My Professor. (Female professors consistently ranked lower, and "horrible" rankings seemed to correlate to the difficulty of the subject.)

But the most significant work so far for RevEx has been for the United Nations. Using RevEx as a starting point, Felix devised a tool that took top honors at in the U.N.'s #VisualizeChange: World Humanitarian Summit Data Challenge. Designed to analyze and visualize a staggering amount of data from citizens on a country-by-country basis, it will allow the U.N. to attack the most pressing local humanitarian problems.

Why did the lizard cross the forest floor? It's an ecological conundrum that James Cook University researchers Mat Vickers and Professor Lin Schwarzkopf have answered with a novel approach.

Their problem was that scientists didn't know why lizards do what they do. If a lizard moves to a sunny spot, its body will heat up - but is it actually trying to warm up? Maybe it was chasing some tasty morsel of food and had to cross a sunny area to get it? How do you know that what an animal is doing is a deliberate strategy?

Lizards usually use the sun and shade to regulate their body temperature, a process known as thermoregulation. But taking a lizard's temperature is a tricky thing, and comparing that to temperatures across their habitat can be even trickier.

Dr Vickers said the original methodology was a bit grim "Scientists used to shoot the lizards with a shotgun and compare the temperature of the meat with the air. If the temperature was different - Voila! Thermoregulation!"

Luckily for the lizards, and the scientists, the field has changed somewhat.

Dr Vickers and Professor Schwarzkopf devised a new supercomputerised method that reconstructs the real temperatures across a habitat and plots random lizard walks through it. If real lizards achieve body temperatures different from that achieved by the random walking model, then they are walking non-randomly (deliberately) with respect to temperature.

The model produced some interesting results and confirmed lizards are deliberately trying to manage their body temperatures. "All our random walking model lizards died of heat shock at about 11 am." Professor Schwarzkopf said. "This was the first indication we had that lizards are often working hard to achieve preferred body temperatures."

The study also suggested that real lizards use a clever mix of carefully selecting habitats with certain temperatures (thermoregulating) when habitat temperatures are very warm or cold, while using the habitat much like the random walking model lizard the rest of the time.

Working out how much of a lizard's behavior is related to controlling its body temperature has been a goal of scientists for a long time. It's even more important now as climate change alters the temperatures in lizard habitats.

The study shows exactly why, when, and where lizards choose specific parts of their environment to either heat up or cool down. The new methodology also shows which parts of the environment, and which times of day, are too hot for the lizards.

This has important implications for how lizards and other cold-blooded animals might have to change their behaviour under future climate change scenarios.

Of all the fast and powerful computers in the world, our brain remains by far the most impressive. Now an interdisciplinary team of scientists, led by Baylor College of Medicine, aims to reveal the computational building blocks of our brain and use them to create smarter learning machines.

To enable this ambitious project, the U.S. government’s Intelligence Advanced Research Projects Activity (IARPA) has awarded a $21 million contract to an interdisciplinary team of neuroscientists, computer scientists, physicists and mathematicians, led by principal investigator Dr. Andreas Tolias, associate professor of neuroscience at Baylor. The research team includes scientists from Baylor, the California Institute of Technology, Columbia University, Cornell University, Rice University, the University of Toronto and the University of Tuebingen.

The program supporting this research is known as Machine Intelligence from Cortical Networks (MICrONS) and was envisioned and organized by Jacob Vogelstein, a neuromorphic engineer and program manager with IARPA. It is part of the broader BRAIN Initiative, launched in 2013 by President Obama with the goal of understanding devastating brain diseases and developing new technology, treatments and cures.

“Our goal is to discover the algorithms and learning rules that the brain implements and use these discoveries to create fundamentally smarter artificial neural networks” said Tolias.

Artificial Intelligence

Artificial Intelligence (AI) has been a dream for a long time, but an elusive one. People have consistently underestimated how hard it is to construct intelligent systems, Tolias said. Famously, in 1955, cognitive scientists proposed to solve AI as a summer project for several Dartmouth undergraduates. It didn’t turn out to be that easy.

But in the past few years, AI has been booming, and applications have blossomed everywhere. Handheld devices now use machine learning algorithms to recognize faces and speech, the first self-driving cars already are on the roads, and computer systems regularly wade through Big Data to find new medicines and anticipate geopolitical and financial trends.

Despite their new successes, “these artificial neural networks are still incredibly primitive compared to biological neural networks, and don’t learn the way real brains do,” said Dr. Xaq Pitkow, co-principal investigator of the MICrONS project, and an assistant professor of neuroscience and McNair Scholar at Baylor as well as an assistant professor of electrical and computer engineering at Rice University. “By modeling the brain’s computations and extracting their key features, we think we can give computers the ability to do much better.”

Bigger Data

Researchers have been trying to develop brain-like intelligence for years, so what has changed that makes this more achievable today? Tolias’s answer: bigger and better data.

Jacob Reimer, assistant professor of neuroscience at Baylor and one of the lead scientists and project manager of the team, agrees.

“Technologies in both physics and molecular biology have advanced so much that we can now record from many hundreds of neurons at a time, with even more extensive recordings on the horizon,” he said. “This lets us analyze neural circuits in ways that we couldn’t dream of just a few years ago.”

In order to accomplish their ambitious goals, the team musters an impressive range of experts, each directing a research group in different aspects of the project. Chris Xu, a physicist at Cornell University, has pioneered 3-photon imaging, a novel method to monitor neural activity non-invasively and more deeply in the brain than ever before. Thanos Siapas of Caltech will focus his team’s efforts on unraveling the learning rules in the brain. On the theoretical and mathematical modeling side, Mathias Bethge of the University of Tuebingen and Liam Paninski of Columbia, will develop mathematical models and statistical methods to help interpret the complex data collected by the team’s experimentalists.

Even with the massive trove of data they will collect by recording neural activity, it can be hard to tell which neurons are connected to each other, directly exchanging information. That type of wiring information is critical to understanding the brain’s algorithms that are embodied in its biological wetware.

Accordingly, the Baylor-led team also is partnering with two other groups to reconstruct the complete wiring diagram for a cube of brain whose activity they measure.

Clay Reid and Nuno da Costa from the Allen Institute for Brain Science will use electron microscopy to image biological structures down to scales reaching just billionths of a meter. They will then hand off about 1,000 large hard disks full of these massive images to Sebastian Seung from Princeton University, who will extract the three-dimensional structure of the neurons, including their shapes and connections. All together, the team will create one of the most exhaustive neuroscience data set in history.

Creating Networks

The ultimate key to succeeding in the goals of the MICrONS project is to implement neuroscience principles in computer algorithms. Working closely with the neuroscientists, team members Ankit Patel, assistant professor of neuroscience at Baylor College of Medicine and Rice University, Richard Baraniuk from Rice University, and Raquel Urtasun and Richard Zemel from the University of Toronto, are machine learning experts, tasked with integrating the computational building blocks the neuroscientists discover into new kinds of artificial neural networks.

This endeavor represents a monumental challenge and opportunity for both neuroscience and computer science. There is no better test of neuroscience principles than to see whether machines built from those principles actually solve real-world problems, Tolias said.

“The project demonstrates what is possible when we bring together world-class interdisciplinary researchers to solve fundamental problems. This bold endeavor could provide enormous benefits to society, including new computer technologies, new understanding of the brain, and new medical breakthroughs,” said Dr. Paul Klotman, Baylor College of Medicine president, CEO and executive dean.

The team’s grand aims are to understand one cubic millimeter of a mouse brain, and if they are successful their new algorithms will revolutionize machine learning. But it will still be the product of human ingenuity and creativity, and it will take more understanding than we can squeeze from a tiny cube of mouse brain to match that feat.

Page 5 of 48