Brookhaven physicists develop algorithm that achieves significant improvement in spotting neutrinos in a cosmic haystack

Ground-breaking image reconstruction and analysis algorithms developed for surface-based MicroBooNE detector filter out cosmic ray tracks to pinpoint elusive neutrino interactions with unprecedented clarity

How do you spot a subatomic neutrino in a "haystack" of particles streaming from space? That's the daunting prospect facing physicists studying neutrinos with detectors near Earth's surface. With little to no shielding in such non-subterranean locations, surface-based neutrino detectors, usually searching for neutrinos produced by particle accelerators, are bombarded by cosmic rays--relentless showers of subatomic and nuclear particles produced in Earth's atmosphere by interactions with particles streaming from more-distant cosmic locations. These abundant travelers, mostly muons, create a web of crisscrossing particle tracks that can easily obscure a rare neutrino event.

Fortunately, physicists have developed tools to tone down the cosmic "noise." An example electron-neutrino event before and after applying the "charge-light" matching algorithm. A neutrino interaction is typically mixed with about 20 cosmic rays during the event recording of 4.8 milliseconds. After matching the neutrino interaction's "charge" signal, recorded by the wires, with its "light" signal, recorded by the photomultiplier tubes, it can be clearly singled out from the cosmic ray background. In the event display, the black points are from the electron-neutrino interaction and the colored points are the background cosmic rays. The size of each red circle shows the strength of the matched light signal for each photomultiplier tube.

A team including physicists from the U.S. Department of Energy's Brookhaven National Laboratory describes the approach in two papers recently accepted to be published in Physical Review Applied and the Journal of Instrumentation (JINST). These papers demonstrate the scientists' ability to extract clear neutrino signals from the MicroBooNE detector at DOE's Fermi National Accelerator Laboratory (Fermilab). The method combines CT-scanner-like image reconstruction with data-sifting techniques that make accelerator-produced neutrino signals stand out 5 to 1 against the cosmic ray background.

"We developed a set of algorithms that reduce the cosmic ray background by a factor of 100,000," said Chao Zhang, one of the Brookhaven Lab physicists who helped to develop the data-filtering techniques. Without the filtering, MicroBooNE would see 20,000 cosmic rays for every neutrino interaction, he said. "This paper demonstrates the crucial ability to eliminate the cosmic ray backgrounds."

Bonnie Fleming, a professor at Yale University who is a co-spokesperson for MicroBooNE, said, "This work is critical both for MicroBooNE and for the future U.S. neutrino research program. Its impact will extend notably beyond the use of this 'Wire-Cell' analysis technique, even on MicroBooNE, where other reconstruction paradigms have adopted these data-sorting methods to dramatically reduce cosmic-ray backgrounds."

Tracking neutrinos

MicroBooNE is one of three detectors that form the international Short-Baseline Neutrino program at Fermilab, each located a different distance from a particle accelerator that generates a carefully controlled neutrino beam. The three detectors are designed to count up different types of neutrinos at increasing distances to look for discrepancies from what's expected based on the mix of neutrinos in the beam and what's known about neutrino "oscillation." Oscillation is a process by which neutrinos swap identities among three known types, or "flavors." Spotting discrepancies in neutrino counts could point to a new unknown oscillation mechanism--and possibly a fourth neutrino variety.

Brookhaven Lab scientists played a major role in designing the MicroBooNE detector, particularly the sensitive electronics that operate within the detector's super-cold liquid-argon-filled time projection chamber. As neutrinos from Fermilab's accelerator enter the chamber, every so often a neutrino will interact with an argon atom, kicking some particles out of its nucleus--a proton or a neutron--and generating other particles (muons, pions) and a flash of light. The charged particles that get kicked out ionize argon atoms in the detector, knocking some of their electrons out of orbit. The electrons that form along these ionization tracks get picked up by the detector's sensitive electronics.

"The whole trail of electrons drifts along an electric field and passes through three consecutive planes of wires with different orientations at one end of the detector," Zhang said. "As the electrons approach the wires, they induce a signal, so that each set of wires creates a 2D image of the track from a different angle." How the MicroBooNE detector works: The neutrino interaction creates charged particles and generates a flash of light. The charged particles ionize the argon atoms and create free electrons. The electrons drift toward the three wire planes under an external electric field and induce signals on the wires. The wires effectively record three images of the particle activities from different angles. The light flashes (photons) are detected by photomultiplier tubes behind the wire planes, which tells when the interaction happens. Scientists use the images from the three planes of wires and the timing of the interaction to reconstruct the tracks created by the neutrino interaction and where it occurred in the detector.

Meanwhile, the flashes of light created at the time of the neutrino interaction get picked up by photomultiplier tubes that lie beyond the wire arrays. Those light signals tell scientists when the neutrino interaction took place, and how long it took the tracks to arrive at the wire planes.

Supercomputer translates that timing into distance and pieces together the 2D track images to reconstruct a 3D image of the neutrino interaction in the detector. The shape of the track tells scientists which flavor of neutrino triggered the interaction.

"This 3D 'Wire-Cell' image reconstruction is similar to medical imaging with a computed tomography (CT) scanner," Zhang explained. In a CT scanner, sensors capture snapshots of the body's internal structures from different angles, and computers piece the images together. "Imagine the particle tracks going through the three-wire planes as a person going into the scanner," he said.

Untangling the cosmic web

It sounds almost simple--if you forget about the thousands of cosmic rays that stream through the detector at the same time. Their ionization trails also drift through the scanning wires, creating images that look like a tangled web. That's why MicroBooNE scientists have been working on sophisticated "triggers" and algorithms to sift through the data so they can extract the neutrino signals.

By 2017, they had made substantial progress reducing the cosmic ray noise. But even then, cosmic rays outnumbered neutrino tracks by about 200 to 1. The new papers describe further techniques to reduce this ratio and flip it to the point where neutrino signals in MicroBooNE now stand out 5 to 1 against the cosmic ray background.

The first step involves matching the signals revealed by particles generated in neutrino interactions with the exact flashes of light picked up by the photomultiplier tubes from that interaction.

"This is not easy!" said Brookhaven Lab physicist Xin Qian. "Because the time projection chamber and the photomultiplier tubes are two different systems, we don't know which flash corresponds to which event in the detector. We have to compare the light patterns for each photomultiplier tube with all the locations of these particles. If you've done all the matching correctly, you will find a single 3D object that corresponds to a single flash of light measured by the photomultiplier tubes."

Brooke Russell, who worked on the analysis as a Yale graduate student and is now a postdoctoral fellow at DOE's Lawrence Berkeley National Laboratory, echoed these comments on the challenge of light-matching. "Given that the charge information is in some cases not fully complementary to the light information, there can be ambiguities in charge-light pairings on a single-readout basis. The algorithms developed by the team help to account for these nuances," she said.

Still, the scientists must then compare the timing of each track with the time accelerator neutrinos were emitted (a factor they know because they control the accelerator beam). "If the timing is consistent, then it is a possible neutrino interaction," Qian said.

The algorithm developed by the Brookhaven team brings the ratio down to one neutrino for every six cosmic ray events. MicroBooNE's time projection chamber--where the neutrino interactions take place--during assembly at Fermilab. The chamber measures ten meters long and two and a half meters high.

Rejecting additional cosmic rays gets a bit easier with an algorithm that eliminates tracks that completely traverse the detector.

"Most cosmic rays go through the detector from top to bottom or from one side to the other," said Xiangpan Ji, a Brookhaven Lab postdoc working on this algorithm. "If you can identify the point of entry and exit of the track, you know it's a cosmic ray. Particles formed by neutrino interactions have to start in the middle of the detector where that interaction takes place."

That brings the ratio of neutrino interactions to cosmic rays to 1:1.

An additional algorithm screens out events that start outside the detector and get stopped somewhere in the middle--which look similar to neutrino events but move in the opposite direction. And one final fine-tuning step rules out events where the light flashes don't match well with events, to bring the detection of neutrino events to the remarkable level of 5 to 1 compared with cosmic rays.

"This is one of the most challenging analyses I have worked on," said Hanyu Wei, the Brookhaven Lab postdoctoral fellow leading the analysis effort. "The liquid-argon time projection chamber is a new detector technology with lots of surprising features. We had to invent many original methods. It was truly a team effort."

Zhang echoed that sentiment and said, "We expect this work to significantly boost the potential for the MicroBooNE experiment to explore the intriguing physics at short baselines. Indeed, we're looking forward to implementing these techniques in experiments at all three short-baseline neutrino detectors to see what we learn about neutrino oscillations and the possible existence of a fourth neutrino type."

Purdue professor wins Multidisciplinary University Research Initiative funding from Department of Defense

Brett Savoie, the Charles Davidson Assistant Professor of Chemical Engineering at Purdue University, will lead one of 25 teams receiving Multidisciplinary University Research Initiative (MURI) funding from the Department of Defense as part of the fiscal year 2021 competition.

As principal investigator, Savoie and his team of researchers from Purdue, the University of Pittsburgh and Carnegie Mellon University will conduct research on the topic of predicting organic molecular decomposition. Specifically, the team will establish an “Informatics Paradigm for Predicting Organic Chemical Stability.”

This project initiates a multi-year effort to develop the tools and data necessary to predict how organic molecules degrade under stress.

“This is a complex problem where there is a significant gap between where the science currently stands and where it needs to be in order to design novel molecules and materials in many applications,” Savoie said. “The investment by the Office of Naval Research to sponsor a MURI dedicated to this topic comes at a perfect time because recent developments in machine learning and high-throughput experiments will allow us to tackle this problem in a way that would have been impossible a decade ago.” Brett Savoie

Savoie explained that winning this award is significant on many levels. “The MURI is always highly competitive, so our team is gratified that our proposal was awarded. In addition, MURIs are a serious long-term investment on the part of the DoD and taxpayers to address mission-critical problems, so we are honored to have the opportunity to execute our research plan. In particular, MURIs fund teams of researchers for the duration necessary to authentically advance the state of the art, which is what we intend to do.”

The team has put together an aggressive research plan focused on delivering data-driven approaches to predicting the stability of organic species. This will occur on several fronts, from leveraging modern computational throughput to address data scarcity. The group will use machine learning to accelerate the time-to-solution for characterizing how things break down and develop software to put these tools in the hands of the larger research community.

“To carry out this plan, we’ve assembled a world-class team of researchers, including both computational and experimental members, who will work in tandem to validate and translate these methods to general use,” Savoie said.

“While we have gotten pretty creative at designing molecules with exotic properties, predicting how they will stand up in use environments still eludes us. The purpose of this MURI is to develop the computational and experimental tools to be able to predict in advance how stable new chemistry will be in a given application.”

The winning teams, which represent 57 academic institutions nationwide, will receive five-year grants, contingent upon satisfactory research progress and the availability of funds, to pursue basic research that spans multiple scientific disciplines.

Savoie joined Purdue’s Davidson School of Chemical Engineering in 2017. His research group focuses on accelerating the design and characterization of energy-related materials using theoretical methods. Burgeoning computational power and algorithm development have made theoretical characterization and screening essential steps in modern materials development. From first-principles predictions of electronic structure, catalytic activity, and even crystal structure — methods development continue to push the frontier of what material properties can be predicted in advance, thus economizing costly synthesis and optimization efforts.

Massive dataset reveals which governments have best responded to COVID-19 pandemic

Researchers study the ability of democracies to react to the global crisis

Are our political institutions up for the task of managing the COVID-19 pandemic and any possible future similar threats? A research team led by faculty at Binghamton University, State University of New York has compiled an extensive dataset tracking public health government responses to COVID-19 at national and subnational levels of government throughout the world.

The coronavirus pandemic provides a unique opportunity to evaluate the response of different types of government to a global crisis, according to Binghamton University Professor of Political Science Olga Shvetsova. Other types of catastrophic events, such as war and national disasters, affect select countries or regions, and do not allow one to draw global comparisons. CAPTION The PPI measures public health government responses to COVID-19 at all levels of government throughout the world. The PPI measure considers the extent of COVID-19 policy responses in the following categories: state of emergencies, border closures, school closures, social gathering and social distancing limitations, home-bound policies, medical isolation policies, closure/restriction of businesses and services, and mandatory personal protection equipment. The coding for public health policies is based on government websites and reputable news sources reporting adoption of these policies.  Total, National, and Subnational Indices are calculated based on the standing public health policies adopted at various levels of government for each unit (state, province, etc.) for each day, by adding together the highest values across levels of government in each category on that day. PPI possible daily minimum is 0 and maximum is 40. Natiional and subnational PPIs were constructed with the values in each category from just national- or just subnational-level policies. The current version of the data set contains public health protective policies on the national and sub-national levels, while we plan to expand to the municipal level in the future. The unit of analysis is unit-day.  CREDIT Olga Shvetsova{module INSIDE STORY}

"We are motivated by events to figure out what happened and is happening, and develop new understandings of how government works and politicians function and respond to crises," Shvetsova said of the collaborative lab.

As the pandemic unfolded over the spring and summer, Shvetsova's lab compiled a massive database comparing pandemic-related governmental policies in 64 countries on both the national and sub-national levels, as part of the COVID-19 Protective Policy Index (PPI) project. The data runs from January through May 2020, and is publically available for researchers' use, while data collection is underway for the period between May and November.

The lab began collecting data on March 12. Policies tracked by the database fall into multiple categories, including international and domestic border closures, school closures, social gathering, and social distance restrictions, lockdowns and curfews, medical isolation and quarantine, the restriction of nonessential businesses and services, states of emergency, and mandates requiring personal protective equipment.

In addition to political science professors and doctoral students with the department, the project has drawn colleagues from around the country and even around the world, including Canada, the United Kingdom, and Russia. Undergraduate students joined the effort, too, as research assistants. The lab is collaborative, with members pitching in on data collection, brainstorming, writing, and responding to requests during the peer review process.

"Pandemic policy-making is a truly global experiment in how different types of government work. It is a check on how resilient we are, and what the constitutional sources of that resilience are," Shvetsova said of the ongoing pandemic research.

The data has already sparked two papers, with more in the pipeline. "Institutional Origins of Protective COVID-19 Public Health Policy Responses" will appear in an upcoming issue of the Journal of Political Institutions and Political Economy, and takes a global look at the advent of pandemic-related policies. Published in September by Canadian Public Policy/Analyse de politiques, "COVID-19 Policy Response and the Rise of the Sub-National Governments" compares the advent of policies in Canada and the United States, on both the federal and state/province levels.

The lab will continue to collect data on the pandemic for as long as it remains feasible. The team hopes to make another round of data, from May through July, available by the end of the year. Additional variables as well as more countries will also be added to the database.

Currently, the lab is writing and publishing work on incentives and disincentives for pandemic response in democracies, looking at the impact of governmental structure, political parties, and the way governments are held accountable for the health of their populations. Other projects will likely emerge as data continues to accumulate.

Long-term, the coronavirus may offer a metric with which to judge the efficacy of different styles of government in responding to the crisis. That would require reliable statistics that other disciplines are gathering: of the number of cases and deaths, along with strong mathematical epidemiological models of factors determining spread and mortality.

"These are big questions. It's unprecedented to be in a moment of time when we can contemplate hypotheses and run regressions and actually glimpse the answers to those big questions," Shvetsova said.