University of Valencia's Todolí investigates the risks that Artificial Intelligence can cause in occupational health

A study by Adrián Todolí, professor at the Faculty of Law and co-director of the Chair of Collaborative Economy and Digital Transformation of the University of Valencia, highlights the health risks that algorithms and artificial intelligence can cause due to the progressive work management automation. Published in the journal Labour & Law Issues, the article warns that machines can treat humans as if they were other machines and proposes that the algorithms be designed considering existing occupational hazards.

The use of new technologies is constantly growing in the workplace and tends to introduce artificial intelligence algorithms or systems that are part of the work management in the company. To optimize productivity, these programs analyze data and routines that evaluate performance and effectiveness; establish shifts and production times; design and designate tasks; and even analyze the information of applicants for a position and select the options that best fit the company criteria.

In his research, the professor of the Department of Labour Law and Social Security Adrián Todolí details that the personnel of a company is exposed to certain risks when the management of human resources and labor relations is automated. Thus, constant monitoring through technologies such as GPS or wearable devices; or also the intensification of effort as a result of a period marked by the algorithm, which can generate stress, anxiety, discouragement, and even depression.

“New technologies must be programmed to prevent and reduce these risks, taking into account specific factors such as the transparency of their operation, the adaptation to the abilities of each staff member or the margin of autonomy to make decisions and self-organize”, said the researcher of the Faculty of Law. Adrián Todolí, professor of the Faculty of Law of the University of Valencia.{module INSIDE STORY}

The expert also analyses other risk factors such as possible discrimination through the use of pre-existing non-equal data or statistical inferences outside of professional ethics and the right to privacy; or as malfunctions and cyberattacks. In addition, he cites the depersonalization and lack of empathy for the machines, which carry out their activities without taking into account personal circumstances. Among other cases, Todolí cites the big brother, or feeling of being observed at all times, or the burnout syndrome due to lack of privacy or invasive technological control.

Therefore, algorithms and artificial intelligence in work management can cause physical and mental health problems such as increased stress, anxiety, and frustration; decreased self-esteem and depression in more extreme cases; the reduction of human contact and the worsening of the work environment. They can also cause personality changes related to the dehumanization that the algorithm can exert.

Regulation

Therefore, Todolí emphasizes the importance of the algorithms being regulated, as a post-development phase taking into account occupational hazards. In this sense, it is important to respect privacy and non-discrimination, and the need for a trained person to supervise the actions of the algorithms and maintain contact and communications with employees.

The professor at the University of Valencia recalls that these new tools, in addition to the efficient management of the company and human resources, can offer positive aspects such as better evaluations and prevention methods, detect risks through audiovisual and auditory sensors, as well as alert and increase template protection.

Miami University researchers develop a new framework that shows Deepwater Horizon oil spill larger than previously thought

Toxic and invisible oil spread well beyond known satellite footprint, fishing closures

Toxic and invisible oil spread well beyond the known satellite footprint of the Deepwater Horizon oil spill, also referred to as the BP oil disaster, according to a new study led by scientists at the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science. These new findings have important implications for environmental health during future oil spills.

The UM Rosenstiel School-led research team combines oil-transport modeling techniques with remote sensing data and in-water sampling to provide a comprehensive look at the oil spill. The findings revealed that a fraction of the spill was invisible to satellites, and yet toxic to marine wildlife.

"We found that there was a substantial fraction of oil invisible to satellites and aerial imaging," said the study's lead author Igal Berenshtein, a postdoctoral researcher at the UM Rosenstiel School. "The spill was only visible to satellites above a certain oil concentration at the surface leaving a portion unaccounted for." CAPTION On April 20, 2010, the Deepwater Horizon oil rig exploded, releasing 210 million gallons of crude oil into the Gulf of Mexico for a total of 87 days, making it the largest oil spill in U.S. history. Oil slicks from the blowout covered an estimated area of 57,000 square miles (149,000 square kilometers).{module INSIDE STORY}

On April 20, 2010, the Deepwater Horizon oil rig exploded, releasing 210 million gallons of crude oil into the Gulf of Mexico for a total of 87 days, making it the largest oil spill in U.S. history. Oil slicks from the blowout covered an estimated area of 57,000 square miles (149,000 square kilometers).

These new findings, published in Science Advances, showed a much wider extent of the spill beyond the satellite footprint, reaching the West Florida Shelf, the Texas shores, the Florida Keys and along the Gulf Stream towards the East Florida shelf.

"Our results change established perceptions about the consequences of oil spills by showing that toxic and invisible oil can extend beyond the satellite footprint at potentially lethal and sub-lethal concentrations to a wide range of wildlife in the Gulf of Mexico," said Claire Paris, senior author of the study and professor of ocean sciences the UM Rosenstiel School. "This work added a 3rd dimension to what was previously seen as just surface slicks. This additional dimension has been visualized with more realistic and accurate oil spill models developed with a team of chemical engineers and more efficient supercomputing resources."

The new framework developed by the researchers can assist emergency managers and decision-makers in better managing the impacts of future potential oil spills said the authors.

University of Bonn simulates a universe in which Newton's laws are only valid to a limited extent

For the first time, researchers from the Universities of Bonn and Strasbourg have simulated the formation of galaxies in a universe without dark matter. To replicate this process on the computer, they have instead modified Newton's laws of gravity. The galaxies that were created in the supercomputer calculations are similar to those we actually see today. According to the scientists, their assumptions could solve many mysteries of modern cosmology. The results are published in the "Astrophysical Journal".

Cosmologists nowadays assume that matter was not distributed entirely evenly after the Big Bang. The denser places attracted more and more matter from their surroundings due to their stronger gravitational forces. Over the course of several billion years, these accumulations of gas eventually formed the galaxies we see today.

An important ingredient of this theory is the so-called dark matter. On the one hand, it is said to be responsible for the initial uneven distribution that led to the agglomeration of the gas clouds. It also explains some puzzling observations. For instance, stars in rotating galaxies often move so fast that they should actually be ejected. It appears that there is an additional source of gravity in the galaxies that prevents this - a kind of "star putty" that cannot be seen with telescopes: dark matter.

However, there is still no direct proof of its existence. "Perhaps the gravitational forces themselves simply behave differently than previously thought," explains Prof. Dr. Pavel Kroupa from the Helmholtz Institute for Radiation and Nuclear Physics at the University of Bonn and the Astronomical Institute of Charles University in Prague. This theory bears the abbreviation MOND (MOdified Newtonian Dynamics); it was discovered by the Israeli physicist Prof. Dr. Mordehai Milgrom. According to the theory, the attraction between two masses obeys Newton's laws only up to a certain point. Under very low accelerations, as is the case in galaxies, it becomes considerably stronger. This is why galaxies do not break apart as a result of their rotational speed. The distribution of matter 1.5 billion years after the start of the simulation. The lighter the color, the higher the density of the gas. The light blue dots show young stars. © AG Kroupa/Uni Bonn{module INSIDE STORY}

Results close to reality

"In cooperation with Dr. Benoit Famaey in Strasbourg, we have now simulated for the first time whether galaxies would form in a MOND universe and if so, which ones," says Kroupa's doctoral student Nils Wittenburg. To do this he used a computer program for complex gravitational calculations which was developed in Kroupa's group. Because with MOND, the attraction of a body depends not only on its own mass but also on whether other objects are in its vicinity.

The scientists then used this software to simulate the formation of stars and galaxies, starting from a gas cloud several hundred thousand years after the Big Bang. "In many aspects, our results are remarkably close to what we actually observe with telescopes," explains Kroupa. For instance, the distribution and velocity of the stars in the computer-generated galaxies follow the same pattern that can be seen in the night sky. "Furthermore, our simulation resulted mostly in the formation of rotating disk galaxies like the Milky Way and almost all other large galaxies we know," says the scientist. "Dark matter simulations, on the other hand, predominantly create galaxies without distinct matter disks - a discrepancy to the observations that are difficult to explain."

Calculations based on the existence of dark matter are also very sensitive to changes in certain parameters, such as the frequency of supernovae and their effect on the distribution of matter in galaxies. In the MOND simulation, however, these factors hardly played a role.

Yet the recently published results from Bonn, Prague, and Strasbourg do not correspond to reality in all points. "Our simulation is only a first step," emphasizes Kroupa. For example, the scientists have so far only made very simple assumptions about the original distribution of matter and the conditions in the young universe. "We now have to repeat the calculations and include more complex influencing factors. Then we will see if the MOND theory actually explains reality."