contador javascript Saltar al contenido
Contact : alejandrasalcedo0288@gmail.com

The biased algorithms are arranged: can you change your prejudices?


Detail of the fight between Don Carnaval and doa Cuaresma, by Pieter Brueghel the Elder (1559)

The saying goes that no camel looks its hump. This is also true when we talk about
decision systems based on machine learning, which, of course, ignore its defects. A set of algorithms modeled to recognize faces has, in principle, no way of realizing that
It is classifying worse the faces of people of certain ethnic minorities. It is up to us to fix it, and we are working on it.

But the saying is not invented thinking about algorithms. That invisible hump is the portrait of a purely human defect.
What happens when bias is in us? "Changing algorithms is easier than changing people: computer software can be updated; the network of our brains has proven to be much less tactile," says researcher Sendhil Mullainathan
in a column that recently signed in The New York Times.

Mullainathan, a professor of Behavioral and Computer Science at the University of Chicago, knows both humps well. We analyzed ours more than fifteen years ago and published a study on the algorithms last October. His conclusion is that
black boxes are not exclusive to machine learning systems. "Humans are inscrutable in a way that algorithms are not. The explanations for our behavior are changing and are built after the facts," he explains.

Humpback Algorithms

Let's start at the end, technological revolution through, with the most recent work of the professor at the University of Chicago. In this study, Mullainathan evaluates the performance of an evaluation system designed to determine the level of disease and allocate the corresponding resources to each case. Outcome? The number of patients of color selected to receive additional care is reduced by more than fifty percent with respect to white patients who had an identical level of risk.

The source of this imbalance, explains the professor, is in the data used to measure this level of illness: the cost of health care. "As society spends less on patients of color than on whites, the algorithm underestimates the real needs of black patients." According to the estimates of the research authors, this bias could have affected some one hundred million people in the United States alone.

From the point of view of Miquel Segur, professor of the UOC and author of the book
Life is also thought, the myth of neutrality and justice in the calculation has brought us here. "As a reason it comes from
ratio in Latin, and that means calculation, we believe that the calculations are in themselves perfect, unalterable and compact, "note it." The algorithm is a way of trying to get closer to reality to have a photograph or a kind of control around a disparity of situations and cases that will always escape maximum control. "

The bias that lives in you

Emily and Greg are more employable than Lakisha and Jamal? This is the title and central question of the study that Mullainathan published in the American Economic Review in September 2004. After sending fictional curricula to different job offers, they verified that white names receive 50% more calls for interviews than African-Americans. The phenomenon, in addition, was common to all industries, occupations and size of companies.

There are no algorithms here, but
inscrutable humans. "To get to discover what is the interest we have and try to photograph objectively, neutrally and asptically everything we can think or desire is fine as a program, say, to generate knowledge. But I don't know if it is attainable in itself. ", reasons Segur.

The black box is you

Looking back, Mullainathan agrees that the ability to make unfair decisions – and their potential to generate damage – is a feature we have in common with algorithms, but stresses that the list of reasonable similarities ends here: "A difference between both studies is the work that was necessary to discover the bias. "

In 2004 I needed months of work to develop the curricula, send them and wait for the answers. This year summarizes it as a simple "statistical exercise": "The work was technical and repetitive, without requiring stealth or resources." And the same with the solutions. For the algorithm, there is already a prototype tool that will have to neutralize the bias detected in the system. In the case of humans, change takes longer. "None of this seeks to belittle the obstacles and measures necessary to correct the somewhat biased bias, but compared to the intransigence of human bias, it seems quite simple."

ADEMS

(tagsToTranslate) Biased algorithms are arranged: can you change your prejudices? – THE NATION

Rate this post