Saltar al contenido
Contact : alejandrasalcedo0288@gmail.com

The algorithms are irresponsible (and cannot avoid it)


Machines only see profiles, accounts, statistics. They do not see people. They don't even understand what a person is Credit: SHUTTERSTOCK

I had the other day, before teaching, a revealing conversation with a university professor who, by chance, told her that she had not been able to access her Yahoo! Mail account. Just to help, I told him that the service had been down. I did not know. I also told him that Yahoo! It no longer exists, it was bought by Verizon. I ignored it. And I commented that maybe it would be a good idea to change your password and even adopt an alternative service, because the credentials of the 3,000 million accounts that Yahoo! they were stolen long ago (the data was known in October 2017). This almost gave him a shock, and he complained, not without reason, that he had years of data stored in his account. I wondered if something could be done. I said yes, and I explained the procedure (in a nutshell, synchronize with an email program on a PC and back up that file).

People, lives, users, profiles, statistics

Most people have been trusting their individualities to the most colossal engineering work in human history: computers and the Internet; and Internet not only in the sense of connectivity, but also from the cloud, where we store our things, voice commands are processed or the routes recommended by the GPS app are calculated. I realized, that afternoon, after talking with this teacher, that new technologies have tended more and more to force an idea that is, from my point of view, very deviant: for the machines we are only statistical. Users Profiles Accounts

Loose body (I include myself), and because on the other hand it is a fact that we cannot avoid reporting, we say that Facebook has 2400 million users. Or that Gmail hosts 1.5 billion accounts. In a sense, it is true. In the same way that it is true that lettuce is a vegetable. Now, go to the greengrocer and order half a kilo of vegetables. It won't work, right? Is the same.

So Facebook has 2400 million users, but above all it has 2400 million individuals (let's leave aside for now those who register more than one account and all that; it is not relevant). Gmail accumulates 1.5 billion lives. And the same Netflix (154 million), Twitter (321 million), Instagram (1000 million), and so on.

They are huge numbers and therefore it is hard to imagine that each one, each account, each profile corresponds to an existence. It is a sin of these times, and especially of the digital revolution, to have lost that perspective. Therefore, every place I enter these days shows me pictures of Japanese knives to prepare sushi. Do I want to buy one? Yes and no. The question is that I did a search a week ago to answer a question to a friend. The algorithms, and this is the central point, have a basic prejudice: they only see statistics, users, accounts, profiles; They do not perceive people or lives. They can't even know what people are.

That is, they do not understand that they have already fed me with so much sushi knife, not only because of the repetitiveness, but because they ignore a) the reason why I did that search and b) the fact that the ones I like are too expensive for me budget. Similarly, recently, just after losing my dog
Vicky, the Facebook algorithms informed me that the photo with the most likes of 10 years ago was one in which Vicky appeared. Let's say I don't like him very much. And these are two fairly mild examples. Other people have told me much more shocking experiences. Traumatic, it will be a more appropriate word.

Recently, a friend told me that the only usefulness of Facebook is that it reminds him of his acquaintances' birthday. As long as it is not a deceased (it happened to me a couple of times), really delusional circumstance (and very painful) that shows something that we have to hurry to understand: by definition, the algorithms are irresponsible.

Partly cloudy

Apart from the pstum birthday, lgubrious reminders and misplaced notices, there is a digital iceberg called "the cloud."

I will not say that it is a bad idea, because it lies. It is more: it is not even a new idea. Nor would he say that it is not useful, because it is not only, but that it is a logical and predictable evolution of the digital revolution. But there is something irrefutable and, in a way, inevitable: the cloud depends on private companies. While they are doing well, everything is going well. It's not that I want anyone to go wrong, but no company is going to be king forever. Some companies have leadership decades. But everyone knows that nobody is guaranteed success. The long decline of Yahoo! It is proof of that. Or the almost instantaneous bankruptcies that were seen when the dotcom bubble burst between 2000 and 2002.

If it's a drink, a car or even the underwater cables that make the Internet work, there's no problem. If one provider disappears, another arises. Unless the product no longer interests anyone. But if it's your personal data, all that you had before in a real file or in a folder on your computer (properly backed up), things are different. Because they can filter, disappear or cease to be accessible; In the most benign of cases, you will have to migrate everything to another service, something that is not always easy. Sometimes, the fact that a service falls for only a few hours may be enough to cause millionaire losses. The companies that provide these services are not responsible for the losses. I have been offered the argument that there is nothing to take responsibility for, because these services are free. There are two answers to that approach.

First no. They are not free. We pay them with our personal data. Wikipedia s is free. PlantNet is free (and is now in full
fundraising campaign). But not Google, Facebook or Twitter. And secondly, since so many people and businesses (small and medium, in general) depend on these services, the responsibility is in fact. It is a rule of capitalism – that Europe is clearer than the United States – that with economic power comes responsibility.

Political-corporate labyrinths aside, my best advice is to be attentive to the news about the companies to which we have entrusted our digital life. That includes even music, movies and books today. It does not seem small.

As for the conflict of the principle, is it possible to confer some responsibility on the algorithms? With the current state of these technologies, I fear that the answer is no. As long as we cannot program, at least in part, the unfathomable human condition, the will, the intention, the motives and even the birthday will remain as obscure for the algorithms as are the instincts that lead the ants to do what they do. make. The massive data knows us very well, even better than our spouses or our friends. But they still don't understand us.

ADEMS

. (tagsToTranslate) The algorithms are irresponsible (and cannot avoid it) – LA NACION

Rate this post