When algorithms decide for you: A warning about automation and the loss of freedom
Today, we often hear that artificial intelligence will be able to solve almost all of our problems. Medicine will be personalized, government procedures will be streamlined, and we’ll always get it right when we have to choose a film or a song to suit our tastes.
According to Andrea Rosales, a researcher in the Communication Networks and Social Change group (CNSC) at the Universitat Oberta de Catalunya (UOC) Internet Interdisciplinary Institute (IN3), our societies are only able to see the benefits of digitalization.
However, in her analysis of Marc-Uwe Kling’s novel “QualityLand,” a work of fiction depicting a world in which everything is governed by algorithms, published in The De Gruyter Handbook of Automated Futures, she points out—along with Sara Suárez—that there is a darker and less well-known side to new technologies and how they are being implemented.
The forced digitalization of society is excluding the most vulnerable groups, due to the in-built biases of algorithmic systems. Indeed, technologies seem to be capable of solving problems, but they are in fact unable to do so.
“Digital technologies generate large quantities of data and create the idea that it is possible to control and quantify most aspects of life,” explained Rosales, who is also a member of the Faculty of Information and Communication Sciences. “But many aspects of everyday life are not quantifiable, and many quantifications are crude approximations of reality.”
The algorithms that make decisions for us also make mistakes
This issue has an impact on people’s everyday lives, as sometimes, and without being aware of it, they make decisions based on recommendations that algorithms provide, despite the fact that they may make errors or overlook certain factors.
“For example, dating apps have changed the way we look for a partner, by emphasizing quantifiable aspects of our lives, which govern the system of priorities that displays the most popular users on these apps.”
What’s more, data is not neutral. One of the risks is that biases can reproduce or worsen existing prejudices in society as a result of their failure to represent disadvantaged groups. This problem is compounded by the lack of laws regulating the algorithms in use today, which are mostly opaque, as very few people know how they work.
Rosales’s analysis of the novel “QualityLand” shows how this work of fiction presents a world that is increasingly familiar to us. Its themes reflect the problems that our contemporary societies face: lack of freedom, frustrations over forced digitalization, predominant techno-optimism, the hyperdatafication of everyday life, and the threats that data-based systems pose to democracy.
‘Peter’s problem’
Peter Jobless is the central character in “QualityLand,” and he leads a life that many of us can relate to: the online store knows exactly what Peter will want and sends it to him before he even thinks he wants it; he uses a platform that finds the right partner for him; and he receives constant ratings for his actions in the digital world. TheShop, QualityPartner and RateMe, the names of these applications, could easily be replaced by Amazon, Tinder and Instagram.
The character’s opinions do not matter very much, because in this world, the models say they know him better than he knows himself: they have access to every aspect of his life, and can access his unconscious. Nevertheless, Peter senses that something is wrong. He feels detached from his friends, his partner, and the objects that the algorithm has chosen for him.
Data can have blind spots. It may contain biases, make mistakes, or simply record only some parts of reality, while leaving other parts out. But when systems make mistakes, they often assume that the problem lies with the user—what the novel calls “Peter’s problem”—and they fail to recognize that it is simply an indicator of how marginalization operates when decision-making is automated.
Forced digitalization
According to Rosales, one of our biggest contemporary problems is “forced digitalization.” The characters in the novel are forced to do everything in the digital world because they have no alternative. And this lack of analog alternatives gives the government and tech companies even greater control over their lives. This is something that we see in our own societies every day. It is increasingly common to find public services where face-to-face appointments are limited, or where some information cannot be accessed other than by using the internet.
Rosales points out that the negative effect of having no alternative to the digital is that people whose technical knowledge is limited or who are not interested in these topics are excluded, or they become dependent on other people.
A critique of techno-optimism
Rosales uses the example of “QualityLand” to debunk the idea that technology will always make our lives easier, for example, by making better choices for us. It will instead create new problems, such as lack of autonomy and growing inequality. It will also create risks arising from the excessive use of data.
“Peter’s problem,” Rosales explained, “is his struggle to fit into a system governed by data that he doesn’t understand.” Opaque algorithms make decisions for him, although he does not know their processes or the basis for their reasoning.
“This opacity is often justified as a trade secret, a security measure, a fraud prevention policy, or due to the complexity of the algorithms. Laws are unable to cope with all the social and political challenges they pose. But in addition to biased data and inaccurate predictions, the datafication of everyday life impacts people’s everyday lives, how they interact and how they are part of society,” continued Rosales.
A new kind of democracy
Another interesting issue that it raises is the future of democracy. In “QualityLand,” an android called John of Us is standing to become president of the country, as his party believes that citizens will trust a robot more than a human. A robot has more data and memory, and can look for more objective arguments when making decisions. So is this something we should be considering?
The author points out that this idea, called “augmented politics,” has found favor among some sectors of academia and the big technology companies. Nevertheless, it poses new threats since it can exclude minorities in its pursuit of benefits for the majority, and more worryingly, it can become a decision-making system that is very difficult to hold to account.
“In ‘QualityLand,’ the citizens lose control over data and how it is used to make crucial decisions that affect their fundamental rights and freedoms,” Rosales explained.
Working to protect the most vulnerable people and seeking models in which the government maintains its independence from big tech companies are two steps that the researcher believes are essential for protecting democracies and basic rights.
More information:
Andrea Rosales et al, Chapter 3 Peter’s Problem. An Analysis of the Imaginaries about Automated Futures Portrayed in QualityLand, The De Gruyter Handbook of Automated Futures (2024). DOI: 10.1515/9783110792256-003
Open University of Catalonia
Citation:
When algorithms decide for you: A warning about automation and the loss of freedom (2024, December 17)
retrieved 17 December 2024
from https://techxplore.com/news/2024-12-algorithms-automation-loss-freedom.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.