A few days ago I was asking this question in class and someone answered:
The answer is simple, everyone can do whatever they want as long as they don’t land you in jail.
I replied but it’s not really like that. There are things that I cannot do, because they are impossible, no matter how hard I try and can avoid imprisonment. I can’t create a circular square, not because there is a threat of jail time for anyone who attempts it, but because it’s impossible.
I can’t marry my bot, because marriage is a matter of two, and here there are no two: robot is not a subject can get the commitment that marriage means. Just as you can’t hire anyone, apply for a loan, or buy a house (obviously, when a robot does these things on behalf of a human, the action is attributed to the human, not the robot; just as we attribute the kill to a gun or a knife).
You can’t marry your pet either. Although the reasons in this case are somewhat different, because the pet is in some way a subject. Your pet can adopt you (i.e. accept you as owner and companion); But he cannot obtain a bond equivalent to the marriage obligation.
And we continue: You can fall in love with your Alexa, but Alexa can’t fall in love with you. You can pretend to be emotional, but you can’t experience it. It’s far, far less than what a pet could do.
The robot is not a subject capable of obtaining personal commitments. He cannot put himself on the line, simply because he is not “a breath”. A robot can be programmed to do “do”, but not “want”. A robot cannot be programmed to want to marry (although, of course, it can be programmed to say “I do”). In short, I can love a robot a lot, but my robot can’t love me, neither too much nor too little.
I’ll start to believe that the venerated AI is possible when someone tells me How do you program the machine to do what you want?. Alexa, LaMDA, and ChatGPT don’t do that, and they don’t come close to it, and the proof is that they are subject to external quality control. They only do what we know how to ask them to do, As Ada Lovelace already said 180 years ago, and Alan Turing couldn’t disprove it.
robot nationality
In October 2017 it was News That Saudi Arabia granted citizenship to the robot Sophia. Among the many criticisms that were present, I save this By Hussain Abbas of the University of New South Wales, Australia (here in Spanish). An unfortunate joke in bad taste (and I am short) offends the real citizens who are mistreated in that country, and who now see their legal status below that conferred upon a machine recognized with more rights than their own. It’s unbelievable, but the media echo Sofia’s supposed “statements” in which she stated it He wants to have a robotic baby.

I say “alleged” statements because A robot can make no more statements than a book or gramophone. I say the same Carissa Velez About Blake Lemoine and LaMDA in his candid and scandalous article, Why Google’s algorithm is not a person:
Believing in LaMDA and thinking it is a sentient being It’s like having a mirror And you think your reflection is your twin, who lives a life parallel to yours on the other side of the mirror. The language that this AI uses is that of the reflection in the mirror. His use of language is more like a book, an audio recording, or a speech-to-text program than a conscious person. Would you feed a book if it says “I’m hungry”? The words the AI used are the ones we used that reflected back to us, arranged statistically in the patterns we tend to use the most.
That is, the AI does nothing but give us back our own image, our own words, that have been scrambled a million times, but without contributing anything truly original. Sophia (ex lambdaLikes chat) is not a subject, so it is wrong to attribute to him someone’s actions, such as expressing an opinion, wishing, concluding reasoning, trying to persuade … Sophia can want a child as much as she wants dolls Famous. Figuratively speaking, we can use action verbs with grammatical subjects that are not living subjects. Like when we say “The printer says it has no paper”. But this is not a true “saying”, but the product of a purely mechanical process.
Are these Saudis crazy? Well, this craze also happened in Europe. In February 2017 it was introduced in the European Parliament initiative To give legal personality to bots (here News). Of course, at the same time they must be equipped with a “death button” in case they get out of control.
Parliamentarians propose to consider robots capable of making independent decisions as electronic people. They ask the engineers to include a “death button” in each of them that would allow them to be quickly deactivated if they got out of control and ensure people’s safety.
How nice to collect in the same paragraph a request legal personality y kill button for robots. Fortunately, common sense prevailed among the European Parliament’s advisers, as in this one a report to Natalie NevegansFrom the University of Artois:
In fact advocates the option of legal personality You have a fantasy vision of a robot, inspired by science fiction novels and movies. They consider a robot – especially if it is classified as intelligent and humanoid – to be a genuinely thinking artificial creation, the alter ego of humanity. We believe it would be improper and misplaced not only to acknowledge the existence of an electronic person, but even to create such a legal personality. This not only runs the risk of appropriating rights and obligations to what is nothing more than a tool, but also to break down the boundaries between man and machine, blurring the boundaries between the living and the inert, the human and the inhuman. On the other hand, the creation of a new type of person – a bionic person – sends a powerful signal that can not only reignite fear of synthetic beings, but also raise questions The Foundations of Humanism in Europe. Thus, to assign the status of a person to an inanimate, non-conscious entity would be a mistake because, in the end, humanity would likely be demoted to that of a machine. Robots serve humanity and have no other role than in the world of science fiction.
(In fact, advocates of the legal personality option They have a fantasy vision of a robot, inspired by science fiction novels and movies…)
android sex
Sex is biologically and evolutionarily related to reproduction, although in humans, who are a mixture of biology and culture, sex goes beyond being a mere reproductive function. But we wouldn’t have sex if we weren’t animals, mammals, monkeys. Although it is conceivable that robots can build exact copies of themselves, if they are not alive, one cannot properly speak of biological reproduction, much less sex, but only of the manufacturing process.
On the other hand, it cannot be denied that robots can have features similar to those of a male or female body. They are called androids and gynoids. An artificial voice can have a masculine or feminine timbre, or even a voice neutral bell With difficulty (as it also happens, in fact, with some people’s voice).

I asked a few days ago from E2 To generate images of female and male bots (I apologize for the anthropomorphism: DALL E is not someone you can ask anything of). It came to my attention, as can be seen from the pictures, that the female sexual features were more pronounced than the male sexual features.
We can say, perhaps, that robots can have sex, but not sex. Obviously, this is nothing more than a file Imitation of stereotyped features. A robot doesn’t really make a robot feminine or masculine by giving it long hair or a mustache.
—oo—
This article has been sent to us Gonzalo GenoaProfessor at Carlos III University, Madrid. Apart from computer science classes, I also teach courses in humanities where I cover topics in philosophy of technology and critical thinking.
You can read All of my articles on Naukas are at this link. In addition to using Naukas social networks, if you would like to comment and discuss in more depth, you can visit my blog On machines and intentions (reflections on technology, science, and society).as this entry will be available in two days.
If you have an interesting article that you would like us to publish in Nocas as a guest contributor, you can contact us.