More
    TechnologyKate Darling, robot expert: “We shouldn't laugh at people who fall in...

    Kate Darling, robot expert: “We shouldn’t laugh at people who fall in love with a machine. It will happen to all of us”

    Kate Darling investigates the legal, social and ethical effects of robots at the MIT Media Lab (Rhode Island, USA, 1982). She has spent years observing how humans and robots relate to each other. She has several in her house. With the arrival of the revolution in artificial intelligence (AI), she answers about the future with evasions: “It’s all so speculative,” she says, “that it’s hard to figure out.” Even so, there is no better time for her work, because we have never been so close to living with robots: “It is an exciting moment, I feel very lucky to be able to live it.”

    Darling (41-year-old American) is the author of the book The New Breed (“The new breed”, still without translation into Spanish) where he affirms that the best comparison to understand what a robot is is with animals, not with humans. In mid-June she will visit Barcelona to participate in the activities of the Sonar + D festival, invited by the consultant Seidor. In this conversation with EL PAIS, which she made by video call from her house, she tries to explain the enormous novelty represented by the language models headed by ChatGPT.

    Ask. How has the success of ChatGPT changed the way you see the future of bots?

    Answer. It is a huge change. Many people did not anticipate it. If you had asked me a few years ago if we would have this kind of sophistication, I would have said no, never. This is game changing in many ways. What will happen now? Nobody knows. For me one of the big questions is: will the capabilities that we are seeing in generative AI translate into being able to control and program physical robots? That kind of intelligence and learning would be really amazing. I’m not sure what will happen.

    Q. There is no definition of a robot. Why is so difficult?

    R. There is no universal definition. Depending on the scope, they will give one definition or another. Throughout history, something new has been called a robot, a new technology that people don’t understand, that has something magical about it. Then once it becomes more common, people stop calling it a robot and start calling it a dishwasher or a vending machine.

    Q. There is a lot of debate now with a possible extinction caused by an AI capable of deciding for itself.

    R. I am a very practical person and I don’t know how something like this can develop. There’s not much we can do to predict if it will happen and there’s nothing that can protect us other than stopping AI research, which won’t happen. I’m more interested in whether people will believe that the AI ​​is conscious, regardless of whether or not it actually is. That is something we must face as a society.

    A new technology that people don’t understand has been called a robot. Then they start calling it a dishwasher or a vending machine.

    Q. To understand what a robot is, he says that it is better to compare it with an animal than with a human. Do you keep that idea after ChatGPT?

    Read Also:   LG introduced the first high-definition screen capable of stretching

    R. Yes. I know that now it is more difficult to compare with an AI that uses human language. But the reason is even more important, which consists in saying that it is not so valuable or useful to create something that we already have, that we can already do. It is more valuable to have machines that can complement us or be partners in what we are trying to achieve. Many tasks that generative AI will do now are done by humans, but I think the real potential of the technology is for it to be a tool that blends with other human abilities and not just a replacement.

    Q. See robots soon as members of our families. How will they be?

    R. In much research on human-robot interaction, people already treat robots as living beings, even though they know they are just machines. But people love to do it. People even anthropomorphize robots and we project ourselves onto them, give them crazy human qualities, emotions. People also understand that they are interacting not with one person, but with something else. Robots will be a new type of social relationship: it can be like a pet or it can be something totally different, which is why my book is called a new breed. But I don’t think it necessarily replaces human relationships. It will be something different, but it will definitely happen.

    The true potential of technology is that it is a tool that is combined with other human abilities and not just a replacement.

    Q. He has robots at home. As they are? What do they do?

    R. I have a couple of different types. We have a baby seal, a dinosaur robot, a robot dog and then we have other robots that are more to help around the house, like an assistant or a vacuum cleaner. They all do different things and my kids interact with them differently depending on whether they see them as a tool or a partner.

    Q. Can the companion robots be turned off or are they always on?

    R. We turn them off. Although some are designed to be always on. The dog, for example, when the battery is low, looks for its charging place and lies down as if it were going to sleep to charge itself.

    Q. Are these pet robots ready to enter millions of homes?

    R. We have already seen, with this primitive and very expensive technology, that the people who have it develop meaningful connections. Technology is not going to get worse. The barrier to home robots is not the complexity of the robot, but rather that people do not yet know the social value that owning one would provide them. Once they get enough positive effects from a home robot that many homes will have, there will be a tipping point and more people will want them.

    ‘Her’ is about an application launched by a company, what is its business model? What are they trying to do?

    Q. What do you mean by “positive effects”?

    Read Also:   Elon Musk asks Twitter employees to find a way to charge for verified accounts

    R. People used to not see the value of having a pet. The animal had to fulfill a function: the dog would take care of the house and the cat would catch the mice, but then people realized that the relationship with the pet and the emotional connection were the true value. Now they have pets for that reason. The same will happen with robots. Right now they have a function: assistants, vacuuming the floor. But once there are an adequate number of bots for people to interact with, they will see value in social connection and want them for that reason as well.

    Q. He said that the movie Her, about a human who falls in love with a machine, cares and excites him in equal measure. What ethical issues does he see?

    R. Her it is about an application that is launched by a company. There are many questions: what is the business model of the company? What are they trying to do? They are probably trying to maximize their profits. They are people in a very vulnerable position because they already have a very strong emotional connection with an application, a device, a robot. This is already happening. The Replika app, which already has millions of users, has people emotionally attached to it. I am also concerned that there are privacy and data collection issues. You could emotionally manipulate people into buying products and services or changing their behavior, not in your own interest, but in the interest of a company.

    Still from the movie 'Her', with Joaquin Phoenix.
    Still from the movie ‘Her’, with Joaquin Phoenix.

    Q. He has said that he can imagine that a sexual app could exploit a user’s weakness in the heat of climax.

    R. Yeah.

    Q. It’s not bad marketing?

    R. It’s still a little more subtle. But Replika has in-app purchases that people buy and so it’s easy to manipulate, spend money or show advertising. These are consumer protection issues because it’s somewhat persuasive but in an overly manipulative way.

    Q. Will there be a reasonable way to monetize these apps?

    R. Yes, when consumers realize the value of buying an artificial partner and pay enough money for it. They can sell it and that’s it. I think that will happen? No. But it would be the best way to protect privacy and not have to emotionally manipulate anyone.

    Q. Many people will be surprised that someone humanizes these machines. But we are programmed for that.

    R. Yes. And it won’t go away. If something moves around us it is because it is alive. This is how our brains think and there is this subconscious projection that occurs not only with moving objects, but with a chatbot or whatever it is that imitates human behavior, things that we recognize as signals, sounds, and scientific evidence shows that we do it from an early age. It is very entrenched and will continue to be there.

    [Lo que mas me preocupa es] companies, the structure of incentives and political and economic problems. It’s a matter of governance, not technology

    Q. The robots will die. Could it be that we get divorced or abandon a robot in a ditch because of an update of software?

    Read Also:   Google Stops Production of AI Tools for Fossil Fuel Extraction

    R. Yes probably. Relationships can end anyway, and we will have real relationships with robots, whether they are human relationships, human-pet relationships, or new relationships. As such they may end in different ways, both with death and with someone deciding that they no longer want to continue. All kinds of things will happen. It is easy to foresee because people develop emotional relationships with artificial entities. But there are still many people who do not understand it.

    Q. Don’t they understand that they can sneak through a machine?

    R. Yes. There are already stories about people falling in love with their chatbot. Most people think they don’t. That those people who fall in love are sad and lonely, but that they are not. We are all susceptible to bonding with machines, especially when they are somewhat more interesting and more available. We need to take this more seriously instead of laughing at someone falling in love with a robot because it will happen to all of us.

    Q. Isn’t it amazing that the machine we fall in love with is just a screen?

    R. Not too much. Even with the chatbot more primitive people opened up. At MIT they created Eliza in the 1970s and people told him things about her. We are suckers for everything that gives us signals that we recognize, even if it’s just a screen. The reason why I love physical robots is because it adds that more visceral layer that makes them more attractive.

    Q. But he doesn’t like humanoid robots.

    R. No, they are boring.

    Q. He prefers R2-D2, a “trash can on wheels.”

    R. I like robots that are designed to be cute and relatable to people, but they don’t have to look human. It’s much more interesting to create a form and sometimes it works even better because if it looks too humanoid then the expectations of how it should behave and what it should do are disappointed. Whereas with something that looks like an animated garbage can you don’t have the same expectations.

    Q. Are you more excited or concerned by these developments?

    R. Both.

    Q. What worries you the most?

    R. The companies, the structure of incentives and the political and economic problems. It is a matter of governance, not technology.

    You can follow THE COUNTRY Technology in Facebook and Twitter or sign up here to receive our weekly newsletter.

    Subscribe to continue reading

    Read without limits



    Source: EL PAIS

    Awutar
    Awutar
    This post is posted by Awutar staff members. Awutar is a global multimedia website. Our Email: [email protected]

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here

    13 + eighteen =

    Subscribe & Get Latest News