https://nighthawkrottweilers.com/

https://www.chance-encounter.org/

Business

Be careful about the emotions of the robots; & # 39; Simulated Love is Never Love & # 39;




SAN FRANCISCO (AP) – When a robot "dies" does it make you sad? For many people, the answer is "yes" – and it tells us something important and potentially worrying about our emotional responses to social machines that are beginning to move into our lives.

For Christal White, a 42-year-old marketing and customer service director at Bedford, Texas, the moment came several months ago with the sweet, friendly Jibo robot up in his home office. After more than two years in the house, the foot-lined humanoid and the inviting, round screen "face" had begun to enjoy her. Well, it danced and played funny word games with their kids, but it also interrupted her sometimes during conference calls.

White and her husband Peter had already started talking about moving Jibo into the empty guest room upstairs. Then they heard about "death sentence" Jibo's maker had lived on the product when the business collapsed. The news came via Jibo himself, who said the servers would settle down, effectively lobotomizing it.

"My heart broke," she said. "It was like an annoying dog that you don't really like because it's your husband's dog. But then you realize you actually loved it all the time."

The whites are far from the first to experience this feeling. People went to social media this year to say tear goodbye to the Mars Opportunity rover when NASA lost touch with the 15-year-old robot. A few years ago, scads of anxious comments weighed into a demonstration video from the robots firm Boston Dynamics, where employees fired a dog-like robot to prove stability.

Smart robots like Jibo do not live, but it does not stop us from acting as if they were. Research has shown that people tend to project human traits on robots, especially when they are moving or acting in equally weak human ways.

Designers recognize that such features can be powerful tools for both connection and manipulation. It can be a particularly acute problem when robots move into our homes – especially if, like so many other home appliances, they also become wires for data gathered on their owners.

"When we interact with another human, dog or machine, how we treat it, is influenced by what kind of mind we think it has," says Jonathan Gratch, a professor at the University of Southern California studying virtual human interaction. When you feel that something has feelings, it now deserves protection from harm. "

The way the robots are designed can affect the tendency of people to project stories and feelings on mechanical objects," said Julie Carpenter, a researcher studying peoples interaction with new Technology. Especially if a robot has something similar to a face, its body resembles humans or animals, or just seems self-controlled, like a Roomba robot vacuum.

"Even though you know a robot has very little autonomy when something moves in your room and it seems to have a sense of purpose, we connect it with something that has an inner consciousness or goal, "she said.

Such design decisions are also practical, she said. re home is built for people and pets, so robots that look and move like humans or pets fit in more easily.

However, some researchers are concerned that designers underestimate the dangers associated with attachments to increasingly life-like robots.

For example, Longtime AI researcher and MIT professor Sherry Turkle are concerned that design signals may lead us to believe some robots are expressing emotions back to us. Some AI systems are already presented as socially and emotionally aware, but these reactions are often written, making the machine seem smarter than it really is.

"The experience of empathy is not empathy," she said. "Simulated thinking can think, but simulated feeling never feels. Simulated love is never love."

Designers at Robot Launch insist that humanizing elements are critical when robot use expands. "There is a need to appease the public to show that you are not disruptive to public culture," said Gadi Amit, president of NewDealDesign in San Francisco.

His agency recently worked on designing a new Postmates delivery robot – a four-wheeled bucket-shaped object with a cute, abstract face; rounded edges; and light indicating which way to turn.

It will take time for humans and robots to establish a common language as they move around the world together, Amit says. But he expects it to happen in the next decades.

But what about robots working with children? In 2016, Dallas-based startup introduced RoboKind a robot called Milo designed specifically to help teach social entries to children who have autism. The mechanism, which resembles a young boy, is now around 400 schools and has worked with thousands of children.

It is meant to connect emotionally with children at a certain level, but RoboKind co-founder Richard Margolin says the company is sensitive to the concern that the children might be too attached to the robot, which contains human speech and facial expressions.

So RoboKind suggests boundaries in the curriculum, both to keep Milo interesting and to ensure that children are able to transfer these skills to reality. Children are only recommended to meet Milo three to five times a week for 30 minutes each time.



Source link

Back to top button

mahjong slot

https://covecasualrestaurant.com/

sbobet

https://mascotasipasa.com/

https://americanturfgrass.com/

https://www.revivalpedia.com/

https://clubarribamidland.com/

https://fishkinggrill.com/