In the AI ​​era, human empathy is still fundamental

0
9


People value more empathy samples when they believe it comes from a human being, although in reality the response has been generated by artificial intelligence (AI).

An investigation led by the Hebrew University of Jerusalem that publishes Nature Human Behaviour recruited more than 6,000 participants in nine experiments, to study whether empathy is perceived differently if it was labeled as coming from a human being or a chatbot.

The result was that the responses attributed to a human being are perceived as more solidarity, with greater emotional resonance and more affectionate than the identical responses generated by AI, the university said.

In all cases, the responses were prepared by large language models (LLM) and, nevertheless, participants systematically described the ‘human’ responses as more empathic, more solidarity and more emotionally satisfactory than the identical ones of AI.

“We are entering an era in which AI can produce answers that seem and sound empathic,” but people continue to prefer “to feel that another human really understands, feels with them and worries,” said Anat Perry, from the Hebrew University of Jerusalem and signer of the study.

The preference was especially marked by the responses that emphasized emotional exchange and genuine attention (the affective and motivational components of empathy) instead of mere cognitive understanding.

You may be interested: Being ‘cool’ describes a very similar personality type in all cultures, according to study

In the AI ​​era, human empathy is still fundamental

Participants were even willing to wait days or weeks to receive a human response rather than get an immediate response from a chatbot.

In addition, when participants believed that an AI could have helped generate or edit an answer that they thought it was from a human, their positive feelings decreased significantly.

At present, Perry added, it has become something natural to pass the emails or messages through the AI, but “our findings suggest a hidden cost: the more we trust in AI, the more you run the risk that our words seem hollow,” he said.

As people begin to assume that each message is generated by AI, “perceived sincerity, and with it, the emotional connection can begin to disappear.”

Although AI is promising for use in educational, health and mental health environments, the study highlights its limitations.

AI can help climb support systems, “but at times that require a deep emotional connection, people still want the human touch,” said the researcher.

With EFE information.

Follow us on Google News to always keep you informed


LEAVE A REPLY

Please enter your comment!
Please enter your name here