What It Means To You


Deepfake involves the use of Artificial Intelligence, AI-powered deep learning software to edit images and videos of people and entities doing and saying what they didn’t.

After using deepfakes to create videos of the likes of Mark Zuckerberg and Simon Cowell on America’s Got Talent, the technology went a step forward.

AI human partner in the media

The idea of an AI human partner has been executed in the 2013 movie Her, where a writer falls in love with an AI persona. Years later, during the pandemic, Realic, a Florida-based augmented reality company, announced plans of creating the world’s first AI-based virtual partner.

This virtual partner brought on smartphones using a combination of artificial intelligence, VR, and AR, promising to offer emotional support and create a feeling of companionship during loneliness. But was an AI human partner really going to do that?

The Cyber Express spoke with experts in the industry to gain insights about the impact of an AI human partner on people.

There were several thoughts and observations shared that clarified the extent of impact and the conclusion that the impact depends on several factors.

What experts said about the use of an AI human partner

Speaking of the impact of an AI human partner, TEDx speaker, and lecturer Dr. Dorothea Baur said, “The impact of an AI partner on individuals’ emotions can vary significantly depending on the context, design, and usage of the AI system.”

Dorothea said that it could be empowering and entertaining to have an AI human partner. And negative impacts could be anxiety, over-attachment, and manipulation.

Addressing the perils of using such tools, Dorothea added, “Overall, companion bots are data brokers and surveillance machines relying on the pseudo-science of emotion recognition which is empirically invalid and normatively speaking not desirable.”

AI human partner – reality, perception and manipulation

Marisa Tschopp, Human-AI interaction researcher addressed having an AI partner, which poses a question on reality and perception. She said, “Reality can be enhanced by VR and AR and so on, for greater joy or for therapy (e.g., anxiety due to arachnophobia)”

“However, people also may get lost – just like the “AVATAR Depression”. The movie is so bright and powerful – people were unable to recognize and appreciate the beauty of the normal world.”

Can an AI human partner manipulate a human being trying it? “Yes,” said Marisa further adding, “an AI human partner, particularly if designed with manipulative features, can potentially create a negative impact on the user.”

Since AI systems can be developed to analyze data, and offer personalized responses, it can be leveraged to influence behaviors, emotions, and also beliefs of users, Marisa added.

Citing the example of emotional manipulation, Marisa said, “AI human partners designed to simulate empathy and emotional understanding might exploit users’ emotional vulnerabilities to steer their decisions or responses in a specific direction, like buying stuff they don’t need or sharing data they should not.”

Other impacts of having an AI human partner

AI human partner
Interface of exercise with a virtual partner (Photo: MDPI)

According to a study that tested the impact of a virtual partner on an individual’s exercise level, it was found that the enjoyment level was found to be high. However, the experience inhibited the exercise level and harmed exercise perception.

AI human partner
(Photo: Aktualne)

During the coronavirus isolation, having virtual support was considered a boon when people could not travel and meet near and dear ones safely.

However, now that there is no such barrier, would having an AI human partner affect the psychology of people in a negative way?

Some of the understandable impacts of a virtual partner could be as follows –

  1. Resistance to having a relationship with a human who is unpredictable and cannot be programmed
  2. Dependency on a virtual partner that may not last and is a technology-driven tool
  3. Distancing from physical support offered by a human being
  4. Limiting oneself to the trained and defined behaviors and reactions of the AI human partner
  5. Creating false expectations from humans based on AI human reactions

This could bring several reactions to the person when they meet people outside of the confines of the machine. For instance, in school, work, or a reactional space where people express their feelings in ways that are not programmed through codes, it can seem different to deal with.

This can lead to disappointment and agony for someone who likes or has become dependent on a human AI partner.

Human empathy vs AI chatbot

Can a bot offer emotional support? After designing a lookalike through deepfake or having a good-looking AI human partner, does it help reduce loneliness and emptiness among individuals?

AI chatbots have been tested for their ability to offer emotional support, disclosure of social cues, and providing emotional validation.

A study investigated how an individual looking for support during a stressful time responded to an AI chatbot in comparison to a human offering emotional support.

“The emotional support from a conversational partner was mediated through perceived supportiveness of the partner to reduce stress and worry among participants, and the link from emotional support to perceived supportiveness was stronger for a human than for a chatbot,” the study confirmed.

Having a human being offer reciprocal self-disclosure created better positive effects in terms of emotional support on worry reduction. However, these observations about offering support or the lack of it were noted as follows –

  1. In the absence of emotional support, a self-disclosing chatbot reduced lesser stress than a bot offering no response at all.
  2. Human partners were more likely to be taken as real sources of support than AI bots.
  3. Human partners may be more beneficial than AI human partners.
  4. AI partners may need to depend on the data fed to them and the social cues present in the conversation.

Humans can gauge and understand a human’s conversations to connect information with their own experiences and offer support that is not explicitly asked.

They can think about the past, and present and make an estimate about what the person might be asking without explicitly making a reference to it or saying the same.

In such situations, an AI human partner will respond with a fed statement like, “I am sorry. I do not understand that. Perhaps we can discuss it in more detail.” Besides emotional limitations, an AI human partner will stop seeming desirable when it would become clear that it would not innovate or think for the progress of the individual beyond a certain limit.

Marrying an AI human partner

AI Human Partner
36yr old New York woman (Right) with an AI human partner she married (Photo: AI Magazine)

Technophilia or the strong urge to try technological gadgets and devices has led to a widespread growth in the development of bots and AI-powered tools.

Humans have gone ahead and forayed into marriage with a virtual partner, however, the longevity, impact, and law behind such interactions will be known in time.

A problem with an AI human partner is if it gets hacked. All the data fed by the individual would become accessible to the hacker which could land up on the dark web, creating more remorse than joy.

AI human partner, a mixed bag of emotions

AI human partner
(Photo: TS2)

While AI human partner is no replacement for a real human being who may build a relationship based on reality, having an online partner during dire situations can help calm someone in distress. Users may turn to an AI human partner to look for what it offers, and how it can be trained to say things.

However, after a point, it is bound to lose value and create a feeling of emptiness just like most passing fads, apps, and tools. It can be used as a tool to build interpersonal skills for beginners and those with anxiety. An AI human partner will listen, and offer to talk without actually judging a person.

Hence, it is essential to understand the boundaries of using AI support for grooming and improving skills.

Such tools cannot offer long-term support, which needs to be understood by users who are looking to maximize AI-powered human partners.

App stores offer several options for AI human partners that do a range of functions as programmed by developers. It would be wise to understand the impact of using and feeling dependent on such applications for their needs.

Overusing such applications and bots may distort reality, making the user feel a sense of power with the expectation of similar submission from human beings as well. People use video games, and children play with dolls and toys.

Having a life-long memory of either is only natural as these offer support and entertainment at the same time and mean different things to different people.

However, just like toys and games, it is expected that individuals stop accessing AI human partners when it starts hampering their day-to-day activities and commitments.





Source link