🔥 Land A High Paying Web3 Job In 90 Days LEARN MORE

Ethical Concerns Mount as AI Deadbots Raise Questions of Digital Dignity

In this post:

  • AI deadbots generate the persona of deceased loved ones, which raises ethical questions pertaining to the problem of emotional distress and digital dignity. 
  • Cambridge University scientists recommend creating effective safety rules to allow the development of AI deadbots without putting users in danger.
  • Effective and ethical standards and sound regulations are necessary ethical boundaries in the field of AI in the digital afterlife area.

AI experts warn that deadbot AI, or digital reanimations of the deceased, are about to become a reality. Hence, there must be regulation of this issue to prevent people from suffering psychological harm due to the “haunting” of their creators and users.

An example of such services, being technologically possible and legally permissible, would enable the creation of chatbots that use saved conversations with a lost one to “call grandma back” in the sense of understanding people’s emotions after hearing such discussions, as stated by the University of Cambridge scientists 

Some companies offer services in a manner reminiscent of the “Be Right Back” Black Mirror episode, which allows a chatbot to imitate language patterns and personality traits of a deceased person by using the digital footprint they have left, the research says.

Safety regulations urged to protect digital dignity

The study, which is published in the journal Philosophy and Technology, presents examples of how deadbots may be used by companies, such as to advertise products to a person in a similar way as a deceased loved one or to traumatize children by claiming that a dead parent is “with you.” 

However, in all instances, untrustworthy companies and reckless business initiatives may result in long-term psychological damage and a violation of the rights of the deceased, the paper suggests.

The researchers suggest that daily interactions occur with an overwhelming emotional weight. They claim that such emotional support can also hinder the grieving process, a natural way of coping with the loss. 

See also  AI tool promises precise prediction of the day you’ll die

An ethical minefield

 Dr Katarzyna Nowaczyk-Basińska, one of the study’s co-authors at Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI), stated, “Rapid advancements in generative AI means that nearly anyone with internet access and some basic knowhow can revive a deceased loved one.” 

This area of AI seems to be an ethical minefield. The most essential thing to do is to ensure that the dignity of the deceased is not violated by profit-driven services like digital afterlife providers, for instance. The major issue may arise from businesses that will commercialize their online legacy infrastructure through ads. 

It is particularly risky to involve children in the process because they may face the worst result. Companionship in the form of ‘deadbots’ may soon become necessary for parents looking to console their kids who have recently lost their moms or dads.

Impact on the grieving process

However, there are no studies that can indicate the suitability of such efforts which should have one think of their possible impact that could be very unpleasant not to mention the fact that it could negatively interfere with the normal mourning process.

According to the research paper, “No re-creation service can prove that allowing children to interact with ‘deadbots’ is beneficial or, at the very least, does not harm this vulnerable group.”

To ensure the dignity of the dead and the psychological well-being of the living, the researchers propose a range of best practices that can be used even to the extent of making laws to regulate them.

See also  Elon Musk takes legal action to stop OpenAI from becoming a for-profit

Those platforms need protocols for “retiring” deadbots, limiting their interactive functions to adults only, acknowledging the limitations of any artificial entity in full operation, and being very transparent with customers.

Global Reach and Varied Applications

According to the researchers, some platforms exist where a dead person’s image can be generated with AI for a modest cost. For example, there is Project December, which initially used GPT models and later used their systems, and there are also apps such as Hereafter.

Likewise, there are Chinese counterparts with the same services indicated in the study. In 2021, Joshua Barbeau received public attention when he used GPT-3 to develop a chatbot that spoke in the voice of his girlfriend, who had passed away. In 2015, Eugenia Kuyda turned the texts of her deceased friend into a chatbot, in turn giving way to the most popular AI companion app called Replika.

The technology is also not limited to chatbots. MyHeritage, the genealogy site, 2021 brought Deep Nostalgia, a feature that generated animated videos from individual pictures with users’ forebears. As soon as the feature went viral, the company had to admit that many users thought it was creepy. With the gradual advancement of those technologies, there is no alternative means other than keeping ethics in mind to facilitate development. 

From Zero to Web3 Pro: Your 90-Day Career Launch Plan

Share link:

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Most read

Loading Most Read articles...

Stay on top of crypto news, get daily updates in your inbox

Editor's choice

Loading Editor's Choice articles...
Cryptopolitan
Subscribe to CryptoPolitan