In recent months, the parents of a Parkland, Fla., school shooting victim used artificial intelligence to re-create the voice of their son, Joaquin Oliver , to advocate for gun control. A father used recordings of his late daughter and AI to create a video of her singing a birthday greeting for her mother. And Hollywood has digitally brought back stars like Carrie Fisher to appear in films after they died.

New platforms and apps have made possible this new era of animating the voices or visual images of those we’re grieving for. Beyond the static photos, voicemail messages and videos of loved ones we once might have treasured, we can now keep an interactive version of that person, one with whom we can potentially converse.

But should we? What are the implications of such deathless digital avatars? We wanted to hear what our readers think, as part of a series of ethical dilemmas posed by AI .

The question was: AI is already creating chat, audio and video representations that mimic deceased people. Will conversing with them comfort loved ones, or prolong the feeling of loss and prevent them from moving on? If the chatbots are flawed, could they warp our understanding of who the person really was?

Some of our readers had a visceral, gut reaction. Others wrestled with the potential, and the problems they anticipate. Here’s what they shared.

Just no

I would not support mimicking the dead—may they rest in peace—with chatbots.

Digital dementia

Having photos or videos of lost loved ones is a comfort. But the idea of an algorithm, which is as prone to generate nonsense as anything lucid, representing a deceased person’s thoughts or behaviors seems terrifying. It would be like generating digital dementia after your loved ones’ passing.

I would very much hope people have the right to preclude their images being used in this fashion after death. Perhaps something else we need to consider in estate planning?

A positive for many

I think this aspect of AI will prove ultimately popular, as people live longer, and as we face personal loss. The mimicry will be significantly flawed initially, but probably since this sort of application is likely to have great appeal, improvements will be rapid.

In the long run, this will lead to significant cultural shifts in our feelings about the elderly and our ancestors. It will be interesting to see and strange. Warped and flawed, and unusually positive for many of us in our lives as well: Something to look forward to.

Warping reality

A machine isn’t an embodied human being. AI chatbots will prolong the effects of loss if used this way. They will divide humans. They will muddy our understanding of reality. Humans are made in the image of God; a machine cannot replicate that.

An individual choice

If someone can’t move on, it’s their business to remain in their emotional state or to change it. The idea that public policy should “nanny” people it deems in need of emotional correction is revolting.

A modern-day seance

Just as in the 19th century when seances were so popular, I think this would work the same way. For the most part, it was a comfort to the grieving. But several known “robber barons” benefited from the process.

We’ll learn from mistakes

At 72, I have a good deal of experience in matters of death. We know that memories aren’t movies in our heads, they change every time we access them. It seems to me that AI could assemble a character out of photos, recordings, writings by and about the person of interest and other data to eventually form a reasonable composite.

I know that I would get a kick out of a more complete history than simply staring at a photo and struggling to remember. Many times the stories we remember and laugh about are lies anyway.

So bring it on. Progress always entails costs and uncertainty. The faster this can be accomplished the faster the benefits accrue and the costs will diminish. Mistakes will be made, but we’ll learn from them and there are some good outcomes here worth striving for.

Memories are

enough

I loved my parents and grandparents deeply and think about them daily. That said, I have no desire to interact with a simulation of any of them. I’m no psychologist, but I find that entire notion troubling.

No replacement

I would find no comfort from chatting with a mimic of a lost family member. There’s no possible way an algorithm could capture my family member’s humor, spontaneity, quirkiness and spirit. No way. That’s an AI bridge too far.

Sinister outcomes

So-called technological progress is already producing so many lonely, excluded and marginalized people. The use of such crutches as this won’t be able to prevent mass depression, neuroses and a huge increase in suicide rates.

Creepy, but brilliant

This is creepy, flawed, and outrageous; yet probably brilliant. I would love to have a conversation with numerous historical figures and I am certain many others would as well. Right or wrong—these digital avatars are coming! Flawed or not—they are coming! Where can I find a subscription?

A poor rendering

AI should not mimic the deceased. As humans, we always grow and change. At its best, AI will create an unchanging snapshot of us and our thoughts, a poor rendering of the depth and complexity that is “us.” At its worst, AI will guess at how we might change and grow, warping the memories of those who love or respect us.

Tempting premise

Grieving is a basic human need. Chatting with a recently deceased relative or friend could distort that healing process with this new form of entertainment. As much as I would like to talk to my grandfather or Charlie Chaplin, the conversation would not be valid nor offer up any new truths.

After someone’s been gone for several years, an AI chat might be enjoyable. The downside, however, might cause some people to gradually believe they are really talking to “Mom,” even though she’s been gone for 20 years. It’s a fascinating and tempting premise, calling for a new interpretation of the line, “See you on the other side.”

Cold comfort

For me this is the same as going to a psychic. I know how difficult mourning can be but it is better to work through the pain and cherish one’s memories than use artificial comfort that perpetuates denial of one’s loss.

Sweet dreams

My mother is mostly alone in a nursing home. As her dementia advances, she regresses to her past: in particular, her hometown, her husband and her sister. I know she would love to have conversations with my deceased father or at least have him wish her a good night.

Everyone grieves differently

All our understanding of who a person really was is subject to a host of subsequent external stimuli, any of which may change what is, after all, only our perception.

Do we care on a personal level how others use AI to seek comfort, prolong mourning, redefine their perception of their deceased or attain closure? Upon being injured, some will choose the assistance of a crutch to aid healing, others may take a different approach. No approach can accurately predict the timeline and ultimate success of the healing process.

AI-generated representations of the deceased may become just another of the several memorabilia we retain to prevent the complete disappearance of those we cherished or admired.

Lost in fantasy

Well-adjusted people know death is final and they deal with it. They would neither need nor want the fantasy of an AI bot as it would only interfere with a healthy grieving process.

For others that need help coping, for whatever reason, AI bots may possibly be useful as part of a mental-health recovery plan supervised by a professional.

But the better the chatbots become, the greater the danger for people to get lost in their own alternative-reality narratives.

My memories will suffice

My memories and recollections of those who have died satisfy me with no need to continue my relationships via AI.

No conversation

I think audio and video representations of deceased people could be awesome, but NOT chatting with them. Just revisiting them saying their favorite phrases and laughing could be comforting. Actual “interaction” could be harmful psychologically, I think.

A distorted reality

This is a really bad idea. It doesn’t distort our understanding of who the person was because it isn’t the person at all. Period. Pretending so not only dishonors the actual person, it encourages us to live in a distorted reality and to avoid the real world. undefined undefined The person is gone—you can’t have a heart to heart with AI because there is no heart. It is not thinking about anything because it is not “thinking” at all. It is using a complex algorithm to decide what the next word (not thought, not idea, but word ) should be based on a training data set that by definition includes people beyond the person in question. You can learn nothing about the person in question beyond what you already know.

I’m not sure what is more depressing—the idea that young women are pulling back from the world to curl up with a screen containing an algorithmic boyfriend who never existed, or distraught children convincing themselves that grandma isn’t really gone because they can “talk with her” any time they like.

Life can be difficult, but withdrawing into a computer-generated delusion to try to escape those difficulties isn’t the answer. It’s madness.

Only one exception

We should not try to preserve someone in a chatbot or a hologram after they die. People need to grieve and find closure. The only exception would be to bring back Michael Jackson for that final epic world tour that never happened. That would be OK.

Demetria Gallegos is an editor for The Wall Street Journal in New York. Email her at demetria.gallegos@wsj.com .