An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: “To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”
It was the first time the AI avatar of a victim—in this case, a dead man—has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings.
Yes, in this case it is completely unacceptable.
Im saying after that that I would find it acceptable for victims to use an avatar to convey what they want to say, if speaking in court is too difficult because of the trauma. It shouldnt be an llm though, the victim should write the script.
It was written by the sister. It’s what she thought he would have said, and it was her victim statement.
The AI made it into a video.
I am talking about two different scenarios. My second comment is solely to clarify this point.