An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: “To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”
It was the first time the AI avatar of a victim—in this case, a dead man—has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings.
Using it at a trial like this is completely unhinged. That being said, it might help victims of violent crimes give their testimony (I’m mainly thinking of SA). I could see this having a use in prisoner rehabilitation as well, with the explicit consent of the victom or family of course.
It wasn’t testimony, it was part of a victim impact statement. Vibes should never be used as testimony or evidence because LLMs are infamously biased and can’t be sequestered.
Yes, in this case it is completely unacceptable.
Im saying after that that I would find it acceptable for victims to use an avatar to convey what they want to say, if speaking in court is too difficult because of the trauma. It shouldnt be an llm though, the victim should write the script.
It was written by the sister. It’s what she thought he would have said, and it was her victim statement.
The AI made it into a video.
I am talking about two different scenarios. My second comment is solely to clarify this point.