An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: “To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”
It was the first time the AI avatar of a victim—in this case, a dead man—has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings.
There is no end to what the robolovers think is normal or acceptable. Fucking weird, creepy, and disgusting.
what a shit show.
this doesn’t belong in a court of law.
Do i need to submit paperwork somewhere giving formal non-consent to this kind of thing?
Clickbait title. It wasn’t testimony, it was part of a victim impact statement. The defendant had already been found guilty.
Still weird, but not as procedurally questionable as it sounds. Victim impact statements are meant to literally sway the judge toward a harsher sentence, and basically everything’s fair play.
This still has no place in a court of law if it was supposed to influence anything about the case.
how disgusting
Yeah … that is not OK.
“And I would like to amend my will to add the prosecutor’s kid as the beneficiary. And thank you, Judge, and might I add that you make that gavel look very powerful!”
Simpson’s did it. The lawyer dubbed his voice onto a video will.
We have this interesting combination of high tech and low tech literacy, before you know it praying to the machine spirits will be a mainstream religion.
Let’s hope. At least the machines are real
Praying to a machine that isn’t listening isn’t really any different than praying to the Moon.
There’s a marked difference, though, the Moon can’t actually generate words that you can digest. Some people might think the Moon is speaking to them, but their brains are just spicy - it is not the same as a machine.
Why wouldn’t the A.I. listen? My phone listens to every word I say.
This seems like it opens the door to asking for a mistrial.
If these are not the victim’s actual words, then WTF? This is completely contrived.
Hey! It’s Black Mirror!
I genuinely used to think Black Mirror was way too unbelievable to be scary
Hurry submit an AI video of Brian Thompson admitting that he accidentally shot himself as evidence
Using it at a trial like this is completely unhinged. That being said, it might help victims of violent crimes give their testimony (I’m mainly thinking of SA). I could see this having a use in prisoner rehabilitation as well, with the explicit consent of the victom or family of course.
It wasn’t testimony, it was part of a victim impact statement. Vibes should never be used as testimony or evidence because LLMs are infamously biased and can’t be sequestered.
Yes, in this case it is completely unacceptable.
Im saying after that that I would find it acceptable for victims to use an avatar to convey what they want to say, if speaking in court is too difficult because of the trauma. It shouldnt be an llm though, the victim should write the script.
It was written by the sister. It’s what she thought he would have said, and it was her victim statement.
The AI made it into a video.