An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: “To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”
It was the first time the AI avatar of a victim—in this case, a dead man—has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings.
What a fuckknob
This is fucking stupid. No one has the right to fake another person’s appearance in court. “We could have been friends” is lame and embarrassing. So stupid and lame.
There is no end to what the robolovers think is normal or acceptable. Fucking weird, creepy, and disgusting.
And dumb Christians will eat it up
what a shit show.
this doesn’t belong in a court of law.
Do i need to submit paperwork somewhere giving formal non-consent to this kind of thing?
Clickbait title. It wasn’t testimony, it was part of a victim impact statement. The defendant had already been found guilty.
Still weird, but not as procedurally questionable as it sounds. Victim impact statements are meant to literally sway the judge toward a harsher sentence, and basically everything’s fair play.
This still has no place in a court of law if it was supposed to influence anything about the case.
Yeah … that is not OK.
“And I would like to amend my will to add the prosecutor’s kid as the beneficiary. And thank you, Judge, and might I add that you make that gavel look very powerful!”
According to an article (dunno if it’s this one, didn’t click lol) the victim’s sister wrote it and they just used Ai for his voice and likeness.
Simpson’s did it. The lawyer dubbed his voice onto a video will.
We have this interesting combination of high tech and low tech literacy, before you know it praying to the machine spirits will be a mainstream religion.
Let’s hope. At least the machines are real
Praying to a machine that isn’t listening isn’t really any different than praying to the Moon.
Why wouldn’t the A.I. listen? My phone listens to every word I say.
There’s no one there! It’s mindless, listening is an active process that requires a person.
Sheep can hear, when a preacher talks to a crowed of sheep, the sheep hear him. A sheep is not a person.
Sheep can hear, they don’t listen.
But let’s say they do!
Should you worship sheep because they can hear your prayers?
You’re the one bringing up listening as a requirement for prayer. For me personaly, sheep are for fuckin’
There’s a marked difference, though, the Moon can’t actually generate words that you can digest. Some people might think the Moon is speaking to them, but their brains are just spicy - it is not the same as a machine.
The machine isn’t really speaking to them either, it’s just generating responses through a pattern recognition engine. There’s no “there” there.
Might as well pray to a parrot.
how disgusting
This seems like it opens the door to asking for a mistrial.
Hey! It’s Black Mirror!
I genuinely used to think Black Mirror was way too unbelievable to be scary
Welcome to earth / life. You must be new here.
If these are not the victim’s actual words, then WTF? This is completely contrived.
Hurry submit an AI video of Brian Thompson admitting that he accidentally shot himself as evidence
Using it at a trial like this is completely unhinged. That being said, it might help victims of violent crimes give their testimony (I’m mainly thinking of SA). I could see this having a use in prisoner rehabilitation as well, with the explicit consent of the victom or family of course.
It wasn’t testimony, it was part of a victim impact statement. Vibes should never be used as testimony or evidence because LLMs are infamously biased and can’t be sequestered.
Yes, in this case it is completely unacceptable.
Im saying after that that I would find it acceptable for victims to use an avatar to convey what they want to say, if speaking in court is too difficult because of the trauma. It shouldnt be an llm though, the victim should write the script.
It was written by the sister. It’s what she thought he would have said, and it was her victim statement.
The AI made it into a video.
Using it at a trial like this is completely unhinged. That being said, it might help victims…
Yes, in this case it is completely unacceptable. Im saying after that that I would find it acceptable for victims…
I am talking about two different scenarios. My second comment is solely to clarify this point.