- cross-posted to:
- fuck_ai@lemmy.world
- cross-posted to:
- fuck_ai@lemmy.world
An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: “To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”
It was the first time the AI avatar of a victim—in this case, a dead man—has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings.
The avatar was made by Pelkey’s sister, Stacey Wales. Wales tells 404 Media that her husband, Pelkey’s brother-in-law, recoiled when she told him about the idea. “He told me, ‘Stacey, you’re asking a lot.’”
No, this is exactly why it shouldn’t be allowed. This isn’t akin to playing a video of home movies because this is a fake video of the victim. This is complete fiction and people thinking it’s the same thing is what makes it wrong.
It is like a home movie in that it is an attempt to humanize the victim. There is no evidence in a home movie, no relevant facts, just an idea of the person that’s gone. You’re right that one is a memory of something that happened while the other is a fabrication of something that might have happened, but they are both equally (ir)relevant and emotionally manipulative. Many jurisdictions do prohibit victim statements beyond a written or verbal testimony. Some countries and states require you to use a form and won’t admit statements that do not adhere to the form.
Also remember that this is for the judge, not a jury.