Was this AI trained on an unbalanced data set? (Only black folks?) Or has it only been used to identify photos of black people? I have so many questions: some technical, some on media sensationalism
Was this AI trained on an unbalanced data set? (Only black folks?) Or has it only been used to identify photos of black people? I have so many questions: some technical, some on media sensationalism
Not talking about systemic racism in general. I know there’s a lot of that. I’m talking about systemic racism causing this particular issue. I’m saying because there have been cases of motion sensors not detecting black hands because of technical issues. I’m not apologizing for anyone, just pointing out the fact that it has happened before due to technical deficiencies.
@nieceandtows The fact that there have been issues with sensors (which is true) does not disprove systemic racism (which exists). That’s like saying that because I put vinegar in the dressing the lemon juice wasn’t sour. It doesn’t follow.
Putting the same thing the other way around: The fact that there have been issues with systemic racism (which is true) does not disprove technical malfunction (which exists). That’s like saying that because the lemon juice is sour it means it has vinegar in it. It doesn’t follow. Lemon juice can be sour just because it has lemons in it, without need of any vinegar in it.
@nieceandtows But we know that there is systemic racism in the police. There *is* vinegar in it.
Again, I’m not disagreeing on systemic racism in the police at all. That is a big issue that needs to be solved. Just saying that this doesn’t have to be related to it, because the technology itself has some issues like this. The vinegar is in the food, yes, but lemon is naturally sour. Even if there is no vinegar, it’s gonna be sour. Attributing everything to vinegar wouldn’t make food better. It would just make it difficult to identify issues with individual ingredients.
@fishidwardrobe As far as the UK is concerned (re facial recognition) I recall the latest study has found false positives disproportionately higher for Black people and statistically significant.
The UK Police thought this acceptable and have continued the roll out of this tech. A judgement call that bakes a little bit more systemic racism into UK Policing with little to no accountability.
https://science.police.uk/site/assets/files/3396/frt-equitability-study_mar2023.pdf
PS. I’m not academically qualified to comment on the paper, but take an interest in these things.
@nieceandtows