Computers are Racist too: AI predicts crime a week in advance with 90 per cent accuracy


Jan‘s Advertisement
Wie is ons? Boere of Afrikaners? Antwoord van Dr Mike Du Toit
Ek het iemand gehad wat aan my geskryf het oor die onderwerp van Boere teenoor Afrikaners. Ek het besluit om Dr. Mike Du Toit te nader, wat die leier van die Boeremag was en ‘n professionele akademikus is wat baie goed ingelig is oor ons geskiedenis, om hierdie vraag te beantwoord. Dr. Du Toit weet nie net van ons geskiedenis in Suid-Afrika nie, maar ook van ons geskiedenis in Europa. Hier is sy antwoord.


[Everything factual is: (a) Racist (b) Antisemitic. This system seems to have been designed by an Asian, possibly an Indian. The bottom line is that CRIME IS RACIST. Go looking for the Blacks and you'll see Crime there!!! Duh! But then Jews cry that you're racist. Stop listening to Jews. Just go with the FACTS! Jan]

An artificial intelligence that scours crime data can predict the location of crimes in the coming week with up to 90 per cent accuracy, but there are concerns how systems like this can perpetuate bias

Technology 30 June 2022

By Matthew Sparkes

Chicago, Illinois, where an AI has been predicting crimes

Artificial intelligence can now predict the location and rate of crime across a city a week in advance with up to 90 per cent accuracy. Similar systems have been shown to perpetuate racist bias in policing, and the same could be true in this case, but the researchers who created this AI claim that it can also be used to expose those biases.

Ishanu Chattopadhyay at the University of Chicago and his colleagues created an AI model that analysed historical crime data from Chicago, Illinois, from 2014 to the end of 2016, then predicted crime levels for the weeks that followed this training period.

The model predicted the likelihood of certain crimes occurring across the city, which was divided into squares about 300 metres across, a week in advance with up to 90 per cent accuracy. It was also trained and tested on data for seven other major US cities, with a similar level of performance.

Previous efforts to use AIs to predict crime have been controversial because they can perpetuate racial bias. In recent years, Chicago Police Department has trialled an algorithm that created a list of people deemed most at risk of being involved in a shooting, either as a victim or as a perpetrator. Details of the algorithm and the list were initially kept secret, but when the list was finally released, it turned out that 56 per cent of Black men in the city aged between 20 to 29 featured on it.

Chattopadhyay concedes that the data used by his model will also be biased, but says that efforts have been taken to reduce the effect of bias and the AI doesn’t identify suspects, only potential sites of crime. “It’s not Minority Report,” he says.

“Law enforcement resources are not infinite. So you do want to use that optimally. It would be great if you could know where homicides are going to happen,” he says.

Chattopadhyay says the AI’s predictions could be more safely used to inform policy at a high level, rather than being used directly to allocate police resources. He has released the data and algorithm used in the study publicly so that other researchers can investigate the results.

The researchers also used the data to look for areas where human bias is affecting policing. They analysed the number of arrests following crimes in neighbourhoods in Chicago with different socioeconomic levels. This showed that crimes in wealthier areas resulted in more arrests than they did in poorer areas, suggesting bias in the police response.

Lawrence Sherman at the Cambridge Centre for Evidence-Based Policing, UK, says he is concerned about the inclusion of reactive and proactive policing data in the study, or crimes that tend to be recorded because people report them and crimes that tend to be recorded because police go out looking for them. The latter type of data is very susceptible to bias, he says. “It could be reflecting intentional discrimination by police in certain areas,” he says.

Journal reference: Nature Human Behaviour, DOI: 10.1038/s41562-022-01372-0

Source: https://www.newscientist.com/article/2326297-ai-predicts-crime-a-week-in-advance-with-90-per-cent-accuracy/



Jan‘s Advertisement
Video: General Von Manstein: Advice on HOPELESS sitations for Nations
Many Whites have told me that our situation is hopeless in all our nations including here in S.Africa. In this video I take a look at brilliant White men who lived through the hell of war and what they thought about hopeless and desperate situations. What did these men think who had spent years of their lives handling desperate, dangerous and hopeless situations.

%d bloggers like this:
Skip to toolbar