Computers are Racist too: AI predicts crime a week in advance with 90 per cent accuracy

(000228.79-:E-000157.73:N-:R-SU:C-30:V)   


Jan‘s Advertisement
National Vanguard
This is an excellent American White Racial Website and organisation. This comes from the work of the late Great Dr William Pierce


[Everything factual is: (a) Racist (b) Antisemitic. This system seems to have been designed by an Asian, possibly an Indian. The bottom line is that CRIME IS RACIST. Go looking for the Blacks and you'll see Crime there!!! Duh! But then Jews cry that you're racist. Stop listening to Jews. Just go with the FACTS! Jan]

An artificial intelligence that scours crime data can predict the location of crimes in the coming week with up to 90 per cent accuracy, but there are concerns how systems like this can perpetuate bias

Technology 30 June 2022

By Matthew Sparkes

Chicago, Illinois, where an AI has been predicting crimes

Artificial intelligence can now predict the location and rate of crime across a city a week in advance with up to 90 per cent accuracy. Similar systems have been shown to perpetuate racist bias in policing, and the same could be true in this case, but the researchers who created this AI claim that it can also be used to expose those biases.

Ishanu Chattopadhyay at the University of Chicago and his colleagues created an AI model that analysed historical crime data from Chicago, Illinois, from 2014 to the end of 2016, then predicted crime levels for the weeks that followed this training period.

The model predicted the likelihood of certain crimes occurring across the city, which was divided into squares about 300 metres across, a week in advance with up to 90 per cent accuracy. It was also trained and tested on data for seven other major US cities, with a similar level of performance.

Previous efforts to use AIs to predict crime have been controversial because they can perpetuate racial bias. In recent years, Chicago Police Department has trialled an algorithm that created a list of people deemed most at risk of being involved in a shooting, either as a victim or as a perpetrator. Details of the algorithm and the list were initially kept secret, but when the list was finally released, it turned out that 56 per cent of Black men in the city aged between 20 to 29 featured on it.

Chattopadhyay concedes that the data used by his model will also be biased, but says that efforts have been taken to reduce the effect of bias and the AI doesn’t identify suspects, only potential sites of crime. “It’s not Minority Report,” he says.

“Law enforcement resources are not infinite. So you do want to use that optimally. It would be great if you could know where homicides are going to happen,” he says.

Chattopadhyay says the AI’s predictions could be more safely used to inform policy at a high level, rather than being used directly to allocate police resources. He has released the data and algorithm used in the study publicly so that other researchers can investigate the results.

The researchers also used the data to look for areas where human bias is affecting policing. They analysed the number of arrests following crimes in neighbourhoods in Chicago with different socioeconomic levels. This showed that crimes in wealthier areas resulted in more arrests than they did in poorer areas, suggesting bias in the police response.

Lawrence Sherman at the Cambridge Centre for Evidence-Based Policing, UK, says he is concerned about the inclusion of reactive and proactive policing data in the study, or crimes that tend to be recorded because people report them and crimes that tend to be recorded because police go out looking for them. The latter type of data is very susceptible to bias, he says. “It could be reflecting intentional discrimination by police in certain areas,” he says.

Journal reference: Nature Human Behaviour, DOI: 10.1038/s41562-022-01372-0

Source: https://www.newscientist.com/article/2326297-ai-predicts-crime-a-week-in-advance-with-90-per-cent-accuracy/



Jan‘s Advertisement
S.Africa: Oprahs Embarrassing Girls school could end up expelling more
One of the Black girls was sent to Tara Hospital which is basically a mental institution. Oprahs, school had many problems... later they just did everything in secret and there‘s a lot of security. It still exists.

%d bloggers like this:
Skip to toolbar