Buy Edo
Language CDs

Learn to Speak
African Languages |
What is the use of
AI in law enforcement?
Seattle University Law Review (SULR,
2017) defines Artificial Intelligence (AI) as technologies used to improve
police work or replace it over time and AI can predict where crime will take
place in the future and those who are at high risk. According to the National
Institute of Justice (2019), AI is the ability of machine to perform
intelligence predictions and decision making tasks without human involvement
through self-learning algorithms from pattern recognition over time.
How can AI be used
to hire police?
in AI should be able to predict the
pattern of personalities and behaviors of police officers that may be abusive
towards the poor and minorities in order to deny them justice. This should help to prevent the hire of such
individuals into the police force, know the officers to monitor closely for
harassment, and keep a police registry of officers to update the machine
learning of the AI systems to prevent police brutality. Deviant Behavior (2017)
states that blacks are 37% of the prison population, which is trice their
percentage in the American population. This is a miscarriage of justice!!! |
 |
 |
How can
AI be used to select judges?
AI should also be able
to find predictive patterns in the way that judges sentence minorities and the
poor. According to the Department of Justice (2018), black male incarceration
rate is 5.8 times that of white males while black females were 1.8 times the
rate of white females. Judges who are biased in their sentencing should be
retrained and warned once, but should be removed from the bench if it continues
or the judges fits the AI’s prediction of unfair personalities and behaviors for
judges. Judges are to enforce justice regardless of race, ethnicity, gender, or
religion.
What are some
setbacks of AI in law enforcement?
Some setbacks of AI systems are
hidden biases that are in the design based on biased policing data, which can
lead to flawed AI predictions (SULR, 2017). To avoid these biases in AI systems,
all AI decisions should be based on a machine learning prediction for white male
in a similar situation. If a decision for a white male is to give the police his
information and forfeit the counterfeit money before going home, that should
have been the same decision for George Floyd. Justice should and must always be
blind!!! |
 |