About the Post

FLOW OF WISDOM's avatar

Author Information

FLOW OF WISDOM is lead by Sean Anthony, a 25+ year radio veteran, media executive, and creator of the long-running investigative platform Flow of Wisdom. For more than a decade, Sean has explored the intersections of artificial intelligence, emerging technology, psychology, deep-state geopolitics, UFOs, occult symbolism, and the hidden systems shaping our world—long before these topics hit mainstream news. After a two-year hiatus, Flow of Wisdom is back, now evolving from a nationally syndicated radio show into a modern podcast and livestream experience. Sean brings the same broadcast instincts, but with a renewed focus on A.I., quantum physics, digital deception, transhumanism, information warfare, exposing the occult, the future of human consciousness, and more. With 25+ years in broadcasting across major U.S. markets, Sean has interviewed some of the most controversial thinkers, cultural icons, and emerging voices in science, media, and technology. His work blends investigation, intuition, and real-time analysis—making Flow of Wisdom a destination for curious minds who want context, not noise. Sean is also the founder of Sean Anthony Media Group, The Sean Anthony Foundation (501c3), and a participant in the TechX Accelerator developing new tech innovation. He is the author of Conversations With Hip Hop, available on Amazon. Flow of Wisdom delivers the conversations shaping tomorrow—before the world realizes they matter.

This software that helps predict criminal behavior is under fire for having a ‘racist’ algorithm

Source: TechInsidermqdefault-2 by 

Software used across the US to help predict whether people who end up in courtrooms, jails, and prisons are likely to commit more crimes appears to be biased against black people, according to a sweeping, thorough ProPublica investigation.

The software, which is designed by several for-profit and nonprofit groups, looks at a series of risk factors in people’s lives and assigns them “risk scores” as to how likely it thinks they are to reoffend. Those scores get used in jails, prisons, and courtrooms.

In Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin the scores are introduced as part of the sentencing process.

ProPublica studied data from 7,000 people arrested in Broward County, Florida and found three key things:

The scores are not very good at predicting who will reoffend.
White people were much more likely to get low scores and then reoffend.
Black people were much more likely to get high scores and not reoffend.
It seems likely that this is a result of the software taking into account factors like wealth and social marginalization that correlate heavily with race. It’s not the first example we’ve seen lately of an calculation not designed explicitly for racist purposes producing racist results, though it is the one we’ve seen with the highest stakes.

Northpointe, the company that evaluated risks in Broward County, disputes ProPublica’s analysis, saying these claims do not “accurately reflect the outcomes from the application of the model.” That said, the company would not disclose how it arrived at its risk scores, so it’s tough to know what’s driving this algorithm.

ProPublica’s report is full of jarring comparisons between the way the algorithm rates black people and white people. You can read the full report here.

Northpointe did not immediately respond to a request for comment.

Read the original article here.


Discover more from FLOW OF WISDOM®

Subscribe to get the latest posts sent to your email.

Tags: , , , ,

One Comment on “This software that helps predict criminal behavior is under fire for having a ‘racist’ algorithm”

  1. Unknown's avatar
    RonMamita June 14, 2016 at 8:48 am #

    This should not be a surprise to anyone, how effective will legal complaints be -should be a interesting public service announcement follow-up.

    Freedom: Shift Consciousness

    Like

Leave a reply to RonMamita Cancel reply