Judges decide which defendants are released from jail and which stay behind bars. Traditionally, that multi-factored decision was usually more subjective than objective. However, some states are using bail algorithms to apply scientific measures to the decision.
What is a bail algorithm?
Bail algorithms are formulas that use statistics in order to assess risk. The algorithm takes selected information about the defendant and produces an objective, scientifically-based assessment. Before making a bail decision, the judge gets to consider the algorithm’s result1.
How are they used?
While algorithms can be used to determine risk of any type, some states are using these mathematical formulas to make legal decisions throughout the arrest-to-parole process. In addition to making a recommendation on whether the defendant be granted bail or not, algorithms can isolate a defendant’s risk of breaking the law (the initial crime that landed the defendant in jail in the first place) from the decision to skip a court date. An algorithm, properly used, can also give the judge insight into the defendant’s likelihood of committing an additional crime while out on bail.
California, the state with the highest bail in America, passed a bill last year to abandon cash bail. Today, the state uses its Public Safety Assessment (PSA), an algorithm-based risk evaluation tool, to guide judges’ pretrial decisions. The PSA compares a defendant’s risk factors to a 750,000-case database gleaned from 300 jurisdictions. The defendant’s age, the alleged charge, other pending charges, any prior misdemeanor or felony convictions, the violence associated with those convictions, previous failures to appear in court and any prior prison sentences factor into the defendant’s PSA score. However, the PSA does not take into account the defendant’s race, gender, education level, economic status or neighborhood. More than 40 other jurisdictions also use the PSA to assess risk associated with defendants. Other jurisdictions use algorithms created locally or leverage risk assessment systems from for-profit corporations and non-profit organizations to determine defendants’ risk profile.
Are algorithms the answer?
The use of algorithms is not without critics. More than 100 civil rights groups, including the ACLU, are advocating that algorithms be abolished. Civil rights groups base their criticism of algorithm use on the fact that these mathematical formulas take into account the number of times defendants have committed a crime, which can be affected by racial bias. Additional criticism centers on the fact that judges can ignore the algorithm-driven recommendation and that algorithms may prompt a complacent public to assume that bail-related inequities are a thing of the past. Additionally, critics who are critical of algorithm use point out that the formulas do not take into account a defendant’s employment status or substance abuse history. Also, algorithms treat categories of crime the same – regardless of the circumstances. For example, a man who flees from security officers after shoplifting and a man who robs a convenience store are both charged with robbery despite the differences in the two instances.
Proponents of algorithm-driven bail decisions cite the mathematical formulas’ “color blindness” and objectivity.
The debate about algorithms’ value or harm rages on, and it is likely that critics and proponents will continue to wage a state-by-state battle to reform the U.S. legal system.
Algorithms can offer statistics that help guide decision making; however, A 2nd Chance Bail Bonds believes that there’s no substitute for bail as a tool to get the defendant to actually make his or her court appearance.