top of page

COMPAS Algorithm Bias Evaluation

Kevin Lige, Bess Yang, Jayden Li

My specific contributions are in the file COMPAS Project (Bess Yang).ipynb

  • Tools and Techniques Used

    • Python, Jupyter Notebook, Data Visualization (Box Plots, Bar Charts), Statistical Analysis, Stakeholder Presentation

  • Objective

    • The primary goal of this project was to assess whether the COMPAS algorithm—used to predict the likelihood of recidivism—exhibited racial bias. The project focused on analyzing the decile scores assigned to individuals of different ethnicities, particularly African-Americans and Caucasians, to determine if there were any disparities in the algorithm's predictions.

  • Approach & Analysis

    • Using Python and Jupyter Notebook, our team performed data cleaning, transformation, and analysis on a dataset that included variables such as race, decile scores, priors, and charge degree. We created visualizations like box plots and bar charts to compare the decile scores across racial groups. The analysis aimed to identify whether individuals with similar charges and priors were being scored differently based on their ethnicity.

  • Key Findings

    • Our analysis found that African-Americans were generally assigned higher decile scores than their Caucasian counterparts, even when controlling for charge degree and number of prior offenses. This suggested a potential racial bias in the algorithm. However, we acknowledged limitations in our analysis, including the need for a deeper investigation into other potential intervening variables, such as age and time served.

  • Outcome

    • We presented our findings to stakeholders via a slideshow, highlighting the racial disparities in the COMPAS algorithm’s predictions and providing recommendations for further analysis to ensure fairness in its application. The insights from this project were geared toward understanding the social impact of algorithmic decision-making in the criminal justice system.

bottom of page