BloombergGraphics/2024-openai-gpt-hiring-racial-discrimination

Data and materials to reproduce Bloomberg's investigation into racial and gender bias in OpenAI's GPT

40
/ 100
Emerging

This project helps HR managers, recruiters, and hiring professionals evaluate potential biases in AI tools used for resume screening. It takes in various resumes and job descriptions and processes them through OpenAI's GPT models to see if demographic information (like name, gender, or race) subtly influences candidate rankings. The output is an analysis showing if the AI tool exhibits racial or gender bias in its hiring recommendations.

No commits in the last 6 months.

Use this if you are a human resources professional or recruiter concerned about fairness and discrimination in AI-powered hiring tools.

Not ideal if you're looking for an off-the-shelf solution to integrate into your existing HR systems for real-time bias detection.

HR-Tech-Audit Recruitment-Bias AI-Ethics Candidate-Screening Hiring-Fairness
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

39

Forks

10

Language

Jupyter Notebook

License

Apache-2.0

Last pushed

Mar 07, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/BloombergGraphics/2024-openai-gpt-hiring-racial-discrimination"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.