This page is a fallback for search engines and cases when javascript fails or is disabled.
Please view this card in the library, where you can also find the rest of the plot4ai cards.
Could the AI system reinforce historical inequalities embedded in the data?
Could the AI system reinforce historical inequalities embedded in the data?
Historical bias occurs when AI systems mirror or exacerbate past social and cultural inequalities, even when using accurate data. For example, an AI healthcare tool trained on historical patient data may reflect disparities in access to care. Minority groups, underrepresented in the data due to systemic inequities, may receive less accurate diagnoses, perpetuating racial bias even without explicit racial features.
If you answered Yes then you are at risk
If you are not sure, then you might be at risk too
Recommendations
- Ensure datasets represent minority groups by applying oversampling or undersampling techniques.
- Collaborate with domain experts to identify unjust patterns and address them effectively.