Could certain groups be disproportionately affected by the outcomes of the AI system?

This page is a fallback for search engines and cases when javascript fails or is disabled.
Please view this card in the library, where you can also find the rest of the plot4ai cards.

Bias, Fairness & Discrimination Category
Design PhaseInput PhaseModel PhaseOutput PhaseMonitor Phase
Could certain groups be disproportionately affected by the outcomes of the AI system?
  • Could the AI system potentially negatively discriminate against people on the basis of any of the following protected characteristics: sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age, gender or sexual orientation?
  • If your model learns from data tied to a specific cultural context, it may produce outputs that discriminate against individuals from other cultural backgrounds.

If you answered Yes then you are at risk

If you are not sure, then you might be at risk too

Recommendations

  • Consider the different types of users and contexts where your product is going to be used.
  • Consider the impact of diverse backgrounds, cultures, and other relevant attributes when selecting your input data, features and when testing the output.
  • Assess the risk of possible unfairness towards individuals or communities to avoid discriminating minority groups.
  • The impact on individuals depends on the type, severity, and scale of harm, such as how many people are disadvantaged compared to others. Statistical and causal analyses of group differences are essential tools for evaluating potential unfairness and discriminatory impacts of AI systems.
  • Design with empathy, diversity and respect in mind.