Vi bøger
Levering: 1 - 2 hverdage
Forlænget returret til d. 31. januar 2025

Causal Fairness Analysis - Drago Ple¿ko - Bog

Bag om Causal Fairness Analysis

The recent surge of interest in AI systems has raised concerns in moral quarters about their ethical use and whether they can demonstrate fair decision taking processes. Issues of unfairness and discrimination are pervasive when decisions are being made by humans, and are potentially amplified when decisions are made using machines with little transparency, accountability, and fairness. In this monograph, the authors introduce a framework for causal fairness analysis to understand, model, and possibly solve issues of fairness in AI decision-making settings. The authors link the quantification of the disparities present in the observed data with the underlying, often unobserved, collection of causal mechanisms that generate the disparity in the first place, a challenge they call the Fundamental Problem of Causal Fairness Analysis (FPCFA). In order to solve the FPCFA, they study the mapping variations and empirical measures of fairness to structural mechanisms and different units of the population, culminating in the Fairness Map. This monograph presents the first systematic attempt to organize and explain the relationship between various criteria in fairness and studies which causal assumptions are needed for performing causal fairness analysis. The resulting Fairness Cookbook allows anyone to assess the existence of disparate impact and disparate treatment. It is a timely and important introduction to developing future AI systems incorporating inherent fairness and as such will be of wide interest not only to AI system designers, but all who are interested in the wider impact AI will have on society.

Vis mere
  • Sprog:
  • Engelsk
  • ISBN:
  • 9781638283300
  • Indbinding:
  • Paperback
  • Sideantal:
  • 304
  • Udgivet:
  • 31. januar 2024
  • Størrelse:
  • 156x16x234 mm.
  • Vægt:
  • 465 g.
  • 8-11 hverdage.
  • 15. januar 2025
På lager
Forlænget returret til d. 31. januar 2025
  •  

    Kan ikke leveres inden jul.
    Køb nu og print et gavebevis

Normalpris

Medlemspris

Prøv i 30 dage for 45 kr.
Herefter fra 79 kr./md. Ingen binding.

Beskrivelse af Causal Fairness Analysis

The recent surge of interest in AI systems has raised concerns in moral quarters about their ethical use and whether they can demonstrate fair decision taking processes. Issues of unfairness and discrimination are pervasive when decisions are being made by humans, and are potentially amplified when decisions are made using machines with little transparency, accountability, and fairness. In this monograph, the authors introduce a framework for causal fairness analysis to understand, model, and possibly solve issues of fairness in AI decision-making settings. The authors link the quantification of the disparities present in the observed data with the underlying, often unobserved, collection of causal mechanisms that generate the disparity in the first place, a challenge they call the Fundamental Problem of Causal Fairness Analysis (FPCFA). In order to solve the FPCFA, they study the mapping variations and empirical measures of fairness to structural mechanisms and different units of the population, culminating in the Fairness Map.
This monograph presents the first systematic attempt to organize and explain the relationship between various criteria in fairness and studies which causal assumptions are needed for performing causal fairness analysis. The resulting Fairness Cookbook allows anyone to assess the existence of disparate impact and disparate treatment. It is a timely and important introduction to developing future AI systems incorporating inherent fairness and as such will be of wide interest not only to AI system designers, but all who are interested in the wider impact AI will have on society.

Brugerbedømmelser af Causal Fairness Analysis



Find lignende bøger

Gør som tusindvis af andre bogelskere

Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.