Experimenting with unproven technology to determine whether a child should be granted protections they desperately need and are legally entitled to is cruel and unconscionable.

  • Basic Glitch@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 days ago

    Companies that tested their technology in a handful of supermarkets, pubs, and on websites set them to predict whether a person looks under 25, not 18, allowing a wide error margin for algorithms that struggle to distinguish a 17-year-old from a 19-year-old.

    AI face scans were never designed for children seeking asylum, and risk producing disastrous, life-changing errors. Algorithms identify patterns in the distance between nostrils and the texture of skin; they cannot account for children who have aged prematurely from trauma and violence. They cannot grasp how malnutrition, dehydration, sleep deprivation, and exposure to salt water during a dangerous sea crossing might profoundly alter a child’s face.

    Goddamn, this is horrible. Imagine leaving shitty AI to determine the fate of this girl :

    ‘Psychologically broken,’ 8-year-old Sama loses her hair