Loading…
Sunday December 15, 2024 17:00 - 18:00 GMT+03

While AI technologies promise significant benefits, the human rights risks they pose too often fall disproportionately on marginalized populations, such as women and girls in all their diversity, persons with disabilities, members of marginalized racial, ethnic, religious, or linguistic groups, Indigenous peoples, LGBTQI+ persons, children, and human rights defenders. For example, AI systems are often used to generate harassing and harmful “deepfakes” or spread disinformation that specifically targets women and human rights defenders; AI systems can perpetuate patterns of bias found in their training data, reinforcing historical patterns of discrimination faced by groups defined by traits such as gender, geography, race, or caste; and AI tools enable advances in surveillance technologies that are too often used to interfere with rights to peaceful assembly or freedom of association, especially by marginalized populations, and have been used for targeting by security forces with harmful effects for civilians and privacy rights.
This interactive workshop session aims to collaboratively develop feasible steps that can advance the identification, assessment, and mitigation of risks to marginalized populations that are created or exacerbated by AI. Framed by remarks from government, civil society, and industry stakeholders describing the challenges and constraints they face in this area, the workshop will explore 1) pressing issues related to AI’s impacts on marginalized populations; 2) success stories that should inform future actions; and 3) feasible steps that different groups of stakeholders can take to advance progress. The discussion will pay particular attention to how these issues and potential actions differ across diverse cultural, geographic, and economic contexts. After the event, the key issues and steps identified will be collated into an outcomes document, which could be published by the FOC.
Sunday December 15, 2024 17:00 - 18:00 GMT+03
Workshop Room 9
Log in to leave feedback.

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link