모두보기닫기
NHRCK Emphasizes the Importance of Implementing the Human Rights Impact Assessment to Protect Human Rights on the Development and Use of Artificial Intelligence
Date : 2024.07.09 14:17:28 Hits : 3473

- NHRCK Suggests the ‘Human Rights Impact Assessment Tool for AI’ as the Specific Human Rights Protections Standards

Following the ‘Human Rights Guidelines for AI’ (2022) -


On May 23, 2024, the National Human Rights Commission of Korea (Chairman Song Doo-hwan, hereinafter referred to as 'NHRCK') expressed its opinion to the Minister of Science and ICT that the "Human Rights Impact Assessment Tool for AI (hereinafter referred to as ‘HRIA Tool for AI’)” should be utilized during the establishment and implementation of AI-related policies and project plans to prevent human rights violations and discrimination arising from the development and use of AI. 




In addition, the NHRCK emphasized the need to disseminate this assessment tool to ensure that all AI developed or procured by public institutions and high-risk AI introduced in the private sector are subject to voluntary human rights impact assessments until such assessment is legally mandated.




As the development and proliferation of AI technology continue, there are growing expectations for the positive changes and impacts it will bring to our society and human lives. However, concerns about negative effects such as human rights violations and discrimination caused by AI are also increasing.




Due to the inherent opacity and widespread effects of AI systems, it is often difficult to administer remedial measures or sanctions after the fact. Therefore, it is necessary to adopt and implement proactive HRIA to prevent potential risks in the development and use of AI. Particularly, the United Nations and various countries are focusing on the negative impacts that the AI of public sectors and the high-risk AI of private sectors can have on people. As a measure to prevent and manage these effects beforehand, there is a trend towards proposing and adopting various impact assessments that go beyond simple ethical standards. The NHRCK has also urged the Korean Government and National Assembly to introduce the "HRIA for AI" through the ‘Recommendation on Human Rights Guidelines for AI’ (2022) and the ’Expression of Opinions on the Bill of the ‘AI Act’ (2023)




The NHRCK has prepared the “HRIA Tool for AI” as a specific and practical measure to identify, prevent, and mitigate the negative impacts on human rights during the development and use of AI from a human rights perspective.




The assessment tool, consisting of 72 questions across four stages, allows entities involved in the development and use of AI to comprehensively examine not only the technical risks but also the impact and severity on human rights. Additionally, efforts have been made to reduce the burden of assessment as much as possible by providing separate explanations. The NHRCK hopes that the dissemination of the "HRIA Tool for AI" will lead to the development and use of AI technology in a manner that is friendly to human rights, and anticipates that this will pave the way for the legalization of the HRIA for AI.




※ The original text of the decision and other documents is in Korean. The English version is provided for convenience by translating the original text. Please note that the interpretation of the decision and other documents should be based on the Korean version.

공감

확인

아니오