Avoiding the Same Mistakes: Understanding and Countering Bias in the Deployment of Artificial Intelligence for Humanitarian Assessments
MetadataShow full item record
Other Titles:Going Deeper: Establishing Rules for the Ethical Use of Artificial Intelligence in Humanitarian Emergencies
Series/Report no.:PREA Conference. Ethics and Humanitarian Research: Generating Evidence Ethically. The Fawcett Event Center, The Ohio State University, Columbus, Ohio, March 25-26, 2019. Presentation. Session 10. Oral Presentations 3. Paper A.
An effective response to humanitarian emergencies relies on detailed information about the needs of the affected population. In recent years, most primary data collection for this purpose has moved to handheld computer-assisted personal interviewing technologies, and—to a smaller extent—to computer-assisted telephone interviews. Natural language processing (NLP), a type of artificial intelligence (AI), provides radical new opportunities to capture qualitative data from voice responses of thousands of people per day, and analyze it for relevant content in order to inform humanitarian emergency decisions more rapidly. But this innovation, currently its pilot stages for deployment in Yemen, would rely heavily on opaque algorithms and training data to convert qualitative responses into data for operational planning purposes. Based on key informant interviews with engineers and humanitarian survey specialists, and a review of the latest proposals for countering bias in AI development, this paper provides an overview of the major ethical challenges related to deploying NLP in humanitarian emergencies. I demonstrate that previous quantitative data collection methods have a different set of biases that have become entrenched in humanitarian assessments, and how we may avoid similar mistakes in the age of more automation in data collection.
AUTHOR AFFILIATION: Tino Kreutzer, York University, Canada, email@example.com
Items in Knowledge Bank are protected by copyright, with all rights reserved, unless otherwise indicated.