close
close

Semainede4jours

Real-time news, timeless knowledge

Artificial intelligence can help scale humanitarian interventions. But it can also have major disadvantages
bigrus

Artificial intelligence can help scale humanitarian interventions. But it can also have major disadvantages

NEW YORK – As the International Rescue Committee grapples with a dramatic increase in the number of displaced people in recent years, the refugee aid agency has sought efficiencies wherever possible, including using artificial intelligence.

Since 2015, IRC has been investing in Signpost, a portfolio of mobile applications and social media channels that answer questions in different languages ​​for people in dangerous situations. The Signpost project, which includes many organizations, has reached 18 million people so far, but IRC wants to significantly increase its reach using artificial intelligence tools.

Conflicts, climate emergencies and economic challenges have increased the demand for humanitarian aid. 117 million people Forcibly displaced persons in 2024, according to the United Nations High Commissioner for Refugees. As humanitarian organizations encounter people in need, they also face huge funding gaps. The move towards artificial intelligence technologies is partly due to this huge gap between needs and resources.

To reach its goal of reaching half of the displaced within three years, the IRC is building a network of AI chatbots that can boost the capacity of humanitarian workers and local organizations that serve people directly through signposting. The project currently operates in El Salvador, Kenya, Greece and Italy and responds in 11 languages. It draws on a combination of broad language models from some of the largest technology companies, including OpenAI, Anthropic, and Google.

The chatbot response system also uses customer service software from Zendesk and receives other support from Google and Cisco Systems.

Beyond developing these tools, the IRC wants to extend this infrastructure to other nonprofit humanitarian organizations at no cost. They hope to create shared technology resources that less technically focused organizations can use without having to negotiate directly with technology companies or manage deployment risks.

“We’re trying to be really clear about where the legitimate concerns are, but leaning into optimism about the opportunities and at the same time not allowing the populations we serve to be left behind in solutions that have the potential to scale in a way that can scale from person to person. or other technologies cannot,” said Jeannie Annan, Chief Research and Innovation Officer at the International Rescue Committee.

The answers and information provided by Signpost chatbots are reviewed by local organizations to ensure they are up-to-date and sensitive to the precarious conditions people may be experiencing. An example query shared by IRC is from a woman from El Salvador who was traveling to the United States via Mexico. son seeking shelter and services for his child. The bot provides a list of providers in its area.

More complex or sensitive queries are passed along for people to respond to.

The most important potential disadvantage of these tools is that they don’t work. For example, what happens if the situation in the field changes and the chatbot doesn’t know about it? It can provide not only false but also dangerous information.

A second problem is that these tools can create a valuable trove of data about vulnerable individuals that hostile actors can target. What happens if a hacker manages to access data containing personal information, or if that data is accidentally shared with an oppressive government?

The IRC said it had agreed with its technology providers that none of its AI models would be trained on data generated by the IRC, local organizations or the people they serve. They also worked to anonymize the data, including removing personal information and location.

As part of the Signpost.AI project, IRC is also testing tools such as digital autotutors and maps that can integrate many different types of data to help prepare for and respond to crises.

Cathy Petrozzino, who works at nonprofit research and development company MITER, said AI tools have high potential but also high risks. To use these tools responsibly, he said, organizations must ask themselves: does the technology work? Is this fair? Are data and privacy protected?

He also emphasized that organizations should bring together a range of people to help manage and design the initiative; not just technical experts, but also people with deep knowledge of the context, legal experts, and representatives of the groups that will use the tools.

“There are a lot of good models in the AI ​​graveyard,” he said, “because they were not developed together and in collaboration with the user community.”

For any system with potentially life-changing effects, groups should bring in outside experts to independently evaluate their methodologies, Petrozzino said. Designers of AI tools should also consider other systems it will interact with and plan to monitor the model over time, he said.

Helen McElhinney, executive director of the CDAC Network, said consulting with displaced people or others served by humanitarian organizations can increase the time and effort required to design these tools, but not getting their input can lead to many safety and ethical issues. It can also unlock local knowledge.

People receiving services from humanitarian organizations should be told whether an AI model will analyze any information they provide, even if the goal is to help the organization respond better, he said. This requires meaningful and informed consent, he said. They also need to know whether an AI model makes life-changing decisions about resource allocation and where the responsibility for those decisions lies, he said.

Degan Ali, CEO of Adeso, a nonprofit organization in Somalia and Kenya, has long been an advocate of changing the power dynamics in international development to give more money and control to local organizations. He asked how IRC and others pursuing these technologies could overcome access issues. power outages lasting weeks He said Chatbots will not help when there are no devices, internet or electricity caused by Hurricane Helene in the USA.

Ali also cautioned that very few local organizations have the capacity to participate in major humanitarian conferences where the ethics of AI are discussed. He said few have staff who are senior and knowledgeable enough to actually engage in these discussions, but they understand the potential power and impact these technologies can have.

“We must be extraordinarily careful not to replicate power imbalances and biases through technology,” Ali said. “Meaningfully answering the most complex questions will always require local, contextual and lived experience.”

___

Associated Press and OpenAI a license and technology agreement This allows OpenAI to access some of AP’s text archives.

___

Associated Press coverage of philanthropy and nonprofits receives support through the AP’s collaboration with The Conversation US and funding from Lilly Endowment Inc. AP is solely responsible for this content. For AP’s complete philanthropy coverage, visit: https://apnews.com/hub/philanthropy.

Copyright 2024 Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.