- Advertisement -
World

In Middle East, poor miss out as 'faulty' algorithms target aid

Lebanon is one of nine Arab nations using an algorithm-powered poverty assessment formula funded by the World Bank that ranks welfare applicants according to dozens of different data points.

Reuters
5 minute read
Share
Palestinian women receive condolences for the loss of their relatives among the people who were victims of floods, in Deir Al-Balah, central Gaza Strip Sept 13. Photo: Reuters
Palestinian women receive condolences for the loss of their relatives among the people who were victims of floods, in Deir Al-Balah, central Gaza Strip Sept 13. Photo: Reuters

Asma Ibrahim, an unemployed woman who lives in a cramped shack in northern Lebanon, has no idea why she was refused a welfare benefit for the country's poorest people - money that would prevent her five children from going to bed hungry.

"People in need are not receiving anything," she said by phone from the deprived Akkar region, adding that she had been calling a government hotline twice a week to ask why she was turned down after applying more than a year ago.

In the absence of an explanation, she blames corruption, and is angry that some of her neighbours are receiving the aid.

"How is that supposed to make me feel?" she said.

Lebanon is one of nine Arab nations using an algorithm-powered poverty assessment formula funded by the World Bank that ranks welfare applicants according to dozens of different data points.

Called Proxy Means Testing (PMT), the software considers factors such as family size, address, and ownership of livestock or cars, but the full details of the metrics have not been disclosed in the countries where it is being used.

The use of such methods aims to make payments fairer and more efficient by targeting them to those most in need, but activists and researchers say algorithmic tools often wrongly exclude people.

"These algorithms only capture a static picture of what people are going through at a single point in time - but this is not how people actually suffer hardship," said Amos Toh, a senior researcher on artificial intelligence (AI) and human rights at Human Rights Watch (HRW).

Research published by the group in June found that many people in Jordan were not receiving financial support under the country's Takaful (meaning Solidarity) programme because "their hardships fall outside an algorithm's faulty model of poverty".

'No justice'

Part of the problem is that the data fed into the systems is often out of date, leading to apparently arbitrary decisions, said Chad Anderson, an international consultant who works on social protection programmes.

Researchers have found PMT-based aid programmes in Lebanon and Yemen using census data that is more than a decade old, and fluctuating levels of household income and consumption can also skew poverty assessments, experts say.

"Countries often don't update their targeting lists for months or even years," said Anderson, adding that such issues meant means-testing algorithms were akin to "a random lottery system."

At home in Akkar, Ibrahim said she could see no reason for her family's rejection.

Under Lebanon's National Poverty Targeting Programme, seven-member families living in extreme poverty like hers should be entitled to US$145 (about RM685) per month.

Ibrahim's husband is also unemployed - unable to work due to an eye condition that has left him visually impaired because the family could not afford treatment.

"Our government has no justice," she said.

The Lebanese government did not respond to a request for comment.

The World Bank, which said it had worked with governments to tailor the PMT-based programmes to meet their needs, defended the systems deployed in the region for more than a decade, but acknowledged their limitations.

Around the world, 40 countries use PMT systems, also including Morocco, Algeria, Saudi Arabia and Tunisia in the Middle East and North Africa (MENA), the entity said in a 2020 report.

"Unfortunately, no targeting method is perfect, whether it's welfare, age-based or relying on other criteria," the World Bank told the Thomson Reuters Foundation in written comments, adding that it takes apparent cases of unfair decisions "very seriously and will follow up with concerned teams accordingly".

"Fortunately, there are robust standard procedures to handle such cases," the institution said, adding that countries had made significant investments in grievances and redress mechanisms.

Risk of errors

Welfare applicants face other hurdles, too.

In Jordan, applications can be made online, but many people said they preferred to go to local registration centres to get help filling out their form and to reduce the risk of errors that could lead to rejections.

Others said a lack of computer literacy deterred them from making electronic applications.

Even after taking such precautions, some people said they had been rejected despite apparently meeting key criteria.

Because 38-year-old Maysa's husband is Egyptian, the couple's five children do not have Jordanian citizenship as women in Jordan are not allowed to confer their nationality to their spouse and children.

Maysa, who asked not to give her full name, said she thinks her Takaful application was rejected because her children's status meant she was treated as an individual requesting aid rather than as a family.

The algorithm calculates aid based on the number of Jordanians in a household, the HRW report said, citing the National Aid Fund (NAF), the government agency charged with implementing Takaful.

As she waits to put in another application, Maysa, who is unemployed, said she had no choice but to run up debts.

"My kids need milk, they need diapers and to eat and drink," she said by phone.

Jordan's government did not respond to a request for comment.

In Egypt, Cairo pensioner Haggag Abu Yasser was turned down twice for aid from the Takaful and Karama (Solidarity and Dignity) programme, before finally being accepted a third time.

Applicants are meant to be visited by workers to assess their living conditions and gather data which feeds into the PMT software, but Abu Yasser said that never happened in his case.

"They didn't even come to see my living conditions and see how I'm living with my wife," he said, lamenting that it had taken him five years to be accepted.

The Egyptian government did not respond to a request for comment.

Costs and benefits

A better - and more cost-effective - approach to algorithmic targeting would be for governments to move towards universal social protection systems, said Aya Majzoub, Amnesty International's deputy director for the MENA region.

"These programmes are often very costly to implement, which means that less money ultimately goes into the hands of the beneficiaries," Majzoub said, referring to PMT-based aid systems.

But Sikandra Kurdi, a researcher at the International Food Policy Research Institute (IFPRI), which was hired by the World Bank to conduct technical assessments of its anti-poverty programmes, said algorithmic tools are a reasonable option for countries that cannot afford universal social protection.

"Once you agree that you cannot catch everybody and you have to make some decisions about targeting, then PMT is a fair way to do it," she said, adding that policymakers should be aware that it is not a "technocratic magic bullet".

Officials highlight the programmes' positive results.

Between 2019 and 2021, Jordan's Takaful programme reduced inequality and poverty by 0.7% and 1.4% respectively, the World Bank said in a statement.

The largest in the region, it reached 220,000 people in 2023, it said in a June report.

In the city of Karak, Salem Ali, 43, said he applied for Takaful aid in 2020 and has been benefiting from it ever since.

His family now receives US$120 per month - money he uses to meet essential household expenses such as food and utility bills.

"It's better than nothing," he said. "It helps with (my children who are) students, I have no other income at all."