Denmark: Amnesty criticizes use of algorithms by welfare authority
The Danish welfare authority is trying to detect social benefits fraud with the help of algorithms – and in doing so has created a system of mass surveillance, Amnesty International reports. The organization argues that the system “is eroding individual privacy and undermining human dignity” – and could even run afoul of a new EU ban.
For its new report, “Coded Injustice: Surveillance and Discrimination in Denmark’s Automated Welfare State,” Amnesty investigated the use of algorithms by the Danish welfare authority Udbetaling Danmark (UDK). The authority was established in 2012 to centralize payouts of state benefits – including childcare, sick pay, housing, and unemployment benefits. Working together with private companies, the authority developed algorithms that are meant to help identify potential benefits fraud.
Up to 60 different algorithms are deployed to flag individuals for further investigation by the authorities. While conducting its investigation, Amnesty had partial access to four of the algorithms in use.
Merging of data
In order to detect fraud, UDK accesses state databases that contain information about recipients of benefits, their family members, and other members of their households. According to Amnesty’s report, the personal data of millions of Danish residents is linked or merged in this way. The data processed by the authority includes information on individuals’ residency or change of residence, citizenship, birth place and family circumstances. Tax and health data, as well as information on a person’s education level, employment and income, are also processed.
Algorithms analyze these and other data points to generate a list of people who supposedly pose a higher risk of claiming benefits fraudulently. Their cases are then examined further by specialists.
Hellen Mukiri-Smith, researcher on artificial intelligence and human rights at Amnesty, said in a statement: “This expansive surveillance machine is used to document and build a panoramic view of a person’s life that is often disconnected from reality. It tracks and monitors where a social benefit claimant lives, works, their travel history, health records, and even their ties to foreign countries.”
“Pre-existing inequalities”
Amnesty points out that these algorithms are being deployed “in an environment of pre-existing inequalities – laws, rules, institutions, norms, and values – within Danish society. These discriminatory structures are embedded” in the algorithms. People are categorized based on purported differences in what amounts to “othering.”
In its investigation, the NGO determined that the fraud detection mechanism specifically and disproportionately targets already marginalized groups that are seen as “other” because their living patterns or family arrangements are “unusual.” Individuals belonging to these groups come under suspicion of fraud or are classified by the authorities as ineligible for benefits. The authorities’ method of categorization carries the risk of discriminating against low-income individuals, migrants and refugees, ethnic minorities and people with disabilities.
To identify fraud in childcare or pension programs, for example, Danish authorities use an algorithm to detect “unusual” or “atypical” living patterns. “Yet there is no clarity on what constitutes such situations,” Amnesty reports, “leaving the door open for arbitrary decision-making.”
Said Mukiri-Smith: “People in non-traditional living arrangements [. . .] are all at risk of being targeted” by algorithms for further investigation. This could include people who are married but live apart as well as “those living in a multi-generational household, a common arrangement in migrant communities.”
An algorithm to determine “foreign affiliation”
In evaluating recipients of child benefits, the algorithms seek to identify whether a person has “strong ties” to a country outside the European Economic Area (EEA). Those categorized as such are prioritized for further investigation. According to Amnesty, the perceived need for such categorization stems from the UDK’s fears that beneficiaries are living abroad without informing the authority – and thus unjustly continuing to receive benefits.
Amnesty points out that the UDK’s method of investigation is not based on objective criteria. Rather, an individual’s “strength of ties” is determined in relation to other beneficiaries who are seen as the “norm.” Information on individuals’ foreign residence and travel in and out of the country are among the data processed – as are the number of children a person has and their citizenship.
Responding to Amnesty’s investigation, the UDK stated “that the use of ‘citizenship’ as a parameter in their algorithms does not constitute processing of sensitive personal information.” Amnesty disagrees, arguing that in some circumstances citizenship could be used as a proxy for a person’s ethnicity or migration status. The report states that the use of these criteria “explicitly targets people from countries outside the EEA and, therefore, directly discriminates on the basis of nationality, ethnicity and migration status.”
Denmark has passed laws to enable this comprehensive data processing. Amnesty argues, however, that the country has engaged in the processing of sensitive data – and that this intrusion into people’s private lives is neither necessary nor proportionate, as defined by international human rights treaties.
Heightened scrutiny “eating” away at recipients of benefits
Amnesty also spoke to organizations and individuals affected by this algorithm-driven scrutiny. One person interviewed by Amnesty said that they were “always afraid” of becoming the target of investigations into potential benefits fraud. Gitte Nielsen of the Dansk Handicap Foundation spoke of people with disabilities who were subject to constant interrogation and felt that the scrutiny was “eating” away at them.
According to the report, the Danish system not only facilitates the surveillance of those who apply for and receive benefits, it also acts as a barrier that prevents access to those benefits. Among those affected are women in crisis shelters. Many don’t have access to computers with internet connections, which are necessary to apply for benefits. Digitization also limits access for people with disabilities – even a new wheelchair can only be applied for online.
Danish authorities confirmed that in investigating suspected cases of fraud, they also monitor recipients’ social media profiles. This “may pose risks to a person’s rights to privacy, freedom of expression and social security,” Amnesty’s report states. This type of surveillance can also have a “chilling effect,” writes Amnesty, “as people are forced to censor themselves.”
The report also cites instances of authorities drawing false conclusions based on social media posts – posts that don’t always reflect a person’s actual living circumstances. Amnesty demands an end to the practice.
The lack of sufficiently independent oversight over the UDK’s processing of data encourages human rights abuses, Amnesty argues. The organization recommends the creation of an independent public authority with oversight powers. The Danish data protection authority must also investigate UDK’s data processing, says Amnesty.
The Danish system could fall under the prohibition against “social scoring” included in the EU’s new AI Act, Amnesty believes. The NGO calls on Danish authorities to suspend use of the system, at least until its permissibility under EU guidelines can be determined. The use of data relating to “foreign affiliation” must also be banned, Amnesty argues.
“Under various international human rights treaties, European Union laws, and national law, Denmark has a legal obligation to safeguard human rights like privacy, data protection, freedom of expression, equality, and non-discrimination,” Amnesty writes. Hellen Mukiri-Smith adds: “These rights aren’t just the driving force of a just society, they also enable fair access to social security, healthcare, and more.”
Together with other NGOs, Amnesty filed a complaint in France last month against the use of another social security algorithm. The groups allege discrimination and call for a ban. (js)