Miguel Afonso Caetano<p><a href="https://tldr.nettime.org/tags/FRANCE" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FRANCE</span></a> <a href="https://tldr.nettime.org/tags/CNAF" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>CNAF</span></a> <a href="https://tldr.nettime.org/tags/Algorithms" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Algorithms</span></a> <a href="https://tldr.nettime.org/tags/RiskScoring" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>RiskScoring</span></a> <a href="https://tldr.nettime.org/tags/AlgorithmicDiscrimination" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AlgorithmicDiscrimination</span></a>: "Fifteen French NGOs are suing the public body that distributes allowances for families, youth, housing, and inclusion (CNAF) at the French state council over the use of a risk-scoring algorithm, which impacts almost half of France's population, according to a Wednesday (16 October) press release.</p><p>This legal action follows the Court of Justice of the EU (CJEU) ruling that decision-making using scoring algorithms that use personal data is unlawful under the EU's data privacy regulation (GDPR).</p><p>The NGOs are calling on the state council to refer the case to the CJEU for a preliminary ruling. The case could take two to five years, depending on how the reference is handled.</p><p>"This algorithm mathematically reflects the discriminations already present in our society. It is neither neutral nor objective," said Marion Ogier, a lawyer at the Human Rights League, at a press conference in Paris on Wednesday.</p><p>Since 2010, the CNAF has been using an algorithm to select recipients for a review of their benefits. These credit checks are focused on cases deemed as 'higher risk' based on the recipient's profile and situation.</p><p>However, a number of local investigations published in December 2023 criticised these checks for not being truly random. Seventy per cent of 128,000 credit checks conducted in 2021 came from scoring algorithms, revealed CNAF in a 2022 report.</p><p>"The CNAF algorithm is just one part of the system. The public pension schemes, health insurance, and employment service all use similar algorithms,” Ogier added."</p><p><a href="https://www.euractiv.com/section/tech/news/french-ngos-sue-public-body-over-scoring-algorithm/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">euractiv.com/section/tech/news</span><span class="invisible">/french-ngos-sue-public-body-over-scoring-algorithm/</span></a></p>