Sprache ändern: English

iBorderCtrl – Transparency complaint against secret EU surveillance research “iBorderCtrl”

The EU is funding the development of a supposed “video lie detector” that would be used on travellers before entering the EU. Member of the European Parliament and civil liberties activist Patrick Breyer (Pirate Party) filed a lawsuit on 15 March 2019 for the release of classified documents on the ethical justifiability and legality of the technology.

The European Court of Justice delivered its judgment on 15 December 2021 (Case T-158/19). The Court ruled that the public can partially access the project documentation where it generally discusses the reliability, ethics and legality of such technology. However the Court considered that commercial interests rule out public access to information on the specific iBorderCtrl technology, including its legality, its reliability (false positives), the risk of discrimination and mitigation measures.

On 25 February 2022 Breyer filed an appeal (Case C-135/22 P). He claims to have a right to full access. The public interest in disclosure outweighs private commercial interests. The public interest in access to information arises already at the beginning of the research phase and cannot legitimately be deferred to the phase after completion of the research project. The EU’s system of publishing summaries of results only is not capable of satisfying academic interest, media interest and the interest of the public in general in the project, argues his appeal. The European Court of Justice will deliver its judgement on 7 September 2023.

How is the “video lie detector” supposed to work?

The iBorderCtrl research project, funded by the EU with 4.5 million euros, aimed at developing a prototype. The idea was that people who want to travel to the EU should take a lie detector test at home in front of their webcam. Based on their facial expressions and behaviour when answering standard questions, special software would determine whether the person is telling the truth.

Whether such “deception detection” technology works is highly controversial. The only “scientific” assessments of the technology have been published by Manchester Metropolitan University (MMU), which was part of the iBorderCtrl consortium. The MMU scientists have patented the technology and are selling it commercially through a company called Silent Talker Ltd. As the technology is based on machine learning, the developers themselves say they do not know what the system assumes are signs of deception.

The EU Commission claimed in 2019: “The project proposal has been scientifically assessed by independent experts and has undergone a technical review confirming the scientific assumptions, including the statistical significance of the automatic deception detection system, based on the scientific and technological research carried out to date.” However, the EU refuses to release the “scientific evaluation”. In court, the EU research agency explained that it is not a precondition for EU funding that the methods of a project are are recognized in science.

Independent scientists fundamentally question whether the truth of a statement can be inferred from “micro-expressions”.

Which information is being withheld?

According to the ruling of 15 December 2021, the following information about the iBorderCtrl project shall not be disclosed in order to protect commercial interests:

  • the legal and ethical implications of the iBorderCtrl technology (documents D1.1, D1.2)
  • ethical and legal concerns (for example, human dignity, right to privacy, non-discrimination, risk of stigmatisation, function creep, etc.) and arrangements for compliance with ethical principles and fundamental rights, both for the research phase and for the operational phase
  • How the iBorderCtrl project deals with profiling and the risk of stigmatisation of both individuals and groups
  • Project risks and related safeguards, such as the problem of false positives and false negatives of the iBorderCtrl technology
  • Requirements of Union and national law applicable to iBorderCtrl technologies (document D2.3)
  • technical requirements for iBorderCtrl (document D2.1)
  • the concrete iBorderCtrl components: (1) Automatic Deception Detection System, (2) Document Authenticity Analysis Tool, (3) Biometric Module, (4) Face Matching Tool, (5) Hidden Human Detection Technology, (6) Risk Based Assessment Tool and (7) integrated Border Control Analytics Tool (document D2.2)
  • the iBorderCtrl process for collecting personal data (document D3.1)
  • assessment of the quality of the project results as well as risk management (document D8.1)
  • progress reports including information on project risks (foreseen and unforeseen, organisational, technical) and the respective safeguards (documents D8.3 and D8.5, D8.4 and D8.7)

What are the reasons given for refusing public access to the documents?

Breyer’s request to access the ethics report, as well as to a legal assessment, to much of the project’s public relations strategy and the project’s results claiming that these documents are “commercial information” of the companies involved and of “commercial value”. The EU research agency’s lawyer explained in court: “Democratic control of research funding is not necessary”, arguing that research and development was not yet about the use of the technology. The EU research funding deliberately does not follow an open access approach in order to protect competitive advantages of the participating companies. Disclosure of the iBorderCtrl project would jeopardise the business interests and reputation of the participating companies and institutions. Public comments taken out of context could put the responsible entities under pressure and jeopardise the completion and marketing of the technology.

Breyer said: “The reasons given for the secrecy demonstrate: It is all about economical profit. Regarding this highly dangerous technology the transparency interests of the scientific community and the public must take precedence over private profit interests.”

What does Breyer criticise about the technology?

Because of the technology’s lack of reliability, countless people are at risk of being falsely accused of lying and exposed to disadvantages. Certain groups of people (for example persons of colour, women, older people, children, persons with disabilities) might be particularly likely to be falsely accused. After two parliamentary questions by Breyer (1, 2), the EU Commissioner for Home Affairs had to admit that the project did not evaluate which proportion of respondents were classified potential “liars” by the technology, what its error rate is and whether the error rate is higher for certain groups of people.

What is Breyer criticising about the technology?

Because of the lack of reliability of the technology, countless people are at risk of being falsely accused of lying and exposed to disadvantages. Experience shows that certain groups of people (for example, people with dark skin, women, older people, children, people with disabilities) may be particularly likely to be misjudged. After two parliamentary questions by Breyer (1, 2), the EU Commissioner for Home Affairs had to admit that the project did not check what proportion of respondents the technology classifies as “liars”, what their error rate is and whether the error rate is higher for certain groups of people.

Breyer: “Systems for recognising conspicuous behaviour gradually create a uniform society of passive people who just don’t want to attract attention. Such a dead surveillance society is not worth living in. I am convinced that this pseudo-scientific security hocus-pocus will not detect any terrorists. For stressed, nervous or tired people, such a suspicion-generator can easily become a nightmare. In Germany lie detectors are not admissible as evidence in court precisely because they do not work. We need to put an end to the EU-funded development of technologies for monitoring and controlling law-abiding citizens ever more closely!”

The EU Commission defends the project by referring to an independent external ethics evaluation, but refuses to release it.

EU research funds are being diverted for lobbying purposes

In April 2021, it emerged that the „iBorderCtrl“ project, which was entirely funded by the EU, used part of its funding to lobby legislators for fundamental rights restrictions which would allow the use of its controversial technology on travellers. The EU Commission tried to hide this in a partially redacted document that was reconstructed by technical means.

While the Commission publicly claimed that “iBorderCtrl was a research project and did not envisage the piloting or deployment of an actually working system”, the secret parts of the redacted “communications plan” revealed that the iBorderCtrl consortium collaborated with industry “so that [iBorderCtrl] can easily be the basis for many other applications for other target groups and even other application domains”.

The document goes on to acknowledge that “a statutory legal basis will be required” to use the “deception detection” and other technologies at borders. “To foster such legal reforms” the consortium envisaged “dissemination activities to … stakeholders” such as Members of Parliament, the Commission and border authorities.

In her answer to a written question by Patrick Breyer, the EU Commissionerfor Home Affairs, Ylva Johansson, denied the verifiable lobbying.

EU has a history of funding illegal and unethical technology

Years ago, FRONTEX had technology for video lie detection tested. Under a follow-up project to iBorderCtrl, “TRESSPASS”, the EU again funded the testing of unscientific technology to “assess the sincerity of the traveller and his statements”. In October, the European Parliament expressed “concern” about the iBorderCtrl project.

The EU keeps funding illegal and unethical surveillance research. With projects like INDECT or CleanIT, EU security research has been criticised for years. The actual extent is difficult to grasp. We have had a long list of potentially relevant research projects compiled, but evaluating them would require extensive analyses.

In order to stop the funding of technology that violates fundamental rights, fundamental changes would be necessary:

  • Advisory bodies should include an equal number of representatives of all political groups, criminologists, victims’ associations and non-governmental organisations for the protection of civil liberties and privacy, in addition to representatives of national governments and industry.
  • A decision on the tendering of a project should be made only after an investigation by the European Fundamental Rights Agency on the impact of the respective research objective on our fundamental rights (impact assessment).
  • The development of technologies for increased surveillance and control of citizens should be excluded by law.
  • Instead, security research should be extended to all crime and accident prevention options, and should include independent research into the effectiveness, costs, harmful side effects and alternatives to each proposal.
  • Because perceived security is an important prerequisite for our well-being, research on how to increase public awareness of security and how to counteract distorted assessments and portrayals of the security situation should be funded as well.

These unresolved issues are exacerbated by the new EU Defence Fund, through which the EU is expanding its flow of money to the development of weapons, i.e. lethal technology.

What is the aim of the lawsuit?

Plaintiff Patrick Breyer: “The European Union is funding illegal technology that violates fundamental rights and is unethical. It labels the research a ‘trade secret’ of the corporations involved. With my transparency lawsuit, I want the court to rule once and for all that taxpayers, scientists, media and Members of Parliament have a right to information on publicly funded research – especially in the case of pseudoscientific and Orwellian technology such as the ‘iBorderCtrl video lie detector’.”

Breyer has already successfully taken the EU Commission to the European Court of Justice in the past. At the time, the Commission refused to hand over documents on indiscriminate data retention.

Relevant documents:

Official Summary of the proceedings (meeting report) (in German)

Pleading of the plaintiff’s lawyer in court (in German)

Documents issued following the first judgment

“Open Security Data”: Search engine on EU security research

EPRS Opinion: Mechanisms to prevent unethical research and funding – Horizon and EDF

Official iBorderCtrl homepage

iBorderCtrl documents:

“Silent talker” homepage (archived)

Press releases: