Facial Recognition – Austrian Regulations v European Approach?

Facial Recognition – Austrian Regulations v European Approach?

Austria
Tools
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Over the last couple of years, more and more countries have deployed technologies that allow them to match digital images of a person (e.g. from a surveillance camera) against a database of pictures.

While these technologies tend to facilitate certain tasks, be they identification of criminal suspects, ID verification for various computing platforms or other electronic devices, or deployment on CCTV surveillance, they are prone to misuse and to data and privacy infringement. For example, the government of Hong Kong used real-time facial recognition technology to identify protestors at the 2019 demonstrations against the Extradition Law Amendment Bill, causing them to destroy cameras and "smart" lampposts used for CCTV surveillance. But as these technologies are increasingly adopted in Europe and in Austria, it is important to learn more about the domestic use, regulations and plans for the future.

Also, have coronavirus mask-wearing mandates set back the use or reliability of facial recognition software?

General

Biometric data, meaning "personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data", is a special category of personal data and therefore is subject to increased requirements for protection according to Art 9 of the General Data Protection Regulation. The use of facial recognition software by Austrian/European law enforcement authorities is subject to the "Regulation on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences."

Guidelines of the Council of Europe on facial recognition

At the end of January 2021, the Council of Europe ("COE") issued guidelines on how to develop and use facial recognition without infringing rights to privacy and data protection. The COE has identified the following topics that need to be taken into account to ensure the rights and liberties of data subjects:

  • Lawfulness

The processing of biometric data must rely on an appropriate legal basis that assesses the necessity and proportionality of facial recognition and addresses the explanation of the specific use and purpose, the accuracy of the algorithm, the retention period of used pictures, the possibility of auditing these criteria, the traceability of the process and the enshrined safeguards.

In light of the above, the COE states that certain uses will be limited by law, e.g. the use of facial recognition for the purpose of determining a person's skin colour, health condition or religious belief is strictly limited, and affect recognition is prohibited altogether.

The legal basis will consider and address the different types of intrusiveness (e.g. real-time / not-real-time, sectors where it is used, etc.). It must also be ensured that images that are available in a digital format cannot be processed to extract biometric data when these images where initially captured for other purposes (e.g. social media).

Law enforcement authorities must distinguish between the use of facial recognition for verification or identification, strictly observing the necessity for and proportionality of the latter. The law should provide rules and criteria for the creation of databases. These prerequisites will be even stricter when it comes to real-time facial recognition.

In the private sector, the free, informed, explicit and specific consent as stipulated in the GDPR will be a prerequisite for use. Private entities will not be allowed to deploy facial recognition in an uncontrolled environment (e.g. a shopping mall) for marketing or private security purposes.

  • Involvement of the supervisory authorities

Authorities will be systematically involved in legislative and administrative matters prior to and during envisaged projects.

  • Certification

Lawmakers must guarantee the accountability of developers, manufacturers, service providers or entities using these technologies and set up qualified certification mechanisms.

  • General principles for entities

Entities using facial recognition must comply with data protection principles and guarantee the transparency and fairness of the processing, adhere to the principles of the GDPR (such as purpose limitation, data minimisation and limited duration of storage), meet the highest standards when it comes to data security, as breaches have particularly severe consequences for data subjects, and implement technical and organisational measures to ensure their accountability. Furthermore, entities must carry out impact assessments, as the processing of biometric data presents high risks to the fundamental rights of data subjects.

  • Rights of data subjects

Rights of data subjects, such as the right of information and access, the right to rectification and to obtain knowledge of the reasoning, and the right to object will be granted and will only be restricted where this is provided for by law, absolutely necessary and proportionate.

Austria – Regulations and use

With the issuance of the Safety Bundle (Sicherheitspaket) 2018, which amended the Security Police Act (Sicherheitspolizeigesetz, "SPG"), the Criminal Procedure Law (Strafprozessordnung) and the Federal Telecommunications Act (Telekommunikationsgesetz), the Austrian legislator enabled law enforcement authorities to access and process the data of public surveillance cameras (i.e. cameras in train and metro stations, public places, airports, schools, hospitals, etc.) in real-time to "fulfil their tasks" in the case of ongoing or imminent danger. This right can be exercised without judicial/court permission. While this provision enables the authorities to access surveillance data, it does not expressly permit the use of facial recognition technology. According to the Austrian Ministry of the Interior (Innenministerium, "BMI"), however, other regulations (Section 64 (2) in conjunction with Section 75 SPG) provide for the use of facial recognition. Critics note that the open wording of these provisions cannot be used as a valid legal basis for implementing random technologies for forensic purposes such as facial recognition. Nevertheless, the BMI confirmed that it already bought facial recognition software in 2019 and used it in a test run in 581 cases between December 2019 and June 2020. During this test, the programme successfully identified only 83 unknown criminal suspects by matching digital images against a database for forensic evidence containing roughly 10 million pictures, which is a success quota of only slightly more than 14 %. The Austrian Minister of the Interior, Karl Nehammer, emphasised that the technology is only intended to be used to identify unknown perpetrators suspected of intentionally committing a criminal offence. It is not used for real-time surveillance, because the Austrian legal system does not provide for such use. After the test run, the facial recognition technology was incorporated in the normal course of business in the Federal Criminal Police Office (Bundeskriminalamt) and was used 931 times until 1 October 2020. Following criticism from various data protection entities, the BMI renamed the technology to "digital picture matching" (Digitaler Bildabgleich), but continues to use it without explicit legal provision.

While Austria currently uses only its own forensic databases for matching against digital images of suspects, it is leading the negotiations for an amendment of the Prüm Convention. This treaty allows signatory EU Member States to access the databases of DNA profiles, fingerprints and other biometric personal data (as well as vehicle registration databases) for law enforcement purposes. Austria and the other members of the working party wish to extend the convention to merge the databases into a single database that also includes digital pictures linked to the biometric data. Such a vast database of EU citizens is currently unheard of.

Conclusion

  • Austria

While Austria does comply with many points laid out in the COE's guidelines, an explicit legal basis needs to be set. There is also room for improvement concerning the rights of data subjects. In addition, there are no suitable domestic safeguards and certifications guaranteeing the transparency and fairness of the processing when it comes to facial recognition. Austrian privacy experts worry that the use of facial recognition software may lead to the gradual extension of powers, such as the use of real-time surveillance and processing without a valid legal basis.

  • General conclusion

While the use of facial recognition to identify criminal suspects is to be welcomed, there are various problems that come with the use of this technology:

  1. Facial recognition technology is far from perfect. Studies show high failure rates, especially when it comes to the identification of people of colour or ethnic minorities, hence amplifying discrimination. Additionally, facial recognition software still can be fooled by facial expressions (e.g. disgust), leading to false positives or negatives.

  2. The underlying algorithms of facial recognition software are often business secrets, meaning the involved logic cannot be understood by the authorities using the software. Furthermore, human operators tend to accept computer decisions without (re-)checking them.

  3. Facial recognition could be used inappropriately (e.g. to identify climate change protesters, human rights activists, etc).

  4. The "chilling effect", meaning the inhibition, deterrence or discouragement of the legitimate exercise of fundamental natural and legal rights (e.g. free speech) by real-time surveillance and facial recognition (hence the limitation of free will), could be amplified.

  5. The creation of databases with biometric data is problematic, as these databases could be hacked and the biometric data used for criminal means.

Facial recognition has many sensible applications, but undoubtedly also poses risks. In light of the COVID-19 pandemic, tracking and surveillance software was often hastily implemented in countries all over the world. Therefore, increased attention to fundamental civil and privacy rights and liberties seems to be required and facial recognition should only be implemented in compliance with the COE's guidelines.

Oh, and while at the beginning of the COVID-19 pandemic, face masks confused facial recognition technologies (causing failure rates of up to 50 %), the latest studies show that algorithms have learned and cut the failure rate down to only 5 %.

By Florian Terharen, Associate, Schoenherr