Thesis

One of the simplest forms of attack on a biometric system consists on presenting a fake artefact, known as Presentation Attack Instrument (PAI) to the capture device (e.g., a silicone face mask, a gummy fingerprint, or a patterned contact lense) in order to impersonate someone else or avoid being recognised. This is known as Presentation Attack (PA). Whereas considerable efforts have been devoted to the detection of these attacks for other characteristics, such as face or fingerprint, this is still not the case for iris recognition. Iris Presentation Attack Detection (Iris PAD)

One of the simplest forms of attack on a biometric system consists on presenting a fake artefact, known as Presentation Attack Instrument (PAI) to the capture device (e.g., a silicone face mask, a gummy fingerprint, or a patterned contact lense) in order to impersonate someone else or avoid being recognised. This is known as Presentation Attack (PA). Whereas considerable efforts have been devoted to the detection of these attacks for other characteristics, such as face or fingerprint, this is still not the case for iris recognition.

In order to ensure proper functioning of biometric recognition systems, inputs (i.e., face or iris images, fingerprint samples) need to fulfil high quality standards -  remember the garbage-in garbage-out principle. Toolkits for the  assessment of the quality of a biometric samples already exists for face (OFIQ from the BSI) and fingerprint (NIST's NFIQ 2.0) samples, but not yet for iris. Only a handful of general purpose metrics are mentioned in the ISO/IEC 29794-6 and have been implemented in the literature. Iris Quality

In order to ensure proper functioning of biometric recognition systems, inputs (i.e., face or iris images, fingerprint samples) need to fulfil high quality standards -  remember the garbage-in garbage-out principle. Toolkits for the  assessment of the quality of a biometric samples already exists for face (OFIQ from the BSI) and fingerprint (NIST's NFIQ 2.0) samples, but not yet for iris. Only a handful of general purpose metrics are mentioned in the ISO/IEC 29794-6 and have been implemented in the literature.

Biometric data are classified as the sensitive personal data by the EU General Data Protection Regulation (GDPR) and therefore need to be accordingly protected for its use and storage. Due to their noisy nature (e.g., due to a change in the illumination when acquiring a facial picture or a temporal sickness), dedicated methods need to be developed to ensure end-to-end protection. Multiple approaches have been presented in the literature, mostly focusing in particular biometric characteristics. The goal is to develop general approaches, valid for several characteristics, and possibly multi-biometric algorithms. Biometric Template Protection (BTP)

Biometric data are classified as the sensitive personal data by the EU General Data Protection Regulation (GDPR) and therefore need to be accordingly protected for its use and storage. Due to their noisy nature (e.g., due to a change in the illumination when acquiring a facial picture or a temporal sickness), dedicated methods need to be developed to ensure end-to-end protection. Multiple approaches have been presented in the literature, mostly focusing in particular biometric characteristics. The goal is to develop general approaches, valid for several characteristics, and possibly multi-biometric algorithms.

GAN networks can generate very realistic images, which could help to train better biometric systems (e.g., by increasing the number of samples in minority population to tackle bias, or to augment the amount of attacking samples for attack detection). However, the person's identity is frequently lost in this generation process, or artefacts reduce their usability for biometric recognition. This problem affects both facial and iris images, in the near-infrared and the visual spectra. Synthetic Biometrics

GAN networks can generate very realistic images, which could help to train better biometric systems (e.g., by increasing the number of samples in minority population to tackle bias, or to augment the amount of attacking samples for attack detection). However, the person's identity is frequently lost in this generation process, or artefacts reduce their usability for biometric recognition. This problem affects both facial and iris images, in the near-infrared and the visual spectra.

Face recognition systems have frequently been labelled as “biased”, “racist”, “sexist”, “unfair” by numerous media outlets, organisations, and researchers. Since this is an emerging challenge, further research in this area is required to enable the same treatment across different demographical groups. Transparency & Fairness in Biometrics

Face recognition systems have frequently been labelled as “biased”, “racist”, “sexist”, “unfair” by numerous media outlets, organisations, and researchers. Since this is an emerging challenge, further research in this area is required to enable the same treatment across different demographical groups.

Biometric recognition based on electrocardiogram (ECG) signals have been explored for human identification for decades. Given that ECGs are not as easy to acquire as faces or fingerprints, they are also harder to replicate and produce artifacts to launch attacks. On the other hand, ECG signals vary depending on physical activities or the environment, potentially affecting recognition accuracy. ECG-based biometrics

Biometric recognition based on electrocardiogram (ECG) signals have been explored for human identification for decades. Given that ECGs are not as easy to acquire as faces or fingerprints, they are also harder to replicate and produce artifacts to launch attacks. On the other hand, ECG signals vary depending on physical activities or the environment, potentially affecting recognition accuracy.