Biometrie - Biometrische Authentifizierung in der Sicherheit
Biometric authentication uses unique physical characteristics (fingerprints, face, iris, voice) or behavioral patterns (typing rhythm, gait) for identification. It is phishing-resistant and convenient—but it comes with specific risks: stolen biometric data cannot be changed, and liveness detection attacks and GDPR requirements (Art. 9) necessitate careful implementation.
Biometric authentication combines two authentication factors: "what you are" (fingerprint, face) and "what you have" (the device that stores the biometric template). When implemented using modern methods—such as FIDO2 or passkeys—it is resistant to phishing and significantly more convenient than passwords. The pitfalls lie in implementation and data protection.
Types of Biometric Recognition
Physiological Biometrics
Fingerprint (Fingerprint Scanner)
Technologies:
-
Capacitive: Measures capacitance differences (common in smartphones)
-
Optical: Camera under glass (newer smartphones, in-display)
-
Ultrasonic: Sonar-like, works with wet fingers (Qualcomm 3D Sonic)
-
FAR (False Accept Rate): approx. 0.001% (1 in 100,000 false acceptances)
-
FRR (False Reject Rate): approx. 1–2% (1–2% of legitimate users rejected)
-
Risk: Stolen fingerprints (Chaos Computer Club: from a photo of the Foreign Minister)
Facial Recognition (Face ID, Windows Hello)
| System | Technology | FAR |
|---|---|---|
| Apple Face ID | 3D infrared projector (structured light), liveness detection | 1:1,000,000 (best consumer product) |
| Windows Hello Face | IR camera + depth sensor | < 1:100,000 |
| 2D Cameras | No depth sensor | Insecure (photo bypass possible!) |
Attacks:
- 2D Photo: Possible with poor implementations
- 3D Mask: Very complex, hardly relevant in practice
- Deepfake Video: New threat to video-based KYC
Iris Recognition
- Accuracy: FAR < 1:1,000,000 (very high)
- Usage: High-security areas, border control (EU Entry/Exit System)
- Convenience: low (User must stare at the camera)
- Falsification: Contact lenses with a fake pattern (demonstrated in research)
Voice biometrics
- Usage: Call center authentication, voice assistants
- Attacks: Deepfake audio (increasingly easy with AI!)
- FAR: significantly degrades with high-quality deepfakes
- Recommendation: NOT as the sole factor for critical systems
Behavior-based biometrics
Keystroke Dynamics
- Typing rhythm, key pressure, inter-key timing
- Transparent authentication (user is unaware)
- Application: Banking (abnormal typing rhythm → MFA challenge)
- Risk: Training phase required, recognition rate weaker than physiological methods
Mouse Dynamics
- Mouse movement patterns, click precision
- Combination with keystroke: significantly better
- Use: Fraud detection, not as primary authentication
Gait Analysis (Gait Recognition)
- Step sensor analysis (smartphone accelerometer)
- Scientific research, rarely used in production yet
- Privacy issue: constant monitoring
Biometrics and FIDO2/Passkeys
> IMPORTANT: With Passkeys/FIDO2, biometric data NEVER leaves the device!
Process
Registration:
- Device generates key pair: private (local) + public (server)
- Private key: stored encrypted, unlockable only via biometrics
- Public key: stored on server
Authentication:
- Server sends challenge
- Device: "Please use fingerprint/Face ID to unlock"
- Fingerprint unlocks local private key
- Device signs challenge with private key
- Server verifies signature with public key
- Biometric data: NEVER LEAVES THE DEVICE
Biometric Storage: Secure vs. Insecure
Secure (local):
- Apple Secure Enclave (iPhone): biometric template cryptographically isolated
- Qualcomm SPU: similar concept for Android
- Windows Trusted Platform Module (TPM): Windows Hello template
- FIDO2 hardware key: Template in the Secure Element
- Attacker requires physical device access + security vulnerability
Insecure (centralized):
- Server-side facial recognition database
- Biometric data in the cloud backend
- Breach → all biometric templates stolen
- Stolen biometric data can never be changed
- Prominent example: Suprema Biostar 2 (2019): 1 million fingerprints in plain text
FAR and FRR: The Two Error Types
FAR (False Accept Rate):
- Definition: How often is a stranger falsely accepted?
- Example FAR 0.001%: 1 out of 100,000 random attempts is successful
- Security-relevant: low FAR = more secure
FRR (False Reject Rate):
- Definition: How often is a legitimate user rejected?
- Example FRR 1%: 1 out of 100 genuine logins fails
- Convenience-related: high FRR = frustrating
Equal Error Rate (EER):
- Point where FAR = FRR (intersection of the graph)
- Low EER = better system
| System | EER |
|---|---|
| Apple Face ID | < 0.001% |
| Smartphone fingerprint | ~0.5% |
| Basic fingerprint sensor | ~2-5% |
Adjusting the Thresholds
High security (e.g., nuclear power plant):
- FAR very low (0.0001%), FRR higher (5%)
- Better to reject a user than accept an attacker
High convenience (e.g., smartphone unlocking):
- FRR low (0.1%), FAR slightly higher (0.01%)
- User accepted, occasional false access possible
Liveness Detection
- Distinguishes a real person from a photo/video/mask
- Passive Liveness: Software analyzes image (depth, micro-movements)
- Active Liveness: User must blink, turn head, speak
- Important for: Face recognition without a 3D sensor
- Deepfake risk in 2025: Passive liveness detection often bypassable!
Data Protection and GDPR
Art. 9 GDPR: Special Categories of Personal Data
Biometric data for unique identification = subject to special protection
Prohibition with exceptions:
- a) Explicit consent of the data subject
- b) Substantial public interest (national security)
- c) Medical care
No consent = processing prohibited!
Practical Cases for Businesses
Case 1: Biometrics on the user’s own device (Passkeys, Windows Hello)
- Data does not leave the device
- No server-side template
- Art. 9 GDPR: interpretatively no issue (EDPB: if no identification function, no “identification biometrics”)
- In practice: Passkeys with FaceID are acceptable under data protection law
Case 2: Time tracking via fingerprint terminal
- Templates stored centrally
- Art. 9 GDPR → DPIA (Data Protection Impact Assessment) usually required
- Employee consent: problematic (power imbalance, not truly voluntary)
- Recommendation: Offer an alternative (PIN/chip card)!
- Hamburg Data Protection Authority: Biometric time tracking without an alternative = unlawful
Case 3: Facial recognition in office buildings
- Art. 9 GDPR + other data protection laws
- DPIA mandatory
- In many EU countries: permitted only where there is a strong public interest
- EU AI Act: real-time biometric identification in public spaces = generally prohibited (exceptions: counter-terrorism, missing children)
DPIA Checklist for Biometrics
- Does the purpose justify the use of biometric data?
- Is a less data-intensive alternative possible?
- Is freedom of consent guaranteed (alternative option offered!)?
- Technical safeguards: encryption, access controls
- Deletion policy: when are templates deleted?
- Is the Data Protection Officer involved?
- Is DPO consultation required? (Art. 36 GDPR)