DPDPA Logo
DPDPA.com Logo
DPDPA Sections DPDP Rules BLOGS CASE LAWS Templates Poster's Certificate Course
Privacy Notice Generator Legitimate Interest Tool
DPDPA QUIZ FAQ's
Facial Recognition Technology Under DPDPA: Legal, Ethical & Practical Guidance

Facial Recognition Technology Under DPDPA: Legal, Ethical & Practical Guidance

Author: Advocate (Dr.) Prashant Mali Published: February 01, 2026
SHARE: Share on WhatsApp Share on Twitter Share on LinkedIn Share on Facebook Share via Email

Introduction: Facial Recognition in Modern India

Facial recognition technology (FRT) is rapidly deployed across Indian infrastructure - airports implement DigiYatra for touchless travel, offices use facial recognition for attendance tracking, retail stores analyze customer demographics, and law enforcement uses facial databases for criminal identification. While offering convenience and security benefits, FRT processes biometric data - among the most sensitive personal data categories under DPDPA. Organizations deploying facial recognition face stringent compliance requirements and legal risks.

Critical Point: Facial recognition technology falls under "biometric data" in DPDPA Section 3(b), classified as sensitive personal data requiring heightened protection, explicit consent, and strict processing limitations. Many current FRT deployments operate without adequate legal basis or consent mechanisms.

Understanding Facial Recognition as Biometric Data Under DPDPA

What Constitutes Biometric Data?

Under DPDPA Section 3(b), biometric data includes information derived from physical, physiological, or behavioral characteristics that identify an individual, including:

  • Facial geometry and features (distance between eyes, jawline shape, etc.)
  • Iris and retina scans
  • Fingerprints and palmprints
  • Voice prints and speaker recognition
  • Gait analysis (how a person walks)
  • DNA data
  • Behavioral biometrics (typing pattern, mouse movement)

Facial images themselves are personal data; facial biometric templates extracted from images are more sensitive biometric data.

Special DPDPA Protections for Sensitive Personal Data

DPDPA Section 6 requires that sensitive personal data (including biometric data) be processed only:

  • With explicit consent from the data subject (not implied or general consent)
  • For processing that is necessary to fulfill an obligation under law or court order
  • For processing necessary to protect vital interests
  • For processing by a government agency in public interest
  • In limited cases by organizations for their own purposes where data is collected directly from the subject

Processing must be restricted to purposes directly related to why consent was obtained.

Types of Facial Recognition Deployments and Legal Basis

1. Airport Facial Recognition: DigiYatra Program

Overview: The Bureau of Civil Aviation Security (BCAS) introduced DigiYatra, enabling passengers to use facial recognition instead of physical documents at major Indian airports (Delhi, Mumbai, Bangalore, Hyderabad, etc.). How It Works:
  1. Passenger registers with DigiYatra app, submitting passport or Aadhaar
  2. Facial image captured during registration
  3. At airport, passenger stands in front of facial recognition gate
  4. System matches live face to registered image and ID document
  5. Gates open without physical interaction
DPDPA Compliance Framework:
  • Legal Basis: Necessary to fulfill obligation under law (Airport Authority Act, immigration requirements) - does NOT require explicit consent for each encounter
  • Data Minimization: Only facial biometric necessary for identity verification; should not capture and store additional facial data
  • Purpose Limitation: Data used only for airport security verification and immigration compliance, not for secondary uses like:
    • Profiling passengers for commercial marketing
    • Sharing with law enforcement for crime investigation (without separate legal basis)
    • Tracking passenger movements in airport beyond security check
    • Correlating with other databases for surveillance
  • Data Retention: DPDPA Section 7 requires data retention for limited period - ideally delete after successful security verification; maximum retention should be justified
  • Security Measures: Biometric data must have special technical/organizational measures (encryption, access controls, audit logs)
  • Transparency: Travelers must be clearly informed that facial recognition is used and what happens to their facial data
  • Rights: Passengers should have mechanism to request facial data deletion upon completion of journey
Potential DPDPA Risks:
  • If facial databases are shared with law enforcement or intelligence agencies without explicit legal provision and passenger notice
  • If facial data is retained indefinitely beyond airport security purpose
  • If system is not transparent - passengers not informed of facial recognition use
  • If technical security is inadequate and facial data could be breached

2. Workplace Facial Recognition for Attendance

Scenario: An IT company deploys facial recognition system for employee attendance tracking. Employees face a camera, system identifies them, and automatically marks attendance without manual entry. DPDPA Compliance Issues:
  • Employee Consent Problem: While employees consented to joining the company, they did not consent specifically to facial biometric processing. Fresh, explicit consent required for FRT use
  • Purpose Limitation Concerns: Attendance tracking is legitimate purpose, but can facial data be used for:
    • Identifying employees not on premises during working hours (movement tracking)?
    • Profiling which employees meet with whom (social network analysis)?
    • Detecting emotions or stress levels from facial expressions?
    • Determining bathroom/break room usage patterns?
    All of these exceed attendance tracking purpose.
  • Less Restrictive Alternatives: DPDPA favors data minimization. Attendance can be tracked via:
    • ID card access systems (no biometric data needed)
    • Fingerprint systems (still biometric but less invasive than facial data that captures identity + appearance + expression)
    • Manual attendance self-reporting
    Using the least invasive technology is required.
  • Data Retention: Facial images should not be stored permanently. Retain only biometric template for current employment period, delete after employee leaves
  • Rights of Employees: Mechanism for employees to:
    • Access facial data collected about them
    • Correct inaccurate facial templates (if system incorrectly identifies them)
    • Delete facial data
    • Opt out of FRT and use alternative attendance method
Compliance Path for Employers:
  1. Conduct Data Protection Impact Assessment (DPIA) specifically for facial recognition use
  2. Determine if less invasive alternatives could serve purpose
  3. Obtain explicit consent from employees with clear explanation of:
    • What facial data is collected
    • How long it's retained
    • Who has access to it
    • Purposes it will be used for
    • Right to object or request deletion
  4. Implement technical security (encryption, access controls)
  5. Document consent in writing
  6. Establish data deletion schedule
  7. Provide employees with access to their facial data and deletion rights
  8. Train security team handling facial data on DPDPA requirements

3. Retail Facial Recognition for Customer Analytics

High-Risk Deployment: A retail chain installs facial recognition at store entrance to:
  • Identify frequent customers
  • Analyze demographic breakdown of store visitors (age, gender, apparent ethnicity)
  • Track how many times each customer visits
  • Correlate with purchase data to build customer profiles
  • Send targeted marketing based on facial analysis
DPDPA Violations:
  • Illegitimate Processing: Customers did NOT consent to facial recognition for analytics
  • Excessive Data Capture: Capturing facial data of ALL store visitors for behavioral analysis is disproportionate
  • Discriminatory Profiling: Using facial features (age, gender appearance) to infer demographics for marketing violates fairness and non-discrimination principles
  • Purpose Inflation: Even if customer consented to in-store cameras for security, consent doesn't extend to facial recognition for marketing analytics
  • Inadequate Notice: Customers would not know their facial data is being captured and analyzed
DPDPA Compliance Prohibition: This use case is largely non-compliant. Stores cannot process facial biometric data of customers for marketing without explicit prior consent - and such consent is difficult to obtain in an open retail environment where customers are not explicitly directed to consent forms. Compliant Alternative:
  • Use WiFi/mobile analytics or in-store camera systems for movement tracking (less biometric-dependent)
  • Implement opted-in customer loyalty program where customers voluntarily provide data
  • Use non-facial sensors for crowd analytics (heat mapping of store zones)

4. Law Enforcement Facial Recognition

Use Case: Police departments use facial recognition systems to:
  • Identify suspects from CCTV footage at crime scenes
  • Match faces of arrested individuals against criminal database
  • Locate missing persons or wanted criminals
  • Verify identity at borders and checkpoints
DPDPA Section 6 Exception for Government: Law enforcement facial recognition may proceed without explicit consent if:
  • Processing is necessary to fulfill obligation under law (Criminal Procedure Code, passport verification requirements, etc.)
  • Processing is by government agency in legitimate public interest
  • Processing is ordered by court
DPDPA Compliance Requirements Even for Police:
  • Legal Basis Clarity: Specific law or court order must authorize facial recognition use - cannot be general authority
  • Purpose Limitation: Facial data collected for criminal suspect identification cannot be repurposed for immigration screening or marketing
  • Data Minimization: Only necessary criminal suspects' facial data should be in database, not all citizens
  • Accuracy and Fairness: System must have high accuracy (testing shows FRT systems have higher error rates for women and dark-skinned individuals); biased systems violate fairness principles
  • Transparency & Rights: Individuals have right to know if their facial data is in criminal database and to request correction if wrongly included
  • Security: Criminal biometric database must have highest security standards; unauthorized access is breach of trust and DPDPA violation
  • Retention Limits: Facial data of innocent individuals (wrongly arrested, acquitted) should be deleted promptly
Known Issues with Police FRT:
  • Facial recognition systems used by law enforcement globally show higher error rates for women and people of color
  • Risk of algorithmic bias leading to false arrest of marginalized groups
  • Lack of transparency about which agencies have access to facial data
  • Concern about mission creep (facial databases initially for criminals used for general population tracking)

The Puttaswamy Judgment: Constitutional Implications for Facial Recognition

K.S. Puttaswamy v. Union of India (2017) - Right to Privacy as Fundamental Right: In this landmark judgment, India's Supreme Court recognized that privacy is a fundamental right under the Indian Constitution (Part III). The judgment has direct implications for facial recognition: Key Holdings Relevant to FRT:
  • Privacy is Fundamental: Citizens have constitutional right to be left alone and not be subject to unreasonable surveillance
  • Bodily Integrity: Bodily autonomy and protection against biometric exploitation are core privacy rights
  • State Accountability: Government surveillance systems must be justified by legitimate state purpose and subject to judicial oversight
  • Proportionality Test: Even legitimate government purposes (national security, crime prevention) must be balanced against privacy invasion
  • Not Absolute: While privacy is fundamental, it is not absolute and can be restricted if:
    • Justified by legitimate state aim
    • Proportionate and necessary (least restrictive means used)
    • Subject to safeguards and oversight
Application to Facial Recognition Systems:
  • Mass Surveillance FRT: Systems that indiscriminately capture facial data of all citizens (e.g., CCTV with FRT analyzing entire city populations) would face constitutional scrutiny. Legitimate law enforcement use limited to specific purposes (identifying suspects) would pass test; general surveillance would fail.
  • Consent and Choice: Citizens cannot meaningfully consent to inevitable mass FRT in public spaces, but can reasonably expect privacy in workplaces and private establishments. Puttaswamy suggests consent is relevant to privacy analysis.
  • Judicial Oversight: Government FRT programs should ideally have judicial approval and regular review, not pure executive discretion
  • Transparency: Citizens have right to know about government surveillance systems; secret facial recognition programs would violate Puttaswamy principles
DPDPA Alignment with Puttaswamy: DPDPA Section 6 (biometric data processing) and overall principles (purpose limitation, data minimization, transparency) directly implement Puttaswamy's privacy protections in statutory law.

The Panopticon in the Digital Age: Philosophical Dimensions of Surveillance

Jeremy Bentham's Panopticon and Michel Foucault's Surveillance Theory: Bentham's panopticon - an architectural design where a central observer could watch all prisoners without being seen - became Foucault's metaphor for modern surveillance systems. Foucault argued that the mere existence of surveillance induces conformity even without constant observation: knowing you might be watched changes behavior. Facial Recognition as Modern Panopticon:
  • Invisible Observation: FRT operates without subject awareness. Citizens don't know when they're being identified, creating perpetual feeling of potential observation
  • Behavioral Modification: Knowledge of FRT causes self-censorship and conformity. Protesters wear masks, journalists are chilled from investigating sensitive topics, activists avoid public assembly
  • Power Imbalance: State and corporations monitor citizens, but individuals cannot monitor monitors. Asymmetrical surveillance relationship
  • Chilling Effect on Rights: Right to freedom of expression, assembly, association are undermined when people know their movements are tracked and faces identified
DPDPA's Response to Panopticon Risk:
  • Consent Requirement: Explicit consent for biometric processing ensures citizens are aware and have choice
  • Purpose Limitation: Data can't be repurposed for surveillance beyond stated purpose
  • Transparency: Citizens must be informed of surveillance systems and data uses
  • Rights to Access and Delete: Individuals can see what's being tracked and remove themselves from systems
  • Regulator Review: Data Protection Board can audit surveillance systems and order cessation of non-compliant programs
The Balance: Facial recognition technology is not inherently evil - it solves real problems (airport security, finding missing children). The question is whether deployment respects human dignity and privacy principles. DPDPA provides the framework for this balance.

Facial Recognition: Challenges and Limitations Beyond DPDPA

Technical Accuracy Issues

  • False Match Rate: System incorrectly identifies person A as person B
  • False Non-Match Rate: System fails to recognize person (different lighting, age progression, facial hair, masks)
  • Bias in Accuracy: Systems trained primarily on Caucasian faces perform worse on Asian, African, and Indian faces; worse on women than men; worse on young and elderly than prime age adults
  • Adversarial Attacks: Sophisticated makeup, prosthetics, or glasses can fool systems

Privacy Risks Specific to Facial Recognition

  • No Anonymity in Public: Unlike other biometrics that are private (fingerprints not visible), faces are visible public data. Facial data collection is difficult to control
  • Real-time Identification: Technology enables real-time tracking of individuals across space and time
  • Re-identification Risk: Anonymized facial data can be de-anonymized by comparing to identified faces in social media or elsewhere
  • Unauthorized Matching: Facial data can be surreptitiously matched against databases individuals don't know exist

Practical Compliance Framework for Organizations Deploying Facial Recognition

Step-by-Step Compliance Checklist:

Pre-Deployment Phase

  • ☐ Determine whether FRT is necessary or less-invasive alternative exists
  • ☐ Conduct Data Protection Impact Assessment (DPIA) specifically for facial recognition
  • ☐ Identify legal basis for processing (consent, law obligation, vital interest, public interest)
  • ☐ Assess technical accuracy of FRT system - test for bias across demographic groups
  • ☐ Plan data retention schedule - define when facial data will be deleted
  • ☐ Establish access controls - specify which personnel can access facial database
  • ☐ Plan security measures - encryption, audit logging, intrusion detection

Consent and Notice Phase

  • ☐ Draft clear, plain-language notice explaining facial recognition use
  • ☐ Obtain explicit written consent from individuals whose facial data will be collected
  • ☐ Document consent along with date, person, and specific authorization
  • ☐ Provide layered notice - initial notice plus in-situ notification at point of FRT deployment
  • ☐ Create mechanism for individuals to withdraw consent and request data deletion

Implementation Phase

  • ☐ Deploy FRT system with technical security in place
  • ☐ Implement logging of all facial recognition transactions
  • ☐ Set automated deletion timers for facial data (never store indefinitely)
  • ☐ Test system accuracy and audit for bias
  • ☐ Train staff on DPDPA requirements and facial data sensitivity
  • ☐ Establish incident response procedure for data breaches

Ongoing Compliance

  • ☐ Quarterly review of FRT system use and accuracy
  • ☐ Annual bias audit to detect discriminatory performance
  • ☐ Monitor for unauthorized access or misuse of facial database
  • ☐ Review and honor data subject access and deletion requests
  • ☐ Maintain audit trail for regulatory inspection
  • ☐ Periodically assess whether FRT is still necessary or should be discontinued
  • ☐ Review facial data retention schedules and execute deletions on time

Conclusion: Balancing Technology and Privacy

Facial recognition technology will continue advancing and deploying across Indian infrastructure. DPDPA provides the legal framework for this deployment to respect individual privacy and human dignity. Organizations that implement FRT transparently, with explicit consent, proportionate purpose limitation, robust security, and respect for individual rights will navigate this new landscape successfully. Those that deploy FRT covertly or excessively without legal basis will face regulatory enforcement and potential constitutional challenges under Puttaswamy precedent.

SHARE THIS ARTICLE: Share on WhatsApp Share on Twitter Share on LinkedIn Share on Facebook Share via Email

Related Articles You May Find Useful

  • AI and Machine Learning Compliance Under DPDPA
  • ChatGPT and Generative AI: DPDPA Data Protection Risks
  • 72-Hour Data Breach Response Plan Under DPDPA
  • DPDPA for E-commerce: Customer Data Compliance
  • DPDPA for Healthcare: Hospital Compliance Guide
DPDPA Logo

Site maintained by Advocate (Dr.) Prashant Mali for Public in General interest

E-mail: info@dpdpa.com

Privacy Policy |  Cookie Policy |  Disclaimer