πΉ Video Surveillance / CCTV
Monitoring by video cameras in common areas is one of the most frequent DPIAs in organizations. This processing is mandatory to assess under Article 35(3)(c) GDPR when it involves systematic monitoring on a large scale. Video cameras capture indirect biometric data and enable tracking of behaviors, locations and movement patterns.
The EDPB (European Data Protection Board) classifies video surveillance as high-risk processing when there are no technical or time limitations. The proportionality principle is essential: installing cameras in transition zones is different from monitoring rest areas or sanitary facilities.
Image retention must be minimal and necessary (30β90 days in general), with clear policies on access, automatic deletion, and data subject rights. Access controls to recordings must restrict to authorized personnel with audited logs.
Legal Basis
- Article 35(3)(c) GDPR: Systematic monitoring on a large scale
- Article 5 GDPR: Principles of lawfulness, transparency and data minimization
- EDPB Category: High risk without technical privacy measures
Risks to Data Subjects
- Loss of privacy and intimacy in shared spaces
- Identification and tracking of sensitive behaviors
- Unauthorized access to recordings by third parties
- Disclosure to authorities without informed consent
Recommended Mitigation Measures
- Data minimization: Install cameras only in common access zones (entrances, corridors), not in privacy areas
- Privacy-enhancing technology: Implement automatic face blurring or real-time anonymization
- Limited retention: Automatically delete images after 30β90 days; longer retention only with specific justification
- Access control: Limit viewing to authorized personnel with audited logs; multi-factor authentication
- Visual notice: Clear signage informing of camera and responsible controller
- Data subject rights: Transparent procedure for access, correction and complaints
- Processing contract: With cloud storage or CCTV service provider
Next Step
If your organization has or plans video surveillance, obtain a specialized impact assessment:
π€ Biometrics
Biometric systems (facial recognition, fingerprints, iris, voice recognition) are specifically regulated by Article 9 GDPR as special categories of data. Impact assessment is mandatory in most contexts because they involve processing of special categories with significant risk to rights and freedoms.
Biometrics are commonly used in access control (corporate buildings, secure rooms), time attendance, or identification in HR systems. Even in corporate contexts, biometric template retention must be minimized and technical safeguards must be robust.
Biometric data is permanent (cannot be changed like a password) and highly sensitive. A compromise of a biometric database is practically irreversible for the data subject. Secure and verifiable deletion is critical.
Legal Basis
- Article 9 GDPR: Processing of special categories of data
- Article 35 GDPR: Mandatory DPIA
- Article 37 GDPR: Possible obligation to appoint a DPO
Risks to Data Subjects
- Inability to revoke or alter compromised data
- Cross-identification with other systems (mass surveillance)
- Unjust rejection in facial recognition systems (algorithmic discrimination)
- Identity theft or fake profiles using biometric data
Recommended Mitigation Measures
- Local storage: Store biometric templates on user device or encrypted token, not on central server
- End-to-end encryption: Data in transit and at rest protected with strong algorithms (AES-256)
- Minimal retention: Delete data after necessity (e.g., when employee leaves)
- Segregation: Biometric data separated from HR or other operational data
- Pseudonymization: Use random identifiers instead of names where possible
- Audit trails: Detailed logs of access and biometric comparisons
- Informed consent: Transparent information on processing and right to refuse
Next Step
π€ Artificial Intelligence
AI systems that process personal data are subject to mandatory DPIA under Article 35(3)(a) GDPR. Assessment is particularly important in AI for automated decision-making (Article 22), classification, risk scoring, or predictive analysis. The opacity of machine learning models creates unique compliance challenges.
Simultaneously, the EU AI Regulation (AI Act, in force since January 2025) introduces its own AI risk classification. Applications such as recruitment, credit, or surveillance systems classified as "high-risk" require dual compliance: GDPR + AI Act. Article 27 of the AI Act integrates with DPIA requiring explainability and conformity audits.
Generative AI (LLMs) presents risks of training data leakage, bias, and lack of transparency about sources. Organizations must audit data used, define clear usage policies, and maintain control over which AI is used in which context.
Legal Basis
- Article 22 GDPR: Decisions based solely on automated processing
- Article 35(3)(a) GDPR: Mandatory DPIA for automated processing at large scale
- EU AI Regulation Article 27: Compliance for high-risk AI systems
- Cross-reference: aidf.pt β Impact Assessment for AI
Risks to Data Subjects
- Discriminatory or biased automated decisions
- Lack of transparency: not knowing how AI reached a decision
- Inability to contest automated decision without human intervention
- Use of sensitive data (gender, ethnicity, religion) without consent
- Training data leakage or model inversion
Recommended Mitigation Measures
- Training data audit: Ensure data is representative, free of historical bias, and legitimately processed
- Fairness testing: Verify model does not produce discriminatory outcomes by gender, ethnicity, or protected groups
- Explainability: Use techniques like SHAP or LIME to provide human-readable explanations of decisions
- Human intervention: Significant decisions (recruitment, credit) must have mandatory human review
- Right to contest: Clear procedure for data subjects to contest automated decisions
- AI Act compliance: For high-risk AI, implement registry, technical documentation, and independent audit
- Data governance: DPO and AI specialist collaborate on continuous assessment
Next Step
Specialized assessment of AI β integration of GDPR + AI Act:
π₯ Health Systems / Health Data
Processing of health data (electronic health records, medical history, test results, medical images) is protected by Article 9 GDPR and frequently subject to mandatory DPIA. Hospitals, health centers, telemedicine providers, and biobanks process special categories with very high risk because they involve sensitive information about health, capacity, or vulnerabilities.
Sectoral regulations such as the Healthcare Services Directive (2011/24/EU) or national legislation impose additional requirements beyond GDPR. Clinical research and biobanks involve ethical dimensions requiring approval by independent committees.
Access to health data must be strictly controlled by authorized professionals. Sharing with researchers, insurers, or public health entities must have specific justification and clear consent or legal basis.
Legal Basis
- Article 9 GDPR: Health data as special categories
- Article 35 GDPR: Mandatory DPIA in general
- Directive 2011/24/EU: Cross-border healthcare rights
- National legislation: Sectoral health and clinical confidentiality law
Risks to Data Subjects
- Discrimination by insurers, employers, or others with data access
- Involuntary revelation of sensitive diagnosis or treatment
- Identity theft using health data for medical fraud
- Psychological impact of clinical confidentiality breach
- Inability to retroactively revoke data shared in research
Recommended Mitigation Measures
- Strict access control: Only professionals with clinical need have access; detailed audit logs
- Encryption of sensitive data: Medical records encrypted at rest and in transit
- Data segregation: Health data separated from administrative systems
- Anonymization for research: Use random pseudonyms for clinical research; separate de-anonymization keys
- Sharing policies: Granular consent for different purposes (treatment, research, insurance)
- Limited retention: Documented bases for retention per sectoral legislation
- Data subject rights: Transparency on who accesses data, why, and right of access/correction
Next Step
π Geolocation
Tracking employee location (via GPS, mobile devices, beacons, WiFi) is a common processing that raises significant privacy concerns. Commercial vehicle fleet monitoring is standard, but real-time employee tracking beyond work-related travel can be considered excessive surveillance. Location data enables inference of sensitive behaviors: places of worship, medical offices, political associations.
Workplace location tracking (WiFi, RFID badges, or real-time location apps) requires a DPIA and clear purpose justification. Proportionality must be respected: tracking general building location is different from minute-by-minute movement tracking.
Telecommunication data (cell tower triangulation) or historical location data obtained from mobile service providers requires clearly identified legal basis (consent or legitimate interest with safeguards).
Legal Basis
- Article 6 GDPR: Lawfulness of processing (consent, contract, legitimate interest)
- Article 35 GDPR: DPIA generally mandatory for large-scale tracking
- ePrivacy Directive (2002/58/EC): Location in telecom context
Risks to Data Subjects
- Continuous surveillance and loss of personal autonomy
- Revelation of movement patterns and personal activities
- Possibility of movement analysis outside work hours (if data persisted)
- Inference of sensitive information (medical, religious, political locations)
- Discrimination based on movement patterns (geographic bias)
Recommended Mitigation Measures
- Scope minimization: Track only during work hours; disable outside hours
- Reduced granularity: Record location by zone (e.g., "Building A") instead of exact GPS coordinates
- Minimal retention: Delete historical data regularly (e.g., 30 days)
- Privacy-enhancing technology: Use indoor beacons with limited range instead of global GPS
- Granular consent: Employees may opt into locations for specific purposes (fleet safety vs. movement surveillance)
- Limited access: Only project managers or security have access to location data
- Clear information: Documented policies on tracking; right to contest
Next Step
π£ Digital Marketing
Digital marketing campaigns involve multiple processing: cookies and online behavior tracking, programmatic advertising, lead scoring, predictive interest analysis, and automated decisions on audience segmentation. This is the context where DPIA is frequently neglected despite significant risk.
Tracking cookies (third-party) enable creation of detailed user browsing profiles. Predictive analysis and machine learning can infer sensitive attributes (health, sexual orientation, political beliefs) even if not explicitly processed. Credit scoring or risk scoring can lead to algorithmic discrimination.
Consent for cookies and marketing must be explicit, granular and easily revocable. Offering immediate consent ("Cookie Banner Accept All") is unlawful under GDPR; consent must be free and informed. Excessive personalized advertising may violate proportionality principles.
Legal Basis
- Article 6 GDPR: Consent necessary for tracking cookies
- Article 7 GDPR: Consent must be free, specific and informed
- Article 21 GDPR: Right to object to direct marketing
- ePrivacy Directive: Prior consent mandatory for cookies
Risks to Data Subjects
- Detailed profiling of behavior, preferences and vulnerabilities
- Manipulative or discriminatory advertising based on inferred attributes
- Profile sharing with third parties without explicit consent
- Practical impossibility to control or revoke distributed processing
- Psychological impact of commercial surveillance
Recommended Mitigation Measures
- Compliant cookie banners: Granular consent (analytics vs. marketing), not pre-selected, equal ease to accept/reject
- Tracking minimization: Only essential cookies by default; third-party tracking must be explicitly consented
- Vendor transparency: Clear list of marketing vendors (ad networks, analytics) and their data policies
- Limited retention: Tracking cookies expire after 13 months; marketing profiles deleted regularly
- Simple opt-out: Obvious "Unsubscribe" button in all marketing emails
- Transparent scoring: If using lead scoring or risk qualification, inform of criteria and right to contest
- Vendor contracts: Processing clauses, prohibition of unauthorized subprocessing
Next Step
π» Employee Monitoring
Monitoring of IT activity (email, browsing, keystroke logging), productivity (screenshots, app tracking), or wearables (smart badges with movement, posture) is frequent in organizations and raises significant labor rights and privacy questions. Monitoring is particularly sensitive because it occurs during work hours where the right to privacy is reduced, but not eliminated.
National labor legislation may impose restrictions beyond GDPR: right to rest, communication confidentiality, or protection of union representatives. Excessive monitoring may violate freedom of thought or association (Article 12 EU Charter of Fundamental Rights).
Monitoring technologies (RPA software, IoT posture sensors, activity tracking) have evolved rapidly; DPIA must be done prospectively on real surveillance capabilities, not just intended use.
Legal Basis
- Article 6 GDPR: Employer legitimate interest must be proportional
- Article 4(11) GDPR: Consent invalid in labor power imbalance context
- National labor law: Employee rights to privacy, rest, freedom of thought
- Occupational Safety Directive (89/391/EEC): Employer duties of protection
Risks to Data Subjects
- Continuous surveillance reducing autonomy and psychological well-being
- Inability to communicate privately (including with unions or representatives)
- Disciplinary decisions based on incomplete or incorrect monitoring data
- Incidental revelation of sensitive data captured (health, religious beliefs)
- Stress and burnout caused by continuous surveillance
Recommended Mitigation Measures
- Proportionality: Monitoring only for high-risk roles (sensitive data access, physical security); not blanket monitoring
- Reduced granularity: Measure productivity by completed tasks, not keystroke/mouse movements
- Clear communication: Documented monitoring policy; employee consent (even without practical opt-out)
- Sensitive exclusions: Do not monitor union representative, medical, or legal communications
- Limited retention: Monitoring data deleted after 3β6 months if no violation evidenced
- Right of access: Employee has right to see monitoring data about themselves
- Management training: Training on legitimate monitoring use; transparency on real capabilities
Next Step
π International Data Transfers
Transfer of personal data outside the European Economic Area (EEA) β particularly to the United States β is processing that requires DPIA in many contexts. Cloud services (AWS, Azure, Google Cloud, Salesforce) frequently involve transfer to servers outside the EU. The Schrems II decision (2020) banned Privacy Shield and imposed strict requirements for Standard Contractual Clauses (SCCs).
The post-Schrems II challenge is that transfers to certain countries (USA particularly) may not have adequate data protection level even with SCCs, because foreign legislation may permit government access for national security surveillance. Organizations must implement "supplementary measures" (end-to-end encryption, tokenization) to compensate for this risk.
ePrivacy Regulation may also restrict telecommunications data transfers. Data center location in the EU is not sufficient guarantee if remote third-party access is possible.
Legal Basis
- Article 45 GDPR: Adequacy decision (e.g., EUβJapan)
- Article 46β47 GDPR: Secure transfers (SCCs, Binding Corporate Rules)
- Schrems II Decision (2020): Validation of SCCs + mandatory supplementary measures
- ePrivacy Regulation (2002/58/EC): Additional restrictions for telecom data
Risks to Data Subjects
- Unauthorized access to data by foreign government agencies
- Lack of effective data subject rights to contest unauthorized access
- Inability to obtain compensation for violations in foreign jurisdictions
- Exposure to foreign legislation incompatible with EU fundamental rights
Recommended Mitigation Measures
- Transfer documentation: Clear inventory of all cloud services and third parties receiving data
- Standard Contractual Clauses: Vendor contract including valid SCCs; Schrems II-compliant risk assessment
- Technical supplementary measures: End-to-end encryption (keys retained by organization); tokenization; pseudonymization
- DPIA by destination country: Specific assessment per country; identify government access risks
- EU location: Prefer services with processing in EU data centers; verify location claims rigorously
- Data subject rights: Inform data subjects of transfers; ensure rights in case of foreign access
- Periodic review: Re-assess transfers when foreign legislation changes (e.g., surveillance law updates)
Next Step
Comparison: Most Common DPIA Types
| Type | Base Risk | GDPR Art. | Mandatory DPIA? | Key Measure |
|---|---|---|---|---|
| Video Surveillance | High | 35(3)(c) | Yes | Limited retention; face blurring |
| Biometrics | Very High | 9, 35 | Yes | Local storage; encryption |
| AI | High to Very High | 22, 35(3)(a) | Yes | Bias audit; human intervention |
| Health | Very High | 9, 35 | Yes | Strict access control; anonymization |
| Geolocation | High | 6, 35 | Often | Reduced granularity; short retention |
| Digital Marketing | Medium to High | 6, 7, 21 | When at scale | Granular consent; easy revocation |
| Employee Monitoring | High | 6 | Often | Proportionality; sensitive exclusions |
| International Transfers | High to Very High | 45β47 | Yes (post-Schrems II) | SCCs + end-to-end encryption |
Do you need a DPIA?
Contact us for an initial assessment and recommendations specific to your organization.
Request AssessmentAvaliacaodeimpacto.pt is the reference hub for impact assessment compliance. View hub