Complete legislative reference for conducting Data Protection Impact Assessments in Portugal and the EU.
General Data Protection Regulation becomes effective, introducing mandatory DPIA regime (Art. 35).
Portuguese law implementing the GDPR, regulating enforcement in national context.
Definition of 22 processing categories requiring mandatory DPIA in Portugal.
Methodologically robust impact assessment guidelines adopted by all European authorities.
Article 27(4) articulation with DPIA — new requirements for high-risk AI systems.
Complete integration of DPIA, FRIA and CSIA in structured compliance programs.
Establishes the general obligation to conduct DPIA "where a type of processing in particular using new technologies, is likely to result in a high risk to the rights and freedoms of natural persons".
Key provisions:
Obligation to consult the competent authority (in Portugal, CNPD) "before commencing the processing" when a DPIA indicates high risk that cannot be adequately mitigated.
Requirements: formal submission of DPIA to CNPD with complete documentation. CNPD has up to 8 weeks to respond.
The DPO must "cooperate with the supervisory authority" and "serve as the point of contact". Must ensure DPIA is conducted when mandatory and oversee methodological rigour.
Law implementing the GDPR in Portugal, reinforcing provisions in national context:
The Portuguese CNPD defined the following 22 processing categories requiring mandatory DPIA:
Systematic monitoring of individuals, e.g. CCTV, location tracking.
Processing of genetic or biometric characteristics (Art. 9 GDPR).
Electronic health records, telemedicine, medical big data.
Automated evaluation of personal, behavioural or economic characteristics.
Purely automated decisions producing legal consequences (Art. 22 GDPR).
Systematic processing of minors' data (e.g. social media, e-learning).
Processing of special category data of persons with reduced capacity.
Use of novel technologies (AI, blockchain, IoT) without track record.
Combining data from multiple files, increasing re-identification risk.
Systems potentially excluding individuals from services based on profiling.
Transfers to third countries without adequacy decisions.
Continuous tracking of individual movement (GPS, mobile triangulation).
See P05 — Common Types for detailed analysis of each category.
Guidelines from the European Data Protection Board (EDPB) define the consensus methodology for all DPIAs in the EU:
| Element | Description |
|---|---|
| Systematic description | Clear documentation of processing: actors, data, purpose, duration, recipients. |
| Necessity and proportionality | Justification of why processing is necessary and proportionate to purpose. |
| Risk assessment | Systematic identification of risks using probability × severity matrix. |
| Mitigation measures | Technical (encryption, pseudonymisation) and organisational (training, audits). |
| Respect for third-party rights | Dialogue with trade unions, supervisory bodies, data protection authorities. |
| Final approval | Signature by controller and DPO. Potential consultation with CNPD. |
FRIA and DPIA articulation: Regulation (EU) 2024/1689 requires high-risk AI systems to be subject to FRIA (Fundamental Rights Impact Assessment). This must be articulated with DPIA under the GDPR.
Implication for DPIA: When deploying AI systems for personal data processing, the DPIA must include fundamental rights risk analysis components, including bias, discrimination and algorithmic opacity.
See aidf.pt — FRIA for complementary guidance.
| Standard | Scope |
|---|---|
| ISO 29134:2017 | Information technology — Security techniques — Guidelines for privacy impact assessment. Provides internationally recognised complementary PIA framework. |
| ISO 27701:2019 | Information security, cybersecurity and privacy protection — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management. Guidance on integrating privacy in information security management systems. |
European supervisory authorities have imposed severe fines for DPIA violations:
Fines for lack of DPIA in high-risk processing (surveillance, profiling) up to EUR 250,000.
Google (2020): EUR 90M for lack of valid cookie consent. Violation included inadequate DPIA.
British Airways (2020): GBP 22.5M for data security failures. Inadequate DPIA was contributing factor.
The information presented on this page is for informational and educational purposes only. It does not constitute specific legal advice. Conducting a DPIA in compliance with the GDPR, Law 58/2019, and regulatory guidance should be accompanied by qualified data protection professionals and, where applicable, specialised legal counsel.