Welcome to the March 2026 edition of RegRally Insights: Personal Data Protection and ICT Regulation.
Our briefing summarises practical takeaways, upcoming coordinated enforcement actions, and evolving guidance to help organisations ensure compliance, mitigate risks, and maintain trust in an increasingly complex digital environment.
Lithuanian DPA Fines Hospital Over Unlawful Video and Audio Surveillance
VDAI found that while video surveillance in common areas (e.g., entrances, corridors, lobbies) was lawful to ensure safety, surveillance in operating theatres, emergency examination rooms, and the geriatric day-care unit was unlawful, as it captured patient examination areas and staff workplaces. In these cases, patients’ and employees’ privacy rights outweighed the hospital’s interests.
The authority also determined that audio recording inside the hospital, including in operating theatres, violated the GDPR. Audio recording was not considered necessary for safety or work organisation, particularly given the risk of capturing sensitive health data.
Additionally, the hospital set unclear and excessive retention periods for recordings, failed to ensure proper access controls, and did not fully cooperate with the authority during the investigation.
VDAI ordered corrective measures and, in February 2026, imposed an administrative fine of 6,000 EUR. The decision may be appealed within one month.
Under Article 35 GDPR, a Data Protection Impact Assessment (DPIA) is required where processing is likely to result in a high risk to individuals’ rights and freedoms – including systematic monitoring, employee surveillance, or processing of health data.
A properly conducted DPIA should:
- assess the purpose, necessity and proportionality of the processing,
- evaluate risks to data subjects,
- analyse legal bases,
- assess technical and organisational security measures,
- define concrete mitigation actions.
- Failure to carry out a DPIA where required is itself a GDPR infringement.
Our team assists clients with comprehensive DPIA compliance, including:
- Conducts Data Protection Impact Assessments (DPIAs), including an evaluation of personal data processing operations from an IT security perspective (technical security measures).
- Advises on when a DPIA is required and how to carry it out properly in line with GDPR requirements.
- Prepares and/or reviews internal procedures related to Data Protection Impact Assessments.
The State Data Protection Inspectorate of Lithuania (VDAI) has announced its supervisory priorities for 2026
Planned Inspections and Monitoring Activities
In 2026, the VDAI plans to conduct:
- 15 scheduled inspections in selected public and private sector organisations;
- 10 monitoring activities focusing specifically on technical and organisational security measures;
- Follow-up reviews of 15 organisations previously inspected where deficiencies were identified or corrective measures were ordered.
Although inspections will target selected organisations, the findings are relevant to entire sectors. After completing inspections, the VDAI publishes summaries outlining identified risks and compliance gaps, enabling other organisations to assess their own exposure.
Focus Areas in 2026
The 2026 monitoring activities will concentrate on security measures, including:
- Access control management
- Backup procedures for information, software, and systems
- Event logging and audit trail records
These areas reflect ongoing regulatory focus on accountability, traceability, and operational resilience.
In addition, the authority will verify whether previously issued orders and corrective measures have been properly implemented.
In light of the 2026 supervisory plan announced by the State Data Protection Inspectorate of Lithuania (VDAI), organisations should treat this as a clear signal to reassess not only their documentation, but also the practical implementation of security measures.
Particular attention should be given to:
- Access management controls (role-based access, periodic access reviews, termination procedures);
- Backup and recovery mechanisms (testing of backups, documented recovery procedures, system resilience);
- Logging and monitoring practices (event logs, audit trails, incident traceability);
- Evidence of implementation, not merely internal policies.
Organisations that have previously undergone inspections should conduct an internal follow-up review to ensure that:
- All corrective measures have been fully implemented;
- Supporting documentation is up to date;
- Technical and organisational measures are demonstrably operational in practice.
Lithuanian DPA Orders Company to Provide Call Recording in Direct Marketing Case
On 10 October 2024, the State Data Protection Inspectorate of Lithuania (VDAI) upheld a complaint against DirectMarketing OU for refusing to provide a data subject with a copy of a call recording in which the company claimed he had consented to direct marketing.
After receiving a marketing call, the complainant asked how his personal data had been obtained and requested confirmation of the data processed, details of the alleged survey, and copies of his consent. The company stated that he had participated in a survey on 9 April 2024 and agreed to receive marketing calls but refused to provide the call recording, arguing that recordings were internal documents and that the individual’s identity could not be verified based on name and phone number alone.
During the investigation, the company submitted the recording to the VDAI. The authority found that the same identifiers used for processing the data could not be dismissed as insufficient for identity verification without justification. It emphasized that under Article 12(6) GDPR, additional information may be requested where necessary, but such measures must be proportionate.
The VDAI concluded that the refusal to provide the recording breached Article 15(3) GDPR, issued a reprimand, and ordered the company to provide the recording to the complainant.
In light of the VDAI decision, companies should consider the following:
- Treat call recordings as personal data. Audio recordings containing identifiable individuals fall within the scope of Article 15 GDPR and must be provided upon request.
- Do not rely on “internal document” arguments. Internal classification does not exempt data from disclosure obligations.
- Apply proportionate identity verification. If there are reasonable doubts, request additional information under Article 12(6) of the GDPR, but ensure the measures are necessary and proportionate.
- Ensure recordings are retrievable and shareable. If consent is collected by phone, the relevant recording must be accessible and capable of being provided within the one-month deadline.
State Data Protection Inspectorate: Fewer Personal Data Breaches Reported in 2025, but Over 1.2 Million Data Subjects Affected
In 2025, the State Data Protection Inspectorate (VDAI) received 223 notifications of personal data breaches (PDBs). The total number of affected data subjects in Lithuania reached 1,249,409.
Compared to the previous year, the number of reported breaches decreased. In 2024, VDAI received 273 breach notifications. The number of affected data subjects in Lithuania also decreased by nearly 200,000 (1,467,368 in 2024).
Nature of Breaches
By type, confidentiality breaches accounted for the majority of cases in Lithuania in 2025. Integrity breaches accounted for 6%, availability breaches for 10%, and in 1% of cases, the incident was not considered a personal data breach (did not meet the legal definition).
Main Causes
After analysing the 2025 notifications, VDAI found that:
- 58% of breaches occurred due to human error (carelessness, lack of awareness that certain actions may cause a breach, or actions that cannot normally be prevented by technical and organisational measures).
- 13% were caused by other factors, including IT system failures, programming errors, insufficient system testing before launch, or delayed data updates that prevented timely service provision and led to unauthorised access to personal data.
- 29% were caused by cybersecurity incidents.
Cybersecurity Incidents Breakdown
Among breaches resulting from cybersecurity incidents:
- 16% were ransomware (data encryption and ransom demand) attacks,
- 45% involved unauthorised access to IT systems,
- 26% resulted from social engineering attacks,
- 7% were credential stuffing attacks,
- 3% involved SQL injection attacks,
- 3% involved denial-of-service attacks.
It is important to note that cybersecurity incidents accounted for 57% of all affected data subjects in 2025 (713,644 individuals), while other causes affected 43% (535,765 individuals).
Reporting Obligations Under the GDPR
VDAI reminds data controllers that once they become aware of a personal data breach likely to result in a risk to the rights and freedoms of individuals, they must notify VDAI without undue delay, and no later than 72 hours after becoming aware of it, as required by the GDPR.
In 2025, 63% of data controllers notified VDAI within the 72-hour deadline, while 37% reported breaches after the required 72-hour deadline.
Enforcement Actions
In 2025, VDAI imposed 5 fines on public and private entities for GDPR violations, totalling EUR 27,529.
Additionally, after assessing breach notifications and identifying insufficient safeguards for personal data security, VDAI issued 9 corrective orders to data controllers or processors to align processing operations with GDPR requirements. The Inspectorate also provided 22 recommendations to help ensure compliance with GDPR obligations.
AI-generated imagery and protection of privacy: EDPB supports the joint Global Privacy Assembly’s statement
EDPB has signed a Joint Statement on AI-Generated Imagery and the Protection of Privacy on behalf of the EDPB. The statement, coordinated by the Global Privacy Assembly’s (GPA) International Enforcement Cooperation Working Group (IEWG), represents the united position of 61 authorities worldwide. This reflects the Board’s commitment to contributing to the global dialogue on data protection as outlined in the fourth pillar of its work programme 2026-2027.
The statement addresses serious concerns about AI systems that generate realistic images and videos depicting identifiable individuals without their knowledge or consent. Whilst AI has the potential to bring numerous benefits to individuals and society, recent developments – particularly the integration of AI image and video generation into widely accessible social media platforms – have enabled the creation of non-consensual intimate imagery, defamatory depictions, and other harmful content featuring real individuals. The co-signatories are especially concerned about potential harms to children and other vulnerable groups, such as cyber-bullying and/or exploitation.
Expectations for organisations
The co-signatories remind organisations developing and using AI content generation systems that these systems must be developed and used in compliance with applicable legal frameworks, including data protection and privacy rules.
Although specific legal requirements vary by jurisdiction, fundamental principles should guide all organisations developing and using AI content generation systems. These principles include:
implementing robust safeguards,
- ensuring meaningful transparency,
- providing effective and accessible mechanisms to protect individuals, and
- addressing specific risks to children.
Joining forces to address a global risk
The harms arising from the non-consensual generation of intimate, defamatory, or otherwise harmful content depicting real individuals are significant and warrant urgent regulatory attention. The co-signatories are committed to addressing this global risk and will join efforts. To achieve this, the co-signatories aim to share information on their approaches to addressing these concerns.
Finally, the co-signatories call on organisations to engage proactively with regulators, implement robust safeguards from the outset, and ensure that technological advancements do not come at the expense of privacy, dignity, safety, and other fundamental rights – particularly for the most vulnerable members of our global society.
EU Council Rejects Changes to GDPR Personal Data Definition
EU member states have agreed to remove the European Commission’s proposal to change the definition of personal data from the Digital Omnibus package. This decision stems from a Council compromise reached on February 20, obtained by Euractiv. The proposed amendment aimed to alter the definition of personal data under the General Data Protection Regulation (GDPR), but it faced significant opposition from several countries and data protection authorities.
The Commission had suggested that personal data should be defined based on the data holder’s ability to re-identify an individual, even if others receiving the data could not. This change would have allowed data processors to potentially avoid GDPR obligations by arguing that they cannot re-identify the data subjects. Critics, including the European Data Protection Board (EDPB), warned that this approach oversimplifies legal nuances and could weaken privacy protections.
The Council’s compromise also removes the Commission’s power to use secondary legislation to set criteria for when data is considered sufficiently pseudonymised. Instead, the text emphasises the EDPB’s role and its updated guidelines on pseudonymisation. Pseudonymisation is a key privacy technique in which personal data is processed in a way that prevents the individual from being identified without additional information, which remains protected under the GDPR.
This decision reflects the EDPB’s concerns raised in its negative opinion on the Digital Omnibus proposal. The Board highlighted risks that the Commission’s changes could undermine existing privacy standards. While some EU countries were uncertain about the best path forward, the EDPB’s opinion appears to have influenced the Council’s decision to reject the Commission’s proposed amendments to the definition of personal data.


Newsletter Subscription