30 Jan DSA and GDPR: Operational guidance on integrated compliance

Authors: Irene Negri, Carmine Perri, Giulio Monga
Abstract
This article analyses the interaction between the Digital Services Act and the GDPR in light of the EDPB Guidelines 3/2025, providing operational guidance for the adoption of an integrated compliance approach by online platform providers. It examines content moderation activities, complaint handling, advertising communications, recommendation systems and the specific provisions for the protection of minors.
Background: Regulatory framework and objectives of integrated compliance between the DSA and the GDPR
In cases where providers of “intermediary services”[1] are required to comply both with the new obligations introduced by Regulation (EU) 2022/2065 on digital services (the “Digital Services Act”, “DSA”)[2] and by Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data (“General Data Protection Regulation”, “GDPR”),[3] it is essential to establish integrated compliance between the two Regulations. Doing so is necessary to reduce risks arising from potential non-compliance, to avoid duplication and operational inconsistencies, and to prevent harm to users.
The DSA and the GDPR represent two pillars of the new European regulatory infrastructure intended to redefine the responsibility of online intermediaries and the protection of personal data in the digital economy.
In particular, the DSA, which became fully applicable in February 2024, applies to providers of intermediary services such as cloud computing and web-hosting services, online marketplaces, app stores, social networks, search engines, content-sharing platforms, and online travel and accommodation platforms that target recipients located in the European Union. More recently, the European Data Protection Board (“EDPB”) published the “Guidelines 3/2025 on the interplay between the Digital Services Act and the GDPR” (the “Guidelines”)[4] for public consultation. The Guidelines introduced significant clarifications regarding the interaction between the two Regulations.
The EDPB Guidelines are therefore relevant in cases where a provider of intermediary services acts as a data controller and / or data processor in relation to the processing of personal data of individual service recipients located in the European Union.
In particular, the EDPB Guidelines analyse the DSA provisions concerning:
-
- Investigations of illegal content;
- the handling of user complaints and notices;
- the prohibition of advertisements based on profiling using special categories of data;
- recommender systems; and
- specific safeguards for minors.
These specific issues constitute the core focus of the analysis set out in this article and are examined with the aim of providing practical guidance to intermediary service providers.
Investigations of illegal content
Content moderation activities regulated by the DSA require integrated compliance that balances enforcement needs with the safeguards provided by the GDPR, especially where moderation is based on automated tools or machine learning models.
The DSA does not impose general monitoring obligations on providers of intermediary services with respect to the information they transmit or store. However, where such providers undertake voluntary initiatives to identify and address illegal content, the related processing of personal data must comply with the GDPR.[5]
In this respect, the EDPB recalls the following compliance obligations:[6]
- implementation of data minimisation, particularly where machine learning models are used to identify characteristics of a given content item;
- identification of an appropriate legal basis for the processing, which may be the legitimate interest of the controller or the compliance with a legal obligation, provided that specific and clearly defined legal obligations can be identified;[7]
- assessment of whether the foreseen activities may qualify as decisions based solely on automated processing, which are prohibited under Article 22(1) GDPR unless certain conditions are met;[8]
- ensuring an adequate level of transparency towards data subjects regarding the processing of personal data. In this context, in order to pursue this objective effectively, it is advisable to align the information notices addressed to users which are required under the DSA and the GDPR, so as to avoid inconsistencies, while at the same time favouring a multi-layered communication approach; and
- verification of whether a data protection impact assessment (DPIA) pursuant to Article 35 GDPR is required.
Handling of user complaints and notices
Providers of hosting services that store, manage and make available online user-generated content are required, in order to comply with the DSA, to put in place “notice and action” mechanisms that allow users to notify the presence of illegal content.[9] Recital 50 of the DSA specifies that the mechanism set up by the provider “should allow, but not require, identification the identification of the individual or the entity submitting a notice, unless this is “necessary to determine whether the information in question constitutes illegal content”.
These “notice and action” mechanisms are considered by the EDPB as they may involve additional processing of personal data carried out by intermediaries acting in their capacity as data controllers. At the same time, the EDPB also examines[10] the activities required by the DSA for the handling of complaints submitted by recipients of the service[11] and where providers take measures against potential abuse.[12]
In order to comply with the structured compliance obligations laid down by both the DSA and the GDPR, it is therefore advisable for providers to adopt procedures for managing “notice and action” mechanisms and complaints, specifying which personal data may be collected depending on the circumstances (for example, name and/or email address, or no data), when the identity of the notifier may be revealed to the affected recipient of the service,[13] and which information must be provided to all data subjects involved. To this end, the adoption of forms that predefine the information to be provided by notifiers or complainants may also be useful, in order to further limit the risk of collecting excessive data.
Prohibition of advertisements based on profiling using special categories of data
Certain categories of personal data, such as data relating to health or to religious beliefs or political opinions as comprehensively defined in Article 9 GDPR, are considered highly sensitive also for the purposes of the DSA. For this reason, the DSA[14] provides that online platform providers[15] may never present advertisements to recipients based on profiling using such categories of data. This prohibition applies to advertising disseminated through a platform subject to the DSA, regardless of whether profiling is carried out by the platform provider or by other parties.
In light of these provisions, platform providers must:
- adopt control mechanisms, at both a technical and contractual level, to prevent prohibited targeted advertising; and
- coordinate compliance with the transparency obligations relating to advertising communications under the DSA[16] with those laid down by the GDPR.
Recommender systems
Online platforms and search engines frequently use recommender systems[17] to automatically show tailored content to users and therefore require a coordinated approach that takes into account the transparency obligations under the DSA[18] together with the potential impacts under the GDPR.
Through recommender systems, content is presented according to a certain order or prominence based on various criteria that may include users’ activities, preferences or behaviours, such as purchases, clicks or ratings.
From an integrated compliance perspective, the EDPB clarifies that platform providers must be qualified as data controllers in relation to the personal data processed by the recommender systems they decide to integrate into their services and, as such, are required to comply with the following obligations under data protection law:[19]
- carry out and document an assessment of the risks for users connected with the use of recommender systems, with specific reference to the safeguards provided by the GDPR;[20]
- identify appropriate measures to mitigate such risks;
- assess whether the presentation of specific content to users through a recommender system significantly affects the data subject from a legal, economic or social perspective and, therefore, must comply with the conditions set out in Article 22(1) GDPR. In such cases, particular attention must be paid where the offering of content, services or products may have prolonged or permanent impact on users or significantly affect their behaviour or choices[21]; and
- where specific functionalities are made available that allow users to select and modify the parameters of recommender systems,[22] process only the data strictly necessary for the purpose of complying with the obligations under the DSA and retain such data only for the time strictly necessary, avoiding the maintenance of a history of users’ choices.
Specific safeguards for minors
Another fundamental objective of the DSA is the protection of minors online, as it provides that online platform providers must ensure a high level of privacy, safety and security of minors.[23] The DSA further specifies that online platform providers are not required to process “additional personal data” to verify the age of the recipients of their services.[24] In this regard, the EDPB provides valuable guidance in the Guidelines, stating that:[25]
- age verification is not necessary where a platform is clearly directed at minors;
- platform providers must adopt a risk-based approach, assessing the risks for minors presented by their services and identifying accordingly technical and organisational measures that are adequate and effective;
- where it is necessary to verify the age of the recipients of the service, the related processing of personal data must be limited to the information that is strictly necessary and proportionate to the pursued purpose, while adopting solutions and techniques capable of minimising adverse effects for all the recipients of the service;[26]
- where all GDPR principles are complied with, Article 28 DSA may be considered a provision on which to base the processing of personal data for age verification purposes under Article 6(1)(c) GDPR;[27] and
- providers should not permanently store the age or age range of the recipient of the service, but rather merely record whether the recipient of the service fulfils the conditions to use the service.
Conclusion
The EDPB Guidelines confirm the need for integrated compliance that is capable of coherently regulating core activities for the business of companies operating in the digital context as providers of intermediary services, such as, content moderation, complaint handling, advertisement, recommender systems and the protection of minors, in compliance with both Regulations analysed. In particular, they emphasise the importance for data controllers and data processors to always ensure compliance with the fundamental principles of the GDPR such as data minimisation, purpose limitation, transparency and accountability, including in the context of adapting to multiple regulatory frameworks.
Only a structured and coordinated approach, integrating appropriate controls, documented risk assessments and multi-layered communications, can effectively strengthen the protection of data subjects and ensure full regulatory compliance. Such an approach enables providers of intermediary services to turn regulatory obligations into a competitive advantage, while at the same time strengthening user trust and fostering a safe, predictable and reliable online environment for all the recipients of the services.
[1] “Intermediary services” are defined under Article 3(g) DSA as “one of the following information society services:
(i) a ‘mere conduit’ service, consisting of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
(ii) a ‘caching’ service, consisting of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request;
(iii) a ‘hosting’ service, consisting of the storage of information provided by, and at the request of, a recipient of the service e of information provided by a recipient of the service at their request”.
[2] Regulation (EU) 2022/2065 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act): https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32022R2065&qid=1768910109013.
[3] Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation): https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679.
[4] EDPB, “Guidelines 3/2025 on the interplay between the Digital Services Act and the GDPR”, version 1.1 of 11 September 2025: https://www.edpb.europa.eu/our-work-tools/documents/public-consultations/2025/guidelines-32025-interplay-between-dsa-and-gdpr_en. The public consultation relating to these Guidelines was closed on 31 October 2025, and the publication of the final version is therefore expected shortly.
[5] Articles 7 and 8 DSA.
[6] EDPB Guidelines, paragraphs 12-24.
[7] In this respect, reference may be made, for example, to Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market, amending Directives 96/9/EC and 2001/29/EC (“Copyright Directive”): https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:32019L0790, which requires the implementation of controls with regard to the publication of content protected by copyright.
[8] In particular, where decisions are based exclusively on automated processing, it will be necessary to verify whether a “meaningful” human involvement is nevertheless provided for (in accordance with the guidance of the Article 29 Working Party, “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (WP251rev.01)”, adopted on 6 February 2018, pp. 21–24: https://ec.europa.eu/newsroom/article29/items/612053/en), and that there is no mere reliance on the algorithmic recommendations generated by the system (CJEU, judgment in OQ v Land Hessen, C-634/21, paras. 62 and 73), or alternatively whether the processing is authorised by law.
[9] Articles 16 and 17 DSA.
[10] EDPB Guidelines, paragraphs 25-42.
[11] Article 20 DSA.
[12] Article 23 DSA.
[13] Pursuant to Article 17(3)(b) of the DSA, the identity of the notifier may be disclosed to the affected recipient only “where strictly necessary”.
[14] Article 26(3) DSA.
[15] “Online platforms” are a sub-category of “intermediary services” and are defined as “a hosting service that, at the request of a recipient of the service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation” (Article 3(i) DSA). Examples include social networks or platforms that enable consumers to conclude distance contracts with traders.
[16] Article 26(1) and (2) DSA.
[17] A “recommender system” is defined under Article 3(s) DSA as “a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient of the service or otherwise determining the relative order or prominence of information displayed”.
[18] Article 27 DSA.
[19] EDPB Guidelines, paragraphs 80-88.
[20] In the Guidelines, the EDPB identifies the main risks to be taken into account, including large-scale processing of personal data, the potential lack of accuracy and transparency in relation to inferences and the combination of personal data, assessment or scoring activities (such as profiling), as well as the processing of special categories of data or data that are in any event sensitive or relating to vulnerable individuals.
[21] In the Guidelines, the EDPB refers, by way of example, to platforms that present accommodation or job offers.
[22] Article 27(3) DSA.
[23] Article 28(1) DSA.
[24] Article 28(3) DSA.
[25] EDPB Guidelines, paragraphs 89-96.
[26] The EDPB recommends, for example, verifying only the age range, without requiring the specific date of birth, and avoiding the collection of identity documents or data that allow the unique identification of an individual.
[27] Cf. European Commission, “Guidelines on measures to ensure a high level of privacy, safety and security for minors online pursuant to Article 28(4) of Regulation (EU) 2022/2065”, C(2025) 4764 final, available at https://digital-strategy.ec.europa.eu/en/library/commission-publishes-guidelines-protection-minors.