The DSC will only submit a justified request for data access to the VLOPs in accordance with Art. 40 (4), (8) of the Digital Services Act (DSA) if the researchers provide certain information and prove certain circumstances in accordance with Art. 40 (8) DSA, including that they
are able to comply with the specific data security and confidentiality requirements associated with each request and to protect personal data, and that they describe in their request the appropriate technical and organisational measures they have taken to this end.
Researchers must provide the DSC with sufficient information in their application to enable it to make a reasoned request. The documents generated by researchers through our web tool must be uploaded to the data access portal.
The Delegated Act (DA) (C(2025) 4340 final) lays down specific conditions under which researchers may be granted access in accordance with the General Data Protection Regulation (GDPR) if personal data (pbD) is included. According to recitals 14 and 15 of the DA, researchers can demonstrate that they have assessed the risks and taken appropriate security measures, for example by means of a "commitment letter from the research organisation" or a data protection impact assessment (DPIA) in accordance with Art. 35 GDPR.
As a rule, the application documents are also reviewed and (pre)assessed by the competent data protection authority (DPA).
Consequences for the application:
If the requested data is personal data within the meaning of Art. 4 No. 1 GDPR, researchers must consider whether the processing within the scope of the research project is likely to result in a high risk to the rights and freedoms of natural persons due to the form of processing, in particular the use of new technologies, the nature, scope, circumstances and purposes of the processing. This is to be assumed, among other things, in the case of extensive processing of special categories of personal data in accordance with Art. 9 (1) GDPR (so-called sensitive data). This preliminary assessment is also referred to as a threshold analysis (there is an official list of processing activities for which a high risk is to be assumed in any case and which can be used for guidance: www.lda.bayern.de/media/dsfa_muss_liste_dsk_de.pdf).
If no personal data is required for a research project, explanations regarding risk assessment and protective measures under the GDPR are not necessary. This is conceivable if researchers only evaluate aggregated statistics. In this case, it must instead be demonstrated to the DSC that the data is truly anonymous in the sense of data protection law (not merely pseudonymous data) – a general statement that the researchers do not know which user the data belongs to is not sufficient here.
If there is no high risk, the time-consuming risk assessment can be dispensed with; it is sufficient to demonstrate that appropriate technical and organisational security measures within the meaning of Art. 32 GDPR have been taken. In this case, however, researchers must also provide a well-founded explanation in their application as to why, despite the large amounts of data, etc., there is no high risk within the meaning of Art. 35 GDPR. A general reference is not sufficient here.
If a high risk is to be assumed for a specific research project, a so-called data protection impact assessment (DPIA) must be carried out in accordance with Art. 35 GDPR. This is a tool for systematically identifying, assessing and mitigating significant risks to the rights and freedoms of a natural person that may arise from a data processing operation. A DPIA must contain at least the following points:
a systematic description of the planned processing operations and the purposes of the processing, including, where applicable, the legitimate interests pursued by the controller;
an assessment of the necessity and proportionality of the processing operations in relation to the purpose,
a risk assessment and
the remedial measures planned to address the risks.
At this point, it must be explained in detail why researchers are unable to identify the individuals behind the data. They should specifically state that they do not have access to additional information that could be used to identify an individual. They must also demonstrate that they themselves do not have the technical or organisational means to discover an individual's identity. They should also explain that they do not collaborate with individuals or institutions that may possess such information. It is advisable to mention the time and expense that would be required to enable identification using certain technologies. If this expense is unrealistically high, this can be a strong argument. Whether the VLOP can assign the data to specific individuals is meanwhile irrelevant.
In addition, the security measures that have been taken must be described with regard to possible trade and business secrets of the VLOP. To this end, measures that ensure the confidentiality of the content must be described. To do this, section "1. Measures to ensure the confidentiality of data" of the sample template for technical and organisational measures (TOMs) should be completed.
First, the research project, the data required for it and the environment in which the data is processed should be described. To ensure that all relevant data protection aspects are documented in a structured, separate and easily comprehensible manner, this first part is divided into several sub-questions.
There are no legal requirements as to how detailed the description must be, so it can also be presented in keywords, bullet points, etc. It is important that the questions are answered in such a way that a person without specialist knowledge can easily understand who needs which data for what purpose and for how long, and in which (technical) environment the data ends up after the VLOP/VLOSE has granted access. If the path taken by the data is complex, e.g. because many different research organisations, departments within an organisation or technical systems are involved, a data flow diagram should be attached.
Researchers must first describe the purposes for which they are processing the data. This is also important in other respects, as data processing is only permissible if there is a legal basis for it. The legal basis is based on the purpose (see 2. below). The purposes are limited in the case of data access claims under Art. 40 (4) DSA to:
"Research work that contributes to the detection, investigation and understanding of systemic risks in the Union in accordance with Art. 34 (1) DSA, including in relation to the assessment of the adequacy, effectiveness and impact of risk mitigation measures in accordance with Art. 35 DSA".
Guidance: Have funding applications or similar been submitted for the project? If so, the description of the research purposes in such applications can be used, for example.
Goanta et al., The Great Data Standoff: Researchers vs. Platforms Under the Digital Services Act, 2 May 2025, https://arxiv.org/abs/2505.01122v1, offer guidance on how to narrow down a research question (and thus the purposes) and the selection of data.
Questions that should at least be addressed here and that will also be relevant elsewhere in the Data Access Portal:
What systemic risks should be investigated in the research project?
Should a systemic risk explicitly mentioned in Art. 34 (1) DSA be investigated (several may be possible at the same time)? These include:
the distribution of illegal content via their services;
any actual or foreseeable disadvantageous effects on the exercise of fundamental rights, in particular the fundamental right to respect for human dignity enshrined in Art. 1 of the Charter, the fundamental right to respect for private and family life enshrined in Art. 7 of the Charter, the fundamental right to the protection of personal data enshrined in Art. 8 of the Charter, the fundamental right to freedom of expression and information enshrined in Art. 11 of the Charter, including media freedom and pluralism the fundamental right to non-discrimination enshrined in Art. 21 of the Charter, the rights of the child enshrined in Art. 24 of the Charter and the comprehensive consumer protection enshrined in Art. 38 of the Charter;
any actual or foreseeable disadvantageous effects on social debate and electoral processes and public security;
any actual or foreseeable disadvantageous effects in relation to gender-based violence, the protection of public health and minors, and serious adverse consequences for a person's physical and mental well-being.
If the risk to be investigated is not mentioned in Art. 34 (1) of the DSA: Why is it still a systemic risk?
If platform data outside the EU is requested: Why is data outside the EU necessary to research systemic risks in the EU?
To what extent could these risks arise from the design or operation of the platform/search engine services and related systems, including algorithmic systems, or from their use? (This does not mean anticipating the research results, but it must be ensured that the risk is related to platform operation, i.e. not a data source for research that is unrelated to it).
To what extent does the research project contribute to detecting, identifying and understanding this risk? In other words, what opportunities does the research project offer in this regard?
According to Art. 4 No. 7 GDPR, compliance with data protection requirements is the responsibility of the natural or legal person or entity that, alone or jointly with others, determines the purposes and means of data processing (known as the controller). It is possible that there may be several controllers for a data processing process (so-called joint controllership, Art. 26 GDPR), or that the controller may commission a service provider to process data for it exclusively on its instructions (so-called processor, Art. 28 GDPR).
When multi-step processes are interlinked, the responsibility for each step must be considered individually. In connection with Art. 40 (4) DSA, a distinction must therefore be made between the phase of compiling data within the sphere of the VLOP, the phase of transferring data to researchers, and finally the research activity itself. It is conceivable that there is joint responsibility between VLOP and researchers in the data transfer phase, especially if access takes place within a data space operated by VLOP. In this case, a corresponding agreement would have to be concluded between the jointly responsible parties. Whether this is the case will depend crucially on the access modalities, which have yet to be determined by the DSC. This does not need to be clarified further here under 1.2., as the only relevant issue at this point is who is the responsible entity in the phase of subsequent research activity.
In cases where there is an employment relationship between the research organisation and individual researchers, it can be assumed that the organisation to which the researchers belong is the controller, and not the individual researcher. If the connection between the organisation and the researcher is rather loose or complicated, individual researchers should check whether they are personally considered controllers within the meaning of the GDPR (i.e. whether they decide on the purposes and means of processing independently of their institution). If individual researchers use pbD for their own purposes outside the limits and control regulations set by an institution, they are also responsible for the processing (see also EDMO, Report of the European Digital Media Observatory’s Working Group on Platform-to-Researcher Data Access, 31 May 2022, para. 26 ff., https://edmo.eu/wp-content/uploads/2022/02/Report-of-the-European-Digital-Media-Observatorys-Working-Group-on-Platform-to-Researcher-Data-Access-2022.pdf).
Questions that should at least be addressed here:
Who determines the purposes and means of data processing for the research activity phase?
Is the research integrated into a higher-level research organisation (university, etc.) that operates the infrastructure for data processing?
What is the organisational and legal relationship between the individual researchers and the higher-level research organisation (e.g. employment relationship)?
When several research organisations, possibly from different countries, collaborate: How are the individual organisations involved in the research and in determining the purposes and means? Do the various organisations act as joint controllers and has an agreement been concluded for this purpose in accordance with Art. 26 GDPR?
Who is appointed as the (internal or external) data protection officer within the meaning of Art. 37 GDPR and how are they involved in the project (e.g. in an advisory capacity, have they helped to draft documents, etc.)?
Which persons are involved in data processing, in what role and to what extent?
Have the researchers involved undergone data protection training or similar awareness-raising measures?
Are sponsors involved in such a way that they influence the purposes and means of data processing or have (partial) access to the personal data?
Under 1.3, indicate whether there are any legal frameworks that deviate from the GDPR and apply to the specific research project.
However, there are some special features in the national laws of the Member States regarding the processing of pbD for research purposes, as the GDPR leaves room for special regulations. An overview of the most relevant national special features is listed in Annex 4 (Compendium of EU Member State Laws) of the above-mentioned EDMO report.
When selecting the data they need access to, researchers must take the principle of data minimisation into account. In short, as little clear data (e.g. names, addresses) and sensitive data within the meaning of Art. 9 GDPR as possible should be processed. Sensitive data includes all personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, as well as the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, health data, or data concerning a person's sex life or sexual orientation. This must also be taken into account if the raw data appears neutral, but the researchers are specifically interested in drawing conclusions about sensitive circumstances within the meaning of Art. 9 GDPR (see the example in the EDMO Report p. 81: "to infer users' political ideology").
Researchers can present the data they wish to process in tabular form. The examples of potential data categories/types listed in the table below are based on the examples from Recital 13 GDPR and Goanta et al., The Great Data Standoff, 2.5.2025. VLOPs must maintain data catalogues for this purpose. Researchers must also specify elsewhere in the application portal which data they need access to. This information can be referred to at this point.
Under 1.5, describe the entire life cycle from data collection to deletion, i.e. through which hands and systems the data flows (for different constellations, see CESSDA Data Management Expert Guide (DMEG), as of January 2020, https://dmeg.cessda.eu/).
The explanations must refer to the requested access modality. The following options are available:
Transmission via an interface/data storage facility so that the data is processed within the research organisation's infrastructure.
Within a secure processing environment (virtual clean room) operated by the VLOP or its service provider. In this variant, the researchers do not process the data in their own infrastructure, so the description of the life cycle would be much shorter.
The following may be helpful in this regard:
Has the research team previously carried out projects in the same organisation for which data protection documentation was prepared?
Does the research organisation have a Record of Processing Activities (RoPA) for its general research operations, documentation of technical and organisational measures (TOM) for general research operations, or does it provide its researchers with sample documentation? Some of these documents can be found on the public website of the IT departments or on the organisation's intranet, or they can be provided on request. Such documents from the research organisation are also a suitable means of documenting, within the meaning of Recital 14 of the DA, that researchers have access to adequate protective measures.
If so, the contents of these documents can be used if it is ensured that the information is still up to date and applicable to the specific research project. As a rule, it will be necessary to consult the organisation's internal IT department and the internal/external data protection officer to ensure that the following description is accurate, up to date and complete.
Under 1.5.1, describe which devices and systems are used and which software, hosters and service providers are employed (a table may be used). In addition, the technical and organisational measures (TOM) taken to ensure security must be presented in an appendix.
Circumstances that should generally be addressed here: Where and how is the data stored? What security measures have been taken?
"On premises", i.e. in the research organisation, or "off premises",
location and spatial/technical conditions of the hardware (dedicated hardware/virtual hardware (VM-Ware/Solaris LDOM)/partitioned hardware (logical domains, IBM tape library), etc.)
Details of external infrastructure (public cloud, private cloud, etc.), certification of service providers (ISO 27001, BSI basic protection, ISO 27001 with ISO 27017, BSI C5, etc.)
Details on clients (laptops, smartphones, scanners, printers) and administration (centralised, local)
Operating systems used, specific research information systems, research data management systems, office suites (M365), etc.
If available: a schematic representation of information processing (networks, clients, servers, communication/data connections)
If available: schematic representation of the data flow pbD between the IT systems
This section should only be completed if there is a processor; otherwise, the section should be deleted.
For each processor involved in the data processing lifecycle, indicate the purpose, duration and scope of their activities and how/where processing instructions are documented (processing agreement pursuant to Art. 28 GDPR, emails, etc.). If the research organisation has a RoPA for its research operations, the relevant information should be found therein. Otherwise, the research organisation's data protection officer should be consulted in this regard.
In the context of research data access pursuant to Section 40 (4) DSA, there is a special feature compared to other research projects in that consent as a legal basis is ruled out per se, since, on the one hand, the data is not collected directly from the data subjects by the researchers and, on the other hand, the VLOPs did not pursue any research purposes when collecting the data. According to the DA, the following are possible legal bases:
For normal data categories
Art. 6 (1) (e) GDPR, if the processing is necessary for the performance of a task carried out in the public interest.
Art. 6 (1) (f) GDPR, if processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.
For sensitive data
Art. 9 (2) (g) GDPR, if processing is necessary on the basis of Union or Member State law, [...] for reasons of substantial public interest.
Art. 9 (2) (j) GDPR, if the processing is necessary for scientific research purposes in accordance with Art. 89 (1) on the basis of Union or Member State law.
In the case of Art. 6 (1) (f) GDPR, the result of the balancing of interests must be justified. For guidance: Legitimate Interest Assessment Template in the EDMO Report (p. 102 ff.).
In the case of Art. 6 (1) (e), Art. 9 (2) (g) and (j) GDPR, Art. 40 DSA justifies the public interest. It is sufficient to cite Art. 40 DSA as justification and to point out that the legislator has specifically recognised this research activity in the DSA itself as being in the public interest. Researchers may cite several of the above legal bases.
Recital 30 of the Delegated Act, which states with regard to the data provider and its legal bases: "Where special categories of personal data within the meaning of Art. 9 GDPR are to be processed, [Art. 40 (4) DSA] meets the requirement of Art. 9 (2), point (g) GDPR". I would therefore add a brief justification to the standard, which can be limited to the fact that the legislator has specifically recognised this research activity in the DSA itself as being in the public interest.
If it was determined in 1.3 that there are additional requirements in national law that must be observed, it must be stated here in 2.1 that these additional conditions are also met.
According to Art. 5 (1) (c) GDPR, personal data must be adequate, relevant and limited to what is necessary for the purposes of processing (principle of data minimisation). The fact that this principle is not merely a nice-to-have is clarified in the DSA and DA, according to which the DSC must also take into account "information on the necessity and proportionality of access to the data" when deciding to send a reasoned request to the VLOP.
In particular, processing of personal data is not necessary within the meaning of the GDPR if a specific research project can also be implemented with anonymised data from the outset. Rather, data processing is only necessary to the specific extent if no equally suitable, less burdensome means of achieving the respective research purpose is available.
At this point, researchers must therefore describe why it is not sufficient for their research project to process the data that is freely available on the platforms anyway or for which access can also be requested via the existing Research API. They may also refer to the fact that scraping is often not possible (to the extent necessary for the purpose) due to the platforms' terms of use or technical security measures, and that not all of the information required to achieve the purpose is available via the Research API.
With regard to the scope of the requested data, it should be specified how much data is to be processed, how many people would be affected, and which geographical areas are affected.
Why can the research only be carried out by accessing the data?
Why can't the research objective be achieved with anonymous data?
Why can't the research goal be achieved with (completely) pseudonymous data?
Is all personal data listed under 1.4 required for the research project, or could some variables be removed without affecting the project?
When deciding to send a reasoned request to the VLOP, the DSC must also take into account the duration of the research project in accordance with Art. 8 (d) DA. In this regard, the GDPR stipulates that pbD must be stored in a form that allows the identification of the data subjects only for as long as is necessary for the purposes for which they are processed (principle of storage limitation); personal data may be stored for longer if, subject to the implementation of appropriate technical and organisational measures, the personal data is processed exclusively for scientific research purposes in accordance with Art. 89 (1) GDPR.
For guidance: In scientific practice, a general retention period of 10 years until deletion is regularly assumed, unless shorter or longer periods are required for legal or ethical reasons (see, for example, Max Planck Society, Rules for Ensuring Good Scientific Practice, as of 20 March 2009, p. 4; DMEG, p. 101).
It should also be noted that research data containing sensitive personal data pursuant to Art. 9 GDPR must always be anonymised as soon as this is possible for the research purpose. If the data is not anonymised directly but is first processed in pseudonymised form, the parts of the data set that enable re-identification must be stored in a separate system (depending on the type of organisation, this may be required by other standards, see e.g. Section 17 (2) BlnDSG). In this case, it must also be stated when the data that enables re-identification will be destroyed.
When is the project expected to start?
How long is the project expected to run?
How long should which data be retained afterwards?
Starting point of the time limit?
If it is not possible to specify a concrete deletion rule, criteria for determining the storage period must be specified, e.g. deletion X months/years after completion of the research work/last publication of results, etc.
If relevant: When will data for re-identification be deleted?
If data could in any way leave the EU (e.g. use of a cloud service, collaboration with research organisations in other EU countries), specific information must be provided below on the countries in which personal data is processed during the research work life cycle and whether these countries are recognised as offering an adequate level of data protection or what guarantees apply to the transfer (cf. Articles 44 and 49 GDPR).
Data subjects have a wide range of rights that serve to make data processing transparent for them and enable them to intervene to a certain extent. In principle, these rights must also be upheld in the context of research activities, and it must be explained how the rights are being fulfilled. However, Art. 89 (2) GDPR creates scope for certain data subject rights to be restricted in the national law of Member States in favour of research.
This section should specify the relevant legal provisions that indicate that the rights of data subjects cannot/ need not be fulfilled in the specific research project. Since the legal restrictions usually depend on additional characteristics being met beyond general research activities (e.g. disproportionate effort, impossibility, risk to the achievement of the purpose), a justification must generally be provided.
The table contains examples of standards from the GDPR. It must be examined on a case-by-case basis whether and which restrictions are regulated in the national law of the respective Member State.
In principle, documentation from previous research projects or documents from the organisation's general research activities can be used for this purpose (see 1.5).
According to the DA, the DSC should "verify that the data access application contains sufficient indication that the researchers have assessed risks to personal data protection." Against this background, it is to be expected that the DSC and, in particular, the competent DPA will pay close attention to the risk assessment.
Risk assessment and security measures are interrelated: the GDPR links the level of TOMs to the risk to the rights and freedoms of data subjects associated with the processing of personal data. The risk assessment may also take into account the basic security measures described in section 1.5, which the research organisation has already implemented.
If the threshold analysis shows that there is no high risk, the time-consuming risk assessment can be dispensed with; it is sufficient to demonstrate that appropriate technical and organisational security measures within the meaning of Art. 32 GDPR have been taken (4.2). In this case, however, researchers must additionally provide a well-founded explanation in their application as to why, despite the large amounts of data, etc., there is no high risk within the meaning of Art. 35 GDPR (4.1). A general reference is not sufficient here.
According to the DA, the DSC should "verify that the data access application contains sufficient indication that the researchers have assessed risks to personal data protection." Against this background, it is to be expected that the DSC and, in particular, the competent DPA will pay close attention to the subsequent risk assessment.
In order to assess a risk in an informed manner and take appropriate protective measures, potential sources of risk must first be identified, possible risk scenarios analysed, and the probability of occurrence and severity of damage determined. A risk index can then be calculated using a risk matrix. In accordance with the relevant risk, researchers must, pursuant to Art. 40 (8) DSA, be able to comply with the specific data security and confidentiality requirements associated with each request and protect pbD, and describe the appropriate TOMs they have taken to this end.
Risk assessment and security measures are interrelated: the GDPR links the level of TOMs to the risk to the rights and freedoms of data subjects associated with the processing of personal data. At the same time, the risk assessment can also take into account the basic security measures described in section 1.5, which the research organisation has already implemented. Only if the risk index is still too high despite the basic measures does the organisation need to take additional measures to reduce the probability of occurrence and thus the risk index. If a residual risk remains, it must be assessed whether this is acceptable.
There are various options for presenting the risk assessment, although it is advisable to follow the established methods of the supervisory authorities. Detailed results of the risk analysis should be presented in tabular form, for which the template for risk assessment an be used.
Researchers must provide a well-founded explanation in their application as to why there is no high risk to the rights and freedoms of natural persons. The decisive factors here are the form of processing, the use of certain technologies, and the nature, scope, circumstances and purpose of the processing. In particular, it should be explained that the requested data contains no or only a small amount of sensitive data.
Art. 40 (8) DSA and Art. 8 DA focus on the issues of confidentiality and data security. Weak points must therefore be identified that pose risks, in particular to confidentiality (= no unlawful access to data), availability (= no loss of data) and integrity of data (= no unwanted changes to data). The other data protection principles, including transparency, data minimisation and storage limitation, have already been taken into account in sections 2 and 3.
The identified and analysed risks must be mitigated by means of remedial measures. Art. 32 GDPR regulates this as follows: "Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk."
In order to assess what is "appropriate" in the research context – and thus necessary beyond the basic security measures – Art. 9 (4) and (5) DA and, where applicable, the relevant national specifics (see 1.3.) provide guidance. This means that it should be ensured that the additional TOMs cover all elements that the DSC may need to consider when determining access modalities, e.g. specific permanently assigned institutional devices, specific access restrictions.
According to Art. 9 (4) and (5) DA, these include:
Relevant network security measures
Encryption
Access control mechanisms
Backup policies
Mechanisms to ensure data integrity
Contingency plans
Planned retention periods and corresponding data destruction plans
Organisational measures, such as internal review processes, role and rights concepts, confidentiality obligations
Contractual agreements between VLOP and researchers
Training courses for researchers on data security and data protection.
If a secure processing environment is required, it must be demonstrated that the operator of the processing environment
has established specific access restrictions for this environment in order to minimise the risk of unauthorised access, copying, modification or removal,
ensures that researchers only have access to data covered by the application through individual and unique IDs and confidential access modes,
logs access to be able to check and verify all access,
ensures that the computing power available to researchers is sufficient and appropriate for the research project
If the risk materialises, the significant effects on the data subject may be physical, material or immate-rial in nature, with Recital 75 of the GDPR citing the following examples:
For each identified category of potential damage, the events that could lead to its occurrence were determined. These consist in particular of
In risk management, a risk generally has two dimensions: firstly, the severity of the damage and, secondly, the probability that the event and the consequential damage will occur. In the risk assessment in Appendix [bitte Nummer der Anlage zu TOMs eintragen], the probability of occurrence and the severity of the damage are graded as follows:
Level
Description of the level
Probability of occurrence
Description
Example
1
Rare/ negligible
Damage cannot occur according to current expectations.
Infection by malware on a stand-alone computer that is not connected to a network and to which no other media can be connected.
2
Medium/ limited
Damage may occur, but based on previous experience and the given circumstances, it seems unlikely.
Malware infection on a computer that is kept up to date, equipped with the latest antivirus software and connected only to a BSI-certified company network.
3
Frequent/ significant
Based on previous experience and the given circumstances, damage appears possible but not very likely.
Malware infection on a computer that is kept up to date, equipped with the latest antivirus software and directly connected to the internet.
4
Very frequent
Based on previous experience and the circumstances, damage appears to be possible and very likely.
Infection by malware on an outdated Windows XP computer without antivirus software that is directly connected to the internet.
The severity of damage is classified into four intensity levels:
Description of level
Severity of consequences / possible damage
Minor
Those affected may experience minor inconveniences, but they can overcome these with a little effort.
Immaterial: slight annoyance Material: loss of time Physical: temporary headaches
Manageable
Those affected may suffer major inconveniences, but they can overcome these with some difficulty.
immaterial: minor but objectively verifiable psychological complaints Material: significantly noticeable loss of personal comfort Physical: minor physical damage (e.g. mild illness)
Substantial
Those affected may suffer significant consequences that they can only overcome with serious difficulty.
Immaterial: severe psychological distress Material: financial difficulties Physical: severe physical complaints
Severe
Those affected may suffer significant or even irreversible consequences that they are unable to overcome.
Immaterial: permanent, severe psychological problems Material: substantial debts Physical: permanent, severe physical complaints
For risk analysis, a risk index is calculated from the specified classifications for the probability of oc-currence and the severity of the impact according to the following matrix:
Detailed results of the risk analysis and assessment for the specific research project are provided in Appendix [bitte Nummer der Anlage zu TOMs eintragen] under II.
The identified and analysed risks must be mitigated by means of remedial measures. Art. 32 GDPR stipulates the following in this regard: "Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk."
In order to assess what is "appropriate" in the research context – and thus necessary beyond the basic security measures – Art. 9 (4) and (5) GDPR and, where applicable, the relevant national specificities (see 1.3) provide guidance. This means that it should be ensured that the additional TOMs cover all elements that the DSC may need to consider when determining access modalities, e.g. specific permanently assigned institutional devices, specific access restrictions.
The attachments must be listed in the table below. Not all of the examples listed are necessarily required.
The finished document will be sent to you by email.