AIDA Data Hub  »  AIDA Data Sharing Policy

This section is a discussion on the ethical/legal framework around the common practice in the use of clinical imaging data for research in Sweden, with reference to official sources which may be useful as starting points for someone looking to understand the framework, that they may become better equipped to evaluate their own engagements.

This discussion is based on European law, and Swedish law in places where European law leaves room for national legislation. Researchers in countries similar to Sweden may likely be able to use this document as a point of departure for their own discussions, and may have similar legislation that can be used in places where this document references Swedish law.

Note: This document does not constitute legal advice.


The legal and ethical framework surrounding the use of personal data in research is complex. Much depends on the specifics of the particular details and the context surrounding each individual case, suggesting that it would be more appropriate to give recommendation on a case-by-case basis than for a hypothetical general case. Therefore, this is not an in-depth analysis of the framework, but a limited and superficial one. In the interest of readability, many potentially important considerations such as secrecy, law enforcement and national security have been left out of this discussion, although they may be very important to whatever case you as the reader may currently have in mind. Likewise, depending on circumstances, there may exist venues for legal and ethical research that are not covered in full here, such as consented data collection, clinical studies approved by the medical product agency, medical examinations as part of the ethically approved research activities, or through biobanking or register based research.

In Sweden the use of clinical data in research is regulated by:

Explicitly, the Swedish biobank law does not regulate the use of clinical data in research, but rather concerns the use and keeping of physical human biological materials, and defines rules that are much more strict than the above laws that regulate data processing. The possible reason is that a tissue sample can potentially be analysed in a myriad ways that could give rise to a potentially huge amount of sensitive personal data, whose consequences of disclosure could be very hard to assess.

The General Data Protection Regulation (GDPR) is the law that regulates all processing of personal data in all of Europe, which according to GDPR Article 4 §2 includes a very broad range of activities such as collection, organisation, storage, alteration, retrieval, use, disclosure, alignment, restriction, or even erasure and destruction of data. GDPR Article 6 §1 requires all processing of personal information to have a legal basis and be for a specified purpose, and GDPR Article 5 §1 b states that further processing for research is not to be seen as incompatible with any initially defined purposes. However, most data from medical imaging concerns information on the health of a natural person, which is a special category of information given special protection by GDPR Article 9 §1, which states that processing these categories of information is forbidden, except in a defined set of circumstances.

GDPR Article 9 §3 allows caregivers to process patient data according to national law, which in the case of Sweden is the Swedish patient data law (PDL). PDL Chapter 3 §1 requires caregivers to maintain patient health records (“patientjournal”), and PDL Chapter 2 §6 as well as GDPR Article 4 §7 identifiesthe caregiver as data controller (“personuppgiftsansvarig”) to these records. These records may be electronic, and may be distributed over several systems (eg PACS, LIS, etc). PDL Chapter 3 §2 identifies these records as a source of information for research.

Swedish freedom of the press act (“tryckfrihetsförordningen”) TF Chapter 2 §4 states that documents at a public authority (“myndighet”, eg a caregiver) are public, and TF Chapter 2 §1 states that all shall have the right to access them, unless according to TF Chapter 2 §2 access must be restricted for example on account of individual privacy.

GDPR Article 9 §2 g allows research processing of such information according to national law, if it is pursuing a substantial public interest, is proportional to the aim, and has adequate protections in place according to GDPR Article 32 and GDPR Article 89. In Sweden this is established according to the Swedish ethical review act (eg “etikprövningslagen”) EPL §6 in an application for review to the Swedish Ethical Review Authority (“etikprövningsmyndigheten”, EPM). An approved ethical review application then gives the framework for the research activity in terms of what data processing is allowed, and what safeguards are to be employed (cf Protective measures below). This framework can be modified by submitting an amendment application (“ändringsansökan”) to EPM, if needed for example to get support for further kinds of processing.

The ethical review application specifies the legal entity under whose control the research activities will be carried out (“research institution”, cf EPL §2 “forskningshuvudman”). In Sweden this is nearly always an organization, such as a university or a company. EPL §11 allows research only if it is carried out by or under supervision by a researcher (“ansvarig forskare”) who has the necessary scientific competencies, which includes the awareness and ability to ensure that the activities are carried out as described in the ethical review application and according to institutional policies at the research institution.

According to Swedish public access to information and secrecy act (“offentlighets- och sekretesslag”) OSL Chapter 6 §4 caregivers shall on request disclose the documents (“lämna ut handlingarna”) to the research institution, unless for example according to OSL Chapter 21 §7 it can be assumed that the disclosed information will be processed in breach of EPL. This means that the caregiver must not disclose information outside of what is described in the approved ethical review application, nor if it can be assumed that the recipient will abuse the disclosed information for example by processing it by means or for purposes other than what is described in the approved ethical review application. Also, OSL Chapter 25 §1 and OSL Chapter 21 §1 require the caregiver to assess whether disclosure could harm an individual (“menprövning”). This should take into account that research institutions that are public authorities (such as universities) are also required by OSL to assess harm before any subsequent disclosure, whereas private companies are not. Such assessments are typically carried out with support from internal institutional policies and guidelines for information classification and management.

If the caregiver discloses personal information for use in research activities under a research institution’s control as described in the approved ethical review application, then the research institution becomes data controller to the disclosed information, and is from then on responsible for the legal and ethical processing and safeguarding of the disclosed information.

The research institution can according to GDPR Article 28 §3 also make use of third party services for processing personal data (cf Data processing services, and the cloud below).

The research institution also has the copyright (“upphovsrätt”) to the disclosed information according to Swedish copyright law (“upphovsrättslagen”) URL §49, because it has “produced a catalog, table or similar into which large quantities of information have been put together”, since the data extraction is carried out according to parameters specified in the approved ethical review application. If the approved ethical review application allows, the research institution can also grant use of the information to others, and can use licensing terms (“avtalslicens”) to further limit what use is to be considered allowed according to URL Chapter 3a.

Starting 2020-01-01, the research institution will then be required to store the information for 10 years after last use, in order to enable research validation according to §8 of Swedish Good research practice and review of research misconduct act (“lag om ansvar för god forskningssed och prövning av oredlighet i forskning”).

Data processing services, and the cloud

The research institution can make use of third party services also for processing personal data according to GDPR Article 28 §3 if a data processing agreement (“personuppgiftsbiträdesavtal”) is in place which describes what processing is allowed, however such an agreement may according to GDPR Article 28 §1 only be made with a data processor (“personuppgiftsbiträde”) that can provide sufficient guarantees that appropriate technical and organisational measures are put in place (cf Protective measures below). The data processor may also according to GDPR Article 28 §3 a only process information according to documented instruction from the data controller. According to GDPR Article 82 §2 the data controller is liable for any damages caused by this processing, however the data processor is liable for damage caused by any processing it has carried out outside of these instructions.

In regards to sufficient guarantees, it may be easier for a research institution in the EU to obtain such guarantees from data processors that are based in the EU (or even in the same country) and as such themselves are regulated by GDPR, keeping in mind that data processors based outside the EU may not be legally allowed to fulfill any guarantees given, depending on national legislation in the country where they are based, and that a research institution in the EU may not have the same ability to successfully pursue legal action in countries outside the EU in case of failure to uphold these guarantees.

Protective measures

Principles for protection shall according to GDPR Article 5 §1 include purpose limitation (only for specified legal purposes), data minimization (only use relevant and necessary data), storage minimization (do not keep longer than necessary), and integrity and confidentiality (technical and organizational measures).

GDPR Article 32 §1 requires data controllers to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons. The preceding sentence contains many qualifiers that are hard to quantify objectively, like “appropriate”, “taking into account”, “state of the art”, “cost”, “nature”, “risk”, “likelihood”, and “severity”. Advise on their interpretation is provided by national supervisory authorities (“tillsynsmyndighet”, Integritetsskyddsmyndigheten in Sweden) according to GDPR Article 57 §1c, and by data protection officers (“dataskyddsombud”) at large organizations such as public authorities (“myndighet”) according to GDPR Article 39 §1, for example through institutional policies or on request.

GDPR Article 32 §1a suggests encryption and/or pseudonymization when suitable. Pseudonymization, according to GDPR Article 4 §5 entails processing of personal data in such a manner that it can no longer be attributed to a specific person, without the use of additional information (eg “a key”) which is to be kept protected and separate. If this additional information is deleted, so that it is no longer possible to attribute the data to a specific person even indirectly, then the data is anonymous and ceases to be personal data regulated by GDPR, and thus may no longer require as high levels of protection. However, GDPR Article 32 §1d requires regular follow-up of the effectiveness of the protective measures, such as anonymization carried out in the past, since as the “state of the art” progresses and more information becomes available, it may over time become possible or even trivial to again attribute the data to a specific person, making the data anonymous no longer. Depending on how the supposedly anonymous data has been disseminated, it may then require significant effort across many organizations to again restore it to an appropriate level of protection.

As can be seen in this and the preceding section, despite detailed data protection guidelines and support functions, many of the implementational details are still left to the individual researcher to resolve. This is in no insignificant part due to the nature of research itself, whose mission it is to investigate the unclear and to learn the unknown, which makes it hard for anyone to usefully advise the researcher on how best to process the data that they themselves are the most intimately acquainted with and our foremost experts on.

Researchers can to some degree find support from data protection officers and in policy documents such as this, and in the approval of ethical review applications that they themselves likely wrote, but detailed questions like “can this data be considered anonymous?”, and “are these protective measures appropriate?” must by necessity be answered finally by the researchers themselves.

There are however two encouraging consequences arising from this: while the ethical/legal framework surrounding these activities is complex, most research is actually allowed if only it can be properly motivated. And the obvious corollary: if it cannot, then it is not.