Jump to content
people with smartphones Foto von Tobias auf Unsplash

Center for User Rights

The Center for User Rights focuses on strengthening the rights of users of online platforms and enforcing the Digital Services Act.

Users are central actors in the digital world. They are not only passive consumers, but also producers of content, as well as active participants in discourse and observers of what is happening online. However, as most online spaces are commercial online platforms, users find themselves in situations characterized by power asymmetries: Their access to information and the boundaries of their rights to express themselves and seek redress are defined by commercial actors. Such power asymmetries have been related to a host of negative effects of users’ fundamental rights, which are often most acutely felt by marginalized or vulnerable groups. Users and those representing them need new tools to hold tech companies to account.

Svea Windwehr

Svea Windwehr

Head of Centre for User Rights

"For many users, online platforms are a key part of their daily lives – it’s where they get their news, where they connect with the world and where they go for entertainment. Online platforms have a huge influence on how we perceive the world and they often determine whether and how our rights are protected online. The aim of the Center for User Rights is to protect users and to fight for tech companies to respect fundamental rights – which is crucial to protect democratic discourse online.”

The goal of the Center for User Rights is to strengthen and enforce users’ rights against online platforms. Through lawsuits, complaints and policy work, we fight to ensure that platforms become more transparent and respect fundamental rights such as the freedom of expression, freedom of information and academic freedom.

Focus areas

The Center for User Rights consolidates GFF's work on platform regulation and aims to strengthen and enforce user rights in various areas.

Supporting the implementation and enforcement of the Digital Services Act is the focus of the Center's work. In addition, we pursue better protections against violence online and continue our work in the area of copyright law.

The Digital Services Act

With the Digital Services Act (DSA), the European Union sets out standardized rules to create safer digital spaces. The DSA sets out liability rules for online intermediaries and platforms, contains rules to protect users’ procedural rights with regards to the moderation of their content, and confirms that platforms’ terms of service must respect European fundamental rights. It also creates obligations for online platforms to be more transparent and expands the rights of researchers to access platforms’ data. In addition, special due diligence obligations are introduced for very large online platforms and search engines, such as to analyse and minimize systemic risks that their services may pose to society.

Unlike the General Data Protection Regulation, the DSA does not only rely on national regulatory authorities, but also gives the European Commission additional centralized powers to directly supervise large online platforms such as X, Meta and Google.

The DSA promotes cooperation between national enforcement bodies and also gives civil society organizations a comprehensive mandate to enforce users' rights. We want to use this mandate and defend the fundamental rights of users with strategic lawsuits. We will focus on four areas:

  • Transparency and protection of fundamental rights in (automated) content moderation: Platforms must work more transparently and respect the fundamental rights of their users when removing, restricting, or reinstating their content.
  • Access to research data: Most platforms are opaque when it comes to their own data, making research into the influence of platforms on issues such as elections, protection of minors or democratic discourse almost impossible. Platforms must give access to relevant data and thus respect academic freedom.
  • Platform design: Many provisions of the DSA aim to make online platforms and their practices and policies user-friendly and compliant with fundamental rights. For example, platforms must disclose how they recommend content to users, respect users’ privacy in the context of online ads offer protections for minors.
  • Fighting discrimination: Algorithmic decision-making systems and other AI systems have been shown to be prone to discriminate against minorities and vulnerable groups. We explore legal means to counteract these risks.
Protection against digital violence

Digital violence does not only cause real pain to the people affected - it is also a threat to our democracy. A vibrant democracy needs communication spaces in which people can express their opinions fearlessly - including online. Otherwise, voices will be silenced and the diversity of opinions will be in danger.

Together with the Alfred Landecker Foundation, we have launched the Marie Munk Initiative - a project that defends fundamental rights in the digital space. As part of the Center for User Rights, the Marie Munk Initiative fights to improve protection against digital violence. To this end, we have published a draft for a Digital Violence Protection Act and are supporting the legislative process for a law to improve the protection against digital violence.

More information

User rights in the context of copyright law

Free communication is fundamental to a vibrant democracy and is closely linked to numerous civil liberties, in particular the freedom of science, information, opinion and art. Copyright law, which in the pre-digital age mainly concerned professional creatives and media companies, now often comes into conflict with freedom of communication.

This has consequences for science and teaching, for activists, but also for creators themselves. As part of the Center for User Rights, the control © project defends user rights in the area of tension between freedom of communication and copyright.

More information

This project is supported by