Skip to main content

Blog

Impressions from the DSA and Platform Regulation Conference

Posted on    by Weiwei YI
Blog

Impressions from the DSA and Platform Regulation Conference

By 3 April 2024No Comments

Just before the Digital Services Act (DSA) became fully applicable in Europe, the University of Amsterdam hosted the DSA and Platform Regulation Conference, between 15-16th February. This provided a forum for researchers to discuss the latest insights into how the DSA can contribute meaningfully to platform regulation. The conference unfolded as a series of parallel work in progress sessions followed by a plenary mixing keynotes and high-level panels. This post reflects my experience of attending the conference; for more information on the full range of topics and the contributions of individual researchers please consult the programme.

A picture of the conference programme featuring a yellow illustration on a white background, in front of a banner of the Amsterdam Law School

photo by Weiwei Li

 

On the morning of the 15th, I attended the pre-conference Paper Presentation section, where I encountered some intriguing work-in-progress research on the interpretation and enforcement of the DSA.

During first session, researchers articulated their interpretations and expectations of how to better implement the data access right for researchers (Article 40 of the DSA). Discussions revolved around the clarity needed for service coordinators to handle data access requests, considering the legal protections and operational challenges associated with the information. Additionally, there was talk of a possible framework to satisfy researchers’ data access needs, the structure for compliance monitoring, vetting processes, and the challenges in balancing data access with user privacy. A group of researchers shared their experience in employing machine learning models to analyse the opinions of different stakeholders regarding DSA compliance. Further discussion focused on building a sustainable feedback mechanism, one where research findings inform risk assessments, whose outcomes are in turn integrated into regulatory compliance and systematic-risk mitigation efforts.

The second session centred on the relationship between the DSA and fundamental / constitutional rights. The first researcher discussed the DSA as part of a regulatory paradigm-shift in the EU toward risk-based regulation, in response to the complexities and uncertainties of the digital space, and the need to operationalise and balance fundamental rights within the DSA framework. Another researcher addressed the vague notion of ‘illegal content’ in the DSA and its potential to complicate the applicability of laws among EU member states, or possibly foster an over-reliance on the terms of service to define the term. They questioned whether the DSA should offer clear guidance on applying conflict of law rules effectively.

The third session offered perspectives on the challenges to meaningful transparency and accountability in platform governance. I took particular interest in the discussion about the interpretation and applicability of Article 25, which deals with online interface design and organisation. Some examined the definition of ‘platform’ in Article 25, its practical applicability, and its impact on user protection. Risks associated with algorithmic content moderation were scrutinised, with a focus on how over-reliance on algorithms could potentially diminish the quality and diversity of public discourse. Other talks revolved around the transparency of recommendation systems, the effectiveness of control tools provided by platforms, and their compliance with Article 27.3, which requires that such tools be easily accessible and effective.

During the last paper session, discussions continued on various topics related to the DSA, emphasising data access, algorithmic accountability, and the role of trusted flaggers. Researchers proposed civil society and academia collaboration to enhance platform accountability through approaches like adversarial analysis. A notable case study presented involved a tool developed for adversarial data collection, which could support DSA data access and verify platform claims in their risk assessments. The conversation about trusted flaggers opened questions on how the DSA altered the traditional relationship between these entities and social media platforms, and the potential political consequences of the decisions made by trusted flaggers. Concerns were raised about the transparency and efficacy of the trusted flagger system under the DSA, suggesting the need for safeguards to ensure accountability and proportionality in flagging.

As the conference officially began, the plenary session addressed content moderation themes. Researchers delved into the infrastructure layer and provided a quantitative analysis of content moderation reporting via the DSA Transparency Database. Others adopted a more theoretical stance, examining online visibility through political economy and market justice theories or exploring how the DSA could foster a virtuous cycle to counter the societal harm from content moderation through systemic risk assessment design. The potential for collaborative research and policy development in the context of the DSA was acknowledged, as well as the DSA’s role in establishing a research field and focus within the academic community. However, challenges in its implementation and effectiveness in safeguarding EU citizens’ rights were also highlighted.

The conference’s second day broadened the scope, with a keynote speech illuminating the European Commission’s perspective on the DSA’s design. The keynote emphasised the critical role of the research community in supporting the DSA’s implementation through evidence-based studies on platform behaviours. During panel discussions, scholars considered the DSA’s global impact, with researchers from the US expressing their views on the EU’s legislative strategy. This sparked deeper discussions on the DSA’s implementation effectiveness. Subsequent sessions examined practical approaches to enforcing the DSA, especially from national regulators’ viewpoints.

As the DSA is a focal point of my PhD research on dark pattern regulation in video games, the conference was particularly enlightening. It granted me access to cutting-edge academic discourse on the Digital Services Act. The discussions helped clarify the most contentious DSA provisions and inspired me to consider additional aspects of my research question—whether the DSA can help address the issues at the core of my work. Most significantly, I had the opportunity to converse with leading academics in my research area and received numerous insightful opinions.