What DSA Art. 17 requires
DSA Art. 17(3) requires a clearly defined minimum content: the concrete facts on which the decision is based, the exact ToS clause or legal basis, information on whether the decision was automated or manual, and concrete references to internal complaint routes and out-of-court dispute bodies.
These requirements are not soft. The EU legislator intentionally formulated them strictly - a platform speaking generically of 'Community Standards violation' does not satisfy them. That is challengeable.
Important: the duty applies to any form of 'restriction': suspension, demonetisation, reach throttling, item removal. Shadow bans are also covered, provided they have a tangible impact.
What does not suffice
'Community Standards violation' without naming the specific clause: not enough.
'Cheating suspicion' without facts or specific gameplay reference: not enough.
'Repeated past violations' without listing individual incidents and dates: not enough.
The standard is clear: the reasoning must be specific enough that you, as the affected user, can meaningfully complain. If you cannot tell exactly what you supposedly did, the substance is missing.
Where we apply the lever
If the statement of reasons fails the requirements, the underlying measure is formally unlawful. This applies regardless of whether the substantive allegation is true - a cheater doesn't need to be exonerated, but the suspension must be lifted for procedural defects.
Practically: we reject the suspension for breach of DSA Art. 17, demand a proper statement with a deadline, and can escalate to an out-of-court dispute body (DSA Art. 21) or directly to court if the platform stalls.
Experience shows that even a precise lawyer letter pointing to the procedural defect leads to reactivation in many cases - platforms know that procedural escalation costs them more than reinstatement.
Special case: statement of reasons for automated decisions
If the suspension was purely automated (e.g., anti-cheat software or trust-and-safety AI), the statement of reasons must disclose this explicitly. That's not just a formality - it opens an additional lever under GDPR Art. 22.
GDPR Art. 22 prohibits solely automated decisions with significant impact. Following ECJ C-634/21 (SCHUFA), the threshold is low - a permanent account suspension regularly crosses it.
So when a decision was automated and no human oversight took place, both DSA Art. 17 and GDPR Art. 22 are breached. Double lever, materially higher damages prospects.
Author
Dr. Nikolas Hartmann
Managing Partner · Rechtsanwaltskammer Berlin
Focus: Digital Services Act, Platform law, IT contract law, Digital compliance.
View profileAffected yourself?
We'll review your case within 24 hours - free of charge and without obligation.