The variety of online content available to users has grown significantly over the last few years as platforms have largely evolved in their number and variety. This has created a need for authentic and robust content moderation, ensuring that users fully understand the content with which they are interacting.
In this context, the approval of The Digital Services Act (DSA) will significantly improve how EU users interact with online platforms, ensuring safer mechanisms and clear guarantees over the entire EU digital space.
DSA was approved by the European Council in October 2022 and was published in The Official Journal of the European Union on October 27. The regulation entered into force on 16 November 2022 and will apply from 17 February 2024. However, Articles 24(2), (3) and (6), 33(3) to (6), 37(7), 40(13), 43 and Sections 4, 5 and 6 of Chapter IV shall apply from 16 November 2022.
According to the European Commission’s official website, DSA “regulates the obligations of digital services that act as intermediaries in their role of connecting consumers with goods, services, and content”.
The main aim of the DSA is to improve online user experience and fundamental rights protection, establish a strong transparency and accountability framework for online platforms, as well as provide a single and consistent framework across the EU.
In short, DSA will have an impact on every natural and legal person who operates online platforms and/or interacts with e-commerce services in the EU.
DSA draws up a safer framework for citizens to freely express their ideas, communicate, and shop online by reducing their exposure to illegal activities and dangerous goods and ensuring the protection of fundamental rights.
Under Digital Service Act, every European citizen and online platform will experience from a set of important changes in their online presence:
➡ Improved online experience for consumers
DSA sets new rules for online marketplaces to identify their business users and make it clear who is offering a product or service and where they can be purchased. This will make it easier to find rogue traders and safeguard online consumers from dangerous and illegal goods.
Online marketplaces will be required to notify consumers about a) the illegality, b) the identity of the trader, and c) any relevant means of redress as soon as they become aware of the illegality.
To ensure that fewer and fewer non-compliant goods reach European consumers, online marketplaces will need to randomly and constantly check the documentation of products sold on their platform.
➡ New user rights
Users will be able to report illegal content, including products, that they come across and challenge decisions made by online platforms when their content is removed. Platforms are required to inform users of any decisions made, the reasoning behind those decisions, and to provide a way to challenge those decisions.
➡ Greater accountability for very large platforms
Given their systemic influence in facilitating public discourse, commercial transactions, and the dissemination of information, opinions, and ideas, very large online platforms and very large online search engines – which reach more than 45 million users – will be subject to specific regulations.
When such platforms recommend content, users will have the option to change the criteria used and opt-out of receiving personalised recommendations.
➡ Increased advertising transparency
Online users will also be given more information about the ads they see on online platforms, such as whether or not an ad specifically targets them.
Platforms will not be allowed to display ads that are behaviorally targeted to children and will not be allowed to profile users based on things like their racial background, political leanings, or sexual preferences to show them ads.
➡ Clearer repercussions
If an intermediary service provider violates the DSA, users may seek compensation from those providers for any losses or damages they incur.
Of course, with these benefits for the users come increased responsibilities and accountability for service providers such as freelancers, entrepreneurs and large corporations.
First, digital service providers will have to implement a fast and effective procedure for removing illegal online content, including products and services.
Furthermore, providers are required to provide a non-arbitrary and non-discriminatory environment for processing notices with respect to fundamental rights, including freedom of expression and data protection.
Online platforms will ensure that the products and services provided are safe for the consumer, that information is reliable, and that diligent efforts are made to prevent illegal content.
Platforms (aside from online platforms that qualify as micro or small enterprises under the Annex to Recommendation 2003/361 from the European Commission) will be required to:
DSA also imposes some strict penalties for service providers that fail to comply with the new regulations.
In cases of serious noncompliance and damages, immediate action has to be taken, and platforms have to commit to addressing the situation, assessing the damages and finding immediate solutions to compensate affected service recipients.
Fines and sanctions provided by DSA should be proportionate, effective, however dissuasive Large platforms and search engines (with more than 45 million users) that fail to comply with the DSA’s provisions face fines of up to 6% of their global revenue.
Each Member State will implement the sanctions in accordance with the DSA Regulation in national legislation.
In the event that essential obligations are not met, an online service may be temporarily suspended as a last resort.
The above does not represent legal advice or assistance in relation to the subject discussed, and it is only meant to be an informative piece or article. For further details on the above, we kindly ask you to contact us at office@hmpartners.ro
Copyright © 2024 thesigtreeteam. All Rights Reserved.