Sat, Apr 20, 2024
Dublin

EU Taking Control Online

More choice for users over what they see online
More choice for users over what they see online
More choice for users over what they see online
More choice for users over what they see online
2

When it comes to policing online content and behaviours, MEPs say voluntary action by platform is not enough. They want clear, EU-wide rules for content moderation, applying the so-called notice and action mechanism.

The EU is working on a Digital Services Act for December 2020 which proposes to shape the rapidly developing digital economy and set standards for the rest of the world.

One of the fundamental issues that MEPs want it to address is protecting users against harmful or illegal content.

See key points of interest below:


-Removing illegal content while safeguarding rights and freedoms

MEPs want clear, EU-wide rules for content moderation, applying the so-called notice and action mechanism.

The rules should ensure that the mechanism:

Is effective -

users should be able to easily notify online intermediaries about potentially illegal online content so they can swiftly remove it

Is not abused -

in case content is flagged or taken down, affected users should be notified and have the possibility to appeal the decision to a national dispute settlement body

Respects users' rights and freedoms -,

such as freedom of expression and information, so that online intermediaries remove illegal content in a diligent, proportionate and non-discriminatory manner and do not remove content that is not illegal

MEPs want the final decision on the legality of user-generated content to be taken by an independent judiciary, not private commercial entities.


-More choice for users over what they see online

MEPs want to give users more control over the content they see and the possibility to opt out of content curation altogether.

They are calling for stricter regulation of targeted advertising in favour of less intrusive, contextualised advertising that is based on what a user is looking at in a given moment and not on their browsing history.

Going further, they want the Commission to look into more options for regulating targeted advertising, including an eventual ban.


-Clear distinction between illegal and harmful content

Parliament wants a clear distinction to be made between illegal and harmful content. Some types of content, for example Holocaust denial, may be illegal in some member states, but not in others.

Harmful content, such as hate speech and disinformation, is not always illegal. A strict distinction is needed, as the two types of content require different approaches: illegal content should be removed, while harmful content could be tackled in other ways.


-Ways to tackle harmful content

MEPs propose increasing transparency obligations for platforms as well as raising media literacy among users.

Parliament noted that one reason disinformation spreads so fast is because some platforms' business models favour showing sensational and clickable content to users to increase profits.

To tackle the negative effects of this practice, members want transparency on the monetisation policies of online platforms.



Comments (0)

Please be concise and stay on topic and thank you for contributing. Privacy & terms policy here.
write a comment

Write your comment below

Latest Most Read