Michael O’Flaherty: Social media platforms ‘must not retreat from facts’
Social media platforms like Facebook and Instagram “must not retreat from facts”, the European human rights commissioner Michael O’Flaherty has said.
Meta, which owns both platforms, this week announced plans to scrap its independent fact-checking programme in favour of a ‘community notes’ system similar to that used on X, formerly Twitter.
The company has also relaxed its previous policies on hate speech and said it would start making political content more prominent on users’ feeds — moves which critics say are intended to curry favour with incoming US president Donald Trump.
Mr O’Flaherty said the measures taken by Meta, which he compared to measures taken by X under Elon Musk’s ownership, “may have adverse implications for human rights”.
Platforms which “retreat from facts… create a vacuum where disinformation thrives unchecked and the harm to democracy is deep”, he said in a statement yesterday.
Both the Irish government and the European Commission have been urged to regulate social media platforms to protect both users and democratic principles.
“At the heart of this controversy lies a fundamental tension: how to curb the spread of harmful speech while safeguarding freedom of expression and protecting human rights for all,” Mr O’Flaherty said.
“This challenge is not new, but it has taken on greater urgency in current times, where harmful speech can spread faster than corrections, and content-shaping algorithms often amplify the most polarising messages. Sometimes such harmful speech comes from state actors or personalities close to them, making the risks to democracy even bigger.
“It is important to stress that combating falsehoods and preventing the spread of hateful or violent messages is not censorship. It is a commitment to protecting human rights.”
He added: “I urge Council of Europe member states to redouble their efforts and demonstrate principled leadership in enforcing these legal standards by ensuring that internet intermediaries mitigate the systemic risks of disinformation and unchecked speech.
“This includes requiring greater transparency in content moderation practices including in the deployment of algorithmic systems.
“At the same time, state measures must remain grounded in international human rights norms to prevent overreach that could stifle legitimate expression. Indeed, transparency and accountability are the antidotes to both disinformation and overreach.
“The goal is to protect human rights for all by striking a balance that upholds freedom of expression within its well-established limitations. As debates on content moderation continue, state actors, platforms and civil society should work genuinely together to uphold human rights and democratic principles.”