MediaSocial media algorithms need overhaul in wake of Southport...

Social media algorithms need overhaul in wake of Southport riots, Ofcom says

-

Social media algorithms must be adjusted to prevent misinformation from spreading, the chief executive of Ofcom has warned, responding to the rioting that broke out after the killing of three girls in Southport this summer.

Misinformation about the Southport killings proliferated despite tech firms and social media platforms’ “uneven” attempts to stop it, wrote the Ofcom chief executive, Melanie Dawes, in a letter to the secretary of state for science, innovation and technology, Peter Kyle.

“Posts about the Southport incident and subsequent events from high-profile accounts reached millions of users, demonstrating the role that virality and algorithmic recommendations can play in driving divisive narratives in a crisis period,” Dawes wrote.

Dame Melanie Dawes, chief executive of Ofcom.
Dame Melanie Dawes, chief executive of Ofcom.

“Most online services took rapid action in response to the situation, but responses were uneven.”

Dawes was responding to Kyle’s request for information about whether Ofcom would be targeting online misinformation in the next update of its illegal harms code of practice, which social media companies will be required to sign up to when the online safety bill comes into force.

Dawes said that current draft Ofcom proposals would require apps to change their algorithms so that content which is illegal or harmful, including abuse and incitement to violence or hate speech, is down-ranked for children’s accounts.

Some platforms told Ofcom that misinformation seeking to whip up hatred appeared online “almost immediately” after the attacks, with the result that platforms were “dealing with high volumes, reaching the tens of thousands of posts in some cases”, some of which came from accounts with more than 100,000 followers.

Some of these accounts “falsely stated that the attacker was a Muslim asylum seeker and shared unverified claims about his political and ideological views”, Dawes said, adding: “There was a clear connection between online activity and violent disorder seen on UK streets.”

Dawes also highlighted the role of “major” messaging services that hosted closed groups comprising thousands of users. One example Ofcom has seen was of calls for demonstrations targeting a local mosque circulating in private groups online within two hours of the vigil for the victims of the attack, while others identified targets for damage or arson such as asylum accommodation.

Her letter also sets out how Ofcom responded to the riots by reminding tech firms to protect users and asserting that they did not need to wait for the online safety bill to come into force to do so.

Although Ofcom was unable to investigate whether social media platforms’ responses were fit for purpose because the online safety bill has not yet been implemented, it plans to take tougher enforcement action in future.

“These events have clearly highlighted questions tech firms will need to address as the duties come into force. While some told us they took action to limit the spread of illegal content, we have seen evidence that it nonetheless proliferated,” she said.

“On some platforms, false information regarding the identity of the attacker continued to spread in the three days it took for his real identity to be made public, even when there was evidence of intent to stir up racial and religious hatred.”

This has been highlighted in the convictions which have followed, including of those found guilty of online posts threatening death or serious harm, stirring up racial hatred, or sending false information with intent to cause harm, Dawes noted.

Responses to Southport-related disinformation from tech companies included setting up monitoring groups looking for spikes in harmful content and taking down harmful material, including URLs leading to illegal and harmful content, as well as suspending or closing accounts and channels.

Once the online safety bill comes into force, Ofcom will be expecting platforms to: make explicit how they protect users from illegal hateful content; to have processes which can swiftly take it down, especially viral content; and to have robust complaints mechanisms.

Ofcom said it would use the findings from the Southport case to identify gaps in the current legislation and guidance. It has already established that there is a need to strengthen requirements for social media platforms’ crisis response protocols.

Dawes added that the event further “highlight[s] the importance of promoting media literacy, to heighten public awareness and understanding of how people can protect themselves and others online”.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Latest news

Anglesey: Three People Killed After Road Crash Near Pier In North Wales Seaside Town

Three people have been killed in a road collision in Anglesey, police have confirmed.North Wales Police say it responded...

Brodie MacGregor: Murder Probe After Death Of Woman, 23, In Glasgow

A murder investigation has been launched following the death of a woman in Glasgow.Brodie MacGregor, 23, was pronounced dead...

Bogus Taxi Driver Jailed For Raping Woman On Her Way Home From Night Out In Edinburgh

A bogus taxi driver who raped a woman on her way home from a night out has been jailed...

Man, 20, Charged Over Stabbing Of Mother Who Was With Her Child At Notting Hill Carnival

A 20-year-old man has been charged with attempted murder over the stabbing of a 32-year-old mother at Notting Hill...

Must read

Watchdog opens investigation into anti-immigrant posts on Facebook

Mark Zuckerberg’s Meta must answer “serious questions” about its...

How Meta Makes Millions off Political Violence

After the attempted assassination of Donald Trump in July,...

You might also likeRELATED
Recommended to you

0
Would love your thoughts, please comment.x
()
x