Father of 14-year-old schoolgirl who took her own life says social media firms treat removing harmful content as an ‘afterthought’ and campaigners have ‘frustratingly limited success’ when asking tech giants to act
- Ian Russell criticised social media giants over their response to harmful content
- Online safety campaigner said there is ‘limited success’ in removing content
- Mr Russell said that the ‘corporate culture at these platforms needs to change’
The father of a 14-year-old schoolgirl who took her own life has accused social media giants of treating the removal of harmful content from their platforms as an ‘afterthought’.
Ian Russell told MPs that campaigners had ‘frustratingly limited success’ when asking firms to take down content.
The online safety campaigner said tech companies only seemed to take action when ‘news stories break’ or when the Government changes regulations.
Mr Russell said the ‘corporate culture’ at the platforms must change so they respond to harmful content in a ‘proactive’ rather than a ‘reactive’ manner.
Mr Russell’s daughter, Molly, took her own life in 2017 after viewing thousands of online posts about suicide and self-harm.
Ian Russell told MPs that online safety campaigners had ‘frustratingly limited success’ when asking firms to take down harmful content
Mr Russell’s daughter, Molly, took her own life in 2017 after viewing thousands of online posts about suicide and self-harm
The Government has faced intense pressure in recent years to bring forward legislation to tighten the regulation of social media companies.
It is now pushing ahead with its Online Safety Bill which would introduce a new duty of care for online platforms and place them within the scope of Ofcom in its new role as an online regulator.
Mr Russell today gave evidence to MPs on the Draft Online Safety Bill Joint Committee as they scrutinise whether the Government’s proposals go far enough.
Mr Russell was asked this afternoon to detail his assessment of how difficult it is to persuade social media firms to take harmful content down.
He said: ‘It is our experience that there is frustratingly limited success when harmful content is requested to be removed by the platforms, particularly in terms of self harm and suicidal content and this is particularly stressful for families and friends who are bereaved by suicide.
‘It seems only when either news stories break in a particularly public way or when perhaps regulations change that the platforms respond… so it has become our view, and increasingly so, that the corporate culture at these platforms needs to change.
‘They need to be proactive rather than reactive and they after all have the resources and the skills to do this.
‘But it is so often done as an afterthought and they should live up to their words about taking online safety seriously and wanting to make their platforms safer.’
Source: Read Full Article