The Group's Managing the Impact of Social Media on Young People’s Mental Health and Wellbeing report offers 'key findings' -
• Social media can have a range of positive effects: providing a platform for self-expression, enhancing social connections, and supporting learning.
• Young people using social media to find support for mental health conditions are at high-risk of unintentional exposure to graphic content and that discourse could unhelpfully “glamorise” mental illness and prevent young people from accessing professional help.
• While 12% of children who spend no time on social networking websites have symptoms of mental ill health, the figure rises to 27% for those who are on the sites for three or more hours a day.
• Almost two-thirds (63%) of young people reported social media was a good source of health information.
• Pressure to conform to beauty standards perpetuated and praised online can encourage harmful behaviours to achieve “results”, including disordered eating and body shame.
• 46% of girls compared to 38% of all young people reporting that social media had a negative impact on their self-esteem.The consequent 'Calls To Action' are -
That the UK and Devolved Government’s
• Establish a duty of care on all social media companies with registered UK users aged 24 and under in the form of a statutory code of conduct, with Ofcom to act as regulator.
• Create a Social Media Health Alliance, funded by a 0.5% levy on the profits of social companies, to review the growing evidence base on the impact of social media on health and wellbeing and establish clearer guidance for the public.
• That the Government publishes evidence based guidance for those aged 24 and younger to avoid excessive social media use, that is use of “websites and applications that enable users to create and share content or to participate in social networking”.
• Urgently commission robust, longitudinal research, into understanding the extent to which the impact of social media on young people’s mental health and wellbeing is one of cause or correlation and into whether the “addictive” nature of social media is sufficient for official disease classification.The Group proposes a statutory duty of care, commenting
The APPG commends actions being taken by industry to help protect children and young people online. However, polling commissioned by RSPH in April 2018 on behalf of the APPG found more than half of the public (52%) feel that not enough is being done by social media companies to address their potential impact on mental health and wellbeing, with a further 80% of respondents advocating that tighter regulation of social media companies was needed.
In April 2018, in response to the Internet Safety Strategy consultation, the Government outlined it would collaborate with industry, charities and the public on a White Paper which would provide legislation to “cover the full range of online harms, including harmful and illegal content” .
Throughout this Inquiry there has been considerable evidence provided supporting the case that social media companies have a duty of care to protect their users. A statutory duty of care would provide a robust, flexible legal framework within which the Government could require the implementation of a social media code of conduct for providers, which specifically includes measures to protect the mental health and wellbeing of users.
i) A statutory duty of care
The concept of a statutory duty of care to apply to people and companies has been defined by William Perrin and Professor Lorna Woods in their work for Carnegie UK Trust45 and summarised by the Science and Technology Committee in their report on the impact of social media and screen use on young people’s health, as a requirement to: “Take care in relation to a particular activity as it affects particular people or things. If that person does not take care, and someone comes to a harm identified in the relevant regime as a result, there are legal consequences, primarily through a regulatory scheme but also with the option of personal legal redress.”
William Perrin, Trustee of Carnegie UK Trust, and Professor Lorna Woods from the University of Essex, provided oral evidence to the Inquiry informed by their ongoing work on a proposal for Internet Harm Reduction which advocates that social media networks should be seen as a public place. Using this analogy, when people go to social networks owned by companies, they should be protected. In their proposal, they outline a statutory duty of care model, which would require social media and other internet platforms to take reasonable steps to manage the impact of social media on young people’s mental health and wellbeing to prevent foreseeable harm from arising to users, whilst allowing a certain level of flexibility for social media platforms to take action appropriate to their respective services and the risks that those services create. Best practice could be agreed by industry and formulated into codes of conduct, with the emphasis on appropriate and proportionate responses allowing space for innovation. In the case that not enough progress is taken by industry, then appropriate action would be taken by an independent regulator. The independent regulator could have the role of approving industry agreed codes of conduct.
As is discussed by William Perrin in his blog post, Reducing harm in social media through a duty of care, “An industry code of conduct has the benefit of guiding companies on where to focus and makes sure that Parliament’s priorities are not lost” .
ii) A statutory code of conduct
The harm reduction principles behind the duty of care were also endorsed by a number of charities when recommending the introduction of a code of conduct for social media companies. For example, Barnardo’s called for a statutory code of conduct for all social media sites and an independent watchdog to hold them to account, with powers to issue fines. In written evidence, Barnardo’s told the Inquiry, “If playgrounds need health and safety in place before children can use them, the online world should have the equivalent safeguards.”
Parentzone agreed, stating, “We believe that a service that is aware of a child experiencing harm in this country should be required, by law, to report that harm. A child enjoying an online service should reasonably expect the same level of legal protection as a child enjoying a game in public or private play park.”
The Inquiry also heard that it was important that any regulation ensured the positives of social media were protected. Social media is an integral part of young people’s lives and by overly restricting their use, it is a risk that they will be denied its benefits. In written evidence the Corsham Charity Institute advised that “Self-regulation as a stand-alone solution, without any regulation from a higher body, is not a workable solution for the long term”. They advised that instead of prescriptive measures, the Government should focus on creating guidelines and ethical frameworks to support platforms in making decisions that benefit young people’s wellbeing. Rather than limiting young people’s freedom to use social media, they advised that any industry code of conduct should take specific care to encourage innovation and cooperation between companies to best promote young people’s wellbeing, and focus on protecting users’ privacy and ensuring platforms show a duty of care and remain transparent.
On the basis of evidence received throughout this Inquiry, the Internet Harm Reduction Proposal and building upon those recommendations set out in the Government’s Internet Safety Green Paper and by the Science and Technology Committee, the APPG recommends that the Government in its forthcoming White Paper should introduce a statutory duty of care, including the definition of key harms for an independent regulator to focus on and supported by a code of conduct for all relevant service providers to address the defined harms. As a baseline, key harms would be identified in line with those set out in the Government’s Internet Safety Green Paper.
The code of conduct would set out an expectation that service providers will prevent reasonably foreseeable harms from occurring and this will therefore require social media platforms to take action before activity reaches the level at which it would become a criminal offence. In agreement with the Science and Technology Committee’s recommendations, the APPG understands that it is essential such legislation is flexible “so that it can straightforwardly adapt and evolve as trends change and new technologies emerge”.
The APPG suggests that, along with those harms outlined in the Internet Safety Strategy Green Paper, the following harms set out in the Carnegie UK Trust proposal,are reflected in the code of conduct:
• Harmful threats, including a statement of an intention to cause pain, injury, damage or other hostile action such as intimidation.
• Psychological harassment, including threats of a sexual nature, threats to kill, racial or religious threats known as hate crime.
• Hostility or prejudice based on a person’s race, religion, sexual orientation, disability, gender identity, or misogyny.
• Economic harm, including financial misconduct and intellectual property abuse.
• Emotional harm, including preventing emotional harm suffered by users such that it does not build up to the criminal threshold of a recognised psychiatric injury.
• Harm to young people such as bullying, aggression, hate, sexual harassment and communications, exposure to harmful or disturbing content, grooming and child abuse.
The code of conduct should also include specific protection against harms specifically to the mental health and wellbeing of young people using social media platforms including, but not limited to:
• Disordered eating;
• A lack of sleep;
• Over dependence on social media.
Adolescence and early adulthood is a critical and potentially vulnerable time for social and emotional development and this, coupled with 91% of 16-24 year olds using the internet for social media, has led the APPG to recommend that this duty of care should apply to any social media site with registered UK users aged 24 years and under, regardless of size or the number of users of the platform. This is to ensure that all social media platforms take an appropriate level of care, regardless of the size or newness of a platform.
Furthermore, it is important when developing a code of conduct that vulnerable young people are adequately protected, including those who have experienced abuse, those on child protection plans and in acute or hospital settings, children and young people with disabilities, young carers, minority ethnic groups, lesbian, gay, bi-sexual, transgender and questioning (LGBTQ+) young people, and those with poor mental health.
This duty of care should also be extended as deemed appropriate by the regulator and the Social Media Health Alliance to protect all vulnerable social media users.
5.4 Formation of a new body, the Social Media Health Alliance, to fund research, educational initiatives and establish clearer guidance for the public
Prior to the implementation of statutory legislation, a Social Media Health Alliance would be established to work under the direction of Ofcom to advise on what harms are set out in this code of conduct. The objective of this Alliance would be to fund research and educational initiatives to address the harms associated with social media. The Alliance would be independent of industry, and would be independently constituted with representatives who have a shared interest in reducing the damage caused to young people’s mental health and wellbeing from social media, across England, Scotland Northern Ireland and Wales.
The APPG believes that a Social Media Health Alliance would be well placed to regularly review evidence of the impact of social media on young people’s mental health and wellbeing. Based on a polluter pays principle the Social Media Health Alliance would be funded by a compulsory 0.5% levy on the profits of social media companies.
iii) Ofcom to assume responsibility for regulation
On the basis of evidence reviewed, the APPG recommendations that the Government resources Ofcom to assume responsibility for regulatory duties.
As summarised by Maeve Walsh, Carnegie UK Trust Associate, “The regulator (would set out a harm reduction cycle involving civil society as well as companies at each consultative step. Companies would be required to measure and survey harm, produce plans to address these harms for public consultation and agreement with the regulator, then implement the plans. If the cycle does not reduce harms or the companies do not cooperate then sanctions could be deployed.”
The APPG recommends that a code of conduct, regulated by Ofcom, should take effect by 31 October 2019. Prior to the implementation of statutory legislation, a Social Media Health Alliance would be established to work under the direction of Ofcom to advise on what harms are set out in this code of conduct.