16 March 2022

Social Media

Yesterday's report by the House of Representatives Select Committee on Social Media and Online Safety reflected the following terms of reference.

The Committee will inquire into: 

a) the range of online harms that may be faced by Australians on social media and other online platforms, including harmful content or harmful conduct; 

b) evidence of: i) the potential impacts of online harms on the mental health and wellbeing of Australians; ii) the extent to which algorithms used by social media platforms permit, increase or reduce online harms to Australians; iii) existing identity verification and age assurance policies and practices and the extent to which they are being enforced; 

c) the effectiveness, take-up and impact of industry measures, including safety features, controls, protections and settings, to keep Australians, particularly children, safe online; 

d) the effectiveness and impact of industry measures to give parents the tools they need to make meaningful decisions to keep their children safe online; 

e) the transparency and accountability required of social media platforms and online technology companies regarding online harms experienced by their Australians users; 

f) the collection and use of relevant data by industry in a safe, private and secure manner; 

g) actions being pursued by the Government to keep Australians safe online; and 

h) any other related matter. 

The resultant Recommendations were

R 1  The Committee recommends that the Australian Government propose the appointment of a House Standing Committee on Internet, Online Safety and Technological Matters, from the commencement of the next parliamentary term. 

R 2  The Committee recommends that, subject to Recommendation 1, the Australian Government propose an inquiry into the role of social media in relation to democratic health and social cohesion, to be referred to the aforementioned committee or a related parliamentary committee. 

R 3  The Committee recommends that the eSafety Commissioner undertakes research focusing on how broader cultural change can be achieved in online settings. 

R 4  Subject to the findings in Recommendation 3, the Committee recommends that the Australian Government establishes an educational and awareness campaign targeted at all Australians, focusing on digital citizenship, civics and respectful online interaction. 

R 5   The Committee recommends that the eSafety Commissioner examine the extent to which social media companies actively prevent: § recidivism of bad actors, § pile-ons or volumetric attacks, and § harms across multiple platforms.  The eSafety Commissioner should then provide the Australian Government with options for a regulatory framework, including penalties for repeated failures. 

R 6  The Committee recommends that the Office of the eSafety Commissioner be provided with adequate appropriations to establish and manage an online single point of entry service for victims of online abuse to report complaints and be directed to the most appropriate reporting venue, dependent on whether their complaints meet the requisite threshold, and in consideration of a variety of audiences such as children, parents/carers, women, people from culturally and linguistically diverse backgrounds, and other relevant vulnerable groups. 

R 7  The Committee recommends that the Australian Government refer to the proposed House Standing Committee on Internet, Online Safety and Technological Matters, or another committee with relevant focus and expertise, an inquiry into technology-facilitated abuse, with terms of reference including: § The nature and prevalence of technology-facilitated abuse; § Responses from digital platforms and online entities in addressing technology-facilitated abuse, including how platforms can increase the safety of their users; and § How technology-facilitated abuse is regulated at law, including potential models for reform. 

R 8  The Committee recommends that the Australian Government significantly increase funding to support victims of technology-facilitated abuse, through existing Australian Government-funded programs. This should include additional funding for specialised counselling and support services for victims; and be incorporated in the next National Action Plan to End Violence Against Women and Children 2022-2032. 

R 9  The Committee recommends that future reviews of the operation of the Online Safety Act 2021 take into consideration the implementation of the Safety by Design Principles on major digital platforms, including social media services and long-standing platforms which require retrospective application of the Safety by Design Principles. 

R 10  The Committee recommends that the Department of Infrastructure, Transport, Regional Development and Communications, in conjunction with the eSafety Commissioner and the Department of Home Affairs, examine the need for potential regulation of end-to-end encryption technology in the context of harm prevention. 

R 11  The Committee recommends that the eSafety Commissioner, as part of the drafting of new industry codes and implementation of the Basic Online Safety Expectations: § Examine the extent to which social media services adequately enforce their terms of service and community standards policies, including the efficacy and adequacy of actions against users who breach terms of service or community standards policies; § Examine the potential of implementing a requirement for social media services to effectively enforce their terms of service and community standards policies (including clear penalties or repercussions for breaches) as part of legislative frameworks governing social media platforms, with penalties for non-compliance; and § Examine whether volumetric attacks may be mitigated by requiring social media platforms to maintain policies that prevent this type of abuse and that require platforms to report to the eSafety Commissioner on their operation. 

R 12 The Committee recommends that the eSafety Commissioner examine the extent to which social media companies actively apply different standards to victims of abuse depending on whether the victim is a public figure or requires a social media presence in the course of their employment, and provides options for a regulatory solution that could include additions to the Basic Online Safety Expectations. 

R 13  The Committee recommends that the eSafety Commissioner, in conjunction with the Department of Infrastructure, Transport, Regional Development and Communications and the Department of Home Affairs and other technical experts as necessary, conduct a review of the use of algorithms in digital platforms, examining: § How algorithms operate on a variety of digital platforms and services; § The types of harm and scale of harm that can be caused as a result of algorithm use; § The transparency levels of platforms’ content algorithms; § The form in which regulation should take (if any); and § A roadmap for Australian Government entities to build skills, expertise and methods for the next generation of technological regulation in order to develop a blueprint for the regulation of Artificial Intelligence and algorithms in relation to user and online safety, including an assessment of current capacities and resources. 

R 14  The Committee recommends that the eSafety Commissioner require social media and other digital platforms to report on the use of algorithms, detailing evidence of harm reduction tools and techniques to address online harm caused by algorithms. This could be achieved through the mechanisms provided by the Basic Online Safety Expectations framework and Safety By Design assessment tools, with the report being provided to the Australian Government to assist with further public policy formulation. 

R 15  The Committee recommends that, subject to Recommendation 19, the proposed Digital Safety Review make recommendations to the Australian Government on potential proposals for mandating platform transparency. 

R 16  The Committee recommends the implementation of a mandatory requirement for all digital services with a social networking component to set default privacy and safety settings at their highest form for all users under 18 (eighteen) years of age. 

R 17  The Committee recommends the implementation of a mandatory requirement for all technology manufacturers and providers to ensure all digital devices sold contain optional parental control functionalities. 

R 18  The Committee recommends that the Department of Infrastructure, Transport, Regional Development and Communications conduct a Digital Safety Review on the legislative framework and regulation in relation to the digital industry. The Digital Safety Review should commence no later than 18 months after the commencement of the Online Safety Act 2021, and provide its findings to Parliament within twelve (12) months. 

R 19  The Committee recommends that, subject to Recommendation 18, the Digital Review examine the need and possible models for a single regulatory framework under the Online Safety Act, to simplify regulatory arrangements. 

R 20  The Committee recommends that the Digital Review include in its terms of reference: § The need to strengthen the Basic Online Safety Expectations to incorporate and formalise a statutory duty of care towards users; § The scope and nature of such a duty of care framework, including potential models of implementation and operation; § Potential methods of enforcement to ensure compliance, including penalties for non-compliance; and § The incorporation of the best interests of the child principle as an enforceable obligation on social media and other digital platforms, including potential reporting mechanisms. 

R 21  The Committee recommends that the eSafety Commissioner: § Increase the reach of educational programs geared at young people regarding online harms, with a particular focus on reporting mechanisms and the nature of some online harms being a criminal offence; § Formalise a consultation and engagement model with young people through the Australian Government’s Youth Advisory Council in regards to educational themes and program delivery; and § Report to the Parliament on the operation and outcomes of the program, including research identifying whether this has resulted in a reduction in online harm for young people. 

r 22  The Committee recommends that the eSafety Commissioner work in consultation with the Department of Education, Skills and Employment to design and implement a national strategy on online safety education designed for early childhood, and primary school-aged children, and secondary school-aged young people, including: § A proposed curriculum, informed by developmental stages and other relevant factors; § Potential methods of rollout, including consultation and engagement with children, young people, child development and psychology experts, digital education experts and other specialists in online harm; and § A roadmap provided to parents of these age groups detailing methods of addressing online harm. 

R 23  The Committee recommends that the eSafety Commissioner design and administer an education and awareness campaign aimed at adults, particularly in relation to vulnerable groups such as women, migrant and refugee groups, and people with disabilities, with a focus on the eSafety Commissioner’s powers to remove harmful content and the mechanisms through which people can report harmful content and online abuse. 

R 24  The Committee recommends that the Australian Government work with states and territories to ensure that relevant law enforcement agencies are appropriately trained on how to support victims of online harm. This should include trauma-informed approaches as well as a comprehensive understanding of police powers and other relevant avenues, such as the relevant powers of the eSafety Commissioner. 

R 25  The Committee recommends that the Australian Government review funding to the eSafety Commissioner within twelve (12) months to ensure that any of the Committee’s recommendations that are agreed to by the Government and implemented by the Office of the eSafety Commissioner are adequately and appropriately funded for any increased resource requirements. 

R 26  The Committee recommends that the Online Safety Youth Advisory Council, via the eSafety Commissioner, provide a response to this report and its recommendations within six (6) months of its establishment and full membership.