18 March 2022

Scams

The Australian Competition & Consumer Commission (ACCC) has instituted Federal Court proceedings against Meta Platforms, Inc. and Meta Platforms Ireland Limited, alleging that the Facebook subsidiaries engaged in false, misleading or deceptive conduct by publishing scam advertisements featuring prominent Australian public figures in breach of the Australian Consumer Law (ACL) or the Australian Securities and Investments Commission Act (ASIC Act). Meta is also alleged to have aided and abetted or was knowingly concerned in false or misleading conduct and representations by the advertisers. 

The ACCC is seeking declarations, injunctions, penalties, costs and other orders.

The ACCC alleges that the ads, which promoted investment in cryptocurrency or money-making schemes, were likely to mislead Facebook users into believing the advertised schemes were associated with well-known people featured in the ads, such as businessman Dick Smith, TV presenter David Koch and former NSW Premier Mike Baird. 

 The ACCC states that the schemes were scams, with the people featured in the ads neither approved or endorsing them. The ads contained links which took Facebook users to a fake media article that included quotes attributed to the public figure featured in the ad endorsing a cryptocurrency or money-making scheme. Users were then invited to sign up and were subsequently contacted by scammers who used high pressure tactics, such as repeated phone calls, to convince users to deposit funds into the fake schemes. 

The ACCC does not appear to have commented on the presence of the same ads in other digital platforms and indeed in online 'mainstream media'. 

 ACCC Chair Rod Sims comments 

The essence of our case is that Meta is responsible for these ads that it publishes on its platform. It is a key part of Meta’s business to enable advertisers to target users who are most likely to click on the link in an ad to visit the ad’s landing page, using Facebook algorithms. Those visits to landing pages from ads generate substantial revenue for Facebook. 

We allege that the technology of Meta enabled these ads to be targeted to users most likely to engage with the ads, that Meta assured its users it would detect and prevent spam and promote safety on Facebook, but it failed to prevent the publication of other similar celebrity endorsement cryptocurrency scam ads on its pages or warn users. 

Meta should have been doing more to detect and then remove false or misleading ads on Facebook, to prevent consumers from falling victim to ruthless scammers..

The ACCC alleges that Meta was aware that the celebrity endorsement cryptocurrency scam ads were being displayed on Facebook but did not take sufficient steps to address the issue. The celebrity endorsement cryptocurrency scam ads were still being displayed on Facebook even after public figures around the world had complained that their names and images had been used in similar ads without their consent. 

The ACCC notes

Apart from resulting in untold losses to consumers, these ads also damage the reputation of the public figures falsely associated with the ads. Meta failed to take sufficient steps to stop fake ads featuring public figures, even after those public figures reported to Meta that their name and image were being featured in celebrity endorsement cryptocurrency scam ads. 

UK SLAPP consultation

The UK Ministry of Justice has released an urgent Call for Evidence on Strategic Lawsuits Against Public Participation, ie SLAPPs, characterised as 

an abuse of the legal process, where the primary objective is to harass, intimidate and financially and psychologically exhaust one’s opponent via improper means. These actions are typically initiated by reputation management firms and framed as defamation or privacy cases brought by individuals or corporations to evade scrutiny in the public interest. 

They are claims brought by extremely wealthy individuals and corporations. The invasion of Ukraine has heightened concerns about SLAPPs, as we have clearly seen that aggression is closely associated with clamping down on free speech and reporting of events. We need to isolate these cases in devising counter-measures, so that while we prevent our justice system being abused we do not curb access to justice in legitimate cases. In responding to SLAPPs, we need to fully understand the breadth of litigation and range of misconduct involved. A Call for Evidence will enable us to establish a number of things. 

Firstly, we want to hear at first hand from parties who have been involved in SLAPPs – their experiences and the impact on them personally and professionally. Secondly, we are conscious that high profile cases are likely to represent the tip of this iceberg, in two important respects. One is the number of pre-action letters that are issued in cases that never reach court as they result in a settlement or other form of agreement. The other is the chilling effect of SLAPPs – the perfectly appropriate news investigations that may be curtailed or not even started because of the fear or the risk of their incurring the crippling expense of High Court litigation. 

 The Call states 

The term SLAPPs is commonly used to describe activity that aims to discourage public criticism through an improper use of the legal system. 

SLAPPs have two key features: 

• They target acts of public participation. Public participation can include academic research, journalism and whistle-blowing activity concerned with matters of societal importance, such as illicit finance or corruption. 

• They aim to prevent information in the public interest from being published. This can be by threatening or bringing proceedings which often feature excessive claims. 

Individuals or organisations wishing to prevent information reaching the public eye engage reputation management firms or legal professionals to help them do so. This will often result in communications to the targeted individuals or organisations which threaten litigation, though the desired outcome is to prevent further investigations from taking place. Occasionally SLAPPs serve to divert attention from legitimate enquiries, by commencing action on spurious points such that the target’s resources are consumed and taken away from their initial focus. SLAPPs are often framed as legal cases, but they represent an abuse of law and procedure as their principal objective is stifling public debate, rather than the pursuit of a legal remedy. SLAPPs are frequently threatened or brought in defamation law, though increasingly data protection and privacy law is being misused against free speech within the law. 

Why are we looking at this issue? 

The Government is concerned that SLAPPs threaten free speech within the law and the rule of law, which are fundamental parts of our democratic tradition. Public watchdogs, including the press and public officials, are vital in ensuring accountability and transparency in our legal system. We are aware that SLAPPs interfere with parliamentary affairs: reports suggest parliamentary clerks have been subject to SLAPPs such that their constitutional duties are impeded. 

SLAPPs are often brought by powerful entities whose resources vastly exceed those whom they seek to silence, resulting in public interest reporting being withdrawn pre- emptively to avoid expensive confrontation. This means a single successful SLAPP can have far-reaching consequences, in effect censoring others who fear similar tactics. 

Provisional data from the Coalition Against SLAPPs in Europe (CASE) estimates there were 14 SLAPPs cases in the UK in 2021, an increase on the two cases in both 2020 and 2019 and one case in 2018. Whilst this may appear to be a small number of cases, we are issuing this Call for Evidence to uncover information about cases which might have gone unrecorded. We believe there will be many, as well as cases which never reached court because the respondent was intimidated into settling, which are likely to far exceed the number of cases which reach court. The think tank Foreign Policy Centre found in its 2020 survey of 63 investigative journalists working globally on corruption that civil legal cases, including cease and desist letters, surveillance, interrogation by authorities and smear campaigns, were experienced by more than 50% of respondents. 73% of those receiving threats had been threatened with legal action. 61% of respondents also reported that their investigations had uncovered a link (directly or indirectly) with UK financial and legal jurisdictions.  

The Government is supportive of media freedom here and abroad. We have taken action to protect the press through the National Action Plan on the Safety of Journalists led by the Department for Digital, Culture, Media and Sport and the Home Office, which provides measures to counter threats to journalists’ physical safety. 

The Foreign, Commonwealth and Development Office lead on the Government’s participation in and support of the Media Freedom Coalition, a partnership of countries working together committed to media freedom and safety of journalists and to hold to account those who would harm journalists for doing their job. Members of the Coalition have signed the Global Pledge on Media Freedom, a written commitment to improving media freedom domestically and working together internationally. 

Whilst SLAPPs are typically designed to intimidate opponents psychologically, there is evidence suggesting that these threats can escalate into physical harm. Tragic cases overseas, such as the murder of Daphne Caruana Galizia who reportedly faced over forty SLAPPs cases at the time of her death, illustrate how public interest investigative reporting can attract intimidation by lawsuit and, separately, risk to physical safety. In the first instance this Call for Evidence focuses on establishing evidence about the use of SLAPPs in England and Wales, before focusing on reforms within defamation law, which to date has been the primary vehicle for SLAPPs cases. We welcome broader suggestions on how to address SLAPPs to inform Government action to curb this abuse of law.

The Call centres on a SLAPPs Questionnaire -

Impact on SLAPPs recipients 

Question 1: Have you been affected personally or in the conduct of your work by SLAPPs? If so, please provide details on your occupation and the impact SLAPPs had, if any, on your day to day activity including your work and wellbeing. 

Question 2: If you have been affected by SLAPPs, please provide details on who issued the SLAPP (for example, a legal or public relations professional), the form (for example, an email or letter) and the content. Was legal action mentioned? If yes, please provide details on the type of action. 

Question 3: If you have been subject to a SLAPP action how did it proceed? For example, a pre-action letter or a formal court claim resulting in a hearing. Did you settle the claim and what was the outcome of the matter? 

Question 4: If you are a member of the press affected by SLAPPs, has this affected your editorial or reporting focus? Please explain if it did or did not do so, including your reasons. 

Question 5: If you have been affected by SLAPPs, did you report this to anyone? Please explain if you did or did not do so, including your reasons. What was the outcome? 

Question 6: If you have been affected by SLAPPs, please provide details on the work you were undertaking at the time, including the subject matter referred to by SLAPPs. 

Legislative reforms Statutory definition for SLAPPs 

Question 7: Do you agree that there needs to be a statutory definition of SLAPPs? 

Question 8: What approach do you think should be taken to defining SLAPPs? For example, should it be to establish a new right of public participation? What form should that take? 

Question 9: If a new right of public participation were introduced, should it form an amendment to the Defamation Act 2013, or should it be a free-standing measure, recognising that SLAPP cases are sometimes brought outside of defamation law? 

Question 10: Do you think the approach should be a definition based on various criteria associated with SLAPPs and the methods employed? 

Question 11: Are there any international models of SLAPP legislation which you consider we should draw on, or any you consider have failed to deal effectively with SLAPPs? Please give details. 

Question 12: Would you draw any distinction in the treatment of individuals and corporations as claimants in drawing up definitions for SLAPP type litigation? 

Reforms stemming from there being a defined cohort of SLAPPs cases 

Question 13: Which other reform options for tackling SLAPPs would you place on a statutory footing? Please give reasons. 

Question 14: Are there additional reforms you would pursue through legislation? Please give reasons. 

Defamation (libel) laws 

The Serious Harm Defence 

Question 15: Does the serious harm test in defamation cases have any effect on SLAPPs claims? 

Question 16: Are there any reforms to the serious harm test that could be considered in SLAPPs cases? 

The defence of Truth 

Question 17: Does the truth defence in defamation cases have any effect on SLAPPs claims? 

Question 18: Are there any reforms to the defence of truth that could be considered in SLAPPs cases? For example, should we reverse the burden of proof in SLAPPs cases, so that claimants have to demonstrate why a statement is not true? 

The defence of Honest Opinion 

Question 19: Does the honest opinion defence in defamation cases have any effect on SLAPPs claims? 

Question 20: Are there any reforms to the honest opinion defence that could be considered in SLAPPs cases? 

The defence of Public Interest 

Question 21: How far does the public interest defence in defamation cases provide a robust enough defence in SLAPPs claims? 

Question 22: Are there any reforms to the public interest defence that could be considered in SLAPPs cases? 

Reports protected by Privilege 

Question 23: Does the privilege defence in defamation cases have any effect on SLAPPs claims? 

Question 24: Are there any reforms to the privilege defence that could be considered in SLAPPs cases? 

Question 25: Do you have any views on whether qualified privilege should be extended in relation to reporting of Parliamentary debate of SLAPPs. 

Libel Tourism 

Question 26: To what extent does the appropriate jurisdiction test assist as a defence to defamation in SLAPPs claims? 

Question 27: Are there any reforms to the appropriate jurisdiction test that could be considered in SLAPPs cases? 

Other Possible Defamation reforms on SLAPPs 

Question 28: Do you consider that the Government should consider reforming the law on actual malice to raise the threshold for defamatory statements made against SLAPP claimants? Please give reasons. 

Question 29: If you agree the Government should pursue actual malice reforms, what form should these take? 

Other Possible Reforms 

Question 30: Are there any other areas of defamation law that you consider may be reformed to address the problems SLAPPs cases give rise to? 

Procedural reforms 

Pre-Action Protocols 

Question 31: Do you have any views or experience on how the Pre-Action Protocol for Media and Communications operates in SLAPPs cases? If so, to what extent does it help to regulate the conduct of SLAPPs claims? Please explain your response. 

Question 32: Do you have any views or suggestions on amendments to Pre-Action Protocols which would improve upon existing pre-action conduct in SLAPP cases? Please explain your response. 

Strike-Outs 

Question 33: To what extent do you consider that SLAPP type litigation represents an abuse of process, and should be considered by courts for strike-out action? 

Question 34: How would you propose to reform or strengthen the use of strike-out in addressing SLAPP type litigation? 

Civil Restraint Orders 

Question 35: Are Civil Restraint Orders currently an effective procedure against SLAPPs litigants? If not, what reforms do you propose? Question 36: Should the court consider anything beyond the current issues of number of applications and merits of a case when considering whether to issue a CRO? 

Other procedural reforms 

Question 37: Do you have any other suggestions for procedural reform to be pursued either by the Government or considered by the judiciary or Civil Procedure Rule Committee in relation to SLAPPs cases? Should a permission stage be applied to SLAPPs cases? 

Regulatory reforms 

Solicitors Regulation Authority Guidance on SLAPPs 

Question 38: If you are a solicitor, does the SRA guidance provided on SLAPPs help you understand your professional duties in conducting disputes? Please explain your answer. 

Reporting SLAPPs 

Question 39: If you have been affected by SLAPPs, did you report the issue to a professional regulator? Please explain and give reasons for your decision. If you did so, what was the outcome? 

Defamation costs reforms 

Question 40: How was your SLAPP funded (private funding, CFA, other (please specify))? 

Question 41: How were adverse costs addressed (private funding, ATE, other (please specify))? 

Question 42: Please give details of the costs of the case, broken down (i) by stage and (ii) by which party had to pay them. 

Question 43: Do you agree that a formal costs protection regime (based on the ECPR) should be introduced for (i) all defamation cases, or (ii) SLAPPs cases only – please give reasons? 

Question 44: If so, what should the default levels of costs caps be for (i) all defamation cases, or (ii) SLAPPs cases only – please give reasons? 

Question 45: Do you have any other suggestions as to how costs could be reformed in (i) all defamation cases, or (ii) SLAPPs cases only – please give reasons?

16 March 2022

Social Media

Yesterday's report by the House of Representatives Select Committee on Social Media and Online Safety reflected the following terms of reference.

The Committee will inquire into: 

a) the range of online harms that may be faced by Australians on social media and other online platforms, including harmful content or harmful conduct; 

b) evidence of: i) the potential impacts of online harms on the mental health and wellbeing of Australians; ii) the extent to which algorithms used by social media platforms permit, increase or reduce online harms to Australians; iii) existing identity verification and age assurance policies and practices and the extent to which they are being enforced; 

c) the effectiveness, take-up and impact of industry measures, including safety features, controls, protections and settings, to keep Australians, particularly children, safe online; 

d) the effectiveness and impact of industry measures to give parents the tools they need to make meaningful decisions to keep their children safe online; 

e) the transparency and accountability required of social media platforms and online technology companies regarding online harms experienced by their Australians users; 

f) the collection and use of relevant data by industry in a safe, private and secure manner; 

g) actions being pursued by the Government to keep Australians safe online; and 

h) any other related matter. 

The resultant Recommendations were

R 1  The Committee recommends that the Australian Government propose the appointment of a House Standing Committee on Internet, Online Safety and Technological Matters, from the commencement of the next parliamentary term. 

R 2  The Committee recommends that, subject to Recommendation 1, the Australian Government propose an inquiry into the role of social media in relation to democratic health and social cohesion, to be referred to the aforementioned committee or a related parliamentary committee. 

R 3  The Committee recommends that the eSafety Commissioner undertakes research focusing on how broader cultural change can be achieved in online settings. 

R 4  Subject to the findings in Recommendation 3, the Committee recommends that the Australian Government establishes an educational and awareness campaign targeted at all Australians, focusing on digital citizenship, civics and respectful online interaction. 

R 5   The Committee recommends that the eSafety Commissioner examine the extent to which social media companies actively prevent: § recidivism of bad actors, § pile-ons or volumetric attacks, and § harms across multiple platforms.  The eSafety Commissioner should then provide the Australian Government with options for a regulatory framework, including penalties for repeated failures. 

R 6  The Committee recommends that the Office of the eSafety Commissioner be provided with adequate appropriations to establish and manage an online single point of entry service for victims of online abuse to report complaints and be directed to the most appropriate reporting venue, dependent on whether their complaints meet the requisite threshold, and in consideration of a variety of audiences such as children, parents/carers, women, people from culturally and linguistically diverse backgrounds, and other relevant vulnerable groups. 

R 7  The Committee recommends that the Australian Government refer to the proposed House Standing Committee on Internet, Online Safety and Technological Matters, or another committee with relevant focus and expertise, an inquiry into technology-facilitated abuse, with terms of reference including: § The nature and prevalence of technology-facilitated abuse; § Responses from digital platforms and online entities in addressing technology-facilitated abuse, including how platforms can increase the safety of their users; and § How technology-facilitated abuse is regulated at law, including potential models for reform. 

R 8  The Committee recommends that the Australian Government significantly increase funding to support victims of technology-facilitated abuse, through existing Australian Government-funded programs. This should include additional funding for specialised counselling and support services for victims; and be incorporated in the next National Action Plan to End Violence Against Women and Children 2022-2032. 

R 9  The Committee recommends that future reviews of the operation of the Online Safety Act 2021 take into consideration the implementation of the Safety by Design Principles on major digital platforms, including social media services and long-standing platforms which require retrospective application of the Safety by Design Principles. 

R 10  The Committee recommends that the Department of Infrastructure, Transport, Regional Development and Communications, in conjunction with the eSafety Commissioner and the Department of Home Affairs, examine the need for potential regulation of end-to-end encryption technology in the context of harm prevention. 

R 11  The Committee recommends that the eSafety Commissioner, as part of the drafting of new industry codes and implementation of the Basic Online Safety Expectations: § Examine the extent to which social media services adequately enforce their terms of service and community standards policies, including the efficacy and adequacy of actions against users who breach terms of service or community standards policies; § Examine the potential of implementing a requirement for social media services to effectively enforce their terms of service and community standards policies (including clear penalties or repercussions for breaches) as part of legislative frameworks governing social media platforms, with penalties for non-compliance; and § Examine whether volumetric attacks may be mitigated by requiring social media platforms to maintain policies that prevent this type of abuse and that require platforms to report to the eSafety Commissioner on their operation. 

R 12 The Committee recommends that the eSafety Commissioner examine the extent to which social media companies actively apply different standards to victims of abuse depending on whether the victim is a public figure or requires a social media presence in the course of their employment, and provides options for a regulatory solution that could include additions to the Basic Online Safety Expectations. 

R 13  The Committee recommends that the eSafety Commissioner, in conjunction with the Department of Infrastructure, Transport, Regional Development and Communications and the Department of Home Affairs and other technical experts as necessary, conduct a review of the use of algorithms in digital platforms, examining: § How algorithms operate on a variety of digital platforms and services; § The types of harm and scale of harm that can be caused as a result of algorithm use; § The transparency levels of platforms’ content algorithms; § The form in which regulation should take (if any); and § A roadmap for Australian Government entities to build skills, expertise and methods for the next generation of technological regulation in order to develop a blueprint for the regulation of Artificial Intelligence and algorithms in relation to user and online safety, including an assessment of current capacities and resources. 

R 14  The Committee recommends that the eSafety Commissioner require social media and other digital platforms to report on the use of algorithms, detailing evidence of harm reduction tools and techniques to address online harm caused by algorithms. This could be achieved through the mechanisms provided by the Basic Online Safety Expectations framework and Safety By Design assessment tools, with the report being provided to the Australian Government to assist with further public policy formulation. 

R 15  The Committee recommends that, subject to Recommendation 19, the proposed Digital Safety Review make recommendations to the Australian Government on potential proposals for mandating platform transparency. 

R 16  The Committee recommends the implementation of a mandatory requirement for all digital services with a social networking component to set default privacy and safety settings at their highest form for all users under 18 (eighteen) years of age. 

R 17  The Committee recommends the implementation of a mandatory requirement for all technology manufacturers and providers to ensure all digital devices sold contain optional parental control functionalities. 

R 18  The Committee recommends that the Department of Infrastructure, Transport, Regional Development and Communications conduct a Digital Safety Review on the legislative framework and regulation in relation to the digital industry. The Digital Safety Review should commence no later than 18 months after the commencement of the Online Safety Act 2021, and provide its findings to Parliament within twelve (12) months. 

R 19  The Committee recommends that, subject to Recommendation 18, the Digital Review examine the need and possible models for a single regulatory framework under the Online Safety Act, to simplify regulatory arrangements. 

R 20  The Committee recommends that the Digital Review include in its terms of reference: § The need to strengthen the Basic Online Safety Expectations to incorporate and formalise a statutory duty of care towards users; § The scope and nature of such a duty of care framework, including potential models of implementation and operation; § Potential methods of enforcement to ensure compliance, including penalties for non-compliance; and § The incorporation of the best interests of the child principle as an enforceable obligation on social media and other digital platforms, including potential reporting mechanisms. 

R 21  The Committee recommends that the eSafety Commissioner: § Increase the reach of educational programs geared at young people regarding online harms, with a particular focus on reporting mechanisms and the nature of some online harms being a criminal offence; § Formalise a consultation and engagement model with young people through the Australian Government’s Youth Advisory Council in regards to educational themes and program delivery; and § Report to the Parliament on the operation and outcomes of the program, including research identifying whether this has resulted in a reduction in online harm for young people. 

r 22  The Committee recommends that the eSafety Commissioner work in consultation with the Department of Education, Skills and Employment to design and implement a national strategy on online safety education designed for early childhood, and primary school-aged children, and secondary school-aged young people, including: § A proposed curriculum, informed by developmental stages and other relevant factors; § Potential methods of rollout, including consultation and engagement with children, young people, child development and psychology experts, digital education experts and other specialists in online harm; and § A roadmap provided to parents of these age groups detailing methods of addressing online harm. 

R 23  The Committee recommends that the eSafety Commissioner design and administer an education and awareness campaign aimed at adults, particularly in relation to vulnerable groups such as women, migrant and refugee groups, and people with disabilities, with a focus on the eSafety Commissioner’s powers to remove harmful content and the mechanisms through which people can report harmful content and online abuse. 

R 24  The Committee recommends that the Australian Government work with states and territories to ensure that relevant law enforcement agencies are appropriately trained on how to support victims of online harm. This should include trauma-informed approaches as well as a comprehensive understanding of police powers and other relevant avenues, such as the relevant powers of the eSafety Commissioner. 

R 25  The Committee recommends that the Australian Government review funding to the eSafety Commissioner within twelve (12) months to ensure that any of the Committee’s recommendations that are agreed to by the Government and implemented by the Office of the eSafety Commissioner are adequately and appropriately funded for any increased resource requirements. 

R 26  The Committee recommends that the Online Safety Youth Advisory Council, via the eSafety Commissioner, provide a response to this report and its recommendations within six (6) months of its establishment and full membership.