Financial Fraud and Regulating Google and Facebook


Major digital platforms such as Google and Facebook have been under greater scrutiny over the way their services, particularly through targeted advertisements and taking advantage of algorithms, can be used to spread misinformation.

Misinformation about investments, that is, scam investment products, have been costing Australians millions of dollars of their hard-earned savings.

The ACCC estimates that in 2021, $140 million was lost to online investment scams, with scams related to social media amounting to $26.6 million – up 207% from 2020.

Reputable Australian investment funds have had their brands ‘cloned’, with scammers purporting to offer financial products under their brands and targeting people via social media advertisements, paid search engine results and via messenger applications.

Given the alarming rate at which investment scams are increasing on digital platforms, the FSC believes more needs to be done to protect consumers from significant financial detriment.

In February 2022, the ACCC released a consultation paper seeking feedback on potential new laws for large digital platforms such as Google and Facebook.

One potential solution that the FSC has put forward is that digital platforms should be subject to an industry code of conduct, allowing for industry input and flexibility. This code should impose a duty on digital platforms to only allow paid-for advertisements from financial services providers authorized by ASIC. This echoes a commitment that Google has made in the United Kingdom to only allow paid advertisements about financial products that are from entities authorised by the Financial Conduct Authority.

The Code should also impose a duty for digital platform services to remove links and access to financial services advertisements as and when they become aware that there are scam financial products being promoted.

Another option would be the imposition of duties on digital platforms via legislation or regulation. These duties could take the form of the UK draft Online Safety Bill 2021, which proposes to impose duties on ‘user to user services’ (such as Facebook or Instagram) and ‘search services’ (such as Google) including to take steps to mitigate and effectively manage the risks of harm to individuals. This could involve taking steps to minimise the risk of members of the public encountering harmful material via search engine results or through ‘scrolling’ on social media.

While the Bill excludes paid for advertisements within its scope, the UK Parliamentary Joint Committee on the Draft Online Safety Bill (2021-22) report recommends the inclusion of advertisements on digital platforms, noting that ‘The exclusion of paid-for adverts from the scope of the draft Bill leaves little incentive for operators to remove scam adverts. Regardless of their legitimacy, they generate revenue for platforms.’

Interestingly, the ACCC commenced proceedings in the Federal Court against Meta in March, alleging that they have engaged in false, misleading or deceptive conduct in breach of Australian Consumer Law or the ASIC Act 2001, by allowing the publication of scam crypto advertisements that purported to have the endorsement of prominent Australian figures such as Mike Baird and Andrew Forrest.

The result of ACCC v Meta could provide greater legal clarity on the duties of digital platforms absent legislation when it comes to fraudulent advertisements published on their platforms. Should the court find against Meta, a code of conduct would remain desirable in order to provide clarity as to what platforms need to do to meet their obligation not to engage in misleading or deceptive conduct.

The inquiry is set to provide an interim report to the Treasurer by 30 September 2022.

Make sure you’re amongst the first to read blogs like this from the FSC.


/Public Release. View in full here.