'Fundamental flaws' in anti-trolling laws

Maeve BannisterAAP
Erin Molan said it was almost impossible to get help from law enforcement or social media platforms.
Camera IconErin Molan said it was almost impossible to get help from law enforcement or social media platforms. Credit: AAP

Proposed laws targeting online abuse that would hold social media companies accountable for safety issues are "utterly inaccessible", a reform advocate says.

Noelle Martin, a lawyer and survivor of online abuse, told a parliamentary committee on Tuesday the planned laws were narrowly targeted.

She said the financial penalties proposed for tech companies failing to keep people safe on their platforms are "chump change" and will not result in meaningful reform.

"If this committee is serious about online safety, unmasking trolls and improving responsibility by social media companies ... this bill contains fundamental flaws," she said.

As a teenager, Ms Martin discovered images of herself had been edited onto pornographic pictures and distributed online.

Almost a decade later, the perpetrators have not been punished and many of the fake images remain on pornographic websites.

Ms Martin said the government should focus on compelling the eSafety Commissioner to use its existing legislative powers to take action against abusive content online.

"Australia's online safety laws and reforms, and in particular the office of the eSafety Commissioner, are woefully inadequate," she said.

"The regulator continues to underutilise its existing statutory powers, misguides the public on its perceived successes ... and the whole regime is ineffective in providing meaningful support to survivors."

Anti-trolling campaigner and journalist Erin Molan shared her experience of being targeted repeatedly with abusive direct messages on social media.

Ms Molan became emotional as she recounted some of the abuse which made her fear for her life and her young daughter's safety.

She said it was almost impossible to get help from law enforcement or the social media platforms themselves.

"I reported some horrific messages from an account that kept being recreated no matter how many times I blocked it, but Facebook said the messages 'didn't meet the threshold' for inappropriate content," she said.

"The consequences should lie with big tech. They generate a huge amount of money and with that comes the responsibility to ensure users are safe."

Criminologist Michael Salter told the committee Ms Molan's experience of reporting abuse to social media companies was common among victims.

He said victims are often failed by the lack of action taken by social media giants.

"We're asking for transparency because far too often what we're provided from social media company reports on these issues ... is statistics that are most friendly to them," he said.

"Having basic safety expectations built into platforms from the get-go is not too much to expect from an online service provider."

Prior to the hearing, Prime Minister Scott Morrison said people like Ms Molan who shared their stories of online abuse would help to hold social media to account.

"(Tech companies) created these platforms and they have a responsibility to make them safe," he said in a statement.

Representatives from Facebook, Twitter and TikTok are expected to appear at an upcoming hearing.

The committee will hold its next hearing on Thursday.

Get the latest news from thewest.com.au in your inbox.

Sign up for our emails