New Online Safety Act commences: haters hiding behind Anonymity or page 'names' can now be prosecuted
The Australian Government’s Online Safety Act came into effect on Sunday January 23rd, 2022, providing the eSafety Commissioner with even stronger powers to keep Australians safe online.
The Act brings some big changes, including a world first cyber-abuse take-down scheme to protect adults, along with a strengthened cyber-bullying scheme to protect children.
The eSafety Commissioner now also has the authority to order platforms to remove the “worst of the worst” online content—including child sexual abuse material and terrorist content—no matter where it is hosted.
The Act also gives the eSafety Commissioner stronger information gathering and investigative powers to obtain identity information behind anonymous online accounts used to bully, abuse or exchange illegal content.
Minister for Communications, Urban Infrastructure, Cities and the Arts, the Hon Paul Fletcher MP, said the new Online Safety Act is another step forward in the Morrison Government’s world-leading approach to combating cyber-abuse.
“Online safety is a priority for the Government and our new Online Safety Act is the foundation on which our world-leading approach stands,” Minister Fletcher said.
“As more Australians work, learn and conduct business online, the Government will make sure that they can do so safely, and that perpetrators are being held accountable for abusive and threatening behaviour.”
The Act also puts the tech industry on notice, with Basic Online Safety Expectations setting a new benchmark for platforms to take responsibility in protecting Australians online.
“The internet has brought immense advantages, but also new risks, and Australians rightly expect the big tech companies to do more to make their products safer for users,” Minister Fletcher said.
“Australians deserve to be able to use online platforms in the knowledge that they will be safe from vile and unacceptable online abuse, along with other dangers.”
The reforms will ensure social media companies are considered publishers and can be held liable for defamatory comments posted on their platforms. They can avoid this liability if they provide information that ensures a victim can identify and commence defamation proceedings against the troll.
Prime Minister Scott Morrison said the rules that exist in the real world should exist online too.
"Social media can too often be a cowards' palace, where the anonymous can bully, harass and ruin lives without consequence," the Prime Minister said.
"We would not accept these faceless attacks in a school, at home, in the office, or on the street. And we must not stand for it online, on our devices and in our homes.
"We cannot allow social media platforms to provide a shield for anonymous trolls to destroy reputations and lives. We cannot allow social media platforms to take no responsibility for the content on their platforms. They cannot enable it, disseminate it, and wash their hands of it. This has to stop.
"These will be some of the strongest powers to tackle online trolls in the world.
"Anonymous trolls are on notice, you will be named and held to account for what you say. Big tech companies are on notice, remove the shield of anonymity or be held to account for what you publish.
"In a free society with free speech, you can't be a coward and attack people and expect not to be held accountable for it."
The reforms will give victims of defamatory online comments two ways to unmask trolls and resolve disputes.
First, global social media platforms will be required to establish a quick, simple and standardised complaints system that ensures defamatory remarks can be removed and trolls identified with their consent. This recognises that Australians often just want harmful comments removed.
Second, a new Federal Court order will be established that requires social media giants to disclose identifying details of trolls to victims, without consent, which will then enable a defamation case to be lodged.
Importantly, the reforms will also ensure everyday Australians and Australian organisations with a social media page are not legally considered publishers and cannot be held liable for any defamatory comments posted on their page, providing them with certainty.
Attorney-General Michaelia Cash said this was in response to the Voller High Court case, which made clear that Australians who maintain social media pages can be 'publishers' of defamatory comments made by others on social media—even if the page owner does not know about the comments.
"Since the High Court's decision in the Voller case, it is clear that ordinary Australians are at risk of being held legally responsible for defamatory material posted by anonymous online trolls," the Attorney-General said.
"This is not fair and it is not right. Australians expect to be held accountable for their own actions, but shouldn't be made to pay for the actions of others that they cannot control.
"The reforms will make clear that, in defamation law, Australians who operate or maintain a social media page are not 'publishers' of comments made by others."
The Attorney General said the package of reforms would complement the defamation reforms currently being progressed in partnership with states and territories, and sit alongside the Government's commitment to improving online safety.
"Social media providers should bear their fair share of responsibility for defamatory material published on their platforms," the Attorney-General said. ''This reflects the current law.''
"However, if defamatory comments are made in Australia, and social media providers help victims contact the individuals responsible, it is appropriate they have access to a defence."
The Online Safety Act:
- creates a world-first Adult Cyber Abuse Scheme for Australians 18 years and older, across a wide range of online services and platforms
- broadens the Cyberbullying Scheme for children to capture harms that occur on services other than social media
- updates the Image-Based Abuse Scheme to address the sharing and threatened sharing of intimate images without the consent of the person shown
- gives eSafety new powers to require internet service providers to block access to material showing abhorrent violent conduct such as terrorist acts
- gives the existing Online Content Scheme new powers to regulate illegal and restricted content no matter where it’s hosted
- brings app distribution services and search engines into the remit of the new Online Content Scheme
- introduces Basic Online Safety Expectations for online service providers
- halves the time that online service providers have to respond to an eSafety removal notice, though eSafety can extend the new 24-hour period.
The codes will be mandatory, and they apply to various sections of the online industry including:
- social media platforms
- electronic messaging services, online games and other relevant electronic services
- websites and other designated internet service
- search engines
- app distribution services
- internet service providers
- hosting service providers
- manufacturers and suppliers of equipment used to access online services and people who install and maintain equipment.
The codes can require online platforms and service providers to detect and remove illegal content like child sexual abuse or acts of terrorism. They can also put greater onus on industry to shield children from age inappropriate content like pornography.
To report serious abuse or harmful content, please visit https://www.esafety.gov.au/report