The Australian government is going to force Big Tech to scan your photos and emails for illegal stuff

Developed by major service providers and Delivered the problematic Proposed Codes for Designated Internet Service (DIS) and Relevant Electronic Services (RES) submitted to eSafety for review in February, just two of eight sectoral guidelines to be adopted after the passage of the Online Safety Act 2021.

DIS includes providers of apps, websites, and file and photo storage services such as Apple iCloud, Google Drive, and Microsoft OneDrive, while RES covers dating sites, online games, and instant messaging.

Despite making “significant changes” after her September to demand And feedback to the drafts of February, eSafety Commissioner Julie Inman Grant said the revised design guidelines for DIS and RES – two of them eight codes ready to be completed – “still not meeting our minimum expectations”.

Shortcomings, she explained, include the failure of the DIS to “detect and flag known child sexual abuse material” in file and photo storage services, as well as the failure of RES providers to detect “horrible” material detect and highlight email and partially encrypted messages. Services.

“We know there are proactive steps they can take to stop the already rampant sharing of illegal content,” said Inman Grant, adding that her office saw a 285 percent year-over-year drop increase in reports of Child Sexual Exploitation and Abuse (CSAM) material in the first quarter of this year.

With technology companies failing to meet the requirements of the law, Inman Grant will exercise its powers under section D145(1)(a)(ii) of the law – which gives it the authority to “define a standard as a design code” does not contain appropriate Community safeguards”.

That will lead its agency to develop standards for DIS and RES service providers that will be mandatory and enforceable – meaning Australians can file infringement complaints with eSafety, which can investigate and order, enforceable commitments and financial fines of nearly $700,000 can impose. per day.

Five other codes – covering social media services, internet transportation services, app distribution services, hosting services and equipment codes – were accepted and come into effect six months from the day they are officially registered.

Inman Grant also delayed a ruling on an eighth code – which concerns search engine operators – by giving companies an additional four weeks to address the implications of fast-growing generative AI services being used more and more tangled up with search engines such as Google, Bing, OperaAnd Brave.

Fighting the world

Inman Grant may be talking harshly about filtering, but the new regulations – which will apply to Australian service providers as well as overseas vendors who provide services to Australians – will put her on a collision course with tech companies that have been there and done so.

In mid-2021, Apple announced plans to automatically alert users if they send or receive nude images, and to scan user content stored in iCloud by comparing the hashes of stored files against those of known CSAM; users with too many flags would be referred to authorities.

It’s an approach that already exists used regularly by Google – sometimes with unintended consequences — but late last year, Apple function paused after a massive backlash from researchers, civil liberties advocatesand others claim to scan content amounted a gross invasion of user privacy and a slippery slope to mass surveillance.

Whether the use of legal tools to enforce compliance will prove more effective remains to be seen. But by drawing a new line on CSAM filtering — and forcing Apple, Google, Microsoft, Meta, Twitter, and others to comply — eSafety risks sparking yet another vitriol international debate over privacy and civil rights.

Increased government scrutiny of their activities may have helped tech giants working together more closely than ever with Australian regulators – but this last mandate would, if past experiences his only indication, see those companies restrict Australians’ access to core services in the name of user privacy.

Filtering “terrible” illegal content “and other basic requirements are non-negotiable,” said an undeterred Inman Grant.

“While we don’t take this decision lightly, we believe moving to industry standards is the right one to protect the Australian community.”