Kids who do not meet the age requirement will be booted off social media platforms from December 10, but there will be exceptions for health and education services including WhatsApp and Meta's Messenger Kids.
Tech executives were grilled during a parliamentary inquiry hearing on Tuesday, saying although they did not agree with the age restrictions they would implement the ban.
TikTok public policy lead Ella Woods-Joyce said the company shared the concerns of experts that the "blunt" age bans won't work or resolve the concerns the laws aim to address.
"We support evidence-based sensible legislation that improves safety standards for all internet users ... a ban will push younger people into darker corners of the internet where rules, safety, tools and protections don't exist," she said.
Jennifer Stout, a representative from Snapchat's parent company Snap Inc, said the platform believed the ban had been applied unevenly and risked undermining community confidence in the new rules.
"For teens, connection and communication are strong drivers of happiness and well being, taking that away does not necessarily make them safer and it may instead push them towards other messaging services that lack Snapchat safety and privacy protections," she said.
Meta regional director of policy Mia Garlick said compliance presented challenges because 16 was a "globally novel age boundary" as existing technologies were built to identify the age milestones of 13 and 18 years.
"Distinguishing 13 from 16 is inherently less reliable and it also found greater challenges at the 16 age boundary with age estimation technologies," she said.
Platforms face fines of up to $50 million if they do not take reasonable steps to comply with the ban, but there won't be penalties for young people or their families if they gain access to the platforms.
Greens senator Sarah Hanson-Young previously threatened to force executives from TikTok, Snapchat and Meta - the parent company of Facebook and Instagram - to appear at the inquiry into online safety after they were no-shows at an earlier hearing.
The law puts the onus for compliance on the companies to "detect and deactivate or remove" accounts from underage users.
This will mean about 1.5 million accounts on Facebook, Instagram, YouTube, TikTok, Threads and X will be deactivated in less than two months.
Tech giants Apple and Google removed OmeTV from their app stores this week after being alerted to concerns predators were using it to groom and sexually exploit Australian children.
OmeTV instantly connects individuals with random strangers for a video chat.
The app's Portugal-based parent company Bad Kitty's Dad, LDA did not comply with requests sent by eSafety Commissioner Julie Inman Grant in August to introduce protections for Australian children.
The tech giants have since banned the app and are expected to review all others available in their Australian stores.
"This is an app that randomly pairs young children - with pedophiles," Ms Inman Grant told ABC News on Tuesday.
"This app will no longer be able to reach Australians and they will no longer be able to make money off children's misery."