Unplugged

Australia has decided to ban social media for under 16s - what does this mean for kids, their parents, and the social media companies that platform them?

Career Opportunities, directed by Bryan Gordon, produced by John Hughes, Universal Pictures, 1991. © All rights belong to their respective owners. No copyright infringement intended.

A few weeks ago, we discussed Shari Franke and the ethics of family channels, and we rounded off that article with an overview of legislation around children using and working on social media. About 10 days later, the Australian government announced that from November 2025 there will be a legal age restriction on social media accounts, making them accessible only to those over the age of 16. This is not the first law of its kind that has passed through governments across the world, but it is the strictest; where the French have added caveats for parental consent, there is no such grace, and following their usual legislative style, the onus for enforcement is on the platforms, who will face heavy fines if their processes fail to stop those underage from accessing their sites. The platforms cited are five of the big six: Snapchat, TikTok, Facebook, X, and Instagram - YouTube has been excluded in talks due to its use in schools. They currently have age verification software in the works that would be the means of enforcement with the trials for this beginning in the new year.

The responses to this have been the regular mix of opinions that appear when a government attempts to legislate an aspect of the digital space. The tech companies aren’t happy, with Meta claiming that they’re ‘rushing’ the legislation and attempting to argue that their safeguarding methods are sound while the owners of Snapchat claim they’ll work with the government and the e-safety commissioner in a way that will ‘balance privacy, safety, and practicality’. Parents are welcoming the change in the law as more and more become worried about online bullying and predation in regards to their children, with opinion polls claiming 77% are in support of the restrictions, while child rights activists and youth activism groups cite that restricting social media represents a barrier to community for children from minority groups, while those concerned about user privacy worry about how the modes of verification will be handled by companies that don’t have the strongest track record on ensuring that user data is kept safe. I would like to add, going off the issues in the family channels article, that this law also has no stipulations for how children working on the internet in any capacity will interact with this law: if a child is regularly filming content in the family content setting, does this breach the rules? If they’re not allowed on the internet of their own agency, are they required to be paid when their parents monetize their lives on their account?

Australian Prime Minister Anthony Albanese announcing the social media ban on November 6th 2024, via ABC news on Youtube © All rights belong to their respective owners. No copyright infringement intended.

I approached this topic in general support of the restriction - now I’m not so sure. There are a plethora of anecdotal reasons to argue that children shouldn't have access to the biggest platforms in the online space. Whether that is the want to keep children and young teens safe from predation, the impact of online bullying and harassment on the developing mind, the impact that heavy digital use has on attention span and learning ability, or the risks of children seeing mature or dangerous content that can manifest in everything from the promotion of self-harm to teen boys sucked into the manosphere and the implication of that for their peers. On the flipside, connectivity with friends and the like-minded in terms of hobbies, or identity, is cited by young people both anecdotally and in studies as the main reason and greatest benefit of their access to social media channels. However, academic study of the impact of social media on children’s mental health indicates that the harm caused is not truly on the macro level and a study from 2022 indicates that only 9% of teens say that social media has personally negatively affected them.

This hits home as someone who grew up queer on the internet. Finding community this way was a lifeline for many that I knew and met in Instagram group chats or who had active Tumblr accounts, but it was these same spaces that they found the discourse and thinspo pages that fueled confusion and body image issues that have persisted into adulthood. For every anxious teen that was able to use the internet to find human connection while they could not face it in person, there is another scarred from being a preteen talking their much-older “boyfriend” down on Kik. Many are able to mediate their experiences and find that the good for them outweighed the harm; the nuances of being a teen in the real world mirror the nuances of being one online, but on social media, you are offered a gallery of the extremity of every feeling or experience, positive or negative.

As touched on earlier, this ban doesn’t seem to address children as consumers on social media. Online is the only space with generally lax rules on advertising to children - the examples that come to mind are the overt methods of someone like Mr Beast and the Pauls who will use integrated advertising into their content knowing that their viewer base is mainly the very young, including the advertising of shady crypto. Apps like TikTok run both video advertising made by companies and a whole industry of affiliate marketing that works to make paid content look as natural as possible. The ban would allow YouTube due to its use in schools - thus not tackling this issue on the platform that a majority of children report using - and does not directly tackle the lax laws around advertising overall.

On the producer side, these laws have no stipulations so far about how this would work in the context of family vlogging or the posting of those under 16 online via a parent’s account - the lack of parental consent caveats indicates that there would be no child managed by parents in the Australian digital space, but there is the worry that a law such as this would cement the ability for families to exploit their children on social platforms so long as they weren’t the focus of the content as the kids technically aren’t allowed to take any ownership of their part in the business. Children may be put in a position where they are working on platforms that they have no access to, removing their ability to see what makes the cut and removing their agency concerning what about them is shared.

Thinking about what young people consume online, the amount of children who fell headfirst for podcasters like Andrew Tate, Fresh and Fit, and their copycats is concerning - The Pew Research Center found in 2022 that very few teens feel that social media plays a critical role in how they interact with critical and social issues, with those leaning towards Republican or being the least likely to see social media’s role as critical. This while a lot of these accounts focus on social and identity politics such as reproductive rights, marriage and childcare policy, and the role of sex work in society then conflate their views and approaches with the cars and watches they flash - this content is only-just covertly political, and its packaging in a way where preteens do not recognize that they are being sold a viewpoint is dangerous. Though this indicates that maybe we should take kids offline, the more pressing notion would be teaching these kids how to recognize and separate a viewpoint from the gains of running a successful podcast (and how to properly fact-check claims made by creators they enjoy) so that they can work out for themselves what it is that they will believe and what they will take as entertainment.

The overall notion I felt over the research of this issue, is a reminder of the slow but now almost complete erosion of space specifically for children in the digital landscape. If you were a child in the late 90s to the early (maybe even mid) 2000s you likely either had a Club Penguin, Moshi Monsters, or Bin Weevil account. These games were usually based around character creation, building a customizable environment and playing minigames to fund these operations. There was available adding and chat features for others in-game, but these were heavily filtered environments with no photos, and you were actively discouraged from giving out any sort of personal information. Even for older children there were services such as Stardoll and MovieStarPlanet (though more geared towards girls than boys stereotypically), where kids had the ability to form online friendships without any of the dangers of any sort of identification, where the content was largely guaranteed to be age-appropriate, and the old convention of needing a parent’s email address would mean that any reports or violation of terms of service would go straight to the parent. Some may argue services like Roblox have filled this void - I would argue that the amount of grown-ups who play these games with the ability to build mature games and joke builds and post them publicly means that this is not the case.

The inside of a typical moshi monster’s home from 2013, via gamesindustry.biz, February 2013  © All rights belong to their respective owners. No copyright infringement intended.

This is not to begrudge the twenty-somethings who finish a long day and turn to games such as Roblox or Minecraft as a way of decompressing, but the co-use of children’s online media as ‘universal’ is the issue that we have when it comes to children’s internet use. There are no platforms designed to be almost exclusively used by children or teens. Platforms that allow children to share and blog among each other, that is avatar or text-based with stricter regulations on posting of selfies and the language that can be used may be beneficial to strike the balance between ensuring that children are not being exposed to all of the issues that come along with social media and children still getting the connectivity and ability to engage with others on the basis of interest. It would also offer a more controlled environment for children to learn how to operate on social media before turning 16, instead of going in still blind but a little older; they would still have the experience of operating an account and tailoring it to them, without the pressure of it being attached to their faces and/or names until their personhood has developed a little more and the attachment of themselves to their online profile will be more responsible.

This idea is not without its drawbacks. Many parents may recoil on principle to the idea that their children were using platforms that they could not themselves make accounts on - though this could be negated by the old school approach of using a parent’s email, perhaps with some two-factor authentication to make sure it's the parent’s emails and to ensure some user privacy. The pool of young users would be attractive to those looking to victimize children, so strong safeguards would have to be in place to ensure that the accounts were all owned by these young people, and that would have to include a balance that may worry those who enjoy strict privacy rules. Finally, the companies may not be happy as the aforementioned ‘maybe we shouldn’t advertise to children’ would be a much easier argument for a platform with a clear consumer base that would mean many would want to either take some type of user data or require a subscription. But all this said, the return of children’s spaces in the digital in principle at least would be a development that I at least would love to see when we discuss the issue of the young presence in the social media landscape.

The young people of Australia are about to be the guinea pigs in what is becoming a global conversation about where young people should sit in the digital landscape. The want to bar all those under 16 is a knee-jerk reaction that makes sense due to all the safeguarding issues we’ve discussed above - but blanket bans have never worked on anything, let alone something as fluid and now hard to regulate as the digital space has become. After sitting with the issue, the conclusion I’d draw is the thing that adults are supposed to do for the generations raised after them: create a safe space and educate. Allow children to have a space where they can blog about their day and talk to their friends and peers in an incubator that will allow them to learn when they are being misinformed or misled and teach them how to dispel it, then when they get to the age where they are ready to access social media in all its… vastness, they will already have skills to navigate it. It may be what we all needed in the first place.

Julia Brunton

Describing herself as Professionally online, Julia is a recent Media, Industry and Innovation graduate with a focus on digital culture and society. Her passion for research and digital culture is matched only by her love for alternative and metal music and fashion, with both pillars of interest forming the foundation for Julia’s written work. Hailing from England’s north east, she hopes she can champion the local scene and grassroots cultural efforts whenever she can; she hopes her writing can encourage others to pop down to their local venue and keep the culture alive.  

Previous
Previous

Anthony Vacarello's Rise

Next
Next

The Year for Doechii