News

Beyond banning: Our digital world must be made safer for young people

Published on: 15/01/2025

Beyond banning: Our digital world must be made safer for young people

In a new article in the Australian Health Promotion Association’s Health Promotion Journal of Australia, Nicholas Carah, Dr Sandro Demaio, Louise Holly, Ilona Kickbusch,and Carmel Williams discuss the proposed social media ban for young people in Australia. They argue that the focus should be shifted from restricting access to ensuring that digital spaces are safer for young people, drawing on public health approaches used in the control of tobacco, alcohol and road safety.

Thought Leadership
Member News

Social media offers young people opportunities for connection, self-expression and learning, but not without increasingly evidenced health risks. A new report from the WHO suggests that 11% of adolescents show signs of problematic social media behaviours and experience negative consequences such as disrupted sleep. As governments around the world grapple to find the balance between access and protections, the question arises: can we build a safer, more balanced digital space for children?

The regulation of children’s use of social media is a growing global public health priority. In the United States, legislation sets the age of 13 as the threshold for children creating new social media accounts. The European Union has considered raising the minimum age for social media access to 16, and in France, social media platforms must refuse access to children under 15 unless they have parental permission.

Australia has gone further by swiftly introducing a law that will ban children under 16 from accessing social media platforms. [Correction added on 6 January 2025, after first online publication: The preceding sentence has been corrected.] This controversial idea hinges on the outcomes of ongoing trials involving age verification and assurance technologies. While discussions around banning children from social media are happening in several countries, Australia’s approach could reshape how social media companies manage user access for minors. It focuses on the effectiveness of two key technological solutions: age assurance and age verification.

Age assurance includes a variety of techniques designed to estimate or determine a user’s age. Methods include self-reporting or parental verification, and even using technology like facial recognition or analysing scrolling habits. These methods, however, can easily be circumvented by savvy teens making enforcement of any ban difficult.

Meanwhile, age verification involves confirming a person’s age by matching their identity to a verified source of information, such as a government-issued document. However, concerns arise over privacy and security risks, whether personal data are managed by the social media companies themselves or with third parties.

Beyond the technical challenges of determining user age, there are social and cultural risks associated with age verification systems. Young people often use social media to explore their identities and seek information they may not feel comfortable seeking from parents, teachers or peers. Social media provides a level of anonymity, allowing them to ask questions about personal topics such as body image, sexuality and relationships. Age verification requirements could undermine this anonymity, stripping young users of their right to remain forgotten online.

Another issue is the potential to exacerbate digital divides. Navigating age verification systems may prove more difficult for young people with lower digital access or literacy, or limited access to the necessary identification documents. These individuals could be excluded from mainstream social media platforms altogether, potentially pushing them into more dangerous, less regulated corners of the internet.

Despite the many challenges, some social media companies have begun to address public concerns over the safety of children on their platforms. Two common strategies are the creation of advisory groups focused on user safety and the development of specialised apps and accounts for children. For example, Meta has established a ‘Safety Advisory Council’ and an ‘Instagram Suicide and Self-Injury Advisory Group’ to guide its child safety policies. However, little is known about how these groups operate or what impact their advice has had.

Meanwhile, platforms like YouTube have introduced child-specific versions, such as YouTube Kids, which offers a commercially lucrative ‘walled garden’ of content specifically curated for an ever-younger audience.

In September, Meta announced the creation of ‘teen accounts’ for Instagram, a feature aimed at the safety of younger users. These accounts will have several key features, including accounts for users under 18 defaulting to private, and users under 16 needing parental permission to switch to a public profile. Teen accounts will limit interactions with strangers, restrict tagging and mentions, and apply content filters aimed at reducing exposure to harmful material. Young users will also receive notifications prompting them to log off after 60 min of use and turn on sleep mode, which mutes notifications overnight.

While these features may provide enhanced child and teen safety, they also raise concerns. Many of these ‘innovations’ are simply repackaged versions of pre-existing features and fail to represent protections on a scale that matches the known public health risks. Moreover, these changes appear to further commercial interests. Instagram’s vision for teens now focuses on fostering a more intimate, chat-based experience akin to Snapchat, while also incorporating TikTok-style entertainment through Reels.

Whether we ultimately decide to limit young people’s use of social media or not, the more important task is to create online environments that enable these generations to flourish and be healthy in a digitalised world. Standalone bans are a potential quick-win for tech companies and policymakers but risk displacing important responsibilities that digital platforms have as commercial enterprises that make enormous profits from children’s culture and private lives.

Young people do not want to be excluded from the digital world but do want to be better protected from inaccurate information and digital harms. We must think critically about building social media spaces where young people are not targeted by advertisers looking to exploit their emotions or sell harmful products. Social media should prioritise users’ health and well-being rather than aim to maximise attention and engagement at any cost.

Ultimately, young people deserve online spaces where they can explore their identities, form social connections, and express themselves freely, without being inundated with harmful content or manipulated by advertisers. Rather than focusing on restricting access, policymakers and platforms should be expected to provide high-quality, educational and supportive content that fosters healthy online experiences.

There are opportunities to draw on the lessons and experiences from the fields of public health and health promotion to apply similar approaches to tackle the risks and benefits associated with the digitalised world. For example, public health and health promotion approaches have provided both safeguards to limit young people’s exposure to hazardous products and also provided clear guidance on the safe levels of access and strategic incentives. Tobacco, alcohol and car and road safety are good examples of public health success, where the evidence is clear that unfettered access and/or exposure damages the health and well-being of young people. In Australia and other similar countries, a diverse range of public health actions have been put in place to protect young people from the harm caused by unlimited exposure to these products. These include regulation and legislation, such as setting age limits; fiscal responses including taxes and levees on the products; reducing access through restrictions on where products can be sold and consumed; providing clear and accurate information and advice to the community; and guidelines on safe use of these products.

Holding multinational corporations, including the owners of social media platforms, accountable for the harm their products cause young people is critical, but it is the responsibility of governments to act to protect young people’s rights both to engage in the digital world and to health and well-being.

Read the full article here

Discover more about DTH-Lab:

Digital transformations are shaping all aspects of our lives, including health. Digital innovations can help to improve young people’s health and well-being and achieve Universal Health Coverage (UHC) through their application in health systems, public health promotion and prevention, and personal self-management of health status and behaviours. Without good governance, digital transformations can undermine health and exacerbate inequality. As levels of connectivity increase, concerted efforts are required to ensure that digital technologies are harnessed in support of better health and well-being for all and the attainment of the Sustainable Development Goals (SDGs) by 2030.

The DTH-Lab is a global consortium of partners working to drive implementation of The Lancet and Financial Times Commission on Governing Health Futures 2030’s recommendations for value-based digital transformations for health co-created with young people.

NEWS​

Related News

InnoHSupport Open Calls

11 Feb 2025
Welcome to the InnoHSupport Open Calls!

Healthcare AI Solutions

6 Feb 2025
The UK government has launched a £150 million procurement drive for artificial intelligence solutions that can be used by the NHS in areas like medica...

Women Entrepreneurship Bootcamp

6 Feb 2025
Welcome to EIT Health Women Entrepreneurship Bootcamp – a customised experience crafted exclusively for you.  

MedTech Bootcamp

6 Feb 2025
EIT Health’s MedTech Bootcamp has been exclusively crafted for trailblazers like you to help you get a clear and valid proof of concept for your idea.

Empowering AI Innovations to Transform Lives: GSMA Launches New Innovation Fund for Startups

6 Feb 2025
The GSMA Innovation Fund is offering grants of up to £250,000 and tailored venture-building support for enterprises using AI and emerging technologies...

Co-Creating the Future of Clinical Data: Join Digital Health Nation Challenge

5 Feb 2025
Join the “Improving clinical data capture for better healthcare outcomes” open innovation call, 5 Feb - 21 March, 2025, to explore the challenge of in...

Become a member

Join ECHAlliance to amplify your organisation’s message, grow your networks, connect with innovators and collaborate globally.
 
First name *
Last Name *
Email Address *
Country *
Position *
First name *
Last Name *
Email Address *
Country *
Position *