Research and expert consultation play a major role in Meta’s product development process. Over the past few years, Meta has refined how our teams design products for young people based on external guidance from organizations like the UN, the OECD and children's rights groups. Through our global co-design program, we also regularly consult with third party experts and young people, parents and guardians to make sure we build products that meet their needs.
The UN’s Convention on the Rights of the Child (UNCRC) is an international human rights treaty that establishes the civil, political, economic, social, health and cultural rights of youth. This global treaty reflects the most fundamental and commonly held beliefs of societies around the world with respect to how young people should be treated.
The UNCRC emphasizes that in all actions concerning people ages 0-17, including in digital environments, “the best interests of the child shall be a primary consideration.” This is widely known as the “best interests of the child” standard in the youth rights community and is an important principle in product development at Meta.
It's essential that we prioritize preserving the digital rights of young people alongside the work we do to keep them safe and protect their welfare. Many data privacy laws and regulations around the world apply the “best interests of the child” standard to how companies should handle young people’s data, requiring them to provide age-appropriate experiences.
Designing digital experiences that serve young people's best interests requires accounting for a number of different perspectives, including those of experts, regulators, parents and young people themselves. At times, these differing points of view require us to navigate complex tradeoffs. When building Instagram’s new parental supervision tools, parents and some lawmakers encouraged us to give parents detailed insights about what their teens were doing on the app. But we also heard from global experts, and teens themselves, that it was important to respect teens’ evolving expectations of privacy and to promote their autonomy as they grow by learning, playing and creating online.
After careful consideration of these perspectives, including through the TTC Labs co-design and consultation initiative with teens, parents, guardians and experts, Meta launched optional supervision tools on Instagram that require a teen’s consent. When supervision is turned on, parents can see who their teen follows, who's following them and how much time they spend on the app (with the possibility to set time limits). Today, we're expanding these tools to allow parents to set time windows when their teen can’t use Instagram and to see more information when their teen reports an account or post. Teens still have the ability to make decisions for themselves around who they follow and who follows them, and a parent can't see posts or messages from their teen's friends.
This approach makes it easier for parents and guardians to be involved in their teens’ online experiences, while still supporting teen autonomy. This balance is key to all of our current and upcoming features, including the parental supervision tools rolling out for all Meta Quest headsets today.
Developing Meta’s Best Interests of the Child Framework
We developed a process to help us apply the UN’s Convention on the Rights of the Child directly to the products and experiences we build at Meta. We complemented our own internal research with input from global data protection regulators to create Meta’s Best Interests of the Child Framework, which distills the “best interests of the child” standard into six key considerations that product teams can consult throughout the development process.
The framework is available as a resource for all employees at Meta and is intentionally applied at different points of the product development cycle. Each consideration has extensive guiding questions, resources and examples to help our teams and product builders make balanced decisions. Below is more information on each consideration, including a subset of questions, resources and case studies.
1: Recognize and engage global youth and families using our products
This consideration emphasizes that young people are core constituents of almost all of our products and features globally. Even if a product was not specifically developed for teens, we must still actively consider young people who are permitted to use the product and find meaningful ways to understand their needs. Guiding questions in this section include: “Have you referenced relevant external and internal youth research before setting your roadmap and deciding on projects?” and “Have you considered the range of potential youth consultation methods?” For example, we’ve developed a virtual co-design methodology with teens and parents in the US, the UK, Ireland, Brazil, Japan and India to inform how we develop parental supervision tools. While we’ve conducted user research with young people and parents for a long time, this new program offers an additional collaborative space to engage with young people, caregivers and families, hearing their voices and taking their needs into account by co-creating with them.
2: Create safe, age-appropriate environments for youth
We provide Meta teams with resources to help them understand how to make the content, controls and features in their products appropriate for the young people they reach. Youth are not a monolithic group—they vary in age, maturity, home situations, cultural norms and parental support. We have developed common terminology and resources to help teams build for the specific developmental needs of young people as they grow. We’ve included a visual below on the age bands we use to guide developmentally-appropriate design. The age bands span from kids to late teens, but Meta does not allow people under 13 years old on its apps besides Messenger Kids, which is designed for kids and tweens.
We work closely with product teams to identify who might use their products and ensure that designs and tools are age-appropriate. For instance, we made a number of improvements to Instagram over the last year to bolster teen safety and privacy. Guiding questions in this section include: “Have you identified the age ranges and developmental stages of youth likely to use your product?” and “Do all designs follow youth best practices and are all education and tools presented in an age-appropriate way?”
Age bands based on how certain regulations define different age ranges
3: Promote youth autonomy while considering the rights and duties of parents and guardians
Parental controls and supervision features are helpful when they support a parent or guardian in protecting the best interests of their child, but they can also impact a young person's sense of privacy and autonomy in their online experiences. We strive for balance between these two realities and apply the “best interests of the child” standard to create tools that support families and encourage communication around healthy digital habits and digital citizenship. This consideration has been essential in creating supervision features for Instagram and Meta Quest. One of the guiding questions in this section is: “Do you give young people more autonomy over time with education and prompts to help them graduate to more mature product experiences when appropriate?” We provide product teams with resources on how to design these graduation moments as well as principles for balancing teen autonomy and parental control.
4: Prioritize youth well-being and safety over business goals and interests
This consideration gives teams tactics on how to embed youth well-being and safety into their products by default. Guiding questions in this section include: “Does your product encourage meaningful interactions for teens and prevent negative experiences?” and “When youth start using your product, do default settings promote their privacy?” We provide guidance on how to help prevent bullying and harassment in products as well as specific principles on how to build privacy settings for youth. A few features that apply this consideration:
Now when someone under 16 joins Instagram, their account is automatically set to private and remains so unless they change it to public. We also prompt teens who are already on Instagram to consider making their accounts private.
The Take a Break tool on Instagram is a new feature that helps support young people’s well-being, encouraging them to set reminders when they’ve spent over a certain amount of time on the app.
We also restrict adults not followed by a teen from sending direct messages to them on Instagram, and limit adults from messaging teens on Messenger if they are not friends or do not have mutual friends on Facebook.
5: Support young people’s privacy in product decisions
Special considerations apply to youth data and how we use it. Guiding questions in this section include:“Do you only collect youth information when its use is clear and necessary?” and “Have you considered measures that do not rely on the ability or willingness of youth, parents or guardians to engage with privacy information?” Last year, we applied this consideration to change how ads work for teens across Meta technologies so that advertisers can only target ads to people under 18 (or older in some countries) based on their age, location or gender. Previously available targeting options, like those based on interests or activity on other apps and websites, are no longer available to advertisers when showing ads to young people.
6: Empower youth, parents and guardians to understand and exercise their data rights
Youth have rights to autonomy, freedom of association and play, free expression and identity exploration, and we must support their rights through our products. Like all people who use our services, youth can exercise their data rights to delete their accounts, report concerns and download their account information. Guiding questions in this section include: “Does your product or feature support youth data rights?” and “Have you provided age-appropriate data education and tools before experiences, contextually and on demand?” We’ve also introduced guidance internally on how teams can integrate age-appropriate and globally accessible education into their products so youth can comprehend privacy information and their privacy choices. Today, we’re releasing a section in Meta’s Privacy Center with info for people under 18 about privacy settings, ad experiences and parental controls.
Adopting an approach that is grounded in the global “best interests of the child” standard helps us build products for young people that support their well-being and rights while promoting consistency across different jurisdictions and product teams. We’ll continue to evolve and improve the guiding questions and resources in Meta’s Best Interests of the Child Framework as we learn more through expert consultation, user research and co-design. We’re excited to see how the six considerations in the framework continue to improve youth experiences across Meta technologies.