Social Media Parental Notification Requirements: What the Law Now Requires
Published March 2026 · 8 min read
Medically reviewed by licensed healthcare professionals · Legally reviewed by mass tort litigation specialists · Last updated:
Social media parental notification requirements. A wave of state legislation passed in 2024 and 2025 now requires social media platforms to notify parents when minors create accounts, share usage data with parents on request, and obtain parental consent before certain features are available to users under 16 or 18. This guide explains what those requirements mean, where they apply, and how compliance failures affect ongoing litigation.
Why Parental Notification Laws Were Created
For most of social media's history, minors could create accounts on major platforms with no parental awareness, no parental consent, and no mechanism for parents to monitor use. Platforms set minimum age requirements — typically 13 under the Children's Online Privacy Protection Act (COPPA) — but enforcement of those minimums was left entirely to users' self-reported ages. A child who entered a false birthday at account creation faced no obstacle. Platforms did not verify, and in most cases did not attempt to verify, user age.
The legislative response to this gap intensified after 2021 and 2022, as internal platform research documenting youth harm became public through whistleblower disclosures and journalism. State legislators across the political spectrum found bipartisan agreement that parents should have some level of awareness and control over their minor children's platform access. The result has been a patchwork of state laws with different age thresholds, consent requirements, and enforcement mechanisms.
Key States with Parental Notification Requirements
Arkansas: The Social Media Safety Act, passed in 2023 and modified through subsequent litigation, requires platforms to verify user age and obtain parental consent for minor users. The law has faced First Amendment challenges that have shaped implementation, but the core parental consent requirement survived judicial review in significant part.
Texas: The SCOPE Act requires platforms with more than 100 million monthly users to block minors from creating accounts without parental consent, provide parents with access to their minor child's account activity upon request, and implement content filtering for material harmful to minors. Violations are subject to civil penalties enforced by the Texas Attorney General.
Florida: HB 3 prohibits children under 14 from maintaining social media accounts on covered platforms and requires parental consent for 14- and 15-year-old users. Platforms are required to delete accounts created by users who are found to be under 14. Florida's law also requires platforms to implement age verification — through third-party age estimation tools, government-issued ID, or other approved methods.
Utah: Among the earliest states to act, Utah passed a two-bill package in 2023 requiring parental consent for minor users and giving parents the right to access minor account content. Utah's law also restricts platforms from using features specifically designed to increase addictive use among minors — a provision directly relevant to litigation claims about design defect.
Louisiana, Georgia, and Indiana have passed or advanced similar consent-and-notification frameworks with state-specific variations in age thresholds and enforcement mechanisms. At the federal level, the Children and Teens' Online Privacy Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA) have advanced in Congress with bipartisan support, though federal legislation has not yet been enacted as of this writing.
What Platforms Are Currently Required to Do
Compliance obligations vary by state and platform size, but the most common requirements across enacted laws include: age verification at account creation, parental consent for users below the applicable age threshold, parental access to account activity data on request, time-of-use restrictions (many laws prohibit platform use by minors between certain hours without parental override), and content filtering for age-inappropriate material.
Notification specifically — as opposed to consent — is a feature of some but not all state laws. Notification typically means that a parent must receive an alert when a child creates an account, and may receive periodic summaries of account activity. Some laws require that notification be affirmative (the parent receives a direct communication) rather than passive (the parent can request information if they know to ask). The distinction matters because passive notification systems do not meaningfully change the information asymmetry between platforms and parents that enabled years of unmonitored use.
How Compliance Failures Connect to Litigation
Violations of state parental notification and consent laws are relevant to ongoing civil litigation in two ways. First, regulatory violations can establish per se negligence — the act of violating a safety statute designed to protect a specific class (minors) is itself evidence of negligence if a plaintiff in that class was harmed. Second, violation records and enforcement actions by attorneys general generate discovery-accessible documents about platform safety practices and compliance histories.
Platforms that publicly represented their parental control features as effective while internal compliance teams documented failure rates face the same internal-knowledge-versus-public-statement gap that has been central to claims against Meta and ByteDance in MDL 3047. The parental notification legal framework created both protective obligations and a new evidentiary layer for plaintiffs who can show their minor children used platforms in violation of those obligations.
What Parents Should Know About Their Rights Under Current Law
If you are the parent of a minor who currently uses social media, most states with enacted legislation give you the right to request account activity data from covered platforms. Exercise that right now, before your child turns 18. After a minor turns 18, parental access rights under these laws generally terminate, and data preservation becomes the family's responsibility rather than the platform's obligation.
If you believe a platform allowed your minor child to create or maintain an account in violation of applicable state law, document the account creation date, your child's date of birth, and any communications you received or did not receive from the platform. A platform that failed to notify you of your minor child's account in a state that required such notification may have contributed to delayed parental intervention — a harm that is separate from but related to the underlying psychological harm claims.
What to Do If Your Family Has Been Affected
Document when each account was created and on what platform, your child's age at account creation, whether you received any notification from the platform, and the mental health outcomes your child experienced during the period of account use. Bring this documentation to a case review and ask specifically whether your state's parental notification laws are relevant to the claim timeline. Legal teams tracking this litigation incorporate state law compliance failures into case evaluations where applicable.
Related Pages on This Site
Understand Your Legal Options
A free case review will help you understand how parental notification failures may affect your family's claim.
Start Your Case Review →