Owen Borville Learning: Ideas for a Better World
  • HOME
  • ARCHAEOLOGY BIBLE HISTORY
  • ASTRONOMY PHYSICS
  • BIOSCIENCES BIOMEDICAL
  • ENGINEERING TECHNOLOGY
  • ENVIRONMENTAL SCIENCES
  • PHILOSOPHY RELIGION
  • POLITICS LAW
  • TRAVEL GEOGRAPHY
  • ABOUT
  • MANAGEMENT BUSINESS EDUCATION LEADERSHIP

Overview and Impact of Sammy's Law in 2025 by Owen Borville August 26, 2025
Sammy’s Law (2025): Content, Context, and Implications

Introduction
In 2025, Sammy’s Law, formally introduced in the 119th U.S. Congress as H.R. 2657, emerged as a landmark legislative proposal in response to persistent and alarming concerns about child safety on large social media platforms. The bill, named in memory of Sammy Chapman—a teenager who tragically lost his life due to fentanyl poisoning after a drug transaction initiated over social media—proposes a sweeping set of requirements for large social media companies, with the explicit goal of empowering parents and legal guardians to protect their children from myriad online dangers. The law’s fundamental mechanism is to require large social media companies to provide standardized access, via APIs, to registered third-party safety software—allowing for both real-time monitoring and intervention authorized by children or their guardians2.

This report provides a comprehensive and current analysis (as of August 2025) of Sammy’s Law, including its legislative history and rationale, core legal provisions, definitions and scope, mechanisms for enforcement and compliance, and the broader political, social, legal, and economic debates enveloping the bill. Special attention is given to distinguishing this federal legislative effort from similarly named proposals at the state level and to situating it within the wider context of child online safety, both in the U.S. and internationally.

Background: The Origin and Context of Sammy’s Law
The Personal Tragedy and the Broader Problem
Sammy’s Law takes its name from Sammy Chapman, a 16-year-old boy described by friends and family as “sweet, funny, curious, and a top student.” Despite his parents’ efforts to shield him from harm, Sammy was contacted via Snapchat by a drug dealer who ultimately delivered drugs laced with lethal fentanyl directly to his home. Sammy’s death, on February 7, 2021, became one of many high-profile cases that highlighted the role of major social media platforms in facilitating lethal interactions and exposing minors to criminal activity, unsafe content, and predatory behaviors35.

Parents, child safety advocates, and bipartisan members of Congress cite not only “Sammy’s story,” but also a litany of dangers faced by minors online: cyberbullying, trafficking, drug solicitation, harassment, sexual exploitation, self-harm content, and more. Each of these issues has been documented in scientific studies and governmental reports; for instance, around 46% of U.S. teens have reported experiencing cyberbullying, 43% have seen self-harm content on Instagram, and 24% of young people have seen illicit drugs advertised for sale on social media4. The dangerous intersection of adolescent impulsivity, opaque social media algorithms, and barriers to effective parental oversight provides fertile ground for tragedies like Sammy's.

Legislative Momentum and Parental Activism
The legislative considerations behind the bill reflect growing activism among affected families—particularly those who have lost children due to drug-related incidents facilitated through platforms like Snapchat. Parent advocacy groups, such as the Organization for Social Media Safety, have played a major role, arguing that social media companies should open their platforms to allow proven third-party safety solutions that can flag risk behaviors, alert guardians, and potentially avert disaster by enabling prompt intervention4.

Until now, many large platforms have resisted sharing access to account content via external software—citing privacy, proprietary technology, and data protection considerations. Sammy’s Law was conceived as a direct response to inflexibility from social media companies, increasing public demand for parental tools, and mounting evidence that in-platform moderation remains insufficient for meaningful harm prevention.

Federal and State-Level Activity
Sammy’s Law, H.R. 2657 of the 119th Congress (2025-2026), is the preeminent federal legislative effort on this issue2. It builds on an earlier version of the bill (H.R. 5778, from the 118th Congress) and on parallel, though less far-reaching, state-level statutes, such as Illinois’ Let Parents Choose Protection Act (nicknamed “Sammy’s Law”). These legislative moves are part of a larger national and international trend to regulate the interfaces between minors, platforms, and third-party technologies for safety and monitoring purposes.

Sponsor Background and Motivations
The primary sponsor of Sammy’s Law is Rep. Debbie Wasserman Schultz (D-FL), joined by a bipartisan coalition including Earl “Buddy” Carter (R-GA), Kim Schrier (D-WA), Mariannette Miller-Meeks (R-IA), Thomas Suozzi (D-NY), and Brian Fitzpatrick (R-PA). Rep. Wasserman Schultz, long an advocate on issues affecting families and children, has emphasized her personal experience as a parent, her concerns about the inadequacy of current parental controls, and her conviction—bolstered by constituent stories—that law must catch up with the rapidly evolving dangers kids face online5.

According to the formal record, the bill was introduced on April 3, 2025, and referred to the House Committee on Energy and Commerce. Its legislative language and recorded actions are publicly available on Congress.gov and several legislative tracking platforms9.

Legislative History and Status
Bill Trajectory in Congress
April 3, 2025: H.R. 2657 introduced by Rep. Wasserman Schultz and referred to the House Committee on Energy and Commerce.

As of August 2025, the bill’s status is “Introduced,” pending further action in committee. The bill has attracted notable bipartisan cosponsorship but remains several steps away from enactment, requiring committee debate, passage in the House of Representatives, approval in the Senate, and the president's signature9.

The legislative process is marked by uncertainty; GovTrack estimates a modest chance of enactment, noting both the bill’s bipartisan support and the legislative hurdles typical for complex regulatory proposals.

Related and Predecessor Legislation
The core provisions in H.R. 2657 mirror those first introduced in H.R. 5778, “Sammy’s Law of 2023.” The 2023 version shared nearly identical language regarding the role of APIs, third-party providers, and the FTC’s oversight responsibilities. This iterative approach reflects persistent advocacy by its sponsors and evolving negotiations with stakeholders.

Sammy’s Law has also inspired state-level activity, notably in Illinois, where HB5380 (2023-2024) proposed very similar requirements (even referencing “Sammy’s Law” as an alias for the Let Parents Choose Protection Act).

Legal Details: Key Provisions and Definitions
Overview of Bill Structure: Sammy’s Law consists of several major sections, covering the sense of Congress, explicit definitions, core requirements for social media platforms, the regulatory framework for third-party safety software, compliance and enforcement mechanisms, and federal preemption design to establish a national standard11.

Table 1: Key Provisions of Sammy’s Law (H.R. 2657) and Expected Effects
Provision Summary Expected Effect
Mandatory API Access Requires major social media platforms to create APIs accessible to FTC-registered third-party safety software providers. Enables real-time, delegated parental or guardian monitoring and management of a child’s account; circumvents current platform-imposed barriers.
Delegation Rights Allows a child (13-16) or their guardian to delegate permission for monitoring. Empowers parental oversight without needing in-app cooperation from the child; standardizes delegated rights.
Coverage Thresholds Applies to platforms with >100 million monthly users or >$1 billion annual revenue. Focuses compliance on dominant platforms (e.g., Meta/Facebook, Instagram, TikTok, Snapchat, YouTube, X/Twitter).
Definitions of Harm Details specific harms to be monitored, including suicide risk, substance abuse, bullying, sextortion, academic dishonesty, violence, and personal info sharing. Directs monitoring and alerts to clearly defined categories of risk; harmonizes third-party software standards.
Third-Party Software Registration Requires FTC registration, U.S. business base, non-foreign ownership, security audits, and exclusive in-country data storage. Reduces foreign data risks, assures accountability, and sets vetting for software involved in monitoring children.
Data Use and Disclosure Limits use/disclosure of user data to the sole purpose of protecting children; requires deletion after 14 days (with some exceptions). Protects privacy, assures data minimization, and mandates transparency for children and parents.
Indemnification for Platforms Shields platforms from damages if they transfer data in good faith compliance. Removes legal ambiguity and ensures platform cooperation with the new data-sharing regime.
Enforcement via FTC Treats violations as unfair or deceptive acts or practices under the FTC Act. Deploys the FTC’s extensive enforcement powers and complaint mechanisms for systemic oversight.
Federal Preemption Establishes one national standard; preempts state laws requiring social media platforms to provide third-party API access, with limited exceptions. Simplifies the compliance landscape for platforms operating nationwide; maintains limited state consumer protection and tort law.
Effective Date Takes effect upon issuance of FTC guidance within 180 days of enactment. Provides time for rulemaking, clarifies scope, and ensures regulated parties can prepare for compliance.

Definitions and Scope
Sammy’s Law employs precise definitions, critical to understanding its regulatory reach11:

Child: Any individual under age 17 with a registered account.

Large Social Media Platform: Platforms provided via website or mobile app, not prohibiting child use, enabling users to share detectable media (images, text, video) with others met through the service, and either exceeding 100M monthly global active users or $1B in gross revenue annually.

Third-Party Safety Software Provider: A U.S.-based commercial entity, not foreign-controlled, registered with the FTC, authorized by a child (13+) or guardian to manage or monitor a child’s account for protective purposes only.

User Data: Any information or content created by or sent to a child, considered “user data” for 30 days following creation under a valid delegation.

Harm: Enumerated risks include (i) suicide, (ii) anxiety/depression, (iii) eating disorders, (iv) violence, (v) substance abuse, (vi) fraud, (vii) trafficking, (viii) sexual abuse or exploitation, (ix) academic dishonesty, and (x) exposure of sensitive personal information (address, SSN, banking data).

These definitions are intended to ensure the law focuses tightly on child welfare, robust supervision, and the specific online conduct most consistently identified by researchers as leading to harm.

API and Data Requirements
Large social media platforms must, within 30 days of the Act’s effective date (or upon reaching the coverage threshold), make available a suite of real-time application programming interfaces (APIs) for registered third-party providers. This access enables:

Delegated account management: Parents or guardians (or, if aged 13-16, children themselves) can authorize software to change privacy, age, or marketing settings, and to review or configure account interactions.

Data transfer: Platforms must allow secure data transfers no less often than hourly.

Ongoing access: Once authorized, access continues unless delegation is revoked, the child account is closed, the provider rejects the delegation, or the provider fails continued FTC requirements.

Security and privacy: All transferred data must be managed and stored within the U.S.; deletion must occur within 14 days (or up to 30 days after account closure), and only the minimum necessary data can be shared.

Oversight, Registration, and Auditing

Third-party safety software providers must:
Register and demonstrate compliance with FTC requirements (U.S. base, no foreign control).
Pass security reviews by qualified independent auditors; submit annual audit summaries to the FTC and a redacted summary to the public.
Use data only for stated protective purposes; promptly inform guardians and affected children of relevant data disclosures, unless such notice would present risks to the child.

The FTC is tasked with:
Issuing rules and guidance within 180 days.
Overseeing registration and audit compliance.
Undertaking biannual compliance reviews for both platforms and safety software providers.
Operating a complaint portal for children, guardians, software providers, and platforms to allege violations.

Enforcement Mechanisms
Violations are treated as unfair or deceptive acts under Section 18 of the FTC Act, carrying the full range of federal penalties: substantial civil fines, cease-and-desist orders, and other remedies as appropriate. The bill preserves FTC authority under other laws, but circumscribes private civil liability, especially for platforms acting “in good faith” with the law and FTC guidance12.

Intended Impact and Policy Rationale
Safety as the Supreme Goal
The ultimate intent of Sammy’s Law is to provide parents and guardians with the choice to use third-party safety software to monitor and manage their children’s social media accounts, with the aim of averting suicides, mental health crises, trafficking, substance abuse, and abuse or exploitation that often originates online. Parental activists and advocates argue that third-party monitoring applications—already effective in many contexts—have been stymied by platform refusals to allow API-level access, effectively handicapping at-risk families and children4.

Mandatory API access breaks this stalemate, setting a federal baseline to ensure that all major platforms must cooperate with parents who wish to deploy such tools. By incorporating robust security and privacy standards, the bill attempts to balance the protection of vulnerable children with respect for data security and the autonomy of teens age 13 or older.

Constraining the Scope
The law takes care not to subject every digital interaction to regulation. It excludes:

Platforms solely for professional or commercial transactions.

News and information sites without direct messaging features.

Small platforms (less than 100 million monthly users or $1 billion annual revenue), focusing instead on large-scale social media whose design and reach have been most associated with the harms in question.

Political, Social, and Legal Debates
Support and Advocacy
Congressional Support: The bill’s sponsors and cosponsors come from both parties and diverse districts. They are united by concern for child safety and testimonials from families who have lost children to dangers beginning on social media platforms2.

Parent and Child Advocacy Groups: Organizations like the Organization for Social Media Safety have championed the law, emphasizing the ongoing toll of unsafe digital environments on young people and the need for interoperable, cross-platform tools for parents to intervene at critical moments—such as in cases of suicidal ideation, grooming, or drug solicitation1.

Expert Community: Many pediatricians and mental health professionals advocate for greater parental involvement and technological solutions to the social media crisis. Reports and peer-reviewed studies cited by the bill’s proponents document a strong correlation between unmoderated exposure and increases in bullying, self-harm, and risky behaviors.

Industry and Civil Liberties Responses
Social Media Industry: Large platforms have long resisted external monitoring, citing the following concerns:

User privacy (especially of teens and their social contacts).

Proprietary nature of platform algorithms and data structures.

Security risks related to open API access.

Technical challenges in providing standardized, secure, and real-time data feeds.

While some platforms (notably Meta/Facebook and Instagram) offer partial or opt-in parental controls, others (such as Snapchat and TikTok) have until now categorically denied external access, drawing criticism from parents and legislators.

Privacy and Civil Liberties Advocates: Privacy groups express concern that third-party monitoring could invade the privacy of children and, critically, expose sensitive information (e.g. about sexual orientation, health issues, or family conflicts) to parents or guardians in potentially harmful ways. There are also fears around:

Potential “outing” of LGBT youth or those seeking reproductive or mental health resources in hostile environments.

Over-surveillance leading to breakdowns in trust between children and parents, or even legal or physical danger for some young people.

The “normalization” of third-party surveillance could erode digital rights more broadly.

Emma Llansó of the Center for Democracy and Technology and other privacy experts warn that the broad mandate for monitoring could create “a chilling effect,” risking increased family conflict or even criminalization where sensitive data is mishandled or misinterpreted.

Third-Party Safety Software Providers: Companies behind apps like Bark or Qustodio view the law as an opportunity, but also recognize the challenge of building systems compliant with the FTC requirements for registration, security, data localization, and deletion. They argue that in practice, millions of families are already successfully using these tools—outside the closed platforms—and that standardized, vetted access will only enhance safety and privacy controls.

Economic, Technical, and Administrative Implications
Cost and Technical Considerations: Social media companies face the complex and potentially expensive challenge of developing, maintaining, and supporting secure public APIs that offer real-time or near-real-time data streams compliant with both federal and FTC rules. There are also significant ongoing costs in ensuring these APIs are secure against abuse or breaches and that delegation, revocation, and notification mechanisms work as intended. Some platforms may pass these compliance costs on to users or use technical challenges as an argument for delay.

Budgetary Impact: As of August 2025, the Congressional Budget Office (CBO) has not released a public estimate of the bill’s projected fiscal impact. However, the administrative burden primarily falls on the FTC for registration, oversight, audit review, and complaint management—mandating some increase in federal expenditures, potentially offset by registration fees and fines for non-compliance.

Legal and Constitutional Issues
Federal Preemption and States’ Rights: Sammy’s Law asserts a federal “one national standard” for provision of third-party API access, preempting state laws on the same subject, but not broader state consumer protection, trespass, contract, or fraud statutes. This is intended to prevent a patchwork of conflicting requirements for large platforms, but also provokes debate over federalism and local innovation.

Constitutional Concerns: Critics have raised possible First Amendment challenges, particularly as the bill could have the practical effect of chilling youth expression or facilitating compelled speech by platforms. Others argue that by enhancing parental choice, the law enhances—not diminishes—voluntary family decision-making. The law further attempts to thread the constitutional needle by not mandating content monitoring, but focusing on access for guardians and only for an enumerated list of harms.

Data Security: Required exclusive U.S. storage, rapid deletion, and third-party audit of safety software are meant to mitigate risks of data breaches, abuse, or unauthorized government access, but civil liberties groups remain vigilant about the risk of secondary use or spillover into non-child welfare domains.

Comparison with Other Legislation and International Context
U.S. Federal Efforts
Aside from Sammy’s Law, several bills in Congress seek to address overlapping problems in online child safety, including:

Kids Online Safety Act (KOSA): Focuses on a “duty of care” for platforms to prevent a wide range of harms to minors, mandates strong parental tools, audit mechanisms, and platform transparency for content recommendation algorithms15. It differs in its structure and does not mandate third-party API access, but is part of a broader bipartisan push.

Combating Harmful Actions with Transparency on Social (CHATS) Act: Focuses on tracking criminal incidents, such as drug-related crimes, committed via social media.

Sammy’s Law is unique in explicitly requiring universal, standardized, and secure APIs for delegated third-party management, codifying external monitoring as a legal right rather than as a permissible or voluntary platform feature.

State Laws
Illinois' Let Parents Choose Protection Act emulates the core provisions of the federal proposal, highlighting a growing willingness among states to legislate where Congress is slow or platforms have been unresponsive. State laws on social media age verification, data rights, and parental controls are proliferating, but risk creating inconsistent compliance challenges for companies operating nationwide or globally.

International Comparison
UK Online Safety Act (OSA): In effect since July 2025, the UK's OSA mandates wide-ranging content moderation, age verification, and “duty of care” obligations for platforms regarding children. Its implementation has been widely watched in the U.S., as it has led to significant operational changes, including mandatory age checks, content restrictions, and even market withdrawal by some platforms uncomfortable with compliance requirements17.

The UK law is broader (addressing a wider array of content, including hate speech and adult material accessed by adults as well as minors), but does not mandate the technical mechanism of third-party API access for parental delegation, as found in Sammy’s Law.

Early reports from the UK show pushback from privacy groups, users, and some providers, but also heightened awareness of online child safety17.

EU Digital Services Act (DSA): The DSA, effective February 2024, obliges major platforms to undertake systemic risk assessments, respond to “trusted flaggers,” and adopt certain transparency and reporting measures with respect to harmful content—even when originating outside the EU. Enforcement includes “Brussels Effect” concerns, as global companies align worldwide policies to meet stringent EU standards.

Lessons for U.S. Lawmakers: International experience demonstrates that sweeping online safety laws can lead to both major changes in platform operation (including user access, data protection, and algorithmic transparency) and heightened debate over privacy, free expression, and the appropriate scope of state intervention. The international trend is unmistakably toward increased compliance obligations, but the structure and specific mechanisms—like the API access required by Sammy’s Law—vary among jurisdictions.

Social and Media Coverage
Sammy’s Law has engendered considerable and often emotional media coverage, fueled by the advocacy of parents like Sam Chapman and Dr. Laura Berman, whose high-profile interviews and lobbying have made the issue both personal and public4. Coverage in major outlets like NBC News and Fox situates the legislation within a broader conversation about parental rights, youth privacy, and the morality of platform design and revenue motives.

Proponents frame the law as simple common sense, emphasizing “a right to know” and intervention as life-saving, rather than invasive.

Critics foreground unintended consequences and the risk of “nightmare scenarios” should data fall into the wrong hands, especially for marginalized or at-risk youth.

Both tragic and successful interventions are cited, lending urgency and moral weight to the legislative debate.

Commentary from Experts and Stakeholders
Supporters emphasize that a negligible minority of parents or guardians abuse monitoring privileges, while millions of children remain exposed to unrestrained online risks. Proponents argue that existing parental control frameworks are inconsistent, proprietary, and fail to interoperate across platforms or utilize the most advanced machine learning detection tools for risk behaviors.

Skeptics from the privacy and technology policy fields repeatedly caution that the law, while noble in intent, may create substantial new risks—either by normalizing third-party surveillance or inadvertently targeting vulnerable populations.

Academia: Scholars in law, child development, and public policy continue to debate not merely the technical, but the developmental consequences of increased surveillance. Studies suggest that while targeted interventions are often beneficial, excessive parental oversight may reduce child autonomy or stoke conflict, particularly during adolescence. The law’s attempt to strike a balance—by making parental delegation optional, including for minors aged 13-16—is seen as a thoughtful, if imperfect, accommodation.

Industry: As with previous regulatory overhauls (such as the EU’s GDPR or California’s CCPA), large platforms are bracing for the technical and administrative trainwreck of dual or divergent national standards. Smaller providers fear exclusion from the market due to the cost of compliance, while third-party software developers see a new avenue for competition—albeit laced with regulatory risk.

Conclusion: The Future of Child Online Safety in the U.S.
Sammy’s Law embodies the accelerating intersection between tech accountability, child protection, and digital civil liberties. Its passage would mark a turning point in U.S. regulatory strategy, shifting from platforms’ self-imposed measures toward robust, regulated, and standardized third-party safety interventions. Even at the legislative “introduced” stage, the bill has focused attention on the limitations of current approaches and the urgent need for parental empowerment in a tech landscape that has, too often, treated child safety as an afterthought4.

If enacted, Sammy’s Law will:

Immediately compel major social media companies to enable third-party parental monitoring via standardized APIs.
Mandate robust vetting, auditing, and U.S.-only data storage for safety software.
Shift the locus of control from private platforms to parents and guardians—albeit with significant safeguards and limitations.

Major obstacles remain: legislative inertia, opposition from both privacy and industry advocates, technical implementation challenges, and the ever-present risk of regulatory overreach or underenforcement. The U.S. will need to reconcile these issues with global trends in online safety law—balancing the moral imperative to save lives against the foundational values of privacy, autonomy, and free expression.

In sum, Sammy’s Law is a watershed in the politics of child digital safety. Whether enacted as written or further amended, it offers a revealing case study in how legislatures, technology companies, activists, and families are renegotiating the limits of public and private intervention in the digital lives of children in 2025 and beyond.

Appendix: Key Features and Effects Comparison Table
Provision/Requirement Description Primary Intended Effect Notable Stakeholder Concerns
Large Platform Coverage 100M+ global users or $1B+ annual rev. Focuses regulation on technologically relevant platforms Smaller platforms exempt; possible future threshold shifts
API Mandate Standardized, FTC-compliant, secure third-party API access Enables cross-platform, real-time parental delegation and monitoring Technical cost, risk of misuse or abuse
Delegation Authority Parents or children (13+) can authorize external monitoring Parental tools for intervention, child safety Teen privacy, risk of family conflict
Data Use/Disclosure Limits Data only for safety; deletion after 14 or 30 days; U.S. storage only Limits abuse, aligns with privacy best practices Compliance enforcement, risk of under-deletion
Enforcement by FTC Treated as unfair/deceptive practice; penalties, audits National enforcement, uniform rules FTC resource constraints
Federal Preemption Overrides conflicting state laws on platform API access Legal certainty for large platforms State resistance, limited local flexibility
Indemnification for Platforms Shields if acting in good faith with delegation and data transfer Removes legal ambiguity/hesitation for cooperation Victim remedy limitations, perceived impunity
Public Audit Summaries Annual, redacted audit summaries posted Accountability, transparency Proprietary data risk, audit cost
Alert Categories 15+ types of harm (e.g., suicide, trafficking, academic dishonesty) Focuses on established risk vectors False positives, over-disclosure
This report is based on the legislative text of H.R. 2657, public statements by sponsors and advocates, media coverage, and current expert and institutional commentary as of August 2025.


Archaeology Astronomy Bible Studies Biosciences Business Education Engineering Environmental Patterns in Nature Philosophy & Religion Politics Travel Home About Contact
Owen Borville Learning: Ideas for a Better World offers an online, innovative, learning platform for students and researchers that are passionate for learning, research, and have a desire to challenge the established consensus of thought and improve the world.
​
Copyright 2018-2026. Owen Borville Learning: Ideas for a Better World
  • HOME
  • ARCHAEOLOGY BIBLE HISTORY
  • ASTRONOMY PHYSICS
  • BIOSCIENCES BIOMEDICAL
  • ENGINEERING TECHNOLOGY
  • ENVIRONMENTAL SCIENCES
  • PHILOSOPHY RELIGION
  • POLITICS LAW
  • TRAVEL GEOGRAPHY
  • ABOUT
  • MANAGEMENT BUSINESS EDUCATION LEADERSHIP