AI Regulation

House Republicans Introduce Weak SECURE Data Act on Privacy

House Energy & Commerce Committee Republicans have introduced the SECURE Data Act, a draft federal privacy bill that critics say represents a step backward in protecting consumer privacy.

The legislation largely mirrors weak privacy laws in states like Kentucky and Virginia, allowing companies to set much of their own data handling rules. The bill mainly relies on user consent through privacy policies, perpetuating the unrealistic expectation that consumers can fully understand and control complex data practices themselves.

Limited Privacy Protections and Broad Exemptions

The SECURE Data Act’s data minimization requirements only obligate companies to limit data collection to purposes disclosed in privacy policies—a standard that is already mandated under current federal and state laws. The bill does not prohibit manipulative design practices known as dark patterns, which can influence how users make privacy choices.

Several significant exemptions weaken the bill further. Data processing related to consumer-requested products or services and data used for internal research to develop or improve technology—activities often involving AI training—are excluded from regulation. Industry advocates sought to exclude these categories, which critics argue could render the bill ineffective.

Preemption of Stronger State Laws

The Act includes broad preemption clauses preventing states from enacting or enforcing privacy laws “related to the provisions” of the Act. This could block states like Maryland, Virginia, Oregon, California, Texas, and Illinois from maintaining tougher data privacy and civil rights protections.

Such sweeping preemption concerns privacy advocates, who point to the ongoing development of state laws that address biometric privacy and AI transparency, which are not covered or are explicitly undermined by this federal proposal.

Enforcement and Civil Rights Protections

The bill lacks a private right of action, removing an important enforcement mechanism that allows individuals to sue for privacy violations. Previous bipartisan privacy bills included limited private rights of action designed to balance enforcement with business concerns.

Moreover, the SECURE Data Act does not include meaningful civil rights protections related to data use. It simply reaffirms that illegal discrimination remains unlawful under existing laws but does not address how data and AI-driven decisions could perpetuate bias, a topic covered in earlier bipartisan proposals.

Why it matters

The SECURE Data Act’s approach risks entrenching inadequate privacy standards at the federal level and could stifle ongoing state-level innovations in data protection. Its broad exemptions, lack of private enforcement, and preemption clauses may undermine consumer privacy and allow continued discriminatory data practices amid increasing concerns about AI and data ethics.

Read more AI Regulation stories on Goka World News.

Sources

This article is based on reporting and publicly available information from the following source:

Giorgio Kajaia
About the author

Giorgio Kajaia

Giorgio Kajaia is a writer at Goka World News covering world news, U.S. news, politics, business, climate, science, technology, health, security, and public-interest stories. He focuses on clear, factual, and reader-first reporting based on credible reporting, official statements, publicly available information, and relevant source material.

View all posts by Giorgio Kajaia