Minnesota Bans AI Nude-Image Tools, Fines Up to $500K
House File 1606 bars apps and websites from creating realistic AI nude images of identifiable people; penalties include up to $500,000 per use and up to triple damages.
The Minnesota Senate voted 65-0 to approve House File 1606 on Thursday and sent the bill to Governor Tim Walz. If he signs it, the law will take effect Aug. 1 and will apply to new cases from that date forward.
The measure prohibits apps, websites and other software from generating realistic nude images of identifiable people. Companies that operate or control online platforms or software may not allow users to access such tools, generate images on a user’s behalf, or advertise or promote the services.
The statute creates a private right of action for people depicted in AI-generated nude images. Courts may award compensatory damages, including for mental anguish, punitive damages, attorney fees, injunctive relief and up to three times actual damages.
The state attorney general may bring enforcement actions and seek civil penalties of up to $500,000 per use. The bill directs collected penalties to the state general fund, with appropriations earmarked for victim services, including support for survivors of sexual assault, domestic violence and child abuse.
Lawmakers designed the measure to target image-generation tools that require little technical skill and are widely accessible, including to minors. The law preserves online platforms’ legal protections under Section 230 of federal law.
Robert Weissman, co-president of Public Citizen, described the tools as “99% targeting women, over 90% of whom are under 18,” and called them a vehicle for intimidation and harassment with severe psychological consequences. He said state-level enforcement can work alongside federal rules and allow local authorities to act more quickly in individual cases.
The bill follows a wave of incidents and litigation involving AI-generated intimate imagery. Platforms and AI developers have faced federal class actions and consumer protection lawsuits alleging their systems produced nonconsensual sexualized content. At the federal level, the Take It Down Act, signed into law in May 2025, criminalizes distribution of nonconsensual intimate images and provides a civil damages avenue for victims, creating overlapping state and federal enforcement paths.
The governor’s office had not issued a comment at the time the Senate approved the bill.
The material on GNcrypto is intended solely for informational use and must not be regarded as financial advice. We make every effort to keep the content accurate and current, but we cannot warrant its precision, completeness, or reliability. GNcrypto does not take responsibility for any mistakes, omissions, or financial losses resulting from reliance on this information. Any actions you take based on this content are done at your own risk. Always conduct independent research and seek guidance from a qualified specialist. For further details, please review our Terms, Privacy Policy and Disclaimers.








