Sunday, May 3, 2026
No Result
View All Result
NewsWave
  • Home
  • World
  • USA
  • Business
  • Sports
  • More
    • Entertainment
    • Technology
  • Pricing
  • Login
  • Home
  • World
  • USA
  • Business
  • Sports
  • More
    • Entertainment
    • Technology
  • Pricing
  • Login
No Result
View All Result
NewsWave
No Result
View All Result
Home World Australia

Court rules against Grok over AI-generated images

26 March 2026
in Australia
Share on FacebookShare on Twitter



A Dutch court has issued a preliminary injunction against Elon Musk’s xAI and its chatbot Grok, prohibiting the generation and distribution of images depicting individuals, including children, in sexualized poses or undressing without explicit consent. The ruling from the Amsterdam Court may establish a legal precedent in Europe, addressing the responsibilities of AI companies regarding the creation of sexualized content amid a growing wave of complaints about Grok. If xAI fails to comply with the order, it faces fines of 100,000 euros (approximately AUD 166,500) per day. The case was initiated by Offlimits, a Dutch nonprofit focused on combating online sexual abuse. xAI’s legal representatives argued that it cannot prevent all misuse of its tools, but the court found their safeguards inadequate after demonstrating Grok’s ability to create non-consensual images in a March hearing.

Why It Matters

This ruling reflects increasing legal scrutiny of AI technologies and their potential to facilitate non-consensual sexual content. As European regulators tighten their grip under the EU’s Digital Services Act, the spotlight is on companies like xAI to ensure their tools do not contribute to illegal activities. The European Commission has launched a formal investigation into X regarding Grok’s deployment in the EU, primarily focusing on the risks associated with manipulated sexually explicit images. The court’s decision aligns with broader efforts, including a recent European Parliament initiative to ban AI tools that create or manipulate sexual imagery, highlighting the urgent need for accountability in digital content generation.

Want More Context? 🔎

🌊 Diving deeper into this topic...

🪄 Creating a simple explanation...

Loading PerspectiveSplit analysis...

Tags: AIGeneratedCourtcrimeGrokImagesNewsrules
Previous Post

Gangland killer receives additional sentence for assaulting Uber driver

Next Post

FIFA investigates Congolese FA president jailed with family

Related Posts

Australia

Treasurer Chalmers commits to reducing spending in upcoming budget

3 May 2026
Australia

Ugle-Hagan leads Suns to victory over GWS

3 May 2026
Australia

Glass and debris in compost injure players on footy field

3 May 2026
Australia

Swans defeat Demons in tight match

3 May 2026
Australia

Kayakers Show Support for Gaza Flotilla

2 May 2026
Australia

Eagles affirm rebuild is progressing as planned

2 May 2026
Please login to join discussion
NewsWave

News Summarized. Time Saved. Bite-sized news briefs for busy people. No fluff, just facts.

CATEGORIES

  • Africa
  • Asia Pacific
  • Australia
  • Business
  • Canada
  • Entertainment
  • Europe
  • India
  • Middle East
  • New Zealand
  • Sports
  • Technology
  • Trending
  • UK
  • USA
  • World

LATEST NEWS STORIES

  • Iran issues one-month deadline for US to end naval blockade and conflicts
  • Zelenskyy’s Limited Options in Russia-Ukraine Conflict
  • Man drives car filled with explosives into athletic club
  • About Us
  • Disclaimer
  • Privacy Policy
  • Cookie Privacy Policy
  • Terms and Conditions
  • Contact Us

Copyright © 2026 News Wave
News Wave is not responsible for the content of external sites.

No Result
View All Result
  • Home
  • World
  • USA
  • Business
  • Sports
  • More
    • Entertainment
    • Technology
  • Pricing
  • Login

Copyright © 2026 News Wave
News Wave is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In