top of page
Search

Style Theft via Generative AI: When Machines Learn You | Part 3/6

  • Writer: Marc Morgenstern
    Marc Morgenstern
  • Dec 18, 2025
  • 3 min read

Article 3 of 6: Ways Creators Are Having Their Art Stolen Online


When AI Learns Your Style Without Asking

It’s bad enough that creators must accept some level of online risk — a repost here, an uncredited share there.


But now generative AI has introduced an entirely new threat:


Style theft.


Not imitation.Not inspiration.Large Language Models (LLM’s) are absorbing your entire artistic voice — your palette, lines, textures, shapes, and emotional cues — and reproducing it on demand, all with a prompt. 


Why This Series Exists

While trying to reach their audience, many creators are discovering they’ve already lost control of their own work.


This series explores six major ways creative work is stolen online, backed by real-world case studies — and what creators can do to protect themselves.


The Machines Are Learning Us

Generative AI tools like Midjourney and Stable Diffusion train on enormous datasets scraped from across the internet. If your creative work is online, it’s probable that it’s already used in an ai training set. And not because you’ve given permission or were even credited.Simply because it was accessible.


A Quick Look Back: Art School vs. AI Scraping

Studying the Masters is part of art education.Da Vinci trained apprentices. He, with influence from his Patrons, was in control of how many recreations were created, and his original signature was included.Modern students still learn by copying techniques.

But claiming authorship of someone else’s style?

That would be an art forgery – which is still illegal – No?!


AI training is not education. It’s extraction.

AI models learn by scraping millions of images — including copyrighted work — without asking permission.They analyze your artistic fingerprint and generate pieces that look like your hand drew them… even when you didn’t touch a brush.

Once your style enters a dataset, it can be replicated endlessly.There is no opt-out.No consent screen.No undo button.


Quick Definition: AI Training

AI training means feeding huge datasets of images, text, or audio into a machine so it can learn patterns and generate new content. Many datasets include copyrighted artwork scraped without permission, enabling AI to imitate an artist’s style without credit or compensation. 


Case Study: Greg Rutkowski — The Artist Used 400,000 Times as a Prompt

Fantasy illustrator Greg Rutkowski is known all over the world for sweeping, atmospheric art used in media, games and books. He started seeing prompts like: “In the style of Greg Rutkowski.”

At first, he was flattered. Then he saw the numbers, his name appeared in prompts almost half a million times. New renditions of his AI-generated style flooded the social feeds. What’s even worse, clients couldn’t distinguish AI-Greg from the real Greg.

“It feels like losing your artistic identity overnight.”— Greg Rutkowski

He lost his style — and with it, real commissions.


Case Study: Hollie Mengert — When Your Style Becomes a Preset

Children’s book illustrator and Disney artist Hollie Mengert is known for soft, warm, storybook imagery. Then an AI model called “Hollie-Mix” appeared online. It was trained specifically to mimic her. She didn’t collaborate or consent to it. She didn’t even know. The creators openly admitted they scraped her illustrations to build a generator capable of producing “Hollie-style” art instantly. And suddenly: Characters, “Disney-style” concepts, OC designs, products, were all appearing online in her style — except she hadn’t drawn them.

“It was unsettling. People were generating art that looked like mine, faster than I could draw it.”— Hollie Mengert

Which begs the question: Why hire the original artist when a machine can imitate her instantly? She didn’t lose artwork. She lost ownership of her look — the thing her career depends on.



Why This Keeps Happening

AI companies argue:

  • “The dataset was open.”

  • “Training is fair use.”

  • “Images online are public.”


Creators respond:

  • Public ≠ permission

  • Accessible ≠ ethical

  • Learnable ≠ licensable


Until regulation catches up, creators are unprotected — even as their work powers billion-dollar models.


Further Reading: The Verge — AI Style Replication & Lawsuits


Artist Armor PromiseYour style is your identity — not a prompt.


We can’t un-scrape the past.

But Creators can protect future work:

  • With visible copyright and watermarks

  • Clear licensing language

  • Low Resolution Preview Files

  • Metadata

  • Audit Trails.

In the meantime:

  • Track where your art appears

  • Join creator-driven ethical AI groups

  • Support platforms that prioritize creator rights.


Protect your art. Reclaim your style.


ArtistArmor.com - Creativity Defended.

Have You Experienced AI Style Theft? You’re not alone.Tell your story in the comments, reach out via DM or email us at info@artistarmor.com.

 
 
 

Comments


bottom of page