Does give permission for AI to use artists' work

Just wondering about your policy regarding AI… A lot of work is shared here in this community… What policies do you have in place to protect artists? For example, do you have a NoAI tag one can put when sharing one’s work here etc.?

There’s nothing we or any other site can do about it. NoAI tags only apply if AI developers respect them- which they don’t.


It’s a public forum… and BlenderArtists itself doesn’t use… :thinking: … okay i don’t know that… AFAIK it only repost some art on some other social media (i don’t remember expect BlenderNation… because i don’t care about the other :stuck_out_tongue_winking_eye: )

And if you refer to ArtStation… wasn’t it Epic itself who… used it… i’m not sure…

Anyway: you may add a watermark visible and/or unvisible or even add overlay one in such small amount that image recognizion does fail…

There were articles about that back then… i wonder why nobody came up with that idea… tags are useless… make it unrecognizable for non-humans…

A yes for example here at Adversarial images can cause major problems:

in others they use an actual different image…

I’m happy to consider adding a no-ai tag, but as Joseph points out this creates a false sense of security as the app developers are not honoring such tags yet.

Also, if you read our Terms of Use, you’ll see that all work here is posted under a User contributions are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. This means that legally speaking, such parties are already not allowed to use your work (since they’re commercial entities, and they do not provide attribution). So technically a protection is already in place, but I doubt these parties honor the CC licenses too…

No, that’s not correct.


Okay i remenbered wrongly… (was already unsure…)

+1 for refering the CC license

The Laion dataset, which is being used by some visual AIs like stable diffusion ( is being crawled by the bot “common crawl”: and
Maybe it would help a little bit to disallow ccbot in the robots.txt? I know, this doesn’t really solve the problem, since we would always run after all the different bots, that crawl for AI…


It’s a good idea, but robots.txt is also voluntarily honored. Lots of bad actors ignore it. I don’t mean to throw around any accusations, but the people behind Laion aren’t famous for ethical behavior :grimacing:

1 Like

Even if it does nothing in practice, I personally would appreciate having it. Provided it’s not a hassle to do so. Same with robots.txt.



Adding the tag to published artwork makes for a very clear and explicit statement, issued by the creator, regarding his/her willingness to grant the respective rights to AI developers.

Depending on what future legislation and court decisions might bring I figure that might put a grain of additional pressure on them. Might be hard to explain in a courtroom why you were consistently, intentionally ignoring such explicitly stated denials of usage rights, I reckon that prospect might unfold some deterrent potential.


While there’s nothing I personally can do to implement it, I think @0451 and @Helmut_S are correct here :slight_smile:

(Using XY to not need any sensitive word detection :wink:)

The real problem wouldn’t be solved… and as @bartv mentioned would lead to to a false sense of security…
Even my watermark suggestion is already counterfeit… see

The problems which evolves from that problem is that artist have to think about countermeasurements to secure there art which does influence the art… ( :nauseated_face: ) so the freedom of art itself is in danger… (which i think most of the people who are so friendly to this technology used according to art aren’t aware of… in other areas they might be reasonable… for example foreign particle detection via cameras in food production ??)…

But: Nobody likes artpieces with no-xy all over the place ( not only as eater egg but the artist spends what?? half the time to add any no-xy text into the image and also his signiture… ← which were already also put into xy-generated images…)

Off course everbody could add an #no-xy tag on ones post to maybe add an additional

I’d like to know if we should:

  • Add this as a site-wide meta tag (easiest do to, aligns with my personal view, but not possible to opt out for inidividual users - but why would they?)
  • Add this as an account-wide option (enabled by default, opt-out)
  • Add this as a tag you can add to individual posts

Edit: I was made aware that such a tag may not be compatible with the CC license that covers BA. I’ll talk to some people to learn more about that.

1 Like

Aside from that… is it even techical possible to set any tags like this… and what’s next ??

So maybe someone want’s to:

  • do-not-use-for google (at all) , apple, duckduckgo-image-search, A-and-I
  • allowed-for duckduckgo-text-search, arstation (to get me a job),

and then:

  • only-from-december-to-january

okay :stuck_out_tongue_closed_eyes: okay that’s too far… but just some thoughts…

I feel that’s a straw man argument, and not helpful to this specific conversation.

What i meant was: is it technical possible to add anything in anyway to make any post “tagged” for anything else than BA… and then: what would this help… if any extern webscrawler just ignores it?

And now i even see because of the edit according CC license… there are just more problems… so:

… nice idea… but by far too much work…

So for BA it’s just impossible to solve this problem in the first place… aside from any announcements permission or anything…

True, bad actors can still ignore it, but we can at least signal they’re not allowed to by updating our terms of use and by adding the industry-standard tag. We do not have the legal power to fight abuse anyway.


Lawyers are good at bending rules. I think explicitly stating in the TOS something along the lines of
“artwork is not to be used as training data for A.I., machine learning and deep learning alogorithms without the explicit permission of the artist”
would be far better than the no A.I. tags. Leaves less wiggle room.

It would almost certainly be ignored by those companies but if a lawsuit was to happen at some point in the future, I think this would offer far more weight in a lawsuit.

Edit: Just seen the post above :sweat_smile:


IMO, for the sake of working towards a legal deterrent, tags should be added explicitly, by individual artists, to make sure it is understood as an intentional, explicit statement made by the individual artist, regarding denial of rights for that specific use, so the current policy of ignoring everything, in a courtroom, could ideally be seen both as a violation of BA’s terms of use and a breach of every single copyright statement made by individual artists.

IMO the best (only) option we as artists have right now is to instill fear in those AI data scavengers, fear of future lawsuit “nukes” built against them regarding direct and indirect copyright infringement in a gazillion cases, with the potential of destroying their lives, economically, once and for all, on this planet.

How so?

If true it would be reason to change that license rather than abandon the tag.

1 Like

According to my understanding, using the images from this forum to train neural networks for commercial purposes isn’t allowed because of the license which only allows non-commercial usage (but also requires attribution).
Do you think explicitly stating that one use case is not allowed would change anything?
In my opinion, it would be a symbolic gesture at best.

Using it explicitly provides proof the denial of rights is a concscious, willful act by the artist, similar to hitting that “I’ve read the terms of use” checkbox or the “place legally binding order” button.

I’m no lawyer, but I figure it might make a difference compared to less explicit forms of legal protection which might often go unnoticed by users, or are not understood properly in all their implications.

Yes, it’s a gesture. For now. Maybe forever, who knows. If nothing else, sowing a trace of doubt and unrest in some minds counts for something already …

1 Like