Just wondering about your policy regarding AI… A lot of work is shared here in this community… What policies do you have in place to protect artists? For example, do you have a NoAI tag one can put when sharing one’s work here etc.?
There’s nothing we or any other site can do about it. NoAI tags only apply if AI developers respect them- which they don’t.
It’s a public forum… and BlenderArtists itself doesn’t use… … okay i don’t know that… AFAIK it only repost some art on some other social media (i don’t remember expect BlenderNation… because i don’t care about the other )
And if you refer to ArtStation… wasn’t it Epic itself who… used it… i’m not sure…
Anyway: you may add a watermark visible and/or unvisible or even add overlay one in such small amount that image recognizion does fail…
There were articles about that back then… i wonder why nobody came up with that idea… tags are useless… make it unrecognizable for non-humans…
A yes for example here at Adversarial images can cause major problems:
in others they use an actual different image…
I’m happy to consider adding a no-ai tag, but as Joseph points out this creates a false sense of security as the app developers are not honoring such tags yet.
No, that’s not correct.
Okay i remenbered wrongly… (was already unsure…)
+1 for refering the CC license
The Laion dataset, which is being used by some visual AIs like stable diffusion (https://stability.ai/blog/stable-diffusion-v2-release) is being crawled by the bot “common crawl”: https://laion.ai/blog/laion-5b/ and https://commoncrawl.org/big-picture/frequently-asked-questions/.
Maybe it would help a little bit to disallow ccbot in the robots.txt? I know, this doesn’t really solve the problem, since we would always run after all the different bots, that crawl for AI…
It’s a good idea, but robots.txt is also voluntarily honored. Lots of bad actors ignore it. I don’t mean to throw around any accusations, but the people behind Laion aren’t famous for ethical behavior
Even if it does nothing in practice, I personally would appreciate having it. Provided it’s not a hassle to do so. Same with robots.txt.
Adding the tag to published artwork makes for a very clear and explicit statement, issued by the creator, regarding his/her willingness to grant the respective rights to AI developers.
Depending on what future legislation and court decisions might bring I figure that might put a grain of additional pressure on them. Might be hard to explain in a courtroom why you were consistently, intentionally ignoring such explicitly stated denials of usage rights, I reckon that prospect might unfold some deterrent potential.
(Using XY to not need any sensitive word detection )
The real problem wouldn’t be solved… and as @bartv mentioned would lead to to a false sense of security…
Even my watermark suggestion is already counterfeit… see https://restb.ai/solutions/watermark-detection/ …
The problems which evolves from that problem is that artist have to think about countermeasurements to secure there art which does influence the art… ( ) so the freedom of art itself is in danger… (which i think most of the people who are so friendly to this technology used according to art aren’t aware of… in other areas they might be reasonable… for example foreign particle detection via cameras in food production ??)…
But: Nobody likes artpieces with no-xy all over the place ( not only as eater egg but the artist spends what?? half the time to add any no-xy text into the image and also his signiture… ← which were already also put into xy-generated images…)
Off course everbody could add an #no-xy tag on ones post to maybe add an additional
I’d like to know if we should:
- Add this as a site-wide meta tag (easiest do to, aligns with my personal view, but not possible to opt out for inidividual users - but why would they?)
- Add this as an account-wide option (enabled by default, opt-out)
- Add this as a tag you can add to individual posts
Edit: I was made aware that such a tag may not be compatible with the CC license that covers BA. I’ll talk to some people to learn more about that.
Aside from that… is it even techical possible to set any tags like this… and what’s next ??
So maybe someone want’s to:
do-not-use-forgoogle (at all) , apple, duckduckgo-image-search, A-and-I
allowed-forduckduckgo-text-search, arstation (to get me a job), arrrt.io
okay okay that’s too far… but just some thoughts…
I feel that’s a straw man argument, and not helpful to this specific conversation.
What i meant was: is it technical possible to add anything in anyway to make any post “tagged” for anything else than BA… and then: what would this help… if any extern webscrawler just ignores it?
And now i even see because of the edit according CC license… there are just more problems… so:
… nice idea… but by far too much work…
So for BA it’s just impossible to solve this problem in the first place… aside from any announcements permission or anything…
Lawyers are good at bending rules. I think explicitly stating in the TOS something along the lines of
“artwork is not to be used as training data for A.I., machine learning and deep learning alogorithms without the explicit permission of the artist”
would be far better than the no A.I. tags. Leaves less wiggle room.
It would almost certainly be ignored by those companies but if a lawsuit was to happen at some point in the future, I think this would offer far more weight in a lawsuit.
Edit: Just seen the post above
IMO the best (only) option we as artists have right now is to instill fear in those AI data scavengers, fear of future lawsuit “nukes” built against them regarding direct and indirect copyright infringement in a gazillion cases, with the potential of destroying their lives, economically, once and for all, on this planet.
If true it would be reason to change that license rather than abandon the tag.
According to my understanding, using the images from this forum to train neural networks for commercial purposes isn’t allowed because of the license which only allows non-commercial usage (but also requires attribution).
Do you think explicitly stating that one use case is not allowed would change anything?
In my opinion, it would be a symbolic gesture at best.
I’m no lawyer, but I figure it might make a difference compared to less explicit forms of legal protection which might often go unnoticed by users, or are not understood properly in all their implications.
Yes, it’s a gesture. For now. Maybe forever, who knows. If nothing else, sowing a trace of doubt and unrest in some minds counts for something already …