General AI Discussion

I wrote about artists being inspired and inspiring others. It was a bit off-topic here, because I did not mean to write about AI in general.

Traditionally, there is a human who hard-codes a behavior into a machine. Now, with machine learning, a machine develops its own behavior using training data.

Machine learning is still hard coded behavior. It’s nothing similar to how artists learns or inspires from other artist.
When AI defenders say that it “learns like human” they are lying. Algorithmic analysis and human’s learning process are not comparable to each other at all. Human’s process is extremely nuanced, machine can’t replicate it at all.

That was not my intention.

When it comes to machine learning, it is capable of finding relevant patterns in the data. It does not just use patterns, but it was trained to find the in the first place.

My intention is not to give computers or algorithms any kind of rights.

Literally as I am writing this, I am training a generative neural network on my computer. I know a lot of people who are doing that too. None of them is a greedy company, none of the is baiting investors (as far as I know).

Let’s say I generated images in a very similar way to how artists might be doing it. Let’s say I am looking for reference images to what I have in mind for the composition. I am also looking for reference images for the style. Also I am looking for reference images for the most prominent objects in the scene.
Now I am taking those, but instead of creating the content myself, I am training a neural network on those images. I am making sure that the resulting image does not closely resemble any of the references (and certainly I want it to look like the image I had in mind).
Am I allowed to use copyrighted images for that? Do I have to credit the people who created the original images. If yes, would artists also need to credit all references they are using?

That way you do not learn yourself, you gather data for your computer. It surely has valid uses.

But artists concerns need to be seen too. Just look at Etsy, a growing number of people start selling AI generated image prints. How an artist can establish an identity, if his works could be used as training data to mimic his works?

There are also cases, when artists mimic other artists who are more popular. Think about people, who sell “Fake Banksy’s” for money.

Somehow people like to make sure that their work and their cultural identity is protected. It appears to me that concerned people reach out for a ‘legal’ way to stop AI from growing too strong.

What do you mean by hard coded?

If you are training your model using others art then you are exploiting them. It’s not same thing as referencing, because you are giving it to algorithmical analysis. It’s pretty much a fancy theft.
No matter how hard you try, the software will be just shamelessly copying arbitrary patterns because it’s not capable of nuanced thinking like human does. You can’t compare these to true artistic process. It’s not better than tracing or direct plagiarism.

I don’t really understand your argument. I could go on a website where I can hire freelancers and ask them to create an image for me. Or I could use a computer to generate it for me. I don’t think there is a big difference for the artist who create the references and doesn’t know I am using them.

To clarify, I am not talking about forgery kind of behavior.

Similar to the first point, I don’t understand this argument. If I spend enough money, I can hire someone to do it.
Sure, AI can be used to mass produce it.

I may not have explained it thoroughly enough. In the example I gave, I am making sure it doesn’t copy in the way that you could figure out which references were used.

I am guiding the whole process, I picked the reference images. I make sure that you can’t recognize the original references.
The only difference is, I don’t have the capabilities of pulling it off on my own to create the image I am looking for. So, I could hire someone to do it, or use a computer.
Why it should be plagiarism if a computer does it is quite baffling to me.

Edit:
Maybe I understand better what you mean if you explain what you mean by this:

How do you think those algorithms learn or how they are trained?

If you plug these images to a software, it’s already exploitation, because software does analysis on it.
At which point do you deem image “different enough”?

Algorithm doesn’t have any understanding of anatomy, perspective, lighting, etc. It will just sample arbitrary stuff based on keywords modify the dataset based on it. It has nothing to do with actual artistic process.

There are people who make a living with arts. If an artist does not notice his work is used as training data, this does not protect him from lose opportunities because his art gets an AI-Bypass.
I try to point out to look at art not only as something you can consume or buy. Imo, it is important to empathy with the creators themselves, and their concerns.

this …

… was my own interpretation. I try to get a bunch of impressions into a leading picture. It could be wrong.

At the exact same point as a human artist.

This is objectively wrong.

Wonder if you can explain how training works then, WITHOUT comparing it to human, no “like artist” “like brain” etc, just programmer language.

Why don’t you try it yourself?

There is a playground.

You hardcode some neurons and their input. If you run your simulation, you can see how well your network covers the data.

That artists work may have been used thousands of times as a reference, by other artists, by companies, … without getting any recognition, possibly without getting any opportunity out of it.
In the case I gave, there is very little difference whether I am using it as training example compared to others using it as references when it comes to the original artist.

I have as much empathy to the original artist when I use their work in a training dataset, as other artists when using it as reference.

not with tensorflow it’s not objectively wrong. okay, Rytelier may have used “inputs” rather than “keywords” to be a bit more ‘accurate.’ but that is what happens. you change the inputs, the neural net modifies it’s connections (dataset) based on what is viewed as negative/positive. the positive/negative data can either be external based (human input … “i don’t like that”, or result based … “matches original image by a factor of .666 or more”) or internal (the orange/blue in your web version of tensorflow)

And here’s point of looping around point “it’s like referencing”.
Anyways I stated my opinion on gen “AI”, not interested in discussing it further because I know it’s not fruitful. If you want to see answers to further points made there, just read my answers (again). (don’t tag me)

I just hope AI is gonna be heavily regulated, because it’s a dangerous new tech and like all new techs it needs to have serious limits. Damage is done already, but don’t allow even further damage.

A.I can be trained with synthetic data.

1 Like

most certainly… the computer doesn’t care what’s in those inputs… it just needs to “know” (be coded to) parse that into data it can then apply to the dataset/node connections in an additive or subtractive way. but when trying to train a neural net to output something very subjective, like art, it helps to have a point of reference of what “art” is to begin with. you could input random color splotches, and then give it a positive result when a human determines the output has art-like qualities, and vice versa. by training it on actual artwork, however, that makes the training much faster, and requires less human intervention for the training portion.

1 Like

It’s July 2022. The Another AI vs artists thread is arguing about how AI generation works, and if it should give credit to artists.

It’s December 2022. The Another AI vs artists thread is arguing about how AI generation works, and if it should give credit to artists.

It’s March 2023. The Another AI vs artists thread is arguing about how AI generation works, and if it should give credit to artists.

It’s July 2023.

6 Likes

is that ai generated? >grins<