Valve Is Banning Games with AI-Generated Art from Steam, Developers Say

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,333
Points
83
Valve has begun blocking video games and other projects created with AI-generated art from appearing on Steam, according to recent complaints that have been shared online. "While we strive to ship most titles submitted to us, we cannot ship games for which the developer does not have all of the necessary rights," reads one message that Valve apparently sent to an r/aigamedev user, while another poster, from r/DefendingAIArt, says that they are now "depressed" because they can't use AI anymore due to "legal stuff." Anything with AI-created text will also be banned, it seems.

See full article...
 
Guess then games can't use chatgpt to generate NPC responses either.

This is a really backwards move from valve, I hope they come to their senses.

Asking proof of ownership of the training data is stupid. It's like expecting a human artist to own the rights to every piece of art he may have seen and as such draws upon as inspiration.
 
I'm with Valve on this one, no way of knowing what the AI has stolen from other copyrighted works.
 
Guess then games can't use chatgpt to generate NPC responses either.

This is a really backwards move from valve, I hope they come to their senses.

Asking proof of ownership of the training data is stupid. It's like expecting a human artist to own the rights to every piece of art he may have seen and as such draws upon as inspiration.
Which is funny because I think asset flips are a lot more problematic than AI algorithms at this point.
 
I'm with Valve on this one, no way of knowing what the AI has stolen from other copyrighted works.
So if a human artist likes a certain art style and makes something very similar, is that stealing or inspiration? Because I see absolutely no difference between a human artist being trained on existing art or a computer software. You don't need to own the copyright to all novels you ever read to publish your own novel.
 
So if a human artist likes a certain art style and makes something very similar, is that stealing or inspiration? Because I see absolutely no difference between a human artist being trained on existing art or a computer software. You don't need to own the copyright to all novels you ever read to publish your own novel.
but a human knows the difference between copyright/stealing and inspired by, not so sure some AI does.
 
but a human knows the difference between copyright/stealing and inspired by, not so sure some AI does.
Too true but in turn then if Valve is doing this type of enforcement shouldn't it also be using AI to verify if the proposed banned content is copyrighted, provide the example, and allow the developer some kind of process to prove their case?
 
but a human knows the difference between copyright/stealing and inspired by, not so sure some AI does.
The only way an AI can produce something that could be considered stealing is if it was specifically made to do so by the human operator. If you just use it to create generic images, it it will never just randomly pop out an exact replica of someone's copyrighted work.

Too true but in turn then if Valve is doing this type of enforcement shouldn't it also be using AI to verify if the proposed banned content is copyrighted, provide the example, and allow the developer some kind of process to prove their case?
I think this whole issue is based on a misunderstanding of how AI models work. They do not contain any of the copyrighted works used in training, and as such they can't reproduce them exactly either.

If someone uses AI to steal somebody else's work, it's not the AI's fault, it is the person doing it. If their goal was to steal they can do it with or without AI just as well. As it has happened numerous times.
 
If someone uses AI to steal somebody else's work, it's not the AI's fault, it is the person doing it. If their goal was to steal they can do it with or without AI just as well. As it has happened numerous times.
This is a good point. The software can't be liable for anything, it's not a responsible person. The person who turns it on and tells it what to do is responsible - including unintentional outcomes.

Same issue for self-driving cars really. You can't blame the car if it runs over someone or gets into a wreck - you blame the people that programmed it.
 
I guess they need to ban The Callisto Protocol, then. There are environmental textures and models that were made by AI.
It depends. Did they (or whoever trained the model) own the rights to the content that was used to train it? If so, then it's fine.
 
It doesn't sound to me as if Steam is taking sides in this fight.

They are simply saying that right now the legal basis for how to treat the output of AI is unsettled, so until it is, they are erring on the side of caution.

Essentially they don't want to get sued by some PO:ed artist.
 
The only way an AI can produce something that could be considered stealing is if it was specifically made to do so by the human operator. If you just use it to create generic images, it it will never just randomly pop out an exact replica of someone's copyrighted work.

I would argue the opposite. That an AI is completely incapable of every creating anything original. All of its output is a strict derivative of what it was trained to, and as such the rights to the AI output is owned by the rights holders of the material it was trained on.

Of course the AI cannot be held responsible. It is not a living thing. AI will never be sentient. Sentience is and forever will be uniquely a feature of the biological brain. Alan Turing was completely wrong about this. Just because something can mimic sentience doesn't mean that it is sentience. The Turing Test is complete bullshit. It is the human who operates the AI and decides what to do with its output that is liable.

If the operator of the AI owns the material it was trained on, then this is no big deal. If they don't - however - then it is stealing no different from if they themselves saved a texture from another game, modified it a little and then used it. Just because you insert an AI algorithm in the middle doesn't change anything. In this example all the AI is is a really fancy and advanced photoshop filter. If you can't steal a texture, apply a filter, and then use it as if it were your own, then you can't do that same thing with AI.

Honestly, I would love to see some accountability here. Force those working on AI models to destroy their AI's if they used any material at all that they didn't own in the training process, including any collected user data or likeness of any user.
 
Last edited:
I would argue the opposite. That an AI is completely incapable of every creating anything original. All of its output is a strict derivative of what it was trained to, and as such the rights to the AI output is owned by the rights holders of the material it was trained on.
Humans are trained on existing art too. In order to produce art you need to see art. Contrary to popular beliefs humans are not capable of creating anything original either. The first art was created by copying reality. You may think some art is so original it cannot possibly be a copy of something, but if you put a human in a dark room for his entire life he will never come up with any visual art. People go through life experiencing different things, that is why art can seem original to an observer.
Of course the AI cannot be held responsible. It is not a living thing. AI will never be sentient. Sentience is and forever will be uniquely a feature of the biological brain. Alan Turing was completely wrong about this. Just because something can mimic sentience doesn't mean that it is sentience. The Turing Test is complete bullshit. It is the human who operates the AI and decides what to do with its output that is liable.
Being biological has nothing to do with being sentient. The turing test can be fooled, that's true, but it doesn't mean anything. A brain is just an infinitely complex network of neurons. If computer neural networks achieve the same complexity and are provided the same inputs as a living being it will inevitably become sentient. The question is not if, but when will this happen.

That said current AI have nothing to do with being sentient.
If the operator of the AI owns the material it was trained on, then this is no big deal. If they don't - however - then it is stealing no different from if they themselves saved a texture from another game, modified it a little and then used it.
Does a writer own the copyright on all the books he has ever read? Then how dare he produce his own books based on all that accumulated experience?
Just because you insert an AI algorithm in the middle doesn't change anything. In this example all the AI is is a really fancy and advanced photoshop filter.
As I've mentioned this is based on misconception. While AI can be used as a filter and run on an existing image to alter it, that is just a very specific application of it that can be infringing on copyright. The training data however is not contained in the model, therefore it is not infringing on copyright.
If you can't steal a texture, apply a filter, and then use it as if it were your own, then you can't do that same thing with AI.
No, this is not how AI works. You don't steal a texture, you train the AI on thousands of examples how a texture should look. Just as a human artists learns how textures must look by looking at other games textures.
Honestly, I would love to see some accountability here. Force those working on AI models to destroy their AI's if they used any material at all that they didn't own in the training process, including any collected user data or likeness of any user.
All I can say is wow, this is a luddite approach. Or worse, a destroy what you don't understand approach. While deep fakes can raise a personal rights issue, it is not the technology's fault if someone uses it for nefarious purposes. Although I think creating deepfakes of public figures should be fair use as long as you don't try to represent it as reality. But this is straying extermely far from creating assets for games using AI, which I think is a perfectly reasonable use of AI image generators, and shame on Valve for cracking down on it.
 
Humans are trained on existing art too. In order to produce art you need to see art. Contrary to popular beliefs humans are not capable of creating anything original either. The first art was created by copying reality. You may think some art is so original it cannot possibly be a copy of something, but if you put a human in a dark room for his entire life he will never come up with any visual art. People go through life experiencing different things, that is why art can seem original to an observer.

Being biological has nothing to do with being sentient. The turing test can be fooled, that's true, but it doesn't mean anything. A brain is just an infinitely complex network of neurons. If computer neural networks achieve the same complexity and are provided the same inputs as a living being it will inevitably become sentient. The question is not if, but when will this happen.

That said current AI have nothing to do with being sentient.

Does a writer own the copyright on all the books he has ever read? Then how dare he produce his own books based on all that accumulated experience?

As I've mentioned this is based on misconception. While AI can be used as a filter and run on an existing image to alter it, that is just a very specific application of it that can be infringing on copyright. The training data however is not contained in the model, therefore it is not infringing on copyright.

No, this is not how AI works. You don't steal a texture, you train the AI on thousands of examples how a texture should look. Just as a human artists learns how textures must look by looking at other games textures.

All I can say is wow, this is a luddite approach. Or worse, a destroy what you don't understand approach. While deep fakes can raise a personal rights issue, it is not the technology's fault if someone uses it for nefarious purposes. Although I think creating deepfakes of public figures should be fair use as long as you don't try to represent it as reality. But this is straying extermely far from creating assets for games using AI, which I think is a perfectly reasonable use of AI image generators, and shame on Valve for cracking down on it.

I don't think a requirement that "you may only train AI models on things that you own the rights to" is unreasonable or luddite at all.

Your analogies to human beings learning things are irrelevant, because these are not human beings. They are tools being used by human beings to process other peoples work, behavior and likenesses into things that they can use for themselves. In a way, it is a kind of money laundering.

Take all of these things i can't use, stick them into an opaque process, and like magic they come out usable on the other end.

Calling for controls here is neither a lack of understanding nor opposing progress. Its putting a stop to the blind AI evangelism of the industry that does not take into account the peractical effects of what they are doing. It reminds me a lot of those who blindly evangelized NFT's like they were the next coming of Christ, when those of us with any sense at all saw straight through it.

AI is not intelligence. It is just a bunch of compounded statistics used to process the data it has been trained on.
 
Do get what you are saying about AI not being that special, ai is but a tool, but I agree with the other person that there is no copyright issues, there can't be copyright issues if the training data is gone, and I think it is in fact better to make training data as extensive as possible, as to avoid closeness to copyright. The fact that is derivative, well, its an 'oh well' kind of thing.
I disagree completely with the reductive argument of "humans the complex biological machine and just that" but that a separate issue.
 
I don't think a requirement that "you may only train AI models on things that you own the rights to" is unreasonable or luddite at all.
What is luddite is wanting to destroy other people's tools, because it is too efficient and makes a job too easy. That's not a little bit luddite that is the exact description of the luddite movement.
Your analogies to human beings learning things are irrelevant, because these are not human beings. They are tools being used by human beings to process other peoples work, behavior and likenesses into things that they can use for themselves.
No, you still don't get it. AI can be used to process other people's work, but that's an option, and in that case I agree that you need to own the rights for the originals. But what you need to understand is that training AI and using it to process someone's work are two completely separate processes. Owning the copyright to all training data is literally impossible, and would make any AI infeasible.
In a way, it is a kind of money laundering.
That's the most nonsensical thing I have ever heard uttered about AI.
Take all of these things i can't use, stick them into an opaque process, and like magic they come out usable on the other end.
To you it may seem like magic, to the rest of us it is a computer program.

"Any sufficiently advanced technology is indistinguishable from magic"​

Calling for controls here is neither a lack of understanding nor opposing progress.
Calling for control seems to just wanting to maintain the status quo, where regular artists have a monopoly on visual art. When AI can do it much cheaper and faster. The irony is that most content creators who REEEEE against AI have already started using AI as an aid. Which is exactly how it is supposed to be used, not as a replacement of human artists but as a tool making their work easier.
Its putting a stop to the blind AI evangelism of the industry that does not take into account the peractical effects of what they are doing.
What are the practical effects? Making assets at much lower cost and much quicker. I don't see how that's bad.

AI is not magic, most of the "art" created with it is garbage, because fools with zero artistic sense won't suddenly get one just by using AI image generators. That's not the tools fault that the barrier to entry is very low. Creating good work however that still requires skill, even with AI. Just a different set of skills compared to what regular artists have.
It reminds me a lot of those who blindly evangelized NFT's like they were the next coming of Christ, when those of us with any sense at all saw straight through it.
You are comparing apples to oranges, completely irrelevant whataboutism. I was against NFTs and cryptos as a whole from day one, because they offer zero benefits, they don't make anything of value.

AI content generation is a wonderful tool, that opens gates that were completely closed off before. It has the potential to revolutionize games. If only we'd allow it. For one it can mitigate the runaway costs of developing a modern game. If that was the only benefit it would already be worth using a thousand times over.
AI is not intelligence.
Where did I ever suggest that AI is intelligence? The first thing I declare when someone tries to argue about AI tools is that it has nothing to do with intelligence.
It is just a bunch of compounded statistics used to process the data it has been trained on.
Still this basic misconception. How many times do I need to drill it in that the data used for training is NOT part of the finished model? Therefore it is wrong to describe the process this way.
 
Last edited:
Do get what you are saying about AI not being that special, ai is but a tool, but I agree with the other person that there is no copyright issues, there can't be copyright issues if the training data is gone, and I think it is in fact better to make training data as extensive as possible, as to avoid closeness to copyright.
If the training data is gone, how can the AI know that what it is creating is not a 100% copy or a minor deviation of an existing piece of art?
 
If the training data is gone, how can the AI know that what it is creating is not a 100% copy or a minor deviation of an existing piece of art?
By being complex enough?
 
Has Valve issued an official statement to the effect that it's banning games with AI-generated content? Or are Reddit users the only source of the claim?
 
Become a Patron!
Back
Top