AI PBs
-
There’s no ethical consumption under (late stage) capitalism, we all vaguely know this. Nobody in this thread is anyone else’s moral superior, and anyone trying to be should be roundly mocked.
The use of AI feels worse for a lot of us because we’re creatives, or move in creative circles, and that’s what generative AI is directly impacting right now. And that’s a perfectly valid feeling, especially if it mitigates one’s own consumption.
ETA: That isn’t to discount the feelings of everyone involved, simply an explanation for those who don’t understand the seeming hypocrisy and/or double standards.
-
I think it’s a better use of time to push for reining in the corporations, rather than dissuading the average joe twelvepack (the AI gives him more abs).
-
@Jumpscare said in AI PBs:
the AI gives him more abs
If an AI could improve my fitness and physique without me having to put in any labour my ethics and morality would go out the window as fast as I could type “make me fit and sexy please” into FitGPT.
-
The use of AI feels worse for a lot of us because we’re creatives, or move in creative circles, and that’s what generative AI is directly impacting right now.
That is true, but also not the whole story. GenAI is causing widespread disruption in everything from the fundamental business model of the internet to critical thinking skills. It may be impacting entry-level jobs, hurting an entire generation because companies are too short-sighted to realize that today’s entry-level people are tomorrow’s senior people. It has profound implications for propaganda, which is increasingly dangerous considering the threat of authoritarianism. These impacts are not limited to the creative fields.
And that’s not even touching on the alignment issues that make generalized intelligence (which we do not yet have but these grifter companies are trying desperately to build) so dangerous. My favorite thought experiment is the rogue stamp collector AI because it’s pretty hilarious yet illustrates the problem very well.
I am not saying that all machine learning is bad, but I personally see GenAI specifically as a threat on par with climate change in its ability to really screw up society. Amazon is bad, but GenAI is way worse IMHO.
-
@Faraday One could certainly make all those arguments about the internet itself. It’s just all happening all at once like a rolling boil rather than turning the heat up slowly.
Which isn’t an excuse or a “so don’t worry about it.” We have to direct our energies outwards towards forcing our representatives into heavy regulation, etc. For everything else I’m just fatigued in my concern.
-
-
@MisterBoring The people stressed are the ones trying to maintain their ethics. But I respect and exhort that. What’s the alternative, spreading cheeks for our AI masters?
-
@Muscle-Car said in AI PBs:
The people stressed are the ones trying to maintain their ethics.
That’s what I was getting at, that attempting to consume ethically largely fails to have any impact on the continuing economic system, inevitably causing people doing the ethical consumption more stress as they watch the system continue to grow and exploit itself.
-
@MisterBoring said in AI PBs:
- Use stock photos or other art published online for free under a Creative-Commons (or similar) license.
I actually did this once, way back when in the Haunted Memories days for a character. Though it was less an ethical reasoning and more that I just happened to find some stock photos that were so perfect for what I envisioned that I had to use them, so I paid a few dollars to remove the stock image watermark. It did feel kinda good though.
I did the AI thing when Midjourney first came around and before I knew better (I didn’t really look into anything around it I just went ‘oh cool AI images’ like so many people). I wouldn’t touch that sort of thing now.
I’m kinda in the same boat as some others have echoed, I kinda dislike PBs in general. Most of the time it’s the last step on the process and I rarely find something that truly matches what I was thinking of.
-
Secondly, if the model is made for a for-profit system like Midjourney, then they already have the requisite rights and permissions. That’s part of what you’re paying for when you buy a license for Midjourney.
You cannot be serious.
Come on.
So Disney is suing. Okay, so what? Disney sues a lot of people. It’s a massive litigious corporation. That doesn’t mean they’re correct. And even if they are, that doesn’t discount my statement. You are paying for everything being above board. If the corporation is pulling funny business, then that’s on them (and Disney’s lawyers will undoubtedly made them pay). Tarring all companies with the same brush is asinine.
But even if every single AI art corporation on the planet was shady, you still have the option of rolling your own. Train it only on art that you know is copyright free or allowed to be used. Now you’re sure.
. It’s mostly Python anyway.
It’s not the source code that’s the problem, it’s the data that you’re training it on.
Agreed. Which is why rolling your own is an option. Thus the point of my comment about the source code.
Nobody’s ever going to universally agree on ethics and morality; they’re always in the eye of the beholder. Personally I feel a lot less bad about using a screencap of an actor from a Hollywood movie (where both the film and the celebrity have put themselves “out there” into the public eye) than I do about generating some fake person from the work of unwilling artists and/or real everyday people whose face was scraped off the internet somewhere.
Again, that’s only if you actually do that. There’s no reason you have to. Use public domain art or artists that explicitly allow their work to be used and build your own model.
I think it’s a lot more shady to use someone’s real face than a small selection of pixels that will be blended to the point that it’s completely unrecognizable from the source material.
I mean, do you believe that fan artists are doing something immoral when they make something in the style of another artist? The process is very similar. If anything, AI generated art is far better from a moral standpoint because there is no consciousness there; the program isn’t choosing to make the art. An artist absolutely is. Willfully violating a moral stance is worse than a machine just doing what it was designed to do, yes?
Let’s not pretend that fan-casting is a thing unique to MUSHes.
I’m not sure why that’s relevant.
-
Secondly, if the model is made for a for-profit system like Midjourney, then they already have the requisite rights and permissions. That’s part of what you’re paying for when you buy a license for Midjourney.
You cannot be serious.
Come on.
So Disney is suing. Okay, so what? Disney sues a lot of people. It’s a massive litigious corporation. That doesn’t mean they’re correct. And even if they are, that doesn’t discount my statement. You are paying for everything being above board. If the corporation is pulling funny business, then that’s on them (and Disney’s lawyers will undoubtedly made them pay). Tarring all companies with the same brush is asinine.
Disney is just the latest company to sue a generative AI company, they’re not the only one.
Good lordl. I’m well-aware that I personally would not be help personally and legally responsible for the copyright infringement going on. I know that Midjourney or whoever else would be the ones held liable. That’s not relevant to my argument, though. I’m not talking about me being worried about being held personally liable for infringing copyright; I’m talking about me ethically and morally not wanting to be party to supporting companies that steal artists’ work and make a profit off of them.
In my opinion, Disney is correct. Anyone whose work is being used without permission or compensation to train generative AI would be correct in taking legal action.
I do not personally have the requisite rights and permissions to use the data of Midjourney’s training model, because I don’t believe Midjourney itself has the requisite rights and permissions to be using the data it’s using.
But even if every single AI art corporation on the planet was shady, you still have the option of rolling your own. Train it only on art that you know is copyright free or allowed to be used. Now you’re sure.
Sure, I guess I could do that? I’d be interested in seeing the results, but I’m still not interested in the sorts of visual results that come out of generative AI in general. But more to this argument: that’s not what people by and large are doing for PBs. So that’s not what we’re talking about here.
I think it’s a lot more shady to use someone’s real face than a small selection of pixels that will be blended to the point that it’s completely unrecognizable from the source material.
I think it’s more shady to use a system that is actively attempting to profit off of the work of others than to use some promotional pictures taken of an actor for a movie or TV show.
The point isn’t that the AI results are recognizable. The point isn’t that they’re using a system that artists actively hate because the system is stealing data to use. Actors actively speak out against AI. I have yet to hear actors actively speak out about pictures from their movies being used to represent a RP character in a tiny game.
I mean, do you believe that fan artists are doing something immoral when they make something in the style of another artist? The process is very similar. If anything, AI generated art is far better from a moral standpoint because there is no consciousness there; the program isn’t choosing to make the art. An artist absolutely is. Willfully violating a moral stance is worse than a machine just doing what it was designed to do, yes?
Jesus fucking Christ. No, I don’t have it out for the lines of source code. I have a problem with the larger companies and the way they practice business. By engaging with the product, I support the way they’re practicing business.
No, I don’t have any issue with fanart. Actual human artists learn from the work that came before them. An AI being fed every single visual of human artistic history and learning how to photocopy the right bits and pieces is not the same.
-
-
@MisterBoring said in AI PBs:
I often wonder how much of that has to do with news coverage. Is any reliable news source regularly covering our hobby?
LOL what? …no…
Anyway, if this was going to go anywhere in terms of litigation it would’ve come up in the 2000s when fan fiction and websites that did ‘dream casting’ for movies were just becoming things.
-
Image generation isn’t the only thing generative tech like this is being used for. It’s being used for animation, for voices, etc. Actors are fighting for their faces and voices to not be used as an AI simulacrum of themselves to put them out of work on a broad commercial scale. The creation of deepfakes and simulated representations of real people is another reason why I personally find generative learning algorithms deeply ethically concerning. That’s not necessarily a problem with Midjourney itself, and I can see why people might draw the line between a deepfake and a generated artistic rendering of an elf, or whatever. But there’s a reason I categorically dislike it.
-
It doesn’t come up because the impact is minimal. There’s no likelihood of confusion that it’s an authorized licensure and no one is making money on it. People who make money using their likeness know - assuming they know or care that online RP exists - that it does not cost them anything for their face to be used in this fashion because it is not an area of business that it would earn them any money. Essentially, is it morally correct to “steal” someone’s face for this purpose? Probably not, but also who cares? What harm exists here?
Basically if you want to use AI PBs go ahead, but this idea that it is somehow morally superior because no real people are involved is disingenuous. The only choices that are free of contention are drawing your own shit, paying to commission an artist to draw your own shit, or not using an image at all and just writing descriptions in this writing hobby. Otherwise we are all making choices that have points of compromise.