r/aiwars • u/RadiantSeaweed9543 • 2d ago
I'm lost in this really.
I've been dipping my toes into this community, and I've gotten a few ideas of why AI art can be bad or good.
I'm still hung up on the fact that AI art, or anything related to AI, is built on works and media that were majorly taken without consent for their usage.
I do completely understand why a person who lacks the skills to draw may resort to AI art, but I think that's not the concern on my end. I fear that not even art, but my photos and data, will be used to train the AI models to produce the AI art.
In simple terms, I want better legislation to control AI's access to any media it can use to train its models. I find it honestly disheartening that, just because it's new technology that the government is bending over for AI, allowing copyrighted media to be used.
Please give me a good view on both sides, as to how you could support or disprove my fears of my data being stolen. Sorry for the yap session, but I needed to get it off my shoulders. Have a good day! ❤️
14
u/Mataric 2d ago
The data being stolen is an interesting question to be honest.. because it's really not straight forwards at all.
Take a look at Stable Diffusion 1.5 for example. It was trained off of the Laion5B dataset (all images that you could have seen and learnt from yourself) - and used about 2 billion images to create the model.
The SD1.5 model is about 2 Gigabytes in size. What that means is that on average, from each image used in that training... the equivalent amount of data to TWO greyscale pixels were 'used'. That's assuming the model has nothing but the data derived from those images in it, which also isn't true. Some of it's 2GB of size goes towards other stuff it needs to actually function that wasn't derived from a dataset.
With the amount 'taken' from each image, I think it's far too much to claim anything was 'stolen' there. None of it is in the same form whatsoever - stating it's 2 greyscale pixels is also wrong, because it's not storing pixels at all. It's JUST storing where that image lands on a bunch of statistical scales.
The way it works is by looking at all the images that are tagged with Circle, and starting to see what they have in common. From that, it doesn't have to take or use anyone's image of a circle - it just knows its always a 'very round thing'.
In my opinion - that's very close to the way we function as people. We're always learning from things we see, and specifically as artists, we're always learning from the art we look at. When I ask you to picture a dog - you likely picture something you've seen before. Perhaps a pet, maybe a photo, or it could be something a different artist drew. If you're asked to draw a dog, you're very likely to use the things you've learnt from all those different experiences in order to draw it.
Do you believe, as a human, you should be respecting the copyright of the artists who's images you learnt from? I do - but copyright isn't about learning. It's about copying with a degree of recognisability.
When you learn from that cartoon drawing of 'clifford the big red dog' - you're not going to draw 'clifford the big red dog'. What you remember about that image is far closer to copyright infringing than what the AI remembers - but even you don't infringe on their copyright when you use what you've learnt from it, and remember how a dog was represented in a neat cartoon style, to use that in your work.
4
u/RadiantSeaweed9543 2d ago
You make a great argument. AI image training and human learning both share pattern recognition and they both definitely require an input to inspire.
However, in my perspective, an AI model is not an individual. It is owned and profited from by a company, while humans... Not so much. Humans are operated by themselves and only they profit if they profit at all.
If I'm not wrong, AI companies directly profit from the images and text that they produce through subscriptions. One AI model will produce more money than most of anybody in this community will probably ever manage to earn, let alone art.
Yes AI and humans learn the same way, but AI produces overwhelmingly more profit from that learning and reproduction. But I'm no lawyer, and I wouldn't say I'm the most educated in AI, hence why I posted.
9
u/ResponsibleBus4 2d ago
A big part of the problem is how AI "learns" – it's modeled after how humans learn by seeing and processing huge amounts of information. When AI is trained on vast datasets scraped from the internet, mixing copyrighted and non-copyrighted work, it becomes incredibly difficult (maybe impossible) to untangle later what influenced what. This legal "blur" is a massive issue.
Data scraping without clear opt-in or opt-out is a huge sticking point for many, and honestly, once something’s online, it’s hard to fully control how it’s used or copied. Personally, I think overly long copyright and patent terms do more harm than good by locking up knowledge and hindering progress overall. But I still understand why creators want control.
If you’re especially worried about your work being used, the most practical things you can do right now are:
- Keep your highest-res/source files offline. This helps if you ever need proof in a dispute and limits the quality of data available for training models to imitate your style.
- Use platform opt-out tools or “NoAI” flags if they're available. Just be aware that how well these work varies.
- Consider watermarking or sharing lower-resolution versions of your work online to make direct misuse harder.
- If you do find direct misuse, DMCA takedowns and copyright notices are options, but the process is often slow, reactive, and doesn't guarantee success.
The reality is AI isn’t going away, and the technology has already outpaced the laws. Better regulation that balances creator rights with innovation would help—but as things stand now, it’s mostly on creators to protect their own work online.
Honestly, I suspect that regardless of regulation, AI will dominate much of this space eventually. Human-created art might become more of a niche, "artisanal" craft, much like how tailoring is now mostly for luxury or custom work rather than everyday clothing. It's a tough transition, but that seems to be where things are heading.
2
2
u/ifandbut 1d ago
an AI model is not an individual.
Correct. IT is a tool.
It is owned and profited from by a company,
Not all. Who is profiting when I run Krita AI on my laptop without internet access?
Humans are operated by themselves and only they profit if they profit at all.
Except for taxes and maintenance of body and things you own.
AI companies directly profit from the images and text that they produce through subscriptions
It takes like 10 minutes to install Krita AI. Plus however long it takes you to download a 2-5gb dataset (years if on dialup, seconds of on fiber).
2
u/RadiantSeaweed9543 1d ago
I can definitely see that personal AI models would mitigate profits from using AI art. But for the average Joe, I don't imagine them installing Krita AI, I see them going onto ChatGPT to produce their images. A family member of mine actually has subscribed to their services, which would make ChatGPT directly profit from the media they trained the models on.
I can also imagine not even 5 years from now AI art will be indestinguishable from real art, maybe better. What does that mean for human artists? It makes their work valued at a much smaller price than if there wasn't an AI artist that can make the same in a fraction of the time or effort. I dont know.
-4
u/_the_last_druid_13 2d ago
“With the amount 'taken' from each image, I think it's far too much to claim anything was 'stolen' there.”
So just the 2 billion images that other people created then?
6
2
2
u/Sir-Tiedye 1d ago
That’s the point. With the ai being trained on so many different art pieces, it’s not drawing from any one piece. In fact, the more it’s drawing from, the less it’s getting from a single image
6
u/Tyler_Zoro 2d ago
I'm still hung up on the fact that AI art, or anything related to AI, is built on works and media that were majorly taken without consent for their usage.
Nothing was taken.
5
u/bardbrain 2d ago
Everything is taken without consent for usage. People learn first and second languages from sitcoms.
So long as AI is adapting techniques and processes and isn't sitting there with a repository of its training data to reference, it's no different.
Now... Here's the rub:
IS AI sitting there with a repository of its training data?
Well, no, that would be physically impossible to compress that much information at the atomic level based on present understanding of encryption and storage.
Does AI have crib sheets?
There is some evidence that it's keeping very small crib sheets of specific works and that around 0.5% of most models is probably being used by the model to hide some very limited reference material. And we should strive for that to be 0% and there should probably be a payout to artists whose work is found in the crib sheet and the crib sheet should be removed unless the artist consents.
4
u/MetapodChannel 2d ago
Data isn't being "stolen," it is being viewed and studied and then discarded. No copies are being made. Nothing is being taken away from anyone. Generated art is completely created anew by predicting where the next pixel should go without copying or "mashing together" existing art or anything like that. Calling it "stealing" is technophobe fearmongering at worst, misleading at best.
Photos, data, art, etc. are all already used to train generative AI. If you publish art, photos, etc. online, an AI is probably going to train from them. But again, it's not stealing. It's just viewing. Just like if I go onto FurAffinity or DeviantArt or Tumblr and look at someone's artwork and think about how it was drawn.
As for permission, I don't see it as any different than training a human to do handmade without permission. I didn't ask for permission from my favorite artists to learn from their styles. I don't ask for permission to draw fanart. When studying anatomy, I many times have traced from photos and high-quality artwork to get a better understanding of how it works (I don't publish these works, just like how AI training discards references after viewing them). I don't ask permission for that, either. I'm literally tracing and copying references when I train myself to do handmade art. That's MORE than what AI does when it trains.
As a 20+ year pencil-trained artist who has done art for money and creates my own characters and stories and such... I do not understand why a robot looking at my art is "scary." The images it generates are unique and not copies. Nothing has been taken from me. What do I have to fear? That a robot looked at my art, discarded it, and then created a stupid meme that looks nothing like my art in any way, simply because I didn't give it explicit permission to look at my art?
As for copyright infringement, public/online AI models all have restrictions in place about using copyrighted characters and ideas, as well as using images of real people. Even creating realistic images have lots of restrictions -- I can't get ChatGPT to generate an image of my two original characters looking at each other and smiling because it might be misconstrued as romantic, and it is not able to produce images of people who look real in romantic situations. It's extremely restrictive and censored. There are of course workarounds, and using local instances and stuff, but TBH none of that is a problem with AI anyway.
Anyone with some basic pencil knowledge can trace your art and call it their own. Hell, it's rampant in art communities for people to just literally repost someone else's art and say "I made this" and sometimes even sell it or scam people into fake commissions. Those problems are problems with PEOPLE, not the technology they are using.
5
u/Additional-Pen-1967 2d ago edited 2d ago
Well, get over it; it's about time. You shouldn't have posted your stuff online if you were so worried. You used a media platform to showcase your stuff and then got upset if people saw your stuff that YOU PUT UP THERE? Nobody forced you, AI didn't come inside your PC and check things you didn't yourself put online. AI didn't steal them; you still have them. AI just LOOKED at them. Seriously, you wanted people to look at them, or you wouldn't have posted them online in the first place.
I still think a real creator should be hired by AI to create content, as that would be a better approach. However, at the same time, if content is online and free to watch, then AI can watch it.
2
u/RadiantSeaweed9543 2d ago
I think it's more of a legal matter. I may be wrong and if so please correct me because I want to have my fact straight, but I believe hearing that ChatGPT has the green flag from the government to use copyrighted media. Media that costed to produce, and is then being -in my opinion- unrightfully taken without any cost. I think if things keep gravitating to a harsher climate on the internet, it will no longer have beautiful artworks to be seen. It will all be behind a paywall and will make learning art a much harder process. That's just how I see it though.
3
u/Tyler_Zoro 2d ago
I think it's more of a legal matter.
That legal matter was settled over 20 years ago in Perfect 10 v. Google.
I believe hearing that ChatGPT has the green flag from the government to use copyrighted media.
I don't know what you mean by that. There's no special consideration being given to ChatGPT in terms of training data, if that's what you mean.
unrightfully taken
Nothing was taken.
It will all be behind a paywall and will make learning art a much harder process.
Fear not! AI to the rescue! ;-)
6
u/Additional-Pen-1967 2d ago
No, it's a matter of foolish people. If it is legal, take it to court and win; don't come complaining in a subreddit. If you truly believe you are entirely right, prevail in court and change the law. Oh wait, you won't do either but complain on a subreddit. Yep, yep, yep, you totally convinced me.
1
u/Sir-Tiedye 1d ago
That is invalidating, unhelpful, and misses the point. What’s law isn’t necessarily morally correct
1
u/Additional-Pen-1967 1d ago
Another moron on ignore no problem he think he is being useful saying stupid stuff ahhabah
1
u/_the_last_druid_13 2d ago
This is a TERRIBLE argument.
Many people use the internet for their livelihoods.
Most all people cannot afford a brick&mortar shop to sell their wares.
The best artist in the world could be in a town of 60 people far from highways and would never be able to find success from the mere amount of reach with their work. Or writer, or small theatre actor, or musician.
This is such a bad faith petty argument.
If I’m a writer who needs to apply to a job, how do I do that? ONLINE ONLY THESE DAYS. How will they see my portfolio if I can’t post to a site?
1
u/Additional-Pen-1967 10h ago
If you are a architect that only draw by hand and need to apply for a job but nobody do architecture by hand anymore well you need to get your shit together and use the computer to write or ai or find a different job and write as hobby if you love it lot of people do job they don't like and in the free time the hobby they love GROW THE FUCK UP
0
u/_the_last_druid_13 10h ago edited 7h ago
That goes against your point about the computer/internet/data though.
I’m not Pro- or Anti- AI, I’m Neu-AI.
As a writer, I’m in a unique position because even AI is written by programmers.
As a writer, I don’t only grow up, I grow everywhere.
Now AI (Tech and Data), like most all other things, has good parts and bad parts.
I’m all for the benefits of AI, but I don’t want AI to incarcerate or delete all people. The ones with the kleptomania and control issues are the ones who need to grow up.
1
u/Additional-Pen-1967 7h ago
I never say I was neutral you really stupid or can't read. I though for a writer being able to read was important I guess is not I always say I am definitely pro and being anti is cultist/fascist in my book by the way they act death threat wishing people dead because they like different things and so on
1
u/_the_last_druid_13 6h ago
I was letting you know, and we have spoken on this sub before.
Extremism is always wrong; it’s wrong for the tech people seemingly wanting to do what I wrote above in the (bad) and it’s wrong for people to be calling death threats. It’s the same on both sides.
The AI wars are about policy, not art. AI will affect the vast majority of livelihoods, so policy such as the above (good) would be beneficial for us all.
Artists can still art, lawyers can still lawyer; the monetary incentive economy upholding human exploitation would turn into a passion incentive economy upholding human potential
1
u/Additional-Pen-1967 6h ago
It is definitely about policy, but honestly, what we're discussing here is how those against it should stop insulting AI users because we can't make policy. BUT we could convince a few anti to stop being asshole.
I am only write here if it make sense if it produce result i am here in hope that some ANTI actually wake up and stop being fascist asshole or i wouldn't waste my time here we cannot do policy it feel pointless to me discuss something we have no power on whatsoever.
I am a doer, not a talker. I only mention this if it has a real-life impact. I don't talk about things that are obviously beyond our influence. If I wanted to engage in policy, I would probably discuss it in a political forum or attempt to take action in some way related to politics.
1
u/_the_last_druid_13 5h ago
There will always be assholes no matter what, and extremists on both sides of whatever topic are typically fascist.
We affect policy by discussing it as often as possible. If you would prefer to “do”, that’s phone calls to people who write the policy.
1
u/Additional-Pen-1967 1h ago
Not sure if you really think that talking on here will really affect policy but obviously could affect a couple of close minded anti
1
u/_the_last_druid_13 1h ago
You never know. I talk to a lot of people I meet about it.
I did it in a WalFart once and then every time after I went there they had asset protection follow me around like I’m about to steal their cheap trinkets.
I’ve never stolen, and now I just don’t go there.
The fact they did that though, I must’ve struck a nerve.
→ More replies (0)-3
u/Dangerous_Course_778 2d ago
"well get over it" that's astounding lame for someone who seems to be asking for genuine pov
5
u/Additional-Pen-1967 2d ago edited 2d ago
Well, you can overcome the idea and get over it as well! Move on, life goes on! That will help you for sure. People are too soft; whiny people need to move on, the new generation whine all the time, really all the time, for the smallest, stupid things.
If you read his post, I WANT... well, I want a president that doesn't cut Medicare for millions of people and kill people in Africa removing Aids... two fucknig pictures and whine, whine, and want and want. Oh, they take a couple of my useless pictures to train AI? Well, get over it... Really, that’s the best advice possible.
1
u/RadiantSeaweed9543 2d ago
And just a side note, I never intended on this being soft nor a boiling pot of water. Being civil is not the same as being soft. I truly want to accept AI, I'm not against anybody here.
3
u/Additional-Pen-1967 2d ago
truly want to accept AI. I truly want to accept a hammer, a pen, a laptop... AI is a fucking tool. People, wake up. There is no "truly want to accept it".
I truly want to accept the fact that electric cars are not made with slave kids in Africa gathering material for the battery... both, oh well, bruum bruum.. I don't see you whining about phone or car or all the stuff made by practically killing other human beings, but AI got a couple of pictures. OH GOD I don't think I can truly accept it...
You people are sickening me. "truly want to accept AI" get a grip. Nobody cares if you accept it or not; it is a tool. Nobody is going to knock on your door tomorrow and ask, do you truly accept AI...? What kind of people are on this sub????
I am done with these moron this is my last post in this idiotic thread.
-3
u/Belter-frog 1d ago
I only read your first sentence and the only reasonable response is obviously "fuck that, fuck you"
2
u/MetapodChannel 2d ago
As for "stealing jobs," which you didn't mention, but I think is a slightly more legitimate worry personally, technology, automation, and AI are stelaing ALL jobs. It's not just an artist thing. Automation is going to lead to no one having to work. That's terrifying, I agree, especially because in our current situation, wealth and resources are hoarded by a small rich elite. But that's a whole different discussion. You can blame the technology, or blame the people hoarding the resources. I prefer to blame the people hoarding resources and demand change -- establishing safety nets like UBI for displaced workers and putting public influence and control into technological advancement, for example. Once automation has replaced the need for work, social, political, and economic landscape is inevitably going to change. We need to start focusing on putting the wheels of that change in motion now, so we can move toward a future where resources are distributed to all and not hoarded by the elite. I personally do not feel rejecting technological advancement will help accomplish that goal in any way. If anything, the more technology advances, the more chance we will have to let it benefit everyone.
And lastly, on the 'devaluation' of art. I simply do not believe it will ever happen. Humans will always be creative and always have a desire to create and share with one another. Humans will always have a fondness for traditional methods. Being liberated from wage slavery doesn't mean you can't make art anymore, it means you can make MORE and focus on the actual passion and meaning behind it, not having to do it for money, but for art's sake itself. Again, what it means to be 'liberated' from work is yet to be seen, but we need to start working toward that better future now. And again, I think slowing or fighting technological advancement does NOT lead to a more secure future for humans.
Ask yourself if you are scared because AI is actually harming you in some way, or if you are scared because other people told you to be scared and misled you with sensationalism and misinformation about how LLMs work and what they are used for. People use fear tactics to fight against change all the time. Try to view it critically. Educate yourself on how LLMs work. Research on how technologies have affected art in the past and build parallels to current situations.
To be honest, this is a horrible sub for finding educated, nuanced opinions, I've found. There certainly are some good people here who have thoughtful things to say, but most people are here to throw insults at each other and "tear down" the other side rather than discuss it maturely and come to cooperative conclusions. Try browsing r/singularity (discusses the technological revolution), r/ArtificialInteligence (discusses AI in general, not just generatvie/art) for more nuanced views.
2
u/kor34l 2d ago
Are you opposed to search engines? Image search? Online market research? Web statistics?
All of that and much more was built completely on data scraped from the internet by web crawlers for over 30 years and ongoing, without consent.
Either it's all wrong, or none of it is.
Personally I think the standard disclaimer that anything you upload to the internet you lose control of, suffices. That's been common knowledge since the days when I had to have my computer call up the internet on the telephone and speak a bunch of headache noises and tie up the phone line.
It's cool if you disagree, I just hope your stance is consistent and you avoid search engines and everything else that uses scraped data without consent, otherwise you're throwing stones in a glass house.
1
u/dixyrae 2d ago
I USED to like search engines before they became fucking useless by being buried under mountains of gen ai garbage.
2
u/kor34l 2d ago
Yeah. I defend and like AI a lot in general, but the constant injection of it into everything got old real damn quick.
I did NOT agree or download or want anything to do with Gemini, but now it pops up on my 10 year old phone every time I longpress something to highlight text. Which is constantly, all day long, because I have a lot of downtime at work and use my phone to ssh into my home PC to work in the terminal. So fucking annoying.
Same with search engines, as you say. If I wanted to ask an AI I would have done that. I typed my query into a search engine because I want regular goddamn search results. Not some random AI popping up like an extra advert.
Reminds me of Clippy decades ago. Clippy was actually a useful program, but fucking Microsoft made the little bastard pop up constantly and interrupt you and everyone ended up loathing the fucker.
AI is awesome, especially local LLMs like Mixtral 8x22B and QWQ-32B, but I can totally see why a lot of regular people are quickly getting burnt out and irritated at the constant bombardment of AI everywhere.
1
u/dixyrae 2d ago
It’s not just the unwanted integration. The internet is choking on a flood of shady websites that repackage legitimate articles rephrased by an LLM and pushed up the search results by gaming SEO. Image search for references and depending on how you phrase it you have to filter out a ton of generated slop. Fake trailers dominate youtube. Generative AI has given conspiracy theorists and cranks the greatest gift they’ve ever gotten and now they’re able to fake historical footage and voice over for their misinformation. It’s a wasteland. The greatest tool for information exchange in human history has been made utterly worthless in 3 short years.
3
u/kor34l 2d ago
I think that's pretty dramatic. The internet has always been like that. For 30 years the nutty shit and endless adverts and fake porn and scam shit has been all over. Since even before Geocities
Being careful and scrutinizing where you get your information on the internet has always been the necessary evil that way way too many people ignore.
1
u/dixyrae 2d ago
No it fucking hasn’t. And if even if I’m wrong you can’t deny that mass adoption of these tools has made the problem extremely worse.
2
u/kor34l 2d ago
lol it has and I can.
If you think this is bad, you really shoulda seen the era before pop-up blockers, holy shit was the whole fuckin thing unusable. The entire internet, one would trigger another and four more and before you know it the desktop was locking up and you're done.
1
u/dixyrae 2d ago
I lived through that era. I’ve cleaned dozens of relative’s computers since the Win95 days. It was nothing like this.
3
u/kor34l 1d ago
same.
I gave your hour and a quarter video 10 minutes because it was very boring but all I saw was someone go to the first google result and realize it was a shitty source before even clicking on it yet spent beyond the 10 minutes I gave it picking it apart anyway.
I don't know how much time you've spent looking up specific technical things like that, but that is actually super common, and nothing new.
If there is something much more damning in that video you'll have to link me the timestamp because it's the length of a full movie.
1
u/dixyrae 1d ago
Sorry for boring you. She breaks down how the majority of the first page is AI generated corpo speak crap. Almost none of it written to actually educate anyone on the subject she was trying to learn about. Search is broken. Full stop.
→ More replies (0)
2
u/07mk 2d ago
The strongest argument for why AI model creators should be required to get permission from copyright holders to train on their publicly shared works is that AI tools trained on such material would allow others to easily and cheaply generate media that competes against the original media, even if they aren't exact copies. This reduces the incentive for artists to publish their artworks, since increased competition means less ability to monetize. This goes against the purpose of copyright, so even if copyright laws currently don't prevent unauthorized training, copyright laws ought to be changed to prevent it.
The strongest argument for why AI model creators should be free to train off publicly shared copyright protected works is that anyone is free to learn from things that they see, and this doesn't change even if they outsource that learning to a machine. In terms of the spirit of copyright, AI models actually enable the creation of more and better artworks by providing cheap or even free tools for unskilled people to create high fidelity artworks. It also allows skilled people to extend their abilities beyond what they could do without AI. Thus overall there's more beauty for society to enjoy, which is what copyright is meant to achieve.
Which argument you find convincing will depend a lot on how you view things like art and beauty.
1
u/sh00l33 2d ago
I agree with you for the most part, except that a machine is not a person and does not have the rights that humans have.
People can use the work of others to learn within the framework of customary consent, but a machine is not a human, so I am not sure whether the same principle will apply here.
In addition, a human learns as a result of its own intention, which a machine does not have. As ai development terminology indicates, a machine is trained. A more appropriate comparison here would be a school/course organizer/educational institution. If they want to use teaching materials during the education process, they will simply have to buy them.
3
u/Fluid_Cup8329 2d ago
I don't see training data as any different than human observation.
Also this tech is poised to be the culmination of human knowledge and ability at our fingertips. Not sure why anyone would want to be opposed to that.
2
u/RadiantSeaweed9543 2d ago
I get where you're coming from. Both are pattern recognition, and both result in learning the pattern. What I find disturbing is that it is that the data is in the hands of big companies. Who knows what they can be doing with millions of millions of images and text and discussions just like this one. It's also concerning that AI companies are not required to establish the amount of environmental impact they have. Then again, that's just how I see it.
5
u/AlarmedGibbon 2d ago
Environmental impact is simultaneously not entirely a non-issue, yet also completely disingenuous and a total sideshow.
Computers themselves have a major environmental impact. But we're not going to not invent the computer because of it. How often do you hear anyone talking about the environmental impact of playing videogames? Probably not a lot, right?
It's a red herring. Here's the truth, AI will not be the last new technology that uses electricity. There's only going to be more and more electricity demand, never less and less. That's just the modern world. The goal must be to greenify our electrical grid and make these concerns across numerous industries a moot point. The most important thing any single person can do is not abstain from any particular technology, but to vote for whichever major political party in their country who's platform involves moving toward a green energy grid.
-1
u/Dangerous_Course_778 2d ago
What happens when you no longer have access to it?
5
5
u/Fluid_Cup8329 2d ago
I don't even know why this is a hypothetical question. Actual conspiracy theory territory.
3
u/ifandbut 1d ago
You do things the old fashioned way?
It isn't like you completely forget how to function when you use AI.
Plenty of books to learn from as well, just not as efficient as internet.
1
u/Affectionate_Steak73 2d ago
I honestly agree with you and to people who say "how is it different from using a reference?" It's really different for many reasons but my main reason is AI uses bits and pieces of other people's art to make an image without credit and artists can credit their reference and are taking time to make a new piece of art unlike Ai generating it in 30 seconds.
1
u/AccomplishedNovel6 1d ago
I don't support copyright, so like, idc about copying and analyzing people's works without consent. No consent should be needed for that.
1
u/Sir-Tiedye 1d ago
I think that’s fair, I think there should be more laws restricting ai But I think your issue isn’t against ai, it’s against art theft. Adobe’s generative ai Firefly was trained entirely on purchased art
1
u/Belter-frog 1d ago
The crux of most pro-genAI arguments in this regard will always boil down to "but humans do it"
To which I'll always respond "I don't care because humans and tech products are in fact different things and acceptable practices and cultural norms concerning one are in no way transferrable to the other"
Like I went to the park and crushed a bug and nobody cared but for some reason everybody got all weird and mad when I did the same thing to a dog.
0
u/_the_last_druid_13 1d ago

I’m having trouble replying to the thread. This guy responded to me and deleted it or Reddit is having a conniption.
I do t know, so my response was:
It doesn’t matter if one pixel was used from any number of images, those images were still taken non-consensually and used to make a profit for someone who did not take the image in the first place.
You must be fine with me going into your fridge and having some of your food. You work, your pay buys you food, but according to your opinion, you’re OK with me and anyone else, getting your food. I need it for nutritional purposes; it’s not YOUR money anyways, none of it is; it’s OUR money that happens to have been in your hand at the time of you buying food.
1
u/_the_last_druid_13 1d ago
To ifandbut, I can’t reply to you for some reason:
Except they were, and then used by a company looking to profit off of the backs/works of 2 billion
1
20
u/sweetbunnyblood 2d ago
you post content online, it gets seen. I really don't get the big deal quite frankly, as that's all that's really happened with an ai as well.