Views : 22,630
Genre: Film & Animation
Date of upload: Jul 24, 2023 ^^
Rating : 4.803 (82/1,579 LTDR)
RYD date created : 2024-05-13T00:21:51.00305Z
See in json
Top Comments of this video!! :3
This whole thing is maddening. We should never have to be the ones to opt out of something, they should be asking permission and paying us if we agree to have our work used, I know this is an obvious statement. Being an artist is a difficult enough career without these jerks then just stealing what we've all created so they can make money off of us.
291 |
They are asking whether it's too complex to talk about licenses with all the content holders or whether it's possible. To me, this is complete madness. It is like: We are stealing from too many people, which means it's not possible to ask permission, sorry. If you need to train a tool with millions of stolen images because you cannot ask permission, then you cannot use the tool. End of story. In which world can you justify a crime by saying it is inconvenient to do it the legal way? This is another example that not everything that can be done, should be done just for the sake of technical progression. This way, tools become more important than the humans they should be useful to, no matter if it's morally reasonable or not. How is this even something we have to discuss?? It's THEFT! This entire opt out thing is a joke. We are going to steal your stuff and sell it ourselves, unless you tell us not to. If I want something that belongs to another person to use it for myself, I have to ask nicely. We should keep it that way.
145 |
At 13:00, the default is Opt out, as in your work will be used for training unless you find it and manually opt out every individual image. This also doesn't factor in other people uploading your works without attributing credit. Artists are pushing for Opt In, as in if they want to train on an artists body of work they would have to reach out to that artist and work out a licensing agreement with them. AI companies don't want this because it would be too costly to create their models or they would have extremely limited and biased models. The rabbit in the hat they are trying to pull is that they want to fully train their models now off of billions of images while it's the wild west. That way when they are functional and have a broad enough range of styles and competence they can say they moved to a licensed or ethical approach and it would be near impossible to prove what their AI models have been trained on in the past as the algorithms themselves do not contain actual images, they are training on patterns. It will also inevitably get to the point where you won't need an artists name in order to take from them. Asking for a cool painted dragon illustration will have already learned what it needed from Greg Rutkowski but these companies will be able to claim that he is not found in the dataset, they have made sure no names are able to be used as prompts and there is no evidence of his work inside the models. The only real solution would be new models started from scratch that operate on an Opt In basis and are regularly audited.
24 |
They don't wanna pay for the millions of artworks that they stole? Can all the artists get together and sue them? Tank their entire shitty companies? Lol something tells me it's this goes badly then suing is the only option we'll have left. The way ben brooks kept giving the same answer to everything was so weird. Stop dodging the questions. He's so creepy
76 |
When an AI has trained on another AI generated data it begins to forget and produce gibberish. It's called model collapse. In order to combat this they're trying to create watermarking for AI. So then in turn they can identify AI data and remove it from the data sets to prevent model collapse. But the main reason they are saying watermarks are necessary is to combat misinformation and protect from dangerous uses of AI. Per a research paper watermarking seems impossible so that would mean no protection from misuses of AI and an eventual model collapse. At least at the moment untill they find a foolproof way to identify AI outputs. As our community has been saying, these things can't exist without the human sourced material. And model collapse just proves this even further.
21 |
I really appreciate you translating the jargon because I have a hard time with that as well lol
Morally the only thing that makes sense is to wipe the whole dataset and start from scratch. It's not fair to financially benefit from the exploitation of artists and THEN say "going forward we will ask for permission" while keeping the same data.
The core of the problem isn't even about what they do with the data or how similar AI training is to human learning (it's not), it's the act of taking in the first place.
18 |
Hearing these proceedings are super disheartening for sure, especially from the AI representatives, because all I am getting out of their statements is "This beast has exponentially grown to the point that we really have no control over it, outside of pulling the plug. Sorry, not sorry." I mean, I wish they would just say that, rather than throwing a lot of words out there to confound and confuse.
13 |
ItĀ“s just ridiculous how many single private persons got to pay partly humogous fines for unknowingly using copyrighted images (rightfully so but it took a whil for people to understand the copyright) and now a big company basically grabs it all at once and nobody thinks they should get sued at all lol. Would be funny if it wasnĀ“t so ridicoulusly scary
7 |
Iām not gonna lie I still donāt like the fact that Adobe is collaborating with that open AI guy whatever his name is š¤
Mind you I think Adobe knew way ahead of time that this technology was coming out and theyāve been collaborating but now theyāre trying to come out like theyāre doing it the ethical way itās shady.
it just feels icky to me and it feels like Adobe wanted to tap in to the trillions and billions of dollars of that genre, and didnāt think of the implications of the artist. Thatās why I donāt know why so many people like adobe, but it feels weird and I think Adobe is a lot more involved in the situation than you think.
Also, I think the cat is out of the bag already, and thereās gonna be no turning back at this point even if they put more into affect to help people itās probably mostly only going to be in the US because the rest of the country is like the wild wild West, and The laws are unpredictable and out-of-control, especially in Asia. Itās just a hot mess and at the end of the day Artist are going to suffer and they already starting to suffer. I feel like a lot of these companies like Adobe collaborated and are in cahoots with these companies, knowing that this technology was coming because at the end of the day, it helps them. They donāt have to pay Artist if they can generate the stuff themselves.
I would also say that, even if you donāt physically go to the copyright office and get your stuff copyrighted, you still hold the physical copyright to your intellectual property for the lifetime of the artist +75 years. You still have ownership over your intellectual property. What is this guy talking about? Again I donāt know what are the countries but thatās how it is in the US. Iām telling you that Brooks guy is a weasel and yeah I need to watch out for adobe and in the meantime in between time what is going to happen are they going to put safeguards in affect? Whatās gonna be the end result?
Shout out to Ms Ortiz for representing all of us big and small artists š©āšØ Because God knows who else is going to do it for us
29 |
@allensvissuals9903
9 months ago
I find it hilarious how the AI guy mentions "Open-Source Datasets" Allow me to explain what this means, because I believe the term was overlooked and it can get him in more trouble than he believes. "Open-Source Dataset" is a conglomerate of data that's arranged and made public by the company that owns the piece of data. If we go into technicalities, this would have to mean that ArtStation, Twitter, Google, and several other companies have gone against their own terms and conditions of only being able to "reproduce the works" that you're putting in them (which generally means being able to display your work on the front page) and have arranged a big organized dataset that then everyone can access. In other words: This guy talking about "Open datasets" and then having the Artist that have gotten their works stolen is admitting to not only having stolen these works, but to also have stolen the images that (within technical reasons) are "co-owned" by the sites you've posted them in and therefore have stolen from bigger companies than artists. it's a legal mouthful, yes, but it's also a legal mouthful that can be taken advantage of
201 |