High Definition Standard Definition Theater
Video id : SbCBaopgbqw
ImmersiveAmbientModecolor: #ccc2b3 (color 1)
Video Format : 22 (720p) openh264 ( https://github.com/cisco/openh264) mp4a.40.2 | 44100Hz
Audio Format: Opus - Normalized audio
PokeTubeEncryptID: bd761052e0d32bb60ceb1f380fcc7b3e35f2dd77be9e1ac579e2cbfc84ba81130daa96782f2cae34900a92a3862a0f89
Proxy : eu-proxy.poketube.fun - refresh the page to change the proxy location
Date : 1715652496148 - unknown on Apple WebKit
Mystery text : U2JDQmFvcGdicXcgaSAgbG92ICB1IGV1LXByb3h5LnBva2V0dWJlLmZ1bg==
143 : true
233,962 Views • Jan 22, 2024 • Click to toggle off description
In this video I discuss techniques artists are using to protect their artwork from being "stolen" by people training large language models to generate artwork.

My merch is available at
based.win/

Subscribe to me on Odysee.com
odysee.com/@AlphaNerd:8

₿💰💵💲Help Support the Channel by Donating Crypto💲💵💰₿

Monero
45F2bNHVcRzXVBsvZ5giyvKGAgm6LFhMsjUUVPTEtdgJJ5SNyxzSNUmFSBR5qCCWLpjiUjYMkmZoX9b3cChNjvxR7kvh436

Bitcoin
3MMKHXPQrGHEsmdHaAGD59FWhKFGeUsAxV

Ethereum
0xeA4DA3F9BAb091Eb86921CA6E41712438f4E5079

Litecoin
MBfrxLJMuw26hbVi2MjCVDFkkExz
Metadata And Engagement

Views : 233,962
Genre: Science & Technology
Date of upload: Jan 22, 2024 ^^


Rating : 4.905 (314/12,935 LTDR)
RYD date created : 2024-05-14T02:00:52.388199Z
See in json
Tags
Connections
Nyo connections found on the description ;_; report a issue lol

YouTube Comments - 4,239 Comments

Top Comments of this video!! :3

@rustymustard7798

3 months ago

I like how the solution to constantly shooting ourselves in the foot is more bulletproof boots.

4.8K |

@v4n1ty92

3 months ago

Artists wouldn't need to put "malware" in their work if these companies weren't using their work without permission.

1.2K |

@Bioniclethanok

3 months ago

I feel like treating Nightshade as an illegal piece of malware is like saying home security cameras should be illegal because you're taking away the house robbers' main source of income.

1.7K |

@terig8974

3 months ago

If you eat someone else's lunch from the breakroom fridge, don't be surprised if it gives you explosive diarrhea.

1K |

@theaudiocrat

3 months ago

What took them so long? As an "artist" whose drawings were only ever good enough to make the neural networks worse, I've been doing this for years

3K |

@arc5031

3 months ago

Well, if a company want's to use my artwork to train their network, I reckon they can always pay me for an un-poisoned copy.

621 |

@TheZeroNeonix

3 months ago

AI as we were promised: "We're taking the hard labor jobs you don't want, freeing you up to do more art!" AI we're getting: "We're taking over the creation of art away so people can have more time for manual labor."

1.6K |

@stupidweasels1575

3 months ago

Nightshade seems like a dye packet in a wad of bills and it's loudest dissidents just sound like they're crying "you ruined what I stole"

594 |

@CyberMutoh

3 months ago

i find it funny how companies get mad at ppl for pirating their software yet those same companies just get to use artists who they didnt even ask permisssion to steal their art with no consequences?

1.4K |

@kaijuultimax9407

3 months ago

Everyone parroting "They'll just use AI to work around Nightshade" are missing the point. The point of glazing images is to make it JUST annoying enough that data scrapers don't bother to circumvent the cloaking. It's the same logic as getting a big, scary padlock for your property. It's not supposed to stop the expert thief who is after you specifically, it's just supposed to keep your common everyday thief out. Glazing your images is likely to stop data scrapers from going after you because circumventing a glaze takes more time and effort than it would be to just find an unglazed image elsewhere, much like how most thieves see an industrial padlock and just leave to look for unsecured goods somewhere else instead.

1.1K |

@eeeguba432

3 months ago

in a year id be funny if the largest models got poisoned, then you need to hire a translator "hi, id like to make a dog driving a car" "ok, computer, generate cat plonking the cow"

197 |

@jayvee3165

3 months ago

My biggest issue with neural networks is that it should be Opt IN, not opt OUT If they want to include an artist's works then they should contact the artist and get their permission, not just use the works until the artist finds out and asks them to remove it

520 |

@holdenwinters68

3 months ago

The poisoning reminds me of the ad nauseum extension - not only does it block ads, but it also clicks them so the advertiser has to pay & the added bonus of ruining your advertising profile.

569 |

@KazmirRunik

3 months ago

This reminds me of something people were doing years ago, adding subtle noise maps to images to make earlier AI misidentify those images. For instance, two seemingly identical images of a penguin might be identified as something totally wrong like a pizza or a country flag, based on the noise map that was added to the original image. That might even be exactly what evolved into image glazing.

1K |

@texanrattler9061

3 months ago

Considering how most of the images that an ai is trained on are stolen without the permission of the person who created or uploaded it, it’s really the company’s own fault if something they stole broke their program. This is a really cool tool to protect your work.

237 |

@Furufoo

3 months ago

The issue with glaze unfortunately is that it is really visible for more cartoony artstyles, or ones with lots of flat colors. But we've seen AI start to inbreed as more AI generated images make it to places typically scalped by AI. Artists have taken inspiration and iterated off each other since the dawn of human hystory, meanwhile AI can't make it past 2 years without it becoming glaringly apparent that it cannot create, only make shittier copies of human work

299 |

@6Saturn9

3 months ago

Eh... as some once said... the cycle continues... Ad > Adblock > Adblock Blocker > Anti Adblock Blocker >... DRM > Cracks > New DRM > New Crack >... Cloaked Image > Image Uncloaker > Anti DeCloak...

1.1K |

@HoldTheHeathenHammerHigh

3 months ago

Auto-generated content everywhere. The dead internet Theory is REAL.

754 |

@incineroar9933

3 months ago

Apparently using both makes it even more difficult for AI to rip off your work. If you use both, you're supposed to use Glaze first, and Nightshade afterwards.

324 |

@markush8803

2 months ago

Calling Nightshade malware is like calling some software malware because your poor attempt to crack it caused your computer to shit itself.

83 |

Go To Top