Had to visit a hospital today. (Nothing scary, I promise!)
Anyway, the security ID photo they took of me was uh. Not classically soothing.
19K notes
·
View notes
Original source: s_kinnaly
(Thanks @felinalain for finding the source!)
11K notes
·
View notes
Outspan Orange Mini (1972)
15K notes
·
View notes
It was never about hamas. If israel manages somehow to kill every member of hamas, what then? Do you think Palestinians are just going to forgive and forget everything Israel has done?
Babies who are the only surviving members of their families? Fathers carrying the remains of their children in plastic bags? Palestinians who witnessed people blown to bits right in front of them? Had Israeli forces shoot at them as they tried to escape the north? Palestinians in the West Bank who have been captured and tortured on camera? Palestinians in 48 who have been arrested just for sympathizing with their kin in Gaza? Palestinian school girls being assaulted by the IOF? Mothers who only have the blood of their children on their hands as their only remaining piece of them? The constant dehumanization that followed our every move - how while Palestinians suffered, politicians called us “monsters”, “human animals”, “children of darkness”, “savages”, and “cockroaches”?
It’s been 75 years since my family was forced from their villages by Zionist militias, they have never forgotten what they did to their neighbors and how they are still denied their right of return. None of us will.
Now, IOF forces in Gaza raise the Israeli flag over the beaches and take selfies with fleeing Palestinians in the background, cheer and celebrate a “return to their settlements in gaza” and sing about leveling the land and fantasize building shopping malls on Palestinian mass graves - it was never about hamas.
18K notes
·
View notes
the darling Glaze “anti-ai” watermarking system is a grift that stole code/violated GPL license (that the creator admits to). It uses the same exact technology as Stable Diffusion. It’s not going to protect you from LORAs (smaller models that imitate a certain style, character, or concept)
An invisible watermark is never going to work. “De-glazing” training images is as easy as running it through a denoising upscaler. If someone really wanted to make a LORA of your art, Glaze and Nightshade are not going to stop them.
If you really want to protect your art from being used as positive training data, use a proper, obnoxious watermark, with your username/website, with “do not use” plastered everywhere. Then, at the very least, it’ll be used as a negative training image instead (telling the model “don’t imitate this”).
There is never a guarantee your art hasn’t been scraped and used to train a model. Training sets aren’t commonly public. Once you share your art online, you don’t know every person who has seen it, saved it, or drawn inspiration from it. Similarly, you can’t name every influence and inspiration that has affected your art.
I suggest that anti-AI art people get used to the fact that sharing art means letting go of the fear of being copied. Nothing is truly original. Artists have always copied each other, and now programmers copy artists.
Capitalists, meanwhile, are excited that they can pay less for “less labor”. Automation and technology is an excuse to undermine and cheapen human labor—if you work in the entertainment industry, it’s adapt AI, quicken your workflow, or lose your job because you’re less productive. This is not a new phenomenon.
You should be mad at management. You should unionize and demand that your labor is compensated fairly.
10K notes
·
View notes