pull down to refresh

How it works:
Nightshade seeks to “poison” generative AI image models by altering artworks posted to the web, or “shading” them on a pixel level, so that they appear to a machine learning (ML) algorithm to contain entirely different content — a purse instead of a cow, let’s say. Trained on a few “shaded” images scraped from the web, an AI algorithm can begin to generate erroneous imagery from what a user prompts or asks.