Glaze: a new way to protect your art against inclusion in Deep Learning model sets
News / Software
28. February 2023
News / Software
28. February 2023
A team of scientists from the University of Chicago is working on an algorithm that will enable artists to protect their artwork against inclusion in training sets for Deep Learning algorithms as Midjourney, DALL-E or Stable Diffusion. The new algorithm, that will be released for free in the near future, is called “Glaze”.
Basically Glaze adds invisible information to images that “cloaks” the images in a way that makes it difficult for Deep Learning algorithms to correctly recognize the images for this purpose. From the project website:
Glaze is a tool to help artists to prevent their artistic styles from being learned and mimicked by new AI-art models such as MidJourney, Stable Diffusion and their variants. It is a collaboration between the University of Chicago SAND Lab and members of the professional artist community, most notably Karla Ortiz. Glaze has been evaluated via a user study involving over 1,100 professional artists. At a high level, here’s how Glaze works:
Suppose we want to protect artist Karla Ortiz’s artwork in her online portfolio from being taken by AI companies and used to train models that can imitate Karla’s style.
Our tool adds very small changes to Karla’s original artwork before it is posted online. These changes are barely visible to the human eye, meaning that the artwork still appears nearly identical to the original, while still preventing AI models from copying Karla’s style. We refer to these added changes as a “style cloak” and changed artwork as “cloaked artwork.”
Caveat. This may not be a permanent solution, since the DL-algorithms are improving constantly and it may be possible in the future they will recognize the glazing and try to remove it, the team writes on their website:
Unfortunately, Glaze is not a permanent solution against AI mimicry. AI evolves quickly, and systems like Glaze face an inherent challenge of being future-proof (Radiya et al). Techniques we use to cloak artworks today might be overcome by a future countermeasure, possibly rendering previously protected art vulnerable. It is important to note that Glaze is not panacea, but a necessary first step towards artist-centric protection tools to resist AI mimicry.
Nonetheless, this may be a good temporary solution for artists trying to protect their art. And I am sure if the DL platforms invent countermeasures, Glaze also can be improved.
The Glaze team consists of:
There is no downloadable version as of yet, but the team plans to release a Windows an Mac version in the near future, “the next weeks”.
By continuing to use the site, you agree to the use of cookies. more information
The cookie settings on this website are set to "allow" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this. If you allow it, cookies from third party pages as Google or Amazon may be set. If you do not want that, don't click on "allow" and stop using the page, switch your browser to incognito mode or use anti-tracking extensions in your browser
When and if clicking on "allow" you also consent to the activation of external hosted javascripts that may transmit further information about you (for example the IP address) to third parties. Details about those informations can only be acquired from those third parties (jQuery, Google, Youtube, Twitter if applicable). If you do not want that, please do not click on "allow" and stop using this site.
If you want to protect your identity online, use browser plugins as uBlock Origin, AdBlock or ScriptBlock, so you can allow or disallow scripting and tracking on a more granular basis.