Australian Artist Employs Innovative Tools to Protect Artwork from AI Scraping

On November 12, 2024, Australian artist Stephen Cornwell is employing advanced tools to safeguard his artwork from unauthorized use by artificial intelligence (AI). His digital creations, often featuring gothic and horror-inspired themes, are being altered in a way that is imperceptible to human viewers but confuses AI systems.

Cornwell is part of a growing movement among artists who are taking proactive measures against AI companies that utilize copyrighted material without consent. These companies, including OpenAI, Google, and Meta, have faced criticism for their lack of transparency regarding the data sources used to train their AI models.

Researchers from the University of Chicago have developed tools like Glaze, which modifies image pixels to prevent AI from accurately replicating an artist's style. Another tool, Nightshade, disrupts AI training by altering how images are perceived by AI, effectively poisoning the data used for training.

Despite having a disclaimer on his website against unauthorized scraping, Cornwell has started using Glaze to protect his distinctive style. He views these tools as essential for artists to reclaim control over their creative works.

Cornwell expressed concerns about the implications of AI on the arts, advocating for legislative measures in Australia similar to those in Europe that allow individuals to opt-out of data usage for AI training. He noted the ongoing campaign by the Media, Entertainment and Arts Alliance (MEAA) for stronger protections for artists.

While Cornwell acknowledges the challenges of using these tools, he remains committed to finding solutions to protect his craft from being exploited by AI.

Hai trovato un errore o un'inaccuratezza?

Esamineremo il tuo commento il prima possibile.