Are your social media streams currently packed with portraits of your friends and acquaintances rendered in various artistic styles? It seems like just about everyone is hopping on the Lensa bandwagon right now, using the photo editing app’s “Magic Avatar” add-on to produce fantasy portraits for a mere $3.99.
Users simply upload 10 to 20 photos of their faces, and the app’s AI-powered image generating tool does the rest. It may seem like harmless fun, but some artists say the AI is stealing their work. Even worse, the app is generating sexualized images of minors, and users are unwittingly signing away the rights to their own images.
Fantasy Portraits Ignite Ethical Concerns
Artificial intelligence often learns by scraping the internet for content. The Lensa app’s Magic Avatar function draws from sites where real artists upload their work, like DeviantArt, Behance, and ArtStation. Lensa runs on Stable Diffusion, a text-to-image app trained to learn patterns through an online database of images called LAION-5B.
Some artists are recognizing their distinctive styles and even the remnants of their signatures in the AI portraits, like Kim Leutwyler, a Sydney-based artist who told
In an interview with
For artists, the problem is twofold. First, the AI is flooding the market with cheap art that exploits and devalues human creativity. “They are meant to compete with our own work, using pieces and conscious decisions made by artists but purged from all that context and meaning,” Stelladia says. “It just feels wrong to use people’s life work without consent, to build something that can take work opportunities away.”
Second, AI photo apps like Lensa are making bank while the human artists they draw from barely scrape by. By some estimates, Lensa is drawing in $1 million USD per day, and the artists seeing their work in its results aren’t even getting credited, let alone compensated.
Portrait Results are More Than a Little Problematic
Some users have found that even when they submit source images that are fully clothed and show no skin at all, the results are often semi-nude and sexually suggestive. At
Lensa’s Magic Avatar has also
AI training data like LAION is full of racist stereotypes, pornography, and even explicit images of rape. Of course, AI training data reflects the biases and prejudices of human artists and other content producers, but there doesn’t appear to be any attempt on the app creators’ part to moderate that data. While Stable Diffusion offers a filter on its dataset to limit graphic results, Lensa doesn’t appear to use it.
Lensa also doesn’t seem to enforce its policies prohibiting nudity and minors, and it doesn’t prevent users from uploading source images of people other than themselves. That makes it easy for anyone to exploit Magic Avatar and produce sexualized images of anyone they want, including children.
Signing Away the Rights to Your Own Image
When users upload their own images to Magic Avatar and similar apps, they’re essentially selling their personal information for a very low price. Lensa’s privacy policy allows them to “collect and store your
Though the images are supposedly automatically deleted within 24 hours after being processed by Lensa, they’re used to train the AI to produce more portraits. Users aren’t compensated when the company uses their images for that purpose. The app thus becomes just another way to give corporations more control over our identities — and that doesn’t seem worth the four bucks.
The post