2. They convert your photos to face data, which they claim to not retain BUT if you worry
Posted: Mon Jan 06, 2025 5:46 am
The dark side of Artificial Intelligence and its art
Interesting, right? However, Lensa AI raises many questions about the treatment of data it collects from users' selfies. Many critics have raised the alarm over data protection and copyright issues.
Ok I've been seeing people posting about the privacy issues with Lensa AI, and I slovenia number data thought I'd share some helpful tidbits from the TOS and privacy notice:
1. You're helping them train their AI
— Chanda Prescod-Weinstein (@IBJIYONGI) December 4, 2022
This debate has been going on since the days of Faceapp . The problem that professionals warn about is that users give up sovereignty over their own faces. Because according to the app's data protection regulations , Lensa AI can use the generated images as it wishes and even resell them.
For this reason, there is a risk that images may be generated against the will of the person portrayed.
Real time watching everyone I know give their data and likeness away (because #vanity and I'm with them) to a poorly trained ML model and finally starting to care about facial recognition + racist/sexist AI as they try out Lensa. #productinclusionandequity #mlfairness #technerd
— sydneycoleman (@sydneycolemanSF) December 5, 2022
TechCrunch states that “The Prisma Labs team has responded to our concerns. The company notes that if you specifically trigger the AI to generate NSFW images, it could do so, but it is implementing filters to prevent this from happening accidentally. The jury is still out on whether this will actually help people who fall victim to this sort of thing without their consent.”
In addition, users have noticed that these portraits are created from stolen art from real artists. In many of the images, a kind of artist's signature can be seen, so alarm bells have gone off on Twitter that art may be stolen.
If you've recently been playing around with the Lensa App to make AI art "magic avatars" please know that these images are created with stolen art through the Stable Diffusion model. pic.twitter.com/VGrrECYVn5
— meg rae
Interesting, right? However, Lensa AI raises many questions about the treatment of data it collects from users' selfies. Many critics have raised the alarm over data protection and copyright issues.
Ok I've been seeing people posting about the privacy issues with Lensa AI, and I slovenia number data thought I'd share some helpful tidbits from the TOS and privacy notice:
1. You're helping them train their AI
— Chanda Prescod-Weinstein (@IBJIYONGI) December 4, 2022
This debate has been going on since the days of Faceapp . The problem that professionals warn about is that users give up sovereignty over their own faces. Because according to the app's data protection regulations , Lensa AI can use the generated images as it wishes and even resell them.
For this reason, there is a risk that images may be generated against the will of the person portrayed.
Real time watching everyone I know give their data and likeness away (because #vanity and I'm with them) to a poorly trained ML model and finally starting to care about facial recognition + racist/sexist AI as they try out Lensa. #productinclusionandequity #mlfairness #technerd
— sydneycoleman (@sydneycolemanSF) December 5, 2022
TechCrunch states that “The Prisma Labs team has responded to our concerns. The company notes that if you specifically trigger the AI to generate NSFW images, it could do so, but it is implementing filters to prevent this from happening accidentally. The jury is still out on whether this will actually help people who fall victim to this sort of thing without their consent.”
In addition, users have noticed that these portraits are created from stolen art from real artists. In many of the images, a kind of artist's signature can be seen, so alarm bells have gone off on Twitter that art may be stolen.
If you've recently been playing around with the Lensa App to make AI art "magic avatars" please know that these images are created with stolen art through the Stable Diffusion model. pic.twitter.com/VGrrECYVn5
— meg rae