Deepflake Image Generation Causing Security Threats: A Deepfake Nude Generator Reveals a Chilling Look at Its Victims
Image generation tools are gaining popularity these days, but along with that, the risk of privacy and safety is becoming a concern, especially for female users. Recently a deep flakes video generator site has become very popular among users who want to nude their photos using the AI tool.
We are still not able to identify the names of the sites which is allowing their users to Nudify the subjects they are uploading. The deep flakes site is a big concern for teenagers. Content related to the usage of the site was recently inked with YouTube videos and some subreddit posts. However, these videos and posts are now being removed and the users are banned from the platform.
The site offers a seamless user interface to modify the image. Users who want to create and save nude images are asked to log on to the sites with some paid credits, the pricing details are not cleared yet. After getting logged in to the site, people can easily upload any of their images or group photos to remove the clothes and later save the deep flakes images.
Also read: Profluent, Spurred by Salesforce Research, using AI to Discover Medicines
Advanced AI image generator tools, that are launched for artistic and creation use are also generating deepflakes porn images. People are generating content for the porn sites by using prompts like” hands o the hips. Pornography has now become the most popular use case for AI tools like OpenAI and others.
There is a huge traffic base and a big audience reported on that site. People are creating Nonconsensual Images using these sites. Some images found on these sites are of some high school girls and under 18 girls. Even for a few photos, the faces are blurred, and the rest of the body is visible to the users.
Even there are many photos of Social media influencers and general people who are posting their photos on Instagram on a daily basis. People are taking screenshots of their images and then uploading them on these sites by removing the clothes. Even people with high Instagram followers or TikTok followers are the “Most Viewed” images listed on the site.
If we talk about the legal actions, then in the US there’s no federal law that targets the distribution of fake, nonconsensual nude images.
“If it is indistinguishable from an image of a live victim, of a real child, then that is child sexual abuse material to us,” Newman says. “And we will treat it as such as we’re processing our reports, as we’re getting these reports out to law enforcement.”- says Jennifer Newman, executive director of the NCMEC’s Exploited Children’s Division.