One fear regarding AI on a personal level that you should know about because it’s very much in the present is the creation of deepfake photos, including those that strip you of the most basic of privacy rights: The right to protect images of your body.
Why would someone create an AI camera that removes clothing?
MORE: HOW SCAMMERS HAVE SUNK TO A NEW LOW WITH AN AI OBITUARY SCAM TARGETING THE GRIEVING
How does a camera that digitally strips away clothing work?
MORE: ARE AI DEEPFAKES THE END OF ACTING AS WE KNOW IT?
Doesn’t deepfake nude photo technology already exist?
MORE: AI WORM EXPOSES SECURITY FLAWS IN AI TOOLS LIKE CHATGPT
Nuca’s deepfake dilemma: Artistic innovation or ethical Invasion?
Bottom line: Anyone could use the technology that’s found with Nuca to create a deepfake nude photo of almost anyone else within several seconds. Nuca doesn’t ask for permission to remove your clothing in the photo.
It’s worth emphasizing again that the two artists have no plans to allow others to use Nuca for commercial gain. They will showcase its capabilities in late June at an art exhibition in Berlin, Germany, all in an effort to spark public debate.
However, the next people who develop a similar technology may choose to use it in a far different way, such as to potentially blackmail people by threatening to release these fake nude photos that other people won’t necessarily know are fake.
Kurt’s key takeaways
Are you concerned about AI-created deepfake photos and videos affecting you personally? What safeguards should exist around the use of AI? Let us know your thoughts in the comments below.
FOR MORE OF MY SECURITY ALERTS, SUBSCRIBE TO MY FREE CYBERGUY REPORT NEWSLETTER HERE
2 comments