Last week, an undisclosed number of girls at a New Jersey high school learned that one or more students at their school had used an artificial intelligence tool to generate what appeared to be nude images of them. Worse, those images, which used at least a dozen photographs of girls on campus, were being shared among some boys at the school in group chats. There’s an ongoing investigation, the local police are involved, and counseling has been offered to affected students. If you think there ought to be a federal law against such harmful exploitation of underage victims, or even adults, for that matter, I agree with you. Sadly, there is no such crime that covers AI-generated nudes.
So-called deepfake photos and videos are proliferating almost as fast as people can download the software that generates them. In addition to creating fictionalized products that don’t resemble particular people, deepfakes can use the face, voice or partial image of a real person and meld it with other imagery to make it look or sound like a depiction of that person. Last spring, a deepfake photo of Pope Francis wearing a stylish Balenciaga brand puffy coat went viral. The pope might be fashionable, but that wasn’t him in the image.