Sex in the news

Who Is Responsible For All These NSFW Images?

Finding the responsible parties in lawsuits is key to getting justice for the aggrieved. But when it comes to stolen NSFW images posted across various portals by wayward employees plucking content off a customer’s mobile phone, the question becomes, how far does the responsibility of this crime travel?


Advertisement

In a recent case, a judge ruled that the company the thieving employee was working for at the time of the theft could “potentially be liable,” ruling that the case should proceed. What specifically happened was that a lady came to an LA cell phone store to get her iPhone updated, and the clerk who helped her with this upgrade stole graphic images off her phone, then posted them.

This decidedly landmark decision by the judge allows for the responsibility for the crime to be placed on both the employer, as much as the wireless carrier store they were working at.

What is revenge porn?

Revenge porn refers to the distribution, usually online, of sexually explicit images without consent of the person or persons depicted in the image(s). Even if the person in the image initially consented to having the pictures or video taken, (grabbing graphic images of a person who is not consenting, is a whole other heinous level of crime, even if those images are not distributed), if the images are distributed to another party or posted without the person in the image giving consent to this posting, this is a legally prosecutable act of perpetrated revenge porn.

All this legal mumbo jumbo aside, most of us have heard of this kind of revenge taken by jealous partners seeking to embarrass an ex by distributing some pictures sent them in a sexting exchange or posting a private video they made when having sex with their ex. And surely celebrities see nudes ‘leaked’ all the time, as the public seems rabid for them. But the internet, still replete with revenge porn sites, see these portals shut down all the time.


Advertisement

And ‘good riddance’ as any of us would say.

Unfortunately, this crime has grown over the years with the instances above of employees taking liberties with their customers’ phone images and videos. In these cases, there is usually not even a personal relationship between the thief and the victim, no actual ‘revenge’ angle to the images being stolen and posted, it’s more a simple act of one person being able to embarrass another because they have the technical know-how to do so.

And yes, they should be prosecuted if found.

But should their employer?

Respondeat superior

According to the legal doctrine of respondeat superior (Latin for "Let the superior answer"), an employer is indeed legally responsible for what their employee does, when that employee is executing the job they have been hired to do. This could be interpreted to mean, as the judge in the recent case mentioned above did interpret, that while that phone clerk employee was performing the job they were employed to do, and stole graphic images off a customer’s phone, they were indeed covered under respondeat superior and their employer could be legally responsible for their actions.

The judge in the above case seems to think the employer is.

Where does the law come down on AI deep fakes?

As we have all seen, there have been a bunch of recent cases where AI or deep fake NSFW images have been posted online. What happens in lots of these cases (as it happened at New Jersey high school last year) is that someone or a group of adept computer miscreants, take images from one source and through an AI image generator, create wholly new, fake graphic images.


Advertisement

Congress just passed the Disrupt Explicit Forged Images and Non-Consensual Edicts Act or DEFIANCE, to address this issue, among others. But the theft and posting of an AI image is not, technically speaking, revenge porn.

That’s not to say it isn’t a crime.

The subject of the pictures (in the NJ case, underage teenage females) never posed for a graphic photo or video. Pictures they posted across social media mage were downloaded and filtered through an AI program to create a graphic image (a nude in most cases) that did not exist before the creation; technically there would be no consent to give as the NSFW picture posted did not exist before it was wholly created by AI.

Of course, there are invasion of privacy issues, and the representation of a graphic image of a minor, misrepresentation of an underage person, harassment, and the distribution of child sexual abuse materials, could apply. But as in the case of stolen images plucked by a clerk working to update a woman's iPhone might prove the store the clerk work is partly liable to the crime, if students use school computers or hack into school networks to perpetrate this AI fake crime, or display them across a school social network, is the school liable as well?

Murky waters to be sure.

Advertisement
Ralph Greco

Ralph Greco, Jr. is an ASCAP licensed songwriter, professional playwright, the senior east coast correspondent/reviewer/interviewer for vintagerock.com, press liaison for The Erotic Heritage Museum, blogger for latex designer Dawnamatrix Designs, co-host of the podcast Licking Non-Vanilla and a professional copywriter for adult as well as mainstream clients around the world. Ralph is now the resident Staff Writer for Kinkly as well. Ralph’s short fiction (erotic and ‘straight’) poetry and essays have been published in eight...

Latest Sex Positions