Ethics in development

27th June 2019

This post has been updatedskip to update

With every boundary that is pushed – there will always be people saying you’ve gone too far. Smartphones are ruining conversations, facial recognition is destroying privacy, and AI is going to take over the world.

While most of these fears never come into being, the power of a lot of these fledgeling technologies in the wrong hands can do a lot of harm. AI has a relatively long history of not being particularly unbiased when it comes to women, especially when it comes to women of colour.

Facial recognition programs frequently have higher error rates with female faces than males and significantly higher errors with women of colour. While this might not seem like a big deal, when you consider that these models are trained on information supplied to them shows that while the AI is biased, it gets that from us.

Part of this stems from women not being represented. Without having a seat at the table, it’s no wonder that these mistakes are made. While ideally, a room full of men would have the wherewithal to think “hmm better pop some pictures of women in there”, it would be a lot better if women were represented.

Making fake nudes

Speaking of rooms that don’t have female representation – whoever made this definitely made it without consulting anyone else. At all. Ever.

Some guy who probably thinks he’s very smart has made a web app (also available on Windows and Linux) that can make any woman nude. Any unsuspecting, unconsenting, unaware women. You do have to pay $50 to remove the watermark from the image.

Now, for anyone who isn’t sure why this is a terrible thing let me explain. This app gives anyone the power to create a fake nude picture of a woman (just women, doesn’t work on men) and then distribute it. All without the woman’s knowledge or consent. It would be naive to think this won’t happen. Nudes are leaked every single day and revenge porn is everywhere. And on top of that, women are blamed. Blamed for taking the picture, for sending it to their partner, for not having 2FA enabled. And now blamed for… posting a fully clothed image?

“So if someone has bad intentions, having DeepNude doesn’t change much… If I don’t do it, someone else will do it in a year.”

‘Alberto’ – the creator of DeepNudes

He’s not wrong, the technology exists so someone probably would have made it. The problem isn’t that it was made (although Jesus Christ), it’s that it was created without any thought about the harm which could be done. ‘Alberto’ (as he wants to be known) didn’t even seem to think that his app WHICH MAKE NUDES OF WOMEN WITHOUT THEIR CONSENT could ever be a bad thing.

“Is this right? Can it hurt someone?”

‘Alberto’

The utter and complete disregard for women that ‘Alberto’ shows should be horrifying. It should cause shockwaves and make people recoil. But it won’t. He’ll have backers and make money from the app and there will be hundreds of people willing to defend the app and its users.

We can only hope that the Revenge Porn laws are updated to include deepfakes. Without that protection, women will always be vulnerable to revenge porn no matter what precautions we take.

**Update**

So less than a day after DeepNudes hit the press, its creator has taken it down.

Screenshot of tweet by deepnudeapp. Text to follow
Screenshot taken from their twitter account

So there’s a lot to unpack here. Good thing I’ve got time on my hands.

Here is the brief history, and the end of DeepNude. We created this project for user’s entertainment a few months ago. We thought we were selling a few sales every month in a controlled manner.

So they admit this was done for entertainment rather than their original line of ‘oh we’re trying to monetise the algorithm to improve it’. The assumption that an app like this wouldn’t at some point go viral is a bit naive but I can let that slide.

Honestly, the app is not that great, it only works with particular photos. We never thought it would become viral and we would not be able to control the traffic. We greatly underestimated the request.

Ok, so they’re now downplaying how good the app was. Yes a maxi dress confused it and odd angles threw it for a loop but it was still pretty good at making women appear naked. Without their consent. Also, note they have not yet apologised for actually making the app. Or for not realising the impact that something like this could have on a woman’s life.

Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high. We don’t want to make money this way.

I would argue if you don’t want to make money from an app that creates fake nudes of unconsenting women, you shouldn’t monetize an app that creates fake nudes of unconsenting women. Or, maybe you shouldn’t have made it (and marketed it) in the first place. Also, saying ‘misuse’ implies there’s another use for the app other than ‘make any woman you want naked’. Which there isn’t.

Also – a watermark isn’t a ‘safety feature’. It’s the ploy they used to make people pay to have it removed. In reality, women could lose their jobs if a nude of them became public. Could you imagine a teacher having a fake nude spread around a school of them? How many schools/local authorities would back the teacher? Even if it was watermarked. Or the women who could face abuse from their partner or families if a fake nude of them was created.

Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it. Downloading the software from other sources or sharing it by any other means would be against the terms of our website.

Oh well, wouldn’t want to break the terms of your website. I’m sure that will stop people.

From now on, DeepNude will not release other versions and does not grant anyone its use. Not even the licenses to activate the Premium version.

People who have no yet upgraded will receive a refund.

Back to just talking about their customers. Sorry lads, you’ll have to get nudes the traditional way.

The world is not ready for DeepNude

Sorry, what. The world is not ready for an app that makes fake nude pictures of unconsenting women. What world is! Gilead?! Where women have no rights or agency and are at the mercy of men who see them as nothing more than objects and accessories!

Let’s be real. The developers of this app don’t care about women. They describe the app as “The superpower you always wanted”. This wasn’t a fun way to experiment with AI or a way for them to drive research into the field. It was a blatant power play by the developers. They see no issue in creating these fake nudes. To them, it’s just a bit of fun. Grab a picture of your ex-girlfriend and make her naked. Then you’ve got that picture forever! And you can do whatever you want with it. That sure showed her right?

The 'deepnudeapp' twitter account header. Contains "the superpower you always wanted" as the profile tagline

Responsibility

In their non-apology, the developers are trying to absolve themselves of any responsibility for the very real harm that their app does. It’s not their fault if someone makes the nude of someone and shares it. That’s not the developers’ fault! They couldn’t possibly know what would be done with the app that makes nudes of unconsenting women.

Bull crap. They are responsible. Not wholly, but partially. Yes, ultimately it’s the fault of the person creating and sharing the fake nude. But when you give a person the tools, you have a level of responsibility.

There will be copies of DeepNude. You can’t put the stopper back on the bottle. As AI’s get smarter and frameworks get better, it’ll become easier and easier for these sorts of apps to be made and there won’t be a shortage of people who will make them. All we can do is keep rejecting these apps. Keep writing articles damming them and their creators. Keep educating people as to why these apps are terrible and the very real harm that they can cause and keep striving forwards towards equality.