Download!Download Point responsive WP Theme for FREE!

Send Facebook your porn… to stop your porn from spreading.

Facebook has begun a new initiative today to combat revenge porn. It’s simple; Just send then naked photos of yourself so their proprietary recognition software can take down future versions of the photos automatically. Yeah, you read that right.  To stop revenge porn, give your porn to Zuckerberg.

It’s hard for me to even begin writing this because of the massive absurdity you must swallow along with this. Okay, so Facebook wants you to send them nudes to prevent nudes from being uploaded to their servers. First, Facebook doesn’t allow pornographic images on its sites. The people posting these images should be dealt with without the need for the victims to take this horribly intrusive measure to prevent. Second, why on earth would I share with Facebook, who aggregates less but far more personal data than most any other tech company on the planet, nude photos of myself for them to store indefinitely on their servers.  They want you to upload your nudes in messenger to yourself and then delete them.   They claim “They’re not storing the image; they’re storing the link and using artificial intelligence and other… tech.” That sounds like a load of shit to me, the same kind of purposefully verbose and unintelligible user agreement shit in which Facebook is often known to tread.

Keep in mind this is the same Facebook who couldn’t figure out that they were being used as a supposed method of manipulating our entire country by “hostile foreign agents” in an election season. The same Facebook who has yet to make any concrete statements pledging this won’t happen again or even how they’re going to be addressing the issue. And a shady internet company isn’t the only problem with this plan.

Most photo recognition software is using some neural network systems which are easily fooled with minor and unnoticeable changes to the photo. This article on GitHub goes in to technical detail about the limitations of image classifying software and how easily they are to manipulate. With a small amount of imperceptible noise introduced into the original photos, not only are they classified incorrectly, specific noise can be introduced to fool the neural network into intentional misclassifications with high rates of success. This is not an issue with the images or even the manipulations of the images but a flaw in the current methodology of tension between the learning models ability to “learn” or gain new information and incorporate it as a useful advancement in its understanding and the models resistance to what it sees as irrelevant or even intentional perturbations to the model.  The author goes on to show that the flaw is more inherent to the model of learning/resisting change, as even adjusting all the parameters of the existing methodology proved to have little effect on this specific type of manipulations.

So, are any of you going to be sending Mr. Zuckerberg your risqué photography sessions? I hear if you classify them as art you can pretty much get away with anything over there. Can’t wait for someone to publish The Art of Revenge Porn.