Having staff examine a photo is better than using algorithms
Revenge porn, the practice of people posting intimate photos of a former significant other on social media, has become terrifyingly frequent in recent years. It’s a moment worthy of fear and condemnation, since social media is a platform which includes all of the people you are closest with. Having such photos displayed publicly for everyone you know and care about to see is humiliating and emotionally damaging. Beyond this, photos posted online can exist in the ether forever, and impact a person’s quality of life and job prospects.
The question then is how far should social media sites go to prevent such acts? Facebook has proposed an answer, and so far it seems to be the best one available.
Facebook users in Australia will have access to a program specifically designed to preemptively stop the proliferation of revenge porn. If you’re worried about risqué photos being posted online, you can submit an application form and answer questions on Australia’s eSafety commissioner website and send yourself the picture on Facebook. Facebook will then essentially remember the photo, and will prevent it from ever being uploaded to the site, whether it be on the news feed or in a private message.
There is, however, a crucial step in the process which can cause anxiety. Before a photo can be pre-emptively excluded from the site, a Facebook employee will have to see the photo to confirm that it is an explicit image. Their reasoning for having a human confirmation is to combat censorship; someone has to confirm such photos because if not, a user could flag any photo they don’t like as revenge porn and have it removed from the site.
The use of human confirmation in this intimate scenario is unfortunate, but it is the best answer. The idea of a Facebook database filled exclusively with revenge porn seems like a disaster waiting to happen. As last week’s Paradise Papers have shown, nothing on the internet is entirely secure, and such a database could certainly be accessed by the wrong people.
The new program must be assessed for its effectiveness. Facebook’s solution seems imperfect, because it is, but it’s the best one available on the market. If you really are worried about a malicious ex posting intimate photos online, then you face a tough dichotomy; do you want one person you will never know or meet to see such photos, or everyone you know, and don’t know, to see them on a daily basis? The gradations of humiliation in these two cases is enormous. The thought of a stranger seeing your photos will surely make you uncomfortable, but the alternative is enough to make you sick.
And hacking such a database won’t be straight forward either, as Australia’s eSafety commissioner confirmed that “they’re not storing the image” but rather “storing the link” to it. This means that illegally accessing such photos won’t be as easy as merely downloading a bunch of pdf’s, as extra steps will be required to actually view them.
Facebook’s attempt at ending revenge porn is one worthy of our admiration. The inconveniences proposed by the solution are far outweighed by their utility. It’s easy to scorn the seemingly unnecessary human intervention, and to worry about the fragility of a revenge porn filled database. But for those who face this problem such a service is perhaps a step in the right direction.