By Karley Sciortino / above artwork by Richard Prince /
Over the past week, people have been sounding off about how Instagram censored the below photo of a fully clothed woman on her period. Dark times, huh? The photo is from a series of artworks by the Toronto-based poet and artist, Rupi Kaur. In her original artist statement, Kaur said the objective of the work was to demystify and destigmatize the female body — to make viewers “realize these are just regular, normal processes,” nothing to reject or shame. Well, sorry–Instagram disagrees with you! Your period is gross; remove this unsexy image at once.
After Instagram removed it, Kaur posted: “Thank you @instagram for providing me with the exact response my work was created to critique … when your pages are filled with countless photos/accounts where women (so many who are underage) are objectified. Pornified. And treated less than human.”
Yeah, well, I’ve know that the social media overlords hate periods for a while now. Last month, Facebook deleted the above photo–taken by Petra Collins of Sandy Kim, Alice Lancaster and myself–within an hour of me making it my banner image. Like, what?! That is the coolest photo on earth! Facebook should be honored to host it!
In honor of these dark times, below is an essay I wrote for the current print issue of Purple magazine about censorship and social media.
The first image ever censored from my Instagram account was of a painting by the famous French realist, Gustave Courbet. Titled, “Le Sommeil” (Sleep), 1866, it depicts two nude women sleeping in a peaceful embrace. After its removal, I got one of those “you breached the terms of service” emails, which annoyingly never specify which image was deleted, and inevitably leaves you scrolling through your timeline, trying to pinpoint the missing link. It took me quite a while to realize that “Le Sommeil” was the culprit, as I never expected that a painting, let alone a classic piece of art, could be deemed inappropriate viewing, even by the most puritanical of judges. But apparently, we’ve achieved a level of such unsophistication that there’s no longer a distinction between art and porn–all nudity is pornography, at least according to social media sites, who have inadvertently become the quotidian authorities on what we are permitted to say and see, and the significance of the nude body.
We’ve come a quite a way since the time of book burnings, and whitewashing pieces of history from children’s textbooks. We congratulate ourselves for our freedom to speak. And yet somehow, we seem to overlook the fact that, on a daily basis, we are consuming and sharing information through the heavily patrolled, omnipresent censor-world of social media.
I’m not the only one who’s been scolded by Instagram for posting iconic works of art. Works by Corbet, Richard Prince and Robert Mapplethorpe have gotten countless magazines and galleries booted off Instagram entirely. Even Prince himself had his account deleted, after posting his controversial artwork ‘Spiritual America,’ 1983–a punishment he later said felt “strange and confusing.” I relate. It feels so ridiculous to be reading an article about a new Ryan McGinley exhibition in the New York Times, with accompanying images, and then to open Instagram and see the same images, except with superimposed hearts over any visible nipple or bush, as if I’ve somehow teleported back to being a 7-year-old in Victorian England. (And let’s just acknowledge the irony in that by covering certain body parts, we insist on sexualizing them, regardless of the artwork’s original intent–i.e. glittery stars over Marina Abramovic’s breasts during a piece of performance art.)
Instagram is square, in more ways than one. In their terms of service, Instagram states that users are banned from posting content that is “violent, nude, partially nude, discriminatory, unlawful, infringing, hateful, pornographic or sexually suggestive.” And not only does Instagram censor what we post, but it also blocks users from searching certain hashtags. The 100-plus blocked hashtags predictably include words like #pornography and #blowjob, but also extends to seemingly innocuous words like #bra, #fetish and #lingerie. But then for some reason, #faketits and #underboob are allowed. The inconsistency by which the policy is enforced makes it even more annoying. (And that’s only gotten worse since Facebook bought Instagram for a billion dollars last year).
The apparently arbitrary enforcement of the no-nudity policy was brought to media attention last year when Instagram famously deleted artist Petra Collins’ popular account, after she posted an image of herself in a bikini with a visibly unshaved bikini line. Because there’s roughly a zillion images of girls in bikinis on Instagram, the issue was clearly that Petra’s body didn’t meet society’s standard of “femininity.” As Petra said, “It’s an example of the pressure to succumb to society’s image of beauty literally turning into censorship.”
Clearly, it’s ridiculous to censor photographs of historical paintings, women breastfeeding and unshaven bikini-lines, but then for some reason to allow Kim Kardashian to post the photo of her nude, oiled butt that graced the cover of Paper magazine late last year. Soon after Kim’s butt pic, comedian Chelsea Handler posted a comedic photo of herself riding a horse shirtless, mocking the famous image of Vladimir Putin in the same scenario. Instagram censored the imagine. Afterward, Handler called out their double-standards, posting a statement that said, “Just so I’m clear, Instagram…it’s ok to use nudity to sexualize yourself on your site, but not to make a joke? I’m just so confused.”
Apparently, Instagram have appointed themselves the body police. But by censoring nudity, they are hugely hindering people’s ability to share and discuss art, film and photography, and to create a dialogue around sexual politics and sex-ed. And by choosing to allow sexual images–for example Playboy’s Instagram, full of women in tiny bikinis–but to remove comic or artistic images of nude female bodies, social media sites are sending a clear message: women’s bodies exist solely to be sexually stimulating, and if they are not serving that purpose then they should be removed from sight.
Being a journalist, I am predictably against censorship of any kind. At the same time, I can understand the desire to prevent Instagram (which is for persons as young as 13) from turning into porn a aggregate. But is a strict no-nudity policy the only way to achieve that?
I love porn, and I love art, but I understand that they are not the same thing. The definition of pornography extends far beyond nudity–porn is the explicit display of sexual organs or activity, with the specific aim to stimulate sexual excitement and climax. And I assume anyone with half a brain understands that this is not the ambition of art. But Instagram’s current policy leaves no room for distinction. Of course, there are always grey areas, and the argument over where to draw the line between art and pornography will probably never end. But lines are drawn all the time, and people who have an advanced knowledge of art are better suited to be drawing those lines than people who don’t. If it’s someone’s job to decide that #faketits and #underboob are OK hashtags, but that #cleavage isn’t, then surely it’s possible for an artistically literate person to decide whether a specific image that contains breasts is pornographic, and therefore inappropriate for Instagram, and or if it’s art, and therefore of social value.
“Facebook,” said the influential legal commentator Jeffrey Rosen, “wields more power [today] in determining who can speak . . . than any Supreme Court justice, any king or any president.” That’s a scary thought. And while social media sites claim to support free speech, they all create “terms of service” that disregard our legally protected, basic human right of freedom of expression.
Instagram claims that its restrictions are there to protect its users. But protecting and censoring are two different things. It’s a slippery slope when the private tech companies that run social media sites are proactively stepping-in to decide what information is valid for the public to consume. The bottom line is, whether or not each of us looks at a photograph or watches a video should be our decision. And while government censorship is unfortunately a familiar battle, mass censorship by a private tech company is something new, some different, and something we should not be passive about.
WikiLeaks founder Julian Assange, in this recent book Cypherpunks: Freedom and the Future of the Internet, states: “The internet, our greatest tool of emancipation, has been transformed into the most dangerous facilitator of totalitarianism we have ever seen.” Sure, that sounds slightly melodramatic, but Assange is not the first person to raise such a warning flag. From George Orwell’s Nineteen Eighty-Four (1949), to Guy Debord’s Society of the Spectacle (1967), to Michel Foucault lectures on biopolitics in the 70s, many before have cautioned of “societies of control,” and technocratic governments that restrict all aspects of human life.
Ideally, we would live in a world free of censorship. But in in lieu of utopia, I suggest that rather than social media sites imposing filters, filters should be available on request, therefore allowing the acceptance of all content and the avoidance of censorship. Alternatively, user’s feeds could be rated, similarly to how we rate films.
Realistically, people on the internet are fickle: if and when we become fed-up by the censorship of Instagram, or any other social media site, we will happily drift onto the next. The fear, however, is that we’ve become so accustomed to being silenced and controlled that we will soon stop caring, or even noticing.