Facebook issued an apology after mistakenly deleting a post from free speech group Article 19 about human rights abuses in Syria.

The Guardian reported that the social network said the post was mistakenly removed after being reported as containing offensive content, adding that its moderation team receives “a high volume of take-down requests,” and issuing a statement saying:

The link was reported to Facebook. We assess such reports manually and, because of the high volume, occasionally, content that shouldn’t be taken down is removed by mistake. We’re sorry about this. The organization concerned should try posting the link again.

The post from Article 19 linked to a Human Rights Watch report on alleged torture in Syria, and it read:

@hrw publishes a shocking report into #torture in #Syria including geo-tagged detention centres http://ow.ly/bZ6Yl ^OS

Article 19 Executive Director Agnes Callamard was not satisfied with Facebook’s explanation, telling The Guardian that the group received no explanation or warning, other than a pop-up message that read:

We removed the following content you posted or were the admin of because it violates Facebook’s statement of rights and responsibilities.

Callamard told The Guardian:

The deletion shows the looming threat of private censorship. We commend Facebook for creating tools to report abuse, but if your post was wrongly deleted for any reason, there is no way to appeal. Facebook doesn’t notify you before deleting a comment, and it doesn’t tell you why after it has. Facebook acts like judge, jury, and executioner.

Facebook is now widely recognized as a quasi-public space and, as such, it has responsibilities when it comes to respecting free speech. It can’t just delete content without some kind of transparent and accountable system. International law says that censorship is only acceptable when it is clearly prescribed, is for a legitimate aim — such as for public health — and is necessary in a democracy.

Readers: Was Facebook’s apology sufficient, or does the social network need to take an even closer look at its content-reporting policies?