The Balkan Troll of the Month is an individual, a group of individuals or a media outlet that spreads hate on the internet based on gender, ethnicity, religion, or other diversity categories. The Balkan Troll is selected based on hate speech incidents identified across the Western Balkans region.
Our October Troll of the Month is a private group on Facebook who shared inappropriate photos of underage Roma girls without consent which is both illegal and goes against all ethical guidelines promoting further gender-based violence.
It recently became known to the Northern Macedonian public that a private group on Facebook with over 2000 members was sharing nude photos of young Roma girls without their consent or approval. Indeed, the users themselves were encouraged to share photos from their private messages and public social media profiles of these young Roma girls after which, users would comment under the photos with hate speech and inflammatory comments including negative labelling of the girls being ‘indecent’. This incident further created a dynamic of hatred and shaming against an already highly marginalised social group.
The group was allegedly created by a Roma citizen and most of the users who had joined it where from the Roma community. However, the exact profiles of the members themselves are not entirely known due to the private nature of the Facebook group.
This is not the first case of social media groups sharing explicit photos of girls, following what was known as the ‘Public Room’ scandal earlier this year. The Public Room affair included some ‘7,000 users, who shared explicit pornographic content’ on the messaging system Telegram – the first instance of its shutdown was back in January 2020 following public outcry, however, the same case re-emerged one year later before being closed down again on the 29 January. In both instances explicit content of often under-age, young girls was shared illegally and without consent on social media platforms.
According to North Macedonian police, it has been confirmed that since the start of the Covid-19 pandemic there has been an increase in ‘instances of online sexual abuse’. This emphasises the seriousness of this case and the need for mechanisms in place to prevent, react and respond to online sexual harassment and violence against women and girls.
The group that was sharing nude photos of Roma girls has since been deleted from Facebook following public outcry after its discovery and the revelation of its activities. Furthermore, a number of human rights NGOs are pursuing legal action against the users who created the group.
Nevertheless, these legal actions led to a number of hate speech comments and personal attacks towards the president of the most vocal NGO. Presumably, those who are associated with the group shared his family photos in a series of relentless attacks that included a number of hateful comments.
Despite the positive action being made by the civil society who reacted immediately to the authorities and demanded that the group be removed from Facebook, there is a bigger issue at hand.
As this is not the first instance of indecent, illegal photographs of underage girls being shared on private groups on various social media platforms, this case raises the important question of the mechanisms of protection and regulations in place to prevent such events from occurring. Not only is such content extremely harmful and illegal but it also exposes the importance of rigid social media regulation and mechanisms in place to prevent such scandals and inappropriate content from being shared on social media platforms. Furthermore, it highlights the importance of collective action in demanding a reaction from authorities and social media companies in removing such content.
Social media platforms such as Facebook have a moral and legal duty within the public domain to enforce mechanisms of protection to prevent unlawful, inappropriate content from being shared on their platforms. This is an extremely sensitive issue and must be dealt immediately to ensure that such incidents are not repeated in the future.