Last Thursday, during a committee hearing in parliament, the New Zealand Green MP Golriz Ghahraman called for new laws on content moderation for Facebook. During the meeting, Mia Garlick and Nick McDonnell, Facebook’s two regional directors of Public Policy for Australia, New Zealand and the Pacific, made their first appearance before the Justice Select Committee in order to discuss the matter at hand. 

Ghahraman questioned Facebook’s monopoly of the decision-making process since the Cambridge Analytica scandal. In 2018, contractors and employees of Cambridge Analytica were reported to have sold psychological profiles of American voters to political campaigns, which they had acquired from the private Facebook data of millions of users. Being the largest data breach in Facebook’s history, the Cambridge Analytica scandal set forth the dominant question policymakers and academics ask themselves today: who should govern these platforms, and how.

Ghahraman called for an end to the view and endorsement of Facebook solely as a neutral middle party content provider. “I think the time of us treating you as the postman rather than the publisher has passed, said Ghahraman in Parliament. You are the publisher, and we hold publishers to account in a very different way in online spaces.” Indeed, as of today, Facebook and other social media platform giants can moderate content as they see fit due to section 230 of the Communications Decency Act in the United States, where it is stated that “no provider…shall be treated as the published or speaker of any information provided”.

This has been proven to be very problematic in recent years, as platforms come to take a preponderant role in many democratic countries, without the transparency or accountability that is required by other media such societies. Ghahraman called for new moderation policies and shared decision-making with governments to reconcile these notions with platform compagnies. “We’ve got a multinational, multibillion dollar company making decisions about freedom of speech and our democracy, but we don’t have any clear guidelines about how they’re applying those standards, declared Ghahraman. We live in these online spaces, now, and we have all sorts of laws criminalising threatening speech, regulating speech that defames, and somehow we don’t treat these online spaces to be accountable in the same way.”

Garlick explained in Parliament that Facebook’s Community Standards and Engagement Reports are made publicly available and are regularly updated on their transparency center. “As we get better at detecting and removing fake accounts or other malicious activity on our services, people will try to game the system and seek new ways around that, and so it’s a constantly evolving process for us to identify new ways in which people are attempting to misuse our services so we can get ahead of that both in terms of whether we need to update policies or invest more in technology,” stated Garlick. 

Although Facebook officials assured in Parliament that they have been and are working on the matter, Ghahraman said she remained “dissatisfied”, and compared New Zealand’s treatment in comparison to Germany’s. “I know that in the European system, particularly in Germany, they have a far more effective way of ensuring safety when users access Facebook, when it comes to extremism. That’s because they have a 50 million euro fine for breaches that will apply to Facebook if their hate speech laws are breached,” said Ghahraman referring to Germany’s strict hate speech code of conduct. “The main difference with Germany is we’re determined to require illegality of content which I think is not a place we’re comfortable with being. But there’s certainly no difference in terms of how we develop our policies and enforce them. We have global standards and then if there’s a law in a particular country that sets a different standard, we’ll restrict access to content that complies with our standards but doesn’t comply with local law”, explained Garlick.

According to Ghahraman, Facebook should not be the sole moderator of content and privacy. “I don’t think you should be left self-governing anymore,” argued Ghahraman. This was supported by other parts of the New Zealand Government, as Kris Faafoi, the justice minister and minister for broadcasting and media, also declared he was open to collaborating with the Greens on election-related legislation. “We’ve got a cooperation agreement with the Greens to look at electoral laws, so I’m happy to work with them on that, Faafoi said to NewsroomNZ last week. I do think that Facebook and other social media platforms could do a little better in terms of speed of takedown with some of their misinformation – not just in terms of electoral laws”, said Faafoi declaring his shared concerned with Green MP Ghahraman.

This call for new legislation on platform regulation in New Zealand is one that will resonate in other Western countries, as political science and communication studies academics have been calling for this, for some years now, hoping to set an example for other countries around the world. 

Eva Julia van Dam

Hi! I am a Franco-Dutch student living in Montréal. Just graduated with a Bachelor degree in Political Science from McGill University, I have taken an interest throughout my studies in Journalism. I've worked with the McGill Tribune, the Bull & Bear and recently collaborate with La Gazette des Femmes in Quebec City. I have also been an associate editor with the McGill Historical Discourse. My areas of interest are international relations, cultural diplomacy, feminism and gender studies, environmental issues and journalism - I love to write, learn and grow and I'm exicted to work with Global Green News.

More Posts - Facebook

LEAVE A REPLY

Please enter your comment!
Please enter your name here