Helen Clark Foundation recommendations to monitor & penalise posters of ‘harmful content’

Transcript starts at 4:12.

Information

This is a partial transcript of Jack Tame’s NZQ&A interview with Helen Clark and Kathy Errington (Executive Director of the Helen Clark Foundation) pertinent to the Helen Clark Foundation’s recommendations in its report “Anti-Social Media: Reducing the spread of harmful content on social networks”


Jack:

Let’s talk about some of the suggestions in your report – how is New Zealand’s current social media regulation serving us?

Kathy:

So, at the moment we have a patchwork of legislation and agencies, and there’s big gaps in it. Most of the legislation predates the existence of social media and the parts that do provide exemptions for social media platforms. So, it’s sort of like a patchwork with big holes cut in it at the moment.

Jack:

You want to see a regulatory body like the ones we have for broadcasting, like the ones we have for other printed forms of media, the Media Council and the BSA, how would that work for social media?

Kathy:

So, to start with what we want is for the government to at least review and bring our laws up to standard, and one way we propose that could be done is by having an independent regulator, similar to the BSA, that is of course regulating our conversation now and I don’t feel particularly constrained ah… by that body.

I don’t think it unduly limits freedom of speech but what the… what this regulator could do is agree a set of standards, have a process where public input could be gathered into where the lines are drawn. Because, like I said before, the current situation isn’t a free for all of free speech on social media.

They take content off all the time, ah… for example I’ve been in new mother’s groups and I have my baby and breastfeeding content was removed all the time because it violated rules on nudity and the… yeah… and I don’t agree with that. But I’d no forum where I had a right to be heard… to… to protest that, so what we are saying is we need democratic models of how this can be done.

Helen:

As I see the report recommendations, they are not dissimilar to what is being looked at in the United Kingdom. You have the statutory duty of care, that is placed in law, on the social media companies, and you have a regulator who has general oversight. Are they exercising that duty of care? And the regulator probably should be able to investigate of their own accord.

Jack:

And so, what do you do if they don’t?

Kathy:

There need to be penalties. So there need to be penalties for people…

Jack:

What (indistinct).

Kathy:

So, you’d see internationally, there’s three different kinds that are emerging.

The first is financial penalties, the second is liability on executives and the third is business disruption – so that’s things like internet service providers being ordered to take sites down and that’s a massive stick – particularly the latter one – and it’s already happened in New Zealand. ISPs took sites offline during the Christchurch attack – but with no public input – and some did and some didn’t, so it was uneven.

Jack:

Would you support… would you support… and if something like Christchurch were to happen again, would you support taking Facebook offline for 24 hours like the Broadcasting Standards Authority can do to this very TV channel?

Kathy:

What I would say is that we need a system that is robust enough to be able to do that. Because on my own I couldn’t say that. I wouldn’t want to have that much power but there should be a body that can make those kinds of decisions and at the moment there isn’t.

Jack:

Should social media companies be able to live-stream?

Kathy:

So, there’s a… they need to develop better technological processes to determine this content. They’re already doing that, and working on it. And the second part is if, in the short term, they cannot reliably detect that content there are other options that could work. For example, limiting live streaming to verified accounts because then, if you knew who that person was, and they started an offensive live-stream it would be much easier to detect it.

So, there… there are options to restrict live-stream without getting rid of it.

Jack:

What chance does Jacinda Ardern have to create some sort of multi-lateral international effective response to the problems we see in social media?

Helen:

First point is that Jacinda has really scored a triumph in being invited as the leader of a small country to go to the G7 Summit and talk about these issues. That’s incredibly important. So New Zealand is in a leadership role ah… what she is proposing with others with the Christchurch Call is around terrorist and violent extremism. Our issues, obviously in the discussion paper, are going a bit wider than that, but I think she’s got a real opportunity to set the pace here.

Kathy:

Yeah. I reiterate that. From having worked at MFAT, I mean the group of seven – that’s the seven biggest economies in the world. They don’t usually even mention NZ let alone give us a platform to define a major policy issue. So… I have heard a lot of cynicism about that meeting, saying “Oh, it’s just a talk fest” but I sincerely do not share that. I think it’s amazing that she’s created this opportunity, and what NZ says and the way we go about doing this is going to have a real impact.

The Helen Clark Foundation
38%
×