Internet filtering bill gives the Govt a ‘blank cheque’ to censor the internet, opponents say

Internet filtering legislation currently before Parliament is impractical, and gives the government “blank cheque” powers to censor the internet, opponents say.

The internet filtering bill – officially the Films, Videos, and Publications Classification (Urgent Interim Classification of Publications and Prevention of Online Harm) Amendment Bill – is being shepherded through Parliament by Internal Affairs Minister Jan Tinetti.

Read More

  • Pros and cons of new Crown tool that lets parents control their kids’ online habits

Inspired in part by Christchurch mosque massacre gunman Brenton Tarrant live-broadcasting his killing spree over Facebook, It makes the livestreaming of objectionable content a criminal offence.

It would create the legal parameters for a government-operated web filter that would block objectionable content. The Chief Censor would be able to move more quickly to rate content objectionable, and social media companies would be explicitly brought under laws about objectionable content and subject to new take-down provisions – with a $200,000 fine.

When the bill was introduced in February, it was opposed by ACT and National, who cited free speech concerns. But also, more surprisingly, by Te Paati Māori and the Greens, who called it an “overly invasive regime”.

The bill passed its first reading regardless, given Labour’s absolute majority, and is now before the Governance and Administration Committee, which heard oral submissions today, and opposition from two watchdogs.

Appearing before the committee by Zoom, NZ Council for Civil Liberties chairman Thomas Beagle said the legislation was too vague.

“It allows an inspector to arbitrarily decide to block or takedown material on ‘reasonable grounds that it’s objectionable’ which is hardly a high standard, especially when ‘objectionable’ is so subjective,” he told the committee.

The bill created an “expansive legal framework” with too many details left to un-elected regulators to decide how it would be enforced, Beagle said.

“It’s a law change that seems to ignore what I see as the current problems on the internet – issues such as harassment, abuse, death threats, disinformation, algorithmic radicalisation and extremism, – in favour of trying to set up a barrier to block certain sorts of material.”

Earlier, in a written submission signed by 237 supporters, InternetNZ – the non-profit that administers the .nz domain – said: “The approach proposed in this Bill would give the executive government a blank cheque to block material it considers objectionable based on subjective line-calls with no referee.”

InternetNZ said some provisions, such as the new take-down provisions, could be tweaked to be effective.

“We shared the horror and anger of people across New Zealand when we saw the Internet weaponised to harm more people and attack social cohesion in the Christchurch mosques terrorist attacks,” the group’s submission says.

“[But] the bill takes the extreme step of legislating to impose a filtered Internet for New Zealand with no set legal safeguards and no requirement for independent oversight, making the inherent problems of filtering even worse.”

Green Party digital economy spokeswoman Chlöe Swarbrick earlier said the bill was “90 per cent there”. But the 10 per cent her party objected to – the filter – made it a no-go.

“We’re in conversation with the Minister about our concerns with the filter. That remains the sticking point,” she told the Herald today.

Tinetti responds

“When I first came in as minister and took over this bill, I had the option of discharging the bill or seeing it through,” Tinetti told the Herald this afternoon.

“Looking at the urgency around livestreaming objectionable content – to address specific legislative and regulatory gaps in our current online content regulation that were highlighted in the wake of the Christchurch Mosque terrorist attacks on March 15 – I was quite convinced that it needed to progress through to select committee,” she said.

“I absolutely anticipated that establishing the legal parameters for a potential web filter to block objectionable content would be a big part of the discussions at select committee.

“I know that there’s a little bit of anxiety around this particular part of the bill, and I want to see what people have to say. I am very open to looking at further shaping the work around the filtering system and hearing what’s been said.”

Who's in charge?

Vodafone NZ did not want to make its submission available (the Committee has yet to publically post all submissions) but the Herald understands the talk shares concerns that the filter provision is too broad-brush, and adds the objection that it is not technically feasible

In June last year, a bill introduced by then-Internal Affairs Minister Tracey Martin that sought to block pornography was dropped on the grounds it was impractical – mirroring a move to drop a similar propose measure in the UK.

In the wake of the Christchurch shootings, Vodafone, Spark and 2degrees voluntarily agreed to block access to a number of sites – notably 4Chan and 8Chan – that were sharing Tarrant’s clip.

But at the same time, the trio stressed they did not want to be thrust into the role of sheriff. They wanted clarity over which government agency had the deciding say on such moves, and under what terms.

The filtering bill goes some way to meeting that request, by giving the Chief Censor more powers, and the ability to act with more speed – and teeth via $200,000 fines.

“We strongly support statutory amendment that gives service providers clarity about what is required of them, avoids the need for them to decide for themselves what is harmful online content, and protects service providers when they act against such content in future,” Vodafone NZ external affairs head Richard Llewellyn told the Herald this afternoon.

Filter already in place

Although it’s not shouted from the rooftops, New Zealand already has an internet filter, which has been operated by Internal Affairs since 2010.

The DIA’s Digital Child Exploitation Filtering System (DCEFS) that blocks websites that host child sexual abuse images is being made available on a voluntary basis to New Zealand internet service providers (ISPs). It’s been deployed by Spark, Vodafone NZ, 2degrees and others.

The DCEFS is overseen by an “independent reference group” that features a mix of private sector appointees and government officials – including Chief Censor David Shanks, and reps from the DIA itself and the Office of the Children’s Commissioner.

Additionally, Crown agency N4L (Network for Learning), which connects schools to the internet and, in many cases, manages their networks, offers a’ target=’_blank’>free content filter for schools and, now, for parents managing childrens’ devices at home.

If not an expanded filter, then what?

Opponents of an expanded filter, like the Council for Civil Liberties’ Beagle, fear mission-creep, especially with the ill-defined filtering bill.

“It’s not enough to look at the current government and say that they wouldn’t misuse these powers,” Beagle said.

“Our own history contains examples, such as the 1951 Waterfront Dispute, when our government did abuse its power to censor in order to suppress dissent.

“We have no guarantee that future New Zealand governments will not expand upon and misuse these powers.”

But if not an expanded filter, what mechanism should we use for dealing with harmful internet content?

In Beagle’s view, a pragmatic approach is called for.

He sees two groups of internet users – “regular” folk, who spend most of their time on mainstream sites, and a minority of “motivated” people who, for whatever reason, seek out dark content.

The Council for Civil Liberties chairman thinks there’s no practical way to police the “motivated” set – at least without undermining individual freedoms. The government should focus its energy on the “regular” users.

“Most regular’ people access internet content through the big providers such as Facebook, Twitter, YouTube, etc, and responsible publishers such as NZ Herald and Stuff,” he says.

“These big companies and publishers generally have content standards that would prevent patently offensive material from being shared on their sites. The recent work done with the Christchurch Call and the revamp of the GIFCT [Global Internet Forum to Counter Terrorism] is targeted at improving their ability to work together and quickly eliminate material such as the Christchurch massacre video.

“We believe that government would be better off continuing with that work, and working with other like-minded governments to help set content and process standards for the big internet companies to implement, similar to the way that the Privacy Act sets requirements but individual companies work out how to honour them.”

The internet filtering bill

The bill amends the Films, Videos, and Publications Classification Act 1993 so that:

• The Chief Censor will be able to more quickly notify the public of illegal content that could cause high levels of harm;

• Livestreaming of objectionable content will be a criminal offence;

• Government will be able to issue take-down notices, requiring the removal of objectionable content online

• Social media companies will come within the scope of current laws on objectionable content; and

• Legal parameters are established for a potential web filter to block objectionable content in the future, subject to further policy development and consultation.

Internet safety contacts

Four Crown agencies can assist if you’re having problems with harmful content.

• Netsafe

• CERT NZ (The Computer Emergency Response Team)

• The Office of the Privacy Commissioner

• N4L (Network for Learning)

Source: Read Full Article