One of my priorities when starting as Netsafe’s CEO was to reconnect the organisation to our global online safety communities. It was therefore timely that I was invited by Internal Affairs Minister Jan Tinetti to join her on an eight-day study trip to Europe. The aim of the tour was to inform work underway here in response to some of the global harm challenges of the future.
It was a highly valuable visit that saw a small group of politicians, officials, NZ regulators and not-for-profit organisation representatives meet with counterparts in Dublin, Paris, London and Helsinki in the context of the current government review underway on content regulation.
I learnt from some countries’ successes and heeded advice from others who have tackled online content regulation but it didn’t go so well. And with the delegation members, I also realised the degree of fundamental change that lies ahead of us.
This global change will not only be to systems and processes but also to the way the tech sector operates around the world. This will be focused on the way we approach protecting children online, our democracy, balancing competing human rights over content moderation and working with platforms and civil society to customise tools for local settings.
As a not-for-profit, with two government contracts, Netsafe juggles, like others in the sector, with resourcing and delivery, and there is a constant risk that increasing complexity and increasing expectations could overwhelm us. But a system designed without all relevant sectors and a whole of society lens wouldn’t be nearly as effective or in the public interest.
What most struck me from the tour was how well-resourced some agencies were to tackle online harms and how narrow some bodies were in the tools they used. Some had the luxury of deciding what harm they were working to solve and doing that successfully. It highlights the opportunity for us to work with the government and peers to improve our current system. It also opens up a world of possibilities for New Zealand online safety organisations when it comes to project and research work partnering with European counterparts.
This is why visiting Protect Children In Finland (doing amazing things in the education space) and the Alan Turing Institute (using machine learning to disrupt patterns of harm) were extremely exciting places to visit and didn’t disappoint!
What did I learn from the trip?
Heaps! Let’s start with the basics.
- Communicate any law changes simply – get clear on the rationale as to why greater regulatory protection is needed especially for the ‘lawful but awful’/’lawful but harmful’ issues.
- There is a need to address the adverse human rights impacts of widely disseminated legal but harmful online content, such as disinformation, hate speech, incitement to cause violence self-harm imagery.
- Focus efforts on the broadest consensus around tackling online harm. This is most evident when it comes to children, vulnerable communities and seniors’ communities.
- Design the system of oversight with integrity – do the right thing, not the easy thing. It means ensuring there are sufficient monitoring, inspection, and oversight functions.
- Ensure there are fast, efficient, free complaint pathways for the public and that they involve humans helping other humans – not just automated AI replies.
- Share the load by encouraging the private sector, tech industry and not for profit sector to identify the issues and solve them together.
- Make it borderless by adopting what the rest of the world is doing while at the same time incentivising some local nuances that delivers value to New Zealanders.
- Remember that Parliamentarians have a role to play so think about a multi-disciplinary Committee (around election times) to ensure integrity of Parliamentary and democratic systems and processes. Deliver communications/social media training to Parliamentarians.
- Build in a review mechanism 24-months after making any changes to regulate tech to see if the benefits anticipated materialise.
I further learnt the importance of providing researchers access to public data. This openness means citizens can be informed about a system and whether it is working as predicted or letting them down. Ensuring researchers have access to data delivers much better outcomes for everyone. I saw first-hand what enabling programmatic access to platforms can make for real-time analysis. Quality analysis informs improvements in systems.
- Prevention and education are crucial to building a society’s resilience to online threats and harms. These two elements are necessary to prepare us for what comes next – be it a pandemic, ransomware attack, fraud or scam.
- The benefits of industry codes and how they are used by tech companies to demonstrate the value of their processes and policies to the public good.
- Making it clearer to people what is true and untrue, good and bad, permitted and not permitted, lawful and unlawful isn’t the easy thing to do, but it is certainly the right thing to do. This is because online spaces and places are where people meet our expectations around online and offline behaviours and brush up against government and tech policies and processes.
Many of the people I met in Ireland, the UK, France, and Finland were interested in our alternative dispute resolution approaches to combat harm as distinct from adversarial and court-based approaches. For some, this was a novel approach and an area for Aotearoa New Zealand to be leading the way when it comes to civil remedies and restorative justice practices.
It was time well spent as Netsafe’s CEO, and I am looking forward to developing closer working relationships with the folks I met as we navigate our evolving global environment by working together.
If you’d like to keep up with what’s going on at Netsafe, subscribe to our mailing list.