We know big tech has a responsibility, but where do we go next?
Written by Mille Vardheim
As part of a research project with Equality Now, I have been looking into how the tech environment (companies and platforms) plays a role in human trafficking and sexual exploitation. Slavery and abuse have existed throughout human history, so what is unique with what we’re seeing now? Technology, and in particular large platforms like Facebook, Snapchat, Instagram, and Whatsapp, have provided a medium for which abuse and transactions of human beings has reached a global level unlike ever before. It’s impossible to shy away from the fact large tech platforms have consistently been found to be the main point in which spotters, traffickers or abusers approach their targets and they also give them unprecedented access to potential victims.
These large platforms have for a long time operated under ‘Self scrutiny’. Companies like Facebook, for example, have traditionally pointed to article 230 in the US, claiming they are not responsible for illegal action by users on their platforms, the same reason that was used by Backpage.com in a legal court case against them for selling children online. But as Annie McAdams, A personal injury lawyer, puts it:
“If you sell a lawnmower and the blade flies off and chops someone in the leg, you have the responsibility to fix it and warn people … Nowhere else has an industry been afforded this luxury of protection from being held accountable for anything that they’ve caused.”
Although there is a denial of responsibility taking place, there is still a mixture of tools available, trying to make the internet a safer place. Tools such as Freedom Signal, developed by Seattle Against Slavery, aims to combat sexual exploitation and trafficking through deterrence, data analytics, and victim outreach. AI and algorithms are also becoming increasingly popular, partly due to the sheer scale of sexual exploitation online, automation has been deemed as key. Facebook and Instagram are using various tools such as Microsoft’s PhotoDNA and manual moderators, as well as recently created open-source tools such as Facebook’s PDQ and TMK+PDQF. Although AI and algorithms may be regarded as the most effective automated tools, they still have major flaws. A test done by The American Civil Liberties Union (ACLU) scanned the official photos of each of the 535 members of the US Congress and found that 28 of them were incorrectly matched to 28 criminal mugshots.
Some organisations try to work collaboratively with governments and the private sector. One of those is Thorn, an NGO established to fight sex trafficking, providing and developing a range of products by working with survivors, academics, charities, NGOs as well as private organisations and law enforcement. Two of their tools are Spotlight and Safer. Spotlight both designed to protect minors and piece together information and data points by helping law enforcement find trafficked minors faster and more efficiently.
The Clewer Car Wash App is an example of a solution that aims to inform the public and encourage people to report instances of suspected human trafficking. It attempts to build a community intelligence gathering, by getting people to complete a quick survey - if enough boxes are ticked it is likely that human trafficking is going on. The user is then encouraged to content a helpline to report the incident. Although the app has been downloaded 8000 times and produced 2000 reports, only 18% (126 people) of the ones who were encouraged to call the helpline called it. It’s important to note that users may have called the helpline on a separate phone, or outside of the app. This shows the difficulty of measuring the impact of technological solutions.
Technological solutions have been shown to help law enforcement find trafficked minors faster, spotting abusive content online and hopefully saved lives. So where do we go next?
The solution to these complex issues will involve technology, but not technology on its own. As AI and deep machine learning develops, companies are looking to harness this power in order to stop and prevent human trafficking on a larger scale, but so too will the traffickers themselves. Truly understanding the impact of these tools is also a challenge, and how these tools sit within the greater context. We will also need to see more collaboration and more transparency in supply chains, between organisations and across countries. This will need to sit alongside awareness campaigns, law enforcement cooperation and initiatives from governments, private companies and NGOs are needed. Above all, we need to ensure laws and legislation can keep up with the change of technology.
At the end of the day, we should also strive to look at how these things can be done in a collaborative way by working with survivors themselves. Survivors are not helpless victims, they are in fact experts with unique insights and innovative ideas.
And finally, you might ask: how can I help? This blogpost has been published as part of Safer Internet Day, and while we angrily talk about this on the podcast, Equality Now is an organisation that is actively trying to disrupt this space and make the internet a safer place for everyone. If you’d like to know more or get involved, head over to Equality Now.
Solutions must be global, multi-dimensional and supported by actors including governments, tech companies, civil society, and UN agencies. They must be informed by the experiences and perspectives of survivors.
Equality Now is exploring the role of technology in sexual exploitation in order to advocate for the best approach and most effective solutions for adult women and adolescent girls.
Note: This blogpost is part of a larger literature review conducted, and focused on part of the findings from the report, which will be released soon. EXCITING!