How Do We Stop Bad Use of Good Software?

Posted on Sep 19, 2016 in Accelerator, Musings and Insights
How Do We Stop Bad Use of Good Software?

Surya Remanan and Nidhyia Raj present their prototype at Hackathon Against Gender Based Violence in Ranchi on 26 January 2016  Photo by Lisha Sterling

Surya Remanan and Nidhyia Raj present their prototype at Hackathon Against Gender Based Violence in Ranchi on 26 January 2016
Photo by Lisha Sterling

When you are building a piece of technology intended to help people in crisis, it’s important to think about all the ways that technology can be used that will go against the intended benefit, but there are limits to what you can plan for or mitigate. This has been a major theme of the discussions around a project that is currently in our Accelerator Program and it’s an issue that has just been highlighted in the news today.

Nirbhaya is a wearable tracking device and panic button app intended to help rescue victims of human trafficking. The idea is that the device exists as a piece of jewelry most of the time, but can be activated to call for help when a person is in need of rescue. The project was prototyped at the Hackathon Against Gender Based Violence in Ranchi, Jharkhand, India which was sponsored by the US State Department and the US Consulate at Kolkata back in January of this year based on description of interactions with human trafficking victims and NGOs.

Sometimes a person is identified as a victim of human trafficking, we were told by one NGO representative, but they are not ready or able to be rescued at that time. Perhaps it is too dangerous at that moment, maybe they are afraid or unwilling to go just then, or perhaps there are other reasons that we cannot help in the context where the person is met. After that initial chance meeting, however, the person may be lost forever. Some tool that would allow the person to take some control of their own situation would be helpful, we were told. Something that would allow the victim to become the architect of their own rescue by letting the NGO know where they are at a time and place when they are ready and able to get away.

This is not your rich person’s panic button that calls police and warns a mugger to go away. This is something much more delicate, and much more difficult. The human trafficking victim is unlikely to have a smart phone. They are unlikely to have much control over their possessions. They can’t have something that is obviously meant to help them escape. There are so many security challenges with this project that we found ourselves stuck, mired in the mud of the impossibility of controlling for all possible ways this tool could be abused or even used against the person it was meant to help.

The two young women who designed that initial prototype, Surya Remanan and Nidhyia Raj,have gone back to the drawing board more times than we can count over these last eight months. Different security experts and experts in human trafficking have weighed in, often giving exactly opposite advice about how to proceed. But along the way, I’ve started to wonder, how much of the problem is that we are trying for perfection in security instead of simply trying for good enough? Is it possible to draw a line that says, “This will help people and will not put too many people in further danger?”

Img: WANTED: Ahmad Khan Rahami, 28-yr-old male. See media for pic. Call 9–1–1 if seen. Tweet text: Shoutout to my fellow brown persons who originally planned on taking the subway to the airport today with luggage

Screencap of tweet by @Kenyatta on 19 September 2016

This morning I was reminded of this particular debate when I saw a post on BoingBoing about an alert that went out to mobile phone users in New York City today. The Amber Alert system is intended to help find missing people quickly and to convey important emergency information in times of crisis, but the text that went out today came out more like an open invitation to suspect all brown people in the city. Instead of making people safer, this message may have put thousands of people at higher risk of violence today.

I don’t think that there is any question about whether the mobile phone system Emergency Alert has helped save lives. It has been used to find children and adults. It has been used to communicate tornado warnings and evacuation orders. It has, for the most part, done what it was designed to do. But users, even highly trained government employees with access to this system, don’t always use the tools in the best way, and every tool can be used negligently or with intentional harm.

We absolutely have to work towards the best possible outcomes, but where do we draw the line? How much control over use has to be embedded in the tool itself? And how much do we leave up to users themselves? This isn’t a question that can be answered just once. It’s one that we’ll have to come back to again and again, not just with Nirbhaya but with many of the tools we have already come to rely on in our hyper-connected world.