Ask Citopians the one thing they would change about their beloved (though fictional) city, and you’ll get one answer: the endless delays and choking pollution caused by the city’s heavily congested roads. To address the issue, city officials unveiled an audacious technical solution: customized routes for each vehicle and destination, based on real-time, synchronised traffic data. The scheme promised to cut congestion by a third; all it required was live tracking of each driver’s location data.
The plans were met with weariness. A litany of data abuses meant citizens were sceptical of officials’ ability to use the data responsibly. How could they be sure it wouldn’t just end up being used to optimize road-side advertising, or even more sinister forms of surveillance and control?
The people of Citopia’s dilemma is typical of many we face in the age of data: how can we reap the shared societal benefits that massive aggregated data can unlock, without opening the door to the abuses that are possible with any large concentration of sensitive data?
One effective solution would be a data trust: a piece of infrastructure that separates those collecting and using data from those holding custody over the data. Instead of sharing location data directly with the government, drivers in Citopia would send it to a data trust, an independent, trusted non-profit entity. The data custodian governing the trust has a fiduciary responsibility to safeguard citizen’s individual privacy as well as that of the general public. Some of the core responsibilities of the data custodian are:
To decide what data can be collected
Some data should never be collected. The data custodian decides what data can be collected, and when. Ideally, before cities or companies decide to place certain sensors in our built environment, they must first consult with the data trust.
To decide what data can be shared
The data custodian decides what data can be shared under what circumstances. That decision is based primarily on the preferences and consent extended by the individual data subjects - in this case Citopia’s drivers. However, as Citopia creates more uses for data-driven optimizations, the number of privacy choices its citizenry needs to make can quickly become mind-boggling. To avoid decision fatigue, citizens can decide to offload some of their decision-making to trusted third parties who can help them navigate the individual and societal consequences of them sharing data.
To decide who can use the data
Companies, researchers, or policymakers wanting to make use of the data must request access from the custodian, making clear how they intend to use the data, and for how long. The custodian is obliged first and foremost to abide by the rules of consent for individual subjects mentioned above. It will also be necessary to have a system of auditing, to ensure data users are doing as they say they will.
Trusting the trust
Now, how do we ensure data trusts do not devolve into just another unaccountable surveillance apparatus? How can we trust the trusts? Two simple measures can be built in to keep power vested in the users, and prevent against
If at some point in time citizens no longer trust the data trust, they can transfer data about them to another trust. The consent statements connected to the data will travel with it.
Certification of the trust
The data trust itself is certified by an independent body and subject to auditing. Should it lose its certification, it is no longer allowed to share any data in the data trust.