Can Smart Cities Help Their Residents Without Hurting Their Privacy?

  • Data is driving the world’s most powerful tech companies – but it will also shape cities of the future
  • In Canada, the Sidewalk Labs smart city experiment has drawn criticism
  • So how do we establish data governance that meaningfully involves citizens for the good of all?

We live in a world where information has been transformed into one of the world’s most precious commodities. Data is a significant driver of our economies and the backbone of the world’s most powerful technology companies.

Tech giants like Facebook, Amazon, and Google have consolidated their data-driven empires and expanded into new markets, where data fuels everything from dating apps and cloud gaming to automated shopping and online payments.

The cities of the future are also being built on data. These aren’t Jetsons-esque leaps to flying cars and robot assistants, but subtly significant changes to the hidden infrastructure and advanced systems that make cities work for millions of people every day.

Street sensors that reduce collisions and congestion, heated sidewalks that melt snow, buildings that predict when to heat, cool, and illuminate rooms, emergency and social services that know when and where people need help—these are the kind of urban improvements made possible when data about the behavior of people and objects is collected.

The debate around ethical use of personal data in civic development is affecting local communities and global markets, and it’s heating up. It’s been at the heart of the Sidewalk Labs smart city experiment in Toronto, the city where I live and work. The project, owned by Google’s parent company Alphabet Inc., was announced 18 months ago with a proposal to build a $1.3 billion data-driven model smart city on a 12-acre waterfront land parcel known as Quayside. It has since been plagued by resignationsfiringscivic resistancelegal action, and significant public concern. Waterfront Toronto, the tri-government body tasked with overseeing the project, stiffened its negotiating stance and on Oct. 31 pushed Sidewalk Labs to offer a deal with better terms for Torontonians on data governance and profit sharing, among other issues.

Public sector and tech community leaders around the globe have been watching. But while representatives of the city of Toronto held its own and reps from Sidewalk Labs listened to public pushback, the first-order question remains:

How can cities around the world benefit from data-enabled technologies without sacrificing privacy?

Privacy paranoia, or surveillance capitalism?

My job is to think about the future of smart cities. In Toronto, I’ve worked with city, provincial, and federal governments, startups, corporates, academics, and civic activists to find workable solutions to this wicked question. Together, we’ve been able to map some of the uncharted territory between the polarizing extremes of privacy paranoia and surveillance capitalism.

Here are some significant challenges, along with the seeds of possible solutions, that Sidewalk Lab’s provocation has elicited from Toronto’s data experts.

Meaningful consent in the public realm is currently impossible

Unlike an app, streets and parks can’t require their users to check a dialog box consenting to how their personal information will be used before granting access. In public spaces where personal information is collected—take video footage that records people’s faces in a crowd—there is no easy way for people to opt out of giving their consent. Former Ontario privacy commissioner Anne Cavoukian has an answer to this: Privacy By Design, an international standard she pioneered.

When data collection is minimized and “de-identified” (the process of anonymizing data) at source, the risk of privacy violations can be dramatically reduced. And when no personal information is collected, consent is not required.

The most valuable insights require access to personal information

De-identifying data at source eases many privacy concerns, but it also strips the data of its most valuable details. It’s a bit like building a powerful telescope to see far into space, only to put frosted glass over the lens. University of Toronto professors David Lie and Lisa Austin have proposed a solution in what they call safe sharing sites, a type of technical and legal interface for privacy-protective sharing of personal information.

The trick they propose is to encrypt personal information in a way that preserves the ability to run queries on the encrypted data. Analysts can ask questions that link together personal data, but they only ever see anonymized, aggregated results. All the questions and answers are recorded, creating an audit trail that allows regulators and courts to inspect how the data has been used and to penalize misuse.

Privacy laws are inadequate

Canada’s current privacy laws were passed between 1983 and 1990, before Google, Facebook, big data and the Internet of Things. Meaning our current laws and policies are founded on strong principles, but they are in need of a significant refresh.

Digital strategies are under review in all three levels of Canada’s government, but policy change is slow, and smart city technologies change much faster. Kristina Verner, vice president at Waterfront Toronto, has suggested a solution in the form of Intelligent Community Guidelines. Modelled after Waterfront Toronto’s highly successful Minimum Green Building Requirements, the guidelines would set a new standard for responsible data use that conforms with and exceeds all existing privacy legislation. The guideline terms would be developed based on feedback from public consultation, and enforced through contract law.

Legitimate institutions capable of city-scale data governance do not yet exist

In the wake of Cambridge Analytica, citizens are far less trusting of big tech to control their data. Privacy advocates are equally concerned with concentrating citizen data under government control, with warnings of something that could look like China’s controversial social credit scoring system.

Who decides what data is collected, who it is shared with, and for what uses? My team at MaRS—the largest innovation hub in North America— has proposed a new kind of organization: a civic digital trust. Such a body would be characterized by its political and financial independence, democratic legitimacy, fiduciary duty to city residents, and technical capacity to securely share data. Residents would form citizen assemblies to deliberate on allowable public benefit uses of city data. The civic digital trust would also ensure that the benefits of smart city data were equitably distributed to all.

There may be no silver bullet solution, but each of these seeds of solutions have shown early promise. To build on these smaller successes, we may at least have some form of silver buckshot: a portfolio of small and mutually reinforcing experiments in data governance that each solve into different aspects of the challenge.

But we can’t afford to shoot first and ask questions later. Experimenting in the cities where we live, work and play demands the highest ethical standards, careful experimental design, safe-to-fail experimental sandboxes, and independent monitoring and evaluation. Nor can these experiments be carried out in isolation if the bigger data governance challenge is to be solved.

Solutions need to be developed and tested together, and in combination with other solutions from around the world.

Just as important as what is tested is who is at the table. If Sidewalk Labs has taught us anything, it’s that citizens need a central role in setting the agenda and a meaningful role in evaluating data governance solutions. Governments need deeper tech expertise and greater capacity in data governance and regulatory innovation to act as leaders in shaping our future cities. Developers need to operate with unprecedented transparency, openness, and responsiveness to gain and maintain the social license to operate.The Quayside initiative will continue to spin off lessons for cities around the world as it winds its way through evaluations and consultations toward a possible approval at the end of March 2020—and almost certainly for years afterward.

One of the most important takeaways may be the development of a playbook for trusted data sharing based on a coordinated campaign of civic-public-private experimentation.

Source: World Economic Forum | Agenda | Fourth Industrial Revolution

Top