In starting Numina, we wanted to solve a problem that urban planners often take as a given truth: Our cities have little sense of how and how many people actually use our streets. Without this knowledge, planners end up deciding how to allocate public space using a highly quantified automobile dataset, but only a fuzzy idea of bicycle, pedestrian, and other vehicle behavior. It’s no surprise then that cities can more easily justify infrastructure for some, and not other, modes.
The first challenge, technically, is that cars drive in very specific lanes, while everything else does not. Tripwire-style counting will not capture most non-car traffic, and instinctively people will even step over or ride around a tube on a path, evading that sensor altogether. The best method of measuring non-car traffic became optical detection, i.e. seeing it. Our logical conclusion: Use computer vision algorithms to perform real-time image analysis and, essentially, harness camera as sensor rather than as a visual recording device.
It was 2014. Every day, more and more concerning headlines emerged about domestic spy programs like PRISM. These programs were possible because citizens were both unwittingly and willfully giving technology companies vast information about themselves. As we thought about building a data company, we realized that even storing data creates risk. We never wanted to be a tool that could be misused; we didn’t want to have people’s personal data at all. We saw how important it was to incorporate privacy considerations from the beginning. It wasn’t possible any longer to ignore the responsibility or “play dumb” about the consequences of misused data.
We committed to intelligence without surveillance. There are so many operators and use cases that benefit from the robust data that can be visually detected — but that don’t need also to have granular personally identifiable information (PII). In fact, AI applied to cameras can actually be more private/secure than a human watching the same place. We could automate new types of data creation, with purposeful end goals in urban planning, traffic safety, and other important fields — without storing video that created an opportunity for further, unintended data extraction later.
We had, and still maintain, a few theses and tenets to guide Numina’s development:
- We don’t need to be naive to the fact that there are, and will be, cameras practically everywhere in cities; as Numina, we don’t have to contribute additional surveillance.
- Measuring the public realm, we have a greater responsibility to respect personal privacy and to ensure security, by design.
Respect for Privacy
True privacy protection involves not just not identifying people; it must also prevent re-identifiability. We can do so by, for example, avoiding excessive data collection and aggregating data so that individual behaviors are not visible.
To ensure accuracy, our technology still requires training and validation processes. We do need to collect occasional images, but we can do so by the minimum viable parameters. We expect to revise and recalibrate certain parameters to respond to actual scale/needs of the data. We collect sample images:
- at low resolutions, random times, and at minimal quantities so that there cannot be intentional, planned or after-the-fact, monitoring/extraction/targeting of information for a surveillance purpose.
- automatically redacting PII from images before transmission and review by any human or software.
We maintain gold-standard secure-by-design practices like mutual authentication and strong encryption, and we follow the Principle of Least Privilege to manage access between systems, data, and users.
We started with a glossary of terms. Clear headers. Visual examples. You can read and understand it in less than 10 minutes.
We also seek to share additional details, context, and changes in public posts, talks, and other forums, so this information isn’t hidden in a tiny footer.
Please read our current policy here*, and share your feedback with us to email@example.com!
*An updated policy went into effect on March 15, 2022. The same philosophy applies, with explanation of what’s new at https://numina.co/an-update-to-numinas-privacy-policy-introducing-calibration-mode/.
The policy below was current from January 2019 until March 2022. See our current (updated) policy here.Numina-Privacy-Policy-2019-01-24
Pingback: How Should Human Rights Advocacy Balance the Opportunities & Risks of Artificial Intelligence? - The Partnership on AI
Pingback: How Should Human Rights Advocacy Balance the Opportunities & Risks of Artificial Intelligence? | Benetech | Software for Social Good
Pingback: Digital Literacy Trust – How Should Human Rights Advocacy Balance the Opportunities & Risks of Artificial Intelligence?