Public anonymity is dead. While that phrase, “public anonymity” may sound like an oxymoron, let me explain. You can’t walk along a street, visit a store, or attend an event without the possibility of someone knowing you’re there.
Public anonymity is dead. While that phrase, “public anonymity” may sound like an oxymoron, let me explain. You can’t walk along a street, visit a store, or attend an event without the possibility of someone knowing you’re there. A government entity, a store owner, or a tech giant knows you and can track everywhere else you’ve been, only by your physical appearance.
In 2018, facial recognition technology spent much time in the news. Amazon licensed its Rekognition product to law enforcement with the presence of gender and racial bias in some of the current technology. ‘China’s used facial recognition to shame jaywalkers publicly. It’s clear, society faces moral and philosophical questions. Such as, who owns and should have access to your physical identity and information in the real world?
Truthfully, this conversation breaks into two discussions. First, what rights do law enforcement and government entities have to track us? Second, for what purposes do we allow companies to access and use our visual identities?
Much of the focus to date has been on government use of facial recognition. The ACLU concluded, “Face surveillance threatens to chill First Amendment-protected activity like engaging in protest or practicing religion, and it can be used to subject immigrants to further abuse from the government.” San Francisco has already proposed a ban on the ‘city’s use of technology.
Similarly, Aaron Peskin, member of the Board of Supervisors who proposed the ban, commented on his proposal. “I have yet to be persuaded that there is any beneficial use of this technology that outweighs the potential for government actors to use it for coercive and oppressive ends,” he stated.
As this discussion heats up, there will undoubtedly be those who cry, “If ‘you’ve got nothing to hide, ‘you’ve got nothing to fear!” Despite this shallow rationalization, I fully expect masks and other facial coverings to become increasingly popular in public spaces — potentially even stylish.
More significantly, in my opinion, is how we allow companies to use facial recognition. Apple uses the technology to enable you to unlock your iPhone. Facebook uses it to enable you to tag your friends in photos. To date, these applications take place online and under our control. To our knowledge, the owners of this technology have not deployed it to the public sphere. However, public deployment is inevitable. For instance, sensors on ‘Google’s Waymo vehicles will have the capability to act as a roaming camera network. These vehicles identify pedestrians and even keeping track of where ‘they’ve been, just as Android does today.
Admittedly, there are many beneficial and convenient applications for consumer-facing biometric identification technology, as ‘I’ve written about before. Biometric access control will get rid of physical keys or fobs for your home, office, and other institutions. Your physical identity may function as a non-transferable ticket to a concert or sporting event. The need to “ID” people with their ‘driver’s licenses will disappear.
Since the wide-scale deployment of biometric identification is likely inevitable, ‘it’s imperative that we think through all of the potential nefarious use cases and set some ground rules. Lauren A. Rhue, Assistant Professor of Information Systems and Analytics at the Wake Forest School of Business, commented on the potential misuse of facial recognition technology. Lauren stated: “The risk in giving up any biometric data to a company is that there’s not enough transparency, not only about how the data is currently being used, but also the future uses for it.“
Companies looking to deploy biometric identification or facial recognition outside of homes need established standards and operating procedures.
It is clear to me that current tech giants are likely incapable of fulfilling the proposed standards above. They’re too large, diversely focused, and have historically made too many mistakes regarding data privacy and use. Instead, there’s a distinct need for companies built from the ground up to focus on transparently managing people’s biometric identities. Additionally, it should be divorced from any other business lines or monetization streams.
To summarize, we have a short window of time to establish the standards by which companies may use our physical characters to preserve our privacy. Any for-profit company that wishes to deploy biometric identification technology outside of the home or internet should agree to the ethical, transparent, and responsible use of such technology. After all, they are accountable if they fall short of these standards.
We published this article from our dedicated Privacy Magazine, Privacy. Dev – read more now.
We enjoyed two great days of security and privacy talks at this year’s Symposium on Usable Privacy and Security, aka SOUPS Conference! Presenters from all over the world spoke both in-person and virtually on the latest findings in privacy and security research.
At Ethyca, we believe that software engineers are becoming major privacy stakeholders, but do they feel the same way? To answer this question, we went out and asked 337 software engineers what they think about the state of contemporary privacy… and how they would improve it.
The UK’s new Data Reform Bill is set to ease data privacy compliance burdens on businesses to enable convenience and spark innovation in the country. We explain why convenience should not be the end result of a country’s privacy legislation.
Our team at Ethyca attended the PEPR 2022 Conference in Santa Monica live and virtually between June 23rd and 24th. We compiled three main takeaways after listening to so many great presentations about the current state of privacy engineering, and how the field will change in the future.
For privacy engineers to build privacy directly into the codebase, they need agreed-upon definitions for translating policy into code. Ethyca CEO Cillian unveils an open source system to standardize definitions for personal data living in the tech stack.
Masking data is an essential part of modern privacy engineering. We highlight a handful of masking strategies made possible with the Fides open-source platform, and we explain the difference between key terms: pseudonymization and anonymization.
Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!Book a Demo