Public anonymity is dead. While that phrase, “public anonymity” may sound like an oxymoron, let me explain. You can’t walk along a street, visit a store, or attend an event without the possibility of someone knowing you’re there.
Public anonymity is dead. While that phrase, “public anonymity” may sound like an oxymoron, let me explain. You can’t walk along a street, visit a store, or attend an event without the possibility of someone knowing you’re there. A government entity, a store owner, or a tech giant knows you and can track everywhere else you’ve been, only by your physical appearance.
In 2018, facial recognition technology spent much time in the news. Amazon licensed its Rekognition product to law enforcement with the presence of gender and racial bias in some of the current technology. ‘China’s used facial recognition to shame jaywalkers publicly. It’s clear, society faces moral and philosophical questions. Such as, who owns and should have access to your physical identity and information in the real world?
Truthfully, this conversation breaks into two discussions. First, what rights do law enforcement and government entities have to track us? Second, for what purposes do we allow companies to access and use our visual identities?
Much of the focus to date has been on government use of facial recognition. The ACLU concluded, “Face surveillance threatens to chill First Amendment-protected activity like engaging in protest or practicing religion, and it can be used to subject immigrants to further abuse from the government.” San Francisco has already proposed a ban on the ‘city’s use of technology.
Similarly, Aaron Peskin, member of the Board of Supervisors who proposed the ban, commented on his proposal. “I have yet to be persuaded that there is any beneficial use of this technology that outweighs the potential for government actors to use it for coercive and oppressive ends,” he stated.
As this discussion heats up, there will undoubtedly be those who cry, “If ‘you’ve got nothing to hide, ‘you’ve got nothing to fear!” Despite this shallow rationalization, I fully expect masks and other facial coverings to become increasingly popular in public spaces — potentially even stylish.
More significantly, in my opinion, is how we allow companies to use facial recognition. Apple uses the technology to enable you to unlock your iPhone. Facebook uses it to enable you to tag your friends in photos. To date, these applications take place online and under our control. To our knowledge, the owners of this technology have not deployed it to the public sphere. However, public deployment is inevitable. For instance, sensors on ‘Google’s Waymo vehicles will have the capability to act as a roaming camera network. These vehicles identify pedestrians and even keeping track of where ‘they’ve been, just as Android does today.
Admittedly, there are many beneficial and convenient applications for consumer-facing biometric identification technology, as ‘I’ve written about before. Biometric access control will get rid of physical keys or fobs for your home, office, and other institutions. Your physical identity may function as a non-transferable ticket to a concert or sporting event. The need to “ID” people with their ‘driver’s licenses will disappear.
Since the wide-scale deployment of biometric identification is likely inevitable, ‘it’s imperative that we think through all of the potential nefarious use cases and set some ground rules. Lauren A. Rhue, Assistant Professor of Information Systems and Analytics at the Wake Forest School of Business, commented on the potential misuse of facial recognition technology. Lauren stated: “The risk in giving up any biometric data to a company is that there’s not enough transparency, not only about how the data is currently being used, but also the future uses for it.“
Companies looking to deploy biometric identification or facial recognition outside of homes need established standards and operating procedures.
It is clear to me that current tech giants are likely incapable of fulfilling the proposed standards above. They’re too large, diversely focused, and have historically made too many mistakes regarding data privacy and use. Instead, there’s a distinct need for companies built from the ground up to focus on transparently managing people’s biometric identities. Additionally, it should be divorced from any other business lines or monetization streams.
To summarize, we have a short window of time to establish the standards by which companies may use our physical characters to preserve our privacy. Any for-profit company that wishes to deploy biometric identification technology outside of the home or internet should agree to the ethical, transparent, and responsible use of such technology. After all, they are accountable if they fall short of these standards.
We published this article from our dedicated Privacy Magazine, Privacy. Dev – read more now.
Ethyca’s VP of Engineering Neville Samuell recently spoke at the University of Texas at Austin’s Texas McCombs School of Business about privacy engineering and its role in today’s digital landscape. Read a summary of the discussion by Neville himself here.
Learn more about all of the updates in the Fides 2.24 release here.
Ethyca’s Senior Software Engineer Adam Sachs goes through the thought process of creating Fideslang, the privacy engineering taxonomy that standardizes privacy compliance in software development.
Learn more about all of the updates in the Fides 2.23 release here.
Our Senior Software Engineer Dawn Pattison walks you through implementing data minimization into your business.
Learn more about all of the updates in the Fides 2.22 release here.
Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!Request a Demo