by Morey Haber, CTO at BeyondTrust
When classifying the sensitivity of various types of data, few, if any, types of data rank higher in sensitivity than biometric data. Yet, while biometric data is increasingly collected with abandon, this practice is under-scrutinised, pushing us further in the direction toward data dystopia.

What Data is “Sensitive”?

So, what do people today consider “sensitive” data? It may vary from person to person, company to company, and even regulation to regulation.

Personally Identifiable Information (PII) refers to a broad basket of data widely considered sensitive as it can be used to identify a person. PII includes your name, email addresses, usernames, passwords, birthdate, address, social security number, credit card information, medical history, biometric data, etc.

However, while the more traditional types of PII have long been under scrutiny, biometric data, especially with regard to the new ways it is being used and accessed today, is not receiving adequate attention—especially when considering just how sensitive and immutable it is. Biometric data is part of your identity and can never be changed. Specifically, the biometric data I’m focusing on here is fingerprint, voice, retinas or irises, identifying DNA sequences, and facial characteristics.

Today, our biometric data is increasingly being collected, stored, and transmitted by IoT devices and services in the cloud, often with very little thought around the implications. You may participate in this via an ancestry DNA kit, or perhaps by using a modern notebook, mobile device, or smartwatch that stores health or login data via fingerprints or facial recognition.

In this blog, we’ll take a look at some of the unique risks biometric data collection/storage poses, some of the emerging products and services that request your biometric data, and steps you can take to better protect your data and, by extension, your identity.

Biometric Data: Ripe for Abuse and Misuse

Exposure of biometric data poses unique data privacy risks and ramifications on multiple levels. For instance, once your biometric data has been leaked or compromised, it puts you at continual risk for identity-based attacks.
Typically, as a human being, you have a single identity. We could argue that, even if you are a spy or have a criminal alias, you still only have one identity since, regardless of your various, aliases, avatars, or impersonations, you only have one set of biometric data. You cannot alter your fingerprints, voice, face, eyes, EKG, or veins.

When information technology leverages biometric data for authorisation or authentication, it must compare the results against a banked, electronic profile of your biometric data. Stringent security protections, including encryption, can help keep biometric data at rest protected. However, to make use of the biometric data, it must be reassembled (at least in parts) to compare to the assessed input. Storage design flaws, vulnerabilities, and host system misconfigurations are just a few of many ways that could leave biometric data ripe for exposure. Such a fate befell the United States Office of Management and Budget in 2015, with fingerprints belonging to 5.6 million individuals consequently stolen by cyber attackers.

Also consider that even leading-edge biometric systems can be vulnerable to spoofing. For instance, security researchers recently hacked an arm vein biometric authentication device using photos and a replicated arm.
However, the most significant issue with biometric data is not presented by the storage or authentication technology used, but rather by the static nature of biometric data itself. When a password is compromised, you can defuse password re-use attacks simply by changing the password. However, you cannot change your biometric data, so once it’s compromised, it can persist as an identity-based threat. Your eyes, face, or fingerprints are forever linked to your identity (excluding bio-hacking—a topic for another day). Any future hacks that solely rely on compromised biometric data can be an easy target for threat actors.

Thus, biometrics alone should never be relied on as the sole method of authentication or authorisation. For such use cases, biometric data should always be paired either with a password or, better yet, a two-factor or multi-factor authentication solution.

Emerging Products and Technologies Requesting Access to Your Biometric Data

While some vendors prioritise security for biometric data (Apple Secure Enclave), many others treat the protection of this data as an afterthought (such as after they received unwanted publicity regarding a breach or data misuse). One notorious instance in recent years of a consumer product playing loose with data privacy concerns involved Vtech’s popular My Friend Cayla talking doll. My Friend Kayla uses voice recognition software and is embedded with a blue-tooth-enabled device to collect and receive data. The toy ran afoul of privacy standards across many countries, encapsulated by one writer who bluntly put it “Cayla, the connected doll, is a spy and must be destroyed.”
Here’s a small sampling of some new and emerging technologies that may now possess your biometric data.

  • Mobile Devices and IoT: Consider modern cell phones, tablets, and even door cameras (designed to protect against package thieves, acts of vandalism, or intruders)—each of them captures some form of biometric data and stores it on the device or in the cloud—even if it is not used for authentication or authorisation. Some door cameras capture photos or video based on movement and may record footage or pictures of you just by your walking or driving past it. Your likeness, unknown to you, now potentially resides on another end user’s device, or in the cloud. And, your mobile phone or tablet now has fingerprints and facial metrics stored within it too. There are plenty of tools and documents on how to bypass these security models if you have the device in hand. You cannot trust these security models based on biometrics alone, and AI may actually make the matter worse by performing the PII linkage for threat actors.
  • Virtual Assistants (Siri, Alexa, Cortana, etc.): Devices from Google, Amazon, Microsoft, Apple, and others process voice recognition commands and can be programmed to understand individual voices. Your unique vocal patterns are stored and processed in the cloud. While threat vectors for human voice patterns are still very theoretical, be aware that this data is being stored. And, the risks do exist, like this incident of someone’s personal interaction with an assistant being transmitted to other individuals.
  • DNA Kits: If you purchased or used My Ancestry or 23 & Me, your DNA is now on file. And, if you provide permission, your data can be used applied by law enforcement to help solve criminal cases. Your most private and sensitive data, your DNA, is now in the hands of a third party. You should be aware of everything they can do with it and what the ramifications are if those services are ever breached. For instance, consider this breach at consumer genealogy website, MyHeritage, which exposed data from 92 million accounts.
  • City Cameras – In many major cities, as in New York City, there are thousands of surveillance-based street cameras capturing terabytes of video every day to help keep the city safe. How the data is stored and processed is under the jurisdiction of law enforcement and there has been a mainstream push to ensure that processing for facial recognition and other profiling is restricted by law. However, every time you see a surveillance camera in public, it raises the question, who is watching it and what are they doing with my identity?

Exert Control Over How Your Data is Accessed and Used

When you purchase a device, use a new technology, or consider using a new service, ask yourself, and potentially the vendor (especially, if the technology is used for work), the following:

  • How are you storing biometric data?
  • Where is it being stored? (especially, what countries, since this may have other legal and compliance ramifications.)
  • How is my biometric data secured?
  • Who has access to the data?
  • Is my biometric data being expunged over time?
  • Do you sell/give away my biometric data?
  • Does law enforcement have access to my biometric data or logs? Even with a warrant?

We need to candidly and rigorously discuss what we will allow to be stored about our identity and what is just too risky. And, most importantly, scrutinise who has the right to store, transmit, or access this data.

Each of us has a single, unique identity attached to us throughout our lives. It’s in our best interest to take control over how our biometric data and other sensitive data is captured, stored, processed, transmitted, and accessed.
Some of these troubling data privacy issues that I’ve discussed are starting to be addressed by regulations. For instance, a new piece of legislation in the United States Congress called the Commercial Facial Recognition Act of 2019has been introduced to raise awareness to millions of people about how their biometric data is processed and stored. If passed, this bill would obligate companies to obtain consent before tracking individuals using facial recognition technology and processing the data for any type of surveillance, profiling, or other unknown analysis.

Additionally, many IT security solutions, like BeyondTrust’s Endpoint Privileged Management solutions, have specific features embedded to obfuscate a user’s identity during data collection—without losing fidelity in reporting, alerting, and analytics. The goal is to apply these concepts to government regulations worldwide that are already legislated to protect a user’s identity—regardless of whether the source is biometric or user activity-based.
We are at the beginning of a biometric data storage and processing revolution. Everyone should be aware of the risks to their identity and how even at work, we can protect our basic anonymity.

share us your thought

0 Comment Log in or register to post comments