This year could be a boon for biometric privacy legislation. The topic is heating up and sits at the intersection of four trends: rising artificial intelligence (AI) threats, growing use of biometric data by businesses, anticipated new state-level privacy legislation, and new executive order issued by President Biden this week that includes biometric privacy protections.
But increased scrutiny could backfire: These regulations could create conflicts and complex governance issues for companies trying to meet the new restrictions, especially when it comes to a tranche of new state laws that are about to take effect. This means companies need to stay updated as this legal landscape evolves.
Amy de La Lama – a lawyer with Bryan Cave Leighton Paisner, who tracks state privacy laws — says companies need to be more forward-thinking and anticipate and understand risks in order to build the right infrastructure to track and use biometric content.
“This means they should collaborate more closely between their business and legal functions to understand how to use biometrics in their products and services and to understand the legal requirements,” he says.
Biometrics regulation slows state privacy efforts
Various states have enacted data privacy laws in the past two years, including Delaware, Indiana, Iowa, Montana, New Jersey, Oregon, Tennessee, and Texas. This is in addition to privacy laws already enacted in California, Colorado, Connecticut, Utah and Virginia.
Yet despite this growing level of privacy protection, not all states have done much in terms of regulating biometrics. For example, Colorado privacy laws do not explicitly define biometric data but establish rules for how it is processed.
In the meantime, Five states in particular have passed regulations related to biometrics (Illinois, Maryland, New York, Texas and Washington). While it seems like a trend, many of these laws are limited, such as New York’s law that focuses exclusively on prohibiting employee fingerprinting as a condition of employment.
Of the five existing states with biometrics-related statutes, Illinois Biometric Information Privacy Act it has been around the longest – since 2008 – and is the most comprehensive, covering how biometric data is collected, stored and used. Yet it took until this week to establish the damage lawsuit filed by a group of truckers against the BNSF railroad several years ago due to the requirement to scan one’s fingerprints before entering an Illinois rail yard.
That could change: New York is considering at least three bills this year that attempt to expand protections to more comprehensive biometric checks, and there are bills in at least 14 other states’ committees also for a broader interpretation of biometric issues.
A confusing patchwork of data compliance requirements
Subtle differences between all state laws can cause compliance conflicts. There are differences in how biometric privacy will be regulated, as well as general implementation dates and different reporting requirements.
“Biometrics is clearly in the crosshairs right now,” says David Stauss, a leading expert at law firm Husch Blackwell, who tracks privacy laws across the country“and it’s at the top of the list of issues related to sensitive data management. It’s incredibly difficult for companies to keep track of all these requirements. These regulations are a constantly moving target, and it’s akin to building a ship as we sail it. “
For example, Texas and Montana’s privacy laws take effect on July 1, but Indiana’s privacy laws do not take effect until January 1, 2026. California’s laws create new requirements for sensitive personal information and allow consumers to limit certain data that can be used by businesses. Virginia law has a more restrictive definition of biometric data and limits how it can be processed.
Additionally, each state has a different mix of which companies must report, based on the amount of revenue generated in each state, the number of consumers affected, and whether they are for-profit or not.
All this means that it will be complicated for companies operating nationally because they will have to check their data protection procedures and understand how to obtain consumer consent or allow consumers to limit the use of that data and ensure that they correspond to different subtleties. in the regulations.
Contributing to compliance issues: The executive order sets ambitious goals for various federal agencies on how to regulate biometric information, but there may be confusion in terms of how these regulations are interpreted by companies. For example, does a hospital’s use of biometrics fall under the rules of the Food and Drug Administration, the Health and Human Services, the Cybersecurity and Infrastructure Security Agency, or the Department of Justice? Probably all four.
And this before that considering the international implicationsbecause Europe and other places are adding to this crazy quilt of privacy regulations.
Use of biometrics expands, despite trust issues
This complex legal landscape is driven by the growing use of biometrics to protect private and corporate data and the cybersecurity threats that come with it.
Vendors are doing a better job of incorporating these technologies into overall software development packages, as announced last fall Amazon will expand its One software for palm scanning to enable better corporate access controls.
But while fingerprint, face and palm scanning technologies have been around for years (the The FBI has collected many millions of palm scans over the past decade), Amazon is storing its palm prints in the cloud, which could make any leaks or potential abuse more likely, according to Mark Hurst, CEO of Creative Good.
“These handheld readers are intended to normalize the act of providing your biometric data anywhere, anytime,” Hurst says. “And what happens if the palm data, like so many other identification systems, is hacked? Good luck finding a new palm.”
Meanwhile, AI-induced deepfake video impersonations by criminals misusing biometric data such as facial scans are on the rise. Earlier this year, a deepfake attack in Hong Kong was used to steal more than $25 millionand there are surely others who will follow as Artificial intelligence technology becomes better and easier to use to produce fake biometrics.
Conflicting regulations and criminal abuse could explain why consumer trust in biometrics has plummeted.
Second GetApp’s 2024 Biometric Technologies Survey Out of 1,000 consumers, the number of people who strongly trust tech companies to safeguard their biometric data fell from 28% in 2022 to just 5% in 2024. The company says the drop is due to the growing number of breaches of data and reports of identity theft cases.
“To mitigate legal, reputational, and financial risks, ensure biometric data is captured with consent and stored securely in accordance with privacy regulations,” says Zach Capers, senior security analyst at GetApp. But that may be easier said than done, especially as future biometric laws offer conflicting requirements.