Privacy is a controversial subject. What privacy means is strongly related to a particular culture, but it almost always involves a personal choice about what is private, special, or sensitive to that person. When people enroll into an identity management system, they are sharing information about themselves. No two people are alike in their comfort level associated with this sharing, and it varies by each information collected. Furthermore, as more information is collected, people accumulate more unease about their loss of privacy. These feelings of privacy are inevitably bound to a perception of trust. In a government system, a citizen must trust that the government will use the information just for a particular purpose, and that they will keep it safe so that the information is not more broadly shared without consent. Different countries have begun to tackle this natural trust/privacy interaction with privacy laws. In some countries, people have a right to privacy, and administrators of systems are accountable to the responsibilities to maintain a minimum level of trust. When a person provides personally identifiable information like biographics and biometrics, we need to work very hard to provide maintainable solutions:
- That responsibly store information
- That assures information is only used for its intended purpose
- Where policy of use is transparent to the users
Privacy by Design
One of the most important things to consider is the limitation of required user data. If data is not absolutely necessary, the best thing to do is not acquire it. Customers and designers alike often want to establish protocols and interfaces that are capable of delivering use case independent information for reasons of reuse. Unchecked, this can lead to user interfaces that encourage users to enter more information than is necessary for the purpose.
The second biggest consideration is the security of the system. If a system is responsible to keep private information safe, then it must protect that data whether that data is stored on a database (data at rest), or when delivering that data across a channel in a modern distributed application (data in transit).
Interesting New Technologies
The traditional approach to protecting data is encryption. By using encryption, we can change the problem of data at rest and data in transit to a key management problem. To be sure, this is such a dominant requirement in IT systems that the use of symmetric and asymmetric encryption will continue to be used to reduce the attack surface that helps keep personal private data safe. Nevertheless, key management systems can be vulnerable, with risks at key provisioning, key revocation, key storage, key sharing, etc.
New technologies are being developed that promise to add a significant layer of protection to just encryption. The basis of this technology are one-way transformations. The idea is that even if an encryption key is compromised, the decrypted data would leak very little personally identifiable information about the users of the system.
One-Way Transformation for Biometrics
Currently, biometrics are typically collected from sensors and then processed into features. These features are much smaller than the original facial images or videos, fingerprint images, voice records, etc. Additionally, they are designed to make rapid matching feasible using software matching algorithms. While these features are often much smaller dimensionally, there is mounting evidence that a feature representation can be reversed into a usable likeness of the source biometric. For instance, the landmarks and texture information of a face can be used to regenerate a facial image that has a strong likeness to the original image. Another example is that fingerprint minutia information can be used to construct a ridge flow pattern that has a relatively high similarity to the source fingerprint image. The reversibility information would allow a breach to have perhaps greater consequences than intended. Attackers can use this information in innovative ways to determine if a person is enrolled in other systems, or perhaps determine what transaction a user has performed, accelerating the loss of personal information.
Recent research is focused on considering these features in an information theoretic way. Instead of storing the actual information, by some clever design, we can establish some invariant aspect of the biometric, and then utilize an error correction scheme to establish very strong privacy performance. Let’s use fingerprint recognition to help explain what we mean.
When algorithms attempt to compare fingerprints, they must deal with variations of the fingerprint at acquisition between enrollment and comparison. The obvious issue for fingerprints is that the fingerprint will be placed at a different angle, and not precisely in the same location. Furthermore, there will be a different pitch and roll to the finger, resulting in a further lack of fingerprint overlap between the reference and the sample fingerprint images. For minutia algorithms, the algorithms test probably hypotheses of rotation and translation, looking for a match. This level of detail is what makes minutia representations both highly useful for robust fingerprint matching, while also simultaneously leaking a lot of information about the user’s source fingerprint.
Newer approaches try to orient the fingerprint image into a consistent “space”, such that features are automatically aligned, or they use translation/rotation invariant features that still can be compared for similarity. Now, we can identity one-way hash functions for these geometric features. We can also perform a forward error correction process on those hashes, and only store the parity information. When a sample is introduced, the same transformations are applied and the parity information can be applied to match the individual. If an attacker gains access to the parity information, an arbitrarily low amount of information can be leaked about the real features of the user. Moreover, the one-way transformation functions can be created for each user. These novel technologies can be employed to develop very strong privacy protection capabilities, increasing the trust that users have in the system, and improving privacy. There is a tradeoff at the technology level for this stronger privacy protection, as the mechanisms to either automatically align or use invariant features result in a loss of accuracy. Tradeoff costs are an active area of research; there are already commercial systems that have positive value for users and system needs.
One-Way Transformation for Biographics
Biographic information is used in a variety of ways in identity management systems. In many cases, the biographic information needs to be presented to a human security agent such that they can verify information. For instance, a driver’s license might contain information about name, age, race, and gender, such that a police officer can use it to help improve identification accuracy of a person of interest. Alternatively, an electronic passport might need to provide biographic information to a border control agent for similar purposes. For these use cases, we cannot easily use transformation technologies to help protect the personal information. However, as identification becomes more automated, there is less of a need for these “human matching” mechanisms. Perhaps a user might be challenged to provide his name to authenticate an identification credential. This model would have a vastly different approach about how such a comparison could be performed. Instead of humans dealing with variations (like perceived height, different hair color, informal names, etc), the algorithms would deal with these comparison issues. In the same way that biometric algorithms can report a similarity score, so can biographic comparison algorithms. Thus, we can use similar one way transformations and error correction approaches to further reduce leakage of personal information.
It is clear that personally identifiable information like biographic and biometric information will by necessity be collected to allow identity management systems to function. However, we should understand that just as the industry continues to innovate with respect to usability, power, cost, accuracy, and speed, significant innovation is happening for the benefit of user’s privacy.
Greg Cannon is Vice President & Chief Technology Officer of Crossmatch responsible for the Company’s standards involvement, intellectual property, software architecture, biometric algorithm development, and continued innovation excellence.