Skip to main content

Best practice to reduce drawbacks of biometrics

​Although it is the most accurate evidence of someone's identity, the implementation of biometrics has raised concerns.

These have ranged from the technology allowing false positives and false negatives, where access is given (or denied) based on an incorrect match; spoofing, where systems can be fooled by false representations of an individual's biometric data; through to data and privacy, where biometric information is not being collected responsibly or kept securely. 

While these concerns are being addressed, there are currently no biometric-specific laws to ensure that biometric technology will be developed and deployed in an ethical, consistent, and responsible manner as we advance.

Therefore, standards bodies, institutions, and organizations are looking into creating a framework that will help biometric technology enhance and drive the digital economy rather than hold it back. 


A report from the Secure Identity Alliance (SIA), 'Building inclusive futures and protecting civil liberties,' says that although the introduction of GDPR in Europe sets a useful benchmark, it has not been globally adopted and does not eliminate the need for governments or private enterprise to address individual rights and freedoms when planning biometric-led projects.

Therefore, the SIA has developed a set of best practice guidelines to help policymakers make informed decisions about implementing biometric-led projects.

This involves implementing a standards-based approach that includes professional learning and norms and consistency in technical vocabulary uses.

Ethical principles

Earlier this year, the Biometrics Institute launched its Ethical Principles for Biometrics to guide its members – and the wider biometrics community – to act ethically in the absence of international law.

One of the principles, 'Recognizing the dignity of individuals and families', supports the "dignity and human rights of individuals and families," provided that it does not conflict with the criminal justice system's legitimate and lawful aims to protect the public. 

The Biometrics Institute has also updated its Privacy Guidelines, which cover citizens' right to have their biometric record amended or deleted and the right to redress and complaint by people who have suffered discrimination, humiliation, or damage as a result of biometric-related systems.

Day-to-day guidelines

In the UK, an independent panel has set out new guidelines on how the Metropolitan Police in London should use facial recognition technology. Ten trials using facial recognition across the capital were carried out to incorporate the technology into day-to-day policing. 

Following an extensive review, the independent Ethics Panel recommended that the police deploy live facial recognition software if several conditions were met.

These included ensuring the overall benefits to public safety were significant enough to outweigh any potential public distrust in the technology.

Strict guidelines were developed to ensure that deployments balance this technology's benefits with the public's possible intrusion.

With the Financial Times reporting that Brussels plans to legislate on facial recognition technology as part of an EU drive to create ethically based laws governing AI, the pressure is on for governments, regulatory bodies, and private organizations alike to give the general public the assurances they need about the deployment of biometric technology in everyday life.

Related contents:


Get in touch with us

For more information regarding our services and solutions contact one of our sales representatives. We have agents worldwide that are available to help with your digital security needs. Fill out our contact form and one of our representatives will be in touch to discuss how we can assist you.

Please note we do not sell any products nor offer support directly to end users. If you have questions regarding one of our products provided by e.g. your bank or government, then please contact them for advice first.