If Mission Impossible, Mission Impossible 2, Mission Impossible III, Mission Impossible – Ghost Protocol, Mission Impossible – Rogue Nation and Mission Impossible – Fallout have taught us anything, it’s that security, identity and authentication is an ever more complex and involved landscape.
At least, that’s what I took from them. Don’t go in for motorbike stunts much.
Tom Cruise seems to have something of a taste for the theory and practice of security, too. Another notable foray into the realm was in the guise of another kind of agent, playing PreCrime Captain John Anderton in Steven Spielberg’s 2002 film Minority Report. The film foretold of a dystopian world in which crimes could be predicted by ‘Precogs’ – beings blessed with the ‘gift’ of seeing into the future and these crimes being prevented before they occurred. It also foresaw a world in which populations are tracked and identified constantly by both government agencies and private companies using biometric recognition. Oh, and controlling computers through voice and gestures. To classical music.
All seeming very familiar, right? Ok, maybe not Minority Report’s Precogs, which explores the differences between free-will and fate and ethics involved in attempting to control either. Must be said that advances in AI and Machine Learning are attempting to create algorithms that will predict the future based on Big Data and probability calculations.
Biometric identification, however, is now commonplace. Smartphones are using fingerprints and facial recognition for user authentication, and security applications now range from simple Access Control and Time and Attendance tracking to banking, law enforcement and border security.
All a bit scary, non? Combine the technology to recognise people wherever they go by a variety of metrics with software that logs key historical data about them to try to influence future behaviour, and we’re pretty much at the point of walking into a store and having tailored adverts beamed straight to us as cameras scan our faces. Targeted marketing is already happening whenever you’re online and signed in or your cookies are enabled.
And it’s potentially all the more invasive when it comes to facial recognition, because, as Jennifer Lynch, an attorney for privacy rights group Electronic Frontier Foundation, said in an interview with Bloomberg: “Face recognition data can be collected without a person’s knowledge. It’s very rare for a fingerprint to be collected without your knowledge.”
Fighting the tide
So is there any halting the inexorable advance of biometric data and its application throughout our lives? Well, yes, there is resistance manifested in privacy concerns and the resultant regulation. Turns out people don’t like the idea of being tracked everywhere they go, (despite research which has shown that most consumers trust the large online companies who hold their data), and the right to privacy has been encapsulated in various regulations which govern how data is collected, stored and used. The most notable of these in recent years is of course the General Data Protection Regulation, or GDPR, which came into force on 25 May 2018. Remember that mad flurry of emails from everyone and their dog asking if you’d like to continue to receive communications from them when you had zero memory of asking to be added to the mailing list in the first place? That.
GDPR boils down to a set of guidelines for the collection and storage of personal data, so any organisation that controls such data must put in place appropriate measures (both operational and technical) to work within the regulations defined. As a collector of such information, you need to gain permission and justify the continued storage and use of personal data. This includes name, age, location and any other data that could be used to identify you. This, of course, includes biometric data.
The good news is that GDPR and other such legislation has introduced enforcement techniques with a lot more teeth than were previously available. Companies can be fined potentially very damaging amounts, meaning that the incentive to abide by the rules is strong. However good the intentions, data will always be under attack, and attackers will find ways to breach defences. So then comes the question of what data is stored and what happens if it falls into the wrong hands?
In your face
One of the main concerns from consumers regarding facial recognition and fingerprints as authentication methods is the fact that the data needs to be stored for it to work, and thus that biometric data could be disseminated either intentionally or by nefarious means. This makes many people uncomfortable, with the assumption that their fingerprints or images of their face are potentially at risk of being broadcast.
But is this concern justified? How is biometric data stored and used, and would its loss be a serious security concern?
Fingerprint readers, iris and retina scanners and facial recognition all sound quite different in the kind of information they store, but they’re actually very similar in the processes and mechanisms they employ to collect, store and use data. When you ‘register’ your fingerprint or face on a device, the device does scan the graphical data. But computers are the ultimate in massive geeks. They don’t care about what things look like. They like numbers. Zero and one are particular favourites.
So to make use of biometric data as an essentially mathematical device, the scans of your face or fingerprint are translated to a set of ‘nodes’ – measurements of location and distance that, taken together, build a unique algorithmic data set that can be used to identify you as you. Once a device has this data, it has everything it needs and it should then discard the ‘original’ scans. In terms of facial recognition devices used for authentication, scans are done in 3D anyway and work in very low light, so in many cases the original ‘scans’ would be pointless to keep. The other key point is that the stored data cannot be used to recreate the original images.
This means that, to all intents and purposes, your biometric data is just a set of co-ordinates and numbers in the format defined by and useful to the device in question. It’s less useful than the rest of the data that could be held about you, so concerns about privacy in biometrics are usually no greater than any other type of data.
As long as companies follow guidance and put appropriate safeguards in place to protect access to data, encrypt wherever possible, and store only data they actually need to carry out their services to customers, data privacy concerns can be largely allayed and we can all rest a little more easily. If, as an organisation, you have any questions about how all this can be achieved, let us know. We know a thing or two about such things.
Of course it has to be said, when it comes to access using facial recognition, you can’t defend against old Tom as Ethan Hunt and his flawless disguises …