twitter logo linkedin logo facebook logo

Privacy and Authentication: Can they ever co-exist?

MIRACL

Ben Rapp, Founder of Securys Limited, joins MIRACL for this insightful discussion on privacy and authentication for Passwordless Day - 23rd June.

Watch (or read below) as we uncover the challenges of traditional login methods and how they impact cybersecurity. When it comes to Multi-Factor Authentication (MFA), relying on third-party solutions raises concerns about securely handling user access and data. But there’s a solution: Zero Knowledge Proof (ZKP). Unlike popular methods like SMS or authentication apps, ZKP, exemplified by MIRACL, ensures no security information is stored, eliminating the risk of data breaches.

In this conversation, Ben Rapp and Margaret Sherer, Head of Marketing at MIRACL, and Rob Griffin, CEO at MIRACL, discuss the limitations of traditional MFA in guaranteeing privacy and share best practices for MFA privacy. Learn how prioritising privacy in authentication benefits both institutions and consumers. Discover how ZKP solutions enable privacy and authentication to coexist, creating a secure digital environment. Gain insights into safeguarding data and defending against cyber threats. Embrace the power of passwordless authentication and protect yourself in the ever-evolving world of cybersecurity.

Celebrate Passwordless Day with us and revolutionise your approach to login and authentication. Visit miracl.com/passwordless for more information.

Transcript:

Margaret Sherer  

Good afternoon and welcome to another LinkedIn Live brought to you by MIRACL, the login you love. I’m Margaret Sherer, the Head of Marketing here at MIRACL, and today we celebrate a particular day, Passwordless Day. Now MIRACL created this day a few years ago after we realised that in May every year, people want to raise awareness for passwords. We think that passwords are inherently insecure and not fit for purpose, and quite frankly, they drive your customers crazy. So with that, we started this Passwordless Day on Alan Turing’s birthday to help raise awareness for other solutions that can keep consumers safe while being inherently easy to use. Today I am joined by MIRACL CEO Rob Griffin and our special guest, privacy expert and founder and principal of Securys Limited, Ben Rapp. Thank you both for joining us today.

Okay, so today, we will discuss privacy and authentication, or can they coexist? We will ask Ben a few questions about the authentication and privacy landscape. And then, we will hear his top tips for businesses and consumers. Rob will jump in with any of your questions or comments as we go. And, as always, anybody out there viewing, please ask any questions or comments in the comment section on LinkedIn. And Ben or Rob will answer them as we go. Also, if you’re watching this on a replay, feel free to ask those questions. And we’ll be able to answer that, you know, in live time as well. So, Rob, do you want to introduce why MIRACL feels so strongly about the importance of being passwordless?

Rob Griffin

Sure, thanks, Margaret. So, Boy, that’s a hard one to jam into an introduction, given our long list of issues. But security is really about a game of good actors versus bad actors. And after 60 years, the bad actors understood the many holes in passwords. They’ve been around that long. And at this juncture, many flaws and possible vulnerabilities are open and readily exploitable. And so we see now that, from a security perspective, they’re responsible for more data breaches than any other element within the security ecosystem. But it’s not just that, from a user experience point of view. I saw a statistic that 17 billion transactions in the US and Europe don’t happen because of passwords. Roughly, between two and 7% of logins fail; there are just, you know, a whole gamut of reasons why they are not fit for purpose from a user experience point of view. So, there are a raft of other issues. We’ll touch on those during this chat; data privacy is on the list. But the fact of the matter is, they’ve been around for too long, and the vulnerabilities that they are, you know, manifestly showing on our all too evident.

Margaret Sherer  

Thank you. So Ben, please do a brief introduction to your life in privacy.

Ben Rapp  

My life? That would be a long introduction; I’ll do this; I’ll do a short one. I am the principal of Securus, which is a global privacy consultancy. We operate in about 60 countries and look after privacy in large and complex environments. So a lot of our work is with large enterprises and deals with the challenges of multiple regulatory environments, privacy, regulation and others and some of the issues with data moving around within a large organisation. Part of my interest in different authentication solutions comes from enterprise authentication and consumer authentication, where you often have multiple authentication events to do your job daily; in the same way, we all have multiple authentication events in our lives.

Margaret Sherer  

Brilliant. With that in mind, can you tell us how the passwordless revolution affects privacy?

Ben Rapp  

Yeah, I think there’s, I mean, there’s lots of things you could talk about, but the obvious one is that most passwordless solutions that we see commonly, and that’s the Apple and Google products that are out there, are still dependent on a third party who knows all about you. So when you think about how social login works, if you log in with Facebook or you log in with Google, or you log in with Apple, whether you’re using their new shiny password solution, or you’re simply carrying their login credential through, they know what you’re logging into. So every bit of behaviour that you have, every site that you go to that you will authenticate using a third party that is operating on an identity identified basis, that third party can track what you were looking at when you went to effectively when you logged out when your session ended. And the question is, how confident are we in what they will do with that additional profiling information? You’re providing these very large organisations who already know a lot about you and whose fundamental mission if they Google is essentially to sell that information to advertisers with yet more data about your personal life, you’re also putting a lot of eggs in one basket, if you’re, you know, if you’re an Apple user, you live inside the Apple ecosystem. And now you’re saying I’m going to delegate the security of other third-party applications to Apple; then you’ve got to hope that nothing happens that compromises that relationship. So not only do you expect that Apple won’t exploit it, but also the hackers won’t penetrate it, which is equally the argument against continuing to use passwords but using Password vaults, because then that password vault is compromised, you face the same challenge. So that’s the core of it, that you need to share identifying information with a third party to benefit from doing away with passwords.

Margaret Sherer  

And then, so that’s going on, a lot of eggs in one basket. We know that consumers are starting to care more about ownership of their privacy. Please take examples like where it is imperative to consumer safety to understand how their data is being shared.

Ben Rapp  

Sure. I think it’s worth just picking up on that we do a lot of surveys in a programme we call privacy made positive. And we recently surveyed the US. And one of the questions we ask is, what are you most concerned about? And what’s fascinating is that the second highest concern after identity theft, which is always the first concern, was people using data for purposes other than those they originally collected, which is quite a sophisticated concern. And something that is also the second highest concern in most European countries. So what people are worried about is that you do something with their data other than whatever it is they asked you to do in the first place. And probably the best and most depressing example of this comes from the dating app Grindr. They were fined for this. But what happened was that they were selling allegedly anonymised data about their users, which included location data. And that allowed conservative charities to identify. I think it’s Catholic priests, in this case, who were using Grindr, and therefore were assumed to be gay, back, identifying them from their movements, because of course, you know, where the churches and you know where they live, if you’re in that world, and then arranging for them to be sacked. So it was a poor outcome for all concerned, grinder; we’re fine. But the point was that there are things that we do, which are entirely lawful, but which can harm us if that data is misused. And, of course, that gets worse where there are things that we do that we, in the West, consider to be entirely lawful but about which other jurisdictions feel differently about attitudes to homosexuality elsewhere in the world or to alternative sexual preferences, let’s call it that. There are countries where it’s punishable by death. And so, the outcome of a similar hack or the misuse of Grindr data will be significantly worse. But suppose you log into Grindr using an identified passwordless solution. In that case, the passwordless solution provider knows that you’ve logged into Grindr, and you can think about many other circumstances where there’s that social context to it. But then you also need to consider what people can do by understanding your financial or medical situation. And again, if you’re logging into your medical insurance carrier, use an identifiable password list solution. Suppose you’re logging into your banking system or otherwise financially transacting. In that case, you are telling the third-party provider what it is you’re doing; that metadata is what has both value and risk.

Rob Griffin

And first of all, it might seem like an extreme example, but it isn’t, and there are plenty. I know that last year the whole issue around abortion legislation in the US saw several similar use cases or incidents arise. But I think I think the point also needs to be made that it’s something where the onus sits very much on the online operator because if you think of the repercussions and ramifications of an end users data being exploited in that way, at all, you know, being subjected to a breach, those are very severe. And it’s worth understanding at this juncture. Given the history of passwords, I’m sure I mentioned the 60 years; they have been around so long because of the issues around going passwordless. And there have been technologies, for some time, that are significant work, considering that you’ve got to trust a third-party provider. And how can you do that in a way that doesn’t leave you, the online operator, with a very insidious and, frankly, quantifiable risk? And that’s something that we at MIRACL have sought to address by requiring no underlying PII for the end user, and apologies for the plug. But that is something that architecturally we feel is missing from, you know, other passwordless providers, and that becomes very relevant when you consider that something like Auth0 or Okta, you know, they’ve been hacked twice in the last six months, both of them and So to what extent is there a, you know, an open liability for operators that have, you know, essentially used an auth, zero, and then have exposed their underlying customers to that, that that kind of, as I say, very hard to apply for on risk.

Margaret Sherer  

I mean, it’s a building on that, um, you know, the, the architecture that you’re applying to, is that when correct me if I’m wrong, but when you kind of log in with Google or login with Apple, it’s not built on what’s known as zero trust protocol. Correct. I’d like to see if you can elaborate on zero trust protocol and why you agree or know it should not be imperative for modern privacy systems.

Ben Rapp  

Sure. It’s more about zero knowledge, and the point about this is the distinction between somebody knowing who you are and then attesting to a third party that you are the person you say you are, so now everybody knows who you are. And somebody is simply attesting that you possess the necessary identifying information, which you hold on to yourself to authenticate a transaction. So it’s the difference between I have, you know, a certificate or a key or something like that, that allows me to demonstrate to a given third party that I am presently in control of this account, but they don’t know who I am. And they don’t necessarily need to know what I’m logging into. The three people in this triangle, the authentication provider, me, and the service I’m trying to access, agree that there is a mechanism of exchange that will authenticate a particular instance. Is this okay, says the service provider. Are you happy to log in? Say, I Say yes. And the person in the middle simply says, Yes, this is okay without knowing who I am or who’s asking. And that zero-knowledge situation means that nothing is logged at the third party at the authentication provider that would be used as much as me because I do not have to provide authentication information to that third party. So I have the convenience of being passwordless. I do not have to give all my KYC information to a third party; I’m not leaving a personalised trail with them. But I am getting the certainty of security provided, of course, that the third-party authentication solution is intrinsically secure. The interesting thing about MIRACL is that it’s called quite a sophisticated cryptographic underpinning, which means that this works without needing that knowledge of the identities of the parties directly.

Rob Griffin

And I think it’s just to make it explicitly clear, you know, this breakthrough within cryptography, which has now existed for over 20 years, has somewhat been stymied by precisely the business model that Ben set out at the beginning, which is, namely, all by me knowing as a third party authentication provider, I, you know, the various details of what it is the end user is doing. I can profit from that in all ways without necessarily the end user comprehending how I’m monetising that data. And what we’re, what we’re saying is that. Also, the long-term result represents a privacy risk that needs to be more widely understood.

Ben Rapp  

It’s more understood than it was. And this is interesting if you look at social login; there’s been a lot of publicity around its implications. If we forget about more recent developments in passwordless, we think about where it started: logging in with Facebook. And that was a convenience play. I’m already logged into Facebook because, of course, people who use Facebook tend to be on Facebook all the time. So I will now use my Facebook login to authenticate myself somewhere else. I’m somewhere else passing the buck to Facebook to handle the authentication. But it became evident to people that that hugely increased the value of their data to Facebook because Facebook could now track them, including knowing when they visited somewhere but didn’t log in. But that somewhere had the Facebook login option. Because when you arrive, if you have a Facebook session cookie kicking around, Facebook can now know that you’ve been there. So there has been a rise in consumer awareness because of its court matters; we should now call them. They’ve been in the news so much for other privacy issues. So people’s consciousness is coming; people are also more and more concerned about Google’s behaviour; it’s slightly different with Apple because Apple has made such a lot of noise about privacy. So they have taken a strong stance, which is very interesting. Because it says you can trust us to share your information with anybody else, which is a slight asterisk as anybody else apart from everyone we share it with. But you also trust us to do all sorts of things with your data inside the walled garden. So Apple is a panopticon inside its walled garden; it knows everything you do. And the idea is that you’re supposed to like that and feel that you benefit from it. And I think that, you know, I wouldn’t want to claim that there’s quite such a backlash against that, but I suspect there will be because of the level of insight that these businesses now have into every aspect of your behaviour. 

Margaret Sherer  

Let’s turn to some solutions. Consumers and clients are concerned about privacy and whether they want to be in the walled gardens or not. Are there three tips or three things consumers can look for to protect themselves regarding privacy within authentication?

Ben Rapp  

I suggest deferring to Rob on this, but fundamentally, it’s not what consumers are looking for; it’s what people providing services to consumers need to allow. So clearly, whatever login solution you offer wants to be as convenient as possible. One of the things that’s wrong with passwords is that they’re a massive pain. Suppose you have a sufficiently secure password, in that case, that it is difficult to crack. Then, based on modern conventions, it’s also extremely difficult to remember because we’ve done a very stupid thing, right? What makes passwords difficult to crack is the length. But instead of allowing people to write a sentence, we insisted that they introduce all sorts of random characters and have some capital letters; you’ll all have experienced this thing where you go somewhere, and it says you have to reset your password, which is also a bad idea. And here are the 57 rules you must meet for this password to be valid. And you sit there swearing as you discover they don’t think a full stop is a special character. And you realise that they’ve decided that you can’t have more than three numbers in sequence, and this kind of thing. And you think, you know, how am I going to come up with a password that I’ll actually remember?

In terms of what we’ve sought to achieve with passwords, we’ve sought to achieve passwords that are mathematically difficult to reverse. I won’t go into the maths in great detail and talk about bits of entropy because everyone falls asleep. But in consequence of doing that, we’ve produced a situation where passwords are tough to remember, even those that are quite difficult to crack. We do stupid things like expecting people to change them and remember the password they changed it to. And that leads to people failing to log in, having to do password resets, and getting their accounts locked. This is terrible from the service provider’s perspective because whatever transaction the customer will conduct, they will not conduct it. Instead, they topped up and down with rage as they tried to get access to the account to do whatever would take them two minutes and generate revenue for the service. So you need to move away from that. You want to have something that’s intrinsically more convenient. You also want to have something that is intrinsically more secure. And that means it is harder for a hacker to penetrate whatever their mechanism, whether that’s brute force, social engineering, or it’s a third-party compromise. And at the same time, it has to be reliable. It has to work every time because some of the solutions people have adopted to avoid the password problem work very well, as anybody who’s ever dealt with a push notification in an online banking app will know some things that feel like solutions aren’t. And then you’ll excuse me; your third thing is that you want that privacy element. You want to know that it’s not going to be used unwisely. And that leaves us with some challenges that we all leapt onto biometrics, as obviously should say fingerprint or face recognition. The problem with that is that that’s just a proxy. Right? All that’s doing is providing a layer of validation that then underneath has to do something else to authenticate you with the remote end. Because we don’t pass your fingerprint across to your bank, your phone is saying to your bank that that fingerprint correctly unlocked whatever it was on the phone, which authenticates you to the bank. And so if that underlying authentication is going through a third party like Okta, then Okta knows that you’ve just logged into your bank. And if Okta isn’t penetrated, everybody knows you’ve just locked into your bank. There are other issues with biometrics: they need to be more secure than people think they’re becoming increasingly less secure as the technology improves, both with people faking faces and fingerprints. But the issue here is what’s happening in the architecture to make that authentication connection. And what you want is quick and seamless, reliable and easy for the user on the surface. But underneath is fantastically difficult to compromise. And that’s quite a big ask, at which point I’ll hand it over to Rob to explain why MIRACL does these things.

Rob Griffin

If you come back to the original question of, you know, what should NGO users do? There are, you know, a series of actions that, whether it be the use of multi-factor authentication or the use of password managers, there are actions, but it’s, in my opinion, the wrong question. Because the fact is that if you look at 95% of users online, they consider that security is not their responsibility. And so, I could sit here and talk about, actually, what are the options for end users. But the truth is that 95% of people think it is down to the operator. They think it’s; I’ve come to your shop, you make sure that my transaction, my business here, is safe and off, and I want a good experience. If I don’t like it, I’ll go elsewhere. And speaking, not least because of GDPR. That’s the way it should be. And so we are firmly of the view that it’s now down to the fine operators to, to move to a passwordless world to both, you know, enable some of the savings that we talked about in terms of just, you know, better monetisation, getting rid of all of those sort of, you know, thwarted transactions, the cart abandonment, and all of the kind of benefits just from top line traffic that would ensue. But to do that, they’ve got to run a thorough exercise of the different providers out there. And there’s no question that the elements of privacy are a leading agenda within that, and hopefully, we can set out our stall as being, you know, a provider with a clear edge, let’s say that. You know, operators online care about profits, which means they care about getting more traffic and revenues with fewer costs. And time and time again, our case studies point to the fact that because of our login success rate of 99.9% in b2c operators, which is, you know, higher than anyone else out there, and because when a user gets what they want quickly, you know, speedily arriving at their destination, they don’t then result in support costs, they don’t then, you know, seek to, you know, create whatever queries with the back office, there is a manifestly substantial drop in the in support costs that arise from that. So, we set out our stall to make you more money and save you costs. But look, there are all these other considerations as well that you may not have considered; you’re going to have to suddenly do third-party data audits if you’re reliant on a provider that’s going to duplicate all of your Active Directory, you’re going to have to think about all of the kind of architectural considerations of maintaining infrastructure if you’re going to run you know, SMS service where you’re going to store all people’s, you know, mobile phone numbers, etc., etc. So these are the considerations for today.

Ben Rapp  

And that’s a good point you’ve made that is worth picking up on. In addition to this whole question about whether we trust this third party, what we are transferring to the third party is, without question, personal data, as it is defined by the GDPR in the UK, DPA and most of the other data protection regulations around the world. And all those regulations also contained constraints over exporting that data to other countries. Yeah, as soon as you think about passing your email address, which is usually what you use as your username, or your mobile number, which is identified as personal data or your IP address, which is also in Europe, considered to be personal data out to a third party, then the actual service that is using this third party for authentication has a duty to ensure that that transfer is legitimate. And they have to do all that. So they have to have the correct paperwork in place. And it adds to the impact of using these kinds of services from an operational perspective, in terms of the privacy function, in addition to adding to the real risks to the data and the admin function of establishing that they have, you know, hosted centres in Europe, that they have the appropriate data privacy act protection, documentation.

The benefit of any zero-knowledge solution is that you’re not passing personal data. And that’s, from an operator’s perspective, from an online service provider’s perspective, that makes it much easier because you go. As I’m passing you some cryptographic numbers, I don’t know who they relate to candidly. The other end is passing you some cryptographic numbers, they tie up excellently executed authentication transactions, but no personal data is moved. And from a privacy perspective, that makes it much easier for people like me to sign off on the solution without doing loads of DDD and then having to come around and reorder them every so often to make sure it’s still okay.

Rob Griffin

Can that be quantified from a sort of ISO maintenance point of view?

Ben Rapp  

Something will depend on scale, but it’s part of the cost. And I think there’s also the whole question, then, about what are the secondary costs in securing the infrastructure over which the data is moving, making sure that it doesn’t represent a supply chain risk, all of that, which again if what you’re moving is purely cryptographic, you don’t even have to worry about secondary encryption for it because it’s intrinsically encrypted and isn’t susceptible to decryption because there’s nothing but crypto out there in the first place. Anything that makes the lives of people running the privacy function easier will save costs. And you talked about SMS, right? SMS is entirely at the other end of that spectrum. It’s fantastically rubbish. From a security point of view, it always involves moving personal data, and every single transaction has a significant cost. And there will always be a third party providing that SMS service, who will know every time you’ve texted Person X. So it’s the worst of all possible worlds.

Rob Griffin

And we’re seeing now that with the gamut of brute force attacks on SMS systems, where they’re essentially going for MFA fatigue, for ultimately people to just a new identity into the system. Those are, to a large extent, coming from cross-border locations. And so all of that brute force traffic is resulting in substantial SMS cost, far higher unit cost and the average domestic customer and doesn’t pertain to legitimate traffic and the kind of audit process and matching. We saw Elon Musk raise this issue concerning Twitter and their reconciliation of the SMS bills outside the US, but it’s a substantial issue that people need to be aware of now.

Margaret Sherer  

We set out at the beginning to answer the question, can privacy and authentication coexist? And we’re concluding that it can work in a zero-knowledge environment. And not only when you introduce that environment, you make things relatively seamless for your end users. But you also, as an operator, make lots more money as conversions happen because more people are through your front gate, saving heaps on support costs and potential costs. As far as it’s been brought up, you’re not necessarily moving any data from here to there thus far; you’re minimising your eggs in baskets and places where you can potentially be hacked. And that creates a lot more other costs. 

It’s been very, very interesting. Thank you both for your time and for going through this conversation. We will inform you of any comments and links to the passwordless toolkit for anyone wanting to get involved so you can share your thoughts on social media. For any other questions about privacy and authentication, you can contact Ben at securys.co.uk or email at info @ securys.co.uk. And any questions you might have for MIRACL, please do so at MIRACL.com or login@MIRACL.com by email. Thank you so much. Thank you, Robert. Ben, we will see you next time. Great. Cheers. Thank you.

Rob Griffin

And Happy Birthday, Alan Turing!

Get the MIRACL memo in your inbox

Get in touch to learn more

You can opt out at any time. See our privacy policy here.