The SSI Podcast

Self-sovereign Identity, Decentralization and Web3

Podcast Transcriptions

by | Dec 14, 2022

Digital ID: Trapped in Fake News? (with Imraan Bashir)

[Mathieu Glaude] Earlier this year, there were manifestations against the vaccine passports and there was this whole rhetoric that people were using saying that the vaccine passports were a tool that the Government was putting in place to control and put surveillance over its citizens. These types of movements,  specifically within the space of Digital Identity, just looking recently into the Quebec national assembly there has been a couple of different petitions with tens of thousands of people signing against Digital ID programs or even against being forced to use a wallet for Digital ID. Anyone familiar with the Canadian Digital ID in the public sector at least knows of the Saskatchewan Government Program following through in early 2022, which further deepened negative public sentiment around these types of programs which perhaps influenced, perhaps didn’t.

Then again, anyone familiar with the Digital Identity space knows the efforts going into preserving privacy principles and making sure they’re put into place, as well as other principles such as control agencies, equity, inclusion, security, transparency, and the list goes on.  I think that this is also very well communicated within the digital identity community, but we seem to be having a tough time as a community explaining this more broadly to the public.

I hear many people say Governments need to do a better job at doing so, but I  think it’s all of our role to do a better job at doing this. It’s not just the Government, it’s the community overall. If we do this correctly, we’re doing all of this to protect the rights of people through these tools. 

The conversation today is meant for us to have a discussion about all this and hopefully through it generate some suggestions that we can share with listeners on how to avoid the big brother rhetoric moving forward. Let’s start the conversation with the concept of misinformation, as a growing societal topic, which I think is blurring messaging and I do think in fact that misinformation is a big cyber security risk that gets in the way of protecting the rights of individuals and societies. If we’re not able to express our choice without being manipulated, it’s a problem. 

It would be interesting just to start with misinformation, because it’s one of the things blurring Digital ID communication.

How is your thinking on this misinformation, and how it has evolved over the past few years, maybe even more recently with Covid? 

It would be interesting to start the conversation there.

[Imraan Bashir] I would say it’s not just a Digital ID problem, I think this is a societal problem that transcends any kind of subject matter, we see it with elections, with Public Health and Covid was a great example. In my opinion, a lot of it just comes down to trust or trusted sources.

When I was growing up,  the only way to get your news was through outlets such as CBC News, delivered by Peter Mansbridge, who was the ‘trusted source’.  Now, we’re facing an information overload problem, which can be good in some ways – information is awesome – and the internet and social media have come out in the last 20-odd years, which is good for our collective knowledge of humanity. But with information overload comes a lot of other stuff that isn’t necessarily accurate, we have bots that are putting out some automated things and I think the challenge we have now is that there is so much content now. 

If I look at my own Twitter or LinkedIn feeds, the algorithms are starting to dictate what bubbles up because you can’t possibly see it all so what you get is a snippet of some of the things that are going on. Guess what snippets you’re getting? You’re getting the ones with the most reaction, interaction or the most hate or love or whatever the case may be but the extreme reactions are the ones that will bubble up.

I think as humans, when I see misinformation on the internet, my first instinct is to want to correct it, I want to say ‘no, that’s not true’ but the way social media works the sheer interaction with that misinformation is what allows it to bubble up to the top that others see. So it almost becomes a self-fulfilling prophecy, stuff gets out there and we naturally want to correct it, it gives it more visibility and so on. That’s kind of my general view of why it’s out there.

In terms of Digital ID specifically, I won’t go into Covid much. You know, maybe I will a little bit, without picking a side on the whole vaccine thing, I don’t want to, that’s another podcast by itself. I would say the mRNA vaccine, just the principle of it all is a perfect example of something that has been – the mRNA was not just invented you know, last year – it has been in the works for many years and there is a science behind it and so on so forth, but it became more well-known people put those four letters together for the first time in the general public since Covid. 

Digital ID is no different in that this has been researched, especially in Canada. I know going back decades on things like the Pan-Canadian Trust Framework and privacy respecting principles but the general public, just as with the vaccines, doesn’t know those little ins-and-outs, nuances, and all the minutiae that all of us have you know probably laboured through analysis paralysis for years and they have snippets of information. What happens with snippets, is that people tend to fill in their blanks and start drawing parallels to other things they may have read or heard about. The one that comes to mind in Digital Identity for me is the social credit system in China. This parallel opens up a whole can of worms and yes, that is one way to implement it. Is it the way that Canada is going to implement it? Absolutely not.

That is why we spent the last 10 or 15 years developing the principles to prevent that type of thing from happening. I think the rise of misinformation starts with the fact that we have so much of it, that some of the subject matter is so complex and our attention spans are so short that everything needs to be explained in 140 or 280 characters. The topics alone need way more time than that. It’s like the perfect storm of a whole bunch of things that come together to lead to this current state that we’re in.

[Mathieu Glaude] I think it’s just simple for people and I guess based on what you’re talking about the algorithms that push things either one way or another but people I think more and more have found it easier to take sides you know good or bad when in reality no subject in the world is one thing or the other. People have been taken away from actually having discussions about topics so just saying that the Government is trying to control you with Digital Identity, it’s such an easy thing to say and to get behind and people have these existing thoughts about this whether it’s Covid or they thought that this was the case or whether it’s with the central bank and digital currencies, whatever the subject is. The simple messaging I think just sticks a lot more and pushes people to one side or another and you get like these conspiracy theories. It’s easy to say the government wants to control you and it’s easy to say that your telephone is listening to you and tracking data. 

I think there is a misunderstanding of how the architecture of the internet works today and what the economic incentives are for different people today as well. It’s easy to say this without understanding the structure of it underneath. The challenge is how you explain the structure underneath without it getting too complicated for people to listen. 

[Imraan Bashir] As humans, it’s easier to process simple (mis)information  ‘your phone is spying on you without going further and understanding the role of the microphone, video, GPS, etc.

When you apply the analytical side to the logic side, obviously there’s a bit more that comes through, and I think that’s the other reason. There are two different personality types in the sense that there are people with intuitive reasoning and others with analytical type reasoning. Like me – and maybe it’s because of the field I’m in, cybersecurity – I just don’t trust anything by default so when I read something I don’t even feel comfortable re-tweeting or liking it or resharing until I know it’s legit. I need to debunk any information I consume before taking a stand, which is more my personality type.

Others will read and intuitively make some of these logical or not-so-logical intuitive leaps in the sort of ‘my phone is spying on me’ which I mean, it makes sense, and then take it from there. I think that the other thing is that we need to teach our society a bit more on the critical thinking side, I know it’s something I try to instill in my kid as well. If it looks or sounds funny, just ask a question or double-check.

A lot of the beauty of the information overload I referenced earlier, is that there is good information that you can go search for if you desire the truth. I think that’s the bigger challenge, which is how we interpret information and whether we take enough time to vet it to our satisfaction or not.

[Mathieu Glaude] Back to the Digital ID topic, having spent enough time in public sector Digital ID programs, there is a massive movement to protect the privacy of their citizens and the Government is committed to creating digital ideas of public good to protect the democratic rights of citizens. In the same way that the Government takes action to protect its sovereignty, it’s also taking action to ensure that citizens have more sovereignty over their lives, whether it’s just the day-to-day physical world or the digital world.

It is what it is, and we know why these conversations are happening, however, there are these massive movements that are happening throughout the world to protect the privacy and the democratic rights of people and in my opinion, the way they are being communicated is not as efficient as it should be.  

[Imraan Bashir] Also, I think openness and transparency is always the solution when there is some sort of mistrust for whatever reason. Let’s be more open or overly open and transparent about what it is we’re doing, why we’re doing it, and what we’re not doing, which I think is just as equally important as what we’re doing. In these types of messaging, there is certainly an area for improvement and I think that where people have a mistrust – and not that I’m taking the side of the conspiracy theorist – but I do understand why people have a mistrust when an institution or  Government has an attitude of ‘trust us, we’re looking at it’ rather than talking down, and ‘trust us we’ll do it’ a bit more of the let’s talk about what we’re doing. Let’s have a conversation about what we’re doing, this goes beyond identity, this goes with vaccines and other things as well. I heard this concept during Covid because of what happened with all of the misinformation about vaccines and such, that there was a psychology behind some of the stuff that gets announced; you’ve never heard terms like mRNA before and Digital Identity cases, SSI, or all these other things that get thrown around and by human nature, you’re standoffish bit but terms you don’t understand are terms you don’t know so you’re already defensive about it. If someone gives you a reason to be more defensive about it like ‘oh there’s a microchip associated with the vaccine’ it’s easier to believe because you’ve already got your backup. 

The term I heard during Covid – I forgot what article I read this in – but some doctor said something along the lines of what they called ‘pre-bunking’, meaning anticipating what type of false messaging might come out against the thing you’re trying to release. Almost pre-inoculate people to ‘here’s what it is, here’s what you might hear about this thing in the coming months or years, here’s what our answer to that is, and here is where we can ask more questions’. So then you set a tone with people who may be originally defensive, okay now I know what to look out for, and you give them little trigger warnings that may help. Now I don’t know if that’ll have worked with Covid, I’m sure there’s still going to be a subset of the population that believes a certain thing but I do always enjoy the concept of proactive communication.

Bringing it back to identity now, in terms of what we all can do better or what Government can do better, I think some of this pre-bunking of information of ‘what it is the country trying to do and what the country will not be doing’ because of ABC being privacy and consent and all these other principles that we just discussed to me you’re right, I think communication is a big part of it. The sooner the better in my opinion, but I’m that with all facets of life too so the more information the better, and give people a chance to understand it versus throwing it or making it seem it’s forced upon them.

[Mathieu Glaude] I feel sometimes coming from the space, we’re so caught up on the standards and the tech and just the lingo that we use in our day-to-day lives that mean absolutely nothing to regular people, maybe focusing just on the end-user experience that you’re looking for perhaps is a better approach than talking about zero-knowledge proof or concepts that or even talking about what could go wrong in these programs is a good conversation to have. Just to say ‘this is what we think could go wrong and this is what we’re doing to combat that and I like your idea, the pre-bunking almost made me think of different open-source software projects. They’ll just do open calls to the dev communities every week, the type of thing that would be a nice thing to see with that approach of pre-bunking rather than just talking about how to scan a QR code. 

[Imraan Bashir] That’s right because I think a lot of us in the field can already anticipate the types of questions or the types of concerns that might arise. There is no rocket science and we can see from implementations across the globe that there are some pros and cons to a lot of different countries’ implementations but we know this already. We should do this proactively and go out there and talk to people about our beliefs in Canada, and why it’s different. There are going to be some similarities and there will be some differences,  the social credit thing is not happening in this country. For me to say that is one thing, but for us to all prove it collectively and demonstrate openly why that won’t be the thing is the other, and that’s what I think we’re lacking. 

[Mathieu Glaude] I wonder if we could draw parallels to the whole misinformation topic; it’s very difficult to establish what is more and more, so what is a fact versus what is an opinion and I think, our news channels have been bombarded with just opinionated things rather than facts. Different tech companies have taken it upon themselves to create a fact versus fiction thing and you’ll even see them start putting messages on their Tweets or on their podcasts or just to say, warning go check this other source if you want information about something. It feels at least in the information technology or the social media companies, there is perhaps a lack of I don’t know if regulation is the right word, or an oversight committee, or there’s a lack of governance where all of these siloed platforms are taking it upon themselves to say what’s true, what’s not true.

Is there a role for Government in there as well, and then is it a mix between Government and private sector? 

I start to think about if that’s missing in the digital world today then what we’re talking about, of having these pre-bunking forums, it could be a similar type of impartial governance organization. 

[Imraan Bashir] That’s a really interesting question, I think there’s certainly a role for Government to help set some expectations on what we want in our country and I think our country needs to get rid of misinformation as well. The challenge with having the Government involved in it though is that the mistrust is sometimes with Government, so if a Government entity, let’s say hypothetically, was created to debunk and demystify some of the things that are out there, I still think there is going to be that inherent mistrust with a subset of the population anyway. So, to fully answer your question, I do think there’s a role for all entities to play because there’s mistrust everywhere in general. Some people don’t trust their banks, some people don’t trust their Government, some people don’t trust whatever insert name of a corner store here and I think the more people that are in the conversation, the better it is. I don’t think it’s just one’s role, but I do think everyone plays a part in it. It wouldn’t be out of the realm of the possible to have some entity at least, even a non-profit for argument’s sake, come in there and start debunking some stuff.

But it will be collective and don’t forget that only a certain percentage of the population is on Twitter, is on Facebook is on whatever, so the number of mediums that need to be either regulated or overseen are exponential almost. It seems to be that way, that there’s a new platform for everything so I think we have to be careful about how broad we go and what the perception would be if Government were to step in and regulate or set some rules around what can be communicated. 

Now we start talking about free speech infringement and other things that go down a spiral as well so I think it’s a collective, rather than make it feel more ‘big brothery’ or even more perceived as ‘big brothery’ than it is now, to focus it more on a community conversation, including more people in the conversations as they happen. 

[Mathieu Glaude] If I give a conspiracy, that’s the reason why a lot of this negative stuff is coming out in Digital Identity.

Is there are a lot of the big players that are influencing fake news and is that something you could buy into?

[Imraan Bashir] Oh, interesting! So what do you mean by some of the big identity players? 

[Mathieu Glaude] Yeah identity players and big tech players that have silos, they manage a lot of identity data. Anyways is that something you could get behind?

[Imraan Bashir]  I’m an optimist by nature man I think. I would also think everyone wants to do what’s best for, you know there’s a way for all these people who are currently managing these systems, to interoperate in a bigger ecosystem that will give them, and so from a capitalistic perspective, allow them to maintain whatever market share they have and now participate in something bigger. So I see more opportunity for them if there were to be an integrated ecosystem here versus the standalone business that they’re in now. If they were involved I’d say that’s pretty short-sighted because strategically if I were running these businesses, you’d want a bigger piece of the pie right, so I think the more participatory it can be, the more inclusion there is from these people that are in the field. The better it is for them, the better it is for us as citizens that we have a bit more direct knowledge of and access to and consent to the sharing of our information. 

[Mathieu Glaude] Switching gears a bit for anyone getting into the Digital Identity space, there’s a whole slew of standard bodies and different standards that get used and it’s just sometimes tough too, I guess it creates a barrier of understanding to certain people because there are so many different standards. They’re not always technically based on these standards, but there are so many different organizations, there are some that have support from different levels of Government and for example, seeing this a lot in European nation-states and also in the various provinces in Canada, where there is a big movement towards some of the W3C and Hyperledger standards,  and obviously on the other side with the governance of the Pan Canadian Trust Framework being there as well. Then you also look at – just talking about Canada specifically- we have multiple levels of Government, say the federal, the provincial and the municipal, picking a non-international standard is not acceptable for a federal Government due to specific reasons.  For example barring international trade, which you could get sued for.  

How do we make it easier for people to understand all of the different standards that are out there, and why they’re important? 

Maybe you could talk a bit about some of the leading ones which you think are mature enough and need to be better understood.

[Imraan Bashir] That’s a really good question, so I think we have to start maybe at the beginning of why we even need standards in the first place and think what the role of standards is and how deep standards need to go because I think there’s a certain sweet spot there. I think standards are meant to ensure number one, consistency, reliability and interoperability, that we’re all holding everyone at the same level, whether that be technology, or the electrical outlets in your wall. I mean there’s a certain safety requirement in these, some of these other things. So from an identity perspective, I think the challenge is that it’s still early days even though I said we work on it for a few decades, it’s early days in the sense that technology is now catching up with some of those preliminary conversations. Also, I feel – and I’m going to age myself here – that we’re in a beta VHS  moment or Blackberry/ iPhone if I want to be a bit more recent or whatever the case may be, but this goes back historically. There’s always these moments in time where you – there’s a way we are going fork in the road type of thing – in identity I think the key part right now is to find the standards that give us that common baseline that allow those things I said for before. I just don’t honestly care about technology right now, I care about whatever technology that jurisdictions will use in Canada, and let’s be honest it’s going to be multiple. Will they work well together number one, and will they have the same level of security reliability and keep our information private etc.? So I think that’s the key level we have to get to in terms of analogies. 

My analogy in the payments industry is that I don’t care if I have an Amex or a Mastercard or a Visa in my wallet. The thing always works when I go to restaurants unless they don’t accept one of them. What I mean is that machines tend to accept all of them. I can tap a debit, and I can tap a credit as a user of it all. I don’t need to understand the ecosystems and I trust that there is a standard that they’re all following that keeps my data and me protected to the same degree. If we can draw from that analogy in identity, I think we need to be at the same level there. I think the example I give of the Pan-Canadian Trust Framework is perfect. Not on the standards front, but at least on the framework, principles and things that matter in the ecosystem. The CIO Strategy Council’s Digital Trust and Identity standards as well is one of the ones that take it to that higher level, the principal level that’s in this country. Those are two that I’m well versed in for sure, just on past involvement. Then there are others in the technology side of things, W3C and others that are still evolving and I’m not saying any of those are perfect. The challenge we have right now is, do we wait for the standards to figure themselves out and we sit and do nothing, or do we experiment in a way right now where we’re seeing what works best with a willingness to pivot if something were to change downstream.

I think that’s the eternal debate that seems to live on in at least the public sector where it feels wasteful for sure to spend taxpayer dollars on experimentation that the principle seems backwards, but when you think about it, you need some of the experimentation to understand where you need to go next as well. I think we have to get past some of those humps, it does not have to be a 10-year 100 million dollar project, it could be a one or two-year pilot with a small investment and then a quick pivot at the end of a couple of years. I think it’s a mentality around it, where we may be waiting for perfection before starting anything but there is a danger too that our citizens aren’t getting any more patient and I don’t like the in-person stuff I want to go online and digital as fast as I can and I know the next generation is that as well, so it’s an interesting balancing act. 

[Mathieu Glaude] I think at the end of the day the most successful capital allocators in the world, in the capitalist world, and it’s easy to name some of them, but it’s just they’ve been able to best place their dollars or their money into a mix of R&D and growth things so you always need to have this R&D going on so it feels if you’re looking to grow the economy and the prosperity of a nation you need to have good placement into R&D. You named a couple, PCTF and CIO Strategy Council. I tend to look more at these from at least one version of the PCTF than from a federal perspective. 

Why should a citizen care about the PCTF and the work that’s happening within the CIO Strategy Council?

[Imraan Bashir] It’s a good question. Do citizens care about the fact that the security or whatever, the electrical alliance has approved my plugs in the wall? Probably not. I think what’s important for them to care about is that these conversations are happening. Some of those principles we talked about earlier and the fact that this is not a social credit system. Well, how do you prove it? Look at some of the stuff that we’re putting out there in publishing on consent and privacy principles being first and top of mind, I think these types of things they don’t have to read or memorize any of these frameworks. I think they just have to be aware that these conversations are happening and it’s not just as I said earlier ‘trust us we’re good’ it’s a no, show and tell, here’s what we’re doing and here’s what the principles are that’s what’s the important part of those discussions. Is my mom going to be a part of any of the consultations? Probably not, but I think it’s important for them just to know that they’re happening rather than just again, that hiding behind a veil of ‘believe me, we know that’s good for you’, which is where I think the mistrust stems from.

[Mathieu Glaude] What would you say, Canada is doing good or better, or where other countries could take learnings from versus the other side as well? 

Are there certain areas in the world under Digital ID programs, it could be standards, could be anything that you think Canada could learn from or perhaps do better, so maybe both sides.

[Imraan Bashir] It’s a good question, so when I  was, when I spent some time in Government, we were part of what was called the Digital nations of countries, a series of European and Asian countries in there and South American as well. Just leaders in the digital space and the feedback we got when was they were impressed with the work we were doing on the framework-type things, on the standard-type things, on making sure that these principles were well thought out before we dove too deeply and the challenge with our country too – and I won’t sugar coat it – is that we, our Government system is more complex than some other companies. We are countries, we have three levels of Government, it’s a Westminster model. Accountabilities are different, there are some countries in this Digital nations group everything is centralized and there’s authority. I’m not saying Digital ID is easy, but it’s certainly easier when you have the data in one authority or one jurisdictional control versus what we have is a bit more federated and that’s not an excuse as to why we’re late. It’s just more considerations of what we have to come up with, so I think, to answer the first brother question, a lot of good feedback on our willingness to try to break down some of these jurisdictional boundaries with some of these cross-jurisdictional frameworks, and the willingness to interoperate in the country for sure. 

The privacy respecting principles, the consent, the revocation of consent, all the things that we have been baking are well respected. I think the difference on the other side is that we are slower on implementation. Let’s just call a spade a spade, we have a couple of provinces that have got out to good head starts on the West there, and since then, I feel it’s a bit stagnant for the reasons we talked about, misinformation being one. Investment is another, if it’s important we dedicate the resources to it, if it’s not we don’t, and I feel we’re still on the fence as to how much investment this thing needs across the country. You know in terms of models to follow, the one I look to the most is the European Union itself and I bring them up because they have an interoperability challenge, they have whatever number of countries is in there that work together in an interoperable scheme that I guarantee you does not use the same technology in each country. 

If we can, if we can just take that model and just consider ourselves a mini well, union, in which we are provinces and territories, there’s no reason why we couldn’t do something similar. So I think there are a lot of good lessons to learn from those

implementations. Other countries are world-renowned for their regimes; Estonia I’m sure comes up a lot. Different circumstances led to why they had to come together after an incident back in 2007. Evolving from there and building it up from the ground up, I have done a fantastic job there and have set some really good principles for us to follow, including the notion of transparency by the way. So I think there’s a lot to learn from around the world. 

Some mistakes have been made around the world as well, and we alluded to that earlier in terms of whether it be over a collection of biometric data or central storage of some of this biometric information that personally, as a cyber guy, makes me cringe. So I think learning the good and bad from around the world, there is no such thing as bad news. That’s a good approach, and I think it’s just a matter of balancing our willingness to innovate and drive a bit rather than getting stuck into a bit too much of the analysis paralysis at the beginning. That’s my assessment of where we’re at.

[Mathieu Glaude] There have been some brutal stories about biometrics and facial recognition that continue to happen as well and I think what’s scary about that is it’s one thing if our passwords get breached, we could just change our passwords and hopefully we could recoup our assets if our credit card number gets breached, you could cancel your card and there is a liability protection there, but you can’t change your biometric information, like your facial characteristics and I think it’s just a very touchy subject. What makes me uncomfortable, is that it’s hard to avoid them, you go into certain places, maybe not in your day-to-day but throughout life. The first time I was at an airport in the US and I was forced to do a facial scan and whatever, I felt a little uneasy about it. So I think this is one aspect that is used, authentication or security a lot of the time, but there are so many examples of frauds and data breaches and data being wrong here. That’s a bit of a tough area, 

I agree and you talk about roles for the Government, there’s a role for Government right there which is to regulate how and when this information can be collected, used, stored etc. because that is a – I agree with you. We all know that pictures have been taken of us over the last number of years in various contexts whether it’s in a mall walking by or whatever, the airport or where the case may be and I think the lack of regulation or rules around what can be done with that information certainly one of the reasons why I’m not a big proponent of using biometrics as the ‘be all end all’. The issue is that the role of biometrics maybe needs to be discussed a bit in what role it plays like unlocking your phone for example, without storing the image as a one-time transaction. That’s one thing to store it and keep it for later and upload it to a number of different places for various usages, it’s another thing – and I certainly don’t have clarity when, where and how my biometrics are used in all these instances. So I agree with you, and think that is one of the areas that is going to be – there’s a lot of learning too that happens from around the world, either from breaches or whatever the case. 

And that goes back to our other comment there about information and how we can proactively communicate. The words facial verification and facial recognition are two very different things, and I think one-to-one or one-to-many are concepts that people openly talk about enough or fully understand. Even in our own space, how do we expect the average Canadian to understand that as well, so as part of what I said earlier about the pre-bunking, I feel this really a good topic to dive in on and explain what is, what could be used, for what you have to watch out for, what can still be done by other players in the country etc. 

[Mathieu Glaude] It feels one of those that just people get impacted by the most because it’s just so personal, it’s not using a fingerprint to get onto my phone. It’s really my picture being taken, I have no clue what’s happening to it. A few years ago, we were working with a customer that was building a product aimed at folks in the firearms industry in the US, and you could imagine the type – not all – but there’s a big movement there, of just privacy, right, and these types of people whether they’re hunters or gun collectors. It was incredible to see the second that we did some user testing with facial verification software just the backlash that came from that. So it’s a bit of an interesting one. I think there’s been a lot of backlash even then, you said there’s been a lack of capital perhaps invested in some of the ID programs in Canada, but there are other areas in the world – the ID Me project in the US – there’s so much money that was thrown in there but then there was an incredible amount of backlash over their facial recognition technology.

[Imraan Bashir] Absolutely and we’re not even touching – well that’s just the privacy concerns – we’re not even touching on some of the inclusion and diversity concerns that come along with some of these things as well, which I don’t want to downplay either. I think the challenge with the identity space – not in this country but around the world – is that you are dealing with the security and privacy information which obviously has to be inclusive and equitable for all to use. However, here we are creating almost a bunch of mini divides within a subject matter that has already divided enough. I think there are a whole bunch of anecdotes from around the world about darker skin colour, facial coverings or other things that would impact someone’s ability to use it. So I agree, that topic alone can be broken down into several different subtopics, all of which require going back to proactive messaging and good understanding around the world for sure.

[Mathieu Glaude] I think facial recognition technology has proven to have a lot of biases and therefore, it’s often wrong and it takes away from equal opportunities for people in here. 

[Imraan Bashir] That’s right, again going back to the word trust, all things that undermine trust and confidence in a system – and that’s what ultimately comes down to – if trust is undermined whether it be from information or lack thereof or whatever the case, or bias or whatever is the reason for it, you don’t get adoption. If you don’t get adoption, then what is this all for right? So we don’t just do this for the sake of doing this, it’s to provide a benefit and so for me, that adoption piece is really critical. 

I want to get down to the nitty-gritty of what will prevent people from adopting, and how we can address that aspect of it because delivering it and having zero users, no one wins. 

[Mathieu Glaude] I do think though, at the same time, the word trust is overused by a lot of technology solution providers, the trusted platform for this rather. I feel that the concept of trust is maybe misunderstood as well and self-claims about being trusted about something is maybe not going to help create trust. 

[Imraan Bashir] Agreed, and that’s a really good point actually because you can’t just self-assess or self-attest your trusted worthiness. It has to be proven. Same in real life, I can’t just tell you I’m honest and just lie to you all the time, you build that over time so I think that the building of that trust is important and I think you’re right, the word is overused by people that claim to be and then as we find out later, are not. Going back to the standards conversation, having standards to help attest to that trust or having third parties verify that trust, are all things that help build that confidence in the ecosystem. 

[Mathieu Glaude] The usage of zero trust is an abused term nowadays as well, and for yourself, the cybersecurity expert, I’m sure you see it getting abused as well.

What are some misconceptions about zero trust, or how does it get overused wrongly?

[Imraan Bashir] I want to be honest, I love the concept of zero trust. It’s really great, I think the word or the title is the worst title because it’s so negative. There are so many negative connotations from zero trust again, almost makes you think well, I should trust no one and it makes you think you don’t trust your users or you don’t trust this and that. I meant that’s the fundamental principle it’s based on, but not the right message to send. It’s that we want to make sure that you’re doing what you should be doing and nothing more, nothing less. Number one, I wish there was a more positive spin on the name but it starts with the principle itself. Let’s be honest, it’s nothing new, it’s applying the oldest principle; security is the principle of least privilege. You get access to the things you should have access to, nothing more, nothing less. That’s in the CISSP book item, number one. This is the first principle you learn. Now what we’ve done over time is different buzzwords and different implementations and network access control with NAQ was a big thing when I was coming up the ranks. 

This is just an evolution of that, now it’s more based on knowing who you are making, knowing what identity or sorry, what device you’re coming from and having the right access and monitoring. There are a lot of things that came with it, so the principle I think is sound. I think some of the misconceptions around it, is that it isn’t something you just buy and implement and be done with. It’s not a product and you may see some product labels out there that say zero trust on it. To me, it’s more of a philosophy that’s how I would characterize it. It’s a philosophy, an operating model or whatever you want to call it, but have this principle that we want to make sure the right people have the right access to the right data, at the right time. Then the technology enables that so I think that people trying to treat it as a project or something you can buy, that’s not what it is, it’s something you have to buy into and then work towards. 

I think the second part of that is it sounds – and this is typical security stuff – as if we make it sound so shiny like it’s a silver bullet that is going to solve all the problems but that makes it sound really hard and expensive and enterprise and multi-year. I wish that was all gone as well because I don’t think it needs to be as complicated as we make it to be. I think yes, it’s a multi-year program but there are incremental things you can do to step towards it as well. You could start with individual identity, you could start with MFA, you could start with device. I think you can move very slowly as you go along the chain here characterizing it as the giant behemoth that some people do, does it a disservice as well. It makes it sound more intimidating. Or whatever the case may be. Those are a couple of things that came to mind other than the term itself. 

The other thing I’ve heard is that this can just be more secure and it’s going to be harder for me as a user. Personally, I actually think it’s going to be easier as a user to know that I am ‘going to be bothered’ I use air quotes here sorry for those who can’t see, ‘by additional authentication prompts all the time’. The truth is you’ll be bothered when you need to be, when the risk level is nil, you won’t be bothered by all the stuff you’re being bothered with right now. Our problem with security right now is we have this very one-size-fits-all approach to identity; this your thing and if you’re enabling MFA it’s for all, if you’re making something it’s for everything and all things. Data doesn’t work that way, data works in a way that some stuff is open, some stuff is not, and some stuff is classified. I think identity rules need to be commensurate with the type of data you’re accessing now that the data resides in multiple locations like cloud, on prem or other. A data-centric approach to accessing the data is much more productive than a network-centric approach to it. I think it’s more in the way you look at the field versus what I think gets sold. Sometimes it’s the word gets overused and this is typical industry behaviour to be honest. 

[Mathieu Glaude] I was at a large cyber security conference in France a few weeks ago and it’s just incredible the number of companies that are providing zero trust or cyber security solutions. The industry is growing like crazy, there’s a lot of money to be made in cybersecurity with these technology solution providers. I wonder at a certain point if it’s just without fundamentally changing the architecture, and you alluded to data-driven rather than network-driven, but if you’re just trying to add more and more solutions on top of bad architecture, how much better is it actually going to get? So when we talk about self-sovereign identity, we talk about it from more of a data-driven approach. For me, the fact that your data resides with you is zero trust. Really, I have control over what I’m doing with it there’s nothing without my knowledge that is happening with it. 

Do you think categorizing the self-sovereign architecture as zero trust is the right categorization?

[Imraan Bashir] Interesting, I never thought of it that way. I wouldn’t characterize anything as zero trust, to be honest with you, but I do think ‘zero trust enabled’ or ‘enabling a zero trust experience’ or something along those lines, where it’s a part of a bigger series of components that lead to that zero trust ecosystem. So if we all did a better job of saying what our role is in the zero-trust paradigm, that might make it resonate a bit better. The more we go on the lens of this thing that is zero trust part, the more we drive that problem of over-extending the word. What it is, is rather a philosophy or philosophical view that I mentioned earlier. 

[Mathieu Glaude] You also mentioned they are often big implementations but you could have logical steps or milestones you’re hitting throughout this process. We could be talking about the public or private sector here, but in any Digital ID conversation it seems there’s way too much conversation around the technical track and it differs, I guess, between the public and private sector and how you would go about it. Maybe you could discuss that a little bit 

Q1: What are some of the different tracks that you need to get in place to ensure that all of these principles that we’re talking about, and ensuring that we’re doing this the right way to not impact privacy? 

Q2: And if these other principles are being done correctly, what are some of these different tracks beyond the technical track that need to simultaneously run in either the public or private or both?

[Imraan Bashir]  I would over-simplify my answer here to people, process, and technology. I think all three are equally important, but you’re right in that you can have the best tech in the world but if your process, your entry and exit process in your organization is garbage and your identities stay enabled for x number of years after someone leaves, then your technology is useless because the process is the weakest link in that chain. Similarly, if your people aren’t trained properly on how to conduct and implement some of the processes that I just mentioned, you’re in the same boat as well so I always look at these things as a triangle of all three that need to work together for any technology implementation to work properly. To me, that is the bigger risk, the technology pieces. The vendors will tell you, there are a lot of people that will sell you something tomorrow that’ll enable zero trust, but all of it is useless unless those other two legs of the stool so to speak that are adequately addressed as well. That’s my bigger concern. 

When I say start incrementally, it has nothing to do with technology to be honest with you. It’s more about ‘let’s figure out just what is your current lay of the land’  even if you had the best tech in the world, would it do what you want?  Are you making sure that people have the right and how do you assign access today, how do you track when people are leaving or arriving or changing groups?  The longer you stay in an organization, the more rights you accumulate, that’s just the nature of the beast. How are you making sure that doesn’t happen before you start diving into some of the technological concerns? I’m a technology guy and what I hate the most about technology is that it’s always sold as the thing that solves all the problems when really it’s – and I’m going to generalize here and I hate generalizing – but it’s almost the easiest part all the time because you can buy the thing and implement the thing, the other things that the people the change management and the procedural updating, the business process transformation stuff that’s the sticky stuff that usually creates havoc in the technology implementation. 

[Mathieu Glaude] So I guess going into a Government and trying to build a Digital Identity strategy you’re looking at the legislation track and you’re looking at a regulatory track. These things are significantly more important to any of these programs being successful and significantly more important than the technology side of things. 

Where is the current state of things? Maybe we could use provincial Governments in Canada as an example with legislation. 

I think they’re all on different playing fields, but what are some of the conversations that are being had about these processes and the people?

[Imraan Bashir] I think there are a few provincial ones, and again, going back to our Westminster model of Government, we have some federal ones. In fact, I think it was just last week if I’m not mistaken, some announcements around bill C-27 that include some more privacy implications in there as well.  I don’t want to profess to be an expert at it, I just saw the news release but I know that the feds are working on things that will enhance at least some of our outdated national legislation that hasn’t been touched in several years. I know some of the problems have done some work, with Quebec being the most recent with what is it, bill 64 I believe, that is more in line with I think what we’re seeing with let’s call it the ‘godfather of legislation and privacy’ of the GDPR in Europe. The challenge I think we’ll see in our country is that we have provinces and federal Governments on different timelines, different maybe levels of ambition on some of these bills. So we may inherently be creating some disparity amongst – and it sucks on the vendor side of things – where you now have four five six different pieces with different requirements that you have to align to, and what ends up happening here for being honest, is we just align to the highest watermark because we know if it hits the high water mark it’s good enough for everyone else.

I think the landscape right now is just that, it’s finding out what the ambition is from each of these different jurisdictions and if arguably – I’m going to make this up here – but the bill 64 to Quebec is the higher watermark we work toward and we know that we’re going to be good for the rest of the country as well. On the federal side, we do have to respect the jurisdictional boundaries and what the mandates of the federal Government is and provincial and such, but I do think what the federal legislation can do is set at least a low – not a low bar but a baseline – it’s a better way to say a baseline level, for what we need to expect from all implementations from privacy and security.

I saw I think bill C-27 also includes algorithmic bias if I’m not mistaken, but these very important things that need to be addressed just to keep those conversations going I think. The other thing historically is, legislation is not easy, you don’t just change that overnight this stuff takes time as well and it’s really important that we  – and it’s easier said than done i will say – that it’s future-proof, that it’s not as prescriptive that locks us into something that doesn’t work in five years time it needs to be more

principle-based and future-proof so that as technology evolves rapidly we’re not shooting ourselves in the foot and preventing something else from happening. That’s the challenge in being a regulator and legislator, they know it full well but I think the right minds are thinking about it certainly and we just have to make sure that we drive up collective consistency across the country.

[Mathieu Glaude] To comment about shooting yourself in the foot. I feel often there is a trend where it’s easier to pass regulations than to remove regulations and so you pass regulations and that at a certain point you’re starting to create so much red tape all over the place because it’s tough to navigate and actually innovate and create value because there’s so much in there. So it’s a tricky balancing act to for sure.  

[Imraan Bashir] There was a funny anecdote when I was in the Government and I  misquoted – so don’t don’t take this exactly – but it was something along the lines of the number of times the word facts is mentioned in legislation. When the thing was written sure it made sense to fax things, but in 2022 I don’t know when the last time I saw a fax machine or if the people that are younger than me will even know what a fax machine is. That type of codification is what you have to avoid because it then limits you down the road. I think finding out the right principles you want to follow, the outcomes you want to generate and the behaviours you want to influence or encourage is what the legislation needs to set out to do and nothing more, to allow  innovation to happen within those guard rails.

[Mathieu Glaude] If I could use another trifecta or triangle, I think one of the challenges is how do you prioritize privacy control but then also convenience. That’s something that you don’t want to get in the way of convenience or innovation in legislation either but it’s tricky. This is why it’s interesting to look at some of the programs that are happening elsewhere in the world to see where the learnings or where the good learnings and bad learnings could come from. 

[Imraan Bashir] With any legislation comes, and with any honesty statement, comes unintended consequences as well and that reaction you mentioned earlier that we do have to become – and I hate this word – but more agile when it comes to legislative reform and that we cannot wait 20 years if we see something that’s going well, so there is an opportunity to nip these things in the bud sooner than later and I think that there needs to be a stance our regulators take as well and being willing to pivot when the pivot happens. Technology moves way too quickly for us to modernize these things and as regularly as we want to. Just leaving that room to pivot. 

[Mathieu Glaude] I think is equally important and one of my – as I was preparing for this, one of the things I felt we could probably do better as a community is lower the use of buzzwords such as agile and zero trust and all these things just make me laugh.

[Imraan Bashir] That’s true that’s why I prefaced it with I hate this word but I get you.

[Mathieu Glaude] When we talk about that we’re spending a lot of time designing these programs and making sure that it’s not impacting the freedoms and rights of citizens.  

What are some things that we’re ensuring or that we’re doing and or not putting into place that citizens perhaps should know so that they don’t feel they’re getting pushed into something that they necessarily don’t want to do?

[Imraan Bashir] I think the word mandatory comes up a lot and we heard it with vaccines when things get mandatory people get a little antsy about it and the one thing I can say with 100 certainty in Canada and Digital Identity, is that it will not be mandatory in the sense that you will have options. We are not in a position where we’re trying – and again – to debunk some of the myths around tracking or whatever.

This is about empowering citizens, so if you want to use it, if you want better, faster and more secure, easier and more efficient access to services you are welcome to use it. If you wish to go wait in line for an hour at Service Canada or wherever the case is, fill your boots. It’s the same as the banking industry, I think my parents still do telephone banking. All the more power to them. Personally, I do not want to do telephone banking, I’m more than happy to do it from the comfort of my desktop or phone, but I appreciate the optionality. That I think is a really good principle in all walks of life to be honest with you. Specifically, in Digital Identity, this is for options and this is to empower you, and if you don’t feel empowered by it, don’t use it. To me, what I think will eventually happen – just my own opinion – is that it will be eventually framed in a way just as credit cards were, as other things that have evolved from paper to plastic to online and have been in that when your life becomes easier, when you feel more in control when you feel you have the transparency you need to trust it you will use it. That’s human behavior, is to circumvent inefficiency. 

Right now a lot of our inefficiency is based on us waiting in line and Covid certainly spoiled me, to try to avoid human contact. It sounds really bad, I mean human contact waiting in lines. I want things delivered to me for the rest of my life and delivered to me when I want it. I get a movie when I want it, get my food when I want it, get my Uber when I want. I do think services will go down that route but in the way that people want them to be delivered. Freedom is really important to maintain so that people have peace of mind and they can go at their own pace because when they do, I just find there’s a lot more comfort and acceptance and willingness to want to adopt. 

The other thing that I want to add as well, this goes back to the principle of inclusion, is that whatever is built or will be built in this country has to be inclusive, not just gender or race or anything else but I mean inclusive in general. We have a digital divide in the country as it is, we take internet access for granted and we still don’t have drinking water for our entire country much less internet access. So I think, we have some foundational infrastructure type of things that we need to address as a nation to make sure that everyone feels included. The last thing I want is for the haves to have Digital Identity for example and the have-nots to feel more have-nots and then creating a bigger divide. For me, a successful Digital Identity implementation allows some of the more underprivileged or underrepresented groups to get better, faster and more efficient access to things they may not have had before. Maybe they can apply for a job faster, maybe they can get a loan faster or a rental property and without maybe being subject to as much discrimination. I think those principles of optionality and inclusion are non-negotiable in any sort of implementation in this country.

[Mathieu Glaude] That’s a tremendous point and even just yesterday, I saw – and we’re recording this on June 30th 2022 – so just yesterday on the 29th, I saw an article come out on CBC Saskatchewan, where the Saskatchewan privacy commissioner was calling out for the development of an optional Digital ID. I think the word option and optionality has now been inserted into the conversation and I think that is something that needs to be in there because it is not being pushed on people, it’s being put into place to make people’s lives better with all of these privacy-preserving principles that we’ve been talking about. However, if you don’t want to use it, then don’t use it. If I want to keep receiving my statements in the mail for my credit card I could still do it.

[Imraan Bashir] You bet and I think that’s a very Canadian principle as well and that’s one where I think going back to the other question about what other countries are in respect to where we’re going, I think that’s a big one as well, and I think we do that part right. It will be one of those things that will gather a snowball moment over time, and that’s okay that none of this stuff happens overnight. I have no problem if we start off with hypothetically age verification, and nothing more. That will be great and then for those of us who are still fortunate to get carted at the liquor store, you go and do your age verification and you don’t have to show your piece of plastic that has your name and your address and your height and your weight and whether you wear glasses and all these other attributes that drive me nuts that we share so freely

right now because that is the way our ecosystem is built. So starting off with something small, or a small win, and just getting a little bit of adoption I think is the right way to go and again keeping that principle of optionality so you can come on board when you want to. 

[Mathieu Glaude] I also think just focusing on those interactions that you’ve described;  if you’re purchasing alcohol at a liquor store, just talking about that interaction rather than talking about credentials and security, it’s an easier place to start and it’s where the value for people is. When they need to prove something about themselves, that is an endpoint right there. Where there’s someone that needs to verify or there’s a reliant party. Just focusing on what that improved experience is going to look like is a better way to start, than trying to explain Digital Identity or credentials to people.

[Imraan Bashir]  Agreed. I think it’s that old-fashioned what’s in it for me’ message and if you explain what’s in it for them, if it’s more convenient for them they’ll use it. If it’s not, they won’t. Humans are very straightforward creatures that way and I never thought I’d see the day where my parents are using smartphones and using their fingerprints or whatever to log in but what it came down to is, would you rather type a complex  password with your thumbs or would you like to use your fingerprint which will work and gets to the thing faster? And I think you’re right, is explaining what’s in it for them is a great way to start. 

[Mathieu Glaude] Thanks for tuning in today, hope you enjoyed the conversation as much as I did. To stay up to speed with future episodes or to catch up on ones you may have missed, make sure to check out the SSI Orbit Podcast on your favorite podcast platform and make sure you subscribe.

If you have any questions, or comments or wish to see someone in particular on a future episode you could find me by searching Mathieu Glaude on LinkedIn or Twitter. Feel free to reach out and see you all next time!

Related Episodes

Want to Be a Guest?

Come Onto the Show

Stay Connected to Get The Latest Podcast Alerts

Cloud Storage, Management, Issuance and Verification of Verifiable Credentials

Products

 

Orbit Enterprise

Establish your own trusted digital interactions ecosystem with your customers, partners and suppliers

Orbit Edge Wallet

Hold and manage issued verifiable credentials securely and in a privacy-preserving way

Updates

 

Stay up-to-speed with our newest product enhancements that all help enable digital trust

Use Cases

Public Sector

Available Soon

Travel & Hospitality

Available Soon

Supply Chain

Available Soon

Healthcare

Available Soon

Banking & Fintech

Available Soon

Resources

 

SSI Orbit Podcast

Self-sovereign Identity, Decentralization and Web3

Blog

Insights and News from the Forefront of Self-sovereign Identity