Inspiring Tech Leaders - Privacy Enhancing Technologies with Dr. Ellison Anne Williams, CEO of En...
[Music] Welcome to the Inspiring Tech Leaders podcast with me, Dave Roberts. Today, I have the pleasure to be talking with Dr. Ellison Anne Williams, who is the CEO of Enveil, a pioneering, privacy enhancing technology company. What a pleasure to have you here today, Ellison Anne.
Thank you for having me, it's a pleasure to be here. Great, so let's start off with learning a little bit more about you and how you got to where you are today. And, you know, was this the career path that you always wanted to take from the start? It is the career place I wanted to go, but not the pathway that I ever imagined, which is probably a pretty normal story. But today, I founded and I run a privacy enhancing technology company. And of course, we focus on protecting the usage of data, which really can change the entire paradigm around how and where any type of organisation can leverage data. So,
that was the end game and the end goal, as long as I can remember, was starting and running my own companies. But it was certainly a very windy and non-expected journey to get here. Now part of that journey included working at the National Security Agency, where I believe you worked on homomorphic encryption. What exactly is homomorphic encryption? Oh my goodness, Dave, I'm proud of you for being able to pronounce it. That's awesome. Yeah, so homomorphic encryption is one of the key pillars of that family of technologies that I mentioned, privacy enhancing technologies.
So what homomorphic encryption allows you to do is to perform computations in the encrypted domain, or as we would say, in the technical world, cipher tech space, as if they're in the unencrypted world or the plain tech space or the real world, as I would like to call it. So you may be thinking, all right, well, performing encrypted computations as if they're unencrypted. That sounds interesting, but why does it matter? So, to give you a more concrete example, suppose I am a global bank, and I am banking an entity in Singapore that I'm watching for questionable activity, maybe relating to financial crime fraud. And I want to know, is this entity that I'm banking in Singapore being banked anywhere else in my own bank? And if so, are they also watching them for suspicious activity? That sounds like a very reasonable and simple thing to want to answer, but it turns out to be incredibly difficult to get the answer to that question in practice, just due to the kind of heterogeneous regulatory data residency landscape that that bank is operating in. Now, what privacy enhancing technologies and homomorphic encryption come in to do is uniquely enable that question to be answered very cleanly and very simply in that. It can take the question for, are you banking the Singaporean entity? It can encrypt it homomorphically in the form of an encrypted search in Singapore. It can send that
encrypted search out to the other operating jurisdictions of the bank, so out to the UK, Turkey, Germany, Switzerland, etc. It can process out there without ever being decrypted. None of that Singaporean information is ever revealed outside of Singapore. An encrypted result is produced, comes back to Singapore where it can be decrypted. Then in a matter of seconds,
you can see, wait a minute, this Singaporean entity that I'm banking in Singapore, that I'm watching for questionable activity, that I had no idea was being banked anywhere in my own bank, is actually being banked in the UK, and perhaps they're watching them for questionable activity, I need to check that out a little bit further and go about my workflow. And that's all possible because of homomorphic encryption. So, the real transformative, so what around privacy, enhancing technologies or homomorphic encryption, is it allows any type of organisation, but in that case, financial services to securely and privately use data, where it is and as it is, across boundaries and across silos, in ways that was never possible before. And you got a little taste of that in that financial services example, where those boundaries
and silos were driven by jurisdictions, for example, of the same global organisation, but it allowed insights to be drawn from data in a secure and private way that would have never been possible, as I mentioned. So that's a little bit about what is homomorphic encryption. Well, that is amazing. And how did you develop your knowledge around this topic? Where did that come from? I happened to be a mathematician by training, which, you know, occasionally comes in handy.
So that was just kind of a very roundabout pathway that I took to get to, you know, this company really wanted to change that paradigm. But math was always something that I thought was very interesting. I was very young going through school, so I just went on and, you know, got the PhD in math and then went from there. So that's an incredible journey. So, what really inspired you to start Enveil and what are the other areas that you've been focused on? So, for me, like I said before, you know, running and starting my own companies is something that I had always dreamed of as long as I can possibly remember. So,
you know, not many people get to say they get to live their dream on a daily basis. I'm very fortunate and very blessed to be able to do that. But then why this company? Why privacy enhancing technologies? Why going on this really hard journey of creating a new market around this family of technologies? Because they just hadn't existed in a computational practical way. So, you have to go back a little bit. So, when I finished the PhD in math, I got a knock on the door from the US intelligence community, in particular, National Security Agency. And basically, they said, we can't tell you what we do, but we do cool stuff. And I thought, oh, wow, I'm going to go do cool stuff. I was pretty young, and I went there.
And yes, I got to see and do some amazing things. Ended up staying there longer than I ever imagined. I thought I'd be there for a few years, ended up being there 12. I was just given incredible opportunities to see things and do things that I could never see and do anywhere else. And one of those things happened to be an application of privacy enhancing technologies or homomorphic encryption in that we were really faced with a mission problem of how do we use data that exists out in the world that we have legal access to, but we need to do it in a way that respects our interest and intent in that data. That turns out to be that problem framing turns out to be a perfect use case for that special type of encryption called homomorphic encryption.
The problem was at that time, and this was 10-ish years ago, it was possible to do homomorphic encryption, but it was not computationally practical. So, taking a search that originated in Singapore, like I described before, encrypting it, sending it across the globe, have it process and come back would take days of time. Now remember, in the example that I just gave you, it takes now seconds of time. But back then, it took days. Why? Just because of the math, kind of where that was. And so, like I said before, sometimes becoming a mathematician, having those skills is actually handy. So, we were able to step back and take a look at that problem and say, well, wait a minute, if we just approach it a little bit differently, if we look at it, flip it upside down, we could do this kind of computation in a way that became practical for the first time ever. And so, taking a look at that, I saw, wait a minute, these technologies really do have the power to change the entire paradigm around how and where any type of organisation can use data to extract insight to unlock value very uniquely.
However, because the technology itself has never been computationally practical, the market doesn't exist. So, I want to take these technologies and go create a market now that they're available around privacy enhancing technologies with the goal of changing that paradigm. So that's the premise upon which the company was founded, Enveil, CEO of Enveil, about eight years ago in 2016. And we've been on that new market creation journey since then. It's very exciting. We
now kind of see that market coming into form, even though it's very much emerging. But that's, you know, why this, why do privacy enhancing technologies? Why now? That's really interesting. So, what other challenges did you experience when you moved from the National Security Agency to becoming a CEO and founder of a new business? I'm sure like many entrepreneurs, it wasn't an easy and straightforward path. So, I'd love to hear the insights that you've got in that area. No, it's not easy or straightforward at all. I like to tell people you're going to hear no exponentially more than you will hear yes, and you're going to be kicked in the head way more than you're ever going to get a high five. And you just keep going because you believe in what you're doing,
and that it should exist in the world and that it can really make a huge difference in that space. So, I think just that inherently in the entrepreneurial journey is tough and it's consistent and it's hard because it's hard, right, which sounds a little trait, but it is the case. I think with creating a new market, it's extra hard. So, startups, so we're VC-backed startup and a tech company, you come in kind of two forms.
One is your build a better widget kind of company where the technology exists, people understand what the widget is. You just need to convince them why yours is 10,000 times better than anything that they have seen before in order for them to buy and you to gain market traction and things like that. New market is completely different. People don't understand what the widget is. They don't even know that thing is called the widget. You have to name it. You have to educate them on, hey, how can you, this help you solve your problems in ways that you could have never dreamed of before. You have to do that educational piece of it. And then they can adopt the capability.
So, it's very much kind of the Henry Ford principle of horses and cars. So, people always had the problem from the beginning of time of getting from point A to point B. And the way they did it at that time, when he came up with the automobile was they got on a horse and they went from A to B.
And that was the best thing that you had. So, if you had asked people, hey, what do we need for better transportation? And how can you better get from point A to point B? Most of them, the overwhelming majority would say, well, just give me a faster horse, right? And that's the give me a better widget kind of a situation, where he was trying to introduce an entirely new thing to them called automobile, called a car, that they had no idea. And there's certainly a tremendous amount of resistance in that. So, building and creating a market is hard, because it's this educational piece. It's not that the problems haven't existed. It's that nobody has even thought about how you would solve them in this way.
So, there's just a tremendous amount of education that's challenging. It's also why it's incredibly rewarding when you see people go, oh, wait a minute, I get it. That's awesome. Yes, let's go on that journey together. I believe in the power of these capabilities and what it can deliver uniquely to me, for my business, for my mission. And then that makes it all worth it at the end of the day. You mentioned earlier about being VC backed. So how did you go about securing the funding
for your business? And what advice would you give to other entrepreneurs on this journey? So, from securing a funding perspective, clearly you've got to be able to paint the picture of how what we're trying to do really does shift a paradigm, which will result in a lot of value, right? Because VCs are going to invest, they get a return on the value of that investment. And what I would say to other entrepreneurs is, to remember the first principle of, you're going to hear no way more than you're going to hear yes, like exponentially more, not just a little bit more, and you're going to be kicked in the head way more than you're given a high five. And that principle is not only true, trying to get people to adapt the capabilities, it's going to be true in getting people to fund the capabilities. So, from a VC perspective and raising money, that's the gig. You just have to keep going, you have to be persistent, you have to find the people who also believe in that vision, who see the eventual market, and then also have the patience for new market creation. So, the kind of cycle of a company where I want to go capture a piece of an existing market with a capability that's 10,000 times better than what's out there today. And the cycle of a company who's creating a new market in the ways that I
just described are very different. Those new market creation motions take time. There's a lot of education that takes time to occur. And so that's going to require much more patient capital than if you had your kind of existing market, better widget kind of a company. So, watch out for patient capital. Great advice. So, what about data compliance in the landscape evolving around that? What role do you think government needs to play in that part? So, we're seeing some interesting things happen, in particular with privacy enhancing technologies and government globally in two ways. One is, and this is part of the kind
of new market education piece of it, we are seeing now in kind of one of the data points of the market starting to come together and emerge, our government entities like the FCA, the Financial Conduct Authority in the UK, for example, or the ICO, the Information Commissioner of Office in the UK, give encouragement, I will say, from a regulatory standpoint about the adoption of privacy enhancing technologies. We saw this starting back in 2019, when the FCA put on a text sprint around privacy enhancing technologies because they saw the unique potential of these technologies to be able to gain insights to fight things like financial crime, anti-money laundering, know your customer aspects of it in that way. And then folks like the ICO, who look at policy around personal data and things like that, to say, wait a minute, you could use data in this protected way. And one of
the ways that you're going to be able to protect data or one of the aspects of it is going to be personal data, for example. So, it has that privacy component to it. So, you see them now writing different kinds of guidelines and guidance around privacy enhancing technologies and things like that. The other way that it's manifested in a more recent kind of buzzy terms is around secure AI. So, privacy enhancing technologies have really unique capability to protect models, machine learning models, which are the unit of work in AI.
And because of that, you're seeing it written in globally either as recommended as best practice or mandated, for example, by the White House AI executive order for safe, trustworthy, responsible AI to be used as best practice or mandated around this model centric security properties. So, what does that mean? So, thank you to kind of Chai GPT for raising the global awareness to a level we've never seen before of a fact that's always been true, which is models, machine learning models, encode the data over which they're trained. So, if that data has any sensitivity to it whatsoever, then using the model across a boundary or silo is essentially the same thing as taking the sensitive training data for that model and moving it across that boundary or silo. Huge problem in so many contexts, every single vertical. So even if we just go back to that kind of simple example around global bank,
and instead of maybe executing a search with Singaporean data, suppose that I've trained a model over data in Singapore. Now I want to go use that model or run it in the UK to get some insights that should be uniquely available to me. Maybe it's another financial crime fraud kind of detection model from Singapore to the UK. Now, the problem comes into play is if I've trained that model in Singapore over Singaporean data, then sending that model from Singapore over to the UK is essentially the same thing. It's sending the training data from Singapore for that model out to the UK. Clearly, that's a huge,
huge problem from a regulatory and data residency perspective. However, if you take the model and you encrypt the model with privacy enhancing technologies, I can send that encrypted model just like the encrypted surge from Singapore out to the UK. It can process out in the UK without ever being decrypted. If I don't decrypt the model,
then I never have an opportunity to see it. If I can't see it, I can't pull any of that sensitive data out of the model over which it was trained. I also cannot see what the outputs and the decisions of the model are, which cut off against a whole load of other kind of adversarial machine learning attacks relating to not having protection around the model itself. So,
you can get the encrypted results from your model back in Singapore and decrypt and it can secure those machine learning workflows in that way. You can do the same thing with training a model. So, I can now go train my model from Singapore out to all the other operating jurisdictions of the bank in a completely encrypted way, in a completely secure and private way. And if I, of course, never decrypting that model during training, that I can't do any of those things I talked about, which are pulling any of that sensitive data out of all those training jurisdictions over the model as it's learning, learning and encoding that data within itself across those other jurisdictions. So very, very powerful capabilities relating to model-centric
security and privacy enhancing technologies, uniquely enabling that key component of secure AI. And that's, of course, like I said, being echoed globally. So, we're seeing just a lot of movement and momentum around regulators' compliance types of entities giving guidance or in some cases mandating, right, in the White House AI executive order, the use of privacy enhancing technologies for these key qualities around protecting the usage of data, searching, running analytics, running or training machine learning models.
Really is a game changer. So, what about the emerging technologies? What are you most excited about? And how do you see them shaping the future? Well, clearly, we're on the edge of emerging technologies in terms of privacy enhancing technologies and kind of building the market there and around homomorphic encryption. I think the way privacy enhancing technologies impact all the ways that you can use data is very exciting. The most kind of buzzy form of that right now, as I talked about, is AI or machine learning.
So, I think the intersection points between privacy enhancing technologies and AI ML are going to be incredibly exciting and impactful here moving forward in the next few years. And what about your vision for Enveil over the next five to ten years? How do you see the company progressing and evolving over that period of time? So, for us, like I mentioned, we've been on this journey to change a paradigm around how data is used and create that market. And so, for us, I think five, ten years from now, the goal is that privacy enhancing technologies are just the kind of way data gets used. Just across boundaries and silos, it becomes the fabric of how you solve this problem, about how you extract insights.
And so, the education level goes way down, the adoption level goes way up. And then of course, the market is created, and we succeeded in changing that paradigm. So that's the goal that's what we're driving toward over the next years of the company. And finally, how do you maintain a work-life balance while running such a successful tech company? That is a great question. I think balance probably isn't the right word. I think more just kind of eventual equilibrium is more accurate. They get ebbs and flows, right? I mean, if you run a startup and you're creating a new market, that is a 24-7 kind of endeavour, whether you're actively or kind of passively working on it, is the way that I like to put it. So, I think it's a tricky thing,
but it's important to take care of yourself and pay attention to that. Because there's certainly if you burn yourself out, you're no good to anybody. You definitely aren't going to create a market and you're not going to be in it for the long haul. So, I think starting and running a startup from start to finish is a marathon with sprints interspersed, not an all-out sprint all the time, because then you won't make it to the end a new fall over.
So, I think keeping that in mind for other entrepreneurs out there is an important thing to do. And then whatever works for you in order to make sure that you're sprinting and then you're running and then you're sprinting and then you're running, do it. But running a start up from start to finish is not, I'm going to go run around the block and then sit down. It is a marathon all the way, start to end. You will run the whole way. You just can't sprint the whole way. Great advice. Well, thank you so much for taking the time to talk to me today,
Ellison Anne. It's been a pleasure to learn about your entrepreneurial journey and the fascinating technologies that you've been involved in. So, thank you once again for being part of the Inspiring Tech Leaders podcast. Yeah, thanks for having me, Dave. I really appreciate it.
Please remember to subscribe to the podcast and stay tuned for more inspiring tech leaders. [Music] please remember to subscribe to the podcast and stay tuned for more inspiring Tech leaders
2024-11-30 09:10