Search
Follow me:
Search
Follow me:

Privacy & Ethics

Play episode

Privacy & Ethics is the topic of conversation on this episode of @Darshantalks with host @Darshankulkarni and guest Sean O’Brien.

Darshan

Hey everyone, welcome to DarshanTalks. I'm your host Darshan Kulkarni. It's my mission to help you trust the products you depend on. As you know, I'm an attorney and a pharmacist and I advise companies with FDA regulated products. So if you think about drugs, wonder about devices, obsessive a pharmacy is the podcast for you. I have to say this is not legal advice, because I am a lawyer, but I'm not your lawyer. This is not clinical advice, because I am a pharmacist but not your pharmacist. And this is not a cybersecurity privacy advice. Because our guest today is not your privacy, privacy security adviser. Having said that, we're going to have a good conversation about all of those things. Because a this is educational, and be just fun. And see cuz I get to talk really cool experts like not like my guest today. I do these podcasts because they are a lot of fun. And I find myself learning something new each time. But it won't be nice though. Someone's listening. So if you like what you hear, please like, leave a comment, please subscribe. And if you if you actually like what you're seeing, please share. If you want to find me, you can actually reach me on DarshanTalks on Twitter, or just go to our website at DarshanTalks, calm. Our guest today, our podcast today is going to be about privacy. Because you know, it's one of those things that no one talks about. It's It's such a niche subject that no one really wants to discuss. But the truth is, it is super, super hot. And everyone I know is talking about and what does that actually mean. And every nearly every day, nearly every month, probably you see a country estate, someone really famous with an opinion about it. So if you are involved in the life sciences, if you are interested in the topic of privacy, you should care about today's discussion. To that end, we should talk about our guests, our guest is the founder of privacy lab at Yale Law School. So you know, just kind of throwing the Y bomb right out there. And, and the principal researcher ExpressVPN digital security lab. So, you know, this is gonna be interesting, because he actually knows what he's talking about, compared to someone like me, so I'm really excited to have him on our guest today, Sean O'Brien. Hey, Sean, thank you for coming on. Very happy to be here. So Sean, let's talk a little bit about what what is ExpressVPN? First of all?

Sean

Sure. So ExpressVPN is a company a security company that sells VPN software, which allows you to tunnel your connection more securely and potentially anonymously. Around the world, you can pick different locations, it's a very flexible piece of software that works really nicely. ExpressVPN just released a protocol called liteway. As free and open source software, that is a next generation protocol for carrying traffic through VPN, doing it in a very fast way. And very, I'd say simpler way than then in the past with other things, other technologies, and also does things like keep the connection alive, and so on. So there's some very exciting stuff going on there. My role at ExpressVPN is something called a digital security lab. We do investigations into security and privacy of various different parts of the cyber world. Primarily, we've been focusing recently on App Security. So for example, we took a look at 10 different telehealth apps for opioid recovery and treatment. In the past, we've done large surveys of hundreds of different apps, primarily on Android.

Darshan

So it's funny, let's start with the basics. But I'm still trying to understand it. And this is where my ignorance starts coming in. So I apologize in advance. But isn't VPN something that we figured out in like 2002? So So why are we still working on VPN? Like what has changed? What what makes this a constantly evolving thing.

Sean

So at its very heart of VPN is a proxy, right? It's the idea of dialing through another server to use that sort of arcane analogy. And then so you go through another computer, and therefore you can mask your identity or potentially get access to specific resources and so on and so forth. Right. So if you have a corporate VPN, you're dialing into that VPN to access corporate resources, right? So that that concept is is pretty old, as you say. Unfortunately, we have an internet that's very adversarial to our privacy. And many times we're in networks that have poor security. So using a VPN gives you the added benefit of at least obscuring large parts of If your identity, if you're also aware of some of the other habits, what we call cyber hygiene, right? And you're careful about browsers, you know, ad blockers, that sort of thing. You also can hide and protect yourself actually, from surveillance. From who, who the heck knows. Pretty much everyone. So if there's a reason we need VPN, yeah, that's a really good reason to have a higher level of privacy. The other reason, of course, is censorship, circumvention, right? There are many places in the world, including the United States where content content somewhere on the web is obscured or limited, right? So you know, you need to use a VPN, let's say, to connect to a British server, so you can access, I don't know, some British television show, and so on and so forth. Obviously, you know, it's not all about TV and entertainment. There are much more important reasons to worry about censorship. Events, like the Arab Spring, or some other worldwide events where people really need to get information out need to be able to communicate with each other, but also to publish. VPN technology can be one tool that is very useful in that context. So

Darshan

So use a term that I've heard before, but I don't actually understand fully. So what does the word cyber hygiene actually mean? I mean, is that just basically cleaning out your cookies and putting in a fake name in there? Or what does that actually mean?

Sean

Yes, so I'm at privacy lab at Yale, we do a lot of workshops with folks that used to kind of be called crypto parties, which is a weird term. But basically, they're workshops to show people how to be a little bit safer, a little bit more private, potentially more secure, and even anonymous online, right. And that's by learning a set of tools that you can use, but also a set of smart practices. Those smart practices are generally what I mean when I say cyber hygiene. So yeah, part of it is clearing your browser cache, right? Part of it is, you know, going through and using an ad blocker and really understanding what that means. Part of it is compartmentalizing your identity. Right? Keeping the work Sean's separate from the personal Sean separate from the school, Shawn, and trying to sort of juggle identities in smart ways. So it's sort of a basket of techniques. And yeah, it also means basic things like not ignoring software updates, right? It's just saying I can't do that right now. Because patching systems is really important for basic security. So it's a combination of stuff.

Darshan

Such a great topic to explore right now, when you talk about this idea of segmenting your identity so that there's workshop who's different from own Shawn, how does that work? In a working from home situation? Are you really expecting realistically that people carry three different laptops and four different identities and track them? Or how does that work? For people who are working from home? How do they do this appropriately?

Sean

Well, this has certainly been the challenge, right? And it can be hard even to, I don't know have the time or concentration resources, as you say computers, right? To to separate and segment your identity that way. The pandemic has really highlighted this, folks, we're sort of rushed online, started to use software like this ride, zoom, etc. To start teleconferencing all the time from whatever device they can get their hands on. And having that capability to sort of segment your life can be really hard in that context. We're really realism comes in, you know, I mean, once you get bit and you end up with, you know, ransomware, right on your computer, or you end up with some other problem, then you tend to take these things more seriously. You want to try to act before that happens, right? So you know, I tell folks, and of course, the folks who are coming to workshops, at least have some, you know, motivation to do this stuff. But I tell folks look, slow down, spend a little bit of time each day, just think about ways you can, for example, keep one browser separate for work stuff on browser separate for school stuff, right? Trying to not make sure you're not logged into your Google personal identity, and then go onto a Google Drive that has to do with your work. And this software does make it kind of hard. I mean, where's those lines often because there is a motivation on the part of especially big tech companies to sort of mix and match this identities, right? They want me to log in as a Yale Right, and then also, as you know, to my personal stuff, and then go to YouTube and watch a bunch of videos, right? So that allows better profiling and cross device matching and all that other stuff, which is very profitable for companies who are basically surveilling users. So there's a conflict there. And this is where you start talking about how privacy and security sort of overlap. Generally speaking, the better you are at keeping one part of your life private, or separate from another part of your life, the better you're going to be at security, you know, noticing something's awry with a piece of software, or making sure you back things up, right? Because you're gonna have to do these things anyway, to keep your identities separate. So yeah, it's difficult, it's a struggle, the struggle is real, I have the issue just like everyone else does. But try to slow down and try not to panic and make panic decisions. If there's one thing that listeners can get from advice today, don't just install something for one single purpose in a rush and not know the implications of that. This is one of the primary ways that people who are working from home are really being targeted right now. And they're being targeted on cell phones with SMS, right? Even. So if you think your boss is sending you a text message, talk to your boss first, don't just click a link in a text message that you get. So

Darshan

So this is sort of the big question. I try to think about as a individual that at any theoretical level, I agree with every single thing you've said, and like you said, the struggle is real every single time, what I struggle with is the idea that look, yes, I could do all the segmentation, that's probably a great idea to do in theory. But the fact is that I could never have the protections that a that a Google has, because they are, that's what they do for a living. I'm using Google, but it can be Apple, it can be anyone for, for obvious reasons. But the point I'm making is, why do I worry so much about Google tracking me? Yes, they might make money off me. But in exchange, I get all this free stuff. And the fact is that, if I'm worried about identity protection, they only know my identity for the most part, and does it really harm me in any way that they do? So what is your response as a privacy researcher to someone saying those things? Because I expect that you're just kind of going you have no idea what you're talking about? Let me explain that I know the reasons you're wrong.

Sean

So this is a pretty common thing and a common conception, it's kind of the why why should I care? Why does anyone worried about me? Why does it matter? You know, the whole privacy is dead sort of conversation. So there's a few things done pack there. The first one is, you know, if we're talking about direct cybercrime, which is on the rise, right, ransomware, you know, people getting their bitcoin wallet stolen directly, which, like, I've literally watched apps, do this to people, you know, those things have a very direct effect on individuals, right. And so that's one reason you should care beyond the big tech reason. But we're big Tech's concerned or government entities, and so on and so forth. We have to stop thinking about technology in a transactional sense, it's sort of short sighted, you don't necessarily know the impact that your lack of privacy now may have far into the future. We're starting to see technologies in the real world, linking sort of the digital realm much more strongly to every action we take. And we could be looking at a future very soon that, for example, penalizes drivers for the way they drive and links that to their digital profiles, and maybe they get discounts at a restaurant, or they don't, depending on what their habits are, I mean, you start, we have these things anyway, now, right. And those things have sort of an impact on folks like me who are really privileged, right, and in a very privileged environment, living in New Haven, Connecticut, right. But also, folks who are much more vulnerable, who are, you know, in communities that are disadvantaged, who are poor, who aren't able to get a job, who, etc, etc. Privacy really matters to them more so. And this is why it's easy from a position of privilege, let's say for me to say, Oh, well, whatever, if Google has my search today, who the hell cares, right? But for folks who may become sort of blacklisted because of some status, because of something that they did, as a team that's being held against them, because of some mistake they've made in their life, even just as I said, the difficulty of being able to find work, right. You know, there's automatic software that's rejecting resumes and everything else out there. All of that kind of stuff is is already having a real palpable impact on people's lives and will continue to And that's before I get into issues of freedom of speech and protesting and all of that stuff, obviously, where governments certainly have shown that they're willing to surveil their own populations to limit what people are allowed to say allowed to do and so on. So

Darshan

so that really opens up the discussion, a few other areas that we get into the the most common one that I've seen mentioned, at least in my neighborhood. Which wunderkind neighborhood I live in but ignoring that the the idea of Have you seen the white paper that the European Union put out on artificial intelligence, and their their discussion on not using artificial intelligence, as part of surveillance technologies, which would speak to the visual footprint that you leave behind? Those are a lot of big words to basically say, government is saying don't track people, and use artificial intelligence for that. What is your take on that? Do you think that's feasible? Or do you? Do you see a more China like approach where they say, No, no, no, we want to track our citizens, we want to make sure that they behave. And this is how we're going to optimize that in different ways.

Sean

Yeah, so I'm, generally speaking, I'm not a fan of artificial intelligence, or machine learning, and computer vision, and so on. And I've actually

Darshan

say, I just want to say for the, for the AI that's reading this and but 30 years, I completely disagree with Shawn. Bow to the security intelligence that's

Sean

going on, you accept your overlord your AI overlords? Well, anyway, I guess I'm going to be screwed. But um, yeah, so generally speaking, these technologies are problematic, just by their very design. They're blackbox algorithms, right? Literally, you know, you feed a lot of information into these algorithms, these computer programs, and you get output, which humans are unable to determine exactly, you know, how the output was arrived at. And there's varying different versions of how, you know, folks are talking about putting humans in the middle of that decision making process or having some control over it or, and there's meaningful things that, you know, this this software can do, I've played a lot with computer vision, and so on, it has useful applications, certainly. But when that software is used for things like scanning people's faces, right, I'm scanning the way that someone is walking, so they can determine their gait and put that in a profile that's now linked to some other video feed, that's now linked to some social profile that starts to basically surveil them everywhere. That's when things go off the rails pretty quickly. And in the world we live in, in the United States, and certainly in Europe to a large extent. And certainly the interests are different in China, but they're not wholly different. You know, surveillance is profitable, and part of the power structure, right, you learn a lot more about your population, and you're able to either make money off of them, hypothetically, at least trade in data, which is we have a data economy and that's very much what goes on, companies are selling large troves of data to each other for different things. Or, as you said, In China, things are a little more about state control, um, to put it mildly, and a little less about this sort of private data economy, although it's not as if there is no data economy. You know, there's, it's, it's a global phenomenon, with all these countries being linked. So anyway, long story short, I do think it's great to put limits on AI and its usage, I think many places things like facial recognition, um, should be banned, it probably should be banned wholesale. Certainly. I can't think of too many reasonable uses for it in the public sphere. And I am very much worried about AI moderation and so on and social networks, across video sites. Yeah, that that kind of determination is really problematic. Even software which sort of rides the edge of AI like the Sam, the child sexual abuse, scanning, that Apple's doing. That's kind of software worries me very much so.

Darshan

So it's funny we The more I talk to people about the technology and the software, the more it reminds me more about Minority Report, and it's sort of refreshing. Well, that movie is what 20 something years old, has sort of talked about exactly the issues. We're dealing with that. I guess my question is, um, in many ways, the reason we give out that beta is because it actually benefits us. I do these podcasts because, well, quite honestly, because I enjoy talking to people but theoretically, it's possible that I could have been doing this so that I can generate business. That business winds up being a good thing for me. Um, it baner companies might might get into that business because they are trying to try to basically help a bunch of patients and getting their data helps them enable helping patients. Do you see that being short sighted in some way now?

Sean

Yeah, so I mean, I do. And of course, I make some of these choices as well, we have an attention economy where people like me get on podcasts and talk about things, right, and have social media profiles that they promote, and so on, and so forth. But I would say this, looking at privacy from a transactional standpoint is problematic. So to bring it back to the legislation, GDPR, or the new rules that it looks like are going to be put in place in China and some level or some of the legislation we've had here in the US, ccpa in California, and so on. It tends to view privacy as a transactional phenomenon, as if it's the two of us sitting down negotiating, you know, having some sort of contract for some short term thing, some short term benefit. And what we're really talking about is privacy as autonomy, right? You can have limits put on your ability to move, your ability to live, your ability to work, your ability to have social mobility at all. All of that can be limited by a lack of privacy, and knowing what that's going to look like in the future is, has been guessed at by sci fi. And we are starting to move towards some of these things, right? So the example I use sometimes is about car seats and connected cars, right? You're going to have metrics on you sitting in a seat, whether it's you driving, right, whether or not there's a camera in your car that's surveilling you and facial theory, it doesn't even have to be that detailed, or that specific, that sort of sci fi, high tech, could just be some sensors that know your weight, basically, when you sit in the driver's seat, in the position of your seat, and so on, and so forth. And you could have a car that doesn't turn on anymore, because you didn't pay your insurance bill, or you didn't pay some other bill, your electric bill, right? So suddenly, you're not allowed to use your car. And these that's a very just sort of boiled down simple example. But this is exactly why like, as you say, it's it's a little short sighted. Unfortunately, we live in a world where we have to make decisions to get through it each day. But of course, this is why we want to have we need to demand change from the powerful from governments, from regulators from from all these other folks to try to make sure we don't end up in that Minority Report world. And some of its going to require grassroots organizing. And, you know, as I said, about facial recognition, flat out rejection and bands, I think

Darshan

so. So, in many ways, what I think you're saying is, the road to hell is paved with good intentions. And, and I, it reminds me a little bit of that, because I'm thinking about something that happened in yesterday of today's news, which was Delta announced that employees who don't take the COVID vaccination will have to pay an extra $200 for the health insurance. So that means the Delta knows who did and didn't get the vaccine, which, to your point, speaks to a future in which someone can look at your health records and start saying, How much should should you be paying? How am I going to cover you for things? I mean, we already did that for smokers. But is this any different? On one hand, I totally get Delta's perspective, because they are going to incorrect charges. And these are choices you're making and blah, blah, blah. But how does that change today from the COVID vaccination to tomorrow being you ate too much and you're obese? Today, after tomorrow being you're diabetic and your type two, and you have the you could lose the weight and keep going further and further down that that food chain?

Sean

Sure, or you know, you spent too much time at the bar. So, right, they don't even necessarily have to catch you in the act, so to speak, right? It could just be you're spending too much money in this place. And that's a problem. These are very real world issues. And this is one of the reasons why these these COVID measures around the world are an extremely slippery slope. And something that a lot of folks are very up in arms about, especially in places like France, and so on where there's been sustained protests against them. You know, whatever one thinks about the public health emergency and the best way to handle it. I think we can all agree that going down this road is going to pay, I hope we can all agree is going to be really, really problematic. One of the things that's happening too is it's normalizing surveillance. So it's creating a situation where you're a bad citizen, if you're not using this stuff or You allow Bluetooth permissions on your phone that you otherwise wouldn't allow things like that, or you're being forced to install an app without your consent apps are just being pushed the phones in some places, right? And all of that creates state dangerous precedents, which, you know, I fear as the world keeps turning, and we face more crises, climate crisis, etc, it's going to get pretty rough.

Darshan

But but it reminds me in many ways, I mean, do we necessarily disagree with what Delta's position was? Or? Or where the health insurance company's position would be? Because there are additional costs, like, you're right, 100% that I disagree with the endpoint of I don't want big brother looking over my shoulder. But do I disagree with each individual decision? The How do you as a privacy researcher, go about creating a, here's the here's the end of the slippery slope we're willing to tolerate? Or do you just say, don't get onto this slippery slope at all? In which case, do you land up giving up more than you probably? Should? I let you ask that question, because I have a follow up to that.

Sean

Yes, so I'm in the world we're in there are a lot of other reasons not to like these apps, or like these implementations. Part of it is because the effects is very low. And there's other ways to go about doing this. You know, for example, contact tracing can be done in more traditional ways be much more effective than Bluetooth apps turned out to be these other things, we can have paper cards, and so on, people say, hey, you can forge those. But as it's turned out, you can also trick these apps pretty easily. So what they mean is very, very little. You can come up with hypothetical scenarios where I would say, okay, maybe it's a good idea to do this. But in the world we're in Probably not. And I do, I do think it's too slippery of a slope and something that we shouldn't jump into. I actually also think there's a lot of other aspects of our lives that shouldn't be quote, unquote, amplified. apps are, generally speaking, designed to surveil all kinds of aspects of your life that go beyond the main functionality. And traditionally, you should be suspicious, if you're suddenly forced to do something through an app, there's probably a surveillance reason for it. So.

Darshan

So that's a really interesting thought process. And I'm, I'm trying to decide if I agree or disagree, I'll give you an example. I just got recently diagnosed with potentially having diabetes, I'm getting blood blood test to, to see how my sugar is. Um, there's one version of the same exact device that has bluetooth in it, and one version that does not. And and I'm kind of going I love the Bluetooth version, because it's just connect, connect, going to connect, given my readings in one, one spot, I love the convenience of it. On the other hand, this one is a lot more problematic. Yes, the data is probably on this device, I could go back and look at it, but it's a pain in the butt. So my first thought when you just said, what you just said was, oh, crap. So is this a pharma company trying to monitor what I'm doing, and therefore target me or whatever it is, that's gonna happen out of it. But my second thought, that came out of that same exact thing was, number one, I've actually sat on a lot of these farmer type committees. And the one big thing you care about is I don't want your data because I don't I just don't want to be involved with GDPR. I don't want to be involved with it, but keep you that keep me as far away from it as possible. But I like the convenience. So how do you sort of look, figure out who is the right actor to do this? I'm not sure the right actors, the government for sure. But I'm not sure necessarily to your point. Google's right actor apples right actor, but it's it seems if I was a third party, I'd go pharma is not the right actor either. So So how do you decide that?

Sean

Yes, so um, I mean, the the under the question underlying all of this is why can't we have useful functionality that is trustworthy? Why is the assumption that we have to give up something for convenience, and so on and so forth? that mindset is already something that's way too ingrained in these products. But even assuming that the company releasing this app is trying to do their best, extremely difficult. So one of the things we found, for example, in these opioid recovery apps, some were worse than others from a privacy perspective, right. Even the ones that were trying really hard, there were still some flaws, and there were probably other flaws and other things that they do through their website, and so on and so forth. All of this business is being conducted. All of these services are being conducted in a very polluted ecosystem, right? So apps are bootstrapped very quickly. There's code that shoved in there Maybe by some contractor, maybe by your in house dev team that may have implications because it's pulling from some other actor and it's talking to 50 different servers. So I mean, I've literally had to show people, you know, you know, group of kids who are starting a startup and trying to do an app, I say, you have to audit your own app. That's how bad things are. You build it yourself. And when it when you build it, during that build process, stuff may end up in there, which is, you know, and you're unaware of, and you know, in that kind of world, even the best intentions may not be enough is basically the problem. So,

Darshan

I mean, you we could have this conversation forever. And the fact is, you're fascinating talk to you. So I do hope you'll consider coming back. But like I said, I aim for these conversations to be about 15 to 20 minutes long. Well, we are already at 31 minutes. Thank you, very quickly. So thank you for that. So as you know, I'm going to ask you four questions. Sure. The first question is already on the screen, but how can people reach you?

Sean

Sure. Find me on twitter at shonto diggity? And I'm pretty easy to find on the web in general. So

Darshan

degree, the last name.

Sean

It's a nickname I wish it were the last name. We can make that happen, I guess. But.

Darshan

But then again, I do like the Irish part of it, which is the old dignity by shirt. Yeah. So that's my first question. My second question, based on what we've discussed, what would you like to ask the audience?

Sean

Archie's I should be more ready for this. So what would I like to ask the audience? Um, I guess I would like to ask the audience. Do you feel comfortable with using an app for healthcare? And if so, you know, what, what kind of things would you be willing to do via an app in telehealth? And if not, you know, maybe there's some things you would do? And some things you wouldn't? I don't necessarily know where the line is, for a lot of folks out there. And I, I know a lot of people are working through it. So that would be great to hear.

Darshan

So I usually answer the first question to see if I can at least give one answer. Hopefully others will provide their own as well. So to your question, do I use digital apps? Like I said, I do. What, how do I decide whether I'm comfortable using it? I think unfortunately, and this is this terrible considering actually work in privacy. But I tend to wait convenience a little bit more than I weighed privacy. And that is not the smartest thing in the world. But it is what I realistically do. It's sort of the same thing as I think it's CNN or The Wall Street Journal, forget it. But if you click here, you can see this article. And it's a one time thing, no big deal. And you kind of like I just want to read the article, I'm not going to go searching for it again. I click it that one time, each time, it pokes me just a little bit that I let that happen faster and more convenient. Do your next question, which was around EHR is what do you think you feel comfortable in stuff my my general thought process would be, again, I work in compliance I work in I've worked hrs, I would have expected that there is a significant amount of oversight that goes into choosing your EHR. And that's being done by teams of people much more qualified than me to look at these these hrs and go, you know what, I think it's fine, and it's compliant, and it meets all these different goals. What worries me is when privacy researchers like yourself, come out and go, Yeah, not so much. It's not what you think is, is actually necessarily happened happening out there. We actually have a comment from Tia Romero, my son has a rare disease, and we do use the hospital app, it helps, it helps the doctors have a nationwide understanding of what the other kiddos are going through. So so that's a fair point. If you have the rare diseases, how else can you collect the information? Yes, you'd give up privacy. But in exchange, you get aggregate data, which you simply wouldn't have, which is critical in the case of rare diseases. So what is your response to something like he his thought process? Yes. So

Sean

again, you know, these things are very useful for a lot of reasons, right? And as we send the report about opioid recovery, we wouldn't want people to stop using these apps if they're really helping someone, right. But we need to demand more. We need more researchers looking at this stuff. As far as hospitals are concerned, you know, when they're sharing data with each other, you need to be extremely careful, especially you know, it's not even just this and on the user sort of client and but we also obviously have the issues of data breaches, ransomware, etc. That's hitting hospitals. It's a tough world out there. Yeah. I wouldn't say don't use it, if it's helping you and you're feeling like it is but you may also want to look more into it. And if there are permissions it asks for that, you know, No, you don't need don't give us permissions.

Darshan

I love it. Um, as you know, I'm gonna ask you two other questions. My first question is, what is something you learned in the last month that you think the audience might find interesting?

Sean

Oh, geez, learned in the last month. So I've been looking a lot into blockchain protocols. And I found it very interesting that you can get a lot of different mini computers, but you've got IoT devices, but like a Raspberry Pi, if you're familiar with that a little computer, you can get them all talking to each other. And you can have them come to a consensus without doing the wasteful proof of work that everybody's worried about from an environmental perspective. So I guess, you know, I've learned that it really is truly possible to have this technology work and work well. And I've demoed it, you know, without having to get into the whole business of, you know, using massive computing cycles for miners, and expending all this energy. So there's some fun applications for that tech that I hope people will look into that go far beyond criticisms of Bitcoin in some of these other things.

Darshan

It's interesting, I just gave a talk yesterday on blockchain and the life sciences. So so your timing is is amazing. What what I find fascinating is your I know you aren't saying this, but I feel like Elon Musk completely agrees with you. But he just is going Dogecoin Dogecoin, or Dota. And he's kind of like that. You don't have to do the tremendous amount of proof of work stuff. But your point is definitely interesting. So thank thank you for sharing that. I last question for you is, what is something that made you happy in the last week.

Sean

So I've been quite happy that my hops are coming in. I grow hops, I do home brewing. And this year, for better for worse. It's been very humid here. And the weather has been for whatever reason, the ideal, I guess, hops are coming in like crazy. So that's huge.

Darshan

So do you do you share it? Or is it only a personal environment?

Sean

I definitely have to share. If it were only for personal environment, then I would be in trouble.

Darshan

Very, very cool. Again, Shawn, this was wonderful having you on Thank you so much. I hope you'll consider coming back. This was a lot of fun. So thank you again. And people can again just point out, they can reach you on Twitter at shadow diggity. And you can see that on your screen if you're looking at it, or it's just at SBA NODGGITY. And please

Sean

check out our report to express VPN digital security lab, the report we did on opioid recovery and treatment apps. I want to make sure I plugged

Darshan

back in Thank you. Thank you. I appreciate that. And that's great. Thanks again, Sean. we'll have you back soon.

Sean

This is the DarshanTalks podcast, regulatory guy, irregular podcast with host Darshan Kulkarni. You can find the show on twitter at DarshanTalks or the show's website at DarshanTalks.com

More from this show

Recent posts

Newsletter

Make sure to subscribe to our newsletter and be the first to know the news.