Site icon DarshanTalks Podcast

Blockchain

https://pdcn.co/e/media.blubrry.com/darshantalks/content.blubrry.com/darshantalks/ep361_looking-at-the-future-of-medical-affairs_mixdown.mp3

Darshan

Hey everyone, welcome to the DarshanTalks podcast. This is your host Darshan Kulkarni, it's my mission to help patients and you trust the products you depend on. As you know, I'm an attorney, I'm a pharmacist, and I advise companies with FDA regulated products. So if you think about drugs, wonder about devices concerned cannabis or obsessive pharmacy, this is the podcast is the live stream for you. I do have to say since I'm a lawyer, this is not legal advice. I'm doing this because I actually just enjoy the conversation. And it's educational. And because of that, I find myself learning something new each time to be nice enough someone's listening. So if you like what you hear, please like, leave a comment, please subscribe. And and if you are in the technology space, please share because today's conversation is going to be exactly the type of conversation you want to hear. Because we're going to talk about consent and we're going to talk about technology and consent. So feel free to actually share this. If you want to find me, you can always find me on Twitter DarshanTalks, or just go to our website at DarshanTalks calm. I briefly hinted at this, but our podcast today or live stream today is going to be about as we put this, it's going to be about consent. It's going to be about data dignity. What does that mean? And and how is this current going to change the life sciences and regulations? As we continue. Our guest today is the CEO of a core. He is a friend of mine. I've known him for several years at this point. And he is hiding his name apparently at the second but it's going to be a surprise. But but he's actually literally known globally for the type of stuff he does around blockchain around digital ledger technologies in general. And, and our guest today, Jim Massar, Jim, good to have you back.

Jim

It's great to be back Darshan, always good luck with you. Whether it's offline and our usual jibber jabbering away here with with the rest of the world,

Darshan

always. So we have to start with the obvious question. Why the flag behind you?

Jim

Because I'm in Atlanta, United fan, and I live in Atlanta, I'm a founding member. And yeah, you got to support the team.

Darshan

So so so Jim, is, when I worked with Jim Jim was known to be incredibly thick, and spend a lot of time as spending a lot of energy on that. So knowing that he's the founding member of the team does not surprise me in the least. And yet, I'm surprised that he has time to do anything. So kudos to you, Jim. So So Jim, let's actually talk about today's topic, our topic today, and we hinted at this briefly was this idea of data dignity, could you talk a little bit about what that is?

Jim

Yeah. So really, the idea is that we as individuals, these days, we create a lot of data, we create a lot of value, because of that data, high rise, as things have evolved, really, we are largely the product of, of really a very small number of companies, you know, kind of ways of expressing that data, as a simple example is Facebook or Instagram, you going to get pictures or videos or whatever, you're creating data. However, you don't really own that data, and you're essentially the product, right? So whatever advertising or whatever direction those organizations want to take with your data, that's, that's the right. And then really, ultimately, it's this idea that, you know, we're in a place where now as, you know, Creators Update data, we need to, at the very least have some idea of what's happening to data and whether it's being used for various purposes, whether there is a secondary market for it and things like this, I mean, if, at the very least, you know, trace and monitor it, if not have a have a stake in its value, you know, if there is a marketplace, be able to share it as an example. Right, you know, and, of course, it also kind of gets in this area of ethics, you know, which I know you're certainly on top of, you know, and really ethical usage of your data by organizations. You know, with particularly without consent, that's a very big deal. I think also is one of those things that that we all want to kind of, I think we're getting to a stage where whether it's regulators or individuals like us organizations, there's a much greater awareness of this, this kind of concept of a data man graph, and you know, and we need to participate in that because we are the creators and owners so data as individuals, right, and that that is that's a fact I mean, you know, this obviously very well when it comes to medical data and health data, but it's certainly a lot more than that it's not just traditional clinical data is so much data is being generated by us, you know, through other means, such as, you know, sensor based data and, and wearables and, you know, genomic data, you name it, and all of that really is valuable. You know, we're we're very much in a data as a currency space.

Darshan

So you talk about this data as a currency space, you talk about the fact that people should have data dignity, and should be able to control what happens with that data. And including saying, No, you can't have my data, which I think sounds amazing and concept. But at the same time, if someone goes, You know what, Shawn, I want you to give me your log your information. In exchange, I'll let you see this article. I know that from my perspective, I know it's, it's a very lopsided discussion, but I kind of go, it's easier to give me your login, sorry, give you my law, my login, but the email address or whatever, so that I can see the article and I can move on and, and I know that in, I don't know, in five years, that data is probably going to be worthless. So do I really need to worry about so. So talk to me about about the value of data dignity, when data itself gets stale? And how does that play out? Well, I

Jim

mean, you know, again, you know, we can make a blanket statement on everything. And there's, there's definitely different sensitivities in data and different values. So you look at patient health data, your collective, and again, not just to clinical data as an EHR, but the collected data, as a patient, there's a lot of value in that there's tremendous value, you know, I'm not just again, for the primary market. So as example, you know, all of us have been going through COVID, right, and what's been happening and patients going through the cycle of being involved in clinical trials. So there's definitely value in that as value for the sponsors. And ultimately, they're making compounds and they're making drugs, they're selling them and making money and so on, and so forth. But then there's also a significant amount of value in in kind of, like, beyond that initial interaction right now, whether it's in the marketing pharmacovigilance phases, or whether it's secondary data market, places where you take data from from different places, and essentially generate new, you know, kind of assessments, if you like observations based off of it. So that's a different kind of much more valuable data, as example, then if you provide email for, you know, like a kind of an isolated newsletters like this. Now, again, you know, the really scary thing, gosh, and this is why we've kind of gotten this, this down this path of exploring, I can protect individuals rights, and look at transactions associated to any kind of consent as example, the app is in isolation, that snapshot of your data interaction may not be that significant. But in in collection, and as an overall 360, is extremely valuable. Now, it really reveals a lot of ideas, and probably a lot more than, you know, you would want to reveal about yourself, certainly publicly, you know, or to any private organization. Right, I'm not just thinking, I think it'd be a case for almost all of us, right? I think that's the issue that many of us don't see, it's kind of like the forest for the trees problem, right? You know, we're kind of, you know, we're getting hit by the trees, but but really, there's a big forest in there. And it's very easy, you know, unfortunately, very easy. And it could be as simple as using cookies, or it could be much more, you know, comprehensive and sophisticated using machine learning and predictive analytics and things like this, to essentially kind of build a profile of every individual based on that data trail, you know, and if you have no idea what's going on, and particularly have not no idea of who's doing what else with your data, and you've not provided consent, or have an ability to monitor some of that, yeah, that's not a good place to be. I think that's like the opposite of having data dignity, that that's, that's where the issue of unethical data usage, I, you know, is most likely going to happen. And, and, you know, again, none of this is science fiction, this idea of identity tap. I mean, it's really happening, this is, you know, tap into a lot of people. And it's not just because you went to a restaurant and somebody kind of wrote down your credit card number. I think that's it. Seriously, kind of like a caricature version of it. There's a lot more sophisticated, much more electronic much more like behind the scenes type things happening that that really we don't know about. And I think ultimately all that stuff. When we look at all of us essentially being you know, kind of servants to this, right, you know, our phone, and I've got me through and everything goes through there, globally. That's really that's the trail that we're talking about. Oh, yeah, meet.

Darshan

Sorry, I had a car going outside. There's so many things to talk about here. So I want to sort of raise the first thing because it costs started. Before me moving to anything else can you talk to me a little bit about, but the identity theft? What is the back behind the scenes? Because what I still think I, I think I'm pretty sophisticated. And I recognize that there are some elements of identity theft, which are more subversive than the typical ISI stole your ATM card information. But as someone who lives in, in technology, what are the what are the types of identity theft that are more subversive?

Jim

We speak about this for a long time. You know, I think the big the really, to me, the biggest, scariest thing is this idea that there there are a number of organizations, and honestly, they're not organizations you've probably ever heard of, that are building actively building massive profiles of every individual, you know, and then they sell those profiles for all kinds of reasons and to all kinds of organizations. I'm not trying to suggest here, this is all unethical. Hi. Again, one, let's go back to this whole idea of distributed computing, and why why are we even doing it? Well, well, one reason. And the purest, if you like, behind is, believe in this idea of removing one entity as having all the power, you know, all the control, because that one entity, if I knew that, like all the world's information was was housed in your in your database, as example, your basement, there, my number one objective probably is going to be trying to hack into your database, right? Yeah, I mean, you know, because there's a lot of value. Yeah, and when one or two or three entities kind of have that kind of control, so then they're certainly susceptible to to attacks and being hacked, even if they're ethical themselves. But there's all kinds of leakages that happen, I think this this idea of, of decentralized computing is is, again, not a diversion to that, but it's really this idea that, like, instead of having one or two giant mega centers, you could have 10,000, little kind of data holders, if you like, and they all work together in a collaborative way, because hacking through 10,000 databases, especially ones that are synchronized constantly is very, very, very difficult, right, you know, is a whole lot more difficult than doing like one giant data set. So back to Identity Theft Auto is really, ultimately as, as these profiles are being generated, largely without our consent, and then we got our information, they can certainly be used, you know, in all kinds of ways. You know, whether it's completely unethical when there's borderline, you know, unethical and just without consent, or whether it's, you know, essentially something that you have, you get somehow consented to, without really understanding the full extent of the consent, which is, you know, which is a lot of times the issue too, because, you know, you go in, you sign a piece of paper, you know, it's kind of out of context, it's out of context with much bigger picture, but in reality is fine. I picture, you know, and I think clever, clever people, clever technologies, allow all that stuff to be exposed, you know.

Darshan

So so what I hear you saying is you're not using the word identity theft in the typical nefarious way. You're using the term identity theft in a much broader context, which is your identities being used in in ethical and non ethical ways that, that basically subvert your data dignity, for lack of a better term? Well,

Jim

I mean, look at this splitting hairs here. I think, I think, ultimately, you know, there's a crime that's being involved, right? This is not accidental, right? No idea of having, like, essentially 360 views of individuals without their knowledge. I mean, that's a business right? That those are businesses. And you know, I mean, that there's all kinds of companies that do this stuff. They're aggregators and a number of them in the medical space and, and they use intermediaries essentially collect nuggets of information, and they kind of create their own proprietary schema, if you like, and then they sell that, you know, and again, most people are unaware, this is a huge, huge business. But But the reality of all of this is that, you know, I don't think there's way better ways around it, right? You know, and I remember from my days, at the CDC, you know, instead of paying lots of money to aggregate this would have been a whole lot better, cleaner data, much more engaging to work with patients or representatives of patients directly, then then get it you know, during the day to get paid a lot of money for from, you know, a few aggregators. That's just one example. But I think it's just getting more and more, you know, it's a much bigger problem. Now, it's a much bigger picture just because there's so much that's happening on the data side. But I think that the other side of this and I'm going to talk about this is that the regulators have kind of woken up I think that there's a lot more regulatory awareness of what's happening. Even if individuals Like you and I may still not pay attention to it, I think more of us are just because this conversation is example. But regulators and again, we just saw this, I think last week, Amazon was fine, they kind of need $7 million by the EU for for all kinds of data violations related to the GDPR general data protection regulations. So the whole idea then, ultimately is, is that these are none of this is a surprise. I mean, it's not like you know, you and I just uncovered this by thing, you know, it's time to address it, because there's a massive, this is a massive, both ethical as well as economical situation that is out of control can impact all of us.

Darshan

So you talk about the value of this data, and I want to get into this a little bit more, because the value of his data, you're talking about regulators waking up to it, some people, some regulators are waking up to it with this idea that we need to protect people. But then you've got Israel coming out and saying, We are the startup nation, we want COVID data to be available, just come here, develop new drugs, new drugs, new vaccines with us, you've got China saying we want this data because we want global surveillance, how does all that get managed? How do you sort of see that balance, if you will, on? We want Janna dignity, one control versus government saying no, we want the control. We don't want you to have it. You see the distinction? And and why does?

Jim

I do and again, you know, obviously, we can't necessarily go through every kind of exception to the rule. I think the rule dies, I see it is that and again, this goes back to the mobile smartphone. That's the reason why. Because if you said well, how come 20 years ago, this wasn't such a big, you know, kind of big, big deal. Yes, there is the reason that 20 years ago, we didn't have you know, a billion smartphones running around creating data and tying us individually, you know, to to engagements, right, I think that's that's fundamentally what's been happening. And also the fact that and this is very, very proven already that we are generating more data, particularly unstructured data. Now, like over a course we have ever created through the whole course of humanity. I think those those numbers are all like, really out there. So the point, though, is is that I think this this idea that an individual has data rights, that is kind of a new thing. It really wasn't it wasn't something that was that was very seriously thought through or or even the use cases were not that mature 20 years ago, right. But everything has changed. And I think so therefore, you know, all of us, regulators, technologists, entrepreneurs, you know, advisors, all of us are trying to catch up into some of these ideas, because the space is moves so much faster than then all of the checks and balances around it. Now, you know, does that preclude you, you know, kind of being fully open and transparent about sharing data, whatever? The answer is, no, obviously, most people do that, like on social media freely. I think the thing though, and this is, this is why we kind of come from is like, people are doing this and ease of use and consumerism, you know, it's incredibly important, again, going back to the mobile phone, but it doesn't mean that technologists like us can just say, Well, you know, we just build whatever and we we don't, we don't care about security, privacy, accountability, traceability, we can't do that. Because every day, when people realize on mass, all these things are happening and how they're individually impacted, or could they impact it? Can they're gonna be at the very least alarm, right? You know, if not, you have to have a serious connection, that Whoa, what is going on? So So I think it's completely incumbent on us, as technologists and entrepreneurs to build in these degrees of accountability, security, privacy, preservation, traceability, real time kind of like information back to individuals, because receipt it even if not everybody else does at this moment. Right? So I think that's those are the things that are kind of different because, you know, the alternative then is you say, you know what, we're just going to rely on three organizations Facebook, you know, Amazon Apple, as an example, but pretty close to the mark example, to essentially control everything. You look at the Internet right now, is logic controlled by four or five organizations? You know, as the truth of it. Yeah. Yeah. And that's, that was never the intention behind the internet. It was never let's make you know, Bezos and Zuckerberg, the richest people in the world. That was never the intention behind. Right.

Darshan

So so you talk about having more control. You talked about this idea that

Jim

I'm not sure if I necessarily think control, I think control is one of those overloaded words. And it has a lot of connotations that I don't necessarily believe is practical right now, I think it's much more about traceability monitoring auditability, at this stage control is like one or two stages removed from where we're at right

Darshan

now. Fair enough. So, so let's say we're talking about monitoring and, and having a deeper understanding of what's out there. Let me ask you this question. How do you enable through the technology solutions you have? How do you enable that to happen?

Jim

Yeah, so it's a good question. I think this is really the crux of what we have been working on. And actually, you know, just from our open pharma days, and working with people who just generally were are self serving and don't understand these kind of concepts. And even before that, you know, ultimately, we have an opportunity, particularly with with public blockchain technology, you know, and I'm very specific about that, because it's not any technology to have an immutable ledger, a public ledger that shows like, you know, Jim and Darshan talked at 11am. Eastern on August 5, it doesn't say what you and I talked about, there is no reference directly to this podcast, per se, if you wanted to, and we want to provide a link to this podcast, we probably have that in a private database, where we say, okay, on the ledger shows given Darshan talk 11, if you really want to find out what it is, you click on this link, the link says, okay, you have private permission to be able to tap into this as an example. So I think this idea of traceability, this idea of monitoring, particularly in real time, but not like a month and a line is completely doable. Now, you know, and blockchain technology allows us to do it, you know, did that there's a few nuances but for instance, you can use non fungible tokens to so it like as example, you could take your podcast, you can associate your, which is a digital asset, make this an hour long podcast, and save it as an mp3 file, whatever. You're gonna associate it to a non fungible token. And non fungible token is a unique identifier in the world, there will not be another one, just like it's unique to this event. And then if, you know, if you want to track for instance, how many people looked at your podcast over a course of time from different sources, not just from your website, which you have some control over from all kinds of different sources, right? Or who shared it with whom later on, you could potentially do it just because that token is unique. Now creepy, could you see, you know, they actually looked at like a particular piece of the video, probably not. Right, that's, that's, that's the next step. So that's really this idea of traceability, accountability, auditability. Now, in case this podcast is public, so you know, it's a use case, that may not make sense. But there are many, many other use cases where you want to have some kind of a reference that's immutable public, and and really, in our lingo, is computationally trustworthy, or cryptic, you know, using cryptography, cryptography Can you can trust it. So it's not just rely on UI or a private database to say, yes, this happened, it's really rely on public cryptography. And then from there on, any kind of transaction associated with it can be can be related can can be associated with a particular token. That's the basic idea. But But really, ultimately, all of this kind of comes down to this idea of tokenization. Right, which is really at the heart of a public blockchain, right? You know, which was just to say that when you use tokenization, you, you basically, you know, kind of, I think of it as value creation, attribution. So if you're creating value, so you and I speaking, we're creating some kind of value, maybe just for the two of us, right, but that's two is better than none. So that value is now associated, you know, to some kind of a token, and that ledger essentially showed your reference token, pretty simple idea, you know, and if you expand the idea of tokenization, whether it's regular tokens that are they're essentially kind of like, can trade it's like, for instance, one bitcoin, right? If I bought Bitcoin 10 years ago, right, which would have been $2. What a one that or versus one that bought yesterday at $40,000. It's still one bid, right? So if I wanted to exchange that one bitcoin with you, it doesn't matter if someone I bought it $2 or they wanted $40,000. That's, that's basically fungible. There, you know, there is no differentiation between the one from 10 years ago or the one from yesterday, from your perspective, but mine was $39,000 difference, right? Right. But but but the idea is it's fungible, and so that's one of the tokens and then the other kind of which has become very popular lately is is non fungible, which is that that that token that was issued at that time for this podcast is like podcasts is a very good example is unique. There's no other podcast like this in the world ever. By definition As you know, this time frame, the conversation, you know, is specific, there is not a conversation that's going to happen that will be identical to this conversation, right? So so we could associate non fungible tokens to this podcast. But we could encrypt it using a hashing technology and associate the podcast in encrypted form with a token, and thereafter for the rest of eternity, there will be a unique token. Right? So you could never trade that particular token with something else, that that will be identical to it because it's a non fungible token. That's kind of the idea, right? That's, that's, that's really where the tokenization idea comes from, and the things that we can do to actually take some of these ideas that we talked about, you know, and really make it practical, make make it real.

Darshan

So we're literally running out of time here, but I'm gonna ask you one last question, because it's something you and I discussed right before we got on. But I think it's kind of important. We talked about immutability. We talked about tokenization. We talked about taking data and having more access. And the words about control. Obviously, it's management, I forget, I forget the term you use. But But the idea of being you can do all these things. My big question, though, is how did how do you reconcile that with the requirements of GDPR? Or ccpa? Or the new Colorado law? How does all that work together? When they say you need to have the right to delete the information? Yeah,

Jim

you know, I mean, really is kind of a good news situation, dashawn because, in fact, those regulations, you know, in some ways, really kind of expedite the thinking and billing solutions in this way. Because, you know, if the regulator's are expecting you, for instance, let's go back to GDPR. Right. GDPR is one of the elements of GDP, ours is basis for consent is the same thing with ccpa, the California Consumer Protection Act, right, they both explicitly say, basis for consent, you have to prove consent. And yet, it's not just a one time one and done thing. As you know, in a world of clinical trials, we're very much in a world of dynamic consent. Now, where consent is not a snapshot is one that changes so you could have a consent that that you start with, and then maybe it expires, maybe it's revoked, maybe it's renewed, maybe there's a there's a chain of custody event. So it's much more dynamic, it's got multiple states in it. So really, through this, this immutability, and accountability, we're able to show, really, really, again, going back to this, this idea with public ledger and using cryptography, we're able to show what technologies are proven, you know, I'm not relying on one entity. It's not just my word, that transactions happen in order and with the kind of accountability and regulatory requirements that were that were expected to happen. Right, you know, and then later on, if a regulator FDA comes and says, Hey, you know, you got these 20,000 children, right, you know, who are participating in this COVID-19 112 study, show me all of the consents that the parents sign, real, real scenario, right, and show me, you know, who was revoked, and show me that if one of them asked to be deleted, they have done that all of this, this can be done in a timely manner. In fact, if you build it correctly, it can be done, essentially on demand in real time. Yeah, and that's kind of technologies we have built, because we're really big believers that, you know, circa 2021, almost everything we're talking about here can be done in real time. And the truth is, it really can be, you know, this does not have to be like the old days of let's let's do a, you know, let's do a discovery session, let's spend six months and, you know, $18 billion million dollars with lawyers.

Darshan

That's where I draw the line, we need to eat too,

Jim

we need to make you more useful. That is my objective in life. Thank you.

Darshan

Thank you for helping there. Hey, Jim, this was as usual, amazing. Let me let me ask you four questions before it before we let go. The first based on what we've discussed, what would you like to ask the audience?

Jim

I think I would like to ask the audience, you know, if they're, you know, what it is that interests them in this kind of conversation, you know, whether they're kind of I mean, I typically see two communities, what is a crypto community, when they're already very interested, they're engaged and they're really kind of aware of what's happening. And I think from their side, typically the kind of questions as I'm asked or they seem to be interested in are, what is it you guys are doing in the crypto space is different? Right? That's that's one community. The other communities is I think that the community from the domain you know, people who are like maybe regulators or pharma space, a healthcare space, and they're curious, more curious. about why blockchain so what you know, like like is this all smoke and mirrors. And I think both of those communities are completely. Yeah, it's great to have that interaction because you get different kind of cadence when you're interacting with them. And it's important to be able to relate and work and learn from both.

Darshan

So, yeah, let's find out. Let's see what they come back with. Um, let me ask you another question. What is something you've learned in the last month that people will be surprised by?

Jim

Hmm, personal Laurie General,

Darshan

either in general in general.

Jim

Um, you know, I think what I've learned over the last month is going back to our world of technology and things like this is that there is there is a real thirst right now. In that kind of what I'm going to call the practical community, for people to actually to see like real real kind of solutions for blockchain. You know, I think there's been a lot a huge amount of, of like cryptocurrency work and Icos, and there's been a ton of infrastructure people talking until the blue in the face about like, layer two is an off chain and all kinds of wallet things and yeah, and all of that is interesting for the geeks, but it's really not that interesting in general in terms of solving problems. So but but I've seen more people come and say, Well, you know, my problem actually is is i'm doing i give you a real thing. I'm doing loan origination, right for for properties, and how can I use blockchain? and show me what we can do with blockchain? Did you know because because that's a big problem is inefficient. We go through multiple, like banks and lawyers and things like this and, and and the paper gets lost. Can we use blockchain for that? So those kind of questions I think, I'm learning that I'm particularly on NF T's I think people are, all of a sudden the world everybody's talking from, from basketball stars to, you know, to people on the road, and people like you and I talking about NF T's and, and now the questions are more more educated questions, right? NF T's. So all that is good.

Darshan

Um, so the next question is, what is something that made you happy in the last week?

Jim

Let's see. So I've been watching the Olympics. And I'm into cycling these days a lot. I do a lot of cycling. So it was. So it's obviously personal, it's going to work. But this this Austrian woman won the road race. I don't know where a Cinderella story like shunpike, essentially by herself for like 100 miles, and wanted No, just just, I mean, again, it's low profile schema with the Olympics, but to me, it was just phenomenal is one of the most surprising results and incredible performances I have ever seen in sports, and I follow sports, you know, regularly.

Darshan

So I think that the closest thing is Boris Becker, when he came out of nowhere and won the Grand Slam, I believe it was.

Jim

So you say 85? At Wimbledon? Yeah,

Darshan

it was remoulding. Yeah, there you go.

Jim

Yeah. But okay. I was only seven months old at that time.

Darshan

I wasn't born so there's that? Yeah, sure. But yeah, this is awesome. Jim. I have been posting throughout how people can find you. But for those of you who are listening, Jim, how can they find you? Is that your fourth question? That's my fourth question.

Jim

All right, but I'm just kidding track. Yeah, just the usual means LinkedIn, Twitter, I think the easiest path to find me and happy to connect.

Darshan

Perfect. Jim, thank you so much for coming on. It was wonderful to have you and let's chat soon.

Jim

Sounds good. Cheers.

Jim

This is the DarshanTalks podcast, regulatory guy, irregular podcast with host Darshan Kulkarni. You can find the show on twitter at DarshanTalks, or the show's website at DarshanTalks.com

Share this