June 1, 2025

AI ethics, healthcare and regulation - Artem Trotsyuk, PhD (AI Fellow: Stanford)

The player is loading ...
AI ethics, healthcare and regulation - Artem Trotsyuk, PhD (AI Fellow: Stanford)

Today, I'm joined by Artem Trotsyuk, an AI Fellow at Stanford, for a deep dive into the evolving landscape of artificial intelligence, especially within healthcare. Our conversation explores the fascinating intersection of technology, entrepreneurship, and personal development, highlighting Artem's unique journey from aspiring doctor to a leader in AI ethics and innovation.

We discuss how AI learns and adapts, drawing parallels to human "nature vs. nurture," and delve into the critical ethical considerations surrounding AI deployment in healthcare, from mitigating bias to ensuring patient safety. Artem also shares invaluable insights on the entrepreneurial path, particularly for academics transitioning into deep tech, emphasizing the often-underestimated importance of commercialization alongside groundbreaking technical solutions.

Artem Trotsyuk : / artemtrotsyuk
Learning with Rishad: https://www.learningwithrishad.com/
Rishad Usmani: / rishadusmani

[0:00:00] Learning & Unlearning from Childhood: Artem shares deep insights on key lessons from his upbringing, including developing independence and overcoming challenges like public speaking.

[0:01:42] AI's "Nature vs. Nurture": A thought-provoking discussion on how AI learns, distinguishing between pre-coded behaviors and the exciting future of AI learning through physical interaction.

[0:04:54] Artem's Unique Career Path: Hear about Artem's fascinating journey from aspiring medical doctor to pursuing a PhD in engineering and computer science, including his work on smart wound care devices.

[0:07:22] Focus on AI Ethics & Stanford AI Fellowship: Artem details his current role at Stanford, focusing on the ethical deployment of AI systems, mitigating bias, and ensuring patient safety. [

0:08:45] The Academic-to-Market Gap: I discuss with Artem the common challenge academic founders and physicians face: overemphasizing product development while overlooking the critical importance of marketing and commercialization.

[0:14:47] The Importance of Sales & Marketing for Deep Tech Startups: Practical advice on why commercial strategy is paramount, even for cutting-edge technical solutions.

[0:16:17] Key Startup Focus Areas Beyond the Product: Expanding on what truly drives startup success beyond just having a great idea.

[0:21:47] Personal Startup & Commercialization Experiences: Artem shares examples from his own background.

[0:29:47] Building a Strong Team Beyond Technical Expertise: Insights into the diverse skill sets needed for a thriving startup.

[0:35:47] Defining Success in Entrepreneurship: Exploring a broader definition of success beyond just financial returns.

[0:41:47] Ethical Considerations & Regulatory Challenges in AI: A deep dive into the risks and necessary frameworks for AI in drug discovery, generative models, and ambient intelligence.

[0:46:47] Balancing Innovation and Regulation: Discussion on how policymakers can foster innovation while ensuring safety and ethical use of AI.

[0:53:32] Ronjon Nag Mention: A brief mention of another AI expert and investor for future discussions. Tune in to gain a deeper understanding of AI's evolving role in medicine and the entrepreneurial journey behind cutting-edge health tech!

 

WEBVTT



00:06:08.000 --> 00:06:10.000
Rishad Usmani: Hi, Adam, thanks so much for joining me today.

00:06:10.000 --> 00:06:12.000
Artem Trotsyuk: Hey? Thanks for having me.

00:06:13.000 --> 00:06:21.000
Rishad Usmani: There are things that we learned in our childhood which help us and contribute to our success in lots of ways, both career and personal life.

00:06:21.000 --> 00:06:24.000
Rishad Usmani: And there are things we have to unlearn from our childhood.

00:06:24.000 --> 00:06:29.000
Rishad Usmani: What? What are some things you learned from my childhood which have helped you? What are some things you've you've had to unlearn.

00:06:30.000 --> 00:06:37.000
Artem Trotsyuk: Totally so. It's super interesting question. Um, I think as a kid, there was a lot of.

00:06:39.000 --> 00:06:44.000
Artem Trotsyuk: As a 1st child. There's a lot of the need to.

00:06:45.000 --> 00:06:49.000
Artem Trotsyuk: Find and be independent, and find um.

00:06:50.000 --> 00:06:56.000
Artem Trotsyuk: Find solutions to problems because no one else is going to be there to help you out to do those problems, to solve those problems.

00:06:56.000 --> 00:07:04.000
Artem Trotsyuk: And I'd say that was the thing that I had to do early on. And it's something that continued on into a lot of the work that I do now.

00:07:04.000 --> 00:07:09.000
Artem Trotsyuk: One thing I did have to unlearn is the fear of public speaking.

00:07:09.000 --> 00:07:18.000
Artem Trotsyuk: So growing up. I never liked to take the stage, even speaking in front of a class was always a difficult thing.

00:07:18.000 --> 00:07:23.000
Artem Trotsyuk: Uh, it gets so nervous to do that. And you know, nowadays it's still kind of you can't kind of get the jitters, but.

00:07:23.000 --> 00:07:32.000
Artem Trotsyuk: You. There's there's psychological tricks that I've learned that help me be a better public speaker.

00:07:32.000 --> 00:07:38.000
Artem Trotsyuk: And not worry so much when I'm on the stage, for example. And so that's been one thing that I've had to unlearn.

00:07:38.000 --> 00:07:42.000
Artem Trotsyuk: But I'd say everything else has been kind of like. I'd say you. You're you.

00:07:42.000 --> 00:07:53.000
Artem Trotsyuk: And say, you're the nature and the nurture arguments. I do think that there's an aspect of once you're the environment that you're born in and kind of can help mold you into the next stage of your life.

00:07:55.000 --> 00:07:59.000
Rishad Usmani: And maybe extrapolate that environment into AI.

00:07:59.000 --> 00:08:02.000
Rishad Usmani: And as we get into AI learning new things.

00:08:02.000 --> 00:08:06.000
Rishad Usmani: How much of. And and maybe this question doesn't.

00:08:06.000 --> 00:08:11.000
Rishad Usmani: Completely apply for AI. But how much of AI learning, do you think is nature versus nurture?

00:08:11.000 --> 00:08:17.000
Rishad Usmani: Was something inherent in AI, and its capabilities was something it's learning from its environment.

00:08:17.000 --> 00:08:21.000
Artem Trotsyuk: Oh, you hit, I mean, you brought up a super interesting point. So the interesting part is, um.

00:08:23.000 --> 00:08:28.000
Artem Trotsyuk: AI right now is very much in the.

00:08:28.000 --> 00:08:35.000
Artem Trotsyuk: Non-physical domain for the most part, meaning that it's all on your screen. You can't physically touch it yet.

00:08:35.000 --> 00:08:41.000
Artem Trotsyuk: You could. I mean what self-driving cars and waymos, and some Tesla robots.

00:08:41.000 --> 00:08:44.000
Artem Trotsyuk: They're kind of learning from their environments.

00:08:44.000 --> 00:08:51.000
Artem Trotsyuk: But the concept is that majority of AI tools are Sas tools, and those tools are.

00:08:51.000 --> 00:08:54.000
Artem Trotsyuk: Primarily in a non-physical, tangible world.

00:08:55.000 --> 00:09:01.000
Artem Trotsyuk: And so a lot of the models that are used to kind of interact with the world. They're evolving with the.

00:09:01.000 --> 00:09:04.000
Artem Trotsyuk: The premise that um.

00:09:04.000 --> 00:09:14.000
Artem Trotsyuk: Someone's coding them. So that's kind of the the nature component. Right? You're kind of ingraining in their coding certain lines of code on how action items and scenarios.

00:09:14.000 --> 00:09:17.000
Artem Trotsyuk: The more.

00:09:17.000 --> 00:09:20.000
Artem Trotsyuk: Interesting part will be when.

00:09:20.000 --> 00:09:29.000
Artem Trotsyuk: AI is more interacting with the physical world. And that's where it starts to learn new things and.

00:09:29.000 --> 00:09:35.000
Artem Trotsyuk: I. It's not there yet, but I feel like that is the next frontier with robotics, for example.

00:09:35.000 --> 00:09:40.000
Artem Trotsyuk: Um robotics right now, primarily are, they're programmed to do specific sets of tasks. So.

00:09:41.000 --> 00:09:42.000
Artem Trotsyuk: I went to.

00:09:43.000 --> 00:09:50.000
Artem Trotsyuk: I got a smoothie the other day just because the thing that was making it was one of those robots that kind of.

00:09:50.000 --> 00:09:55.000
Artem Trotsyuk: Just that had a bunch of fruit, and it just was self programmed to pick up the fruits. And then.

00:09:55.000 --> 00:10:05.000
Artem Trotsyuk: To put it into a into a blender, and then mix it, and then pour it into a cup and push that cup out. But it was pre-programmed with a specific set of instructions.

00:10:05.000 --> 00:10:07.000
Artem Trotsyuk: Now.

00:10:08.000 --> 00:10:16.000
Artem Trotsyuk: It would have been a different experience if, instead of there was just a touch screen, I selected what I wanted, and then it just output for me a drink.

00:10:16.000 --> 00:10:21.000
Artem Trotsyuk: Imagine it was a humanoid robot standing there, and I would have a conversation with it.

00:10:22.000 --> 00:10:24.000
Artem Trotsyuk: What would that be like? That's kind of where this.

00:10:24.000 --> 00:10:26.000
Artem Trotsyuk: Like the future state of.

00:10:26.000 --> 00:10:29.000
Artem Trotsyuk: If this is going to be a future reality.

00:10:29.000 --> 00:10:38.000
Artem Trotsyuk: Like that. Then imagine having the AI continually learn by interacting not only with humans, but also.

00:10:38.000 --> 00:10:44.000
Artem Trotsyuk: With other AI robots, for example, or AI, other AI systems.

00:10:44.000 --> 00:10:47.000
Artem Trotsyuk: What is that gonna look like? I don't know. But uh.

00:10:47.000 --> 00:10:50.000
Artem Trotsyuk: I think it's an interesting.

00:10:51.000 --> 00:10:58.000
Artem Trotsyuk: Next shift in technology development that we're in right now that we're observing a lot of it's moving pretty quickly.

00:10:58.000 --> 00:11:01.000
Artem Trotsyuk: But it's still more or less.

00:11:01.000 --> 00:11:06.000
Artem Trotsyuk: Not not everything, but more or less constrained to the computer, for now.

00:11:07.000 --> 00:11:13.000
Rishad Usmani: So currently, you're an AI fellow at Stanford. Tell me more about kind of your day to day.

00:11:13.000 --> 00:11:22.000
Rishad Usmani: Work there, and then walk me through the trajectory of your career, and you can start as early as high school. And and what does your career look like so far.

00:11:23.000 --> 00:11:27.000
Artem Trotsyuk: Yeah, my career is pretty nonlinear. I grew up.

00:11:27.000 --> 00:11:30.000
Artem Trotsyuk: Wanting to be a doctor.

00:11:30.000 --> 00:11:36.000
Artem Trotsyuk: And instead of going to medical school, I ended up going to graduate school.

00:11:36.000 --> 00:11:39.000
Artem Trotsyuk: So kind of still became a doctor.

00:11:39.000 --> 00:11:42.000
Artem Trotsyuk: But not that type of doctor. The other doctor.

00:11:43.000 --> 00:11:51.000
Artem Trotsyuk: Um, I always get sometimes get asked the question I'm like, Oh, can you prescribe me something I'm like? No, I can't. Unfortunately, I'm not that type of doctor, you know.

00:11:51.000 --> 00:12:00.000
Artem Trotsyuk: Ah, so things were very nonlinear. And actually, I'm very happy that I chose the Phd. Route rather than Md. Routes. There's a lot of.

00:12:00.000 --> 00:12:03.000
Artem Trotsyuk: Um. I feel like opportunities that have been uh.

00:12:03.000 --> 00:12:05.000
Artem Trotsyuk: Ones. I've been able to kind of.

00:12:05.000 --> 00:12:08.000
Artem Trotsyuk: Tap into because of having the.

00:12:08.000 --> 00:12:10.000
Artem Trotsyuk: More um.

00:12:11.000 --> 00:12:17.000
Artem Trotsyuk: Research, heavy experience that you get through. A Phd allows for you to kind of see a lot of different components. So.

00:12:18.000 --> 00:12:27.000
Artem Trotsyuk: I studied engineering computer science. Through that we developed smart devices for wound care. And then, seeing clinical translation from the research to a bedside.

00:12:27.000 --> 00:12:35.000
Artem Trotsyuk: Application was super interesting. But then, also, how do you integrate AI safely and patient data and going through and mining the data for.

00:12:36.000 --> 00:12:41.000
Artem Trotsyuk: Useful corollaries to predict wound healing. What sensor should I.

00:12:41.000 --> 00:12:43.000
Artem Trotsyuk: Put into a device.

00:12:43.000 --> 00:12:48.000
Artem Trotsyuk: That would output a successful result, meaning that I.

00:12:49.000 --> 00:12:57.000
Artem Trotsyuk: A wound is getting better. And here the sensors are changing. Here's what I need to. And here's how I need to modulate the treatment parameters, etc.

00:12:57.000 --> 00:13:00.000
Artem Trotsyuk: And then, in doing that, it was quite interesting to see.

00:13:02.000 --> 00:13:08.000
Artem Trotsyuk: How AI is being deployed in the clinical space, and then how AI development.

00:13:08.000 --> 00:13:12.000
Artem Trotsyuk: Where AI research is happening on the research side with.

00:13:12.000 --> 00:13:18.000
Artem Trotsyuk: With patient, related data. I was interested in that.

00:13:18.000 --> 00:13:23.000
Artem Trotsyuk: And that I transitioned into more a fellowship role. After graduating to.

00:13:23.000 --> 00:13:29.000
Artem Trotsyuk: Ah! To extrapolate more of these risks that are tied into AI development, and so.

00:13:29.000 --> 00:13:32.000
Artem Trotsyuk: That's where a lot of the.

00:13:33.000 --> 00:13:35.000
Artem Trotsyuk: The work now has been thinking about.

00:13:35.000 --> 00:13:37.000
Artem Trotsyuk: If you are to deploy AI systems.

00:13:38.000 --> 00:13:46.000
Artem Trotsyuk: What is reasonable? How do we ensure that we are mitigating bias within the algorithms? How do you make sure that patients are.

00:13:46.000 --> 00:13:48.000
Artem Trotsyuk: Save that we want to. Uh.

00:13:49.000 --> 00:13:56.000
Artem Trotsyuk: Ah! Not have unforeseen risks or problems for vulnerable, patient populations.

00:13:56.000 --> 00:14:02.000
Artem Trotsyuk: And how do we like equitable deployment of AI tooling and.

00:14:02.000 --> 00:14:06.000
Artem Trotsyuk: A lot of these different topics. We get to kind of think about more on the.

00:14:06.000 --> 00:14:15.000
Artem Trotsyuk: From the ethics lens. And then thinking about, how do we work with policymakers to to translate that into more policy efforts that can.

00:14:16.000 --> 00:14:22.000
Artem Trotsyuk: Help people not only understand the limitations and the possibilities of AI systems.

00:14:22.000 --> 00:14:30.000
Artem Trotsyuk: But also, okay, if we were to create policies, how do we not hinder innovation, but also make sure that we are.

00:14:30.000 --> 00:14:33.000
Artem Trotsyuk: Uh um uh.

00:14:34.000 --> 00:14:38.000
Artem Trotsyuk: Modulating risk such that it's not.

00:14:38.000 --> 00:14:44.000
Artem Trotsyuk: It's not propagating and just kind of doing whatever. And we're we're considering.

00:14:44.000 --> 00:14:48.000
Artem Trotsyuk: Considering patient safety and so forth. So that's kind of how.

00:14:48.000 --> 00:14:54.000
Artem Trotsyuk: How my research work informed me in terms of the AI. Work at Stanford.

00:14:55.000 --> 00:14:57.000
Rishad Usmani: I have. I have 2 different questions.

00:14:58.000 --> 00:15:03.000
Rishad Usmani: The 1st one is. This is something I find in in academics and physicians quite a bit.

00:15:03.000 --> 00:15:06.000
Rishad Usmani: Um is as they value their product.

00:15:07.000 --> 00:15:12.000
Rishad Usmani: And the technical problem they're solving. And the technical solution, they're developed. I would say 80 to 90%.

00:15:13.000 --> 00:15:17.000
Rishad Usmani: And the value distribution, the other 10 to 20%. And there's a graveyard of.

00:15:17.000 --> 00:15:20.000
Rishad Usmani: Excellent solutions, which would have.

00:15:20.000 --> 00:15:22.000
Rishad Usmani: Done a lot of good in the world.

00:15:22.000 --> 00:15:24.000
Rishad Usmani: And saved a lot of lives.

00:15:24.000 --> 00:15:29.000
Rishad Usmani: That didn't make it to market, and did not commercialize because of not valuing the latter.

00:15:29.000 --> 00:15:32.000
Rishad Usmani: What are kind of your thoughts here.

00:15:32.000 --> 00:15:36.000
Rishad Usmani: And and you know, to kind of build on this as a physician.

00:15:36.000 --> 00:15:41.000
Rishad Usmani: Wound. Healing. A lot of it is just clinical, right? I look at a wound I'm like. Is it healing.

00:15:42.000 --> 00:15:46.000
Rishad Usmani: There's certain things I look at, but it's very subjective. There's no.

00:15:46.000 --> 00:15:50.000
Rishad Usmani: You can get into pressure and and vac wounds and.

00:15:50.000 --> 00:15:53.000
Rishad Usmani: You know the specific pressure in the wound, and if it's a.

00:15:54.000 --> 00:15:56.000
Rishad Usmani: If it's alive, or if it's there's dead tissue and.

00:15:56.000 --> 00:16:01.000
Rishad Usmani: And measuring it. But, to be frank, in most of the clinical community, we just look at it.

00:16:01.000 --> 00:16:05.000
Rishad Usmani: And we just say it's necrotic or it's not necrotic. Is it warm? Is it cold?

00:16:06.000 --> 00:16:11.000
Rishad Usmani: But it's very subjective. And then asking me to use a solution to replicate that.

00:16:11.000 --> 00:16:15.000
Rishad Usmani: Usually requires tons of data and FDA approval.

00:16:15.000 --> 00:16:20.000
Rishad Usmani: So I guess I guess it's a 2 part question. One is, how do you think about distribution? And then what do you think drives.

00:16:20.000 --> 00:16:24.000
Rishad Usmani: Clinical adoption for medical devices and technologies such as these.

00:16:26.000 --> 00:16:31.000
Artem Trotsyuk: The hardest part is, I think, what you're alluding to. There is changing a physician's mind.

00:16:32.000 --> 00:16:34.000
Artem Trotsyuk: In how to do their job.

00:16:34.000 --> 00:16:40.000
Artem Trotsyuk: Right, and because there's a certain aspect of kind of like.

00:16:41.000 --> 00:16:45.000
Artem Trotsyuk: And if I were to make a parallel with sports, athletes.

00:16:46.000 --> 00:16:53.000
Artem Trotsyuk: Um. They got to a certain place in where they are in their career by doing things a certain way, and it's been working.

00:16:53.000 --> 00:16:55.000
Artem Trotsyuk: And things have been doing, going well.

00:16:55.000 --> 00:16:58.000
Artem Trotsyuk: So why would a sports athlete.

00:16:58.000 --> 00:17:04.000
Artem Trotsyuk: Change their diet if they made it to the Nfl. By eating burgers and nachos, for example.

00:17:04.000 --> 00:17:10.000
Artem Trotsyuk: And telling them that you can't eat that anymore, because now you have to do something better, I was like, well, don't tell me what to do.

00:17:10.000 --> 00:17:16.000
Artem Trotsyuk: I've gotten here because I've been doing what I've been doing, and it works really well for me. Clearly, I'm performing on the field.

00:17:16.000 --> 00:17:19.000
Artem Trotsyuk: Similarly with physicians. Often it's there's an aspect of.

00:17:19.000 --> 00:17:27.000
Artem Trotsyuk: I've seen a lot of cases. I've treated a lot of cases. I know how this is done. Why would I adopt a new solution? That is.

00:17:28.000 --> 00:17:33.000
Artem Trotsyuk: Not within my workflow, or is going to disrupt my.

00:17:33.000 --> 00:17:37.000
Artem Trotsyuk: My my flow, standard care, etc.

00:17:37.000 --> 00:17:45.000
Artem Trotsyuk: And I think what's super important there is tie in and having physician partnership and buy in earlier on. So if you're developing solutions.

00:17:45.000 --> 00:17:50.000
Artem Trotsyuk: There's a component of working with the physicians to actually develop something that's useful because there are problems that are still.

00:17:50.000 --> 00:17:55.000
Artem Trotsyuk: Not solved in the medical domain that physicians don't have the time to solve.

00:17:55.000 --> 00:18:01.000
Artem Trotsyuk: But they'd love for someone to go solve that problem. And so that's the biggest part is.

00:18:01.000 --> 00:18:07.000
Artem Trotsyuk: If I were to partner with you, for example, and create something, what would be actually useful for you that you'd be like, yeah.

00:18:07.000 --> 00:18:15.000
Artem Trotsyuk: That would help me do my job better rather than that is not a useful tool for me to use, and I'm not interested in that tool.

00:18:15.000 --> 00:18:27.000
Artem Trotsyuk: So that's where I think the differences between things that are more adopted within the community versus not. And even if you can get more physician buy-in by, say, having a co-founder as a physician. If you're developing a med tool.

00:18:27.000 --> 00:18:35.000
Artem Trotsyuk: Or if you have some folks on the board, those folks at least can advocate for you, or advocate on the behalf of what you're developing.

00:18:35.000 --> 00:18:40.000
Artem Trotsyuk: So on our smart bandage technology. We were fortunate that it was being.

00:18:40.000 --> 00:18:44.000
Artem Trotsyuk: Led by Dr. Jeffrey Gertner, who is.

00:18:44.000 --> 00:18:53.000
Artem Trotsyuk: A physician scientist. He sees patients. He was at the wound care center. Frequently he was treating wounds, so he saw the problems that were happening in that space.

00:18:53.000 --> 00:18:58.000
Artem Trotsyuk: And he was keen on having solutions that actually can be put on. Patients lives and.

00:18:58.000 --> 00:19:03.000
Artem Trotsyuk: Create things that would actually be useful. And so for him it was a very clear.

00:19:03.000 --> 00:19:11.000
Artem Trotsyuk: Roi, investing the research time into it in terms of having some sort of product that can then be eventually shipped onto patients.

00:19:11.000 --> 00:19:17.000
Artem Trotsyuk: But then, again, that is because of his interest in it. If he didn't have an interest in it, it would be a different story.

00:19:18.000 --> 00:19:24.000
Rishad Usmani: You've kind of touched both the academic world, but also the finance and the Vc. World.

00:19:24.000 --> 00:19:27.000
Rishad Usmani: Tell me about how you got into.

00:19:28.000 --> 00:19:32.000
Rishad Usmani: Your gig as an entrepreneur in residence, and then maybe tell me a bit about your work with Openai.

00:19:33.000 --> 00:19:39.000
Artem Trotsyuk: Yeah. So the the beautiful part about Silicon Valley is that there are so many.

00:19:39.000 --> 00:19:44.000
Artem Trotsyuk: Opportunities for you to just show up and participate in.

00:19:44.000 --> 00:19:47.000
Artem Trotsyuk: And I mean that in a like, in a.

00:19:48.000 --> 00:19:53.000
Artem Trotsyuk: In a way of one of my mentors, Ron John Nag. He 1 point, said.

00:19:53.000 --> 00:20:00.000
Artem Trotsyuk: You know, there's there's ideas laying on the ground. You just got to pick one up and go with it. Kind of thing like there's.

00:20:00.000 --> 00:20:07.000
Artem Trotsyuk: There's there's plenty of things to do. It's just a matter of execution. That's the 1st thing he's like. Another part is.

00:20:07.000 --> 00:20:14.000
Artem Trotsyuk: Just show up. You don't know who's going to be in the room. You don't know who's there. You don't know what what you can do if you're not there, and you're not present.

00:20:14.000 --> 00:20:21.000
Artem Trotsyuk: And so there's this. If you've seen now this more push of people going back into the office or returning back into work and so forth.

00:20:21.000 --> 00:20:29.000
Artem Trotsyuk: And there, there's a certain aspect of convenience, of being working from home and being able to do a lot of the work from home.

00:20:29.000 --> 00:20:35.000
Artem Trotsyuk: And I feel like there's a good balance between work from home versus work in the office. But there's a certain value. Add where, if you're doing.

00:20:35.000 --> 00:20:40.000
Artem Trotsyuk: Problem solving or critical problems that you need to identify solutions for.

00:20:40.000 --> 00:20:45.000
Artem Trotsyuk: To be in a group set setting where you can work there together with folks, and so.

00:20:46.000 --> 00:20:50.000
Artem Trotsyuk: Um with kind of the entrepreneurial side. There is a lot of.

00:20:50.000 --> 00:20:53.000
Artem Trotsyuk: When I was doing my Phd program.

00:20:53.000 --> 00:20:58.000
Artem Trotsyuk: Um. People are doing startups all the time at Stanford. There's.

00:20:58.000 --> 00:21:04.000
Artem Trotsyuk: The graduate school business. There's the there's a ton of startups coming out from there. There's the.

00:21:05.000 --> 00:21:08.000
Artem Trotsyuk: Biodesign, folks, there's the.

00:21:08.000 --> 00:21:16.000
Artem Trotsyuk: There's a lot of other entrepreneurship hubs on campus and so forth. So I was quite keen on understanding what that.

00:21:16.000 --> 00:21:20.000
Artem Trotsyuk: Ecosystem was like, particularly from a scientist science perspective.

00:21:20.000 --> 00:21:26.000
Artem Trotsyuk: Where we, as a scientist, bring technical expertise to the table.

00:21:26.000 --> 00:21:31.000
Artem Trotsyuk: However, and that we don't necessarily know the the non technical stuff.

00:21:31.000 --> 00:21:41.000
Artem Trotsyuk: And how do we learn the non technical in order to talk to people about technical, but also have a frame of reference. For like earlier, talking about the value of death and.

00:21:41.000 --> 00:21:43.000
Artem Trotsyuk: Translating technologies.

00:21:43.000 --> 00:21:55.000
Artem Trotsyuk: How do we? If if you weren't exposed to that value, you didn't know what that existed, you would have one perspective on it. So it's the flip side of the coin that I think is super valuable, that you get from an ecosystem.

00:21:56.000 --> 00:22:02.000
Artem Trotsyuk: And at Stanford you just you get that as an ecosystem. And so that's where I was able to.

00:22:03.000 --> 00:22:08.000
Artem Trotsyuk: Plug in and kind of, participate, and learn from that ecosystem, and then learn and.

00:22:08.000 --> 00:22:14.000
Artem Trotsyuk: Bring the domain expertise of computer science and engineering.

00:22:14.000 --> 00:22:20.000
Artem Trotsyuk: Biology, all a wound, healing all that to the table when evaluating a company and looking at a company deal.

00:22:20.000 --> 00:22:29.000
Artem Trotsyuk: And then taking that as reference point, and understanding how the market dynamics respect and view that situation so.

00:22:29.000 --> 00:22:38.000
Artem Trotsyuk: That's kind of how I got involved on on the on the investment side, because there was a certain aspect of technical domain expertise that I was able to bring to the table that.

00:22:38.000 --> 00:22:49.000
Artem Trotsyuk: Someone say, who's only getting an Mba does not have because there's a component of just like when you're doing a Phd program, you go deep into a topic. And you're you're the world's expert in a specific topic.

00:22:49.000 --> 00:23:01.000
Artem Trotsyuk: So you're going to be able to talk about that topic. I can talk about wound healing and stuff that works and doesn't work. I've been on to all those conferences, all those trade shows a lot of stuff that doesn't work. There's a lot of people trying solutions, but they're all within the same scope. Right? Everyone's trying.

00:23:01.000 --> 00:23:08.000
Artem Trotsyuk: A better dressing, a better hydrogel, a better skin substitute. There's all the stuff that people are trying, but.

00:23:08.000 --> 00:23:16.000
Artem Trotsyuk: Isn't moving the needle forward is, and is that going to really push the needle on innovation forward? Is, or is this just another thing.

00:23:16.000 --> 00:23:21.000
Artem Trotsyuk: That isn't really going to do anything. And you're just not going to have a product market fit and so forth.

00:23:21.000 --> 00:23:27.000
Artem Trotsyuk: So that's kind of how I got involved with that. And it was. It's been a fun journey to be able to be on both the investment side.

00:23:28.000 --> 00:23:31.000
Artem Trotsyuk: As well as the operation side.

00:23:31.000 --> 00:23:34.000
Artem Trotsyuk: And kind of straddle both, and see.

00:23:34.000 --> 00:23:36.000
Artem Trotsyuk: 2 sides of the coin necessarily just quickly.

00:23:37.000 --> 00:23:42.000
Rishad Usmani: We're seeing this with AI scribes. And even though they don't technically as.

00:23:42.000 --> 00:23:45.000
Rishad Usmani: Work as well as the standard of care. They're worse than the standard of care.

00:23:46.000 --> 00:23:48.000
Rishad Usmani: They're receiving.

00:23:48.000 --> 00:23:52.000
Rishad Usmani: Immense market adoption because they make things easier for me.

00:23:52.000 --> 00:23:59.000
Rishad Usmani: Do you think there's a parallel here to be drawn in medical device where maybe something doesn't even work as well as the current standard of care.

00:23:59.000 --> 00:24:02.000
Rishad Usmani: But it just makes my job easier and.

00:24:02.000 --> 00:24:06.000
Rishad Usmani: Or do you think success, and by success I mean an exit for a startup.

00:24:07.000 --> 00:24:09.000
Rishad Usmani: Is more contingent on the product working.

00:24:10.000 --> 00:24:15.000
Rishad Usmani: Or their ability to sell into clinical workflows and.

00:24:16.000 --> 00:24:18.000
Artem Trotsyuk: You too, both so.

00:24:18.000 --> 00:24:24.000
Artem Trotsyuk: In order for them to sell into the clinical workflow. I hope the product has some sort of workable something or another.

00:24:24.000 --> 00:24:26.000
Artem Trotsyuk: So in your example of scribes. Uh.

00:24:26.000 --> 00:24:34.000
Artem Trotsyuk: You're right. They're not fully there, but at least to get you 80% or 90% of the way there, and you can quickly correct it, which makes it faster for you.

00:24:34.000 --> 00:24:37.000
Artem Trotsyuk: And actually makes it better for the patient, because then you have all the.

00:24:37.000 --> 00:24:41.000
Artem Trotsyuk: Quick transcription notes, and you can kind of make sure that the patients have enough.

00:24:41.000 --> 00:24:47.000
Artem Trotsyuk: Documented history such that you have a more representative view of what you and the patients talked about.

00:24:47.000 --> 00:24:50.000
Artem Trotsyuk: Um for medical devices.

00:24:50.000 --> 00:24:52.000
Artem Trotsyuk: You would have.

00:24:52.000 --> 00:24:58.000
Artem Trotsyuk: A certain need to have the device work. Well, I hope because in order for it to get approved.

00:24:59.000 --> 00:25:01.000
Artem Trotsyuk: You would need to.

00:25:01.000 --> 00:25:03.000
Artem Trotsyuk: Document and prove that the device actually works.

00:25:03.000 --> 00:25:06.000
Artem Trotsyuk: Uh the.

00:25:07.000 --> 00:25:17.000
Artem Trotsyuk: What I've noticed on the adoption side, it goes back to one. Is someone going to pay for something that you're developing. So in the medical device space. It's quite difficult, because.

00:25:17.000 --> 00:25:27.000
Artem Trotsyuk: You need to find someone who's willing to buy what you're building, and if you don't have someone who's willing to buy what you're building. Don't build it, to begin with, as kind of the model that I have now adopted.

00:25:27.000 --> 00:25:30.000
Artem Trotsyuk: Because you can spend all this money.

00:25:30.000 --> 00:25:34.000
Artem Trotsyuk: Invest your money, your money, to build something, and then you have no.

00:25:34.000 --> 00:25:37.000
Artem Trotsyuk: Interesting party to say, here's some money for it.

00:25:37.000 --> 00:25:43.000
Artem Trotsyuk: And then you're stuck because you've developed something. You think the hospitals are going to buy it. Hospitals are actually not going to buy it.

00:25:43.000 --> 00:25:47.000
Artem Trotsyuk: And then what? And then you're kind of stuck with just holding the bag of your device that you just developed.

00:25:48.000 --> 00:25:50.000
Artem Trotsyuk: So there's this.

00:25:50.000 --> 00:25:58.000
Artem Trotsyuk: True component around. If you're going to build something in the medical device space, if it's something that you're planning on implementing with patients who needs to be.

00:25:58.000 --> 00:26:02.000
Artem Trotsyuk: If it's something that needs to be prescribed by a physician.

00:26:02.000 --> 00:26:08.000
Artem Trotsyuk: You want that to work? Well, if it's something that people can just buy at your Walgreens or your Cvs.

00:26:08.000 --> 00:26:14.000
Artem Trotsyuk: As as kind of their own option. That's really up to them. They can go and buy something as a device.

00:26:14.000 --> 00:26:17.000
Artem Trotsyuk: Or some sort of feature like, say.

00:26:18.000 --> 00:26:21.000
Artem Trotsyuk: Uh! I don't know uh.

00:26:21.000 --> 00:26:28.000
Artem Trotsyuk: A a specific type of walking instrument, for example, or right now. Uh, I've seen on uh.

00:26:28.000 --> 00:26:31.000
Artem Trotsyuk: I've seen ads where there's the.

00:26:32.000 --> 00:26:47.000
Artem Trotsyuk: The exoskeletons that help people walk upstairs and so forth, and they're lighter, faster, better. And you can kind of buy them for only $799, or something like that for the basic one. But if y'all want the faster battery pack, it's 1,200 or something like.

00:26:47.000 --> 00:26:50.000
Artem Trotsyuk: There's there's a whole gamut of things, but those are, if you like.

00:26:50.000 --> 00:26:52.000
Artem Trotsyuk: The ones that are.

00:26:52.000 --> 00:26:55.000
Artem Trotsyuk: Consumer oriented. So it's a consumer market.

00:26:55.000 --> 00:27:01.000
Artem Trotsyuk: Play rather than specifically a hospital play. And so it really depends on what the the.

00:27:01.000 --> 00:27:05.000
Artem Trotsyuk: The entrepreneurs building.

00:27:05.000 --> 00:27:08.000
Artem Trotsyuk: And what they want to deploy through the market fit. But.

00:27:08.000 --> 00:27:11.000
Artem Trotsyuk: Anyone who's building a startup needs to have.

00:27:12.000 --> 00:27:18.000
Artem Trotsyuk: Some sort of product market fit idea earlier on. And then, especially if you're going in something that is going to be more.

00:27:18.000 --> 00:27:22.000
Artem Trotsyuk: Business development heavy. You should start those relationships early.

00:27:22.000 --> 00:27:24.000
Artem Trotsyuk: As the general just now.

00:27:24.000 --> 00:27:30.000
Rishad Usmani: What are your thoughts on digital therapeutics as a category? A. After what happened with their health.

00:27:30.000 --> 00:27:32.000
Rishad Usmani: Um. We're kind of seeing.

00:27:32.000 --> 00:27:37.000
Rishad Usmani: The category not take off as much, and it seems like we have this inherent.

00:27:37.000 --> 00:27:39.000
Rishad Usmani: Um. You know.

00:27:40.000 --> 00:27:42.000
Rishad Usmani: Inherent feel that I can take a pill.

00:27:42.000 --> 00:27:51.000
Rishad Usmani: To make something better. But if it's a if it's a digital solution, it's less likely to work. And it's not really grounded on fact or any logic.

00:27:51.000 --> 00:27:54.000
Rishad Usmani: It's just the way I think we as a market.

00:27:54.000 --> 00:27:58.000
Rishad Usmani: Or as humans feel. What are your thoughts on digital therapeutics? Do you kind of see them.

00:27:59.000 --> 00:28:01.000
Rishad Usmani: Are you Bullish on them, or are you bearish on them?

00:28:03.000 --> 00:28:07.000
Artem Trotsyuk: Uh depending. What domain uh they're in.

00:28:07.000 --> 00:28:15.000
Artem Trotsyuk: I think right now we're in a space where digital anything is easily disrupted.

00:28:15.000 --> 00:28:21.000
Artem Trotsyuk: And it's a question of if they are to invest in a company that has a digital angle.

00:28:21.000 --> 00:28:24.000
Artem Trotsyuk: Of digital therapeutics. For example, a mental health app.

00:28:24.000 --> 00:28:27.000
Artem Trotsyuk: For example, if I were to invest specifically in that.

00:28:28.000 --> 00:28:31.000
Artem Trotsyuk: Ah category does.

00:28:32.000 --> 00:28:37.000
Artem Trotsyuk: That bring a return to the fund. That is kind of like a a big question here.

00:28:38.000 --> 00:28:44.000
Artem Trotsyuk: I don't know if it does right now. At this moment, on May 30, th 2025.

00:28:44.000 --> 00:28:50.000
Artem Trotsyuk: It could be something different tomorrow it might have been something different yesterday. I bring this up because.

00:28:51.000 --> 00:28:53.000
Artem Trotsyuk: There. It is a very.

00:28:53.000 --> 00:28:59.000
Artem Trotsyuk: Disruptable space, and a consumer play is very hard, especially in digital.

00:28:59.000 --> 00:29:02.000
Artem Trotsyuk: You're trying to get people to use your app.

00:29:02.000 --> 00:29:09.000
Artem Trotsyuk: Versus another app attention. Spans are difficult to capture. People are busy, people are distracted with everything.

00:29:09.000 --> 00:29:12.000
Artem Trotsyuk: Trying to have them get yet another app is quite hard.

00:29:12.000 --> 00:29:18.000
Artem Trotsyuk: And right now there's something called vibe coding, where you can just build your own app if you want.

00:29:18.000 --> 00:29:25.000
Artem Trotsyuk: So then the higher order question. Now, I'm literally asking if someone's pitching a idea of like an app.

00:29:25.000 --> 00:29:29.000
Artem Trotsyuk: I ask, and I say, can I vibe code this up myself?

00:29:29.000 --> 00:29:31.000
Artem Trotsyuk: Because if the answer is Yes.

00:29:31.000 --> 00:29:35.000
Artem Trotsyuk: Then you have no business. It kind of. I hate to say it, but.

00:29:35.000 --> 00:29:38.000
Artem Trotsyuk: If I can take your app.

00:29:38.000 --> 00:29:41.000
Artem Trotsyuk: Recreate it for 1, 99.

00:29:41.000 --> 00:29:45.000
Artem Trotsyuk: In 30 min, and launch it by.

00:29:45.000 --> 00:29:50.000
Artem Trotsyuk: We're using a few services to kind of launch it onto a web app store or something.

00:29:51.000 --> 00:29:55.000
Artem Trotsyuk: What is the business model for people who are developing digital health apps.

00:29:56.000 --> 00:30:01.000
Artem Trotsyuk: So it's it comes down to have you captured a market segment and consumers willing to.

00:30:01.000 --> 00:30:09.000
Artem Trotsyuk: Buy into your service. Specifically, that's your date. That's your moat. It's the data that the people are allowing you to capture and the people that you've.

00:30:10.000 --> 00:30:16.000
Artem Trotsyuk: Pulled in and said, Please stay, and they want to stay and use your service. Otherwise you have no digital moat.

00:30:16.000 --> 00:30:20.000
Artem Trotsyuk: Everyone else can now pull that up. I think I even heard.

00:30:20.000 --> 00:30:23.000
Artem Trotsyuk: A talk from, or a mention.

00:30:23.000 --> 00:30:25.000
Artem Trotsyuk: That the Linkedin founder.

00:30:26.000 --> 00:30:29.000
Artem Trotsyuk: Said that he, Vibe, coded up. Linkedin.

00:30:29.000 --> 00:30:35.000
Artem Trotsyuk: In 30 min. The same website that they spent many, many, many like many dollars, and many.

00:30:35.000 --> 00:30:37.000
Artem Trotsyuk: But a lot of time on.

00:30:37.000 --> 00:30:39.000
Artem Trotsyuk: One person 30 min. He's like.

00:30:40.000 --> 00:30:42.000
Artem Trotsyuk: Crazy. You can do literally the same thing.

00:30:42.000 --> 00:30:46.000
Artem Trotsyuk: By just using some of these new tools that are being developed.

00:30:46.000 --> 00:30:50.000
Artem Trotsyuk: So then that mean that for all the founders listening.

00:30:50.000 --> 00:30:53.000
Artem Trotsyuk: Rethinking that strategy like what is your modes.

00:30:53.000 --> 00:31:01.000
Artem Trotsyuk: That is the most important part, and specifically in digital, how fast you're getting it into people's hands. And are they staying.

00:31:01.000 --> 00:31:04.000
Artem Trotsyuk: Are these people staying with you versus going somewhere else?

00:31:05.000 --> 00:31:07.000
Rishad Usmani: How, how far are we from.

00:31:08.000 --> 00:31:11.000
Rishad Usmani: A point where a person like me, who has.

00:31:11.000 --> 00:31:13.000
Rishad Usmani: No coding knowledge.

00:31:13.000 --> 00:31:16.000
Rishad Usmani: Um. I can maybe say Hello, world, and that's about it.

00:31:17.000 --> 00:31:21.000
Rishad Usmani: Now, how far are are we from me being able to vibe code.

00:31:21.000 --> 00:31:23.000
Rishad Usmani: A solution. Um.

00:31:23.000 --> 00:31:26.000
Rishad Usmani: For a mental health lab, say which which has that.

00:31:25.000 --> 00:31:27.000
Artem Trotsyuk: You can do it today.

00:31:27.000 --> 00:31:29.000
Artem Trotsyuk: You can do it. Today, you can go on.

00:31:28.000 --> 00:31:30.000
Rishad Usmani: So that beings.

00:31:30.000 --> 00:31:37.000
Artem Trotsyuk: Lovable dot dev. That's like one example, and you can use language, and you can chat in that.

00:31:37.000 --> 00:31:42.000
Artem Trotsyuk: Site and say, I want to create mental health app with these features, measuring these markers.

00:31:43.000 --> 00:31:45.000
Artem Trotsyuk: It's going to go and think for about 10 to 15 min.

00:31:45.000 --> 00:31:47.000
Artem Trotsyuk: And it's going to output you an app.

00:31:48.000 --> 00:31:53.000
Artem Trotsyuk: And then you can kind of say, Oh, I don't like this feature. Can you please modify this? Or can you change this and etc, etc?

00:31:53.000 --> 00:31:57.000
Artem Trotsyuk: And you modify as you like, but you can do it now. There's tools out there that.

00:31:56.000 --> 00:32:00.000
Rishad Usmani: And it will, it will debug the the back end as well.

00:32:00.000 --> 00:32:08.000
Artem Trotsyuk: It's do. Yes. And so if something's not working, it'll like, for example, if you're trying to push something, it's not working. It says error. You tell the app you tell lovable like.

00:32:08.000 --> 00:32:18.000
Artem Trotsyuk: There's an error. I'm confused. It's not allowing me to do it, and it will say things like running a script to identify errors, fixing the errors. It looks like we identified errors, and we fixed them and stuff.

00:32:18.000 --> 00:32:25.000
Artem Trotsyuk: So you'll get workable something. And then, if you were to deploy that, then there's a few other steps you take in terms of.

00:32:25.000 --> 00:32:29.000
Artem Trotsyuk: Qc. And making sure that it's on a specific server and stuff. And so.

00:32:29.000 --> 00:32:40.000
Artem Trotsyuk: Those those few things say, you can use chat to identify. What are the steps for you to launch an app and what vibeco tools exist for you to go from a prototype version to a full deployable version.

00:32:40.000 --> 00:32:44.000
Artem Trotsyuk: But you don't necessarily need to.

00:32:44.000 --> 00:32:48.000
Artem Trotsyuk: Ah, write your own code. It you can do it now if you want.

00:32:48.000 --> 00:32:51.000
Rishad Usmani: So for for an AI solution.

00:32:51.000 --> 00:32:57.000
Rishad Usmani: Does. Defensibility now lies solely in the distribution advantage? Or is there some.

00:32:58.000 --> 00:33:00.000
Rishad Usmani: Inherent product defensibility.

00:33:00.000 --> 00:33:06.000
Artem Trotsyuk: That's that's the hard part right now. Everyone's trying to figure that out like, what is the defensibility of AI solutions in which, if I can.

00:33:07.000 --> 00:33:09.000
Artem Trotsyuk: Vibe coded out.

00:33:09.000 --> 00:33:18.000
Artem Trotsyuk: What happens to the you know, how do you, IP? AI, and it's quite hard. Right? That's a very hard problem to do.

00:33:18.000 --> 00:33:20.000
Artem Trotsyuk: That people are so solving right now.

00:33:21.000 --> 00:33:25.000
Artem Trotsyuk: But I go back to. If you're develop developing any kind of AI solution.

00:33:25.000 --> 00:33:27.000
Artem Trotsyuk: Your your moat.

00:33:27.000 --> 00:33:32.000
Artem Trotsyuk: Is your people aspect or your business, or your clients, whoever you're.

00:33:32.000 --> 00:33:38.000
Artem Trotsyuk: Saying are going to use that. That's your moat. Because if you have a solution which you have hundreds of thousands of people on.

00:33:38.000 --> 00:33:44.000
Artem Trotsyuk: And they like it, and they continue coming back to it. Then you have a growth strategy in which you can pull in more people.

00:33:44.000 --> 00:33:47.000
Artem Trotsyuk: If you don't have that, and they can turn and go to another.

00:33:47.000 --> 00:33:55.000
Artem Trotsyuk: Place doesn't matter how great your app is, someone else can develop that they can pull your customers away from you, and customers don't come back.

00:33:55.000 --> 00:33:57.000
Artem Trotsyuk: Doesn't matter.

00:33:57.000 --> 00:34:03.000
Rishad Usmani: That being said, what are your thoughts in the future of knowledge work? Are we all marketers in the future?

00:34:04.000 --> 00:34:07.000
Artem Trotsyuk: I think we're all creatives in the future.

00:34:07.000 --> 00:34:09.000
Artem Trotsyuk: Because.

00:34:09.000 --> 00:34:16.000
Artem Trotsyuk: In order for you to get a good output from AI tools. You need to have a good.

00:34:16.000 --> 00:34:19.000
Artem Trotsyuk: Way to prompt the tool to do the work.

00:34:19.000 --> 00:34:22.000
Artem Trotsyuk: And prompting comes from having.

00:34:22.000 --> 00:34:25.000
Artem Trotsyuk: What understanding, well good understanding.

00:34:25.000 --> 00:34:31.000
Artem Trotsyuk: Of the English language in this case, because a lot of the tools are being built out primarily in English, but there are.

00:34:31.000 --> 00:34:33.000
Artem Trotsyuk: Other languages coming online.

00:34:33.000 --> 00:34:36.000
Artem Trotsyuk: Being able to ask.

00:34:36.000 --> 00:34:39.000
Artem Trotsyuk: The AI, the right way.

00:34:39.000 --> 00:34:42.000
Artem Trotsyuk: Comes from being able to communicate.

00:34:42.000 --> 00:34:45.000
Artem Trotsyuk: In English the right way.

00:34:45.000 --> 00:34:51.000
Artem Trotsyuk: In order for the tool to understand you. So you going back to being more creative thinkers.

00:34:51.000 --> 00:34:57.000
Artem Trotsyuk: And creatively walking through a task is something that more people are going to get used to.

00:34:57.000 --> 00:34:59.000
Artem Trotsyuk: I don't buy the.

00:35:00.000 --> 00:35:05.000
Artem Trotsyuk: AI will replace people, because that's been an argument that's been said over the last 100 years.

00:35:05.000 --> 00:35:14.000
Artem Trotsyuk: And we're still around. And there's still money to be made. And there's still different types of jobs. There's new types of jobs that come out every single time. There's 1 of these pushes.

00:35:14.000 --> 00:35:22.000
Artem Trotsyuk: Right, you know, even an influencer. I didn't. When I was growing up. There was no such job as I can be an influencer.

00:35:22.000 --> 00:35:25.000
Artem Trotsyuk: That's a whole category in a huge market right.

00:35:25.000 --> 00:35:27.000
Artem Trotsyuk: Um.

00:35:29.000 --> 00:35:31.000
Artem Trotsyuk: How does work?

00:35:31.000 --> 00:35:35.000
Artem Trotsyuk: Change. That's the whole interest. That's no one really knows, because.

00:35:35.000 --> 00:35:38.000
Artem Trotsyuk: Technology is moving so quickly. But if you've noticed, kind of.

00:35:38.000 --> 00:35:41.000
Artem Trotsyuk: Some AI tools are growing really fast.

00:35:41.000 --> 00:35:43.000
Artem Trotsyuk: But then there's some companies that are like.

00:35:44.000 --> 00:35:51.000
Artem Trotsyuk: It's not working as well. So we're scaling back in the amount of deploying AI that we're having. We're gonna have more humans in the loop.

00:35:51.000 --> 00:35:56.000
Artem Trotsyuk: Why is it that it's not working? Well, well, it's because the solution.

00:35:56.000 --> 00:36:01.000
Artem Trotsyuk: Is hallucinating. There's an aspect that the solution was not trained on certain set of data.

00:36:02.000 --> 00:36:04.000
Artem Trotsyuk: It wants to provide you with an output.

00:36:04.000 --> 00:36:09.000
Artem Trotsyuk: And so it provides you with what it thinks is a reasonable output. Given the structure of the data.

00:36:09.000 --> 00:36:15.000
Artem Trotsyuk: But it's not the correct output, because of what you know as correct by format.

00:36:16.000 --> 00:36:18.000
Artem Trotsyuk: So then the higher order question is.

00:36:19.000 --> 00:36:21.000
Artem Trotsyuk: How do we.

00:36:21.000 --> 00:36:33.000
Artem Trotsyuk: How do we rectify this? Right? How do we? How do we balance that? And it's still gonna be. There's a lot of human in the loop component that still needs to happen. And so I'm not at the point that AI is going to replace us at this point. Yet I feel like it's.

00:36:34.000 --> 00:36:38.000
Artem Trotsyuk: AI will only enable us, in terms of doing even more interesting fun stuff.

00:36:40.000 --> 00:36:43.000
Rishad Usmani: This is a full circle moment where the most.

00:36:43.000 --> 00:36:49.000
Rishad Usmani: Competitive, and the most useful degrees of the future might be arts and language, degrees.

00:36:49.000 --> 00:36:51.000
Rishad Usmani: And and not stem degrees.

00:36:51.000 --> 00:36:53.000
Artem Trotsyuk: Yeah, people.

00:36:52.000 --> 00:36:54.000
Rishad Usmani: Which is a very, very interesting reality.

00:36:54.000 --> 00:36:58.000
Artem Trotsyuk: I think people need to understand how computers work.

00:36:58.000 --> 00:37:00.000
Artem Trotsyuk: How AI systems work.

00:37:00.000 --> 00:37:05.000
Artem Trotsyuk: How how architecture stuff, works, how large English model works.

00:37:05.000 --> 00:37:08.000
Artem Trotsyuk: But I don't necessarily.

00:37:08.000 --> 00:37:10.000
Artem Trotsyuk: Us believe that.

00:37:10.000 --> 00:37:12.000
Artem Trotsyuk: You shh.

00:37:13.000 --> 00:37:16.000
Artem Trotsyuk: It would that you will not be able to get.

00:37:16.000 --> 00:37:19.000
Artem Trotsyuk: Some interesting application layer on top of that.

00:37:20.000 --> 00:37:23.000
Artem Trotsyuk: I think that we don't necessarily know.

00:37:23.000 --> 00:37:28.000
Artem Trotsyuk: What jobs might be created, because it's moving so quickly.

00:37:28.000 --> 00:37:31.000
Artem Trotsyuk: And it's uncertain what work will look like.

00:37:32.000 --> 00:37:35.000
Artem Trotsyuk: Clearly work is being augmented and certain.

00:37:35.000 --> 00:37:40.000
Artem Trotsyuk: Laborious tasks are being offset with tooling that's being developed.

00:37:41.000 --> 00:37:46.000
Artem Trotsyuk: But it's still in its early phases of being iterated on and tested.

00:37:46.000 --> 00:37:52.000
Artem Trotsyuk: And so the higher order question is, what is that equilibrium? Once we reach a certain.

00:37:53.000 --> 00:37:56.000
Artem Trotsyuk: Adoption, a certain level of adoption.

00:37:56.000 --> 00:37:58.000
Artem Trotsyuk: What is that equilibrium? And what does that look like?

00:37:58.000 --> 00:38:01.000
Artem Trotsyuk: And I feel like that's still not decided yet.

00:38:02.000 --> 00:38:08.000
Rishad Usmani: And do you kind of buy into the Sequoia philosophy that the future unicorns and decacorns.

00:38:08.000 --> 00:38:12.000
Rishad Usmani: Will be application layers built on these platform companies.

00:38:12.000 --> 00:38:14.000
Rishad Usmani: Versus should we still.

00:38:14.000 --> 00:38:18.000
Rishad Usmani: Be building new models, new platform companies for AI.

00:38:19.000 --> 00:38:28.000
Artem Trotsyuk: I've heard of that discussion around kind of web 1.0 web, 2 point. Oh, the Facebooks and the Googles and those are all built off of the Internet. And so this is similarly.

00:38:28.000 --> 00:38:33.000
Artem Trotsyuk: AI tools are one set of or AI language models.

00:38:34.000 --> 00:38:40.000
Artem Trotsyuk: Models themselves are your application, your your foundation level, and then the applications are built on top of that.

00:38:41.000 --> 00:38:47.000
Artem Trotsyuk: I don't think any of us have seen strong, convincing applications that have come out just yet.

00:38:47.000 --> 00:38:49.000
Artem Trotsyuk: Because they're all sort of integrated, even.

00:38:49.000 --> 00:38:55.000
Artem Trotsyuk: For example, Google's recent announcements with Vo and a few others, the things that they come out.

00:38:55.000 --> 00:38:57.000
Artem Trotsyuk: Pushed out.

00:38:57.000 --> 00:39:00.000
Artem Trotsyuk: Not only like killed a bunch of snap.

00:39:00.000 --> 00:39:09.000
Artem Trotsyuk: Literally, but in a way, a lot of startups were working in certain verticals specific to creating video and creating images and such.

00:39:09.000 --> 00:39:15.000
Artem Trotsyuk: And then you have a Google showcase in which they're dropping video models, image models. And so.

00:39:15.000 --> 00:39:22.000
Artem Trotsyuk: How does that redefine the application layer companies that work in that space to specifically be really good at specifically, video creation, for example.

00:39:22.000 --> 00:39:24.000
Artem Trotsyuk: Uh.

00:39:25.000 --> 00:39:27.000
Artem Trotsyuk: We don't necessarily know.

00:39:28.000 --> 00:39:30.000
Artem Trotsyuk: Yet what the next? Facebook will be.

00:39:31.000 --> 00:39:34.000
Artem Trotsyuk: In kind of looking at the landscape.

00:39:34.000 --> 00:39:39.000
Artem Trotsyuk: But there's still a lot more work to be done in model development, to make them be.

00:39:39.000 --> 00:39:43.000
Artem Trotsyuk: Similar to human thought, human reasoning.

00:39:43.000 --> 00:39:47.000
Artem Trotsyuk: That a lot of people are still working through. So it's still pretty early stages.

00:39:47.000 --> 00:39:53.000
Artem Trotsyuk: A lot of the exciting work happening right now in the AI space is focusing on reasoning, focusing on.

00:39:54.000 --> 00:39:58.000
Artem Trotsyuk: The decision making, focusing on.

00:40:00.000 --> 00:40:02.000
Artem Trotsyuk: Modeling, psychology.

00:40:03.000 --> 00:40:06.000
Artem Trotsyuk: And AI systems, and that in itself.

00:40:07.000 --> 00:40:09.000
Artem Trotsyuk: Is still fairly new. And that's where social science.

00:40:10.000 --> 00:40:16.000
Artem Trotsyuk: Comes in and having people who are studying social science partner with computer scientists develop more.

00:40:16.000 --> 00:40:18.000
Artem Trotsyuk: Human like models is super.

00:40:18.000 --> 00:40:20.000
Artem Trotsyuk: Exciting, in my opinion.

00:40:20.000 --> 00:40:24.000
Rishad Usmani: So one of the key things, I think, and I'm not technical.

00:40:24.000 --> 00:40:27.000
Rishad Usmani: Is the Internet has no owner.

00:40:27.000 --> 00:40:32.000
Rishad Usmani: Because the military kind of standardize the the Tcp. Or the Icp. And from what I understand, those like the.

00:40:33.000 --> 00:40:35.000
Rishad Usmani: How to design and Internet protocols.

00:40:35.000 --> 00:40:38.000
Rishad Usmani: Um. Anyone can. Then go ahead and build.

00:40:39.000 --> 00:40:41.000
Rishad Usmani: The Internet isn't 1 Internet.

00:40:41.000 --> 00:40:43.000
Rishad Usmani: It's multiple autonomous networks.

00:40:43.000 --> 00:40:49.000
Rishad Usmani: We're not seeing this in the AI Foundation models. People own these models. There's companies that own the base models.

00:40:50.000 --> 00:40:52.000
Rishad Usmani: Do you think that.

00:40:52.000 --> 00:40:58.000
Rishad Usmani: Um, you know, the same thing will happen to AI foundation models where they will just be ubiquitous and.

00:40:58.000 --> 00:41:03.000
Rishad Usmani: What an AI, the the core structure of an AI foundation model will be public knowledge.

00:41:03.000 --> 00:41:07.000
Rishad Usmani: And then everyone can build it. And it's kind of standardized or.

00:41:07.000 --> 00:41:11.000
Rishad Usmani: Do you kind of see that happening in the AI Foundation model? What happened with Internet.

00:41:11.000 --> 00:41:17.000
Artem Trotsyuk: It's it's already moving in that direction. Right? You have Facebook releasing Llama.

00:41:17.000 --> 00:41:21.000
Artem Trotsyuk: You have the open source community deep, seek.

00:41:21.000 --> 00:41:24.000
Artem Trotsyuk: People are developing different models. So.

00:41:24.000 --> 00:41:26.000
Artem Trotsyuk: It's already moving in a direction of.

00:41:26.000 --> 00:41:28.000
Artem Trotsyuk: The money.

00:41:28.000 --> 00:41:33.000
Artem Trotsyuk: To be made in the potential future is going to be more on the application side.

00:41:33.000 --> 00:41:39.000
Artem Trotsyuk: Rather than on the model side, since the models are becoming more a commodity like the Internet as you mentioned. So.

00:41:39.000 --> 00:41:42.000
Artem Trotsyuk: I think that it is moving in that direction, and.

00:41:42.000 --> 00:41:45.000
Artem Trotsyuk: You'll see a lot more of that.

00:41:45.000 --> 00:41:47.000
Artem Trotsyuk: Every company.

00:41:47.000 --> 00:41:51.000
Artem Trotsyuk: Models themselves will become more commodities.

00:41:51.000 --> 00:41:54.000
Artem Trotsyuk: But the way in which the companies.

00:41:55.000 --> 00:41:57.000
Artem Trotsyuk: Tune the models.

00:41:57.000 --> 00:41:59.000
Artem Trotsyuk: For their specific applications.

00:41:59.000 --> 00:42:04.000
Artem Trotsyuk: Is kind of what you were talking earlier about the application layer, in which, if you were to deploy.

00:42:04.000 --> 00:42:09.000
Artem Trotsyuk: A custom Gpt or a specific video model or a specific image model.

00:42:09.000 --> 00:42:15.000
Artem Trotsyuk: That might be tuned on what your company has as proprietary, that is, on top of.

00:42:15.000 --> 00:42:17.000
Artem Trotsyuk: The layer data.

00:42:17.000 --> 00:42:21.000
Rishad Usmani: What are you most excited about? AI and drug discovery.

00:42:21.000 --> 00:42:28.000
Rishad Usmani: In AI, and generative models for large data and ambient intelligence. And what are some risks you see in those 3 categories.

00:42:29.000 --> 00:42:32.000
Artem Trotsyuk: I think it's super interesting to have.

00:42:34.000 --> 00:42:37.000
Artem Trotsyuk: The opportunity to mine a lot more data. Now.

00:42:37.000 --> 00:42:41.000
Artem Trotsyuk: So if we're looking from a pure research perspective.

00:42:41.000 --> 00:42:43.000
Artem Trotsyuk: Because of the.

00:42:43.000 --> 00:42:46.000
Artem Trotsyuk: Ability for us to have lower costs on compute.

00:42:46.000 --> 00:42:55.000
Artem Trotsyuk: And the ability for us to dissect larger amounts of data, there's still a very large foray of opportunity that we have not fully.

00:42:55.000 --> 00:43:01.000
Artem Trotsyuk: Dissected that could lead us to identify new.

00:43:01.000 --> 00:43:03.000
Artem Trotsyuk: Modalities for treating diseases.

00:43:03.000 --> 00:43:08.000
Artem Trotsyuk: Or different interventions that we have yet to discover. So that's on the research side.

00:43:09.000 --> 00:43:11.000
Artem Trotsyuk: I think that there's an opportunity there.

00:43:12.000 --> 00:43:16.000
Artem Trotsyuk: Ah for research scientists. Now, if research scientists find some.

00:43:16.000 --> 00:43:22.000
Artem Trotsyuk: Crazy version of Covid that we should probably not have be known as a.

00:43:22.000 --> 00:43:27.000
Artem Trotsyuk: A variant, or releasing that data into.

00:43:27.000 --> 00:43:29.000
Artem Trotsyuk: The Ethernet that might.

00:43:29.000 --> 00:43:33.000
Artem Trotsyuk: Make sense for some more safeguarding or safe.

00:43:33.000 --> 00:43:36.000
Artem Trotsyuk: Or guardrailing around certain AI tooling.

00:43:36.000 --> 00:43:41.000
Artem Trotsyuk: To not have that be as an output, and that's where a lot of the folks in the trust and safety.

00:43:41.000 --> 00:43:44.000
Artem Trotsyuk: Teams and a lot of folks who are doing preparedness work.

00:43:44.000 --> 00:43:49.000
Artem Trotsyuk: Are tying in their efforts in that space, and.

00:43:49.000 --> 00:43:53.000
Artem Trotsyuk: Are culminating this idea of how do we ensure reliable.

00:43:53.000 --> 00:43:55.000
Artem Trotsyuk: And useful. AI.

00:43:55.000 --> 00:44:00.000
Artem Trotsyuk: But also taking into account that we want to prevent.

00:44:00.000 --> 00:44:05.000
Artem Trotsyuk: Bad actors from doing bad the general consensus, even with the work we've studied on.

00:44:05.000 --> 00:44:08.000
Artem Trotsyuk: Risk mitigation is that.

00:44:10.000 --> 00:44:16.000
Artem Trotsyuk: If a bad actor wants to do bad, they will still do bad. You cannot stop them.

00:44:16.000 --> 00:44:21.000
Artem Trotsyuk: What you can do is you can make it harder for them to do that bad.

00:44:21.000 --> 00:44:26.000
Artem Trotsyuk: For example, so you can make it harder for people to get a scalable.

00:44:26.000 --> 00:44:29.000
Artem Trotsyuk: Product out that is going to scale.

00:44:29.000 --> 00:44:31.000
Artem Trotsyuk: To uh.

00:44:31.000 --> 00:44:36.000
Artem Trotsyuk: Infect a bunch of people, for example. Perhaps it's harder for them to develop a.

00:44:36.000 --> 00:44:41.000
Artem Trotsyuk: Biologic. That is something that can be sequenced in a large scale. Facility.

00:44:44.000 --> 00:44:50.000
Artem Trotsyuk: And for research scientists, it is important to kind of think about. If you are to publish some data.

00:44:50.000 --> 00:44:53.000
Artem Trotsyuk: If you are to think about the risks associated with the work that you're doing.

00:44:54.000 --> 00:44:57.000
Artem Trotsyuk: How does that fit in into the narrative.

00:44:57.000 --> 00:45:00.000
Artem Trotsyuk: To make sure that you're developing AI that is safe.

00:45:00.000 --> 00:45:02.000
Artem Trotsyuk: That is.

00:45:02.000 --> 00:45:06.000
Artem Trotsyuk: Not going to harm people, but also moves the needle forward in discovery.

00:45:07.000 --> 00:45:09.000
Artem Trotsyuk: I'm more bullish on AI tooling.

00:45:10.000 --> 00:45:16.000
Artem Trotsyuk: As a mode for novel discovery, and I think that there's many more opportunities that we have yet to tap into.

00:45:16.000 --> 00:45:20.000
Artem Trotsyuk: Protein sequencing, for example, as a means to identify.

00:45:20.000 --> 00:45:26.000
Artem Trotsyuk: New modalities. Perhaps there's more personalized therapeutics as a.

00:45:26.000 --> 00:45:32.000
Artem Trotsyuk: As a means to develop something for you as an individual that will work for your body has been research recently shown that.

00:45:33.000 --> 00:45:35.000
Artem Trotsyuk: Even if you and I, for example.

00:45:35.000 --> 00:45:42.000
Artem Trotsyuk: Eat a bag of chips. My body will react different to those chips than your body will react different to those ships.

00:45:42.000 --> 00:45:46.000
Artem Trotsyuk: What does that mean for us if we are trying to create therapies.

00:45:46.000 --> 00:45:54.000
Artem Trotsyuk: For drugs. If my body is reacting to trips by not spiking my glucose and your body spikes the glucose.

00:45:54.000 --> 00:45:57.000
Artem Trotsyuk: Should you have a different type of.

00:45:57.000 --> 00:45:59.000
Artem Trotsyuk: Therapeutic that built for you.

00:46:00.000 --> 00:46:06.000
Artem Trotsyuk: That would allow for your body to be more receptive to that therapeutic, whereas for me, it wouldn't really make a difference right.

00:46:07.000 --> 00:46:11.000
Artem Trotsyuk: That, I think is super exciting. That is something that you don't necessarily get to see.

00:46:12.000 --> 00:46:14.000
Artem Trotsyuk: As a.

00:46:15.000 --> 00:46:20.000
Artem Trotsyuk: As an opportunity. If you don't have the ability to mine.

00:46:20.000 --> 00:46:25.000
Artem Trotsyuk: A bunch more data in an efficient way to get novel insights. So I I.

00:46:25.000 --> 00:46:30.000
Artem Trotsyuk: As going back to what I was saying earlier. We're still way too early to kind of decide.

00:46:30.000 --> 00:46:33.000
Artem Trotsyuk: And say, what's happening on the AI front that.

00:46:35.000 --> 00:46:42.000
Artem Trotsyuk: That we should completely, not all do anything and mitigate risk. There's there's certain things we need to protect against.

00:46:42.000 --> 00:46:48.000
Artem Trotsyuk: But also understanding that we have many, many more discoveries that have yet to happen.

00:46:48.000 --> 00:46:53.000
Artem Trotsyuk: And so it's the balance of educating the researchers with the same time.

00:46:53.000 --> 00:46:55.000
Artem Trotsyuk: Allowing for the researchers to do their jobs.

00:46:55.000 --> 00:46:58.000
Artem Trotsyuk: By being able to develop and deploy these AI tools.

00:47:00.000 --> 00:47:07.000
Rishad Usmani: Do you? And I'll ask a very broad question, and then it does lead to a more specific question. But I'd like you to answer the broad question.

00:47:07.000 --> 00:47:13.000
Rishad Usmani: First.st Do you think the universe is more probabilistic.

00:47:13.000 --> 00:47:18.000
Rishad Usmani: More based on correlation, such as kind of more. The quantum mechanics theory of the universe.

00:47:19.000 --> 00:47:23.000
Rishad Usmani: Or do you think it is more deterministic, more causative.

00:47:25.000 --> 00:47:27.000
Artem Trotsyuk: I don't know. I mean, that is a good question.

00:47:28.000 --> 00:47:33.000
Artem Trotsyuk: Um haven't really thought about that too much. To be honest, I think.

00:47:34.000 --> 00:47:37.000
Artem Trotsyuk: There's so many mysteries of the universe that we.

00:47:38.000 --> 00:47:41.000
Artem Trotsyuk: Don't know, and it's our all of.

00:47:41.000 --> 00:47:45.000
Artem Trotsyuk: The mysteries of the universe are tied to the best hypothesis.

00:47:45.000 --> 00:47:47.000
Artem Trotsyuk: Of what's happening in the world.

00:47:47.000 --> 00:47:54.000
Artem Trotsyuk: Or how we think certain things operate. But the more we study, the more we realize that some of our hypotheses might not make sense.

00:47:54.000 --> 00:47:59.000
Artem Trotsyuk: So not sure. It's a good question, but I don't know if I have a good answer for you.

00:48:00.000 --> 00:48:06.000
Rishad Usmani: But so when we think about, and I and I have a very strong inherent bias here which will come across in my questioning.

00:48:06.000 --> 00:48:09.000
Rishad Usmani: When we think about medicine.

00:48:09.000 --> 00:48:13.000
Rishad Usmani: And AI regulation specifically within healthcare, and the FDA.

00:48:13.000 --> 00:48:17.000
Rishad Usmani: We seem to be stuck on this concept of explainability or causation.

00:48:17.000 --> 00:48:20.000
Rishad Usmani: We must understand why this happens.

00:48:20.000 --> 00:48:25.000
Rishad Usmani: And if AI say comes up with a new molecule that we cannot even detect.

00:48:25.000 --> 00:48:27.000
Rishad Usmani: Right. It says there's a molecule that exists.

00:48:27.000 --> 00:48:30.000
Rishad Usmani: Or there is a substance that exists in our blood.

00:48:30.000 --> 00:48:32.000
Rishad Usmani: Um are in our body, called.

00:48:33.000 --> 00:48:41.000
Rishad Usmani: You know, uh, uh. And I'm just making this called a genome or something, right? Something we haven't heard of. And this is what predicts cancer across the board.

00:48:42.000 --> 00:48:46.000
Rishad Usmani: And AI says, you know, I have this much correlation data, millions of data points.

00:48:46.000 --> 00:48:56.000
Rishad Usmani: This is where you should look out for what I need is a picture of you to detect it, that's all I need. I don't need a blood sample. I don't need anything, or I need your voice. Something we don't use as a diagnostic.

00:48:56.000 --> 00:49:01.000
Rishad Usmani: So I'm kind of getting deeper into the woods here. But the the question is.

00:49:01.000 --> 00:49:06.000
Rishad Usmani: Are we doing humanity? And AI a disservice by being stuck.

00:49:06.000 --> 00:49:08.000
Rishad Usmani: On causation and explainability.

00:49:09.000 --> 00:49:11.000
Rishad Usmani: And I'm kind of equating the 2 ended determinism.

00:49:12.000 --> 00:49:16.000
Rishad Usmani: Versus just recognizing, you know, all we can find out is correlation.

00:49:18.000 --> 00:49:23.000
Rishad Usmani: And and maybe it's because we're not smart enough. Maybe it's because we are limited by 5 senses.

00:49:23.000 --> 00:49:29.000
Rishad Usmani: Um. And but we we just can't like it's it's it's an exercise in futility. And we're just wasting time.

00:49:29.000 --> 00:49:32.000
Rishad Usmani: Being stuck on explainability and causation, we should just.

00:49:32.000 --> 00:49:35.000
Rishad Usmani: Focus on correlation. What are your thoughts? There.

00:49:35.000 --> 00:49:37.000
Artem Trotsyuk: Well, you know, like.

00:49:37.000 --> 00:49:42.000
Artem Trotsyuk: So uh, you bring up 2 topics that I will.

00:49:42.000 --> 00:49:47.000
Artem Trotsyuk: Push, based on what stuff that we talk about in our classes, that we teach on.

00:49:48.000 --> 00:49:52.000
Artem Trotsyuk: Intro to AI, and there's an aspect of.

00:49:52.000 --> 00:49:56.000
Artem Trotsyuk: Does correlation imply causation, and vice versa.

00:49:56.000 --> 00:50:00.000
Artem Trotsyuk: If I were to say that, for example, it's.

00:50:01.000 --> 00:50:05.000
Artem Trotsyuk: Today is dry and hot and sunny, sunny, and it's summer weather.

00:50:06.000 --> 00:50:08.000
Artem Trotsyuk: Causation could mean that.

00:50:08.000 --> 00:50:13.000
Artem Trotsyuk: I want ice cream or causation could mean that I get a sunburn.

00:50:13.000 --> 00:50:16.000
Artem Trotsyuk: But does getting a sunburn.

00:50:16.000 --> 00:50:19.000
Artem Trotsyuk: And getting ice cream.

00:50:19.000 --> 00:50:21.000
Artem Trotsyuk: Is that a correlation meaning that.

00:50:21.000 --> 00:50:25.000
Artem Trotsyuk: If if I get a sunburn, then I also get ice cream.

00:50:25.000 --> 00:50:32.000
Artem Trotsyuk: But does does that necessarily correlate to it? Being dry, hot, and sunny weather, for example? And there's.

00:50:32.000 --> 00:50:35.000
Artem Trotsyuk: There's there's many more different types of. For like.

00:50:35.000 --> 00:50:38.000
Artem Trotsyuk: For example, if you have germs.

00:50:38.000 --> 00:50:41.000
Artem Trotsyuk: A cause of that could be that it can lead to bad smells.

00:50:41.000 --> 00:50:49.000
Artem Trotsyuk: Or a cause can be that it leads to diseases, but doesn't bad smell correlate to disease not always. And so.

00:50:49.000 --> 00:50:52.000
Artem Trotsyuk: It's kind of this interesting foray of causation.

00:50:53.000 --> 00:50:59.000
Artem Trotsyuk: Versus correlation that we bring up in our class all the time. When people try to link the 2.

00:51:00.000 --> 00:51:02.000
Artem Trotsyuk: I think that.

00:51:02.000 --> 00:51:04.000
Artem Trotsyuk: For.

00:51:04.000 --> 00:51:08.000
Artem Trotsyuk: For what you touched upon on explainable AI.

00:51:08.000 --> 00:51:17.000
Artem Trotsyuk: It's important for us to understand how the AI is making certain decisions, such that we can kind of evaluate to make sure it's not hallucinating.

00:51:17.000 --> 00:51:21.000
Artem Trotsyuk: So for folks. When I hear explainable AI.

00:51:21.000 --> 00:51:26.000
Artem Trotsyuk: For me. That means that I want to understand how it's making its decisions.

00:51:26.000 --> 00:51:36.000
Artem Trotsyuk: So that I can evaluate against my knowledge base to determine if that makes sense. If I'm saying I'm going to trust the black box.

00:51:36.000 --> 00:51:42.000
Artem Trotsyuk: There's been plenty of documented cases now in which the black box does not make a good decision.

00:51:42.000 --> 00:51:48.000
Artem Trotsyuk: Or it makes a decision that is disadvantageous to certain individuals or certain populations of individuals who.

00:51:48.000 --> 00:51:50.000
Artem Trotsyuk: Might not have their data.

00:51:50.000 --> 00:51:56.000
Artem Trotsyuk: Represented in the data set that the AI is trained on. And so, because of that.

00:51:56.000 --> 00:51:59.000
Artem Trotsyuk: It creates an output, but that output is perhaps.

00:52:00.000 --> 00:52:07.000
Artem Trotsyuk: Going to impact them in a negative way. And so for me to have an explainable AI. It allows for me to understand, hey.

00:52:07.000 --> 00:52:10.000
Artem Trotsyuk: I get, how you made your decision.

00:52:10.000 --> 00:52:14.000
Artem Trotsyuk: I think this makes sense. I trust your output.

00:52:14.000 --> 00:52:20.000
Artem Trotsyuk: Or I see how you made a decision. I see where the faults are. These are the faults we should address.

00:52:20.000 --> 00:52:24.000
Artem Trotsyuk: This way you have AI! That's more fair.

00:52:24.000 --> 00:52:26.000
Artem Trotsyuk: At the end of the day, and has a lot more.

00:52:26.000 --> 00:52:30.000
Artem Trotsyuk: A equitable opportunity for everyone.

00:52:30.000 --> 00:52:35.000
Artem Trotsyuk: And not just one demographic or another. So that's how I view that.

00:52:35.000 --> 00:52:37.000
Artem Trotsyuk: But I do think that.

00:52:39.000 --> 00:52:41.000
Artem Trotsyuk: If AI enables.

00:52:42.000 --> 00:52:43.000
Artem Trotsyuk: Novel discovery.

00:52:44.000 --> 00:52:49.000
Artem Trotsyuk: That's a great thing if it can say that if by this molecule like you mentioned earlier by this molecule.

00:52:50.000 --> 00:52:56.000
Artem Trotsyuk: You can have this output, and all I need is a picture. Well, hey? If I get if I run a study.

00:52:56.000 --> 00:53:07.000
Artem Trotsyuk: In which I identify that molecule in a bunch of people, and all I need is their picture. And I throw that into a system. And it's 99% accurate. Well, maybe that's a new way for doing standard of care. So.

00:53:07.000 --> 00:53:11.000
Artem Trotsyuk: It goes back full circle to saying, Hey, AI!

00:53:11.000 --> 00:53:17.000
Artem Trotsyuk: Can be an enabler and allow for you to be superhuman by doing your job better.

00:53:17.000 --> 00:53:25.000
Artem Trotsyuk: But also having it be explainable, is super important. Because if you don't understand how it's making decisions.

00:53:25.000 --> 00:53:33.000
Artem Trotsyuk: How much reliance are we putting on the system right and and low, use case scenarios like scribing you mentioned earlier.

00:53:33.000 --> 00:53:39.000
Artem Trotsyuk: That's pretty self-explanatory audio in text out if it can listen.

00:53:39.000 --> 00:53:45.000
Artem Trotsyuk: Create correct transcription, it can create a relatively good output. You understand how that system works.

00:53:45.000 --> 00:53:51.000
Artem Trotsyuk: Now for more complicated systems that impact people.

00:53:51.000 --> 00:53:55.000
Artem Trotsyuk: And their livelihoods, I would hope to understand how it works otherwise.

00:53:56.000 --> 00:54:03.000
Artem Trotsyuk: I can just be following a black box and directions. And then whoever program that black box, and or if the AI has been programmed in a way that is.

00:54:04.000 --> 00:54:07.000
Artem Trotsyuk: Detrimental to to individuals, right.

00:54:07.000 --> 00:54:11.000
Artem Trotsyuk: So that's that's how I view it. It's an interesting topic.

00:54:11.000 --> 00:54:17.000
Artem Trotsyuk: But often the idea of causation versus correlation does come up in terms of does one correlate to another.

00:54:17.000 --> 00:54:19.000
Artem Trotsyuk: And for now.

00:54:19.000 --> 00:54:26.000
Artem Trotsyuk: There there is, there is this balance of actually understanding how AI works, such that it's not who's saying creating stuff up. That's not.

00:54:26.000 --> 00:54:30.000
Artem Trotsyuk: Completely irrelevant to what you're trying to do.

00:54:31.000 --> 00:54:35.000
Rishad Usmani: Last question, Arden. Say, tomorrow is a hot, sunny day. You're having your ice cream.

00:54:35.000 --> 00:54:41.000
Rishad Usmani: Not getting a sunburn, and and you run into your 20 year old self to.

00:54:41.000 --> 00:54:45.000
Rishad Usmani: Tomorrow and your 20 year self asks you.

00:54:45.000 --> 00:54:49.000
Rishad Usmani: Um! What what advice do you have for me? What would you tell them.

00:54:51.000 --> 00:54:55.000
Artem Trotsyuk: Great question, the thing that and I get that a lot.

00:54:56.000 --> 00:54:57.000
Artem Trotsyuk: Um.

00:54:58.000 --> 00:55:01.000
Artem Trotsyuk: There's 2 ways to look at that lens.

00:55:01.000 --> 00:55:03.000
Artem Trotsyuk: There's 1 lens is.

00:55:03.000 --> 00:55:05.000
Artem Trotsyuk: I am where I am today.

00:55:05.000 --> 00:55:11.000
Artem Trotsyuk: Because of this, the decisions and the things that happened to get me to the point in where I am today.

00:55:11.000 --> 00:55:17.000
Artem Trotsyuk: So then, if I am happy with where I am today, would I have changed anything any.

00:55:17.000 --> 00:55:19.000
Artem Trotsyuk: Chain, of.

00:55:20.000 --> 00:55:24.000
Artem Trotsyuk: Actions that happen to get to where I am. That's that 1st step is.

00:55:24.000 --> 00:55:26.000
Artem Trotsyuk: I'm great today.

00:55:24.000 --> 00:55:32.000
Rishad Usmani: The caveat being the world is very different today. So if you do the same steps you did in the past 20 years. In the next 20 years. You probably won't get to where you are today.

00:55:33.000 --> 00:55:36.000
Rishad Usmani: So you probably have to change some things to get to where you are today.

00:55:35.000 --> 00:55:39.000
Artem Trotsyuk: Yeah, but true, true, true. But uh.

00:55:39.000 --> 00:55:42.000
Artem Trotsyuk: But also it's kind of like this idea of.

00:55:43.000 --> 00:55:46.000
Artem Trotsyuk: The guided path on which you were on.

00:55:46.000 --> 00:55:52.000
Artem Trotsyuk: If that so if we're saying that if I met my 20 year old self in 2025.

00:55:52.000 --> 00:55:54.000
Artem Trotsyuk: And I told my 20 year old self.

00:55:54.000 --> 00:56:01.000
Artem Trotsyuk: These are the lessons that I've learned, and then these are the mistakes you should avoid. From 2025 onward.

00:56:01.000 --> 00:56:06.000
Artem Trotsyuk: That would be like a different framing if I'm thinking about. If I were to go back in time and restart.

00:56:06.000 --> 00:56:14.000
Artem Trotsyuk: And redo things. I would not change anything, because everything that has happened has taught me certain lessons that have helped me get to where I am today.

00:56:14.000 --> 00:56:17.000
Artem Trotsyuk: Now, if I'm saying, if if the question that you're asking is.

00:56:17.000 --> 00:56:21.000
Artem Trotsyuk: In 2025, your 20 year old self is kickstarting their career.

00:56:21.000 --> 00:56:24.000
Artem Trotsyuk: Is that the question kind of like.

00:56:23.000 --> 00:56:25.000
Rishad Usmani: Yeah.

00:56:24.000 --> 00:56:30.000
Artem Trotsyuk: Okay. So in 2025, if you are kickstarting your career and learning.

00:56:31.000 --> 00:56:36.000
Artem Trotsyuk: And jumping in first, st I would tell my 20 year old self to pick up Cs.

00:56:36.000 --> 00:56:45.000
Artem Trotsyuk: So learn how systems work. You don't have to be a professional coder proficient. But you should understand. You should generally understand how computer systems operate.

00:56:45.000 --> 00:56:51.000
Artem Trotsyuk: More so now than before. How large language models operate! What is chain of thought? What is reasoning? What's persuasion?

00:56:52.000 --> 00:56:57.000
Artem Trotsyuk: I would encourage myself to read more books on social science and psychology.

00:56:57.000 --> 00:57:01.000
Artem Trotsyuk: Because understanding how the mind works, will and help you understand how.

00:57:01.000 --> 00:57:03.000
Artem Trotsyuk: Future computer systems will work.

00:57:03.000 --> 00:57:10.000
Artem Trotsyuk: First.st That's that's inherent, and I would encourage myself to diversify my experiences.

00:57:10.000 --> 00:57:13.000
Artem Trotsyuk: In different forays that allow for me to.

00:57:13.000 --> 00:57:16.000
Artem Trotsyuk: To capture a broad scale of.

00:57:18.000 --> 00:57:23.000
Artem Trotsyuk: Exposure and experiences earlier on, and then test those out in different application layers.

00:57:24.000 --> 00:57:32.000
Artem Trotsyuk: There's a lot of uncertainty exactly on. What will the job market look like in a year from now, 2 years from now, 5 years from now, what is safe, what is not safe.

00:57:32.000 --> 00:57:39.000
Artem Trotsyuk: It's hard to tell, because today it's safe tomorrow, it's not safe. Today. It's changing. Tomorrow. They're reverting back. Policies are changing all the time.

00:57:39.000 --> 00:57:44.000
Artem Trotsyuk: What is safe is identifying something that you are really good at.

00:57:44.000 --> 00:57:48.000
Artem Trotsyuk: By bringing certain types of skills to the table.

00:57:48.000 --> 00:57:53.000
Artem Trotsyuk: And whatever those skills are that you're just good at, you bring those. And you sell those skills. Your skills are more valuable.

00:57:53.000 --> 00:58:01.000
Artem Trotsyuk: Then you can always learn how to vibe code. You can always learn new tools that are being developed. But how are you applying your skills with those new tools.

00:58:01.000 --> 00:58:03.000
Artem Trotsyuk: In a in a demographic, and so.

00:58:04.000 --> 00:58:07.000
Artem Trotsyuk: Yeah, for for my, that's what I would tell my 20 year old self. Uh.

00:58:07.000 --> 00:58:12.000
Artem Trotsyuk: As a starting point. But if we were to reflect back from 20 years ago till now.

00:58:12.000 --> 00:58:14.000
Artem Trotsyuk: I would not have changed anything.

00:58:15.000 --> 00:58:17.000
Rishad Usmani: Awesome thanks. So much for joining me today.

00:58:17.000 --> 00:58:24.000
Artem Trotsyuk: Yeah, no, this was fun. I hope that your audience has a kick and they take an AI class or 2.

00:58:24.000 --> 00:58:29.000
Artem Trotsyuk: Uh! We teach them regularly at Stanford. They can take a class with me or Ron John.

00:58:29.000 --> 00:58:32.000
Artem Trotsyuk: And kind of learn about these systems and.

00:58:33.000 --> 00:58:35.000
Artem Trotsyuk: Or reach out to me directly. If they have any questions.

00:58:39.000 --> 00:58:43.000
Rishad Usmani: So if there's anything you said, you kind of you want me to remove, feel free to email me.

00:58:43.000 --> 00:58:44.000
Rishad Usmani: Um.

00:58:44.000 --> 00:58:49.000
Artem Trotsyuk: You can cut whatever you don't want to to include there. Totally fine. With that.

00:58:49.000 --> 00:58:55.000
Rishad Usmani: Okay, um, are you? I think you mentioned someone who would be interested in speaking about ethics and AI.

00:58:55.000 --> 00:58:57.000
Rishad Usmani: And consciousness of AI.

00:58:57.000 --> 00:58:59.000
Rishad Usmani: And AI writes.

00:58:58.000 --> 00:59:00.000
Artem Trotsyuk: Yeah.

00:59:00.000 --> 00:59:02.000
Artem Trotsyuk: Wrong John nag oops.

00:59:04.000 --> 00:59:05.000
Artem Trotsyuk: See if you're interested.

00:59:04.000 --> 00:59:05.000
Rishad Usmani: Are you able to.

00:59:06.000 --> 00:59:08.000
Rishad Usmani: Okay, I'll I'll look at their profile in.

00:59:08.000 --> 00:59:16.000
Artem Trotsyuk: Yeah, look him up. If you're said, send me a note and I will reach out. I'll ask him. And you guys can talk about AI. Consciousness and boundaries of humanity.

00:59:09.000 --> 00:59:10.000
Rishad Usmani: Yeah.

00:59:17.000 --> 00:59:20.000
Rishad Usmani: Thanks. Yeah, that was very interesting. I really enjoyed that.

00:59:20.000 --> 00:59:25.000
Artem Trotsyuk: Cool. No, Ron Jones love to talk to you, I'm sure. But look, look up! His profile.

00:59:25.000 --> 00:59:27.000
Artem Trotsyuk: And see if he'd be useful for you.

00:59:28.000 --> 00:59:34.000
Rishad Usmani: Yeah, and I'll I'll release this. I'll tag you in all social media. But this this was, I love this, you know.

00:59:34.000 --> 00:59:37.000
Rishad Usmani: I need to have more people such as yourself.

00:59:37.000 --> 00:59:41.000
Rishad Usmani: Cause. I just have physicians and Vcs, and it kind of gets a little bit mundane. So.

00:59:41.000 --> 00:59:45.000
Rishad Usmani: More AI experts that I can learn from is amazing. Thanks for taking the time to join me.

00:59:45.000 --> 00:59:53.000
Artem Trotsyuk: Ron Johnson would love Ron John would be super keen on talking. AI. It's his also bread and butter. He's also an investor.

00:59:53.000 --> 01:00:02.000
Artem Trotsyuk: Founder, but his passion stuff right now, right now he's like all about vibe coding, and how he can build something up super quickly and trying all these tools out.

00:59:59.000 --> 01:00:01.000
Rishad Usmani: Yeah.

01:00:02.000 --> 01:00:07.000
Artem Trotsyuk: And then customizing tools for your own different needs and applications, and so forth, so.

01:00:07.000 --> 01:00:10.000
Artem Trotsyuk: Yeah, look up his profile. Send me a ping if he'll be useful for you.

01:00:10.000 --> 01:00:13.000
Artem Trotsyuk: And if yes, then I'll go ahead and reach out to him and make an intro.

01:00:13.000 --> 01:00:15.000
Rishad Usmani: Awesome. Yeah. And if I can help in any way.

01:00:15.000 --> 01:00:17.000
Rishad Usmani: I'm happy to.

01:00:17.000 --> 01:00:19.000
Artem Trotsyuk: Sounds good. Alright, Rishad, thank you so much. Talk to you soon.

01:00:20.000 --> 01:00:21.000
Rishad Usmani: Thanks, Arthur, bye.