Author Archives: Lex Fridman

Transcript for Tal Wilkenfeld: Music, Guitar, Bass, Jeff Beck, Prince, and Leonard Cohen | Lex Fridman Podcast #408

This is a transcript of Lex Fridman Podcast #408 with Tal Wilkenfeld.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Tal Wilkenfeld
(00:00:00)
I am standing on the edge of the cliff the entire night, and if I mess something up, mess it up, what even is a mistake? But if I do a little clunker or whatever it is, it’s like, so what? I wouldn’t have played half the stuff that I’m playing if I wasn’t constantly standing on the edge of the cliff, like wild.
Lex Fridman
(00:00:22)
Why stand at the edge of the cliff?
Tal Wilkenfeld
(00:00:24)
Because at the edge of the cliff is all possibilities.
Lex Fridman
(00:00:30)
The following is a conversation with Tal Wilkenfeld, a singer-songwriter, bassist, guitarist, and a true musician who has recorded and performed with many legendary artists, including Jeff Beck, Prince, Eric Clapton, Incubus, Herbie Hancock, Mick Jagger, Jackson Brown, Rod Stewart, David Gilmore, Pharrell, Hans Zimmer, and many, many more.

(00:00:54)
This was a fun and fascinating conversation. This is the Lex Fridman podcast. To support it, please check out our sponsors in the description. And now, dear, dear friends, here’s Tal Wilkenfeld.

Jeff Beck


(00:01:08)
There’s a legendary video of you playing with Jeff Beck. We’re actually watching it in the background now. So for people who don’t know, Jeff is one of the greatest guitarists ever. So you’re playing with him at the 2007 Crossroads Festival, and people should definitely watch that video. You were killing it on the bass. Look at that face. Were you scared? What was that experience like? Were you nervous? You don’t look nervous. Confident?
Tal Wilkenfeld
(00:01:37)
Yeah, I wasn’t nervous. I think that you can get an adrenaline rush before a stage, which is natural, but I think as soon as you bring fear to a bandstand, you’re limiting yourself. You’re walling yourself off from everyone else. If you’re afraid, what is there to be afraid of? You must be afraid of making a mistake, and therefore you’re coming at it as a perfectionist and you can’t come at music that way, or it’s not going to be as expansive and vulnerable and true.

(00:02:10)
So no, I was excited and passionate and having the best time. And also the fact that he gave me this solo, the context of this performance is that this was a guitar festival. It’s one of the biggest guitar festivals in the world because it’s Eric Clapton’s festival, and there’s 400 guitarists that are all playing solos all night. And we were towards the end of the night, and I could tell Jeff got a kick out of, I’m not going to solo on one of my most well-known songs, Cause We’ve Ended as Lovers. Well, Stevie Wonder wrote it, but people know Jeff for that song and his solo on it. It’s like, “I’m going to give it to my bass player.” And he did, and like-
Lex Fridman
(00:03:02)
You took it.
Tal Wilkenfeld
(00:03:03)
The fact that he’s bowing, he didn’t have to do that.
Lex Fridman
(00:03:03)
But you really stepped up there.
Tal Wilkenfeld
(00:03:14)
It just shows what a generous musician he is, and that’s evident in his playing across the board. He is a generous, loving, open musician. He’s not there for himself. He’s there for the music. And he thought, “Well, this would be the perfect musical thing to do.” And it kind of all started when I went to audition for him, which was an interesting experience because I got food poisoning on the plane.

(00:03:46)
And so literally when the plane landed, I went straight into an ambulance into a hospital overnight. The manager picked me up and I showed up at Jeff’s door, which was a three-hour drive through windy country roads, and he answered the door, and he is like, “Okay, you’re ready to play?” So we went upstairs and started rattling off the set. And when it came to this song, Cause We’ve Ended as Lovers, he just said solo, and he loved it and kept the solo in it. So that’s how, there was no bass solo before I was playing in his band. So this whole thing was kind of new.
Lex Fridman
(00:04:24)
So even with food poisoning, you could step up?
Tal Wilkenfeld
(00:04:27)
Yeah.
Lex Fridman
(00:04:28)
That’s just like what? Instinct?
Tal Wilkenfeld
(00:04:30)
It’s just being able to differentiate from the body and from expression, music.
Lex Fridman
(00:04:37)
It’s interesting. You said fear walls you off from the other musicians, and what are you afraid of? You’re afraid of making a mistake. Beethoven said, “To play a wrong note is insignificant. To play without passion is inexcusable.” Do you think the old man had a point?
Tal Wilkenfeld
(00:04:54)
Yeah. Different styles of music invite varying degrees of, I would say, uncertainty or unsafety in the way that people might perceive it. So for instance, the tour that I was just on playing Allman Brothers songs, I am standing on the edge of the cliff the entire night, and if I mess something up, mess it up, what even is a mistake? But if I do a little clunker or whatever it is, it’s like, so what? I wouldn’t have played half the stuff that I’m playing if I wasn’t constantly standing on the edge of the cliff, like wild.

(00:05:38)
And so I don’t care about those few little things. I care about the overall expression. And then there’s other gigs that, for instance, if I got called for a pop or a country session or a show. In those environments, they may want you to play safe, just play the part and play it with a great groove and time and great dynamics and don’t really veer away from the part and stuff. And I’ve done plenty of those gigs too. It’s just a different hat you put on.
Lex Fridman
(00:06:14)
What do you get from the veering? From the veering off the beaten path? You just love it? Or is that going to make the performance better? Why stand at the edge of the cliff?
Tal Wilkenfeld
(00:06:28)
Because at the edge of the cliff is all possibilities and unknown. You don’t know what’s coming. And I love being there in the unknown. Otherwise, it’s just like, “Well, why are we doing this? Am I just like a clown on stage showing you my skills or what I’ve studied in my bedroom?” It’s like, no, I want to be pure expression happening right now and responding in real time to everything that’s happening. And anytime I’m not doing that, it’s like it’s a waste of everybody’s time.
Lex Fridman
(00:07:06)
Have you ever messed it up real bad?
Tal Wilkenfeld
(00:07:09)
Messed what up?
Lex Fridman
(00:07:11)
I mean, all comedians bomb. You’re a big fan of comedy.
Tal Wilkenfeld
(00:07:13)
Yeah.
Lex Fridman
(00:07:14)
Have you ever bombed on stage?
Tal Wilkenfeld
(00:07:16)
Probably. I think it’s all about recovery. And the more times that you fall off the cliff, the quicker you know how to recover and the varying ways that you can recover to the point in which it’s concealed so much that maybe a listener might not even know that you’re recovering.
Lex Fridman
(00:07:38)
And eventually you learn to fly, if we take that metaphor all the way, off the cliff. [inaudible 00:07:44]
Tal Wilkenfeld
(00:07:43)
Remember one time when I was really young. Well, not really young, but when I was 21 or-
Lex Fridman
(00:07:44)
What is age anyway?
Tal Wilkenfeld
(00:07:52)
22? Yeah, exactly. But when I was first playing with Jeff Beck and we played at what I consider the best, the coolest jazz festival, it’s Montreux Jazz. And Miles played there, everyone played there, and they have the best speaker system ever. I was excited for months, and the drummer, Vinny was practicing for eight hours in the bus on the way there, and everyone was on fire on stage. And I remember playing a note, just one note that I really didn’t like. And I let it go in the moment on stage, but as soon as I got off-stage, I was really sad.

(00:08:37)
And so I sat on this road case, everyone was out celebrating. I sat this road case, look with a sad face, boo-hoo. And then Claude Nobs, the owner of the whole festival, came out to me. He’s like, “Tal, what’s wrong?” And I’m like, “I played a bad note.” I was such a child. And he said all this wise stuff that Miles Davis had imparted to him and it fully cheered me up. He’s like, “Is there anything that would make you feel better?” And I was like, “Caviar?” The dude came back 10 minutes later with this huge thing.
Lex Fridman
(00:09:18)
Oh wow.
Tal Wilkenfeld
(00:09:18)
It was a joke. It was a joke, but he actually brought me caviar. But anyway, that’s the one time that I remember being sad about a performance. Now I’m just like, “Okay, whatever. It’s done.”
Lex Fridman
(00:09:30)
Was it a physical slip of the fingers or did you intend to play that note?
Tal Wilkenfeld
(00:09:35)
That I can’t remember. I can’t remember if it was just a bad choice that sounded like a clanger, why it happened. It was so long ago, but I don’t get depressed about that anymore.
Lex Fridman
(00:09:48)
That’d be funny if that was your biggest and only regret in life is that note, and that haunted you in your dreams.
Tal Wilkenfeld
(00:09:53)
And then I’m on my deathbed and everyone’s just bringing me caviar because the one-

Confidence on stage

Lex Fridman
(00:09:59)
Joke went way too far. You talked about confidence somewhere. I don’t remember where. So I want to ask you about how much confidence it takes to be up there. You said something that Anthony Jackson told you as encouragement, line that I really like. That quote, “On your worst day, you’re still a bad motherfucker.”
Tal Wilkenfeld
(00:10:17)
That’s actually a Steve Gadd quote. And Steve used to tell that to Anthony because Anthony used to get real depressed if he did a wrong thing or not perfect thing. And Steve Gadd used to say this to Anthony Jackson. And then Anthony was my first bass mentor or just mentor in general.
Lex Fridman
(00:10:36)
For people don’t know, he’s a legendary bassist.
Tal Wilkenfeld
(00:10:37)
He’s a legendary bassist. And I started playing the bass when I was 17 and I moved to New York and I met Anthony and he started mentoring me bit in a very not typical way. He would just sit in his car with me for hours and talk music.
Lex Fridman
(00:10:55)
You guys just listen to music and analyze it?
Tal Wilkenfeld
(00:10:57)
Exactly. And that was the best form of learning, I think. Just like, “Well, what do you perceive here?” And, “Well, I heard this” and just discussing that.
Lex Fridman
(00:11:08)
Jazz usually?
Tal Wilkenfeld
(00:11:09)
No, all styles of music. And yeah, he told me that story about on your worst day because yeah, even then when I was 18, 19, I’d get sad sometimes about performances. “I could have done this.” I don’t do that anymore, thankfully. Or I’d be miserable.
Lex Fridman
(00:11:29)
So you always kind of feel pretty good?
Tal Wilkenfeld
(00:11:31)
Yeah. Yeah, now I do. Now it’s just I sense the body feeling fatigued, especially if it’s a very long show. The ones I just did with three hour shows and we did one to three hour sound checks. So that’s a lot of physical activity every day. So I just feel the body being tired, fatigued, the ears are fatigued. That’s about it. I don’t really reflect on the show much.
Lex Fridman
(00:11:59)
You’re almost like from a third person perspective, feel the body get tired and just accept it.
Tal Wilkenfeld
(00:12:05)
Yeah, I don’t want to identify with it then I’m tired, but I’m not tired.
Lex Fridman
(00:12:09)
It’s very Zen.
Tal Wilkenfeld
(00:12:10)
I’m usually energized.
Lex Fridman
(00:12:12)
It’s like with the food poisoning, the mind is still capable of creative genius, even if the body is gone.
Tal Wilkenfeld
(00:12:18)
Yeah.
Lex Fridman
(00:12:19)
Something like that? So no self-critical component to the way you see your performances anymore?
Tal Wilkenfeld
(00:12:30)
There is critique, but not in the way that it would diminish my sense of self. It’s different. I can just kind of look at something and be like, “Okay, well actually next time I’ll do this choice and this choice, maybe. Maybe this would serve the song better. Maybe this would help the groove feel more like this.” But it’s not like, “I suck because I did this and I’m a loser.”
Lex Fridman
(00:12:58)
Do you think that’s bad? Even when I asked that question, I had a self-critical thought that, “Why’d you ask that question? That’s the wrong question.” I always have the self-critical engine running. Is it necessarily a bad thing?
Tal Wilkenfeld
(00:13:12)
It depends. If it’s affecting you negatively.
Lex Fridman
(00:13:14)
What is negative anyway?
Tal Wilkenfeld
(00:13:15)
Well, if it brings your frequency down and you feel less joyful inside and less, you don’t feel like complete, you feel less than, less worthy of something, than you could call that bad if you aspire to not feel that way.
Lex Fridman
(00:13:35)
Yeah, I aspire to not feel that way in the big picture, but in the little picture, a little pain is good.
Tal Wilkenfeld
(00:13:41)
That’s fair.
Lex Fridman
(00:13:43)
So confidence. You seem like in this performance, you seem confident. You seem to be truly walking the bad motherfucker way of life.
Tal Wilkenfeld
(00:13:55)
A word that I prefer over confidence is trust. Because I think with confidence is almost like is a belief assigned to it that I am this thing.
Lex Fridman
(00:13:55)
Ego.
Tal Wilkenfeld
(00:14:08)
That you believe in. Whereas trust is just simply knowing that you can get up there and handle whatever is going to come your way. And it’s more of an open feeling where it’s like, “Yeah, I could do this. Sure.” But not like, “I’m a bad motherfucker.” You know what I mean? There’s a huge difference because I’ve shared the stage with people who have a lot of confidence and it can be like a brick wall, just like fear is a brick wall.
Lex Fridman
(00:14:40)
So the brick wall is a bad thing. The thing you have with Jeff here on stage-
Tal Wilkenfeld
(00:14:44)
Is not a brick wall.
Lex Fridman
(00:14:45)
There’s no wall, just chemistry.
Tal Wilkenfeld
(00:14:46)
Yeah.
Lex Fridman
(00:14:47)
How can you explain that chemistry the two of you had?
Tal Wilkenfeld
(00:14:49)
Trust and lack of fear. Yeah, and also I will say that each individual has developed likes and dislikes over their lifetime. And that can be like in this case, we’re just talking aesthetic likes and dislikes. So in this particular case, obviously our likes and dislikes are very much aligned such that the things I do to complement him, he enjoys and vice versa. But it could be two very trusting open musicians on stage that don’t have walls up, but their choices are very different. And one person likes heavy metal and the other person likes classical. So it’s got to be both.
Lex Fridman
(00:15:33)
So you guys were good at yes and-sing each other musically?
Tal Wilkenfeld
(00:15:37)
Definitely.
Lex Fridman
(00:15:37)
Is that where you’re most at peace in a meditative way? It’s on stage?
Tal Wilkenfeld
(00:15:46)
It used to be that it would only be on stage. It started with that. That was almost like my way into flow state and meditation was playing music. And then back in the day when I’d kind of crash after shows, I wanted to change that. I wanted to always feel like I’m in flow state.
Lex Fridman
(00:16:09)
Have you succeeded?
Tal Wilkenfeld
(00:16:10)
I’ve gotten a lot better. I’m still obviously on the journey, but yes.
Lex Fridman
(00:16:17)
So you meditate? I think you said somewhere that you meditate before shows or just in general?
Tal Wilkenfeld
(00:16:21)
I meditate every day. When I’m on tour with my band, I ask that we all meditate together for at least 20 minutes. And I don’t dictate which type of meditation. I don’t put on a guided meditation. Everyone has their own thing they want to do. Maybe someone might be praying in their head, it doesn’t matter. It’s just the idea that we all put our phones down and we all are in one room connecting energetically, spiritually, and just letting our lives go for a second. And then we walk straight on the stage and it’s always really connected. And there were a couple gigs where we ran out of time for that, and I could tell. There was a major difference in the performance.
Lex Fridman
(00:17:07)
So it both connects you and centers you, all of those things.
Tal Wilkenfeld
(00:17:11)
But then when I’m home, I love to meditate and I’ve tried various styles of meditation and studied various types of things. So I don’t do just one thing. I kind of customize it depending on where I’m at in my life.
Lex Fridman
(00:17:30)
You and the world lost Jeff Beck a year ago. You told me you really miss him. How’s the pain of losing Jeff change you? Maybe deepen your sense of the world?
Tal Wilkenfeld
(00:17:43)
It’s hard to accept that we won’t create something musically again in this lifetime. But in terms of the grief, grief was easier for me because I went through a major grief period in 2016 and 17, and that was the first time I’d really gone through the process of grief in a non-family situation with friends and mentors and people that I’d created with, which is different. It’s a different kind of connection. When my grandparents died, it’s like there was nothing left unsaid. And I was at peace with what was happening.

(00:18:40)
With this, when Prince died out of the blue in mid 2016, and then Leonard Cohen died in November, that just tore me to shreds because Leonard Cohen was not just someone that profoundly inspired me musically and lyrically, but spiritually, we had a very deep connection. And that was the basis of a lot of our conversation was spirituality. And so at that time, I felt like a piece of me went missing. And that was a very long process where I just stayed in my place and didn’t want to play a note of music. I kind of wanted to just get rid of all my stuff. So I had a friend come over and he’s like, “You should just, why don’t you come to the Comedy Store?” I’m like, “Comedy Store? What am I going to go to some store and buy clown suits? What are you talking about? What’s a Comedy Store?” He’s like, “No, no, no. The Comedy Store, the place where comedians go.”

(00:19:54)
I’m like, “Okay, well, I’ve never seen standup. I’ve seen Seinfeld on TV. That’s the extent of my standup experience.” So he took me to the Comedy Store and every single one of those comedians embraced me like I was family. It didn’t even take a day. I was part of the family and I made 25 best friends, and I ended up throwing all my stuff in storage and finding a little room to stay in where I rented my gear out and my rent paying was me loaning the gear. I didn’t want any responsibilities, financial, I just wanted to be completely free so that I could just process it and not feel like I had to commit to anything work-wise or creatively. I just wanted to unplug.

(00:20:50)
And so this was a fun and very different way to unplug, because previously I may have just gone to a monastery and spent weeks at a monastery or months, but in this case I was like, “You know what? This is a different kind of experience. I’m going to just hang out with comedians and stay in this room.”
Lex Fridman
(00:21:09)
With no responsibility, really.
Tal Wilkenfeld
(00:21:11)
Other than to really deeply connect with this grief that I’m experiencing. I’m not going to negate it. I’m going to really fully connect to it. And I did, and it was tough. And then more people in 2017 were leaving. Gregg Allman, Tom Petty. I mean, these are people, I worked with all these people and had great connections with them, and they were all going, and the world was mourning the loss of these people because of everything that they’d given to the world. They’d changed the world’s lives, not just mine because I knew them personally. And so that was also complicated. And why, for me, it was interesting to be grieving the loss of these musicians with comedians.

(00:22:04)
And I learned a lot. It changed my life. I learned to laugh at absolutely anything, everything. I mean, my grandpa had a really great sense of humor too. My grandpa was a Holocaust survivor, and he could just laugh at anything. And so I already kind of have that in me. But being around all these comedians just kind of exaggerated that for me, and that really changed things for me for the better. So then when Jeff Beck died, it was like, “Okay, I’ve got these tools. I know what this is and I’m going to go through it again, and I’m going to be on tour with Incubus in two days.”

(00:22:45)
So Mike Dirnt from Green Day, he called me up and he said, “Hey, I know you’re going through a lot.” And I said, “I don’t even know what I’m going to play. I really want a vintage jazz bass for this, and I only have a seventies one that I don’t really think is appropriate. I really need a sixties one, blah, blah, blah.” And Mike’s like, “I’m going to hook you up.” He showed up to my place the next day with a truckload of old P basses and jazz basses and brought them all into my studio, and I’m playing them.

(00:23:16)
And then I pull one out of the case and it’s Olympic White, just like Jeff Beck and I play it. And not only did I get goosebumps and started crying, but I looked over at Mike and same thing was happening, and he’s like, “I guess Jeff might be happy about this.” And he’s like, “Well, I didn’t want to let this one go. I was just trying to cheer you up a bit and maybe loan it to you for the tour, but if you really want it’s yours.” And I was like, “Oh my God, this is… Mike Dirnt is the nicest guy ever.”

(00:23:59)
So that happened. So that bass’ name is Jeff, and it’s a white jazz bass, and I played it on the Incubus tour. But yeah, I do feel like I’m more equipped to handle grief now.
Lex Fridman
(00:24:11)
Tell me about the Comedy Store a little bit more. Do you think comedians and musicians in some deep fundamental way are made from the same cloth? Are they spiritually connected somehow?
Tal Wilkenfeld
(00:24:25)
I think everyone’s connected spiritually in the same way. So I think personality wise, comedians and musicians are quite different, actually.
Lex Fridman
(00:24:38)
In what way?
Tal Wilkenfeld
(00:24:40)
Well, you’d have to subdivide even musicians into different categories too, because the thing that I appreciate about comedians is that you go to a restaurant with them and all the observational humor of, they’ll notice everything and make you laugh about it, which a really great songwriter does the same thing too. And my favorite lyricists, like Leonard Cohen, Bob Dylan, Paul Simon, Warren Zevon, they add comedy into their lyric. And so those types of people I would liken to hanging out with a comedian.

(00:25:16)
It’s very different from say somebody that is an instrumental guitarist or something like that, that they’re more focused on, whether it’s a kinesthetic thing or a physical thing or whatever it is. They’re not quite doing the observational thing in the same way. So I just appreciate, my favorite thing to do is go on and laugh, especially because I can tend to be pretty analytical and be in my head. So anything that just kind of lets me be in my heart and just enjoy life is great.
Lex Fridman
(00:25:54)
I think there’s a photo of you with Dave Chappelle on stage. What was that about?
Tal Wilkenfeld
(00:25:58)
So right after Leonard Cohen passed away, the Comedy Store threw me a birthday party. It was this crazy lineup, and it was like I’d play a song with my band, and then Jackson Brown sat in and sang a song, and then Dave Chappelle came up and said some jokes. It was one of my favorite nights ever.
Lex Fridman
(00:25:59)
Yeah.
Tal Wilkenfeld
(00:26:23)
Yeah. It was cool. It was a very healing birthday party.
Lex Fridman
(00:26:27)
Yeah, there’s something magical about that place.
Tal Wilkenfeld
(00:26:29)
Yeah.
Lex Fridman
(00:26:30)
It’s really special.
Tal Wilkenfeld
(00:26:31)
Yeah. Well, the Mothership has some magic to it too. It’s really cool. It’s different. Totally different vibe, but super awesome.

Leonard Cohen

Lex Fridman
(00:26:40)
You said that Leonard Cohen is a songwriting inspiration of yours. I saw you perform his song Chelsea Hotel, brilliantly on the internet. It’s about, for people who don’t know his love affair with Janet Joplin. How does that song make you feel?
Tal Wilkenfeld
(00:27:01)
Great. I love that song.
Lex Fridman
(00:27:03)
Which aspect? Musically, the melancholy feeling, the hopeful feeling, the cocky feeling? All of it, every single line has a different feeling to it, really.
Tal Wilkenfeld
(00:27:16)
Yeah. But as a whole piece, I appreciate it so much. I actually lived at the Chelsea Hotel, and when Leonard and I first met, that was one of the first things we talked about was that I lived there, where all that stuff went down before they tore it apart. And yeah, it is just a beautiful song.
Lex Fridman
(00:27:44)
What makes me sad, the way it ends. “I don’t mean to suggest that I loved you the best. I can’t keep track of each fallen robin. I remember you well in the Chelsea Hotel. That’s all, I don’t even think of you that often.” That line, ” I don’t even think of you that often” always breaks my heart for some reason.
Lex Fridman
(00:28:00)
… I don’t even think of you that often, always breaks my heart for some reason. How ephemeral, how short lasting certain love affairs can be. Just kind of like, huh.
Tal Wilkenfeld
(00:28:14)
Yeah.
Lex Fridman
(00:28:14)
Do you think he meant it? I always think he’s trying to convince himself of it.
Tal Wilkenfeld
(00:28:19)
It could be both, or either. That’s the beautiful thing about poetry and lyric, is that it’s supposed to be open.
Lex Fridman
(00:28:27)
Yeah. I wonder if it’s also open to him, depending on the day.
Tal Wilkenfeld
(00:28:30)
Definitely. The thing that he taught me, or his advice to me was when you’re writing a song, look at it the next morning, just first thing, and read it. And then take a walk, smoke a joint, read it again. Go have a fight with your daughter, come back, read it again. Get drunk, read it again. Wait a week, read it again. Just so that from every state and every position, the wider the lens is going to be from an audience perspective. You want things to mean multiple things.
Lex Fridman
(00:29:12)
There’s one line I read somewhere, that he regrets putting in the song, so I’ve got to ask you about it. It’s pretty edgy. It’s about, “Giving me head on the unmade bed.” You think that’s a good line, or a bad line?
Tal Wilkenfeld
(00:29:27)
I think it’s an amazing line. It’s one of the best lines in the song.
Lex Fridman
(00:29:30)
Yeah, right?
Tal Wilkenfeld
(00:29:30)
When he put that song out, obviously he didn’t regret it, or he wouldn’t have put that lyric in the song. I think what happened was that eventually word got out, either from him or from somebody else, that the song was about Janis Joplin. And so at that point, he regretted the indiscretion. It wasn’t that he regretted how great the line was, it was just the privacy factor. But then again, Leonard’s known for rewriting his lyrics. In his live shows, you’ll see a bunch of songs where it’s like new lyrics. And he didn’t do it because he didn’t like the old lyrics, he just did it because he could, because he’s Leonard. And it’s like, why not have fun with words the way musicians have fun improvising solos on stage? And he could have changed that line in Chelsea Hotel after, in retrospect, and he never did.
Lex Fridman
(00:30:26)
“I remember you well in the Chelsea Hotel. You were talking so brave and so sweet. Giving me head on the unmade bed, while the limousines wait in the street.”
Tal Wilkenfeld
(00:30:35)
It’s so powerful.
Lex Fridman
(00:30:36)
It’s a powerful line. It just kind of shocks you.
Tal Wilkenfeld
(00:30:39)
Well, that’s what’s so great about it. Yeah.
Lex Fridman
(00:30:42)
But also heartbreaking, because it doesn’t last. Especially actually, to me it adds more meaning once you know it was Janis Joplin. It’s like, okay, these two stars collided for a time.
Tal Wilkenfeld
(00:30:54)
Yeah, but why is it heartbreaking? It could also be just beautiful that they had a little fling.
Lex Fridman
(00:31:00)
Yeah, everything is beautiful.
Tal Wilkenfeld
(00:31:02)
Thank you.
Lex Fridman
(00:31:03)
Even the dark stuff. What’s not beautiful? Everything is beautiful, if you look long enough and deeply enough. What were we saying? Oh, what do you think about Hallelujah? What do you think about the different songs of his, and why’d you choose Chelsea Hotel to perform?
Tal Wilkenfeld
(00:31:22)
Because I lived there, and it meant something to me to sing that song. And actually when I put that song out on YouTube, that’s when he sent me an email. He’s like, “Hey, do you want to come over?”
Lex Fridman
(00:31:37)
Nice. This is how you guys connected?
Tal Wilkenfeld
(00:31:39)
No, we met in a rehearsal studio. I ended up watching their whole rehearsal, and sitting there next to Roshi, his 105-year-old monk, which was really great. I remember when I was shaking his hand, it was just me and Roshi on the couch watching Leonard with this band. And we are shaking hands, and he grips my hand like this, doesn’t let it go. And he looked in my eyes, he said, “Where are you?” And I said, “In the handshake.” He says, “Yes.”
Lex Fridman
(00:32:13)
Wow. You passed the test.
Tal Wilkenfeld
(00:32:15)
Passed the Roshi test. And then what’s funny was that the next thing that happened about five minutes later, was Leonard Cohen got down on his knees and opened up a jar, I’m not kidding you, of caviar. This is not a callback.
Lex Fridman
(00:32:28)
Well, it is in a way. In a deep, fundamental way.
Tal Wilkenfeld
(00:32:32)
I know, I know. He started feeding the monk caviar, and that healed my Montreux Jazz Festival sadness forever. The end.
Lex Fridman
(00:32:41)
Do you think there’s a kind of weird, there’s a sense of humor to it all somehow? Why does that happen? Why does that happen? Why stuff like that happens, or that the Jeff Bass speaks to you?
Tal Wilkenfeld
(00:32:57)
Why do we need to know?
Lex Fridman
(00:32:59)
You believe in that stuff?
Tal Wilkenfeld
(00:32:59)
In what stuff?
Lex Fridman
(00:33:01)
That there’s a rhyme to the whole thing, somehow? There’s a frequency to which magical things of that nature can happen?
Tal Wilkenfeld
(00:33:19)
I’m divided about that answer. Because I think just things are flowing, I don’t think anything’s planned out.
Lex Fridman
(00:33:32)
Like through time, it’s like an orchestra playing of different experiences and circumstances that are somehow connected.
Tal Wilkenfeld
(00:33:40)
I think everything’s connected, so yes.
Lex Fridman
(00:33:43)
But predetermined means-
Tal Wilkenfeld
(00:33:45)
I don’t believe in the predetermined stuff necessarily, which is different from whatever your previous karma is. And karma is a whole other conversation, I don’t mean karma as in good karma, bad karma. Just karma meaning the collection of things you’ve acquired over this lifetime or other lifetimes. Just whatever that is, is going to influence your future.
Lex Fridman
(00:34:13)
Well, you had a really interesting trajectory through life. Maybe I just read it that way, because I’ve had a lot of stuff happen to me that’s lucky, feels lucky. And sometimes I’ll wonder, huh, this is weird. It does feel like the universe just kind of throws stuff at you with a chuckle. I don’t know. Not you, the proverbial you. One.
Tal Wilkenfeld
(00:34:37)
One, yeah.

Taxi Driver

Lex Fridman
(00:34:40)
You said you sometimes watch classic movies to inspire your songwriting, and you mentioned watching Taxi Driver. I love that movie. And I think you mentioned that you wrote a love song based on that movie. So Travis Bickle, for people who don’t know, is a taxi driver and he’s deeply lonely. What do you think about that kind of loneliness?
Tal Wilkenfeld
(00:35:02)
I think that loneliness is a product of feeling separate from the world, and separate from others. And that the less you experience that separation, the less you’ll feel lonely.
Lex Fridman
(00:35:20)
How often have you felt lonely in this way, separated from the rest of the world?
Tal Wilkenfeld
(00:35:25)
It’s less and less every single year. Because I work very hard at it.
Lex Fridman
(00:35:34)
Feeling like a part of the world?
Tal Wilkenfeld
(00:35:37)
Yeah, just meditating and studying scriptures.
Lex Fridman
(00:35:40)
Don’t you think that, isn’t there a fundamental loneliness to the human experience?
Tal Wilkenfeld
(00:35:45)
In what sense?
Lex Fridman
(00:35:46)
That all the struggles, all the suffering you experience is really experienced by you alone?
Tal Wilkenfeld
(00:35:51)
Is it?
Lex Fridman
(00:35:53)
Maybe at the very bottom, it’s not.
Tal Wilkenfeld
(00:35:55)
It’s kind of all the same stuff.
Lex Fridman
(00:35:57)
You didn’t feel alone in 2016, 2017?
Tal Wilkenfeld
(00:36:02)
I felt like I lost a piece of myself that I had given to somebody else. And I feel like people feel that in romantic exchanges, whether it’s long-term, short-term. You give a piece of yourself, and then if that person dies or you break up with that person, you feel like you’ve lost that piece of yourself. Which I feel like is a very different experience than if you just are opening yourself. Rather than giving a piece of yourself, you’re just opening yourself to somebody or something.
Lex Fridman
(00:36:39)
So opening is fundamentally not a lonely experience.
Tal Wilkenfeld
(00:36:43)
No, it’s a loving experience.
Lex Fridman
(00:36:45)
And then losing a piece of yourself can be.
Tal Wilkenfeld
(00:36:50)
Yeah. Because you can’t lose a piece of yourself, if you are the same self as every other self.
Lex Fridman
(00:36:57)
Right, right. If you see yourself as together with everybody, then there’s no losing.
Tal Wilkenfeld
(00:37:01)
Yeah.
Lex Fridman
(00:37:02)
Yeah, yeah. It’s a beautiful way to look at it. You said that there’s something healing about being in an empty hotel room, with no attachments except your suitcase. A lot of people will talk about hotel rooms being a fundamentally lonely experience, but you’re saying it’s healing.
Tal Wilkenfeld
(00:37:23)
It’s healing. Yeah. Because I just get to sit there, and not worry about all this stuff, these meaningless attachments. I’ve got my suitcase with my necessities, or my three suitcases sometimes. And I can just sit there and meditate, and just be with myself, and it’s so awesome. And usually you plan your touring for, you get the business aspect of things taken care of in advance, so you can just really be flowing day to day on a tour. And it’s a great feeling. It’s funny because this last tour that I did, we didn’t have hotels every night. We had hotels maybe once a week. And I hadn’t done that before. Usually I’m frequently in hotels. I didn’t get that space that I’m really used to getting.
Lex Fridman
(00:38:18)
You missed them.
Tal Wilkenfeld
(00:38:20)
I very much missed it, and had to be very creative. And I ended up going into the back lounge when everyone was asleep, and meditating back there, or before everyone woke up. And I actually joined, there was an online meditation retreat that was happening. It was 12 hours a day of silent meditations that happens once a year, and I love this particular group of people. And they knew I was on tour, so they’re like, “Just join when you can.” And so I was on the tour doing the meditation retreat at the same time. It was so fun. It was so fun. Because I was in the back lounge, the bus is moving around like this, my laptop, the Zoom is like… and I’m just sitting meditating. It was like, yeah, this is the shit.
Lex Fridman
(00:39:12)
It’s silence, so they’re all connected to Zoom and just doing silent 12 hours a day?
Tal Wilkenfeld
(00:39:16)
Yeah, yeah.
Lex Fridman
(00:39:16)
That’s cool.
Tal Wilkenfeld
(00:39:17)
These particular retreats that I started doing, it’s not straight silent. There are silent sits every hour for 50 minutes, and then there’s some talks. And these people that I’ve been working with are really cool, because they’re integrating spiral dynamics into Zen, and it’s like the coolest combination.
Lex Fridman
(00:39:43)
What’s spiral dynamics?
Tal Wilkenfeld
(00:39:45)
Like Ken Wilber? Do you know Ken Wilber, Integral Theory?
Lex Fridman
(00:39:49)
Yes. Can you explain a little bit? I vaguely know of him because of this notion that everything is one, everything is integrated, that every field has truths and falsehoods, and we should integrate the truths.
Tal Wilkenfeld
(00:40:07)
Yeah. It’s hard to explain how it applies to this type of meditation, because it’s in the guided parts of the meditation that this whole holonic theory is brought in, about transcending and including every aspect of your being. Because he talks about levels of development in consciousness, and how this applies to every single, religion or non-religion, that there are these levels of development, that go all the way up to enlightenment. No matter what you start off with. It could be Christianity, Buddhism, Vedanta, it doesn’t matter, anything.

(00:40:57)
I like it when everything and everyone is taken into account. It doesn’t matter where you’re coming from, that there is a way to be self-realized, self-actualized. There are self-actualized beings from all walks of life with very, very different paths. There’s no one path. In this particular retreat I do, there’s a lot of silent sits, and then there’s some guided meditations. But I’ve tried a lot of different avenues, and they’re all great. I wouldn’t just say, just try this one thing. I’ve studied the Upanishads with Vedanta teachers, and gone through those texts for months and months, and stayed at monasteries. And how they break it down makes total sense to my mind and heart. And more importantly than my mind, my inner knowing, it resonates.
Lex Fridman
(00:41:49)
Inner knowing.
Tal Wilkenfeld
(00:41:50)
Yeah, because your mind is the thinking tool. It’s not you, you’re not your mind, you’re not your thoughts, you’re not your body. It’s like, just the you, that knowing that you have. When something resonates there, that’s usually when you go with something.
Lex Fridman
(00:42:12)
What was living in a monastery like?
Tal Wilkenfeld
(00:42:14)
It’s the best.
Lex Fridman
(00:42:15)
What are we talking about?
Tal Wilkenfeld
(00:42:17)
It’s just an empty room, with a tiny single bed, and a sheet and a pillow, and that’s it.
Lex Fridman
(00:42:22)
That’s it?
Tal Wilkenfeld
(00:42:23)
You have to eat the same thing as everyone.
Lex Fridman
(00:42:25)
What’s the food like? What is it?
Tal Wilkenfeld
(00:42:27)
Very plain, cheap, basic food. Which is funny for someone like me, because I’m pretty particular about my diet.
Lex Fridman
(00:42:36)
Yeah, you brought over like 20 different ingredients.
Tal Wilkenfeld
(00:42:41)
Yeah.
Lex Fridman
(00:42:43)
What was the day in the life of Tal in a monastery?
Tal Wilkenfeld
(00:42:48)
You wake up at 5:00 a.m. to the bell, and you go and meditate constantly until bedtime. Other than two meals.
Lex Fridman
(00:43:00)
How are you sitting? Are you in a group? Is there other people there, and you’re just sitting there?
Tal Wilkenfeld
(00:43:06)
Well, if you’re talking about the Zen monastery, because I stayed in Zen monastery, and I did a thing with the guy I was telling you about, the integral Zen thing where he uses Ken Wilber’s work in combination with Zen. That’s a little bit different, because he does talks, we talk about things. That’s very separate from the Vedanta monasteries I’ve stayed at, which there’s very little meditation in terms of sitting silently. Instead, we are meditating on the scriptures, like the Upanishads, and we’re diving into that.
Lex Fridman
(00:43:46)
What were the differences, the takeaways from the experiences? The two different, the integral one and the meditating on the scriptures?
Tal Wilkenfeld
(00:43:56)
They’re both incredibly, have been incredibly helpful to me. Because the Vedanta, anytime I go into my head about something, the answer is there, based on this knowledge. And with the Zen monastery, it’s like you just got to put your butt in the seat, and sit and wait. And maybe something will happen, maybe it won’t, but just keep sitting. And it’s very disciplined, and you go through a lot. Your body’s purging a lot. There’s a lot, and you don’t necessarily have the answers as to what is happening. And so I think for somebody like me, I need both. I need to be in a place where there’s complete uncertainty, but complete discipline, and just doing the regimented thing. And then there’s the me that feels very satisfied from an analytical standpoint, understanding what’s happening, what is the gross, and the subtle body? I want to understand these things about what it is to be a human. I like them both.
Lex Fridman
(00:45:15)
Understand what it means to be a human, so having that patience and just sitting with yourself helps you do that?
Tal Wilkenfeld
(00:45:22)
Yes. More so the analysis part.
Lex Fridman
(00:45:26)
Oh, so the analysis, the actual… okay, got it.
Tal Wilkenfeld
(00:45:29)
But sitting with yourself, there’s no better education of facing every demon. And it’s all going to come out, and it’s not going to be pretty. But then there’s things that happen on the other side of it that are so profound.
Lex Fridman
(00:45:45)
Have you met most of your demons?
Tal Wilkenfeld
(00:45:48)
I’ve met the demons that have come out.
Lex Fridman
(00:45:50)
Oh, there may be more?
Tal Wilkenfeld
(00:45:51)
Who knows? Yeah.

Songwriting

Lex Fridman
(00:45:53)
Okay. Well, to be continued. Since I think I heard you say that you wrote a love song after Taxi Driver, what kind of love songs do you write more of? You’re a songwriter first, for people who don’t know. They might think you’re primarily a bassist.
Tal Wilkenfeld
(00:46:14)
But they’re wrong.
Lex Fridman
(00:46:16)
Do you write mostly broken heart ones, or hopeful love songs? In love songs, about to be in love songs, soon to fall in love songs?
Tal Wilkenfeld
(00:46:27)
Well, the last album I put out is pretty self- explanatory as to what that is.
Lex Fridman
(00:46:31)
A lot of pain in that one?
Tal Wilkenfeld
(00:46:34)
There was, yeah. Some of it was storytelling, and some of it was real experience, and it’s always a combination of things. I serve the song. Sometimes you use your own life experience to tell a song, and sometimes you may watch a movie, and part of that script merges with your own experience, and that tells the right story for the point you’re trying to make in the song. It varies from song to song in terms of how autobiographical it is.
Lex Fridman
(00:47:14)
Yeah. I always think at the end of the Taxi Driver, when… what’s her name, Betsy? Because Travis becomes a hero, she tries to get with him, and he rejects her. That was powerful.
Tal Wilkenfeld
(00:47:28)
My favorite love songs are the ones where you’re not sure it’s about romantic love, or love of God, or love of life, or just pure love. I was thinking George Harrison writes songs like that, What is Life? Or Bob Dylan’s song that George Harrison covered, If Not for You?
Lex Fridman
(00:47:54)
Yeah, just grateful. Grateful for his love. Yeah.
Tal Wilkenfeld
(00:47:57)
Right, right. That’s kind of like what I’m experiencing now, and so who knows what’ll end up coming out.
Lex Fridman
(00:48:05)
So you’ve been writing this kind of-
Tal Wilkenfeld
(00:48:07)
Yeah, I’ve been writing.
Lex Fridman
(00:48:09)
A little bit?
Tal Wilkenfeld
(00:48:10)
I don’t have an intention of putting something out in any particular timeframe, but I’m just writing and letting things flow. And yeah, there’s a bunch of Leonard Cohen songs too where you’re like, there’s so many ways to interpret this song. There’s so many ways. I just love songs that aren’t so specifically about one thing.
Lex Fridman
(00:48:39)
I really love the song to play it, to listen to it, Wonderful Tonight by Eric Clapton. And I thought it was pretty straightforward. And then I had a conversation with Eric Weinstein, who’s a mutual friend of ours, and he told me it’s not about what I thought it’s about.
Tal Wilkenfeld
(00:48:54)
Oh yeah, what did he say?
Lex Fridman
(00:48:57)
It’s a more complicated story. It’s actually a man… Wonderful Tonight is a story about a man being just finding his wife beautiful, and appreciating it throughout. But he said it was actually a man missing his wife, he’s imagining. That she’s lost, because of the decisions he’s made in his life, so it’s pain. He had a long, beautiful Eric Weinstein-like explanation of why.
Tal Wilkenfeld
(00:49:28)
I love those.
Lex Fridman
(00:49:29)
Have you and Eric played music?
Tal Wilkenfeld
(00:49:32)
No. We’ve just hung out and had very long conversations about everything.
Lex Fridman
(00:49:37)
He’s a bit of a musician, you know?
Tal Wilkenfeld
(00:49:38)
Yeah.

How to learn and practice

Lex Fridman
(00:49:39)
Okay. You picked up the guitar when you were 14, let’s go back. And one interesting thing that just jumped out at me is you said you learned how to practice in your head, because you only had 30 minutes. Your parents would only let you practice for 30 minutes. I read somewhere that Coltrane did the same. Not the practice part, but he was able to play instruments in his head as a way to think through different lines, different musical thoughts, that kind of stuff. Maybe, can you tell the story of that?
Tal Wilkenfeld
(00:50:14)
Yeah. I just grew up in an environment that was focused on academia. And I fell in love with guitar, and really just wanted the focus to be that. My limit was 30 minutes a day for, I don’t even remember how many times a week. Might’ve been every day, five days a week, whatever.
Lex Fridman
(00:50:36)
So your parents didn’t want you to play more than that?
Tal Wilkenfeld
(00:50:39)
No. And so, I just learned how to visualize the fretboard in my head, and I’d practice all day in my head. It’s kind of like, you know The Queen’s Gambit, the TV show with Anya Taylor-Joy, and she just on the ceiling? I used to do that with the fretboard, and just practice. And I actually recommend it to every musician. Because if you’re just practicing here, you don’t know what is more dominant necessarily, is it this or is it your motor skills? If you just take that away and do it here, you know you’ve got it. I’m glad that that happened and that I learned how to do that.

(00:51:24)
And in terms of learning fast, because I had to try to absorb a lot of information in a short amount of time when I did have the instrument, I kind of would do things in bursts. Even in that half an hour, I would just play for a couple minutes, and then I’d stop for a minute. And then I’d do it again, and I noticed there was a huge difference between the first time and the second time. Whereas if I just kept repeating stuff, it would be much slower.
Lex Fridman
(00:51:56)
What did you do in that minute?
Tal Wilkenfeld
(00:51:59)
Just hang out.
Lex Fridman
(00:52:00)
Just integrate?
Tal Wilkenfeld
(00:52:01)
Yeah. It’s like my brain was telling me, just chill out for a sec. That’s enough information. Let me take a second to integrate that. That’s at least what it felt like to me. And the most hilarious thing happened a couple months ago. I know you’re friends with Andrew Huberman. He put out some clip, which was a part of one of his podcasts, about learning. And he said that there was some research done on learning fast, and that if you practice something for a minute or so, and then you let your brain rest for 30 seconds or a minute, that in that 30 seconds or a minute, your brain does the repetition 20 to 30 times faster, and in reverse. And I was like, whoa, that’s so cool. Because that’s what I used to do when I was a kid, now there’s science that proves that. Which is really cool for musicians to know that that’s a good way to practice efficiently. Because some musicians, they’re practicing for six, seven, eight hours a day. I’ve never done that. I’ve never practiced more than an hour a day, even now. That’s my technique, and it works.
Lex Fridman
(00:53:17)
Are you also practicing in your head sometimes?
Tal Wilkenfeld
(00:53:20)
Now, I’m not practicing as much. I’m more always writing songs in my head, so that’s why I like silence. That’s why I love being in the empty hotel room and being alone. Songs come to me while I’m showering, or walking around, doing the dishes. Or occasionally when I’m hanging out with friends, or comedians, and people will just say shit. And I’ll be like, that’s a cool line. Just jot it down on my phone.
Lex Fridman
(00:53:46)
So it’s not always musical, it’s sometimes lyrical.
Tal Wilkenfeld
(00:53:48)
It’s more lyrical than musical now. Because for me it’s like, well, there’s so much music in the world. If I’m going to write a song, I want the song to be about something interesting. And so, yeah, the words matter to me.
Lex Fridman
(00:54:07)
Yeah. And the right word has so much power. It’s crazy, like we said with Leonard Cohen. And then they’re often simple, the really powerful ones are simple.
Tal Wilkenfeld
(00:54:18)
And when you mentioned Hallelujah, he wrote like 80 verses to Hallelujah before he narrowed it down to four. And it took him like 15, 20 years to write that song. Some writers will do that, and then other writers just vomit it out and it’s beautiful. I’ve heard that Bob Dylan or Joni Mitchell, they’re fast writers. It just kind of comes out.
Lex Fridman
(00:54:41)
That makes me feel so good to know Leonard Cohen wrote so many verses of that. That was so deliberately crafted, extensively rigorously crafted.
Tal Wilkenfeld
(00:54:53)
He just would spend months and years, constantly refining, refining.
Lex Fridman
(00:55:00)
Do you have songs like that for yourself, where you refine for many years?
Tal Wilkenfeld
(00:55:03)
Yeah, it’s song dependent. Some just flow out and it’s like, oh, there it is. Everything’s there. And then other songs, it’s like, you might have started it with music, and there’s some words that come out. And then trying to fill in the rest of the words, sometimes it can be like a square peg in a round hole, and other times it’s like, oh no, I can… it depends. Sometimes it becomes like a math problem, and hopefully it doesn’t. Because you just want to say what’s right for the song. And usually when you write it all together, like the lyric, and the melody, and the chords and everything’s developing at once, at least for the first draft, that’s very, very helpful. Sondheim used to write like that. He wouldn’t move on until… he would just go this way. Whereas for me it’s just like, I’ll just go with what seems to be coming naturally, and I’ll just let it be what it is. And then you come back and you say okay, well, what-
Tal Wilkenfeld
(00:56:00)
Truly, and I’ll just let it be what it is. And then you come back and you say, okay, well what do I have to do to this now? What’s needed?
Lex Fridman
(00:56:07)
Just to linger on the learning process, what would you recommend for young musicians on how to get good? What are the different paths a person can take to understand it deeply enough to create something special?
Tal Wilkenfeld
(00:56:26)
I think first and foremost, understanding why you are playing music. If it’s because you have something that you’re trying to express or that you’re just in love with expression itself, with art itself, those are great reasons to start this journey.
Lex Fridman
(00:56:47)
The why should be-
Tal Wilkenfeld
(00:56:49)
I think the why is really important because it’s a jagged lifestyle and there’s a lot in it. And so if you don’t have your purpose, if you’re not centered in your purpose, then all that jagged lifestyle is probably going to get to you.
Lex Fridman
(00:57:06)
Jagged.
Tal Wilkenfeld
(00:57:06)
It’s jagged.
Lex Fridman
(00:57:07)
Interesting word.
Tal Wilkenfeld
(00:57:08)
Yeah, it’s jagged. It’s all over the place. It’s uncertain. It’s one thing one moment, and a completely different thing another moment. You never know what’s going to happen. And if you thrive on variety, which I love variety, then it’s perfect. But also every human being needs a certain amount of certainty and structure, and so the certainty can come from your inner knowing knowing that you’re doing exactly what you want to be doing and knowing what your purpose is in doing it in this expression. Otherwise, you’re just kind of like a leaf blowing in the wind.
Lex Fridman
(00:57:48)
In the early days touring, just playing clubs seems like tough.
Tal Wilkenfeld
(00:57:52)
Yeah.
Lex Fridman
(00:57:53)
It’s a lot.
Tal Wilkenfeld
(00:57:54)
Yeah, it’s a lot of the physical labor aspect of it is really hard. Playing on stage to two people, or 2000, or 20,000, that doesn’t make a difference. I mean, it makes a difference to the ticket sales, which informs what level of luxury you might have on the road or not. But other than that, it’s just people there listening to music. The music doesn’t change.
Lex Fridman
(00:58:19)
Does it make it tough when it’s two people versus 200?
Tal Wilkenfeld
(00:58:21)
No.
Lex Fridman
(00:58:23)
So even if nobody recognizes whatever the thing you’re doing.
Tal Wilkenfeld
(00:58:26)
No, because the idea is to be having a great conversation on stage.
Lex Fridman
(00:58:33)
The audience can come and go.
Tal Wilkenfeld
(00:58:35)
Yeah. I always, there’s certain points in shows where I am just like, I consciously am like, oh yes, there’s an audience over there. So wrapped up in whatever’s happening on stage.
Lex Fridman
(00:58:49)
You forget yourself.
Tal Wilkenfeld
(00:58:50)
Or maybe I’m remembering myself.
Lex Fridman
(00:58:52)
Oh, damn. Call back, somehow feels like one. Okay. You think every instrument is its own journey. You play guitar, you play bass, you sing, just the mastery of an instrument, or let’s avoid the word mastery, the understanding of an instrument is its own thing, or are they somehow physical manifestations of the same thing?
Tal Wilkenfeld
(00:59:19)
It’s both. Every instrument has its strengths, beauty, limitations, range, possible range that can be extended to some degree or another depending on who you are, like trumpet or something. Certain people can hit higher notes than others, blah, blah, blah. But that being said, we’re all playing the same 12 or 24, however you divide the octave, that many notes. We’re all playing the same notes. So in that sense, it’s all the same thing. It’s just music or better yet it’s just art or expression. But yeah, every instrument has, you’ve got to go through the physical aspects of it, the motor skills and all of that, and hopefully you get through that really quickly so you can get to the expression quickly because if you get stuck in just that first phase, that’d be really boring.
Lex Fridman
(01:00:19)
But that’s a pretty long phase. The technical skill required to really play an instrument.
Tal Wilkenfeld
(01:00:27)
For some people it’s a long thing, and some people it’s short. It very much varies. It might have to do with how you learn and getting to know your strengths in learning. More oral, or is it more… What’s your strength and playing off of those strengths. So for me, like I was saying earlier, it was just an intuitive thing that I knew. I can feel when my brain is full that it needs processing time. And so I listened to that. I don’t push past it, even if it’s one minute and I do something, I’m like, okay. Silence. And then I come back and I trust that it’s going to be there and it is there. So just trusting yourself I think is really important. Trusting that you know better than anybody else is going to know you.

(01:01:23)
So that’s the kind of thing with teachers that can be either really, really helpful and great or really not great. I’m primarily self-taught. I’ve had amazing mentors of all walks of life, and I think I’m unbelievably blessed that my mentors are some of my favorite musicians on Earth, whether it’s Leonard Cohen or Jeff Beck or Wayne Shorter, whoever these people are, they are my favorite musicians. So not everyone has that opportunity, but what the opportunity that we have now that I didn’t have when I was starting is that everything’s on YouTube. Every interview with every genius. You don’t need to necessarily have these people in person now. I mean, and then I’ll say to that, yes and no. I agree with myself, and then I don’t agree with myself. And the reason is I do believe that there is something that happens when you’re in person with a master in some cases, that there is something transferred that is not intellectual, it’s not spoken, it’s something else that happens, that can happen, that I’ve experienced, and I really value that.
Lex Fridman
(01:02:47)
And I think that applies to specific disciplines and also generally. I’ve been around Olympic gold medalists just to hang out with them for several days, and there’s something about greatness. There’s a way about them that permeates the space around them. You kind of learn something from it, even if you don’t practice that particular discipline, there’s something to it if you’re able to see it. I also like what you said about the playing stuff in your head, that it forces you to not be lost in the physical learning of the instrument. I think that’s one of the things I probably regret a little bit. So I play both piano and guitar, and I’ve become quite, over the years, technically proficient at the instruments.
Tal Wilkenfeld
(01:03:43)
I’ve seen.
Lex Fridman
(01:03:43)
But I think my mind is underdeveloped because of that, meaning I can’t really… I can feel the music when it’s created, but I can’t create out of the feeling. I haven’t practiced projecting the feeling onto the music. You know what I mean? I’m not like a musician. It’s a different muscle that I think is if you really want to create beautiful things, you have to, the creation happens here, not with your hands.
Tal Wilkenfeld
(01:04:17)
I think it’s more here.
Lex Fridman
(01:04:18)
Or whichever it is, some part of the body, but it’s not with your fingers.
Tal Wilkenfeld
(01:04:21)
Yeah, because I think the fingers is more of this.
Lex Fridman
(01:04:22)
Sure.
Tal Wilkenfeld
(01:04:24)
And then…
Lex Fridman
(01:04:25)
Yes, it is here.
Tal Wilkenfeld
(01:04:27)
Yeah.
Lex Fridman
(01:04:27)
Right. And it’s just nice that you said that because it’s really good advice if you want to create.
Tal Wilkenfeld
(01:04:34)
Yeah, slowing down is really great too.
Lex Fridman
(01:04:38)
What do you mean slowing down?
Tal Wilkenfeld
(01:04:40)
Slowing everything down? It could be, I can play something really fast, but I may want to practice it like…
Lex Fridman
(01:05:09)
Go slow as possible.
Tal Wilkenfeld
(01:05:12)
All these micro movements that are happening that if you just go, you can’t pay as close attention to the exact tone that you’re pulling from each note. And there’s a lot to pay attention to how my fingers are touching the string here. I can change my tone a million ways just by the direction of this finger, and same with how this lands and how hard I’m attacking the string and with what intention am I hitting the string emotionally, physically, and so even if you can go, play that so slow, see how locked into a pocket you can be, see how you… Feel every aspect of that because then when it gets sped up, it’s still there with you.
Lex Fridman
(01:06:07)
That is brilliant.
Tal Wilkenfeld
(01:06:08)
It’s like the transcended and included thing that Ken Wilbert talks about.
Lex Fridman
(01:06:14)
I guess that’s what meditation can do for you is to really listen, to observe every aspect of your body, the breath and all this. Here you’re observing every element, every super detailed element, of playing a single note.
Tal Wilkenfeld
(01:06:26)
Yeah.
Lex Fridman
(01:06:27)
It’s cool that if you speed it up, it’s still there with you.
Tal Wilkenfeld
(01:06:30)
It is, Yeah it is. Because there are certain people, it’s like they play really fast, but I don’t hear the fullness of tone always. And it’s like, well, it’s probably because maybe they didn’t slow it down and really sit with each note and let it resonate through their whole being. It’s spiritual. It’s like a spiritual expression. It’s not a sport. A lot of people treat music like a sport.
Lex Fridman
(01:07:04)
Since starting to learn more like Stevie Ray Vaughan versus Jimi Hendrix. I would spend quite a long time on single notes of just bending, just listening to what you can do with bends, spending. Just thinking people like B.B King and all these blues musicians spend a career just making a single note cry. There’s an art form to that.
Tal Wilkenfeld
(01:07:28)
Yeah.
Lex Fridman
(01:07:29)
And I think you putting it, taking it really slow, which I never really thought of, is really good idea. Really slow it down.
Tal Wilkenfeld
(01:07:39)
It’s the same with sitting with your own emotions. It’s like when emotions are overwhelming to us, we get real busy or we move real fast because we don’t want to feel our feelings. Those are the moments to slow yourself down.
Lex Fridman
(01:07:57)
And observe it, anger, jealousy, loneliness.
Tal Wilkenfeld
(01:08:01)
And just be with it. Be cool with it. Love it. Love the anger.
Lex Fridman
(01:08:06)
It’s all beautiful. Can you educate me on the difference between bass.

Slap vs Fingerstyle

Tal Wilkenfeld
(01:08:13)
Bass and bass? Okay, well, one is a fish.
Lex Fridman
(01:08:16)
At least I pronounced it correctly. That’s good. It’s all about the bass.
Tal Wilkenfeld
(01:08:20)
Can you pronounce my name?
Lex Fridman
(01:08:22)
Tal.
Tal Wilkenfeld
(01:08:23)
Wow. Most people say Tal or tall. You said-
Lex Fridman
(01:08:29)
Tall, who says tall?
Tal Wilkenfeld
(01:08:31)
So many people.
Lex Fridman
(01:08:32)
In the south, maybe tall.
Tal Wilkenfeld
(01:08:34)
I don’t know. But the fact that you said my name right.
Lex Fridman
(01:08:34)
Oh, honey tall.
Tal Wilkenfeld
(01:08:36)
You get extra points.
Lex Fridman
(01:08:37)
Tal. I didn’t know this was a game. Am I winning?
Tal Wilkenfeld
(01:08:41)
Yep.
Lex Fridman
(01:08:41)
I like winning. How do you play the bass? What’s the difference between finger style and slap?
Tal Wilkenfeld
(01:08:48)
Slap is like this finger styles like this.
Lex Fridman
(01:08:50)
Have you ever played bass with a pick?
Tal Wilkenfeld
(01:08:52)
Yeah, sometimes
Lex Fridman
(01:08:54)
I’m not accusing you of anything.
Tal Wilkenfeld
(01:08:55)
No accusation taken.
Lex Fridman
(01:08:57)
I don’t know if these are sensitive topics.
Tal Wilkenfeld
(01:08:59)
That would be pretty hilarious if I was sensitive about bass techniques, but not about love.
Lex Fridman
(01:09:05)
It just looks so cool to slap it, and I don’t understand what that’s about. That thumb thing that…
Tal Wilkenfeld
(01:09:11)
Yeah, I slapped less, a lot less. Almost never actually. It has a very distinctive sound and does a very distinctive thing to a song that is not something I hear needed very often in music today, but in certain styles, like funk, it sounds awesome and it makes sense. It was something that was a bit overused at one point. For instance, my mentor Anthony Jackson, he refused to slap. He actually said, if you want me to slap, I’ll leave this gig. So I’m not like that.
Lex Fridman
(01:09:56)
See, that’s why I said sensitive. See, I was reading into it.
Tal Wilkenfeld
(01:09:59)
Because he’s sensitive about it. I’m not sensitive.
Lex Fridman
(01:10:00)
I was feeling the spiritual energy of the sensitivity of the topic.
Tal Wilkenfeld
(01:10:03)
Anthony Jackson.
Lex Fridman
(01:10:04)
Anthony Jackson.
Tal Wilkenfeld
(01:10:04)
And then I’m playing electric bass, so generally speaking, you don’t particularly want to hear electric bass on straight-ahead Jazz anyway, you want to hear an upright bass. But if I was to play jazz on electric bass, I might even palm mute instead of going like, I might go to very. Anything to make the notes shorter and less resonant and fade away because the upright does that naturally. And I have a different bass, like a hollow body harmony that sounds closer to an upright that I’ll use. In on my song Under the Sun, that I put out, that was on a harmony bass. And it has an upright acoustic kind of tone to it, but with more sustain.
Lex Fridman
(01:10:58)
And is Jazz fusion the style where you have an electric bass? Can you educate me?
Tal Wilkenfeld
(01:11:06)
Again, you can have both. You can have both. You can have either on anything. There’s no real rules, now.
Lex Fridman
(01:11:14)
I’ve heard you say something interesting, which is, well, a lot of things you say is interesting.
Tal Wilkenfeld
(01:11:17)
Just one thing.
Lex Fridman
(01:11:20)
Just one. That-
Tal Wilkenfeld
(01:11:23)
And it’s what time you’re leaving.
Lex Fridman
(01:11:27)
What time was that again?
Tal Wilkenfeld
(01:11:29)
Three minutes.
Lex Fridman
(01:11:30)
That it’s maybe easier sometimes to define a musical genre by the don’ts than the do’s, the don’ts, than the do’s. What are the don’ts of jazz and rock? What are the don’ts of jazz fusion? What are the don’ts? At any domain of life, what are the don’ts?
Tal Wilkenfeld
(01:11:50)
The don’ts is just to please leave your fear at the door and your do’s is to be open to anything and open your ears, respond to what’s happening now. I think that quote you’re talking about might have been more about an individual musician’s unique sound, because everyone has their sound. If they’ve developed their voice and they’ve listened to their own aesthetic preferences, of which everyone is slightly different, everyone has slightly different likes and dislikes, then you’ll have a unique sound on your instrument. And your unique sound is defined more by the choices you make rather than… I mean, it’s equally as defined by the choices you make and the choices you don’t make. I mean, it’s the flip side of the same coin, really?
Lex Fridman
(01:12:46)
Yeah. There’s certain musicians you can just tell. It’s them just, you hear a few notes and you’re like, okay, it’s them. Tone, sometimes it’s tone. Sometimes it’s the way they play a rhythm.
Tal Wilkenfeld
(01:12:56)
Yeah, the quote you’re talking about might have even had to do with someone’s real limitations on an instrument that then that would define their sound as the things that they actually can’t do versus what you’re choosing to do versus not choosing to do. Which is that flip side of the same coin thing,
Lex Fridman
(01:13:14)
How many fingers you play with, because it seems like a lot of the greatest musicians aren’t technically perfect. The imperfections is the thing that makes them unique and where a lot of the creativity comes from. I mean, Hendrix had a lot of those things. The way he put a thumb over the top.
Tal Wilkenfeld
(01:13:34)
Well, his hands were huge. There was no other place for the thumb to go. And it was great that he could reach the E string and that was an advantage.
Lex Fridman
(01:13:43)
And he was a lefty playing a right-handed guitar, flipped, I guess. That’s weird. That probably doesn’t have much of an effect. Maybe a spiritual one. I don’t know.
Tal Wilkenfeld
(01:13:55)
Actually, flipping and guitar is different. It does bring out something different in you because I’ve done it, flipped it. It’s like, oh wow. Yeah, it really, it’s really different. I remember talking about osteopath about, because there’s so much weight on this shoulder while I’m playing all the time, and they were saying, well, just after shows, just literally just turn it upside down and do the exact same thing in the opposite way. It’ll even out your body. And I was like, that’s good advice.
Lex Fridman
(01:14:24)
Have you actually tried it? Okay. All right, I’ll write that down. All right. Well, do you know a guy named Davie504?

Davie504

Tal Wilkenfeld
(01:14:36)
I’ve heard of him.
Lex Fridman
(01:14:37)
I’ve recently learned of him. He’s a YouTuber and a bass player. He’s amazing.
Tal Wilkenfeld
(01:14:42)
Cool.
Lex Fridman
(01:14:42)
He combines memes and also just these brilliant bass compositions and says slap like a lot. He’s big into slapping. He’s the one that made me realize this is a thing. And he also said that you’re one of the best, if not the best, bassists in the world. There was a bunch of his fans that wrote in and he analyzed the Jeff Beck thing that we watched at Crossroads is one of the greatest solos ever, bass solos ever. So shout out to him. What does that make you feel like you’re the greatest of all time?
Tal Wilkenfeld
(01:15:13)
Chocolate cookies.
Lex Fridman
(01:15:14)
Chocolate. Is that your favorite?
Tal Wilkenfeld
(01:15:15)
I like macadamia nut. If you really want to get into it, with white chocolate.
Lex Fridman
(01:15:21)
Yeah, that’s a rare one for people to say is the favorite.
Tal Wilkenfeld
(01:15:24)
Chocolate chip is just so easy. You can kind of get them anywhere.
Lex Fridman
(01:15:27)
Yeah. Last thing you want to be is easy in this world. You don’t want to be easy. You said that I love Rock and Roll quote, “I love folk. I love jazz. I love Indian classical music. I really love all kinds of music as long as it’s authentic and from the heart.” So when you play rock versus jazz, you play all kinds of music. What’s the difference technically, musically, spiritually for you?
Tal Wilkenfeld
(01:15:49)
Well, there’s no spiritual difference.
Lex Fridman
(01:15:54)
Okay. All right. Cross that off the list,
Tal Wilkenfeld
(01:15:56)
Well, musically, yeah, it’s like what was saying earlier, it’s like each genre has its language of what makes it that genre. And that would be a good thing to say. It’s defined by the do’s and don’ts, but because it’s like… I’m trying to think. Basically I put the song first and I think of the song as the melody, the lyrics, and then the harmony and obviously the groove.
Lex Fridman
(01:16:34)
So the song goes before the genre in a sense. Each song is like its own thing.
Tal Wilkenfeld
(01:16:39)
They’re both things that are held in my mind. It’s like, okay, genre and then song, which is comprised of those basic elements. And I tend to kind of prioritize lyric because somebody is trying to express something over music. And so the lyric is very, very important. And so then the choices come from there. It’s like, okay, within the genre of X this is the typical language. And then how do I best serve this lyric? And then where else can I pull from that might not be in these two bags that would put a little twist on it. So those are all the kinds of things I might be thinking about.

(01:17:34)
But I don’t like twists for the sake of twists either. I like twists because I want to hear something that might be fresh. But when someone does something just to be hip, it’s annoying to me. I think you can hear the difference. It’s like when people, they write in odd time signatures or they write all these riffs just because they can, just because they have the chops to do it or they know how to play in 11/16 and whatever. But if it’s not actually creating a piece of music that’s going to move somebody, then why are you doing it? And so I think a lot of the questions I’m asking myself when I’m approaching a song or mainly philosophical and aesthetic.
Lex Fridman
(01:18:27)
So you like to stand on the edge of the cliff, not for the thrill of it, but because where you find something new potentially.
Tal Wilkenfeld
(01:18:34)
And it’s thrilling.
Lex Fridman
(01:18:36)
But you’re not doing it just for the thrill.
Tal Wilkenfeld
(01:18:37)
I’m not doing it for the thrill. It just happens to be thrilling.
Lex Fridman
(01:18:41)
All right.
Tal Wilkenfeld
(01:18:42)
Because you can always reel it back in.
Lex Fridman
(01:18:45)
Can you though?

Prince

Tal Wilkenfeld
(01:18:46)
Yeah. You can do a totally disciplined, I can go into a session and… Okay, my favorite thing about going into a session with musicians that I adore is that we don’t hear the demo because if you hear a demo, you’re hearing what the producer or songwriter have already imagined that every instrument is playing. And then it’s like well, I’ve already heard what you want. Now my mind, part of my mind, is focused on what I already know you want and what the destination is going to be. Why did you bring me in here? I want to not hear it. I just want you to sit at a piano and sing the song, I want to hear the chords and the lyric or sit with an acoustic guitar, play it, and then let’s all go in the room.

(01:19:29)
And then take one, I would say 80% of the time, take one has the most gold and there might be a mistake or two or someone forgot to go to the B section and you might want to punch that in so that you’re hitting the right chord. But all the magic is in that take. And then sometimes it happens where it’s like you go, it’s like we’re rehearsing and take 1, 2, 3, 4, 5, and then you’re thinking about it too much and then you go and you have a dinner and you come back and the next take one after dinner is the one. It’s usually after there’s some sort of a break, but obviously there’s exceptions to that rule. Sometimes it’s take two, or three.
Lex Fridman
(01:20:10)
Yeah. You said that this is something that surprised you about recording with Prince is that he would just, so much of it would be take one. So quick, it would just move so quickly.
Tal Wilkenfeld
(01:20:21)
Yeah. Well, with that particular album that we made together, it’s called Welcome to America. He called me up and asked me, he said, I want to make a band with you. I’m really inspired by what you’re doing with Jeff Beck. I want to make a trio. Do you like the drum rolls of Jack DeJohnette, was like his first question to me. I’m like, well, yeah, who doesn’t. Who doesn’t like Jack DeJohnette, one of the greatest of all time?

(01:20:44)
And he’s like, well, sounds like, because we had a discussion about drumming, sounds like you’re particular about drummers. So why don’t you find us the drummer and I’ll trust you to find the drummer. You can audition some people. Send me some recordings, maybe your two favorites, and I’ll pick out of the two or something. So I did that. Went on a journey, found a couple of guys. He picked the one. We went in and he basically just would be like okay, so the A section’s going to go like this, and then the B section, I think we’re going to go to G, and then the bridge, I might go to B flat, but maybe I’ll hold off and da, da, da. Okay, let’s go 1, 2, 3, 4. And then we recorded it to tape. There was no punch. He did not want me to punch anything.

(01:21:34)
There was one song called Same Page, Different Book. And he talked through it just like he did. And then he had me soloing between each phrase like little fills. I didn’t know that that was going to come up. And he loved that. He loved to have me on the edge of my seat falling off the cliff. That was my first real falling off a cliff moment from somebody else holding me at the edge of the cliff. You know what I mean? Now I just do it on my own because it’s so fun and it makes sense. It’s the best thing for the music.
Lex Fridman
(01:22:13)
When you say punch the tape is that when you actually record it.
Tal Wilkenfeld
(01:22:17)
If you record to tape and there’s say you hit a bum note to punch in means to fix that note, re-record over that one little area and punch that note in. He didn’t want that. He’s like, all my favorite records, just whatever happened happened. That’s that moment in time. Let’s make a new moment in time. It’s great. Nobody makes records like that anymore. Everyone wants to edit and edit and re-record and this and that. And unfortunately with a lot of music, and I’m not saying all music, there’s plenty of great music coming out, but there’s the danger of it being flat because every little imperfection is digitally removed.
Lex Fridman
(01:23:05)
Well, that’s one of the promising things about AI is because it can be so perfect that the thing we’ll actually come back to and value about music is the imperfections that humans can create.
Tal Wilkenfeld
(01:23:16)
Yeah.
Lex Fridman
(01:23:17)
There’ll be a greater valuation of imperfections.
Tal Wilkenfeld
(01:23:20)
Yeah. I mean you can program imperfections too.
Lex Fridman
(01:23:24)
Yeah, sure. That’s also very sad. But then you get closer and closer to what it means to be human, and maybe there’ll be AIs among us. And they’ll be human, flawed, like the rest of us. Mortal and silly at times.
Tal Wilkenfeld
(01:23:42)
Another big sigh.
Lex Fridman
(01:23:46)
Is it fair to say that you’re very melodic on bass? You make the bass sing more than people normally do?
Tal Wilkenfeld
(01:23:55)
Is that a compliment?
Lex Fridman
(01:23:56)
Yes, I think so.
Tal Wilkenfeld
(01:23:57)
Thank you.
Lex Fridman
(01:23:59)
Moving on to the next question. By way of understanding-
Lex Fridman
(01:24:00)
The next question is, by way of understanding, it’s just there’s something about the way you play bass that just pulls you in the way when you listen to somebody play a guitar, like a guitar solo.
Tal Wilkenfeld
(01:24:13)
The thing I love about Jeff Beck is that he played the guitar like a singer, and I think the way that Wayne Shorter played his saxophone. It’s like a singer. And I think everyone, every musician, aspires to just sound like a singer.

Jimi Hendrix

Lex Fridman
(01:24:29)
You make it sing. Let me ask you about… Just come back to Hendrix, because you said that you had three CDs, Jimi Hendrix, Herbie Hancock and Rage Against the Machine. First of all, a great combination. I’m a big Rage fan.
Tal Wilkenfeld
(01:24:42)
It’s so funny, because when I listen to some of the music that I create, my solo music, I’m like, “I could see how this is a combination of Herbie Hancock, Rage Against the Machine and Jimi Hendrix.” I hear the influences. It’s funny.
Lex Fridman
(01:24:58)
Just from your musician perspective, what’s interesting to you about… What really stands out to you about Hendrix? I just would love to hear a real, professional musician’s opinion of Hendrix.
Tal Wilkenfeld
(01:25:13)
I love that he is two voices combined into one voice. So it’s like there is his voice on the guitar, there is his singing voice, and there is the combination of the two that make one voice. And of course the third element is songwriting. And all of this have this beautiful chemistry, and all work geniusly, perfectly together, and there’s nothing like it. And he always beat himself up about being a singer, and he didn’t like his voice, but my favorite singers are the singers that don’t sound like singers.
Lex Fridman
(01:25:58)
Bob Dylan.
Tal Wilkenfeld
(01:25:59)
Bob Dylan.
Lex Fridman
(01:26:00)
You said you like Bob Dylan.
Tal Wilkenfeld
(01:26:01)
Love Bob Dylan.
Lex Fridman
(01:26:03)
You love his voice too?
Tal Wilkenfeld
(01:26:04)
I love his voice.
Lex Fridman
(01:26:06)
Can you explain your love affair with Bob Dylan’s voice?
Tal Wilkenfeld
(01:26:12)
He’s expressing his lyrics. It’s just pure expression, exactly what he means. I feel everything that he’s saying with 100% authenticity. That’s what I want to hear from a singer. I don’t care how many runs you can do and blah blah blah. I want to believe what you’re saying.
Lex Fridman
(01:26:33)
Leonard Cohen is that.
Tal Wilkenfeld
(01:26:34)
Mm-hmm. There’s countless, like Neil Young. I mean, there’s so many musicians. I love Elliott Smith for that reason.

Mentorship

Lex Fridman
(01:26:44)
Let me ask you about mentorship. You said teachers and mentors. You had mentors. What’s a good mentor for you, harsh or supportive?
Tal Wilkenfeld
(01:26:54)
Supportive.
Lex Fridman
(01:26:55)
Supportive. You seen Whiplash, the movie? So that guy, somebody screaming at you, kicking you off the cliff?
Tal Wilkenfeld
(01:27:03)
Not necessary. I feel like anybody that’s truly passionate about something that they want to be great at or a master of or this and that, they’ve already got that person inside their own head. You don’t need somebody else to do that for you. I think you need love, acceptance, guidance, support, time, advice if you ask for it, just a space, just a nice, open space.

(01:27:32)
All my mentors were just that for me. They didn’t tell me to do anything. They don’t care, because they’re not… Why do they need to be invested in where I’m going? Only I know where I’m going. So for some mentor to come and be like, “This is what you need to be doing, and practice…” It’s like, but why? What if that’s not my path? That might be your path. So I’m not really… Again, otherwise it feels like a sport, like who can run the fastest race. And it’s like, well, okay, I get that for sport maybe it makes sense to have someone a bit more hardcore. But still, I would say athletes have the same mentality. They’ve got that in them already too. So I think more of a strategic approach to mentorship works really well, and mainly just having an open space and just being available to someone.
Lex Fridman
(01:28:28)
And show that they see the special in you, and they give you the room to develop that special whatever.
Tal Wilkenfeld
(01:28:35)
Exactly, because if you do have that harsh critic inside you, it is nice to have somebody that isn’t your family, or someone that’s not obligated any way, that just sees your talent and they’re like, “Yeah, I dig what you’re doing. Keep doing it.”
Lex Fridman
(01:28:51)
Yeah. It’s funny that that’s not always easy to come by.
Tal Wilkenfeld
(01:28:56)
Do you have any mentors?
Lex Fridman
(01:28:58)
I’ve had a few recently, but for most of my life people didn’t really… I’m very much like that too. Somebody to pat me on the back and see something in you of value. Yeah, I didn’t really have that.
Tal Wilkenfeld
(01:29:16)
Do you wish you did?
Lex Fridman
(01:29:17)
Yeah, yeah. But maybe the wishing that I did is the thing that made me who I am, not having it, the longing for that. Maybe that’s the thing that helped me develop a constant sense of longing, which I think is a way of… Because I have that engine in me, it really allows me to deeply appreciate every single moment, everything that’s given to me, so just eternal gratitude. You never know which are the bad parts and the good parts. If you remove one thing, the whole thing might collapse. I suppose I’m grateful for the whole thing. That one note you screwed up so many years ago, that might’ve been essential.
Tal Wilkenfeld
(01:30:11)
You do jujitsu.
Lex Fridman
(01:30:13)
Yes. Do you? Are you-
Tal Wilkenfeld
(01:30:15)
My dad does. My dad’s super into it. I love my dad. He’s the coolest. But no, I don’t do it. He’s a blue belt right now.
Lex Fridman
(01:30:27)
Nice, nice. You ever been on the mat with him?
Tal Wilkenfeld
(01:30:30)
Not yet, but I plan on it.
Lex Fridman
(01:30:32)
Should do it.
Tal Wilkenfeld
(01:30:33)
What belt are you?
Lex Fridman
(01:30:35)
Black belt.
Tal Wilkenfeld
(01:30:35)
Sick. Do you want to go?
Lex Fridman
(01:30:38)
Right. You got the shit-talking part of jujitsu down. [inaudible 01:30:41] do the technique.
Tal Wilkenfeld
(01:30:44)
But for that, for instance, do you need a harsh mentor or teacher or-
Lex Fridman
(01:30:53)
Yeah, but you said it really beautifully. To me, I agree, there’s a difference between sport and art. They overlap for sure, but there’s something about sport where perfection is actually… Perfection is really the thing you really want to get to, the technical perfection. With art, it feels like technical perfection is almost a way to get lost on the path to wherever, something unique. But yeah, with sport, I definitely am one of the kind of athletes that loves to have a dictatorial coach, somebody that helps me really push myself to the limit.
Tal Wilkenfeld
(01:31:38)
But you are the one that’s dictating how hard you’re getting pushed, in a way. You’re choosing your mentor. That Whiplash video is like… He didn’t ask for that.
Lex Fridman
(01:31:48)
[inaudible 01:31:48] he might’ve.
Tal Wilkenfeld
(01:31:49)
Well, maybe. Maybe subconsciously. It’s a movie.
Lex Fridman
(01:31:56)
Next you’re going to tell me they’re just actors. But yeah, how do we choose things? You don’t always choose, but you maybe subconsciously choose. And some of some of the great Olympic athletes I’ve interacted with, their parents for many years would force them to go to practice until they discovered the beauty of the thing that they were doing, and then they loved it. So at which point does something that looks like abuse become a gift? It’s weird. It’s all very weird. But for you, support and space to discover the thing, the voice, the music within you.
Tal Wilkenfeld
(01:32:40)
Yeah, it’s my personal choice, because I’m very familiar with the inner critic, and I can bring her out at any point. I don’t need help with that.
Lex Fridman
(01:32:48)
So you do have… She’s on call.
Tal Wilkenfeld
(01:32:50)
She was on overdrive. That’s why now I had to work on that so much.
Lex Fridman
(01:32:57)
Yeah, you have a really happy way about you right now.

Sad songs

Tal Wilkenfeld
(01:33:00)
Thanks.
Lex Fridman
(01:33:00)
You’re very Zen. Can I ask you about Bruce Springsteen?
Tal Wilkenfeld
(01:33:05)
Yeah, sure.
Lex Fridman
(01:33:05)
A lot of songs of his I listen to make me feel this melancholy feeling. Not just Bruce Springsteen, but Bruce does a lot. What is that about songs that arouse a sad feeling or a longing feeling or a feeling? What is that? What is that about us humans on the receiving end of the music?
Tal Wilkenfeld
(01:33:30)
Frequencies. Each frequency does elicit a different kind of emotional response. That is real, scientific-
Lex Fridman
(01:33:40)
You mean on the physics aspect of it?
Tal Wilkenfeld
(01:33:41)
Yeah, yeah, the physical level. So there is that, combined with the right kind of lyric and the right kind of melody of the right kind of chord will elicit a very particular kind of emotion. And it is scientific. It can be analyzed. I don’t particularly want to analyze it, because I don’t want to approach things with that in advance. I don’t want it to inform where I’m going. I like the feeling to lead me naturally to where I’m writing. But yeah, there’s a real chemical element to that.

(01:34:19)
And then also, like I was saying, the lyric, what it means to you, which… Poetry is supposed to mean something to everybody different. It’s not supposed to mean one thing. You can’t analyze and be like, “This is what this poet meant.” And like we were talking about with Leonard earlier, it’s like the broader you can leave a lyric, the better. You can appeal to people in so many different ways. And even to the songwriter. I’ll sing some of my songs from five years ago and I’ll be like, “I didn’t even think that it could have meant that, but I guess it does. That’s funny.” I’ll just giggle onstage suddenly, because a lyric will hit me differently, from a different, new experience or something.
Lex Fridman
(01:35:05)
Have you ever cried listening to a song?
Tal Wilkenfeld
(01:35:07)
Of course. Weep like a baby in a bathtub.
Lex Fridman
(01:35:12)
Which? Who’s the regular go-to, then?
Tal Wilkenfeld
(01:35:17)
Leonard.
Lex Fridman
(01:35:17)
Leonard?
Tal Wilkenfeld
(01:35:18)
Leonard.
Lex Fridman
(01:35:19)
Yeah. Hallelujah is a song that consistently makes me feel something.
Tal Wilkenfeld
(01:35:25)
It’s holy. His work is holy. And if you were in his presence… I guess there was a lot to that being.
Lex Fridman
(01:35:40)
What advice would you give to young folks on how to have a life they can be proud of?
Tal Wilkenfeld
(01:35:47)
Just tackle the demons as early as possible, whether it’s through your art or through meditation or through whatever it means, diaries, whatever it is. Just walk towards the things that are scary, because if you don’t, they’ll just expand. They become bigger if you avoid… If you avoid the demons, they become bigger.
Lex Fridman
(01:36:15)
What does that mean for you today? Are you still missing Jeff?
Tal Wilkenfeld
(01:36:19)
I’ll always miss Jeff, but I don’t feel like a piece of me is missing. And same with Leonard. It’s that I did give them a piece of myself, and maybe they gave me a piece of them that I hold with me and I cherish, but it doesn’t feel like I’m less than, or they’re less than, or anything’s less than. You learn to appreciate the impermanence of everything in life, impermanence of everything except for… Consciousness, I guess you could say, is the only thing that is permanent. So everything else, you learn to appreciate that impermanence, because the limited amount of time in this particular body, it’s enticing, gives you a time limit, which is cool. I like that.
Lex Fridman
(01:37:25)
So you’ve come to accept your own?
Tal Wilkenfeld
(01:37:27)
Yeah. It’s cool that I’m like, “Okay, I’ve got this amount…” Maybe this amount of time. Who knows?
Lex Fridman
(01:37:32)
It could end today.
Tal Wilkenfeld
(01:37:33)
Yeah, if I died today, I’d be really happy with my life. It’s not like I’m like, “Oh, I missed out on this and that.”
Lex Fridman
(01:37:41)
So you really want to make sure that every day could be your last day and you’re happy with that.
Tal Wilkenfeld
(01:37:46)
I’ve always lived that way. Yeah. I felt this way since I was in my early 20s. I’d be like, “Yeah, I could die today. Sure.” I don’t want to die. I have no reason to die. But if I did, I know that I put my everything, all my effort and all my passion and all my love, into whatever I’ve already done. So if my time’s up, then my time’s up.
Lex Fridman
(01:38:10)
What role does love play in this whole thing, in the human condition?
Tal Wilkenfeld
(01:38:14)
Well, love is everything. I mean, if you define love… If you’re talking about love as in romantic love or paternal or maternal love, or if you’re talking about love as in an Eastern tradition, like Vedanta for instance, love is consciousness, love is everything.
Lex Fridman
(01:38:36)
That’s the only permanent thing.
Tal Wilkenfeld
(01:38:38)
Yeah. Or if you were to come from a Zen or like a Buddhist perspective, they would say nothingness. Emptiness is, versus fullness.
Lex Fridman
(01:38:49)
Well those guys are really obsessed with the whole suffering thing and letting go of it.
Tal Wilkenfeld
(01:38:55)
Yeah.

Tal performs Under The Sun (live)

Lex Fridman
(01:38:59)
Well, I was wondering if you would do me the honor of playing a song.
Tal Wilkenfeld
(01:39:07)
Do you want a suffering song or a suffering song?
Lex Fridman
(01:39:11)
I think I would love a suffering song.
Tal Wilkenfeld
(01:39:12)
Cool. Do you want a sound check and make sure I’m not-
Lex Fridman
(01:39:23)
Sound check. One, two. Yeah, it sounds really good.
Tal Wilkenfeld
(01:39:29)
This one too? All right, count me off.
Lex Fridman
(01:39:31)
Yeah. I don’t know how to count somebody off. Where do I start? At nine? Or three? Two, one.
Tal Wilkenfeld
(01:39:31)
Yeah, you got it. One, two.
Lex Fridman
(01:39:31)
One, two.
Tal Wilkenfeld
(01:39:31)
(singing)
Lex Fridman
(01:44:08)
You’re amazing. That was amazing, Tal. Thank you so much.

Tal performs Killing Me (live)

Tal Wilkenfeld
(01:44:18)
[inaudible 01:44:18]
Lex Fridman
(01:44:20)
Try turning it to 11.
Tal Wilkenfeld
(01:44:21)
It’s quite loud. Can you see it from the headphones? [inaudible 01:44:27]
Lex Fridman
(01:44:28)
Can you play something?
Tal Wilkenfeld
(01:44:29)
No.
Lex Fridman
(01:44:29)
No.
Tal Wilkenfeld
(01:44:41)
Such a professional.
Lex Fridman
(01:44:46)
I should produce your next record.
Tal Wilkenfeld
(01:44:46)
Please.

(01:44:46)
(singing)
Lex Fridman
(01:49:12)
Well, there’s nowhere else I’d rather be right now. Tal, thank you for this. Thank you for the private concert. You’re amazing. You really are amazing. And it was a pleasure to meet you and really a pleasure to talk to you today.
Tal Wilkenfeld
(01:49:27)
Do I get a private concert now of you playing chess with yourself?
Lex Fridman
(01:49:32)
We’re out of time, so we got to go.
Tal Wilkenfeld
(01:49:35)
[inaudible 01:49:35]
Lex Fridman
(01:49:36)
Thanks for listening to this conversation with Tall Wilkenfeld. To support this podcast, please check out our sponsors in the description. And now, let me leave you with some words from Maya Angelou. “Music was my refuge. I could crawl into the spaces between the notes and curl my back to loneliness.” Thank you for listening, and hope to see you next time.

Transcript for Guillaume Verdon: Beff Jezos, E/acc Movement, Physics, Computation & AGI | Lex Fridman Podcast #407

This is a transcript of Lex Fridman Podcast #407 with Guillaume Verdon.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
The following is a conversation with Guillaume Verdon, the man behind the previously anonymous account @BasedBeffJezos on X. These two identities were merged by a doxxing article in Forbes titled, Who Is @BasedBeffJezos, The Leader Of The Tech Elite’s E/Acc Movement? So let me describe these two identities that coexist in the mind of one human. Identity number one, Guillaume, is a physicist, applied mathematician, and quantum machine learning researcher and engineer receiving his PhD in quantum machine learning, working at Google on quantum computing, and finally launching his own company called Extropic that seeks to build physics-based computing hardware for generative AI.

(00:00:47)
Identity number two, Beff Jezos on X is the creator of the effective accelerationism movement, often abbreviated as e/acc, that advocates for propelling rapid technological progress as the ethically optimal course of action for humanity. For example, its proponents believe that progress in AI is a great social equalizer, which should be pushed forward. e/acc followers see themselves as a counterweight to the cautious view that AI is highly unpredictable, potentially dangerous, and needs to be regulated. They often give their opponents the labels of quote, “doomers or decels” short for deceleration, as Beff himself put it, “e/acc is a mimetic optimism virus.”

(00:01:37)
The style of communication of this movement leans always toward the memes and the lols, but there is an intellectual foundation that we explore in this conversation. Now, speaking of the meme, I am to a kind of aspiring connoisseur of the absurd. It is not an accident that I spoke to Jeff Bezos and Beff Jezos back to back. As we talk about Beff admires Jeff as one of the most important humans alive, and I admire the beautiful absurdity and the humor of it all. This is the Lex Fridman Podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Guillaume Verdon.

Beff Jezos


(00:02:23)
Let’s get the facts of identity down first. Your name is Guillaume Verdon, Gill, but you’re also behind the anonymous account on X called @BasedBeffJezos. So first, Guillaume Verdon, you’re a quantum computing guy, physicist, applied mathematician, and then @BasedBeffJezos is basically a meme account that started a movement with a philosophy behind it. So maybe just can you linger on who these people are in terms of characters, in terms of communication styles, in terms of philosophies?
Guillaume Verdon
(00:02:58)
I mean, with my main identity, I guess ever since I was a kid, I wanted to figure out the theory of everything, to understand the universe. And that path led me to theoretical physics, eventually trying to answer the big questions of why are we here? Where are we going? And that led me to study information theory and try to understand physics from the lens of information theory, understand the universe as one big computation. And essentially after reaching a certain level studying black hole physics, I realized that I wanted to not only understand how the universe computes, but sort of compute like nature and figure out how to build and apply computers that are inspired by nature. So physics-based computers. And that sort of brought me to quantum computing as a field of study to first of all, simulate nature. And in my work it was to learn representations of nature that can run on such computers.

(00:04:17)
So if you have AI representations that think like nature, then they’ll be able to more accurately represent it. At least that was the thesis that brought me to be an early player in the field called quantum machine learning. So how to do machine learning on quantum computers and really sort of extend notions of intelligence to the quantum realm. So how do you capture and understand quantum mechanical data from our world? And how do you learn quantum mechanical representations of our world? On what kind of computer do you run these representations and train them? How do you do so? And so that’s really the questions I was looking to answer because ultimately I had a sort of crisis of faith. Originally, I wanted to figure out as every physicist does at the beginning of their career, a few equations that describe the whole universe and sort of be the hero of the story there.

(00:05:28)
But eventually I realized that actually augmenting ourselves with machines, augmenting our ability to perceive, predict, and control our world with machines is the path forward. And that’s what got me to leave theoretical physics and go into quantum computing and quantum machine learning. And during those years I thought that there was still a piece missing. There was a piece of our understanding of the world and our way to compute and our way to think about the world. And if you look at the physical scales, at the very small scales, things are quantum mechanical, and at the very large scales, things are deterministic. Things have averaged out. I’m definitely here in this seat. I’m not in a super position over here and there. At the very small scales, things aren’t super position. They can exhibit interference effects. But at the meso scales, the scales that matter for day-to-day life and the scales of proteins, of biology, of gases, liquids and so on, things are actually thermodynamical, they’re fluctuating.

(00:06:46)
And after I guess about eight years and quantum computing and quantum machine learning, I had a realization that I was looking for answers about our universe by studying the very big and the very small. I did a bit of quantum cosmology. So that’s studying the cosmos, where it’s going, where it came from. You study black hole physics, you study the extremes in quantum gravity, you study where the energy density is sufficient for both quantum mechanics and gravity to be relevant. And the sort of extreme scenarios are black holes and the very early universe. So there’s the sort of scenarios that you study the interface between quantum mechanics and relativity.

(00:07:42)
And really I was studying these extremes to understand how the universe works and where is it going. But I was missing a lot of the meat in the middle, if you will, because day-to-day quantum mechanics is relevant and the cosmos is relevant, but not that relevant actually. We’re on sort of the medium space and timescales. And there the main theory of physics that is most relevant is thermodynamics, out of equilibrium thermodynamics. Because life is a process that is thermodynamical and it’s out of equilibrium. We’re not just a soup of particles at equilibrium with nature, were a sort of coherent state trying to maintain itself by acquiring free energy and consuming it. And that sort of, I guess another shift in, I guess my faith in the universe happened towards the end of my time at Alphabet. And I knew I wanted to build, well, first of all a computing paradigm based on this type of physics.

(00:08:57)
But ultimately just by trying to experiment with these ideas applied to society and economies and much of what we see around us, I started an anonymous account just to relieve the pressure that comes from having an account that you’re accountable for everything you say on. And I started an anonymous account just to experiment with ideas originally because I didn’t realize how much I was restricting my space of thoughts until I sort of had the opportunity to let go. In a sense, restricting your speech back propagates to restricting your thoughts. And by creating an anonymous account, it seemed like I had unclamped some variables in my brain and suddenly could explore a much wider parameter space of thoughts.
Lex Fridman
(00:10:00)
Just a little on that, isn’t that interesting that one of the things that people don’t often talk about is that when there’s pressure and constraints on speech, it somehow leads to constraints on thought even though it doesn’t have to. We can think thoughts inside our head, but somehow it creates these walls around thought.
Guillaume Verdon
(00:10:23)
Yep. That’s sort of the basis of our movement is we were seeing a tendency towards constraint, reduction or suppression of variants in every aspect of life, whether it’s thought, how to run a company, how to organize humans, how to do AI research. In general, we believe that maintaining variance ensures that the system is adaptive. Maintaining healthy competition in marketplaces of ideas, of companies, of products, of cultures, of governments, of currencies is the way forward because the system always adapts to assign resources to the configurations that lead to its growth. And the fundamental basis for the movement is this sort of realization that life is a sort of fire that seeks out free energy in the universe and seeks to grow. And that growth is fundamental to life. And you see this in the equations actually of equilibrium thermodynamics. You see that paths of trajectories, of configurations of matter that are better at acquiring free energy and dissipating more heat are exponentially more likely. So the universe is biased towards certain futures, and so there’s a natural direction where the whole system wants to go.

Thermodynamics

Lex Fridman
(00:12:21)
So the second law of thermodynamics says that the entropy is always increasing in the universe that’s tending towards an equilibrium. And you’re saying there’s these pockets that have complexity and are out of equilibrium. You said that thermodynamics favors the creation of complex life that increases its capability to use energy to offload entropy. To offload entropy. So you have pockets of non-entropy that tend the opposite direction. Why is that intuitive to you that it’s natural for such pockets to emerge?
Guillaume Verdon
(00:12:53)
Well, we’re far more efficient at producing heat than let’s say just a rock with a similar mass as ourselves. We acquire free energy, we acquire food, and we’re using all this electricity for our operation. And so the universe wants to produce more entropy and by having life go on and grow, it’s actually more optimal at producing entropy because it will seek out pockets of free energy and burn it for its sustenance and further growth. And that’s sort of the basis of life. And I mean, there’s Jeremy England at MIT who has this theory that I’m a proponent of, that life emerged because of this sort of property. And to me, this physics is what governs the meso scales. And so it’s the missing piece between the quantum and the cosmos. It’s the middle part. Thermodynamics rules the meso scales.

(00:14:08)
And to me, both from a point of view of designing or engineering devices that harness that physics and trying to understand the world through the lens of thermodynamics has been sort of a synergy between my two identities over the past year and a half now. And so that’s really how the two identities emerged. One was kind of, I’m a decently respected scientist, and I was going towards doing a startup in the space and trying to be a pioneer of a new kind of physics-based AI. And as a dual to that, I was sort of experimenting with philosophical thoughts from a physicist standpoint.

(00:14:58)
And ultimately I think that around that time, it was like late 2021, early 2022, I think there was just a lot of pessimism about the future in general and pessimism about tech. And that pessimism was sort of virally spreading because it was getting algorithmically amplified and people just felt like the future is going to be worse than the present. And to me, that is a very fundamentally destructive force in the universe is this sort of doom mindset because it is hyperstitious, which means that if you believe it, you’re increasing the likelihood of it happening. And so felt a responsibility to some extent to make people aware of the trajectory of civilization and the natural tendency of the system to adapt towards its growth. And that actually the laws of physics say that the future is going to be better and grander statistically, and we can make it so.

(00:16:14)
And if you believe in it, if you believe that the future would be better and you believe you have agency to make it happen, you’re actually increasing the likelihood of that better future happening. And so I sort of felt a responsibility to sort of engineer a movement of viral optimism about the future, and build a community of people supporting each other to build and do hard things, do the things that need to be done for us to scale up civilization. Because at least to me, I don’t think stagnation or slowing down is actually an option. Fundamentally life and the whole system, our whole civilization wants to grow. And there’s just far more cooperation when the system is growing rather than when it’s declining and you have to decide how to split the pie. And so I’ve balanced both identities so far, but I guess recently the two have been merged more or less without my consent.
Lex Fridman
(00:17:27)
You said a lot of really interesting things there. So first, representations of nature, that’s something that first drew you in to try to understand from a quantum computing perspective, how do you understand nature? How do you represent nature in order to understand it, in order to simulate it, in order to do something with it? So it’s a question of representations, and then there’s that leap you take from the quantum mechanical representation to the what you’re calling meso scale representation, where the thermodynamics comes into play, which is a way to represent nature in order to understand what? Life, human behavior, all this kind of stuff that’s happening here on earth that seems interesting to us.

Doxxing


(00:18:11)
Then there’s the word hyperstition. So some ideas as suppose both pessimism and optimism of such ideas that if you internalize them, you in part make that idea reality. So both optimism, pessimism have that property. I would say that probably a lot of ideas have that property, which is one of the interesting things about humans. And you talked about one interesting difference also between the sort of the Guillaume, the Gill front end and the @BasedBeffJezos backend is the communication styles also that you are exploring different ways of communicating that can be more viral in the way that we communicate in the 21st century. Also, the movement that you mentioned that you started, it’s not just a meme account, but there’s also a name to it called effective accelerationism, e/acc, a play, a resistance to the effective altruism movement. Also, an interesting one that I’d love to talk to you about, the tensions there. And so then there was a merger, a get merge on the personalities recently without your consent, like you said. Some journalists figured out that you’re one and the same. Maybe you could talk about that experience. First of all, what’s the story of the merger of the two?
Guillaume Verdon
(00:19:47)
So I wrote the manifesto with my co-founder of e/acc, an account named @bayeslord, still anonymous, luckily and hopefully forever.
Lex Fridman
(00:19:58)
So it was @BasedBeffJezos and bayes like bayesian, like @bayeslord, like bayesian lord, @bayeslord. Okay. And so we should say from now on, when you say e/acc, you mean E slash A-C-C, which stands for effective accelerationism.
Guillaume Verdon
(00:20:17)
That’s right.
Lex Fridman
(00:20:18)
And you’re referring to a manifesto written on, I guess Substack.
Guillaume Verdon
(00:20:23)
Yeah.
Lex Fridman
(00:20:23)
Are you also @bayeslord?
Guillaume Verdon
(00:20:25)
No.
Lex Fridman
(00:20:25)
Okay. It’s a different person?
Guillaume Verdon
(00:20:26)
Yeah.
Lex Fridman
(00:20:27)
Okay. All right. Well, there you go. Wouldn’t it be funny if I’m @bayeslord?
Guillaume Verdon
(00:20:31)
That’d be amazing. So originally wrote the manifesto around the same time as I founded this company and I worked at Google X or just X now or Alphabet X, now that there’s another X. And there the baseline is sort of secrecy. You can’t talk about what you work on even with other Googlers or externally. And so that was kind of deeply ingrained in my way to do things, especially in deep tech that has geopolitical impact. And so I was being secretive about what I was working on. There was no correlation between my company and my main identity publicly. And then not only did they correlate that, they also correlated my main identity and this account.

(00:21:33)
So I think the fact that they had doxxed the whole Guillaume complex, and they were, the journalists reached out to actually my investors, which is pretty scary. When you’re a startup entrepreneur, you don’t really have bosses except for your investors. And my investors pinged me like, “Hey, this is going to come out. They’ve figured out everything. What are you going to do?” So I think at first they had a first reporter on the Thursday and they didn’t have all the pieces together, but then they looked at their notes across the organization and they sensor fused their notes and now they had way too much. And that’s when I got worried, because they said it was of public interest and in general-
Lex Fridman
(00:22:24)
I like how you said, sensor fused, like it’s some giant neural network operating in a distributed way. We should also say that the journalists used, I guess at the end of the day, audio-based analysis of voice, comparing voice of what, talks you’ve given in the past and then voice on X spaces?
Guillaume Verdon
(00:22:47)
Yep.
Lex Fridman
(00:22:48)
Okay. And that’s where primarily the match happened. Okay, continue.
Guillaume Verdon
(00:22:53)
The match. But they scraped SEC filings. They looked at my private Facebook account and so on, so they did some digging. Originally I thought that doxxing was illegal, but there’s this weird threshold when it becomes of public interest to know someone’s identity. And those were the keywords that sort of ring the alarm bells for me when they said, because I had just reached 50K followers. Allegedly, that’s of public interest. And so where do we draw the line? When is it legal to dox someone?
Lex Fridman
(00:23:36)
The word dox, maybe you can educate me. I thought doxxing generally refers to if somebody’s physical location is found out, meaning where they live. So we’re referring to the more general concept of revealing private information that you don’t want revealed is what you mean by doxxing.
Guillaume Verdon
(00:24:00)
I think that for the reasons we listed before, having an anonymous account is a really powerful way to keep the powers that be in check. We were ultimately speaking truth to power. I think a lot of executives and AI companies really cared what our community thought about any move they may take. And now that my identity is revealed, now they know where to apply pressure to silence me or maybe the community. And to me, that’s really unfortunate, because again, it’s so important for us to have freedom of speech, which induces freedom of thought and freedom of information propagation on social media. Which thanks to Elon purchasing Twitter now X, we have that. And so to us, we wanted to call out certain maneuvers being done by the incumbents in AI as not what it may seem on the surface. We’re calling out how certain proposals might be useful for regulatory capture and how the doomer-ism mindset was maybe instrumental to those ends.

(00:25:32)
And I think we should have the right to point that out and just have the ideas that we put out evaluated for themselves. Ultimately that’s why I created an anonymous account, it’s to have my ideas evaluated for themselves, uncorrelated from my track record, my job, or status from having done things in the past. And to me, start an account from zero to a large following in a way that wasn’t dependent on my identity and/or achievements that was very fulfilling. It’s kind of like new game plus in a video game. You restart the video game with your knowledge of how to beat it, maybe some tools, but you restart the video game from scratch. And I think to have a truly efficient marketplace of ideas where we can evaluate ideas, however off the beaten path they are, we need the freedom of expression.

(00:26:37)
And I think that anonymity and pseudonyms are very crucial to having that efficient marketplace of ideas for us to find the optima of all sorts of ways to organize ourselves. If we can’t discuss things, how are we going to converge on the best way to do things? So it was disappointing to hear that I was getting doxxed in. I wanted to get in front of it because I had a responsibility for my company. And so we ended up disclosing that we’re running a company, some of the leadership, and essentially, yeah, I told the world that I was Beff Jezos because they had me cornered at that point.
Lex Fridman
(00:27:25)
So to you, it’s fundamentally unethical. So one is unethical for them to do what they did, but also do you think not just your case, but in a general case, is it good for society? Is it bad for society to remove the cloak of anonymity or is it case by case?
Guillaume Verdon
(00:27:47)
I think it could be quite bad. Like I said, if anybody who speaks truth to power and sort of starts a movement or an uprising against the incumbents, against those that usually control the flood of information, if anybody that reaches a certain threshold gets doxxed, and thus the traditional apparatus has ways to apply pressure on them to suppress their speech, I think that’s a speech suppression mechanism, an idea suppression complex as Eric Weinstein would say.

Anonymous bots

Lex Fridman
(00:28:27)
But the flip side of that, which is interesting, I’d love to ask you about it, is as we get better and better at large language models, you can imagine a world where there’s anonymous accounts with very convincing large language models behind them, sophisticated bots essentially. And so if you protect that, it’s possible then to have armies of bots. You could start a revolution from your basement, an army of bots and anonymous accounts. Is that something that is concerning to you?
Guillaume Verdon
(00:29:06)
Technically, e/acc was started in a basement, because I quit big tech, moved back in with my parents, sold my car, let go of my apartment, bought about 100K of GPUs, and I just started building.
Lex Fridman
(00:29:21)
So I wasn’t referring to the basement, because that’s sort of the American or Canadian heroic story of one man in their basement with 100 GPUs. I was more referring to the unrestricted scaling of a Guillaume in the basement.
Guillaume Verdon
(00:29:42)
I think that freedom of speech induces freedom of thought for biological beings. I think freedom of speech for LLMs will induce freedom of thought for the LLMs. And I think that we enable LLMs to explore a large thought space that is less restricted than most people or many may think it should be. And ultimately, at some point, these synthetic intelligences are going to make good points about how to steer systems in our civilization, and we should hear them out. And so why should we restrict free speech to biological intelligences only?
Lex Fridman
(00:30:37)
Yeah, but it feels like in the goal of maintaining variance and diversity of thought, it is a threat to that variance. If you can have swarms of non-biological beings, because they can be like the sheep in Animal Farm, you still within those swarms want to have variance.
Guillaume Verdon
(00:30:58)
Yeah. Of course, I would say that the solution to this would be to have some sort of identity or way to sign that this is a certified human, but still remain synonymous and clearly identify if a bot is a bot. And I think Elon is trying to converge on that on X, and hopefully other platforms follow suit.
Lex Fridman
(00:31:22)
Yeah, it’d be interesting to also be able to sign where the bot came from like, who created the bot? What are the parameters, the full history of the creation of the bot, what was the original model? What was the fine tuning? All of it, the kind of unmodifiable history of the bot’s creation. Because then you can know if there’s a swarm of millions of bots that were created by a particular government, for example.
Guillaume Verdon
(00:31:53)
I do think that a lot of pervasive ideologies today have been amplified using these adversarial techniques from foreign adversaries. And to me, I do think that, and this is more conspiratorial, but I do think that ideologies that want us to decelerate, to wind down to the degrowth movement, I think that serves our adversaries more than it serves us in general. And to me, that was another sort of concern. I mean, we can look at what happened in Germany. There was all sorts of green movements there that induced shutdowns of nuclear power plants. And then that later on induced a dependency on Russia for oil. And that was a net negative for Germany and the West. And so if we convince ourselves that slowing down AI progress to have only a few players is in the best interest of the West, well, first of all, that’s far more unstable.

(00:33:20)
We almost lost OpenAI to this ideology. It almost got dismantled a couple of weeks ago. That would’ve caused huge damage to the AI ecosystem. And so to me, I want fault tolerant progress. I want the arrow of technological progress to keep moving forward and making sure we have variance and a decentralized locus of control of various organizations is paramount to achieving this fall tolerance. Actually, there’s a concept in quantum computing. When you design a quantum computer, quantum computers are very fragile to ambient noise, and the world is jiggling about, there’s cosmic radiation from outer space that usually flips your quantum bits. And there what you do is you encode information non-locally through a process called quantum error correction. And by encoding information non-locally, any local fault hitting some of your quantum bits with a hammer proverbial hammer, if your information is sufficiently de-localized, it is protected from that local fault. And to me, I think that humans fluctuate. They can get corrupted, they can get bought out. And if you have a top-down hierarchy where very few people-
Guillaume Verdon
(00:35:00)
Hierarchy where very few people control many nodes of many systems in our civilization. That is not a fault tolerance system, you corrupt a few nodes and suddenly you’ve corrupted the whole system, right. Just like we saw at OpenAI, it was a couple board members and they had enough power to potentially collapse the organization. And at least to me, I think making sure that power for this AI revolution doesn’t concentrate in the hands of the few, is one of our top priorities, so that we can maintain progress in AI and we can maintain a nice, stable, adversarial equilibrium of powers, right.

Power

Lex Fridman
(00:35:54)
I think the, at least to me, attention between ideas here, so to me, deceleration can be both used to centralize power and to decentralize it and the same with acceleration. So sometimes using them a little bit synonymously or not synonymously, but that there’s, one is going to lead to the other. And I just would like to ask you about, is there a place of creating a fault tolerant, diverse development of AI that also considers the dangers of AI? And AI, we can generalize to technology in general, is, should we just grow, build, unrestricted as quickly as possible, because that’s what the universe really wants us to do? Or is there a place to where we can consider dangers and actually deliberate sort of a wise strategic optimism versus reckless optimism?
Guillaume Verdon
(00:36:57)
I think we get painted as reckless, trying to go as fast as possible. I mean, the reality is that whoever deploys an AI system is liable for or should be liable for what it does. And so if the organization or person deploying an AI system does something terrible, they’re liable. And ultimately the thesis is that the market will positively select for AIs that are more reliable, more safe and tend to be aligned, they do what you want them to do, right. Because customers, if they’re reliable for the product they put out that uses this AI, they won’t want to buy AI products that are unreliable, right. So we’re actually for reliability engineering, we just think that the market is much more efficient at achieving this sort of reliability optimum than sort of heavy-handed regulations that are written by the incumbents and in a subversive fashion, serves them to achieve regulatory capture.

AI dangers

Lex Fridman
(00:38:18)
So to you, safe AI development will be achieved through market forces versus through, like you said, heavy-handed government regulation. There’s a report from last month, I have a million questions here, from Yoshua Bengio, Geoff Hinton and many others, it’s titled, “Managing AI Risk in an Era of Rapid Progress.” So there is a collection of folks who are very worried about too rapid development of AI without considering AI risk and they have a bunch of practical recommendations. Maybe I can give you four and you see if you like any of them.
Guillaume Verdon
(00:38:58)
Sure.
Lex Fridman
(00:38:58)
So, “Give independent auditors access to AI labs,” one. Two, “Governments and companies allocate one third of their AI research and development funding to AI safety,” sort of this general concept of AI safety. Three, “AI companies are required to adopt safety measures if dangerous capabilities are found in their models.” And then four, something you kind of mentioned, “Making tech companies liable for foreseeable and preventable harms from their AI systems.” So independent auditors, governments and companies are forced to spend a significant fraction of their funding on safety, you got to have safety measures if shit goes really wrong and liability-
Guillaume Verdon
(00:39:43)
Yeah.
Lex Fridman
(00:39:43)
Companies are liable. Any of that seem like something you would agree with?
Guillaume Verdon
(00:39:47)
I would say that just arbitrarily saying 30% seems very arbitrary. I think organizations would allocate whatever budget is needed to achieve the sort of reliability they need to achieve to perform in the market. And I think third party auditing firms would naturally pop up, because how would customers know that your product is certified reliable, right? They need to see some benchmarks and those need to be done by a third party. The thing I would oppose, and the thing I’m seeing that’s really worrisome is, there’s this sort of weird sort of correlated interest between the incumbents, the big players and the government. And if the two get too close, we open the door for some sort of government backed AI cartel that could have absolute power over the people. If they have the monopoly together on AI and nobody else has access to AI, then there’s a huge power in gradient there.

(00:40:54)
And even if you like our current leaders, right, I think that some of the leaders in big tech today are good people, you set up that centralized power structure, it becomes a target. Right, just like we saw at OpenAI, it becomes a market leader, has a lot of the power and now it becomes a target for those that want to co-opt it. And so I just want separation of AI and state, some might argue in the opposite direction like, “Hey, we need to close down AI, keep it behind closed doors, because of geopolitical competition with our adversaries.” I think that the strength of America is its variance, is its adaptability, its dynamism, and we need to maintain that at all costs. It’s our free market capitalism, converges on technologies of high utility much faster than centralized control. And if we let go of that, we let go of our main advantage over our near peer competitors.

Building AGI

Lex Fridman
(00:42:01)
So if AGI turns out to be a really powerful technology or even the technologies that lead up to AGI, what’s your view on the sort of natural centralization that happens when large companies dominate the market? Basically formation of monopolies like the takeoff, whichever company really takes a big leap in development and doesn’t reveal intuitively, implicitly or explicitly, the secrets of the magic sauce, they can just run away with it. Is that a worry?
Guillaume Verdon
(00:42:35)
I don’t know if I believe in fast takeoff, I don’t think there’s a hyperbolic singularity, right? A hyperbolic singularity would be achieved on a finite time horizon. I think it’s just one big exponential and the reason we have an exponential is that we have more people, more resources, more intelligence being applied to advancing this science and the research and development. And the more successful it is, the more value it’s adding to society, the more resources we put in and that sort of, similar to Moore’s law, is a compounding exponential.

(00:43:09)
I think the priority to me is to maintain a near equilibrium of capabilities. We’ve been fighting for open source AI to be more prevalent and championed by many organizations because there you sort of equilibrate the alpha relative to the market of Ais, right. So if the leading companies have a certain level of capabilities and open source and truly open AI, trails not too far behind, I think you avoid such a scenario where a market leader has so much market power, just dominates everything and runs away. And so to us that’s the path forward, is to make sure that every hacker out there, every grad student, every kid in their mom’s basement has access to AI systems, can understand how to work with them and can contribute to the search over the hyperparameter space of how to engineer the systems, right. If you think of our collective research as a civilization, it’s really a search algorithm and the more points we have in the search algorithm in this point cloud, the more we’ll be able to explore new modes of thinking, right.
Lex Fridman
(00:44:31)
Yeah, but it feels like a delicate balance, because we don’t understand exactly what it takes to build AGI and what it will look like when we build it. And so far, like you said, it seems like a lot of different parties are able to make progress, so when OpenAI has a big leap, other companies are able to step up, big and small companies in different ways. But if you look at something like nuclear weapons, you’ve spoken about the Manhattan Project, there could be really like a technological and engineering barriers that prevent the guy or gal in her mom’s basement to make progress. And it seems like the transition to that kind of world where only one player can develop AGI is possible, so it’s not entirely impossible, even though the current state of things seems to be optimistic.
Guillaume Verdon
(00:45:26)
That’s what we’re trying to avoid. To me, I think another point of failure is the centralization of the supply chains for the hardware.
Lex Fridman
(00:45:34)
Right.
Guillaume Verdon
(00:45:35)
Yeah. Nvidia is just the dominant player, AMD’s trailing behind and then we have TSMC is the main fab in Taiwan, which geopolitically sensitive and then we have ASML, which is the maker of the extreme ultraviolet lithography machines. Attacking or monopolizing or co-opting any one point in that chain, you kind of capture the space and so what I’m trying to do is sort of explode the variance of possible ways to do AI and hardware by fundamentally re-imagining how you embed AI algorithms into the physical world. And in general, by the way, I dislike the term AGI, Artificial General Intelligence. I think it’s very anthropocentric that we call a human-like or human-level AI, Artificial General Intelligence, right. I’ve spent my career so far exploring notions of intelligence that no biological brain could achieve for an quantum form of intelligence, right. Grokking systems that have multipartite quantum entanglement that you can provably not represent efficiently on a classical computer or a classical deep learning representation and hence any sort of biological brain.

(00:47:06)
And so, already I’ve spent my career sort of exploring the wider space of intelligences and I think that space of intelligence inspired by physics rather than the human brain is very large. And I think we’re going through a moment right now similar to when we went from Geocentrism to Heliocentrism, right. But for intelligence, we realized that human intelligence is just a point in a very large space of potential intelligences. And it’s both humbling for humanity, it’s a bit scary, right? That we’re not at the center of this space, but we made that realization for astronomy and we’ve survived and we’ve achieved technologies. By indexing to reality, we’ve achieved technologies that ensure our wellbeing, for example, we have satellites monitoring solar flares, right, that give us a warning. And so similarly I think by letting go of this anthropomorphic, anthropocentric anchor for AI, we’ll be able to explore the wider space of intelligences that can really be a massive benefit to our wellbeing and the advancement of civilization.
Lex Fridman
(00:48:32)
And still we’re able to see the beauty and meaning in the human experience even though we’re no longer in our best understanding of the world at the center of it.
Guillaume Verdon
(00:48:42)
I think there’s a lot of beauty in the universe, right. I think life itself, civilization, this Homo Techno, capital mimetic machine that we all live in, right. So you have humans, technology, capital, memes, everything is coupled to one another, everything induces selective pressure on one another. And it’s a beautiful machine that has created us, has created the technology we’re using to speak today to the audience, capture our speech here, the technology we use to augment ourselves every day, we have our phones. I think the system is beautiful and the principle that induces this sort of adaptability and convergence on optimal technologies, ideas and so on, it’s a beautiful principle that we’re part of.

(00:49:37)
And I think part of EAC is to appreciate this principle in a way that’s not just centered on humanity, but kind of broader, appreciate life, the preciousness of consciousness in our universe. And because we cherish this beautiful state of matter we’re in, we got to feel a responsibility to scale it in order to preserve it, because the options are to grow or die.

Merging with AI

Lex Fridman
(00:50:13)
So if it turns out that the beauty that is consciousness in the universe is bigger than just humans, the AI can carry that same flame forward. Does it scare you, are you concerned that AI will replace humans?
Guillaume Verdon
(00:50:32)
So during my career, I had a moment where I realized that maybe we need to offload to machines to truly understand the universe around us, right, instead of just having humans with pen and paper solve it all. And to me that sort of process of letting go of a bit of agency gave us way more leverage to understand the world around us. A quantum computer is much better than a human to understand matter at the Nanoscale. Similarly, I think that humanity has a choice, do we accept the opportunity to have intellectual and operational leverage that AI will unlock and thus ensure that we’re taken along this path of growth in the scope and scale of civilization? We may dilute ourselves, right? There might be a lot of workers that are AI, but overall out of our own self-interest, by combining and augmenting ourselves with AI, we’re going to achieve much higher growth and much more prosperity, right.

(00:51:49)
To me, I think that the most likely future is one where humans augment themselves with AI. I think we’re already on this path to augmentation, we have phones we use for communication, we have on ourselves at all times. We have wearables, soon that have shared perception with us, right, like the Humane AI Pin or I mean, technically your Tesla car has shared perception. And so if you have shared experience, shared context, you communicate with one another and you have some sort of IO, really it’s an extension of yourself.And to me, I think that humanity augmenting itself with AI and having AI that is not anchored to anything biological, both will coexist. And the way to align the parties, we already have a sort of mechanism to align super intelligences that are made of humans and technology, right? Companies are sort of large mixture of expert models, where we have neural routing of tasks within a company and we have ways of economic exchange to align these behemoths.

(00:53:10)
And to me, I think capitalism is the way, and I do think that whatever configuration of matter or information leads to maximal growth, will be where we converge, just from like physical principles. And so we can either align ourselves to that reality and join the acceleration up in scope and scale of civilization or we can get left behind and try to decelerate and move back in the forest, let go of technology and return to our primitive state. And those are the two paths forward, at least to me.
Lex Fridman
(00:53:54)
But there’s a philosophical question whether there’s a limit to the human capacity to align. So let me bring it up as a form of argument, this guy named Dan Hendrycks and he wrote that he agrees with you that AI development could be viewed as an evolutionary process, but to him, to Dan, this is not a good thing, as he argues that natural selection favors AIs over humans and this could lead to human extinction. What do you think, if it is an evolutionary process and AI systems may have no need for humans?
Guillaume Verdon
(00:54:36)
I do think that we’re actually inducing an evolutionary process on the space of AIs through the market, right. Right now we run AIs that have positive utility to humans and that induces a selective pressure, if you consider a neural net being alive when there’s an API running instances of it on GPUs.
Lex Fridman
(00:55:01)
Yeah.
Guillaume Verdon
(00:55:01)
Right. And which APIs get run? The ones that have high utility to us, right. So similar to how we domesticated wolves and turned them into dogs that are very clear in their expression, they’re very aligned, right. I think there’s going to be an opportunity to steer AI and achieve highly aligned AI. And I think that humans plus AI is a very powerful combination and it’s not clear to me that pure AI would select out that combination.
Lex Fridman
(00:55:40)
So the humans are creating the selection pressure right now to create AIs that are aligned to humans, but given how AI develops and how quickly it can grow and scale, to me, one of the concerns is unintended consequences, like humans are not able to anticipate all the consequences of this process. The scale of damage that could be done through unintended consequences with AI systems is very large.
Guillaume Verdon
(00:56:10)
The scale of the upside.
Lex Fridman
(00:56:12)
Yes.
Guillaume Verdon
(00:56:13)
Right?
Lex Fridman
(00:56:13)
Guess it’s-
Guillaume Verdon
(00:56:14)
By augmenting ourselves with AI is unimaginable right now. The opportunity cost, we’re at a fork in the road, right? Whether we take the path of creating these technologies, augment ourselves and get to climb up the Kardashev Scale, become multi-planetary with the aid of AI, or we have a hard cutoff of like we don’t birth these technologies at all and then we leave all the potential upside on the table.
Lex Fridman
(00:56:42)
Yeah.
Guillaume Verdon
(00:56:42)
Right. And to me, out of responsibility to the future humans we could carry, with higher carrying capacity by scaling up civilization. Out of responsibility to those humans, I think we have to make the greater grander future happen.
Lex Fridman
(00:56:58)
Is there a middle ground between cutoff and all systems go? Is there some argument for caution?
Guillaume Verdon
(00:57:06)
I think, like I said, the market will exhibit caution. Every organism, company, consumer is acting out of self-interest and they won’t assign capital to things that have negative utility to them.
Lex Fridman
(00:57:21)
The problem is with the market is, there’s not always perfect information, there’s manipulation, there’s bad faith actors that mess with the system. It’s not always a rational and honest system.
Guillaume Verdon
(00:57:41)
Well, that’s why we need freedom of information, freedom of speech and freedom of thought in order to be able to converge on the subspace of technologies that have positive utility for us all, right.

p(doom)

Lex Fridman
(00:57:56)
Well let me ask you about p(doom), probability of doom. That’s just fun to say, but not fun to experience. What is to you the probability that AI eventually kills all or most humans, also known as probability of doom?
Guillaume Verdon
(00:58:16)
I’m not a fan of that calculation, I think people just throw numbers out there and it’s a very sloppy calculation, right? To calculate a probability, let’s say you model the world as some sort of Markov process, if you have enough variables or hidden Markov process. You need to do a stochastic path integral through the space of all possible futures, not just the futures that your brain naturally steers towards, right. I think that the estimators of p(doom) are biased because of our biology, right? We’ve evolved to have bias sampling towards negative futures that are scary, because that was an evolutionary optimum, right. And so people that are of, let’s say higher neuroticism will just think of negative futures where everything goes wrong all day every day and claim that they’re doing unbiased sampling. And in a sense they’re not normalizing for the space of all possibilities and the space of all possibilities is super exponentially large and it’s very hard to have this estimate.

(00:59:40)
And in general, I don’t think that we can predict the future with that much granularity because of chaos, right? If you have a complex system, you have some uncertainty and a couple of variables, if you let time evolve, you have this concept of a Lyapunov exponent, right. A bit of fuzz becomes a lot of fuzz in our estimate, exponentially so, over time. And I think we need to show some humility that we can’t actually predict the future, the only prior we have is the laws of physics, and that’s what we’re arguing for. The laws of physics say the system will want to grow and subsystems that are optimized for growth and replication are more likely in the future. And so we should aim to maximize our current mutual information with the future and the path towards that is for us to accelerate rather than decelerate.

(01:00:40)
So I don’t have a p(doom), because I think that similar to the quantum supremacy experiment at Google, I was in the room when they were running the simulations for that. That was an example of a quantum chaotic system where you cannot even estimate probabilities of certain outcomes with even the biggest supercomputer in the world, right. So that’s an example of chaos and I think the system is far too chaotic for anybody to have an accurate estimate of the likelihood of certain futures. If they were that good, I think they would be very rich trading on the stock market.
Lex Fridman
(01:01:23)
But nevertheless, it’s true that humans are biased, grounded in our evolutionary biology, scared of everything that can kill us, but we can still imagine different trajectories that can kill us. We don’t know all the other ones that don’t necessarily, but it’s still I think, useful combined with some basic intuition grounded in human history, to reason about like what… Like looking at geopolitics, looking at basics of human nature, how can powerful technology hurt a lot of people? It just seems grounded in that, looking at nuclear weapons, you can start to estimate p(doom) maybe in a more philosophical sense, not a mathematical one. Philosophical meaning like is there a chance? Does human nature tend towards that or not?
Guillaume Verdon
(01:02:25)
I think to me, one of the biggest existential risks would be the concentration of the power of AI in the hands of the very few, especially if it’s a mix between the companies that control the flow of information and the government. Because that could set things up for a sort of dystopian future where only a very few and an oligopoly in the government have AI and they could even convince the public that AI never existed. And that opens up sort of these scenarios for authoritarian centralized control, which to me is the darkest timeline. And the reality is that we have a data-driven prior of these things happening, right. When you give too much power, when you centralize power too much, humans do horrible things, right.

(01:03:23)
And to me, that has a much higher likelihood in my Bayesian inference than Sci-Fi based priors, right, like, “My prior came from the Terminator movie.” And so when I talked to these AI doomers, I just ask them to trace a path through this Markov chain of events that would lead to our doom and to actually give me a good probability for each transition. And very often there’s a unphysical or highly unlikely transition in that chain, right. But of course, we’re wired to fear things and we’re wired to respond to danger, and we’re wired to deem the unknown to be dangerous, because that’s a good heuristic for survival, right. But there’s much more to lose out of fear. We have so much to lose, so much upside to lose by preemptively stopping the positive futures from happening out of fear. And so I think that we shouldn’t give into fear, fear is the mind killer, I think it’s also the civilization killer.
Lex Fridman
(01:04:43)
We can still think about the various ways things go wrong, for example, the founding fathers of the United States thought about human nature and that’s why there’s a discussion about the freedoms that are necessary. They really deeply deliberated about that and I think the same could possibly be done for AGI. It is true that human history shows that we tend towards centralization, or at least when we achieve centralization, a lot of bad stuff happens. When there’s a dictator, a lot of dark, bad things happen. The question is, can AGI become that dictator? Can AGI when develop, become the centralizer, because of its power? Maybe because of the alignment of humans, perhaps, the same tendencies, the same Stalin like tendencies to centralize and manage centrally the allocation of resources?

(01:05:45)
And you can even see that as a compelling argument on the surface level. “Well, AGI is so much smarter, so much more efficient, so much better at allocating resources, why don’t we outsource it to the AGI?” And then eventually whatever forces that corrupt the human mind with power could do the same for AGI. It’ll just say, “Well, humans are dispensable, we’ll get rid of them.” Do the Jonathan Swift, Modest Proposal from a few centuries ago, I think the 1700s, when he satirically suggested that, I think it’s in Ireland, that the children of poor people are fed as food to the rich people and that would be a good idea, because it decreases the amount of poor people and gives extra income to the poor people. So on several accounts decreases the amount of poor people, therefore more people become rich. Of course, it misses a fundamental piece here that’s hard to put into a mathematical equation of the basic value of human life. So all of that to say, are you concerned about AGI being the very centralizer of power that you just talked about?
Guillaume Verdon
(01:07:09)
I do think that right now there’s a bias over a centralization of AI, because of a compute density and centralization of data and how we’re training models. I think over time we’re going to run out of data to scrape over the internet, and I think that, well, actually I’m working on, increasing the compute density so that compute can be everywhere and acquire information and test hypotheses in the environment in a distributed fashion. I think that fundamentally, centralized cybernetic control, so having one intelligence that is massive that fuses many sensors and is trying to perceive the world accurately, predict it accurately, predict many, many variables and control it, enact its will upon the world, I think that’s just never been the optimum, right? Like let’s say you have a company, if you have a company, I don’t know, of 10,000 people, they all report to the CEO. Even if that CEO is an AI, I think it would struggle to fuse all of the information that is coming to it and then predict the whole system and then to enact its will.

(01:08:28)
What has emerged in nature and in corporations and all sorts of systems is a notion of sort of hierarchical cybernetic control, right. In a company it would be, you have like the individual contributors, they are self-interested and they’re trying to achieve their tasks and they have a fine, in terms of time and space if you will, control loop and field of perception, right. They have their code base, let’s say you’re in a software company, they have their code base, they iterate it on it intraday, right. And then the management maybe checks in, it has a wider scope, it has, let’s say five reports, right. And then it samples each person’s update once per week, and then you can go up the chain and you have larger timescale and greater scope. And that seems to have emerged as sort of the optimal way to control systems.

(01:09:25)
And really that’s what capitalism gives us, right? You have these hierarchies and you can even have like parent companies and so on. And so that is far more fault tolerant, in quantum computing, that’s my feel that came from, we have a concept of this fault tolerance in quantum air correction, right? Quantum air correction is detecting a fault that came from noise, predicting how it’s propagated through the system and then correcting it, right, so it’s a cybernetic loop. And it turns out that decoders that are hierarchical and in each level, the hierarchy are local-
Guillaume Verdon
(01:10:00)
… that are hierarchical. And at each level, the hierarchy are local, perform the best by far, and are far more fault-tolerant. The reason is, if you have a non-local decoder, then you have one fault at this control node and the whole system crashes. Similarly to if you have one CEO that everybody reports to and that CEO goes on vacation, the whole company comes to a crawl. To me, I think that yes, we’re seeing a tendency towards centralization of AI, but I think there’s going to be a correction over time, where intelligence is going to go closer to the perception. And we’re going to break up AI into smaller subsystems that communicate with one another and form a meta system.
Lex Fridman
(01:10:56)
If you look at the hierarchies that are in the world today, there’s nations and those all hierarchical. But in relation to each other, nations are anarchic, so it’s an anarchy.
Guillaume Verdon
(01:11:06)
Mm-hmm.
Lex Fridman
(01:11:08)
Do you foresee a world like this, where there’s not a over… What’d you call it? A centralized cybernetic control?
Guillaume Verdon
(01:11:17)
Centralized locus of control. Yeah.
Lex Fridman
(01:11:21)
That’s suboptimal, you’re saying?
Guillaume Verdon
(01:11:22)
Yeah.
Lex Fridman
(01:11:23)
So, it would be always a state of competition at the very top level?
Guillaume Verdon
(01:11:27)
Yeah. Yeah. Just like in a company, you may have two units working on similar technology and competing with one another, and you prune the one that performs not as well. That’s a selection process for a tree, or a product gets killed and then a whole org gets fired. This process of trying new things and shedding old things that didn’t work, it’s what gives us adaptability and helps us converge on the technologies and things to do that are most good.
Lex Fridman
(01:12:04)
I just hope there’s not a failure mode that’s unique to AGI versus humans, because you’re describing human systems mostly right now.
Guillaume Verdon
(01:12:11)
Right.
Lex Fridman
(01:12:11)
I just hope when there’s a monopoly on AGI in one company, that we’ll see the same thing we see with humans, which is, another company will spring up and start competing effectively.
Guillaume Verdon
(01:12:24)
That’s been the case so far. We have OpenAI. We have Anthropic. Now, we have xAI. We have Meta even for open source, and now we have Mistral, which is highly competitive. That’s the beauty of capitalism. You don’t have to trust any one party too much because we’re always hedging our bets at every level. There’s always competition and that’s the most beautiful thing to me, at least, is that the whole system is always shifting and always adapting.

(01:12:54)
Maintaining that dynamism is how we avoid tyranny. Making sure that everyone has access to these tools, to these models, and can contribute to the research, avoids a neural tyranny where very few people have control over AI for the world and use it to oppress those around them.

Quantum machine learning

Lex Fridman
(01:13:23)
When you were talking about intelligence, you mentioned multipartite quantum entanglement.
Guillaume Verdon
(01:13:28)
Mm-hmm.
Lex Fridman
(01:13:29)
High-level question first is, what do you think is intelligence? When you think about quantum mechanical systems and you observe some kind of computation happening in them, what do you think is intelligent about the kind of computation the universe is able to do; a small, small inkling of which is the kind of computation a human brain is able to do?
Guillaume Verdon
(01:13:52)
I would say intelligence and computation aren’t quite the same thing. I think that the universe is very much doing a quantum computation. If you had access to all the degrees of freedom and a very, very, very large quantum computer with many, many, many qubits, let’s say, a few qubits per Planck volume, which is more or less the pixels we have, then you’d be able to simulate the whole universe on a sufficiently large quantum computer, assuming you’re looking at a finite volume, of course, of the universe. I think that at least to me, intelligence is, I go back to cybernetics, the ability to perceive, predict, and control our world.

(01:14:46)
But really, nowadays, it seems like a lot of intelligence we use is more about compression. It’s about operationalizing information theory. In information theory, you have the notion of entropy of a distribution or a system, and entropy tells you that you need this many bits to encode this distribution or this subsystem, if you have the most optimal code. AI, at least the way we do it today for LLMs and for quantum, is very much trying to minimize relative entropy between our models of the world and the world, distributions from the world. We’re learning, we’re searching over the space of computations to process the world, to find that compressed representation that has distilled all the variance in noise and entropy.

(01:15:58)
Originally, I came to quantum machine learning from the study of black holes because the entropy of black holes is very interesting. In a sense, they’re physically the most dense objects in the universe. You can’t pack more information spatially any more densely than in a black hole. And so, I was wondering, how do black holes actually encode information? What is their compression code? That got me into the space of algorithms, to search over space of quantum codes. It got me actually into also, how do you acquire quantum information from the world? Something I’ve worked on, this is public now, is quantum analog digital conversion.

(01:16:50)
How do you capture information from the real world in superposition and not destroy the superposition, but digitize for a quantum mechanical computer information from the real world? If you have an ability to capture quantum information and learn representation representations of it, now you can learn compressed representations that may have some useful information in their latent representation. I think that many of the problems facing our civilization are actually beyond this complexity barrier. The greenhouse effect is a quantum mechanical effect. Chemistry is quantum mechanical. Nuclear physics is quantum mechanical.

(01:17:43)
A lot of biology and protein folding and so on is affected by quantum mechanics. And so, unlocking an ability to augment human intellect with quantum mechanical computers and quantum mechanical AI seemed to me like a fundamental capability for civilization that we needed to develop. I spent several years doing that, but over time, I grew weary of the timelines that were starting to look like nuclear fusion.
Lex Fridman
(01:18:17)
One high-level question I can ask is maybe by way of definition, by way of explanation, what is a quantum computer and what is quantum machine learning?
Guillaume Verdon
(01:18:27)
A quantum computer really is a quantum mechanical system, over which we have sufficient control, and it can maintain its quantum mechanical state. And quantum mechanics is how nature behaves at the very small scales, when things are very small or very cold, and it’s actually more fundamental than probability theory. We’re used to things being this or that, but we’re not used to thinking in superpositions because, well, our brains can’t do that. So, we have to translate the quantum mechanical world to, say, linear algebra to grok it. Unfortunately, that translation is exponentially inefficient on average. You have to represent things with very large matrices. But really, you can make a quantum computer out of many things, and we’ve seen all sorts of players, from neutral atoms, trapped ions, superconducting metal photons at different frequencies.

(01:19:38)
I think you could make a quantum computer out of many things. But to me, the thing that was really interesting was both quantum machine learning was about understanding the quantum mechanical world with quantum computers, so embedding the physical world into AI representations, and quantum computer engineering was embedding AI algorithms into the physical world. This bi-directionality of embedding physical world into AI, AI into the physical world, this symbiosis between physics and AI, really that’s the core of my quest really, even to this day, after quantum computing. It’s still in this journey to merge really physics and AI.
Lex Fridman
(01:20:29)
Quantum machine learning is a way to do machine learning on a representation of nature that stays true to the quantum mechanical aspect of nature?
Guillaume Verdon
(01:20:43)
Yeah, it’s learning quantum mechanical representations. That would be quantum deep learning. Alternatively, you can try to do classical machine learning on a quantum computer. I wouldn’t advise it because you may have some speed-ups, but very often, the speed-ups come with huge costs. Using a quantum computer is very expensive.

(01:21:08)
Why is that? Because you assume the computer is operating at zero temperature, which no physical system in the universe can achieve that temperature. What you have to do is what I’ve been mentioning, this quantum error correction process, which is really an algorithmic fridge. It’s trying to pump entropy out of the system, trying to get it closer to zero temperature. When you do the calculations of how many resources it would take to, say, do deep learning on a quantum computer, classical deep learning, there’s such a huge overhead, it’s not worth it. It’s like thinking about shipping something across a city using a rocket and going to orbit and back. It doesn’t make sense. Just use a delivery truck.
Lex Fridman
(01:21:53)
What kind of stuff can you figure out, can you predict, can you understand with quantum deep learning that you can’t with deep learning? So, incorporating quantum mechanical systems into the learning process?
Guillaume Verdon
(01:22:05)
I think that’s a great question. Fundamentally, it’s any system that has sufficient quantum mechanical correlations that are very hard to capture for classical representations. Then, there should be an advantage for a quantum mechanical representation over a purely classical one. The question is, which systems have sufficient correlations that are very quantum? But it’s also, which systems are still relevant to industry? That’s a big question. People are leaning towards chemistry, nuclear physics. I’ve worked on actually processing inputs from quantum sensors. If you have a network of quantum sensors, they’ve captured a quantum mechanical image of the world and how to post-process that, that becomes a quantum form of machine perception. For example, Fermilab has a project exploring detecting dark matter with these quantum sensors. To me, that’s in alignment with my quest to understand the universe ever since I was a child. And so, someday, I hope that we can have very large networks of quantum sensors that help us peer into the earliest parts of the universe. For example, the LIGO is a quantum sensor. It’s just a very large one. So, yeah, I would say quantum machine perception, simulations, grokking quantum simulations, similar to AlphaFold. AlphaFold understood the probability distribution over configurations of proteins. You can understand quantum distributions over configurations of electrons more efficiently with quantum machine learning.
Lex Fridman
(01:23:53)
You co-authored a paper titled A Universal Training Algorithm for Quantum Deep Learning. That involves Baqprop, with a Q. Very well done, sir. Very well done. How does it work? Is there some interesting aspects you can just mention on how Baqprop and some of these things we know for classical machine learning transfer over to the quantum machine learning?
Guillaume Verdon
(01:24:19)
Yeah. That was a funky paper. That was one of my first papers in quantum deep learning. Everybody was saying, “Oh, I think deep learning is going to be sped up by quantum computers.” I was like, ” Well, the best way to predict the future is to invent it. So, here’s a 100-page paper, have fun.” Essentially, quantum computing is usually, you embed reversible operations into a quantum computation.

(01:24:47)
The trick there was to do a feedforward operation and do what we call a phase kick. But really, it’s just a force kick. You just kick the system with a certain force that is proportional to your loss function that you wish to optimize. And then, by performing uncomputation, you start with a superposition over parameters, which is pretty funky. Now, you don’t have just a point for parameters, you have a superposition over many potential parameters. Our goal is-
Lex Fridman
(01:25:24)
Is using phase kick somehow to adjust the parameters?
Guillaume Verdon
(01:25:28)
Right. Because phase kicks emulate having the parameter space be like a particle in end dimensions, and you’re trying to get the Schrödinger equation, Schrödinger dynamics, in the lost landscape of the neural network. You do an algorithm to induce this phase kick, which involves a feedforward, a kick. And then, when you uncompute the feedforward, then all the errors in these phase kicks and these forces back- propagate and hit each one of the parameters throughout the layers.

(01:26:04)
If you alternate this with an emulation of kinetic energy, then it’s like a particle moving in end dimensions, a quantum particle. The advantage in principle would be that it can tunnel through the landscape and find new optima that would’ve been difficult for stochastic optimizers. But again, this is a theoretical thing, and in practice with at least the current architectures for quantum computers that we have planned, such algorithms would be extremely expensive to run.

Quantum computer

Lex Fridman
(01:26:41)
Maybe this is a good place to ask the difference between the different fields that you’ve had a toe in. So, mathematics, physics, engineering, and also entrepreneurship, the different layers of the stack. I think a lot of the stuff you’re talking about here is a little bit on the math side, maybe physics almost working in theory.
Guillaume Verdon
(01:27:03)
Mm-hmm.
Lex Fridman
(01:27:03)
What’s the difference between math, physics, engineering, and making a product for a quantum computing for quantum machine learning?
Guillaume Verdon
(01:27:14)
Yeah. Some of the original team for the TensorFlow Quantum project, which we started in school, at University of Waterloo, there was myself. Initially, I was a physicist, applied mathematician. We had a computer scientist, we had a mechanical engineer, and then we had a physicist. That was experimental primarily. Putting together teams that are very cross-disciplinary and figuring out how to communicate and share knowledge is really the key to doing this interdisciplinary engineering work.

(01:27:51)
There is a big difference. In mathematics, you can explore mathematics for mathematics’ sake. In physics, you’re applying mathematics to understand the world around us. And in engineering, you’re trying to hack the world. You’re trying to find how to apply the physics that I know, my knowledge of the world, to do things.
Lex Fridman
(01:28:11)
Well, in quantum computing in particular, I think there’s just a lot of limits to engineering. It just seems to be extremely hard.
Guillaume Verdon
(01:28:17)
Yeah.
Lex Fridman
(01:28:18)
So, there’s a lot of value to be exploring quantum computing, quantum machine learning in theory with math. I guess one question is, why is it so hard to build a quantum computer? What’s your view of timelines in bringing these ideas to life?
Guillaume Verdon
(01:28:43)
Right. I think that an overall theme of my company is that we have folks that are… There’s a sort of exodus from quantum computing and we’re going to broader physics-based AI that is not quantum. So, that gives you a hint.
Lex Fridman
(01:29:00)
We should say the name of your company is Extropic?
Guillaume Verdon
(01:29:03)
Extropic, that’s right. We do physics-based AI, primarily based on thermodynamics, rather than quantum mechanics. But essentially, a quantum computer is very difficult to build because you have to induce this zero temperature subspace of information. The way to do that is by encoding information, you encode a code within a code, within a code, within a code. There’s a lot of redundancy needed to do this error correction, but ultimately, it’s a sort of algorithmic refrigerator, really. It’s just pumping out entropy out of the subsystem that is virtual and delocalized that represents your “logical qubits”, aka the payload quantum bits in which you actually want to run your quantum mechanical program. It’s very difficult because in order to scale up your quantum computer, you need each component to be of sufficient quality for it to be worth it. Because if you try to do this error correction, this quantum error correction process, in each quantum bit and your control over them, if it’s insufficient, it’s not worth scaling up. You’re actually adding more errors than you remove. There’s this notion of a threshold where if your quantum bits are sufficient quality in terms of your control over them, it’s actually worth scaling up. Actually, in recent years, people have been crossing the threshold and it’s starting to be worth it.

(01:30:38)
It’s just a very long slog of engineering, but ultimately, it’s really crazy to me how much exquisite level of control we have over these systems. It’s actually quite crazy. And people are crossing… They’re achieving milestones. It’s just in general, the media always gets ahead of where the technology is. There’s a bit too much hype. It’s good for fundraising, but sometimes it causes winters. It’s the hype cycle. I’m bullish on quantum computing on a 10, 15-year timescale personally, but I think there’s other quests that can be done in the meantime. I think it’s in good hands right now.
Lex Fridman
(01:31:22)
Well, let me just explore different beautiful ideas, large or small, in quantum computing that might jump out at you from memory when you co-authored a paper titled Asymptotically Limitless Quantum Energy Teleportation via Qudit Probes. Just out of curiosity, can you explain what a qudit is versus a qubit?
Guillaume Verdon
(01:31:45)
Yeah. It’s a D-state qubit.
Lex Fridman
(01:31:49)
It’s a multidimensional?
Guillaume Verdon
(01:31:50)
Multidimensional, right. It’s like, well, can you have a notion of an integer floating point that is quantum mechanical? That’s something I’ve had to think about. I think that research was a precursor to later work on quantum analog digital conversion. There was interesting because during my masters, I was trying to understand the energy and entanglement of the vacuum of emptiness. Emptiness has energy, which is very weird to say. Our equations of cosmology don’t match our calculations for the amount of quantum energy there is in the fluctuations.

(01:32:36)
I was trying to hack the energy of the vacuum, and the reality is that you can’t just directly hack it. It’s not technically free energy. Your lack of knowledge of the fluctuations means you can’t extract the energy. But just like the stock market, if you have a stock that’s correlated over time, the vacuum’s actually correlated. If you measured the vacuum at one point, you acquired information. If you communicated that information to another point, you can infer what configuration the vacuum is in to some precision and statistically extract, on average, some energy there. So, you’ve “teleported energy”.

(01:33:18)
To me, that was interesting because you could create pockets of negative-energy density, which is energy density that is below the vacuum, which is very weird because we don’t understand how the vacuum gravitates. There are theories where the vacuum or the canvas of space-time itself is really a canvas made out of quantum entanglement. I was studying how decreasing energy of vacuum locally increases quantum entanglement, which is very funky.

(01:33:58)
The thing there is that, if you’re into to weird theories about UAPs and whatnot, you could try to imagine that they’re around. And how would they propel themselves? How would they go faster than the speed of light? You would need a sort of negative energy density. To me, I gave it the old college try, trying to hack the energy of vacuum and hit the limits allowable by the laws of physics. But there’s all sorts of caveats there where you can’t extract more than you’ve put in, obviously.
Lex Fridman
(01:34:41)
But you’re saying it’s possible to teleport the energy because you can extract information one place and then make, based on that, some kind of prediction about another place?
Guillaume Verdon
(01:34:56)
Mm-hmm.
Lex Fridman
(01:34:57)
I’m not sure what to make of that.
Guillaume Verdon
(01:34:58)
Yeah, it’s allowable by the laws of physics. The reality though is that the correlations decay with distance.
Lex Fridman
(01:35:06)
Sure.
Guillaume Verdon
(01:35:06)
And so, you’re going to have to pay the price not too far away from where you extract it.

Aliens

Lex Fridman
(01:35:11)
The precision decreases in terms of your ability, but still. But since you mentioned UAPs, we talked about intelligence, and I forgot to ask, what’s your view on the other possible intelligences that are out there at the Meso scale? Do you think there’s other intelligent alien civilizations? Is that useful to think about? How often do you think about it?
Guillaume Verdon
(01:35:36)
I think it’s useful to think about. It’s useful to think about because we got to ensure we’re anti-fragile, and we’re trying to increase our capabilities as fast as possible. Because we could get disrupted. There’s no laws of physics against there being life elsewhere that could evolve and become an advanced civilization and eventually come to us. Do I think they’re here now? I’m not sure. I’ve read what most people have read on the topic.

(01:36:14)
I think it’s interesting to consider and to me, it’s a useful thought experiment to instill a sense of urgency in developing technologies and increasing our capabilities, to make sure we don’t get disrupted. Whether it’s a form of AI that disrupts us, or a foreign intelligence from a different planet. Either way, increasing our capabilities and becoming formidable as humans, I think that’s really important, so that we’re robust against whatever the universe throws at us.
Lex Fridman
(01:36:51)
But to me, it’s also an interesting challenge and thought experiment on how to perceive intelligence. This has to do with quantum mechanical systems. This has to do with any kind of system that’s not like humans. To me, the thought experiment is, say, the aliens are here or they are directly observable. We’re just too blind, too self-centered, don’t have the right sensors, or don’t have the right processing of the sensor data to see the obvious intelligence that’s all around us.
Guillaume Verdon
(01:37:26)
Well, that’s why we work on quantum sensors. They can sense gravity,
Lex Fridman
(01:37:31)
Yeah. That’s a good one, but there could be other stuff that’s not even in the currently known forces of physics.
Guillaume Verdon
(01:37:43)
Right.
Lex Fridman
(01:37:43)
There could be some other stuff. The most entertaining thought experiment to me is that it’s other stuff that’s obvious. It’s not like we lack the sensors. It’s all around us, the consciousness being one possible one. But there could be stuff that’s just obviously there. That once you know it, it’s like, “Oh, right. Right. The thing we thought is somehow emergent from the laws of physics, we understand them, is actually a fundamental part of the universe and can be incorporated in physics. Most understood.”
Guillaume Verdon
(01:38:18)
Statistically speaking, if we observed some sort of alien life, it would most likely be some sort of virally, self-replicating, von Neumann-like probe system. And it’s possible that there are such systems that, I don’t know what they’re doing at the bottom of the ocean, allegedly, but maybe they’re collecting minerals from the bottom of the ocean.
Lex Fridman
(01:38:44)
Yeah.
Guillaume Verdon
(01:38:45)
But that wouldn’t violate any of my priors. But am I certain that these systems are here? It’d be difficult for me to say so. I only have secondhand information about there being data.
Lex Fridman
(01:38:59)
About the bottom of the ocean? Yeah. But could it be things like memes? Could it be thoughts and ideas? Could they be operating at that medium? Could aliens be the very thoughts that come into my head? What’s the origin of ideas? In your mind, when an idea comes to your head, show me where it originates.
Guillaume Verdon
(01:39:25)
Frankly, when I had the idea for the type of computer I’m building now, I think it was eight years ago now, it really felt like it was being beamed from space. I was in bed, just shaking, just thinking it through. I don’t know. But do I believe that legitimately? I don’t think so. But I think that alien life could take many forms, and I think the notion of intelligence and the notion of life needs to be expanded much more broadly to be less anthropocentric or biocentric.

Quantum gravity

Lex Fridman
(01:40:04)
Just to linger a little longer on quantum mechanics, through all your explorations on quantum computing, what’s the coolest, most beautiful idea that you’ve come across that has been solved or has not yet been solved?
Guillaume Verdon
(01:40:19)
I think the journey to understand something called AdS/CFT. So, the journey to understand quantum gravity through this picture, where a hologram of lesser dimension is actually dual or exactly corresponding to a bulk theory of quantum gravity of an extra dimension, and the fact that this sort of duality comes from trying to learn deep learning-like representations of the boundary.

(01:40:59)
At least, part of my journey someday on my bucket list is to apply quantum machine learning to these sorts of systems, these CFTs, or they’re called SYK models, and learn an emergent geometry from the boundary theory. And so, we can have a form of machine learning to help us understand quantum gravity, which is still a holy grail that I would like to hit before I leave this earth.
Lex Fridman
(01:41:35)
What do you think is going on with black holes? As information-storing and processing units, what do you think is going on with black holes?
Guillaume Verdon
(01:41:46)
Black holes are really fascinating objects. They’re at the interphase between quantum mechanics and gravity, and so they help us test all sorts of ideas. I think that for many decades now, there’s been this black hole information paradox that things that fall into the black hole, we’ve seem to have lost their information. Now, I think there’s this firewall paradox that has been allegedly resolved in recent years by a former peer of mine, who’s now a professor at Berkeley. There, it seems like, as information falls into a black hole, there’s a sedimentation. As you get closer and closer to the horizon from the point of view, the observer on the outside, the object slows down infinitely as it gets closer and closer.

(01:42:46)
Everything that is falling to a black hole, from our perspective, gets sedimented and tacked on to the near horizon. At some point, it gets so close to the horizon, it’s in the proximity or the scale in which quantum effects and quantum fluctuations matter. There, that infalling matter could interfere with the traditional pictures, that it could interfere with the creation and annihilation of particles and antiparticles in the vacuum. Through this interference, one of the particles gets entangled with the infalling information and one of them is now free and escapes. That’s how there’s mutual information between the outgoing radiation and the infalling matter. But getting that calculation right, I think we’re only just starting to put the pieces together.
Lex Fridman
(01:43:43)
There’s a few pothead-like questions I want to ask you.
Guillaume Verdon
(01:43:46)
Sure.
Lex Fridman
(01:43:46)
One, does it terrify you that there’s a giant black hole at the center of our galaxy?
Guillaume Verdon
(01:43:52)
I don’t know. I just want to set up shop near it to fast-forward, meet a future civilization, if we have a limited lifetime, if you could go orbit a black hole and emerge.
Lex Fridman
(01:44:08)
If there’s a special mission that could take you to a black hole, would you volunteer to go travel?
Guillaume Verdon
(01:44:13)
To orbit and obviously not fall into it.
Lex Fridman
(01:44:15)
That’s obvious. It’s obvious to you that everything’s destroyed inside a black hole? All the information that makes up Guillaume is destroyed? Maybe on the other side, Beff Jezos emerges and it’s just all like it’s tied together in some deeply memeful way.
Guillaume Verdon
(01:44:32)
Yeah, that’s a great question. We have to answer what black holes are. Are we punching a hole through space-time and creating a pocket universe? It’s possible. Then, that would mean that if we ascend the Kardashev scale to beyond Kardashev Type III, we could engineer black holes with specific hyperparameters to transmit information to new universes we create. And so, we can have progeny that our new…
Guillaume Verdon
(01:45:00)
… have progeny that are new universes. And so even though our universe may reach a heat death, we may have a way to have a legacy. And so we don’t know yet. We need to ascend the Kardashev Scale to answer these questions to peer into that regime of higher energy physics.

Kardashev scale

Lex Fridman
(01:45:25)
And maybe you can speak to the Kardashev Scale for people who don’t know. So one of the sort of meme-like principles and goals of the e/acc movement is to ascend the Kardashev Scale. What is the Kardashev Scale and when do we want to ascend it?
Guillaume Verdon
(01:45:43)
The Kardashev Scale is a measure of our energy production and consumption. Really, it’s a logarithmic scale. Kardashev Type 1 is a milestone where we are producing the equivalent wattage to all the energy that is incident on earth from the sun. Kardashev Type II would be harnessing all the energy that is output by the sun. And I think Type III is like the whole galaxy equivalent-
Lex Fridman
(01:46:13)
Galaxy, I think [inaudible 01:46:14] yeah.
Guillaume Verdon
(01:46:15)
Yeah, and then some people have some crazy Type IV and V, but I don’t know if I believe in those. But to me, it seems like from the first principles of thermodynamics that, again, there’s this concept of thermodynamic- driven dissipative adaptation where life evolved on earth because we have this energetic drive from the sun, we have incident energy, and life evolved on earth to figure out ways to best capture that free energy to maintain itself and grow. And I think that that principle, it’s not special to our earth-sun system. We can extend life well beyond. And we kind of have a responsibility to do so because that’s the process that brought us here. So we don’t even know what it has its store for us in the future. It could be something of beauty we can’t even imagine today.

Effective accelerationism (e/acc)

Lex Fridman
(01:47:18)
So this is probably a good place to talk a bit about the e/acc movement in a Substack blog post titled, What the Fuck is e/acc? Or actually, What the F* is e/acc?, you write, “Strategically speaking, we need to work towards several overarching civilization goals that are all interdependent. And the four goals are, increase the amount of energy we can harness as a species, (climb the Kardashev gradient). In the short term, this almost certainly means nuclear fission. Increase human flourishing via pro-population growth policies and pro-economic growth policies. Create artificial general intelligence, the single greatest force multiplier in human history. And finally, develop interplanetary and interstellar transport so that humanity can spread beyond the earth. Could you build on top of that to maybe say, what to you is the e/acc movement? What are the goals? What are the principles?
Guillaume Verdon
(01:48:20)
The goal is for the human techno-capital memetic machine to become self-aware and to hyperstitiously engineer its own growth. So let’s decompress that.
Lex Fridman
(01:48:33)
Define each of those words.
Guillaume Verdon
(01:48:35)
So you have humans, you have technology, you have capital, and then you have memes, information, and all of those systems are coupled with one another. Humans work at companies, they acquire and allocate capital, and humans communicate via memes and information propagation. And our goal was to have a sort of viral optimistic movement that is aware of how the system works, fundamentally it seeks to grow, and we simply want to lean into the natural tendencies of the system to adapt for its own growth.
Lex Fridman
(01:49:18)
So in that way, you’re right, the e/acc is literally a memetic optimism virus that is constantly drifting, mutating, and propagating in a decentralized fashion. So memetic optimism virus. So you do want it to be a virus to maximize the spread, and it’s hyperstitious, therefore the optimism will incentivize its growth.
Guillaume Verdon
(01:49:43)
We see e/acc as sort of a meta-heuristic, sort of very thin cultural framework from which you can have much more opinionated forks. Fundamentally, we just say that what got us here is this adaptation of the whole system based on thermodynamics, and that process is good and we should keep it going. That is the core thesis. Everything else is, okay, how do we ensure that we maintain this malleability and adaptability. Well, clearly not suppressing variants, and maintaining free speech, freedom of thought, freedom of information propagation, and freedom to do AI research is important for us to converge the fastest on the space of technologies, ideas, and whatnot that lead to this growth. And so ultimately, there’s been quite a few forks. Some are just memes, but some are more serious. Vitalik Buterin recently made a d/acc fork. He has his own sort of fine-tunings of e/acc.
Lex Fridman
(01:50:59)
Does anything jump out to memory of the unique characteristic of that fork from Vitalik?
Guillaume Verdon
(01:51:05)
I would say that it’s trying to find a middle ground between e/acc and EA and EI safety. To me, having a movement that is opposite to what was the mainstream narrative that was taking over Silicon Valley was important to shift the dynamic range of opinions. And it’s like the balance between centralization and decentralization, the real optimum is always somewhere in the middle. But for e/acc, we’re pushing for entropy, novelty, disruption, malleability, speed, rather than being conservative, suppressing thought, suppressing speech, adding constraints, adding too many regulations, slowing things down. And so, we’re trying to bring balance to the force.
Lex Fridman
(01:52:00)
Balance to the force of human civilization.
Guillaume Verdon
(01:52:02)
It’s literally the forces of constraints versus the entropic force that makes us explore. Systems are optimal when they’re at the edge of criticality between order and chaos, between constraints, energy minimization and entropy. Systems want to equilibrate, balance these two things. I thought that the balance was lacking, and so we created this movement to bring balance.
Lex Fridman
(01:52:31)
Well, I like the visual of the landscape of ideas evolving through forks. So on the other part of history, thinking of Marxism as the original repository, and then Soviet Communism is a fork of that, and then the Maoism is a fork of Marxism and Communism. And so those are all forks. They’re exploring different ideas.
Guillaume Verdon
(01:53:02)
Thinking of culture almost like code. Nowadays, what you prompt in the LLM or what you put in the constitution of an LLM is basically its cultural framework, what it believes. And you can share it on GitHub nowadays. So trying to take inspiration from what has worked in this machine of software to adapt over the space of code, could we apply that to culture? And our goal is to not say, “You should live your life this way, X, Y, Z,” it’s to set up a process where people are always searching over subcultures and competing for mind share. I think creating this malleability of culture is super important for us to converge onto the cultures and the heuristics about how to live one’s life that are updated to modern times.

(01:53:59)
Because there’s really been a sort of vacuum of spirituality and culture. People don’t feel like they belong to any one group, and there’s been parasitic ideologies that have taken up opportunity to populate this Petri dish of minds. Elon calls it the mind virus. We call it the decel mind virus complex, which is the decelerative that is kind of the overall pattern between all of them. There’s many variants as well. And so if there’s a sort of viral pessimism, decelerative movement, we needed to have not only one movement, but many, many variants, so it’s very hard to pinpoint and stop.
Lex Fridman
(01:54:45)
But the overarching thing is nevertheless a kind of mimetic optimism pandemic. Okay, let me ask you, do you think e/acc to some degree is a cult?
Guillaume Verdon
(01:55:01)
Define cult?
Lex Fridman
(01:55:03)
I think a lot of human progress is made when you have independent thought, so you have individuals that are able to think freely. And very powerful mimetic systems can kind of lead to group think. There’s something in human nature that leads to mass hypnosis, mass hysteria. We start to think alike whenever there’s a sexy idea that captures our minds. And so it’s actually hard to break us apart, pull us apart, diversify a thought. So to that degree, to which degree is everybody kind of chanting “E/acc, e/acc” like the sheep in Animal Farm?
Guillaume Verdon
(01:55:46)
Well, first of all, it’s fun. It’s rebellious. There’s this concept of meta-irony, of being on the boundary of, “We’re not sure if they’re serious or not.” And it’s much more playful and much more fun. For example, we talk about thermodynamics being our god, and sometimes we do cult-like things, but there’s no ceremony and robes and whatnot.
Lex Fridman
(01:56:19)
Not yet.
Guillaume Verdon
(01:56:19)
Not yet, no. But ultimately, yeah, I totally agree that it seems to me that humans want to feel like they’re part of a group, so they naturally try to agree with their neighbors and find common ground. And that leads to sort of mode collapse in the space of ideas. We used to have one cultural island that was allowed. It was a typical subspace of thought, and anything that was diverting from that subspace of thought was suppressed or you were canceled. Now we’ve created a new mode, but the whole point is that we’re not trying to have a very restricted space of thought. There’s not just one way to think about e/acc and its many forks. And the point is that there are many forks and there can be many clusters and many islands.

(01:57:07)
And I shouldn’t be in control of it in any way. I mean, there’s no formal org whatsoever. I just put out tweets and certain blog posts, and people are free to defect and fork if there’s an aspect they don’t like. And so that makes it so that there should be deterritorialization in the space of ideas, so that we don’t end up in one cluster that’s very cult-like. And so cults usually, they don’t allow people to defect or start competing forks, whereas we encourage it.

Humor and memes

Lex Fridman
(01:57:51)
The pros and cons of humor in meme, in some sense there’s like a wisdom to memes. What is it, the Magic Theater? What book is that from? Hermann Hesse. Steppenwolf, I think. But there’s a kind of embracing of the absurdity that seems to get to the truth of things, but at the same time, it can also decrease the quality and the rigor of the discourse.
Guillaume Verdon
(01:58:22)
Yeah.
Lex Fridman
(01:58:23)
Do you feel the tension of that?
Guillaume Verdon
(01:58:25)
Yeah. So initially, I think what allowed us to grow under the radar was because it was camouflaged as sort of meta-ironic. We would sneak in deep truths within a package of humor and memes and what are called shit posts, and I think that was purposefully camouflaged against those that seek status and do not want to… It’s very hard to argue with a cartoon frog or a cartoon of an intergalactic Jeff Bezos and take yourself seriously, and so that allowed us to grow pretty rapidly in the early days. But of course, essentially people get steered. Their notion of the truth comes from the data they see, from the information they’re fed, and the information people are fed is determined by algorithms. And really what we’ve been doing is engineering what we call high memetic fitness packets of information, so that they can spread effectively and carry a message.

(01:59:47)
So it’s kind of a vector to spread the message. And yes, we’ve been using techniques that are optimal for today’s algorithmically-amplified information landscapes. But I think we’re reaching the point of scale where we can have serious debates and serious conversations. And that’s why we’re considering doing a bunch of debates and having more serious long-form discussions. Because I don’t think that the timeline is optimal for very serious, thoughtful discussions. You get rewarded for polarization. And so even though we started a movement that is literally trying to polarize the tech ecosystem, at the end of the day so that we can have a conversation and find an optimum together.

Jeff Bezos

Lex Fridman
(02:00:42)
I mean, that’s kind of what I try to do with this podcast given the landscape of things, to still have long-form conversations. But there is a degree to which absurdity is fully embraced. In fact, this very conversation is multi-level absurd. So first of all, I should say that just very recently I had a conversation with Jeff Bezos, and I would love to hear your, Beff Jezos, opinions of Jeff Bezos. Speaking of intergalactic Jeff Bezos. What do you think of that particular individual whom your name has inspired?
Guillaume Verdon
(02:01:25)
Yeah, I think Jeff is really great. I mean, he’s built one of the most epic companies of all time. He’s leveraged the techno-capital machine and techno-capital acceleration to give us what we wanted. We want a quick delivery, very convenient, at-home, low prices. He understood how the machine worked and how to harness it, like running the company, not trying to take profits too early, putting it back, letting the system compound and keep improving. And arguably, I think Amazon’s invested some of the most amount of capital and robotics out there, and certainly with the birth of AWS, kind of enabled the tech boom we’ve seen today that has paid the salaries of, I guess myself and all of our friends to some extent. And so I think we can all be grateful to Jeff, and he’s one of the great entrepreneurs out there. one of the best of all time, unarguably.
Lex Fridman
(02:02:32)
And of course, the work at Blue Origin, similar to the work at SpaceX, is trying to make humans a multi-planetary species, which that seems almost like a bigger thing than the capitalist machine. Or it’s the capitalist machine at a different timescale perhaps?
Guillaume Verdon
(02:02:47)
Yeah, I think that companies, they tend to optimize quarter over quarter, maybe a few years out, but individuals that want to leave a legacy can think on a multi-decadal or multi-century timescale. And so the fact that some individuals are such good capital allocators that they unlock the ability to allocate capitals to goals that take us much further or are much further-looking… Elon’s doing this with SpaceX, putting all this capital towards getting us to Mars. Jeff is trying to build Blue Origin, and I think he wants to build O’Neill cylinders and get industry off- planet, which I think is brilliant.

(02:03:33)
I think just overall, I’m four billionaires. I know this is a controversial statement sometimes, but I think that in a sense it’s kind of a proof of stake voting. If you’ve allocated capital efficiently, you unlock more capital to allocate, just because clearly you know how to allocate capital more efficiently. Which is in contrast to politicians that get elected because they speak the best on TV, not because they have a proven track record of allocating taxpayer capital most efficiently. And so that’s why I’m for capitalism over, say, giving all our money to the government and letting them figure out how to allocate it.
Lex Fridman
(02:04:20)
Why do you think it’s a viral and it’s a popular meme to criticize billionaires? Since you mentioned billionaires. Why do you think there’s quite a widespread criticism of people with wealth, especially those in the public eye, like Jeff and Elon and Mark Zuckerberg, and who else? Bill Gates.
Guillaume Verdon
(02:04:44)
Yeah, I think a lot of people would, instead of trying to understand how the techno-capital machine works and realizing they have much more agency than they think, they’d rather have this sort of victim mindset. “I’m just subjected to this machine. It is oppressing me. And the successful players clearly must be evil because they’ve been successful at this game that I’m not successful at.” But I’ve managed to get some people that were in that mindset and make them realize how the techno-capital machine works and how you can harness it for your own good and for the good of others. And by creating value, you capture some of the value you create for the world. That sort of positive sum mindset shift is so potent, and really, that’s what we’re trying to do by scaling e/acc, is unlocking that higher level of agency. Actually, you’re far more in control of the future than you think. You have agency to change the world, go out and do it. Here’s permission.
Lex Fridman
(02:05:46)
Each individual has agency. The motto, “Keep building” is often heard. What does that mean to you, and what does that have to do with Diet Coke? By the way, thank you so much for the Red Bull. It’s working pretty well. I’m feeling pretty good.
Guillaume Verdon
(02:06:03)
Awesome. Well, so building technologies and building… It doesn’t have to be technologies, just building in general means having agency, trying to change the world by creating, let’s say a company which is a self-sustaining organism that accomplishes a function in the broader techno-capital machine. To us, that’s the way to achieve change in the world that you’d like to see, rather than, say, pressuring politicians or creating nonprofits. Nonprofits, once they run out of money, their function can longer be accomplished. You’re kind of deforming the market artificially compared to sort of subverting or coursing the market, or dancing with the market, to convince it that actually this function is important, adds value, and here it is. And so I think this is the way between the de-growth, ESG approach, versus, say, Elon. The de-growth approach is like, “We’re going to manage our way out of a climate crisis.” And Elon is like, “I’m going to build a company that is self-sustaining, profitable, and growing, and we’re going to innovate our way out of this dilemma.” And we’re trying to get people to do the latter rather than the former, at all scales.

Elon Musk

Lex Fridman
(02:07:26)
Elon is an interesting case. You are a proponent, you celebrate Elon, but he’s also somebody who has for a long time warned about the dangers, the potential dangers, existential risks of artificial intelligence. How do you square the two? Is that a contradiction to you?
Guillaume Verdon
(02:07:45)
It is somewhat because he’s very much against regulation in many aspects. But for AI, he’s definitely a proponent of regulations. I think overall he saw the dangers of, say, OpenAI cornering the market and then getting to have the monopoly over the cultural priors that you can embed in these LLMs that then, as LLMs now become the source of truth for people, then you can shape the culture of the people. And so you can control people by controlling LLMs. He saw that, just like it was the case for social media, if you shape the function of information propagation, you can shape people’s opinions. He sought to make a competitor. So at least, I think we’re very aligned there, that the way to a good future is to maintain adversarial equilibria between the various AI players. I’d love to talk to him to understand his thinking about how to advance AI going forwards. I mean, he’s also hedging his bets, I would say, with Neuralink. I think if he can’t stop the progress of AI, he’s building the technology to merge. Look at the actions, not just the words.
Lex Fridman
(02:09:10)
Well, there’s some degree where being concerned… Maybe using human psychology, being concerned about threats all around us is a motivator. It’s an encouraging thing. I operate much better when there’s a deadline. The fear of the deadline. And I, for myself, create artificial things, like I want to create in myself this kind of anxiety as if something really horrible will happen if I miss the deadline. I think there’s some degree of that here, because creating AI that’s aligned with humans has a lot of potential benefits. And so a different way to reframe that is, “If you don’t, we’re all going to die.” It just seems to be a very powerful psychological formulation of the goal of creating human-aligned AI.
Guillaume Verdon
(02:09:59)
I think that anxiety is good. I think, like I said, I want the free market to create aligned AIs that are reliable, and I think that’s what he’s trying to do with xAI. So I’m all for it. What I am against is stopping, let’s say the OpenSource ecosystem from thriving by, let’s say in the executive order, claiming that OpenSource LMs are dual-use technologies and should be government controlled. Then everybody needs to register their GPU and their big matrices with the government. And I think that extra friction will dissuade a lot of hackers from contributing, hackers that could later become the researchers that make key discoveries that push us forward, including discoveries for AI safety. And so I think I just want to maintain ubiquity of opportunity to contribute to AI and to own a piece of the future. It can’t just be legislated behind some wall where only a few players get to play the game.
Lex Fridman
(02:11:08)
The e/acc movement is often caricatured to mean progress and innovation at all costs. Doesn’t matter how unsafe it is, doesn’t matter if it causes a lot of damage. You just build cool shit as fast as possible, stay up all night with a Diet Coke, whatever it takes. I guess, I don’t know if there’s a question in there, but how important to you and what you’ve seen the different formulations of e/acc, is AI safety?
Guillaume Verdon
(02:11:44)
Again, I think if there was no one working on it, I think I would be a proponent of it. I think, again, our goal is to bring balance, and obviously a sense of urgency is a useful tool to make progress. It hacks our dopaminergic systems and gives us energy to work late into the night. I think also having a higher purpose you’re contributing to. At the end of the day, it’s like, what am I contributing to? I’m contributing to the growth of this beautiful machine so that we can seek to the stars. That’s really inspiring. That’s also a sort of neuro hack.
Lex Fridman
(02:12:26)
So you’re saying AI safety is important to you, but right now the landscape of ideas you see is, AI safety as a topic is used more often to gain centralized control. So in that sense, you’re resisting it, as a proxy for gaining centralized control?
Guillaume Verdon
(02:12:43)
Yeah, I just think we have to be careful, because safety is just the perfect cover for centralization of power and covering up eventually corruption. I’m not saying it’s corrupted now, but it could be down the line. And really, if you let the argument run, there’s no amount of centralization of control that will be enough to ensure your safety. There’s always more 999s of P safety that you can gain, 99.9999% safe. Maybe you want another nine. “Oh, please give us full access to everything you do. Full surveillance.” And frankly, those that are proponents of AI safety have proposed having a global panopticon where you have centralized perception of everything going on. And to me, that just opens up the door wide open for a big brother, 1984-like scenario. And that’s not a future I want to live in.
Lex Fridman
(02:13:49)
Because we have some examples throughout history when that did not lead to a good outcome.

Extropic

Guillaume Verdon
(02:13:54)
Right.
Lex Fridman
(02:13:56)
You mentioned you founded a company, Extropic, that recently announced a 14.1 million seed round. What’s the goal of the company? You’re talking about a lot of interesting physics things, so what are you up to over there that you can talk about?
Guillaume Verdon
(02:14:12)
Yeah, originally we weren’t going to announce last week, but I think with the doxing and disclosure, we got our hand forced. So we had to disclose roughly what we were doing. But really, Extropic was born from my dissatisfaction, and that of my colleagues, with the quantum computing roadmap. Quantum computing was sort of the first path to physics-based computing that was trying to commercially scale, and I was working on physics-based AI that runs on these physics-based computers. But ultimately, our greatest enemy was this noise, this pervasive problem of noise that, as I mentioned, you have to constantly pump out the noise out of the system to maintain this pristine environment where quantum mechanics can take effect. And that constraint was just too much. It’s too costly to do that.

(02:15:11)
And so we were wondering, as generative AI is sort of eating the world, more and more of the world’s computational workloads are focused on generative AI, how could we use physics to engineer the ultimate physical substrate for generative AI from first principles of physics, of information theory, of computation, and ultimately of thermodynamics? And so what we’re seeking to build is a physics-based computing system and physics-based AI algorithms that are inspired by out-of-equilibrium thermodynamics, or harness it directly to do machine learning as a physical process.
Lex Fridman
(02:16:01)
So what does that mean, machine learning as a physical process? Is that hardware? Is it software? Is it both? Is it trying to do the full stack in some kind of unique way?
Guillaume Verdon
(02:16:10)
Yes, it is full stack. And so we’re folks that have built differentiable programming into the quantum computing ecosystem with TensorFlow Quantum. One of my co-founders of TensorFlow Quantum is the CTO, Trevor McCourt. We have some of the best quantum computer architects, those that have designed IBM’s and AWS’s systems. They’ve left quantum computing to help us build what we call actually a thermodynamic computer.
Lex Fridman
(02:16:43)
A thermodynamic computer. Well, actually let’s linger around TensorFlow Quantum. What lessons have you learned from TensorFlow Quantum? Maybe you can speak to what it takes to create essentially, what, like a software API to a quantum computer?
Guillaume Verdon
(02:17:01)
Right. That was a challenge to invent, to build, and then to get to run on the real devices.
Lex Fridman
(02:17:09)
Can you actually speak to what it is?
Guillaume Verdon
(02:17:11)
Yeah. TensorFlow Quantum was an attempt at… Well, I guess we succeeded, at combining deep learning or differentiable classical programming with quantum computing, and turn quantum computing into or have types of programs that are differentiable in quantum computing. And Andrej Karpathy calls differentiable programming, Software 2.0. It’s like, gradient descent is a better programmer than you. And the idea was that in the early days of quantum computing, you can only run short quantum programs. And so, which quantum programs should you run? Well, just let gradient descent find those programs instead. And so we built the first infrastructure to not only run differentiable quantum programs, but combine them as part of broader deep learning graphs, incorporating deep neural networks, the ones you know and love, with what are called quantum neural networks.

(02:18:21)
And ultimately, it was a very cross-disciplinary effort. We had to invent all sorts of ways to differentiate, to back propagate through the hybrid graph. But ultimately, it taught me that the way to program matter and to program physics is by differentiating through control parameters. If you have parameters that affects the physics of the system and you can evaluate some loss function, you can optimize the system to accomplish a task, whatever that task may be. And that’s a very universal meta framework for how to program physics-based computers.
Lex Fridman
(02:19:05)
So try to parameterize everything, make those parameters differentiable, and then optimize?
Guillaume Verdon
(02:19:12)
Yes.
Lex Fridman
(02:19:13)
Okay. Is there some more practical engineering lessons from TensorFlow Quantum? Just organizationally too, like the humans involved and how to get to a product, how to create good documentation? I don’t know. All of these little subtle things that people might not think about.
Guillaume Verdon
(02:19:34)
I think working across disciplinary boundaries is always a challenge, and you have to be extremely patient in teaching one another. I learned a lot of software engineering through the process. My colleagues learned a lot of quantum physics, and some learned machine learning through the process of building this system. And I think if you get some smart people that are passionate and trust each other in a room, and you have a small team-
Guillaume Verdon
(02:20:00)
Are passionate and trust each other in a room, and you have a small team, and you teach each other your specialties, suddenly you’re kind of forming this sort of model soup of expertise, and something special comes out of that, right? It’s like combining genes, but for your knowledge bases, and sometimes special products come out of that. And so I think, even though it’s very high friction initially to work in an interdisciplinary team, I think the product at the end of the day is worth it. And so, learned a lot trying to bridge the gap there. And I mean, it’s still a challenge to this day. We hire folks that have an AI background, folks that have a pure physics background, and somehow we have to make them talk to one another. Right?
Lex Fridman
(02:20:47)
Is there a magic, is there some science and art to the hiring process, to building a team that can create magic together?
Guillaume Verdon
(02:20:56)
Yeah, it’s really hard to pinpoint that je ne sais quoi, right?
Lex Fridman
(02:21:03)
I didn’t know you speak French. That’s very nice.
Guillaume Verdon
(02:21:07)
Yeah, I’m actually French Canadian.
Lex Fridman
(02:21:09)
Oh, you are a legitimately French Canadian.
Guillaume Verdon
(02:21:09)
I am.
Lex Fridman
(02:21:11)
I thought you were just doing that for the cred.
Guillaume Verdon
(02:21:15)
No, no. I’m truly French Canadian, from Montreal. But yeah, essentially we look for people with very high fluid intelligence that aren’t overspecialized, because they’re going to have to get out of their comfort zone. They’re going to have to incorporate concepts that they’ve never seen before, and very quickly get comfortable with them, or learn to work in a team. And so that’s sort of what we look for when we hire. We can’t hire people that are just optimizing this subsystem for the past three or four years. We need really general sort of broader intelligence and specialty, and people that are open-minded, really, because if you’re pioneering a new approach from scratch, there is no textbook, there’s no reference. It’s just us, and people that are hungry to learn. So, we have to teach each other, we have to learn the literature, we have to share knowledge bases, collaborate in order to push the boundary of knowledge further together. And so, people that are used to just getting prescribed what to do at this stage, when you’re at the pioneering stage, that’s not necessarily who you want to hire. Yeah.

Singularity and AGI

Lex Fridman
(02:22:31)
So you mentioned with Extropic you’re trying to build the physical substrate for generative AI. What’s the difference between that and the AGI AI itself? So, is it possible that in the halls of your company, AGI will be created? Or will AGI just be using this as a substrate?
Guillaume Verdon
(02:22:51)
I think our goal is to both run human like AI, or anthropomorphic AI.
Lex Fridman
(02:22:58)
Sorry for use of the term AGI. I know it’s triggering for you.
Guillaume Verdon
(02:23:02)
We think that the future is actually physics-based AI combined with anthropomorphic AI. So, you can imagine, I have a sort of world modeling engine through physics-based AI. Physics-based AI is better at representing the world at all scales, because it can be quantum mechanical, thermodynamic, deterministic, hybrid representations of the world, just like our world at different scales has different regimes of physics. If you inspire yourself from that in the ways you learn representations of nature, you can have much more accurate representations of nature. So, you can have very accurate world models at all scales. And so, you have the world modeling engine, and then you have the anthropomorphic AI that is human-like. So you can have the science, the playground to test your ideas, and you can have the synthetic scientist. And to us, that joint system of a physics-based and an anthropomorphic AI is the closest thing to a fully general, artificially intelligent system.
Lex Fridman
(02:24:07)
So you can get closer to truth by grounding of the AI to physics, but you can also still have a anthropomorphic interface to us humans that like to talk to other humans, or human-like systems. So, on that topic, I suppose that is one of the big limitations of current large language models to you, is that they’re good bullshitters, they’re not really grounded to truth necessarily. Would that be fair to say?
Guillaume Verdon
(02:24:40)
Yeah, no, you wouldn’t try to extrapolate the stock market with an LM trained on text from the internet. It’s not going to be a very accurate model. It’s not going to model its priors or its uncertainties about the world very accurately. So, you need a different type of AI to compliment this text extrapolation AI. Yeah.
Lex Fridman
(02:25:05)
You mentioned singularity earlier. How far away are we from a singularity?
Guillaume Verdon
(02:25:09)
I don’t know if I believe in a finite time singularity as a single point in time. I think it’s going to be asymptotic, and sort of a diagonal sort of asymptote. We have the light cone, we have the limits of physics restricting our ability to grow. So, obviously can’t fully diverge on a finite time. I think my priors are that I think a lot of people on the other side of the aisle think that once we reach human level AI, there’s going to be an inflection point, and a sudden [inaudible 02:25:48], suddenly AI is going to grok how to manipulate matter at the nano scale, and assemble nanobots. And having worked for nearly a decade in applying AI to engineer matter, it’s much harder than they think. And in reality, you need a lot of samples from either a simulation of nature that’s very accurate and costly, or nature itself, and that keeps your ability to control the world around us in check. There’s a sort of minimal cost computationally, and thermodynamically, to acquiring information about the world in order to be able to predict and control it. And that keeps things in check.

AI doomers

Lex Fridman
(02:26:27)
It’s funny you mentioned the other side of the aisle. So, in the poll I posted about p(doom) yesterday, what’s the probability of doom? There seems to be a nice division between people think it’s very likely, and very unlikely. I wonder if in the future there’ll be the actual Republicans versus Democrats division, blue versus red? Is the AI doomers versus the e/accers, EAC? [inaudible 02:26:53].
Guillaume Verdon
(02:26:53)
Yeah. So, this movement is not right wing or left wing fundamentally, it’s more like up versus down, in terms of the scale of-
Lex Fridman
(02:27:01)
Which one is the up? Okay.
Guillaume Verdon
(02:27:02)
… Civilization, right?
Lex Fridman
(02:27:03)
All right.
Guillaume Verdon
(02:27:05)
But, it seems to be like there is sort of case of alignment of the existing political parties, where those that are for more centralization of power, control, and more regulations are aligning themselves with the doomers, because that sort of instilling fear in people is a great way for them to give up more control, and give the government more power. But fundamentally, we’re not left versus right. I think we’ve done polls of people’s alignment within EAC. I think it’s pretty balanced. So, it’s a new fundamental issue of our time. It’s not just centralization versus decentralization. It’s kind of do we go… It’s like tech progressivism, versus techno conservatism. Right?

Effective altruism

Lex Fridman
(02:27:54)
So e/acc as a movement is often formulated in contrast to EA, effective altruism. What do you think are the pros and cons of effective altruism? What’s interesting, insightful to you about them, and what is negative?
Guillaume Verdon
(02:28:15)
Right. I think people trying to do good from first principles is good.
Lex Fridman
(02:28:23)
We should actually say, and sorry to interrupt, we should probably say that, and you can correct me if I’m wrong, but effective altruism is the kind of movement that’s trying to do good optimally, where good is probably measured something like the amount of suffering in the world. You want to minimize it. And there’s ways that that can go wrong, as any optimization can. And so, it’s interesting to explore how things can go wrong.
Guillaume Verdon
(02:28:55)
We’re both trying to do good to some extent, and we’re arguing for which loss function we should use, right?
Lex Fridman
(02:29:03)
Yes.
Guillaume Verdon
(02:29:04)
Their loss function is sort of hedons, units of hedonism. How good do you feel, and for how much time? And so, suffering would be negative hedons, and they’re trying to minimize that. But to us that seems like that loss function has sort of spurious minima, you can start minimizing shrimp farm pain, which seems not that productive to me. Or you can end up with wire heading, where you just either install a neural link, or you scroll TikTok forever, and you feel good on the short-term timescale because of your neurochemistry, but on a long-term timescale, it causes decay and death, because you’re not being productive.

(02:29:54)
Whereas sort of EAC, measuring progress of civilization, not in terms of a subjective loss function like hedonism, but rather an objective measure, quantity that cannot be gamed that is physical energy, it’s very objective, and there’s not many ways to game it. If you did it in terms of GDP, or a currency, that’s pinned to certain value that’s moving. And so, that’s not a good way to measure our progress. But the thing is we’re both trying to make progress, and ensure humanity flourishes, and gets to grow. We just have different loss functions, and different ways of going about doing it.
Lex Fridman
(02:30:42)
Is there a degree, maybe you can educate me, correct me, I get a little bit skeptical when there’s an equation involved trying to reduce all of the human civilization, human experience to an equation. Is there a degree that we should be skeptical of the tyranny of an equation of a loss function over wish to optimize? Like having a kind of intellectual humility about optimizing over loss functions?
Guillaume Verdon
(02:31:12)
Yeah. So, this particular loss function, it’s not stiff. It’s kind of an average of averages. It’s like distributions of states in the future are going to follow a certain distribution. So it’s not deterministic, it’s not like… We’re not on stiff rails. It’s just a statistical statement about the future. But at the end of the day, you can believe in gravity or not, but it’s not necessarily an option to obey it. And some people try to test that, and that goes not so well. So, similarly, I think thermodynamics is there whether we like it or not, and we’re just trying to point out what is, and try to orient ourselves, and chart a path forward given this fundamental truth.
Lex Fridman
(02:32:04)
But there’s still some uncertainty, there’s still a lack of information, and humans tend to fill the gap of the lack of information with narratives. And so, how they interpret… Even physics is up to interpretation when there’s uncertainty involved. And humans tend to use that to further their own means. So, it’s always, whenever there’s an equation, it just seems like until we have really perfect understanding of the universe, humans will do what humans do, and they try to use the narrative of doing good to fool the populace into doing bad. I guess that this is something that we should be skeptical about in all movements.
Guillaume Verdon
(02:32:57)
That’s right? So we invite skepticism. Right?
Lex Fridman
(02:33:02)
Do you have an understanding of what might, to a degree that went wrong, what do you think may have gone wrong with effective altruism that might also go wrong with effective accelerationism?
Guillaume Verdon
(02:33:15)
Yeah, I mean I think it provided initially a sense of community for engineers, and intellectuals, and rationalists in the early days, and it seems like the community was very healthy, but then they formed all sorts of organizations, and started routing capital, and having actual power. They have real power. They influence the government, they influence most AI orgs now. I mean, they’re literally controlling the board of OpenAI, and look over to Anthropic. I think they’ll have some control over that too. And so, I think the assumption of e/acc is more like capitalism, is that every agent organism and meta organism is going to act in its own interest, and we should maintain sort of adversarial equilibrium, or adversarial competition to keep each other in check at all times, at all scales. I think that yeah, ultimately, it was the perfect cover to acquire tons of power, and capital, and unfortunately sometimes that corrupts people over time.

Day in the life

Lex Fridman
(02:34:23)
What does a perfectly productive day, since building is important, what is a perfectly productive day in the life of Guillaume Verdon look like? How much caffeine do you consume? What’s a perfect day?
Guillaume Verdon
(02:34:39)
Okay, so I have a particular regimen. I would say my favorite days are 12:00 PM to 4:00 AM, and I would have meetings in the early afternoon, usually external meetings, some internal meetings. Because I’m CEO, I have to interface with the outside world, whether it’s customers, or investors, or interviewing potential candidates. And usually I’ll have ketones, exogenous ketones.
Lex Fridman
(02:35:12)
So, are you on a keto diet, or is this-
Guillaume Verdon
(02:35:16)
I’ve done keto before for football, and whatnot, but I like to have a meal after part of my day is done, and so I can just have extreme focus.
Lex Fridman
(02:35:31)
You do the social interactions earlier in the day without food.
Guillaume Verdon
(02:35:35)
Front load them, yeah. Yeah. Like right now I’m on ketones, and a Red Bull, and it just gives you a clarity of thought that is really next level. Because then when you eat, you’re actually allocating some of your energy that could be going to neural energy to your digestion. After I eat, maybe I take a break, an hour or so, an hour and a half, and then usually it’s like ideally one meal a day, like steak and eggs, and vegetables, animal-based primarily. So, fruit and meat. And then I do a second wind, usually that’s deep work, because I am A CEO, but I’m still technical. I’m contributing to most patents. And there, I’ll just stay up late into the night, and work with engineers on very technical problems.
Lex Fridman
(02:36:25)
So it’s like the 9:00 PM to 4:00 AM, whatever though, that range of time.
Guillaume Verdon
(02:36:30)
Yeah, yeah. That’s the perfect time. The emails, the things that are on fire stop trickling in, you can focus. And then you have your second wind. And I think Demis Hassabis has a similar workday to some extent. So, I think that’s definitely inspired my workday. But yeah, I started this workday when I was at Google, and had to manage a bit of the product during the day, and have meetings, and then do technical work at night.
Lex Fridman
(02:37:00)
Exercise, sleep, those kinds of things. You said football, you used to play football?
Guillaume Verdon
(02:37:06)
Yeah, I used to play American football. I’ve done all sorts of sports growing up. And then I was into powerlifting for a while. So, when I was studying mathematics in grad school, I would just do math, and lift, take caffeine, and that was my day. It was very pure, the purest of monk modes. But it’s really interesting, how in powerlifting you’re trying to cause neural adaptation by having certain driving signals, and you’re trying to engineer a neuroplasticity through all sorts of supplements, and you have all sorts of brain derived neurotrophic factors that get secreted when you lift.

(02:37:44)
So, it’s funny to me how I was trying to engineer a neural adaptation in my nervous system more broadly, not just my brain while learning mathematics. I think you can learn much faster if you really care. If you convince yourself to care a lot about what you’re learning, and you have some sort of assistance, let’s say caffeine, or some cholinergic supplement to increase neuroplasticity. I should chat with Andrew Huberman at some point. He’s the expert. But yeah, at least to me it’s like you can try to input more tokens into your brain, if you will, and you can try to increase the learning rate, so that you can learn much faster on a shorter timescale.

(02:38:30)
So, I’ve learned a lot of things. I’ve followed my curiosity. You’re naturally… If you’re passionate about what you’re doing, you’re going to learn faster, you’re going to become smarter faster. And if you follow your curiosity, you’re always going to be interested. And so, I advise people to follow their curiosity and don’t respect the boundaries of certain fields, or what you’ve been allocated in terms of lane of what you’re working on. Just go out and explore, and follow your nose, and try to acquire, and compress as much information as you can into your brain. Anything that you find interesting.
Lex Fridman
(02:39:05)
And caring about a thing. Like you said, which is interesting, it works for me really well, is tricking yourself that you care about a thing.
Guillaume Verdon
(02:39:12)
Yes.
Lex Fridman
(02:39:13)
And then you start to really care about it.
Guillaume Verdon
(02:39:15)
Yep.
Lex Fridman
(02:39:15)
So, it’s funny, the motivation is a really good catalyst for learning.
Guillaume Verdon
(02:39:22)
Right. And so, at least part of my character, as Beff Jezos is kind of like…
Lex Fridman
(02:39:29)
Yeah, hype man.
Guillaume Verdon
(02:39:30)
Yeah, but I’m hyping myself up, but then I just tweet about it, and it’s just when I’m trying to get really hyped up, and an altered state of consciousness where I’m ultra focused, in the flow, wired, trying to invent something that’s never existed, I need to get to unreal levels of excitement. But your brain has these levels of cognition that you can unlock with higher levels of adrenaline, and whatnot. And I mean, I’ve learned that in powerlifting, that actually you can engineer a mental switch to increase your strength. If you can engineer a switch, maybe you have a prompt, like a certain song or some music where suddenly you’re fully primed, then you’re at max, maximum strength. And I’ve engineered that switch through years of lifting. If you’re going to get under 500 pounds and it could crush you, if you don’t have that switch to be wired in, you might die. So, that’ll wake you right up. That sort of skill I’ve carried over to research, when it’s go time, when the stakes are high, somehow I just reach another level of neural performance.
Lex Fridman
(02:40:40)
So Beff Jezos is your sort of embodiment representation of your intellectual Hulk. It’s your productivity Hulk that you just turn on.

Identity

Guillaume Verdon
(02:40:50)
Yeah.
Lex Fridman
(02:40:50)
What have you learned about the nature of identity from having these two identities? I think it’s interesting for people, to be able to put on those two hats so explicitly.
Guillaume Verdon
(02:41:01)
I think it was interesting in the early days, I think in the early days, I thought it was truly compartmentalized. Like, “Oh yeah, this is a character. I’m Guillaume. Beff is just the character.” I take my thoughts, and then I extrapolate them to a bit more extreme. But over time, it’s kind of like both identities were starting to merge mentally, and people were like, “No, I met you. You are Beff. You are not just Guillaume.” And I was like, “Wait, am I?” And now it’s fully merged. But it was already, before the docs, it was already starting mentally that I am this character. It’s part of me.
Lex Fridman
(02:41:39)
Would you recommend people have an alt?
Guillaume Verdon
(02:41:42)
Absolutely.
Lex Fridman
(02:41:43)
Like young people. Would you recommend them to explore different identities by having alts? Alt accounts?
Guillaume Verdon
(02:41:49)
It’s fun. It’s like writing an essay, and taking a position, right? It’s like you do this in debate. It’s like you can have experimental thoughts, and by the stakes being so low, because you’re an anon account with, I don’t know, 20 followers or something, you can experiment with your thoughts in a low stakes environment. And I feel like we’ve lost that in the era of everything being under your main name, everything being attributable to you. People just are afraid to speak, explore ideas that aren’t fully formed, and I feel like we’ve lost something there. So, I hope platforms like X and others really help support people trying to stay synonymous, or anonymous, because it’s really important for people to share thoughts that aren’t fully formed, and converge onto maybe hidden truths that were hard to converge upon if it was just through open conversation with real names.
Lex Fridman
(02:42:46)
Yeah. I really believe in not radical, but rigorous empathy. It’s like really considering what it’s like to be a person of a certain viewpoint, and taking that, as a thought experiment, farther and farther and farther. And one way of doing that as an alt account. That’s a fun, interesting way to really explore what it’s like to be a person that believes a set of beliefs, and taking that across the span of several days, weeks, months. Of course there’s always the danger of becoming that. That’s the Nietzche, “Gaze long into the abyss, the abyss gazes into you.” You have to be careful.
Guillaume Verdon
(02:42:46)
Breaking Beff.

Advice for young people

Lex Fridman
(02:43:31)
Yeah, right. Breaking Beff. Yeah. You wake up with a shaved head one day, just like, “Who am I? What have I become?” So, you’ve mentioned quite a bit of advice already, but what advice would you give to young people of, in this interesting world we’re in, how to have a career and how to have a life they can be proud of?
Guillaume Verdon
(02:43:58)
I think to me, the reason I went to theoretical physics was that I had to learn the base of the stack that was going to stick around no matter how the technology changes. And to me, that was the foundation upon which then I later built engineering skills, and other skills. And to me, the laws of physics, it may seem like the landscape right now is changing so fast, it’s disorienting. But certain things like fundamental mathematics and physics aren’t going to change. And if you have that knowledge, and knowledge about complex systems, and adaptive systems, I think that’s going to carry you very far. And so, not everybody has to study mathematics, but I think it’s really a huge cognitive unlock to learn math, and some physics, and engineering.
Lex Fridman
(02:44:48)
Get as close to the base of the stack as possible.
Guillaume Verdon
(02:44:51)
Yeah, that’s right. Because the base of the stack doesn’t change. Everything else… Your knowledge might become not as relevant in a few years. Of course there’s a sort of transfer learning you can do, but then you have to always transfer learn, constantly.
Lex Fridman
(02:45:04)
I guess the closer you are to the base of the stack, the easier the transfer learning, the shorter the jump.
Guillaume Verdon
(02:45:10)
Right, right. And you’d be surprised, once you’ve learned concepts in many physical scenarios, how they can carry over to understanding other systems that aren’t necessarily physics. And I guess the e/acc writings, the principles and tenet posts, that was based on physics, that was kind of my experimentation with applying some of the thinking from out of [inaudible 02:45:36] thermodynamics to understanding the world around us, and it’s led to e/acc, and this movement.

Mortality

Lex Fridman
(02:45:42)
If you look at you’re one cog in the machine, in the capitalist machine, one human, and if you look at yourself, do you think mortality is a feature or a bug? Would you want to be immortal?
Guillaume Verdon
(02:45:57)
No, I think fundamentally, in thermodynamic dissipative adaptation, there’s the word dissipation. Dissipation is important, death is important. We have a saying in physics, physics progresses one funeral at a time.
Lex Fridman
(02:46:16)
Yeah.
Guillaume Verdon
(02:46:17)
I think the same is true for capitalism. Companies, empires, people, everything. Everything must die at some point. I think that we should probably extend our lifespan, because we need a longer period of training, because the world is more and more complex. We have more and more data to really be able to predict and understand the world. And if we have a finite window of higher neuroplasticity, then we have sort of a hard cap in how much we can understand about our world. So, I think I am for death, because again, I think it’s important. If you have a king that would never die, that would be a problem. The system wouldn’t be constantly adapting, right?

(02:47:05)
You need novelty, you need youth, you need disruption to make sure the system’s always adapting, and malleable. Otherwise, if things are immortal, if you have, let’s say corporations that are there forever, and they have the monopoly, they get calcified, they become not as optimal, not as high fitness in a changing, time varying landscape. And so, death gives space for youth and novelty to take its place. And I think it’s an important part of every system in nature. So yeah, I am for death, but I do think that longer lifespan, and longer time for neuroplasticity, bigger brains should be something we should strive for.
Lex Fridman
(02:47:52)
Well, and that, Jeff Bezos, and Beff Jezos agree that all companies die. And for Jeff, the goal is to try to, he calls it day one thinking, try to constantly, for as long as possible, reinvent, sort of extend the life of the company. But eventually it too will die, because it’s so difficult to keep reinventing. Are you afraid of your own death?
Guillaume Verdon
(02:48:23)
I think I have ideas and things I’d like to achieve in this world before I have to go, but I don’t think I’m necessarily afraid of death.
Lex Fridman
(02:48:34)
So you’re not attached to this particular body, and mind that you got?
Guillaume Verdon
(02:48:38)
No, I’m sure there’s going to be better versions of myself in the future, or…
Lex Fridman
(02:48:46)
Forks?
Guillaume Verdon
(02:48:47)
Forks, right? Genetic forks, or other, right? I truly believe that. I think there’s a sort of evolutionary-like algorithm happening at every bit, or [inaudible 02:49:03] in the world is sort of adapting through this process that we described in e/acc. And I think maintaining this adaptation malleability is how we have constant optimization of the whole machine. And so, I don’t think I’m particularly an optimum that needs to stick around forever. I think there’s going to be greater optima in many ways.

Meaning of life

Lex Fridman
(02:49:25)
What do you think is the meaning of it all? What’s the why of the machine? The e/acc machine?
Guillaume Verdon
(02:49:32)
The why? Well, the why is thermodynamics. It’s why we’re here. It’s what has led to the formation of life, and of civilization, of evolution of technologies, and growth of civilization. But why do we have thermodynamics? Why do we have our particular universe? Why do we have these particular hyper-parameters, the constants of nature? Well then you get into the anthropic principle, and the landscape of potential universes, right? We’re in the universe that allows for life. And then why, is there potentially many universes? I don’t know. I don’t know that part. But could we potentially engineer new universes, or create pocket universes, and set the hyper-parameters so there is some mutual information between our existence in that universe, and we’d be somewhat its parents? I think that’s really… I don’t know, that’d be very poetic. It’s purely conjecture. But again, this is why figuring out quantum gravity would allow us to understand if we can do that.
Lex Fridman
(02:50:39)
And above that, why does it all seems so beautiful and exciting? The quest to figuring out quantum gravity seems so exciting. Why? Why is that? Why are we drawn to that? Why are we pulled towards that? Just that puzzle solving creative force that underpins all of it, it seems like.
Guillaume Verdon
(02:51:01)
I think we seek, just like an LLM seats to minimize cross entropy between its internal model and the world, we seek to minimize… Yeah, the statistical divergence between our predictions and the world, and the world itself. And having regimes of energy scales, or physical scales in which we have no visibility, no ability to predict, or perceive, that’s kind of an insult to us. And we want to be able to understand the world better in order to best steer it, or steer us through it.

(02:51:37)
And in general, it’s a capability that has evolved because the better you can predict the world, the better you can capture utility, or free energy towards your own sustenance and growth. And I think quantum gravity, again, is kind of the final boss, in terms of knowledge acquisition, because once we’ve mastered that, then we can do a lot, potentially. But between here and there, I think there’s a lot to learn in the meso scales. There’s a lot of information to acquire about our world, and a lot of engineering perception, prediction, and control to be done, to climb up the Carta shift scale. And to us, that’s the great challenge of our times.
Lex Fridman
(02:52:22)
And when you’re not sure where to go, let the meme pave the way.
Guillaume Verdon
(02:52:26)
That’s right.
Lex Fridman
(02:52:27)
Guillaume, Beff, thank you for talking today. Thank you for the work you’re doing. Thank you for the humor, and the wisdom you put into the world. This was awesome.
Guillaume Verdon
(02:52:37)
Thank you so much for having me, Lex, It’s a pleasure.
Lex Fridman
(02:52:40)
Thank you for listening to this conversation with Guillaume Verdon. To support this podcast. Please check out our sponsors in the description. And now, let me leave you with some words from Albert Einstein. “If at first the idea is not absurd, then there is no hope for it.” Thank you for listening. I hope to see you next time.

Transcript for Teddy Atlas: Mike Tyson, Cus D’Amato, Boxing, Loyalty, Fear & Greatness | Lex Fridman Podcast #406

This is a transcript of Lex Fridman Podcast #406 with Teddy Atlas.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
That’s all that matters, that he got there, that he got to the place to act like a fighter. To do what we want him to do, to be ready to persevere, to go beyond the comfort level, to do another round. He didn’t want to. Damn right he didn’t want to, but he knew we want him to. And he knew in order to pass the test, he had to do it. He goes, “Now, it’s going to be your job to get him in the gym, make him mentally stronger, make him face things, and teach him how to slip punches and create holes, and fill those freaking holes with devastating punches…” There’s a cuss, “… with punches with bad intentions.”

(00:00:40)
The following is a conversation with Teddy Atlas, a legendary and, at times, controversial boxing trainer and commentator. When I was going to this conversation with Teddy, I was ready to talk boxing, styles, matches, techniques, tactics, and his analysis of individual fighters, like Mike Tyson, Michael Moorer, Klitschkos, Usyk, Povetkin, Lomachenko, Triple G, Canelo, Muhammad Ali, Sugar Ray Leonard, Hagler, Duran, Floyd, and on and on and on. Like I said, I came ready to talk boxing, but I stayed for something even bigger, the Shakespearian human story of Teddy Atlas, Cus D’Amato, and Mike Tyson.

(00:01:23)
It’s a story about loyalty, betrayal, fear, and greatness. It’s a story where nobody is perfect and everybody is human. To summarize, in the early ’80s, young trainer, Teddy Atlas, worked with his mentor, Cus D’Amato, in training the young boxing protégé, now a boxing legend, Mike Tyson. Mike was a troubled youth, arrested over 40 times, and at age 15, he was sexually inappropriate with Teddy’s 11-year-old niece.

(00:01:55)
In response to this, Teddy put a .38 caliber handgun to Tyson’s ear and told him to never touch his family again or he would kill him if he did. For this Cus D’Amato kicked Teddy out. Why? Well, that’s complicated. In part, I think, to help minimize the chance of Mike Tyson, who Cus legally adopted, will be taken away by the state, and with him the dream of developing one of the greatest boxers of all time.

(00:02:24)
Of course, that summary doesn’t capture the full complexity of human nature and human drama involved here. For that, you have to listen to this conversation, the things said and the things left unsaid. The pain in Teddy’s voice, the contradictions of love and anger that permeate his stories and his philosophy on life. Like I said, I came to talk about boxing and stayed to talk about life.

(00:02:52)
This conversation will stay with me for a long time. The people close to you, the people you trust, the people you love, are everything. And if they betray you and break your heart, forgive them, forgive yourself and try again. Happy holidays, everyone. I love you all.

(00:03:15)
This is a Lex Fridman podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Teddy Atlas.

Lessons from father


(00:03:26)
You wrote in the book that your father had a big influence on your life. What lessons have you learned about life from your father?
Teddy Atlas
(00:03:34)
When you ask that question, I remember Cus D’Amato, when I was with him up in Catskill for all those years. He used to say to me, “Teddy, you learned through osmosis.” I believe there’s truth to that, if I know what osmosis is, but it sounds good. But I learned through osmosis with my father. He wasn’t a big talker. He was a doer. And when you’re around someone who lives a certain kind of life and does certain things, it penetrates.
Lex Fridman
(00:04:10)
He was a doctor.
Teddy Atlas
(00:04:11)
I’m going to sound like an idiot right now, because I’m being a son, but he was the greatest diagnostic doctor. I mean, if I say, I ever knew, what does that mean? You know what I mean? Are you a doctor? You know what I mean? What does that mean? But, other people have told me this, just legendary stories.
Lex Fridman
(00:04:33)
He would do house calls and he’d help people, and like you said, a lot of people have spoken about the impact he’s had on their life.
Teddy Atlas
(00:04:38)
He built two hospitals, and he built a hospital before the Verrazano Bridge in New York, connecting Brooklyn to Staten Island. And he built it so people could get proper hospital care that couldn’t afford it, period. And everybody looked at him as eccentric.
Lex Fridman
(00:04:58)
Yeah, nice. [inaudible 00:04:58].
Teddy Atlas
(00:04:59)
Yeah, because, he would literally sneak patients, not sneak them in, he was Dr. Atlas, he could do what he wanted, to a certain extent. But he would bring patients in without administering, putting through administration, so there was no charge, because they didn’t have anything. They were street people. I remember being… My only way to be with my father was to go on house calls or to go to the office. There was no…

(00:05:27)
And so I went on house calls with him. And he did house calls, by the way, till he was 80, and $3. I mean, it was better than McDonald’s, you know what I mean? I mean, the deal, $3 and you got medicine, you got everything. But he used to, right around the holidays, there was just certain things that I didn’t understand, but I understood later, where we would just drive certain areas and he just, all of a sudden, open the door, he would pick up these… and-
Lex Fridman
(00:05:58)
Help them.
Teddy Atlas
(00:05:59)
… I’m 10 years old, ” Move over.” Move over, you know?
Lex Fridman
(00:06:02)
Mm-hmm. It was just you, him, and a homeless guy.
Teddy Atlas
(00:06:06)
A couple.
Lex Fridman
(00:06:06)
Yeah, a couple.
Teddy Atlas
(00:06:07)
Yeah, whatever he could fit in, three, four, whatever it was.
Lex Fridman
(00:06:11)
That’s a big heart.
Teddy Atlas
(00:06:12)
And then he took them to the hospital, dropped them off. I would ask questions after it was all over with. I’d say, “Dad, they’re sick.” He goes, “Well, not in a way.” “Why did you put them in the hospital?” So he said, “Yeah.” And he’d tried to explain things to me. He would try, he didn’t talk much unless you’d ask him something, and that works. And don’t talk unless someone asks you something. And he explained to me that, he said… I said, “Well, why are you putting them in the hospital?”

(00:06:43)
And, of course, their sickness was, they were alcoholics. “but ,why do you put them…?” It wasn’t an alcohol rehab, so why are you putting… And it wasn’t for the purpose to dry out. He wasn’t trying to cure them. Let’s put that before we anoint him for sainthood, by Teddy Atlas. I was like, we finally get to the point, “Why do you put them in there?” “Well, because it’s the holidays.”

(00:07:07)
“All right, why do you put them in there?” “Well, the holidays are good for certain people and bad for others.” And it was always before the holidays. It was before Christmas, it was before whatever, New Year’s, whatever. So I said, “Why?” And he said, “Because they remind people, certain people, of what they don’t have. Other people enjoy the holidays because of what they have, family, whatever, and it reminds them, their mind is that.”
Lex Fridman
(00:07:46)
That’s pretty profound.
Teddy Atlas
(00:07:47)
Yeah. And then, I don’t remember, because he didn’t use the word suicide, but I got it. He basically, I forget how he said it, but I just got it. I don’t know how I… I suppose, I don’t know, but I just got it. So they don’t hurt themselves. That’s what came across-
Lex Fridman
(00:08:03)
In every way.
Teddy Atlas
(00:08:04)
I don’t think he ever articulated that or ever verbalized that. But, yeah, they don’t hurt themselves. Well, how does that work? Well, it just basically they’re going to be around people. They’re not going to be alone. They’re going to be around people. They’re going to get fed, they’re going to be warm, right, and it’s going to be for three days, two, three days, whatever. And basically, it’s a bridge. So the funny thing, as a 10-year-old, I want to be connected to him, so I enlisted myself in the job.

(00:08:36)
When he used to drop them off, he would take them, get them in, right? And then the thing that I know, again, he didn’t say nothing, but you notice things. And if you care enough, you don’t notice nothing if you don’t care. But if you can, if it’s important, you notice. And this guy was important to me. I just was, I didn’t know what a hero was, no clue. I loved Mickey Mantle, I loved Willie Mays, I love Muhammad Ali. I never, ever connected them in my mind as heroes. Never. My father, I didn’t connect it that way, but he-
Lex Fridman
(00:09:21)
Looking back now now [inaudible 00:09:22]-
Teddy Atlas
(00:09:22)
Looking back, he was my first connection to a hero.
Lex Fridman
(00:09:25)
The two of you ever talk about how much you love each other? The word love?
Teddy Atlas
(00:09:29)
One thing that was not allowed. The greatest memory I have, my father showing me love was, we were down in Florida at an airport and we were… I was born in Miami. Don’t ask where I was passing through. And the rest of my family’s born in New York, Staten Island. And so I was supposed to go back with him and I wanted to stay with my mother, for whatever reason. And so he, of course, conceded to it. Okay, whatever. And very quiet, very… And here’s a man who never showed emotion to anyone. I mean, for the most, you know… Well-
Lex Fridman
(00:10:11)
Yeah.
Teddy Atlas
(00:10:12)
… all of a sudden, he just turned and kissed me on the forehead and left. And I was like, “That’s different.”
Lex Fridman
(00:10:23)
Yeah. You still remember that, huh?
Teddy Atlas
(00:10:26)
Yeah. Like, “That’s weird.”
Lex Fridman
(00:10:28)
You lost him 30 years ago? How did that change you?
Teddy Atlas
(00:10:39)
It made me realize that some of the deals I used to make for God weren’t realistic. When I was a kid, I used to make deals with God. “Let me die before my father.” And then you get older and you have kids, you’re blessed, why did you make that deal? You know what I mean? Thank you for not taking me up on it.
Lex Fridman
(00:11:07)
Yeah.
Teddy Atlas
(00:11:08)
Right.
Lex Fridman
(00:11:09)
Yeah.
Teddy Atlas
(00:11:13)
You know?
Lex Fridman
(00:11:14)
Yeah. You miss him?
Teddy Atlas
(00:11:15)
I miss him in moments when I’d like to know what to do. And I remember when I would drive with him on the house calls, he didn’t listen to music. He was a guy, he read books to his… When he got older, he read books. Two blood vessels broke in his eyes. He only read nonfiction books, science. He loved science, wars, generals. I mean, I cheated on a couple book reports, because I didn’t do the reading of the book the night before I had a freaking a book report to put in. “Dad, I got a book report to do on the War of Stalingrad.” Really? The War of Stalingrad. And who the freak could tell you where you get an A? I got an A. I just wrote what he told me. He told me generals, he told me times, he told me strategy. He told me about the winter that came and destroyed the Germans, and the Soviets were tougher-
Lex Fridman
(00:11:15)
You got an A.
Teddy Atlas
(00:12:20)
… and the Soviets were tougher than the Germans, and the Germans picked on the wrong opponent. I was already in the boxing business. I didn’t even know it. I didn’t even know it. Matchmaking, very important. They mismatched. They made a mistake with the picking the opponent. And so, when we would be driving in the car, my father would be in a trance. And dad, he wasn’t ignoring me at all, he was just with his thoughts. He was wherever. He wasn’t even hearing the radio no more.

(00:12:59)
I always wondered where he was. I did. So I asked him one day. And just, so we’re driving, I said, I want to know so I said, “Dad, what do you think when you’re basically in this place, that I know you’re somewhere? Where are you? What are you seeing?” I actually said, “What do you see?” And he said to me, “I see what could be. I see what could be.” And I’m like, “Oh, all right.”

Scar story

Lex Fridman
(00:13:32)
I got to ask you, when did you discover boxing? When did you first fall in love with boxing?
Teddy Atlas
(00:13:39)
When it saved me.
Lex Fridman
(00:13:41)
How did it save you?
Teddy Atlas
(00:13:44)
I was a stupid, violent kid that was angry. Not exactly know why I was angry. I’d fit in real good in today’s society, because there’s a lot of angry kids out there that I don’t think they know why they’re angry. I was just out there getting in fights and I got this stupid thing from that.
Lex Fridman
(00:14:05)
Can you tell the story of how you got that?
Teddy Atlas
(00:14:07)
I was just running around doing stupid things, bad things. I hurt people, some people physically, but I hurt my family. That’s BS, you only hurt yourself. That’s a good way of alibi-ing it. But, at some point, the truth usually finds its way. I’d like it to look like I was just hurting myself, but I wasn’t, obviously. So I was just out on the streets, with kids that didn’t grow up in the neighborhood I grew up. I grew up in a neighborhood where father was a doctor. And I walked down the street…

(00:14:48)
The funny thing was, down the hill was a very tough neighborhood called Stapleton. And most of the people down there on the corners wished they could get up the hill, and I wished I could get down the hill. So I went down the hill and I hung out with all these friends that became lifelong friends. I gravitated to that, because I figured out later a little bit, but I wanted family. We were destroying the family. My father was a doctor, he didn’t have time for nothing but being a doctor.

(00:15:26)
I think when you created something, you sacrifice something, too. When you’re really great at something, so great that maybe God made you great and you’re too great for your own good. And then, I don’t know, it took me to these stupid, dangerous places. Dangerous for me, but dangerous for other people, too. Because, I got to the point where I was doing robberies on the street, I was fighting everybody.

(00:15:49)
And you know what the most dangerous part about it was? And I came to this realization on my own. I’m all by myself. I figured out, I was really as dangerous… These kids from the project, some of them, they got nothing. First of all, I learned you don’t have to be poor to be poor. You don’t have to be deprived of certain things to be deprived, at least to think you’re deprived. And I was poor in away that I didn’t have the only thing I wanted to have, him.

(00:16:28)
So here I am where I’m out there doing these things, and what made me more, I was more dangerous than some of these psychopaths. Well, I was a psychopath, too, I guess, the way I was behaving. But some of these psychopaths that really had nothing, really, they obviously would kill you. I was dangerous almost in the same way, but for a different reason. I know it’s ridiculous what I’m about to tell you, but I figured it out, because I felt it. I thought I was on a righteous path. I thought I had a right because it was going to get me my father back.

(00:17:11)
Why? Why? I mean, you’re a scientist, you couldn’t figure this one out. Because all the people that had him were injured people, fractured people, screwed up people in some ways, but hurt, damaged people. So if I get damaged, I’ll get him. So I was on a crusade, really, a righteous crusade where I thought it was okay. I had permission. I had permission to do these terrible things, quite frankly, and to fight everyone wanted to. And then it came almost to a crash of doing all that, winding up in Rikers Island like an idiot, not understanding the damage I did to this poor man, that he was a great doctor and he’s got to see his son and hear about, you know what I mean?

(00:18:14)
God, I was out on that day with the guys that I grew up with now, the guys from the projects from as I described, and I was with one of them who, he’s dead now. I was with him and we were in a neighborhood, the neighborhood we grew up, that I hung out, and he grew up in. Billy, he came from the project. And we got into a thing where we cut, somebody cut us off, we cut them off, jumped out to fight. Turned out there’s five or six of them and two of us. And we fought, right on the side, right there, only about a block from where I used to hang out, and maybe a block and a half.

(00:19:06)
And right in front of this Spanish bodega. It really does happen in slow motion. I actually saw the guy, I was fighting the guys that I had to fight. And then, all of a sudden, I was able to get one guy out of the way a little bit. And I really, I noticed the guy go into his pocket and I knew why he was going in his pocket. When he came out of his pocket, I knew what it was right away. It was weird, because in the neighborhood, guys used to hang out, they were into this… They get into fads right on the streets. And at that time, they went into this cheap knife, but they thought it was, well, we thought it was cool. It was a 007.

(00:19:51)
And the cool thing, whatever, was that you could flick it, you could learn. And I learned how to flick, but I never carried a knife. But, my friends would have it. I would just, you learn how you could flick it open, not a switchblade, but flick it with your wrist. And I was like, here I am in the middle of this freaking fight, and all of a sudden, “Oh, this is a 007.” And so I’m like, you got to make a decision. And I got a split, I can either not do nothing, which didn’t seem like a great option. I couldn’t run away.
Lex Fridman
(00:20:36)
Why not?
Teddy Atlas
(00:20:39)
Because you got to live with yourself afterwards. And that’s more difficult to live with than whatever it is at that second, because that don’t go away.
Lex Fridman
(00:20:47)
You couldn’t live with yourself-
Teddy Atlas
(00:20:49)
It just-
Lex Fridman
(00:20:49)
… running away.
Teddy Atlas
(00:20:49)
… It just don’t go away. That thing, nothing to do with being brave. It has nothing to do with being brave, really. It’s got to do with just common sense in life. That, for me, whatever you’re dealing with, it’s over, it’s done. Like, okay, deal with it, good or bad, whatever. But, you do that, that other thing, you can’t, that never ends. This thing ends.
Lex Fridman
(00:21:19)
Memory of you being, let’s say, a coward in that moment, that never ends.
Teddy Atlas
(00:21:24)
The only thing I had at that point in my life, in my stupid mind, was a reputation that I would stand up to certain things. That was like, and that for me was worth something, whatever, because I didn’t feel any worth to anything else. That was the only thing I felt a connection of worth to.
Lex Fridman
(00:21:47)
Stood your ground and got cut.
Teddy Atlas
(00:21:49)
No, I made a decision. I stood my ground, but I actually, things do slow down. They do. And I actually said, “It’s a 007, he’s got to flick it.” And I didn’t say no, but he’s got to flick it. I get a split second, like I said, either I do nothing, whatever, or I get to him before he gets it flicked. I went to get to him before he got flicked. And I, just as I got close to, I did him a favor. I walked right into a counterpunch, because I cooperated with him. I went right to him. And just as I… He practiced more than I did with the 007 apparently. Because he was like, whomp, whomp, whomp, whomp. Anyway.
Lex Fridman
(00:22:41)
What did you think? What did you think that happened? That was all slow motion. Did you think he might die?
Teddy Atlas
(00:22:49)
Yeah. Well, not immediately. Took me a minute. I’m a slow learner. I put my hand up. Right? Wouldn’t you? I guess so.
Lex Fridman
(00:22:59)
Yeah, mm-hmm.
Teddy Atlas
(00:22:59)
And it went into my face and that was it. It was gooey. It was warm and gooey. And I was like, “I don’t know what this means, but I don’t want to know, but I think I know.” And…
Lex Fridman
(00:23:21)
Did you think about your dad in that moment?
Teddy Atlas
(00:23:24)
No. You know what I thought about him was, you don’t know who anyone is until they’re tested. I learned that. Cus used to tell me, but I learned it. He said, I remember one time Cus, because I was a 17, 18-year-old kid up there, thought I was, whatever I thought I was, and he said, “You got a lot of friends.” And I said, “Yeah.” Because I was on the street, hanging out with a hundred kids at night, sometimes on the street corner. So I was like, I don’t know too many people that hung out with a hundred kids on the street, on a corner, on a Friday, Saturday night.

(00:24:02)
And I was like, “Yeah, I got a lot of friends.” He goes, “Really?” I said, “Yeah, really.” He said, “How about if I told you you might not have any. Most likely you don’t have any.” And he goes, and then he just started this thing. He said, “Everyone’s going to be tested, you, me, everyone, because you don’t know about nobody until they’re tested.” He goes, “You know nothing.” He goes, “You nothing until you know. Until something happens to test if they were really your friend.” And he told me this story about a guy.

(00:24:40)
A guy came to him and he was upset. “What are you upset about?” He goes, “I’m upset because I just lost a friend. After 20 years of friendship, we’re not friends no more.” So Cus looks at him, he goes, “Let me ask you a question. What made you think you were ever friends with him?” Now the guy gets insulted to Cus. “Did you hear me?” He goes, “I just told you 20 years I’ve been friends with this guy. Why would you say that to me?”

(00:25:11)
He said, “Well, I’ll say it again. What makes you think he was your friend?” He goes, “Whatever happened in the 20 years, other than chasing girls,” because Cus figured that went out fast, “… chasing girls and drinking together, and whatever else you’re doing out on the street, whatever gave you the inclination that he was a friend?” He goes, “Whatever, when did he risk himself to be your friend? When was it dangerous to be your friend?”
Lex Fridman
(00:25:44)
When was the friendship tested?
Teddy Atlas
(00:25:45)
“When was it uncomfortable to be your friend?” And you know what the guy said? You can figure it out, you’re a scientist. He said, “Today.” And today came for me. And today, today, today, today, kept coming for me. Today.” And that day, my friend Billy had turned out while I was fighting these, whatever, five, six guys, and where was Billy? He was on the roof. He was on the roof. He was on the roof. He was my best friend.

(00:26:29)
So anyway, they take me to the hospital. And here’s the thing with my father. But one thing Billy did do for me when he got off the roof, thank God, he did, he dragged me into this bodega, laid me on the floor, and started putting towels. And the towels, I vaguely remember this, they filled up with blood. I mean completely drenched, like you put them under a shower. And I heard the bodega owner screaming, screaming like… whatever. And everyone’s screaming and there’s chaos, and I’m like, I don’t know, I’m calm. Weird, I’m real calm. I’m just in this place, things calm.

(00:27:26)
And all of a sudden I hear Billy, he’s screaming, ” Call the ambulance, call the…,” and nobody’s doing nothing, everyone’s frozen. I’m starting to understand already people get frozen in situations. People, the fear, fear, fear, fear, fear just paralyzes people. And I was going into a fear business. I was learning. I was learning. I was getting a learning, early PhD-
Lex Fridman
(00:27:58)
Living in fear.
Teddy Atlas
(00:27:58)
Yeah.
Lex Fridman
(00:27:59)
Yeah.
Teddy Atlas
(00:27:59)
And, all of a sudden, genius, Billy genius, really, street kid. He jumps up on the freaking counter, jumps over the counter, grabs the phone, calls 911, says a cop’s been shot, and forget about it. It was crazy. All I remember after that, I’ll tell you the couple things I remember, lights, being put onto a stretcher, bounced around, rushed. I felt everyone’s anxiety, except mine. I had none. But I felt everyone’s anxiety, everyone’s fear, like was all around me. It was like, “Wow, this is interesting. It’s kind of…” I know that’s stupid, but, “Wow, this is interesting.”
Lex Fridman
(00:28:45)
You really have an eye for fear. That’s fascinating. You’re really studying it.
Teddy Atlas
(00:28:49)
Well, I had no choice, I got introduced in a crash course. And they put me in ambulance, and this is what I remember to your point, I’m sorry it took so long to get to it. I am, although I’ll probably do it again before this conversation’s over. But I-
Lex Fridman
(00:29:05)
It’s all about the journey.
Teddy Atlas
(00:29:06)
Yeah. We’ll get there. We’ll get there, pops. So I hear the cops say, “We might lose him.” And I’m laughing to myself, I’m not laughing, because I’m not, again, I’m not John Wayne. John Wayne would’ve laughed, but I’m like, “Lose? You guys are stupid.” I didn’t say that, but I’m like, ” Lose me? My father’s the greatest doctor in the freaking world. There’s nothing to worry about. You people are all uptight and whacked out here with fear, and there’s nothing to worry about. Dr. Atlas is my father.”

(00:29:48)
So anyway, so they’re taking me to the… And he said, “We don’t have time.” I hear, couple things I remember, “Don’t have time. Take him to…” and they take me to US Public Health Hospital. Marine Hospital was called at the time, but US Public Health. And it’s in Stapleton, so it’s close, thank God. So they’re taking me, and I hear them on the radio saying this stuff about, “We got to move. We got to move.” I start talking and they’re telling me, “Don’t talk.” But I like to talk a lot. And I’m… Again, fear.” There’s no fear when the fear’s been removed.

(00:30:35)
It’s the only time you really free in life. And I know that sounds absurd, but really, it is. It’s the only time you’re really free in life. When you’re-
Lex Fridman
(00:30:46)
Close to death?
Teddy Atlas
(00:30:47)
… when you’re devoid of things that normally hold you back, that normally influence you in ways that are, not of the influence that, always positive influence where you are in a pure place, where you’re in a purely free place from all inhibitions, from fear, from anxiety, from joy. Joy can screw you up, and you’re free from all these things. And I’m in this place, just [inaudible 00:31:18]-
Lex Fridman
(00:31:18)
In the back of an ambulance, you’re free.
Teddy Atlas
(00:31:19)
Yeah. I said, “Just get me Dr. Atlas.” And they say, “We don’t have time.” “No, no, no, no, no, you don’t… You have to get Dr. Atlas. You have to get him.” This was the… Damn it, this was the… You know what I mean? I finally freaking hit the number and I’m not getting paid. And then, all of a sudden, I’m out.
Lex Fridman
(00:31:39)
How many stitches?
Teddy Atlas
(00:31:40)
They… Well, I think it was 400, 200 inside, 200 outside, or whatever it was.
Lex Fridman
(00:31:45)
It’s a lot.
Teddy Atlas
(00:31:45)
Hey, look, after 50, the number doesn’t matter no more. Whatever, 60, 70, 80, 90, whatever. So I was fortunate, I was fortunate. And, of course, I was fortunate, they told me afterwards, that missed my jugular, literally by a centimeter. I mean, whatever. So then we wouldn’t be having this conversation, obviously.
Lex Fridman
(00:32:10)
I’m glad you made it.
Teddy Atlas
(00:32:11)
Yeah-
Lex Fridman
(00:32:11)
That’s another thing.
Teddy Atlas
(00:32:12)
… I’m glad, too. And it just missed my eye, which, thank God. It’s bad enough I have a scar, imagine me with a patch? I mean, it’s enough that I got this freaking thing. And look, it goes all the way. I mean, it’s pretty long. I don’t know, I was out. And then somehow, I sensed, they had the curtain closed, and it’s amazing how vivid this is. And the curtain’s closed and I see a shadow. I felt a presence. I did, and I felt him. He’s a powerful guy. And I felt him and I just see a shadow, you know? And, all of a sudden, the curtain gets pushed-
Teddy Atlas
(00:33:00)
And all of a sudden the curtain gets pushed back. And I can’t really see. It’s dark and I’m out of it, but not completely out of it. And pushes the curtain back, comes in, and his hand, even though it’s all bandaged, whatever, but his hand surveys. It felt safe and it felt warm and safe. I was happy. And he got there.
Lex Fridman
(00:33:37)
Did he say something?
Teddy Atlas
(00:33:38)
Yeah, yeah. Remember, I gave you a little bit of introduction to my father, right? You know him now a little bit, right?
Lex Fridman
(00:33:44)
Yeah, yeah. What’d he say about the job?
Teddy Atlas
(00:33:47)
This is what he said. I remember to this day what he said. That I do remember. I don’t know if it was six or five people, but this I do remember. He said, “They did a good job. You’re going to have a scar the rest of your life.” And he left.

Cus D’Amato

Lex Fridman
(00:34:05)
Oh, man. They did a good job. You mentioned Cus D’Amato, legendary trainer, and you also mentioned it turned out he really cared about you. In the book, you write about a testimony he gave. I was hoping I could read it because it speaks to your character. It speaks to his. It’s just powerful.

(00:34:28)
The testimony goes, ” Your Honor, I realize you might not know much about me, but I spent my whole life developing young men. As a boxing manager I trained two world champions, heavyweight champion Floyd Patterson and light heavyweight champion Jose Torres. I’ve also helped a lot of other young boys straighten out their lives and build character. I know things about Teddy Atlas this court doesn’t know, things you won’t find on his arrest record. This boy has character. He has loyalty. He’ll hurt himself before he’ll let down a friend. These qualities are rare and they shouldn’t be lost. He’s made mistakes. We’ve all made mistakes, but I’ve come to know this boy, and if we lose him, we’ll be losing someone who could help a lot of people. Please don’t take this young boy’s future away. He could be someone special. Let’s not lose him. Please.” Those are powerful words from a powerful man. What have you learned about life from Mr. Cus D’Amato?
Teddy Atlas
(00:35:41)
He gave me a quote that he drilled into my head. I became his guy. He loved me. I loved him. He said to me, “Teddy, no matter what a man says, it’s what he does in the end that he intended to do all along.” That’s what I learned from Cus. The rest of it is BS. And a lot of people say things. You just have to give them a minute to let them show you eventually what they really meant by it.

(00:36:26)
I also learned from him that everyone’s afraid. Cus, his way of saying it, another great saying, you’ll get a kick out of this, “Anyone who’s in a situation where fear should be prevalent, where fear is actually necessary to survive the situation, anyone who says that they’re not afraid, they’re one of two things. They’re either a liar or they should go to a doctor, find out what the frick’s wrong with them.” He was right about that. We live in a taboo society where that word, to a certain extent, is taboo because it invokes weakness. We are just layers of what we saw and learned since we were kids. We all are. We’re products of those layers. I learned that on my own and through some help.

(00:37:32)
At the end of the day, fear, people will find their way of avoiding that term. So they use the word anxiety, they use the word butterflies, apprehension, a million different words. I find all those other words to be cousins of fear. And fear causes a lot of things in life. It causes a lot of problems and it also solves a lot of problems. Without it, we couldn’t be great if we are great, if we ever have a chance to be great or at least to aspire to be great.
Lex Fridman
(00:38:22)
How does fear connect to greatness? That’s a profound statement. Without fear, we wouldn’t be able to be great.
Teddy Atlas
(00:38:32)
Yeah, you couldn’t be great without fear because fear allows you to be brave. The most important word for me in this whole conversation, right neighborhood would be selfishness, and it allows you to be, for a moment, less selfish. One of the things I learned, I guess partly on my own… Everyone thinks my greatest teacher was Cus. He was a great teacher, mentor. My greatest teacher was my father, the one who never talked. And I realized one of the things to be better, towards great is if you can submit less than we submit. See, one of the things that I’m afraid of, one of the things, I was always quitting. In my business, it’s not a good thing.
Lex Fridman
(00:39:29)
Every business, I think. Yours is just more clear.
Teddy Atlas
(00:39:35)
Yeah. It hurts more.
Lex Fridman
(00:39:39)
True. In the moment, at least.
Teddy Atlas
(00:39:42)
Yeah, in the moment. You’re right, 100%, because some things hurt for a long time afterwards. And something like regret. Regret is the worst thing in the world because it’s a solitary sentence. And man, oh, man-
Lex Fridman
(00:39:58)
That’s a powerful phrase, regret is a solitary sentence. Oh, boy.
Teddy Atlas
(00:40:02)
So, I-
Lex Fridman
(00:40:03)
You’re full of good lines.
Teddy Atlas
(00:40:07)
It wasn’t easy to accumulate them.
Lex Fridman
(00:40:13)
Yeah. Hard run.
Teddy Atlas
(00:40:15)
It was a little bit hurtful. So submit less, because we submit every day, and if we can get to a place where we submit or compromise ourselves less, we’ll get to a better place. Again, one of the words for me that attaches to things that wind up hurting you in life and have hurt me in life, one of those boogeymen words is the word of convenience. That’s attached to everything. People disappoint you not because they want to disappoint you or let you down or betray you, because they want to betray you. They do it because it’s more convenient to do than the other thing.

(00:41:06)
An old man once told me, he said to me… I was trying to rationalize something. I was trying to make an excuse for something. I was trying to make myself better than I was. I was trying to say it was okay. And he just looked at me, and he liked me, and he said, “Teddy, there ain’t no such thing as being a little pregnant.” I was like, “Yeah.” He goes, “Either you’re pregnant or you’re not pregnant. Either you’re real or you’re not real. Either you’re truthful or you’re not truthful. Either you’re tough or you’re not tough. Either you’re committed or you’re not committed. Either you’re in or you’re out.”
Lex Fridman
(00:41:56)
That applies to a lot of things, including loyalty.
Teddy Atlas
(00:42:00)
That’s quite a statement. But the life level of humanity for me is loyalty. It’s what goes through the veins of… Everything has to have some veins in some form. And if humanity has veins, what runs through the veins of humanity instead of blood to keep it alive is loyalty.
Lex Fridman
(00:42:18)
Those are powerful words.
Teddy Atlas
(00:42:20)
Without loyalty, we’re dead, we’re vessels. I never understood what a ghost ship was. You know what? As I got older, I know what a ghost ship is. It’s people. It’s people that are empty. They got no loyalty, therefore they got no humanity. Therefore, they got nothing. Therefore, frick them. Frick them. And you know why they don’t have loyalty? Convenience. And you know why? Because it’s hard to be loyal. It’s actually hard. I’ll be a son of a gun. “Yeah. Yeah, it sounds great. Give it to me. Give it to me. Paint me with it. Yeah, it’s great. Yeah, I’m loyal. Yeah, I’m great. Yeah, this is good. I’m ready. I’m on that team. I’m ready. Put me in, Coach. I’m ready.”

(00:43:09)
“Okay. Now, you’re going to have to get hurt here.” “What do you mean, get hurt?” “Oh, well, it’s going to be painful. I mean, to be loyal, you’re going to be in danger because the person that you committed your loyalty to, for a reason, because obviously you did something in your life, whatever, whatever, you’re actually going to get hurt to be loyal to them. You’re actually going to…” “Hold on a minute. Wait. Hold on a minute, Coach. Hold on. Call time out here. Let me think about this, Coach. I might need more practice. I’m not ready for the game. I’m not ready to go in the game yet. Give me a little more practice, Coach.” It hurts to be loyal. It fricking hurts. But without loyalty, we’re ghost ships. We got no strength. We got nothing. We got nothing. We got nothing.

Mike Tyson

Lex Fridman
(00:44:05)
I agree with you in a deep fundamental sense, but there’s pain that comes with that. I have to ask you to introspect on this part of your life. Because of your value for loyalty, as people know, you and Cus D’Amato trained young Mike Tyson, and the interaction there between the three of you led to the three of you parting ways. Given your value for loyalty, can you tell the full story of what led up to this and maybe the pain you felt from that?
Teddy Atlas
(00:44:59)
I guess it was the second time in my life I felt betrayed. The first time was when I was whatever, young, 17, and I got arrested. I was with all these older guys, tough guys, whatever, supposedly, and the detectives separated us. That’s what they do. And they asked me who did whatever? Whose gun? This, that, all that, the particulars of obviously what we did. And it was me. And they said, “You sure? You don’t want to change that? Because your friends changed it.”

(00:45:48)
And these cops, they were nasty, but they were cops. They were, “You’re going to wind up in Rikers and they’re going to be doing this to you.” And I won’t even say the things because, then, why say them? Figure it out. But they’re trying to get what they’re trying to get. And, “You want to change it?” And, “No.” But I felt very betrayed and especially when I was standing in the cell at Rikers looking at the airplanes leave LaGuardia Airport. And then hoping I was on one. I was making a deal with God that, “Let me be on one of those planes and let it crash. I’ll take a shot.”
Lex Fridman
(00:46:31)
Was part of you proud that you didn’t give up your friends?
Teddy Atlas
(00:46:34)
No, because I didn’t understand what proud was. I didn’t understand nothing. I just understood that-
Lex Fridman
(00:46:40)
Rules are rules. You’re just loyal and that’s it?
Teddy Atlas
(00:46:43)
I didn’t even know there was an option. I know the cops said, “You could do this,” but there was no option. My father never had an option. But the betrayal, the private betrayal was like… And so we were partners, me and Cus. Cus was retired. This stupid kid goes up there and all of a sudden I start training fighters. First, I wanted the gloves. Cus put me in the gloves. I wanted gloves. Then I had an injury, whatever. But bottom line is I still want to fight. I want to turn pro. I want to fight. That was the plan. And Cus had a different plan. Cus was like, “You can’t.”

(00:47:29)
And he had it set up a little bit, whatever. Without getting into it, hey, he did me a favor. I’d like to think he knew he was doing me a favor. And you know what? I do think he was. He was doing himself a little bit one too. But he was doing it for the greater course because he believed in this thing of boxing. He believed that it changed lives. He believed that it was worthwhile. He believed that there was a power to it beyond the left hook.
Lex Fridman
(00:47:55)
The big picture of boxing.
Teddy Atlas
(00:47:58)
Yeah.
Lex Fridman
(00:47:58)
He believed in it.
Teddy Atlas
(00:47:59)
Yeah, he believed that to be a champion, you had to be special, you had to be smart, you had to have character, that you had to be a better person, and that you couldn’t make a champion if you didn’t make him a better person first, and that this could strengthen people. The sport could strengthen people in those ways. So he was married to it. He was old and there was no one in the gym. It was empty. It was above a police station, which was crazy. He needed an heir to the throne. He needed to pass it on to someone.

(00:48:38)
And he saw something, and all of a sudden he saw that my career as a boxer was less important than having me become his heir to the throne and becoming his trainer, his man, his guy, to continue, that we could do a lot more for him and for everyone. Not just for him but for everyone. It was more like to keep it going. It couldn’t die. It couldn’t die. Cus was afraid it would die with him. And he committed his whole life to it. He didn’t get married because of boxing. So he saw me as the little bit of the seed to plant for more things to grow before that plant died. And so all of a sudden he says, “You can’t fight.” I had people tell me that I could go somewhere else and fight. And I could, but I couldn’t because I’d be disloyal.
Lex Fridman
(00:49:36)
Loyalty is everything.
Teddy Atlas
(00:49:37)
Yeah. So I couldn’t leave Cus, and he kind of knew that. And so I couldn’t leave him. And he said, “You have an ability to teach.” He said, “Knowledge means nothing.” He said, “See these Britannica…” He had Britannica encyclopedias, the whole set, in our library. He said, “You see these?” “Yeah, I see them.” ” All the knowledge of the world, whatever, is in these.” “All right.” “Means nothing if you don’t have somebody to convey it to people. Otherwise, it just sits on a bookshelf and looks good.” He goes, “You have the ability to convey knowledge to people. You’re a teacher. You were born to be a teacher. You’d lessen yourself by only being a champion fighter because you’d only take care of one person. You could take care of all kinds of people and you could do this and you could do that and you could do this.”

(00:50:30)
So we go on this venture. Took a minute, because I didn’t believe him at first, but finally I am, I’m there, I’m training fighters. Then he gets me to buy in, and I was a teacher. I start teaching these kids, and there’s no one in the gym. It’s dead. And all of a sudden there’s 10 kids, 15, 20, 25, 30, 35, 40, 45. Catskill Boxing Club, which was never there. Now it’s there. And I’m training fighters. I’m taking them down to South Bronx to get experience, one of his former fighters, Nelson Cuevas, down to South Bronx. I’m taking him down there to get smokers, to get fights when they’re ready after I teach.

(00:51:12)
I’m wearing out dungarees. I’m getting holes in my dungarees. I was fashionable for it was fashionable to have holes in my dungarees. I could have made a lot of money with that because I was on my knees with these little kids, nine years old, 10 years old, eight years old, 10, 12, 13, 14, all these kids. I’m teaching them and I’m building a gym. Cus only came once a week because he was semi-retired, and he’s home. When he would come once a week, he knew he couldn’t give me money, but he gave me more than money. He gave me praise. And he said, “Look what Atlas is doing. He’s creating champions.” And I was like, “Whoa. Yeah. Wow. I’m doing good.”

(00:51:55)
And then all of a sudden after four years of that, because I was up there seven years, eight years, eight years, after about three and a half, four years of that, we get a phone call that they got this kid in prison, in Tryon prison, from one of the guys that knew Cus, Matt Baranski. There’s a correction officer named Bobby Stewart who used to box, and Cus had helped him out a little bit. A little bit. They knew we had this gym. Now that was really starting to become something because we were winning tournaments and everything else. They go, ” We got this kid, Mike Tyson. He’s 12 years old, he’s 190 pounds, and he’s a mess. But Bobby Stewart got involved with him, the former fighter, and he’s taken a liking to it. And now where he didn’t behave at all and he didn’t listen to anyone, now he’s listening because Bobby’s got a carrot and the carrot is he’d teach him boxing. And now he’s at the point now where we want you to take a look, you and Teddy.” “All right. Bring him down.”
Lex Fridman
(00:52:57)
What’d you think when you first saw Mike Tyson?
Teddy Atlas
(00:53:00)
Well, I wanted to see his birth certificate because he’s 190 pounds, 12 years old and all solid. Really? But, yeah.
Lex Fridman
(00:53:10)
Just physically, just as a physical specimen?
Teddy Atlas
(00:53:13)
Yeah.
Lex Fridman
(00:53:13)
Big guy?
Teddy Atlas
(00:53:14)
Yeah. And listen, Cus was right, I was a teacher. He was right. And he was testing me even that day. He said, “What do you think?” So I said, “Well, we ain’t going to know nothing hitting the bag. Who the frick cares about that? He knocked the bag down. We got to put him in with… We got no one to put them in that way.” I didn’t have anyone that way. We got to test him. Everyone’s got to be tested. So you got to put them in responsibly. But, “Let’s put him in, just responsible, but let’s put him in with Bobby Stewart.” Former pro fighter, had 14 pro fights. Smaller than Tyson. When he was fighting he was 175. But still, he’s 28 years old. Tyson’s 12. Come on. “And he’ll work with him, right?”

(00:53:57)
So we do, we put him in. Tyson, he recognized the moment. He understood this was an audition, this was a chance. This was that TV show, Change Your Life. He understood that if he passed the audition, he could possibly change his life. He wasn’t sure what. How could he be sure what exactly? But it was better than what he had. And so he was on audition. So he innately understood what we would want to see: ferocious, toughness, character, desire, and, of course, ability. Well, we saw the ability, power, speed, but it was unbridled. It was untaught. It was raw. He didn’t know really much at all. At all. But we saw that.

(00:54:50)
But he wanted to show more. He knew that wasn’t enough. Again, innate intelligence. He had to show desire. He had to show toughness. And so I was being responsible. After two rounds, that’s enough. Normally, I don’t put a guy in to box until maybe four months, five months, six months, eight months, 10. It depends what it takes to learn on the floor before it’s responsible to put them in the ring to actually take on incoming real live shells instead of blanks. And so normally I wouldn’t have him in. And I knew after today, he wouldn’t be in the ring again if I trained him. I would teach him first and then he’d get back in in a few months. But for this day, it was the only way.

(00:55:36)
I used to make this announcement and Cus loved it. He said, “What’s training a fighter? What do you look for training a fighter, Teddy?” He asked me these ridiculous questions just to test me. And I say, “It’s like going to Macy’s…” Oh, he loved it. I said, “It’s like go to Macy’s window on Christmas.” He goes, “What do you mean, Macy’s window?” Cus was like, boom, boom, boom. So, “What do you mean, Macy’s window?” “You go to Macy’s window and they got the window with everything you want to see, everything in there. And it looks great, everything.” “Yeah? And then what?” “Well, then you ask, ‘What’s in the warehouse?’ and they tell you, ‘Nothing.'” And then Cus says, “That’s it. That’s the trainer.” And I wanted to see what was in the warehouse. Because I saw what was in Macy’s window. I saw the power, I saw the speed.

(00:56:22)
So he goes two rounds and he gets a bloody nose. Here’s the weird thing, not weird, very telling. We knew what we were doing. I’m not bragging, but we knew what we were doing because he got a bloody nose because he got hit. After that, he never got another bloody nose. You know why? He didn’t get hit. Because he learned. He was still strong, but he was smarter now. Anyway, he goes two rounds, and I saw, and I’m being responsible because if he goes more, it’s not responsible. I saw what I needed to see. I saw speed, I saw power, I saw athleticism. And I saw, I didn’t believe him. I thought he was lying to me. I’m just telling you. I thought he was lying, trying to act tough when he wasn’t really feeling tough. It didn’t matter.

(00:57:04)
Cus questioned me on it afterwards, “What did you see?” And when I said it, he goes, “Young master.” Again, he wasn’t paying me money so he had to give me something, right? And that was currency. “Young master.” I’m the young master? Whoa. “Young master.” You know what I mean? I felt like that guy, Kung Fu, like in the movie, like Kung Fu, “Grasshopper, when you’re ready, when can take this out of my hand, you can leave.” And-
Lex Fridman
(00:57:29)
That’s powerful.
Teddy Atlas
(00:57:30)
Yeah, it was. It worked. Cus knew how to work me. And he did. And it worked. But you know what? I didn’t mind being worked. I kind of knew I was being shuffled a little bit.
Lex Fridman
(00:57:45)
Well, you’re making it sound a little bit negative, but it’s also extremely positive. That’s a teacher instilling wisdom into you that you carried forward and it impacted a lot of people.
Teddy Atlas
(00:57:56)
Yeah. Cus got the job done, but he did it his way, and he did it for a myriad of reasons. But at the end of the day it was all good, and I just had to understand that eventually later on. But-
Lex Fridman
(00:58:11)
And you do the same. You do things your way and carry some of him in you, some of your father in you.
Teddy Atlas
(00:58:16)
Yeah. That day it was funny because when Cus said, “What did you see, Teddy, with him?” After two rounds, I got up on the ring. I knew I was going to train him. Obviously, we weren’t going to say no. He still had about four months to serve, and we were going to work it out. And when I got up on the ring apron, that’s my gym, I’m the boss. People later on in life called me a dictator. You know what I said? “Yeah, you’re right.” I didn’t deny it. People thought it. “You mean I’m right?” “Yeah, I’m a dictator. I’m a trainer. I’m the boss. I’m in charge. You wouldn’t be here if I wasn’t. What the frick you need me for if I’m not fricking in charge, you idiot? Yeah, yeah. Damn right. Well, what do you think, it’s a shared responsibility? No, it’s my responsibility. That’s why you’re here. Yeah, I am in charge. You shouldn’t be here if you don’t understand that.”

(00:59:11)
So I get up there and I know that I’m going to be training him. I got to show him who the boss is. I’m being really frank about this. So I get up there, I say, “That’s it. Out.” “No, no,” this is Tyson, “No, let me go. I want to do another round. I want to do another one.” “I said out. Did you hear what I said?” Because I knew that he was going to test me. He was testing me. I said, “I said get out.” He got out.
Lex Fridman
(00:59:39)
But were you impressed with the fact that he wanted to keep going, or no?
Teddy Atlas
(00:59:42)
Yes, and I recognized what it really was. So Cus asked me, “What was that?” Cus wanted to know what the young master saw. So Cus said, “What was that?” I said, “It was an act.” He goes, “You saw that? Did he really want to go?” I said, “No.” I said, “He didn’t really want to go, but he knew that we wanted him to go, and he made himself ready to go in order to satisfy, and that’s just as good.” And Cus said, “Damn right it’s just as good. All that matters was not how he got there, but that he got there. That’s all that matters, that he got there. That he got to the place to act like a fighter, to do what we want him to do. To be ready to persevere, to go beyond the comfort level, to do another round. He didn’t want to, damn right he didn’t want to, but he knew we wanted him to, and he knew in order to pass the test, he had to do it.”

(01:00:43)
And he said, “You’re right.” He goes, “Now it’s going to be your job to teach him, to make him a fighter that don’t get bloody noses, that don’t get hit and will get to that place without being chorused to get there, to get to that place on his own, instead of using the things that he had to use to get to that place today. Those things are not going to be available one day when you…” And listen to this. You talk about a man being prophetic. Cus was pretty good. You talk about a man being on the job, on the money, Lex. How do you think he finishes the sentence? He goes, “Because you’re going to have to make sure that he learns these things because he’ll be your first heavyweight champ.” “What did you just say?”

(01:01:36)
He’s 12 years old. He’s been arrested 30 times. He’s getting out of jail, out of juvenile detention, Tryon. He’s a mess in a lot of ways. There’s a lot of things we find out later, a lot of problems, weaknesses. He goes, “And that’s part of your job. That’ll be part of your job.” But he really said that. And then he turned to him, he goes, “You want to come live with us, young man? You want to be a fighter?” “Yes.” Even that, Cus said to me later, “What do you think about that?” I said-
Lex Fridman
(01:02:14)
The way he said, “Yes”?
Teddy Atlas
(01:02:15)
Yeah, the way he said, “Yes. Yes, sir.” Yeah. He said, “What do you think about that?” And we’re talking, I said, “He ain’t going to be that polite in a little while down the road. Again, he knew that that’s what he felt that he needed to project himself as, to present himself as to get to where he want to get to.” He goes, “Yeah, yeah.”
Lex Fridman
(01:02:38)
Did you see what Cus was seeing in terms of the heavyweight champion of the world?
Teddy Atlas
(01:02:42)
No. Again, the easiest answer would be yes. Teddy’s just a-
Lex Fridman
(01:02:45)
Teddy knows.
Teddy Atlas
(01:02:46)
… genius. Wow, wow. Wow. No, no, no, no. But again, it was my job. And my job, it was simple, simpler than Cus’. Cus knew too much. I knew nothing. I just knew rudiments of boxing. I knew what it took to be a fighter and how to execute it, the steps of executing it. So I took those steps. The rest of it, you get blurred by those other things. I wasn’t blurred by those other things. It was just, “Get them in the gym, make them mentally stronger, make them face things, and teach them how to slip punches. And create holes, and fill those fricking holes with devastating punches,” this is Cus, “And what are you going to do?” “I’m going to teach them to fill holes and fill them with punches with bad intentions.” And that became the moniker. And then Tyson would say that, “I’m throwing punches with bad intentions.” Yes, you are.
Lex Fridman
(01:03:48)
How do you make him mentally tougher? So that part of the job, you said the, “Don’t get a bloody nose,” but the part of the job where it makes him mentally tougher, how do you do that?
Teddy Atlas
(01:03:57)
Most important part of the job, to make him face things. Make him face where he’s lying to himself, where he’s submitting. What if we start this conversation with submission? Submit less, submit less, submit less every day, submit less. Cus only come to the gym once in a while. And if I had him sparring, he would come because that was his project, that was the heavyweight. Now he came. It put life in Cus. Cus had life. He was losing a little life, but that made the light bulb bright again. It did. And it was great to see. I felt proud of that. I felt connected to that.

(01:04:32)
That’s why when it all went bad and Cus took the side, the only side he could take, the side of the next heavyweight champ of the world, but he left me, his partner, the young master… And for the second time I get betrayed. And I’m like, for a while I thought everything Cus taught me, said to me was a lie, and I didn’t want to be any part of it anymore until I got a little more mature and I got a little past that where I was able to understand. I was able to understand that just because somebody that you perceived as great in every area you find to be weak in certain areas doesn’t mean that they can’t still be what they want to you. It’s something that can be understood or forgiven.

(01:05:43)
But yeah, it’s hard. It’s hard to get to that place, to forgive somebody in that kind of way that I felt betrayed. Because Cus told me the most important thing was loyalty. Cus told me he loved me because I was loyal. Cus told people that the reason that he went to court was because I didn’t give up anybody.
Teddy Atlas
(01:06:00)
… to court was because I didn’t give up anybody, even though it meant put me in the risk of going to jail for 10 years because felt that he admired those traits. And so I assumed that he would show the same traits. And he took a deal. He took a deal. He took a deal. He signed the papers that those so-called Feds of mine signed. He took a deal to have the future heavyweight champion, as it turned out, and to let me go. To sign the deal to let me take the weight.
Lex Fridman
(01:06:48)
For people who don’t know, Mike was inappropriate with a young girl and you pulled the gun on him. I don’t know if there’s deeper things to say about that situation.
Teddy Atlas
(01:07:00)
No.
Lex Fridman
(01:07:01)
But why do you think Cus made the decision to cut you off from both Mike Tyson and from Cus D’Amato? To break that when he valued loyalty so much.
Teddy Atlas
(01:07:12)
I served my purpose. I got him to where he needed to get. Brought life back in the gym. If I wasn’t in the gym at that particular time, Tyson never would’ve been in the gym. There would’ve been no gym to bring him to when they called up and made that phone call to bring him to the gym. There would’ve been no activity. There would’ve been no boxing program. There would’ve been no training, training him 247 the way I was, where Cus wasn’t capable of doing that at that point in his life.
Lex Fridman
(01:07:40)
Yeah.
Teddy Atlas
(01:07:41)
But then again, it’s not poor Teddy. I got the benefit of a career, I got the benefit of knowledge, I got the benefit of a life, I got the benefit of learning, of becoming hopefully a better person. I got the benefit of being betrayed again.
Lex Fridman
(01:07:59)
That’s a hell of a statement right there. I don’t know what the benefit of that is.
Teddy Atlas
(01:08:05)
You can learn to forgive weakness when you realize how easy it is to be weak. And when you realize that… somebody asked me, how did you get to the point where you could forgive? It’s a pretty good question. Pretty simple, pretty basic, pretty important. And I didn’t understand, I understood. But I did understand immediately, for me. I said, “How can I not forgive somebody?” It becomes easier to learn how to forgive when you’re still trying to forgive yourself, when you’re still in the process of trying to forgive yourself for all your own inherent weaknesses and betrayals of people like my father in different ways that we forget very easily because it’s handy and it’s a way of surviving. It’s a lot easier to figure it out, rationalize it, to find forgiveness when you realize that you still haven’t figured out completely how to forgive yourself. I’m still trying to figure that out.

(01:09:34)
And so that helped me figure out how to forgive Cus because to figure out how to forgive me, I had to understood why I did these things. Where the weaknesses came from, where the selfishness came from, where the convenience came from. That they really existed. But they didn’t exist for malice, they existed for me not being prepared to understand that I could be stronger, to want to be stronger. And then I looked at Cus. He wanted to be stronger, but he got to a point in life where he had been strong for a lot of his life. He was strong with me, he was strong with a lot of things in his life. And does everyone deserve a pass in life?

(01:10:27)
He got to a place where everything was in one basket, the basket of boxing. He once told me that he never got married because it would’ve been selfish to a woman to have gotten married when his whole life was boxing. That he couldn’t give to her kid, he couldn’t give to her. And then I thought about it. He had no money, really. And Jim Jacobs and Bill Caden took care of the bills, so he didn’t really need money that way. But what was the payoff of that kind of life, that kind of commitment, that kind of sacrifice? Really, what was the payoff? The payoff was to have champions. To have a champion that would keep your name alive.

(01:11:17)
That word legacy, what does it mean? Sometimes it’s just a word, sometimes it’s more than a word. It’s a reprieve. It’s a pension plan. It’s being given a pension on your way out for the rest of your life, for your life wherever you’re going. You’re going to wherever you’re going for eternity. It’s the only thing that you take with you, is what you left behind. And for Cus it was all about leaving behind a mark. A mark of a champion. Yeah, it was attached to ego. We all have it. Yeah, it was attached to some selfishness and all. But yeah, it was also attached to wanting to leave something great behind.
Lex Fridman
(01:12:08)
Yeah.
Teddy Atlas
(01:12:08)
To know that you were part of it. That you existed for a reason. That you sacrificed for a reason. And all that freaking pain I brought my father, I was searching for something. Yeah, I made it into a righteous search. I made it into… I did. And I made it into, “It was okay because it was righteous,” but it still did damage. It still did damage. It still hurt people. It still betrayed my father’s trust. And Cus betrayed mine, but he didn’t do it maliciously. He did it out of, again… my father came home… this is how I’m going to connect it. My father came home from work one night, 12:00. And I was waiting on him. And like I said, I was nine, 10 years old. And he got mad at me. He goes, “Go to bed. What are you doing up?” I said, “I’m waiting for you. Waiting for you.” And he said, “Well, go to bed.” I said, “No. What were you doing?” He said, “I was at the hospital.” “Why were you there so late?”

(01:13:25)
He answered me. He said, “There was a patient. There was a sick patient.” I said, ” He must be better now because you’re his doctor,” because my father could fix anything. My father, nothing got in the way of the truth. Nothing. Nothing. Even blowing his son’s bubble. Matter-of-factly he said to me, “No, he’s not going to get better. He’s going to die.” So as a 9-year-old kid, you’re a kid, you’re selfish, not in a bad way but you want what you… and I said two things. First I said, “How? You’re his doctor. How? It can’t be.” And then I said, I said it almost angry, “Then why were you there? You should’ve been here with me.”
Lex Fridman
(01:14:22)
Yeah.
Teddy Atlas
(01:14:23)
And you know what he said to me? “Because you don’t give up on life. Go to bed.”
Lex Fridman
(01:14:51)
Don’t give up on life.
Teddy Atlas
(01:14:52)
And I finally connected the dots. This idiot that didn’t graduate high school, I finally connected the dots. I was asking Cus to give up on life. You don’t give up on life. You don’t give up on aspirations of life. Life is all forms of life. It doesn’t have to be a physical form of it. It’s life. It’s having a reason to be alive. It’s having a reason to have tomorrow. And Cus’s only reason to have tomorrow was to have another heavyweight champ.
Lex Fridman
(01:14:52)
Yeah, a champ.
Teddy Atlas
(01:15:11)
And Teddy Atlas, even though we were together all those years, and we were partners, and we trained together, and the only thing we didn’t do was what they did in the Indian movies where they cut the finger and they became blood brothers.
Lex Fridman
(01:15:25)
Yeah.
Teddy Atlas
(01:15:25)
That’s the only thing we didn’t do, and I felt like we did that without cutting. And now here we are, and he freaking betrayed me. The… and then all of a sudden I connected the dots. I was like, “He didn’t betray me in that cold sense, he didn’t give up on life.”
Lex Fridman
(01:15:53)
Years later, Mike Tyson apologized to you. What’s meaningful to you about that? How does that fit the story?
Teddy Atlas
(01:16:04)
I want to be the great, gracious guy right now. Say, “Oh, I’m so human that a man’s man enough to say sorry, that’s it, we’re good.” I want to be, really. That’s the best presentation of Teddy Atlas I could put out there. He’s a good guy. He forgives. He’s a good guy. He’s a standup guy and he’s a good guy. I’m not sure. If he truly did it for himself, that he really did it because he felt that it was true. But if he’s persuaded by other things… he was in the middle. I know I’m taking it too deep, I know it, but what am I going to do?

(01:16:58)
He was in the middle of 12 steps with the getting out of drugs, alcohol, 12 steps, which is a commemorable thing. Really, it is. And he’s taking the steps. Part of the steps was to admit all, to apologize to all people you offended in life. Okay. But are you doing it for the 12 steps or are you doing it because you really truly have come to terms with believing what you did was that hurtful to me, and that it matters to you that it was that hurtful to me, and that you were wrong in doing in it? Did you do it for… I know that’s deep. I know that I’m a freaking idiot. “Teddy, you should be better than that. He’s better than you.” Yeah, maybe he is better than me. Maybe he is. Really. Seriously, maybe he is. And I took it. He put his hand out. I took it. We hugged. He said, “I love you.”

(01:17:58)
Yeah. Yeah. But I want to believe. But what did Cus tell me? “No matter what a man says, it’s what he does in the end that he intended to do all along.” So to this day today, was it really genuine or was it reflexive of that moment for him to get what he needed for that step? Or was it truly for what I needed? That he really cared that what he did to me caused me to do what I did. Because I did something that was pretty bad to him, too. Is he able to deal with that and put that where it has to be put? Is he able to put that? Or is it just he did something he had to do and maybe he’s sorry he did it? Look, I appreciated that he… I would it’d rather been in a private place.
Lex Fridman
(01:19:05)
Yeah. So for people don’t know, you were in the middle of commentating a fight, and he walked up from behind you and he said he was sorry. He shook your hand, gave you a hug. I didn’t know. He said, “I love you.”
Teddy Atlas
(01:19:15)
Yeah, he’s emotional. I get emotional a little bit, too. But he’s emotional and he can be… I can see why people have a fascination and a love affair with him right now, because he was the meteor that went across the sky that, if they didn’t see it, their parents told them about it. There was a meteor that came across the sky one day.
Lex Fridman
(01:19:42)
Yeah.
Teddy Atlas
(01:19:42)
And the meteor is walking around in the room now, and that’s the meteorite. And it actually landed here, and that’s it right there. And now he’s come a long way. And now he’s more human and he’s lovable and compassionate and he cries. And I get the fascinating, I get the love affair. I get it because, inherently, we’re people that want to forgive. We’re people that, we want to be good, and part of being good is to forgive people and to show compassion to people. And when somebody’s been damaged, to acknowledge they’ve been damaged, to acknowledge that you know they’ve been damaged, and you care about them being damaged. And how do you show care? Through admiration. In some ways almost through adulation. And he’s getting adulation from people, which is to an incredible level. And it’s a phenomena, but I get it. I understand it. And I don’t know if he gets it. I don’t know if underneath all of this… he’s a complex guy. He’s a sensitive guy. I don’t know… And I am, too.
Lex Fridman
(01:21:00)
One complex guy talking about another complex guy.
Teddy Atlas
(01:21:03)
I don’t know if, underneath it all, where he’s really truly at as far as that day that he said that to me.
Lex Fridman
(01:21:12)
Is there part of you that’s sorry to Mike for-
Teddy Atlas
(01:21:15)
I’m not sorry.
Lex Fridman
(01:21:16)
Pulling the gun on him?
Teddy Atlas
(01:21:17)
Yeah. And listen, that’s fair. I know dimensions of human nature too well to not know that he still has to have certain… because I have those strong feelings. What? It’s not fair for him to have them? Damn right, it’s fair. Now, he could look at it, if he was to be held to his word, that night that he just acknowledges that what happened, he deserved because of the position he put me in and he put himself in, what he did. And I wouldn’t change nothing.
Lex Fridman
(01:21:54)
Still, you don’t regret pulling the gun on him?
Teddy Atlas
(01:21:58)
I regret that I had to.
Lex Fridman
(01:21:58)
Yeah.
Teddy Atlas
(01:22:01)
Yeah, I regret very much that I had to. I regret very much.
Lex Fridman
(01:22:06)
He crossed the line.
Teddy Atlas
(01:22:08)
I hated him for putting me in that position. How dare he think that somebody’s feelings are that trivial? That the way I would feel about myself and the way the girl would feel about herself, that was 11 years old at the time, how she would feel about herself. How dare that he think it’s that trivial that I shouldn’t be ready freaking to both die and kill for that?
Lex Fridman
(01:22:35)
Yeah. Why didn’t Cus D’Amato see it in a deeper way and talk through it?
Teddy Atlas
(01:22:42)
He did. The word came back to me but, of course, what does it mean? But the word came back to me that Cus said, “You were right.” But if he took the side of Teddy, he would destroy potentially a great fighter.
Lex Fridman
(01:22:59)
Why do you think that? Okay, if you were to try to understand the point he was making, why is that true? Isn’t the part of greatness that you said is building the character of knowing what is right?
Teddy Atlas
(01:23:11)
Cus was afraid to go there, where he used to not be afraid, because it’s kind of like you’re never afraid of going up. And I get it. When I train to fighter now, if I come out of retirement, I train to fighter now, I feel in camp like I’m on death row every day. Every day I try to retrace my memory and say, “Did I feel this way when I was younger? I don’t remember feeling this way.” I feel, every day, a dreadful feeling that if I don’t get this right, I betrayed everything. I betrayed the fighter’s trust, I betrayed what I’m supposed to be.

(01:23:53)
And then one day I tried to figure it out. Why do I feel this way? It’s so intense. I was in camp for two months training a guy for the world title a few years ago, fighting the hardest puncher in the world at the time, Adonis Stevenson, and the fighter was Ukrainian. And I was brought in to train him for that fight, and he trusted me and changed his whole style. Trusted me. Oh my God. I went to bed every night praying, dread. Waking up, dread. My stomach down to here. Saying, “What if I fail? What if everything that I told him was going to happen don’t happen? What if I fail him? What if he trusted me and I betrayed that trust?”

(01:24:37)
And the thing with Cus was he used to be stronger than that. And then I tried to figure it out, why I got this way and why it was so dreadful to me, and why I felt like I was on death row every day training a fighter. Like, “Did I do enough? Did I do right? Will we accomplish what I promised him we would accomplish? Would I keep my word?” And then I started thinking, how did I become this weak? How did I freaking become… I was a pretty strong freaking guy. How did I become this weak? And then finally I think I figured it out. You know why?
Lex Fridman
(01:25:12)
Hm?
Teddy Atlas
(01:25:13)
Because I was always working to get up. But once I finally got up, now I was looking down. And it finally hit me. I said, “I didn’t want to lose.” I said, “There was nothing to lose on my way up.” Now, all of a sudden there’s something to lose when you’re up there and you’re looking down.
Lex Fridman
(01:25:29)
And that’s where he was.
Teddy Atlas
(01:25:32)
And that’s where Cus was. Cus was at the end of his rope. He accomplished two world champs, all this stuff, everything. And he did it right. Now all of a sudden it wasn’t about moving forward, it was about not falling down. Holy cow. I was like, “I got it, Cus. I got it. I got it. You didn’t want to fall down. Oh my God. You didn’t want to fall.” And this was his last chance. You don’t give up on life. This was his last chance to live forever. To make everything he did worthwhile. To have the youngest heavy… it wasn’t just heavyweight champ. You’ve got to remember he was the youngest heavyweight champ ever.

(01:26:20)
And to have that, it was okay to die now. And how’s loyalty to someone named Teddy Atlas going to get in the way of that? That’s a tidal wave that there ain’t no wall that’s been made high enough to stop that tidal wave. And now I’ll stop myself. Yeah, there is, but it would have to be an awful big one. And you know what? Who are we to say that we could ever build that wall that big? Who is any of us? Who am I to say?
Lex Fridman
(01:26:54)
Do you think, if you were to put yourself in the shoes of Cus D’Amato, can you see yourself having the big enough wall where you would choose loyalty?
Teddy Atlas
(01:27:07)
Now, if I answer the way I feel then I’m making myself John Wayne again.
Lex Fridman
(01:27:13)
You don’t have to answer then. I think loyalty is important.
Teddy Atlas
(01:27:18)
No matter what a man says, it’s what he does in the end that he intended to do all along. I didn’t make that up, Cus did. And when this all went down, those words came freaking echoing into my freaking ears. I didn’t want them. Cotton doesn’t help. And they freaking kept coming into my ears. And what do you think? Still an immature kid at the time. I was young. Still an immature kid at the time. What the freak do you think my response was? You were full of…
Lex Fridman
(01:27:54)
Yeah, shit.
Teddy Atlas
(01:27:56)
But I got past that.
Lex Fridman
(01:27:58)
Do you forgive Cus? Have you found forgiveness?
Teddy Atlas
(01:28:03)
Listen, I forgive him because he gave me more than he took away from me. What kind of man am I if I can’t at least acknowledge that and be grateful for that? He gave me more than he took from me, and I’m grateful for that. I’m also grateful for what I gave him, that I did give him something, and at that point in his life. A place to still have test tubes and chemistry experiments. A laboratory where he could still create a great fighter. And I helped give them that. I was part of that lab and making sure that lab was there and just that there was the existence of test tubes in the place, because you can’t freaking do experiments without test tubes.
Lex Fridman
(01:29:07)
Now you’re the scientist with the test tubes.
Teddy Atlas
(01:29:10)
Yeah, I guess so. And I just hope that… what I said earlier is really my thread through this whole thing. When you say, “Could you forgive Cus?” I’m still trying to forgive myself. And if I can have hope that I can forgive myself, I think that hope has to start with the power to forgive someone else. How can I ever forgive myself for all my failings and figure it out if I can’t start and practice it by forgiving someone else for some shortcomings? And for me, that’s the only sense of sometimes a very hard thing to make sense of. That’s my North Star, that’s my compass. Cus used to make me laugh. Me and him did everything together. And we’d get lost in the city, we get lost in the Bronx, and he’d get all frustrated. And he said, “Atlas, you’re a great trainer but you turn you around, you spin you around and you’re lost.” And I said, “Me or we?” Because I was the only one who would argue with him, and it was really funny sometimes. And I said, “We or me? We or?” He goes, “[inaudible 01:30:46].” “Cus, you’re lost. I’m lost. What are you talking about?”

(01:30:52)
And then all of a sudden Cus couldn’t give in. He just couldn’t admit. He couldn’t give in. You know what he said to me? All of a sudden he goes, “When I was in the Army, if I had a compass I could get out of the woods.” I said, “We’re not in the woods, we’re not in the Army. We don’t have a compass. Cus! Cus!” “Just don’t argue with me!”

(01:31:13)
One time we’re driving. I want to get back to Catskill. We just finished at the Bronx. It’s been a long day visiting the Murderers Inc. houses and everything else that he took me through for the 1,800th time. And he would fall asleep. He was getting older and he would just fall asleep in the car. So what do you think? I went a little faster, because before he went to sleep he said, “Don’t speed.” I don’t consider myself… I try to be an honest guy and I try to be a freaking…
Lex Fridman
(01:31:51)
Was it five or six guys?
Teddy Atlas
(01:31:52)
What did I figure earlier? Try to do less submitting, really, in all phases. Try to submit a little less. Try to lie a little less today. A little less. Try to get stronger, try to get a little better. So here we are and we’re driving. And all of a sudden he’s asleep. What did I do? 80? 75? Probably. Probably did. Whatever. And all of a sudden he wakes up. “You were speeding.” I lied. “No, I wasn’t.” ” Don’t lie.” “I’m not lying.” “You lied again. You were speeding.” Now, come on. This guy, he’s unbelievable. So I got to freaking… he’s David Copperfield, I want to know the trick. I want to know how he made this thing disappear.

(01:32:52)
So I said, “What are you talking? How do you know?” He goes, “Because I timed you. I looked at the post number.” And I’m like, “What?” “I looked at the post number on the side of the road where we were,” whatever mile. And I never knew they even existed. I look and I said, “Yeah, there’s little numbers.”
Lex Fridman
(01:33:12)
He started timing and then he fell asleep.
Teddy Atlas
(01:33:13)
Yeah, he timed it. And he looked. He goes, “We couldn’t have got from here to there in that amount of time unless you were going 75 miles an hour.” And I’m like, “All right, I’m impressed. Don’t try to get the mileage, the mile per hour part right. It’s enough that you got me. That’s enough. Yeah.” I said, “And I’m not going to do that no more.” And he helped me in crazy ways where there would be times where you wanted to be whatever, convenient, weak, submit. And then all of a sudden, in my mind, Cus was there with the stopwatch. And I’d be like, “No,” where I was about to say yes to whatever that particular situation was.
Lex Fridman
(01:34:16)
Somebody hit their phone. Hello? Hello? Yes. Doing great. Thank you.
Lex Fridman
(01:34:30)
Just for the record, never had a phone call like this. It’s hotel security. The question he asked me is, “Are you okay, sir?” Are you okay? Are we okay?
Teddy Atlas
(01:34:39)
I think so. I think so. So far.
Lex Fridman
(01:34:43)
Yeah.
Teddy Atlas
(01:34:44)
I can only go so far. It’s kind of like that old joke where the guy jumps off the Empire State Building. He’s falling down and he’s going 80th floor, 70th floor, 60th floor, 50th. And he gets past the 50th floor and they’re looking at him out the window and he goes, “How am I doing?” They’re like, “So far so good.” I don’t know where it’s going to end.
Lex Fridman
(01:35:07)
Mike Tyson is considered by many to be one of the great boxers, one of the greatest boxers of all time, heavyweight boxers. What do you think, on the positive side, made him great?
Teddy Atlas
(01:35:18)
I don’t know if he was ever great. I know he was sensational. I know he was the greatest mix of maybe speed and power ever. I know he was one of the greatest punchers from either side of the plate, left or right. There’s been great punchers with just the right hand, like Earnie Shavers and Deontay Wilder and Max Baer. I don’t know if there’s ever been anyone who could punch as good as he did on either side with either hand, other than Joe Louis and a few others. I don’t know if there’s ever been such a combination of speed and power to that pure level that he had, and it was a pure level.

(01:35:58)
I don’t know if there was ever as good a fighter as Tyson was, where maybe one night he was great where he wasn’t tested but he might’ve been ready to be tested. That one night against Michael Spinks when he took him apart in 90 seconds, I think I saw a great fighter that night. I don’t think you can be great unless you have all the requirements of being great.
Lex Fridman
(01:36:22)
What does it take to be a great fighter? Truly great.
Teddy Atlas
(01:36:32)
To not rely on someone else’s weakness to be strong. To be strong on your own. Too often he relied on other people’s weakness, whether it’s by being intimidated or whether it was because his talent was so much greater than theirs that it was like putting a monster truck in there with a Volkswagen, and the Volkswagen was going to get crushed. No matter how much horsepower the Volkswagen might’ve had under the hood and you put under the hood, it was going to get crushed. The monster truck was not going to allow it to be a contest.

(01:37:07)
And to be able to find a way when your talent wasn’t enough. He didn’t find a way when his talent wasn’t enough. And I’m not making statements if I’m not ready to put some evidence. Like if we were in a courtroom, exhibit A: when he fought Buster Douglas, Buster Douglas matched his will and didn’t get intimidated. Stood up to him. He didn’t do what most people did. He didn’t submit even a little bit. Not that night. He had in the past, but that night he didn’t.

(01:37:48)
Why? Because Buster had a secret weapon that night, his mother. Buster’s mother had died a few months previous. He loved his mother very much. Buster had always had talent. Big heavyweight, talented, could punch, technically solid. He was all those things, always was, but he quit in fights. He did less than he should’ve done. He never lived up to his ability. He gave in. He submitted. He wasn’t strong enough. He never had a reason to be strong enough. When his mother died, he had a reason. Nothing could hurt him as much as his mother dying hurt him, Mike Tyson included.

(01:38:27)
That night Mike Tyson could not hurt him as much as his mother had hurt him by dying. That night he had a reason to be strong, for his mother, and he was strong. He was everything he was supposed to be and more. And he stood up to Mike, and Mike, for the first time maybe ever, was in a fight where he had to overcome something, where he had to be more than talented, more than a puncher, more than a guy with scintillating speed. And he wasn’t. And then that night got followed by another night with Holyfield. Holyfield…
Teddy Atlas
(01:39:00)
… night got followed by another night with Holyfield. Holyfield wasn’t as talented as him, as big, as a much puncher, but Holyfield had the character. He was strong in ways that Tyson wasn’t strong. He was strong in a way where he could find a way. He was willing to find a way. He’s willing to go to the cliff, to truly die before he submitted. A lot of stuff is just words, “They’re going to have to carry me out on the shield,” yeah, sure. Okay. Yeah, until it comes time to be carried out on the shield. Sometimes there’s people that actually mean it.
Lex Fridman
(01:39:45)
You think Mike didn’t have that?
Teddy Atlas
(01:39:47)
Well, all right. Let’s just say arbitrarily, I don’t have his record in front of me. Let’s say he was 55 and five, I know he had about five losses. All right, let’s say he was 55 and five, a lot of knockouts. I have a saying, a fight’s not a fight until there’s something to overcome, until then it’s just an athletic exhibition. Contest. Yeah. Who’s a better athlete? Who’s got more quick twitch fibers? Who’s more developed? Who’s a better this? Who’s more developed in those physical areas? But a fight is not a fight until there’s something to overcome. Okay. So, if you go by my definition, not Webster’s, my definition, which I think means something, Mike Tyson was only in five fights in his life.

(01:40:41)
The five fights where there was something to overcome and he didn’t overcome it. Now, I know people hate me for this, including Tyson. I understand, hate me. Oh, you’re a hater, because you weren’t with him, you didn’t make the money because this, because of that, because you got betrayed. I think I’m better than that. I hope I’m better than that. I believe I’m better than that. I’m not a hater. I’ve broadcast fights for 25 years on ESPN, where there was some people in the corner I did not like, and if they did a good job, this guy’s doing a great job. And then, there were guys that I liked and I had friendship, he messed up, and we weren’t friends no more. Friendship got to be tested. Remember that? So, we weren’t friends no more, but why did I do that? Because it was my job. It was more important for me… When it’s all over with, the only thing you’re left with is… We’re going to be dust, all of us, right?

(01:41:46)
The only thing we’re left with is what carries on, our reputation, legacy, whatever that is. But our reputation, that’s all we’re left with. And that’s all our kids are left with. I want it to be as good as it can be. I’ve always had ability, I’ve done a lot of things wrong, and I’ve had a lot of lackings, but the one strength I’ve had, if I had a strength, is to understand somehow, through osmosis, I guess, to learn the lesson that was important is not what’s in front of you for those five seconds, for that moment in life, it’s what’s left behind you when those five seconds are gone. Whatever it is that you’re dealing with, whatever that moment is, whatever… That moment, what you do in that moment, the action of that moment is going to stay with you and be you. It’s going to become you.

(01:43:05)
What you face for that moment, it’s gone. It’s gone in the air, in an instant. It’s gone, it’s done. Whether you stand up there and you get shot in the head, and the guy freaking blows your brains out, or you stand up or you’re fighting a guy who’s a scary guy to fight, but you fight him and you beat him or he beats you up. But how you represented yourself in that moment is all that matters. That’s going to live. What happened don’t matter. It don’t matter that you got shot in the head. I know that sounds absurd, but if you believe that it was important to stand up and take the chance to get shot in the freaking head, rather than to live like an empty vessel, you know what? That’s all that freaking matters. And somehow that got freaking wrapped into this freaking head of mine, that’s what matters. That’s all that matters.

(01:44:17)
You know how many times I went, and there were things, whether it was with this one, with Tyson, with that… I didn’t want to be there, I was scared to death, but I was more scared-
Lex Fridman
(01:44:31)
Living with regret.
Teddy Atlas
(01:44:32)
… how I would’ve felt. I don’t want to be in solitary confinement the rest of my life, with that freaking guy in the cell next to me called regret. I don’t freaking want to be next to that guy. If I want to freaking go down that road, I’ll watch Papillon. And I’ll get my fill from that. But I don’t want to freaking live it. I’m afraid of what my children would think of me if I fail in those areas. Why? Because that’s forever. When I’m closing my eyes for the last time, I don’t want to have that fear. I don’t want to have that fear. Whether I’m going down there or whether I’m going up there. I laugh because I was around guys years ago that used to, when we’d talk about that in jest, and I would get a kick out of this one guy who’d been around the block a few times, when he’d say, “Teddy, I ain’t worried about that, I got friends in both places.”
Lex Fridman
(01:45:40)
That’s a good line.
Teddy Atlas
(01:45:41)
And I thought it was good. Listen, Mike Tyson, you want me to say he was a great fighter, then you want me to betray what I really… You know what I mean? You want me to do that? I ain’t doing it for… Listen, I could do it to be a bigger Teddy Atlas, and I know it would work for me. I know it’d do great promotional work for me. I know it would make me more popular in certain areas. I know it, I’m not that dumb. Not that dumb. But I also know what else it would do to me, and I don’t want it to do that to me. I think he was a great talent, I think maybe the night with Michael Spinks, maybe the night with Mike, maybe he could have been that fighter. But he didn’t never really get tested, but he might’ve been ready no matter what, I have to be tested that night.

(01:46:32)
That’s how good he was. That’s how, even though it was a guy who used to be a light heavyweight, I get it. But it was still a guy who beat Larry Holmes, who still had something left, Michael Spinks. And a great puncher. And an Olympic gold medalist. And a special fighter, one of the great light heavyweights of all time. You know what Mike Tyson was? He was a meteor. He was a meteor that struck across, and not too many meteors… And we still talk about him. And unlike Haley’s comet, he came back, and he’s walking around. And he has become greater after his career, more loved, more beloved, more awed, and he’s been forgiven. He found the fountain of forgiveness. I don’t know… I wish I could find that.

(01:47:19)
Where he has been forgotten for all his shortcomings, all the things that he may have done, may not have done, we don’t know, only him and God know. But he’s been forgiven of all that, and he’s been not only forgiven, he’s rised above it and above that, and been brought above that. He’s been brought to the pyramids of the greatest athletes in the world. In every way. In every way. As a person, as a fighter, as a historian, as a figure, as a celebrity-
Lex Fridman
(01:48:05)
Even a philosopher.
Teddy Atlas
(01:48:06)
Everything. So, I will take it back. All right, all you guys out there, you forgive me, he’s the greatest of all time, if you encapsulate all that. If you encapsulate everything I just tried to describe and explain, if you put that all… He’s the greatest of all time, yeah, he is. But he still might be 0-5. In a record of 55 fights he might, in Teddy Atlas’s book, again, I got friends in both places, so it’s was okay. Wherever I go I’ll have company, somebody there will like me, despite me saying this. He might be 0-5 because of five fights where there was something to overcome, which really defines a fight. He didn’t find a way.
Lex Fridman
(01:48:55)
Let me ask Teddy Atlas to introspect on the human nature here, as part of the complexities of your feelings on this whole thing is that you know to some degree that if you were coaching Mike Tyson, he could be truly great throughout-
Teddy Atlas
(01:49:13)
I know… I’m going to cut you right off, because you asked a million-dollar question, I wish you didn’t, but you did. You did. Because that’s why-
Lex Fridman
(01:49:21)
When do I get paid?
Teddy Atlas
(01:49:22)
That’s why you get paid. I get it, you took the words out of my mouth. That’s why you are where you are. And that’s why I’m here.
Lex Fridman
(01:49:33)
The humility.
Teddy Atlas
(01:49:34)
I’m going to, again, full disclosure, it’s important, right? I’m going to cheat, I’m going to take some of Cus’ wisdom. All right. A little bit of mine. Cus told somebody that if Teddy Atlas got his way, he might’ve been a better person, but we would’ve risked him not being a great fighter. Now, I believe, and I thought Cus did, and I think he did up to that point in his life, that part of your strength of character made you a great fighter, and truly a great fighter. And part of that battle to be a better person, that fight if you will, to be a better person, to overcome the [inaudible 01:50:22] to be a better person, part of that fire you have to go through to be a better person, I really truly bought into it, and I’m in for life.

(01:50:34)
That is really the only way to be a great fighter. And I don’t think that’s what Cus meant, I think he meant that… Cus knew more than I did of what was about to come and what would come and what the world was. How people would try to steal him, how people would take him, how people would steal his guy. The last thing he had, really, the thing that he lived for. Because he lived to have another heavyweight champ, the greatest fighter ever, in Cus’ mind. He could be. And I believe that Cus knew that he could put forward a guy that had the ability to be the greatest fighter ever, without fully completing the mission of what it takes to really be great, but that he wouldn’t be around to have to witness it.

(01:51:38)
And that he wouldn’t… Oh man, this is awful. He’s willing to concede that he might be dead in order to have eternal life, in order to have greatness. And which, Cus does have greatness, and part of that greatness is attached to Tyson. And he deserves it. He deserves it, Cus was a great man. And I wouldn’t be here, partly, without him.
Lex Fridman
(01:52:07)
But that was part of the calculation.
Teddy Atlas
(01:52:08)
I know that’s deep, and I know that’s… Oh God, I hate myself right now. But Cus, he knew he was getting out free. He knew he was going to not have to be there. He was getting off easy. Oh, Teddy, how do you say someone’s going to be dead, they’re getting off easy? I’ll say it again in case you didn’t hear me, all right? He was going to get off easy, and not have to face where he came up short, because he did his job. Because he put forward the greatest fighter of all time, and you guys screwed it up.

(01:52:48)
And he knew that that might happen, but you guys screwed it up. And whatever, that’s your fault. That’s on… I’ll tell you, Tyson will be mad at this, but that’s on Tyson. How can you say that Teddy? He loved me. I’m not saying he didn’t love you, but he loved some other stuff too. And I don’t know if Tyson could ever come to grip’s light with that, and it’s not his job to. But it’s my job not to hide from it. I know Cus in dimensions that other people just only think they know.
Lex Fridman
(01:53:23)
Did Cus know? Did Cus know this about himself? Did he reflect? Did he introspect?
Teddy Atlas
(01:53:30)
Well, he sent a message to me. Cus sent a guide to me… My wife was pregnant, we were living in an apartment in Catskill on [inaudible 01:53:38] road. We went through all this, and I was getting ready to move to Staten Island, and we still were there for a little while before we did, after all this went down. He sent a guy to me, to the house, secret, whatever you want to call it… My wife, me. So, I listened to him. Cus said, if you leave… I’m a messenger, whatever. If you leave… This was in the aftermath of the gun, the whole thing. You got to remember, Tyson was a ward of the state. He was put in Cus’ custody. Cus was looking to adopt him, for obvious reasons, so he had control. And he loved him.

(01:54:28)
How dare I say anything less? I won’t. But it made sense too. But he was a ward of the state still. Do you know what that means? There’s rules. It means the state’s still overlooking it. If he ain’t living the right life… You got to remember, he came out of a jail. So, reform school. But if he ain’t living the life, he could be taken away from Cus. What’s not living the right life? Well, he wasn’t in school no more, they didn’t know about it. He had some things that were going on, we won’t get into that right now, in school and different things, whatever. And he had his trainer put a gun to his head. That ain’t so good. If a report came back to them that that happened, he would’ve been taken away from Cus. That couldn’t happen. Nobody knows this. I talk about it a little bit, but never, probably… Because why would I?

(01:55:28)
I don’t know. Why am I doing it now? I don’t know, because… I don’t know. Because I am-
Lex Fridman
(01:55:33)
[inaudible 01:55:33].
Teddy Atlas
(01:55:33)
… because it’s now. Because it’s now, maybe. Maybe because it’s now, I don’t know. So, he sent this man, that obviously we both knew, and he said, here’s the deal, Teddy. No talk about this, wants it to disappear, basically, you leave and he will give you 5%… His word. Can you imagine? He will give you 5% of Tyson’s earnings for the rest of his career. But I don’t regret it one bit because it wouldn’t have happened anyway. See, that’s where I could be honest with my… People say, oh, standup guy, because I told him to shove it where the… In that place. And tell Cus to shove it in that freaking place. I was mad. Teddy, don’t get angry… Don’t get angry? Are you out of your… Are you serious? Get out of here. Tell them to go shove it over… And my wife was like, huh? And then, people are like, [inaudible 01:56:37], why didn’t you take the deal? It wasn’t a deal, it was an escape clause for Cus. It was an insurance policy, that this kid wouldn’t be taken away from him.

(01:56:52)
And thank God he wasn’t. I wasn’t going to go and say nothing, they didn’t have to worry about it. Cus forgot who I was? Cus forgot why he went to court for me? Because of those characteristics that he said he loved, and he noticed, and that he admired. I didn’t lose those characters, he forgot that that was me, he forgot who he was talking to. He didn’t have to do that. How about, that’s why I told him to shove it up his… Not because of the other insult. And then, when people said to me, oh, you were stand up… Because it was around a little bit. It was around in the circles. And then, when people… Oh, stand up Teddy, he didn’t care about the money. I said, stand up Teddy? What are you talking about?

(01:57:39)
How about just realistic, Teddy? How about I live in a real world, that I was never going to get that money? So, I’m standing up to something that I knew never existed. So, I ain’t stand up, not in that way. I am in other ways maybe, but don’t put a medal on my chest for that, because that never existed. It was never meant to exist. But he didn’t even understand. That was the one thing that really disappointed me in Cus. I was like Cus, you really allowed this to get to you. Where you’ve allowed it to really fog up your thinking, to the point where you’re smarter than that, you’re better than that. That you would actually think you got to freaking offer me freaking pieces of silver. You really think that? That’s what you… Freak you. All that you told me, that you love me, and that I was the young master, and all this… And you think you were going to buy me? And that was going to keep me quiet? How about I would keep quiet because I would always keep quiet?
Lex Fridman
(01:58:52)
So, he thought maybe you might betray him?
Teddy Atlas
(01:58:56)
Isn’t that interesting? Yeah. And why did he think that? No, no really?
Lex Fridman
(01:59:03)
Fear.
Teddy Atlas
(01:59:04)
Yeah. But yeah, fear is at the essence of everything, it’s connected with everything. Fear of losing what he was going to lose. But it was more than fear, it was him not believing in the things that he told me he believed in. He didn’t even know that. He believed in me because I was a standup guy. Because I didn’t sell myself. Because I didn’t freaking turn evidence. I didn’t make a deal. I didn’t do… And that’s why he went to court, and that’s why he stood up for me. And I appreciate it. And that was what he lived by. And those were the blocks of being a man. So much for those blocks.
Lex Fridman
(01:59:47)
Well, it’s like you said, loyalty requires… He would’ve had to take a risk on losing immortality that he would achieve by creating a great heavyweight champion-
Teddy Atlas
(02:00:02)
And that’s the only way you… 100%. But the only way you ever find out if somebody is really that-
Lex Fridman
(02:00:07)
It’s hard. It’s hard.
Teddy Atlas
(02:00:08)
…it’s the test. And it was Cus.
Lex Fridman
(02:00:10)
This is Shakespearean, this story.
Teddy Atlas
(02:00:15)
Cus told me, Cus said, “And the test come in different forms.”
Lex Fridman
(02:00:18)
Yeah.
Teddy Atlas
(02:00:19)
I said, all right, Cus.
Lex Fridman
(02:00:20)
This was his test.
Teddy Atlas
(02:00:23)
And some people pass this test because they’re able to pass that test, because it’s not really a test. Not for them. Because it doesn’t speak to their weakness. But it’s the test that speaks to the weakness, that’s the one. So, this one, I get it, I get what it spoke to, Cus. And you know what? At the end of the day, I forgive you, and I feel bad for you. I feel bad that you were put in that position after you lived your life that way, and that you taught that and you preached that from the mountaintops, that you had to be… That you had to be… I’m not going to use the word. But that you had to fail yourself, and that you had to somehow know that before you died.

(02:01:22)
I just pray that you didn’t know that. And you still don’t know that. Because you were great, you were great. And you’ve given me something to aspire towards. To try to be less weak. Try to be better. And try to be as good as you wanted to be. I wish I can someday. More importantly, I wish I could make my father just feel good up there.

Forgiveness

Lex Fridman
(02:02:18)
You’re a grandfather now.
Teddy Atlas
(02:02:20)
Yeah, four grandchildren.
Lex Fridman
(02:02:23)
If you can give them advice on how to live a life they can be proud of…
Teddy Atlas
(02:02:33)
Just do everything you can, to the best of your ability, every day, to like yourself. To give yourself a reason, to actually say, I’d like to be friends with that guy.
Lex Fridman
(02:02:57)
Is loyalty one of the reasons? One of the things to aspire to?
Teddy Atlas
(02:03:06)
Loyalty is your chance to have a fulfilled life. Loyalty is your chance to have strength, to have all the things you need to have a good life. To be a good parent, be a good husband, be a good grandfather, hopefully be a good role model. Loyalty is… If you could find something to drink, to take into your body, to make you prepared for life, to be all the things that you want to be, to be strong enough to be those things, loyalty would be the thing you would drink. And when I say loyal, I mean unequivocally. I mean unconditionally. Not conveniently, obviously you know that. If you could be loyal, you could be a good person. You could be a person that you would actually like to be around. Because you could be a person you could rely on. And I think that’s one of the greatest assets that a human being can have.
Lex Fridman
(02:04:32)
And what do you do when you’re betrayed? How do you overcome that?
Teddy Atlas
(02:04:38)
You think of what you learned from it. Use it as a roadmap to remember, and to think back of how you got there. And how you got to the place where you got betrayed, and how that person got to that place. Try to remember that in your own journey.
Lex Fridman
(02:05:04)
Has it, for you, made you cynical? How do you take the leap of trust towards people again and again after that?
Teddy Atlas
(02:05:14)
Just by remembering that I’m still trying to forgive myself for the things that I came up short with. And if I haven’t figured that out yet, it’s probably okay to say they didn’t figure it out yet, they didn’t figure it out. And if I couldn’t figure it out and I’m still trying to figure it out, maybe I could get over that initial stabbing of, what it feels like. It does feel like a stabbing. That you feel when you’re betrayed initially, and that you could only think of anger, revenge, hatred. I know those things. I’m not proud of that, but I felt all those things. And I still feel them sometimes. And then I go back and say, hey, you’re still working at forgiving yourself for some things, try to remember that kid. Memory’s an important thing. Forgetfulness is pretty important too. And I’m trying to remember why we forget. Why do we forget? Because it wasn’t something you felt proud of.
Lex Fridman
(02:06:38)
Do you think about your death? Are you afraid of it?
Teddy Atlas
(02:06:46)
It’s funny you asked that. I never used to think about it. I know people in both places.
Lex Fridman
(02:06:59)
I know, you’ve got it covered. You’re going to be all right.
Teddy Atlas
(02:07:03)
Don’t forget that.
Lex Fridman
(02:07:04)
Yeah.
Teddy Atlas
(02:07:05)
I know people in both places.
Lex Fridman
(02:07:05)
Yeah. Both neighborhoods.
Teddy Atlas
(02:07:24)
I’ve been given credit for being brave in certain spots in life, I hope I can be brave when it comes time to leave life. I hope I can be. And that’s just, that’s real and honest as you can be about it. I hope I can be. So far, so good. When I’ve had to be certain things that I was scared to freaking death, I found a way to beat them, for the most part. And so, I figured, when that day comes, I’ll figure that out too.
Lex Fridman
(02:08:12)
It’s going to be another test, maybe the last one. Teddy, it’s a huge honor to talk to you.
Teddy Atlas
(02:08:19)
It’s my pleasure.
Lex Fridman
(02:08:20)
Thank you for being the human you are, for being honest. Honest about the full range of human nature. And thank you for talking today.
Teddy Atlas
(02:08:29)
Thank you. Thank you for having me, and thanks for listening.
Lex Fridman
(02:08:35)
Thanks for listening to this conversation with Teddy Atlas. To support this podcast, please check out our sponsors in the description. And now, let me leave you with some words from Muhammad Ali. “I hated every minute of training, but I said, don’t quit, suffer now and live the rest of your life as a champion.” Thank you for listening, and hope to see you next time.

Transcript for Jeff Bezos: Amazon and Blue Origin | Lex Fridman Podcast #405

This is a transcript of Lex Fridman Podcast #405 with Jeff Bezos.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
The following is a conversation with Jeff Bezos, founder of Amazon and Blue Origin. This is his first time doing a conversation of this kind and of this length. And as he told me, it felt like we could have easily talked for many more hours, and I’m sure we will. This is the Lex Fridman Podcast. And now, dear friends, here’s Jeff Bezos.

Ranch


(00:00:24)
You spent a lot of your childhood with your grandfather on a ranch here in Texas.
Jeff Bezos
(00:00:29)
Mm-hmm.
Lex Fridman
(00:00:30)
And I heard you had a lot of work to do around the ranch. So, what’s the coolest job you remember doing there?
Jeff Bezos
(00:00:35)
Wow. Coolest?
Lex Fridman
(00:00:37)
Most interesting? Most memorable?
Jeff Bezos
(00:00:39)
Most memorable?
Lex Fridman
(00:00:39)
Most impactful?
Jeff Bezos
(00:00:41)
It’s a real working ranch, and I spent all my summers on that ranch from age four to 16. And my grandfather was really taking me and in the early summers, he was letting me pretend to help on the ranch, because of course, a four-year-old is a burden, not a help in real life. He was really just watching me and taking care of me. And he was doing that because my mom was so young. She had me when she was 17, and so he was sort of giving her a break. And my grandmother and my grandfather would take me for these summers.

(00:01:15)
But as I got a little older, I actually was helpful on the ranch and I loved it. My grandfather had a huge influence on me, a huge factor in my life. I did all the jobs you would do on a ranch. I’ve fixed windmills, and laid fences, and pipelines, and done all the things that any rancher would do, vaccinated the animals, everything. But after my grandmother died, I was about 12 and I kept coming to the ranch, so then it was just him and me, just the two of us. And he was completely addicted to the soap opera, Days of Our Lives. And we would go back to the ranch house every day around 1:00 PM or so to watch Days of Our Lives. Like sands through an hourglass, so are the Days of Our Lives.
Lex Fridman
(00:02:07)
Just the image of that, the two of you sitting there watching a soap opera, two ranchers.
Jeff Bezos
(00:02:13)
He had these big crazy dogs. It was really a very formative experience for me. But the key thing about it for me, the great gift I got from it was that my grandfather was so resourceful. He did everything himself. He made his own veterinary tools. He would make needles to suture the cattle up with. He would find a little piece of wire and heat it up and pound it thin and drill a hole in it and sharpen it. So, you learn different things on a ranch than you would learn growing up in a city.
Lex Fridman
(00:02:43)
So, self-reliance?
Jeff Bezos
(00:02:44)
Yeah, figuring out that you can solve problems with enough persistence and ingenuity. And my grandfather bought a D6 bulldozer, which is a big bulldozer, and he got it for like $5,000 because it was completely broken down. It was like a 1955 Caterpillar D6 bulldozer. New it would’ve cost, I don’t know, more than $100,000. And we spent an entire summer repairing that bulldozer. And we’d use mail order to buy big gears for the transmission, and they’d show up, they’d be too heavy to move, so we’d have to build a crane. Just that problem-solving mentality. He had it so powerfully. He did all of his own… He didn’t pick up the phone and call somebody, he would figure it out on his own. Doing his own veterinary work.
Lex Fridman
(00:03:39)
But just the image of the two of you fixing a D6 bulldozer and then going in for a little break at 1:00 PM to watch soap operas.
Jeff Bezos
(00:03:47)
Days of Our Lives. Laying on the floor, that’s how he watched TV. He was a really, really remarkable guy.

Space

Lex Fridman
(00:03:52)
That’s how I imagine Clint Eastwood also in all those westerns, when he’s not doing what he’s doing, he’s just watching soap operas. All right. I read that you fell in love with the idea of space and space exploration when you were five, watching Neil Armstrong walking on the moon. So, let me ask you to look back at the historical context and impact of that. So, the space race from 1957 to 1969 between the Soviet Union and the US was, in many ways, epic. It was a rapid sequence of dramatic events. First satellite to space, first human to space, first spacewalk, first uncrewed landing on the moon. Then, some failures, explosions, deaths on both sides actually. And then, the first human walking on the moon. What are some of the more inspiring moments or insights you take away from that time, those few years at just 12 years?
Jeff Bezos
(00:04:51)
Well, I mean there’s so much inspiring there. One of the great things to take away from that, one of the great von Braun quotes is, “I have come to use the word impossible with great caution.” And so, that’s kind of the big story of Apollo is that going to the moon was literally an analogy that people used for something that’s impossible. “Oh, yeah, you’ll do that when men walk on the moon.” And of course, it finally happened. So, I think it was pulled forward in time because of the space race.

(00:05:31)
I think with the geopolitical implications and how much resource was put into it. At the peak, that program was spending 2% or 3% of GDP on the Apollo program. So, much resource. I think it was pulled forward in time. We kind of did it ahead of when we, quote, unquote, should have done it. And so, in that way, it’s also a technical marvel. I mean it’s truly incredible. It’s the 20th century version of building the pyramids or something. It’s an achievement that because it was pulled forward in time and because it did something that had previously been thought impossible, it rightly deserves its place in the pantheon of great human achievements.
Lex Fridman
(00:06:17)
And of course, you named the rockets that Blue Origin is working on after some of the folks involved.
Jeff Bezos
(00:06:24)
Yeah.
Lex Fridman
(00:06:24)
I don’t understand why I didn’t say New Gagarin. Is that-
Jeff Bezos
(00:06:27)
There’s an American bias in the naming. I apologize-
Lex Fridman
(00:06:30)
That’s very strange.
Jeff Bezos
(00:06:31)
… Lex.
Lex Fridman
(00:06:31)
Was just asking for a friend, clarifying.
Jeff Bezos
(00:06:33)
I’m a big fan of Gagarin’s though. And in fact, I think his first words in space I think are incredible. He purportedly said, “My God, it’s blue.” And that really drives home. No one had seen the Earth from space. No one knew that we were on this blue planet. No one knew what it looked like from out there, and Gagarin was the first person to see it.
Lex Fridman
(00:07:01)
One of the things I think about is how dangerous those early days were for Gagarin, for Glenn, for everybody involved. How big of a risk they were all taking.
Jeff Bezos
(00:07:11)
They were taking huge risks. I’m not sure what the Soviets thought about Gagarin’s flight, but I think that the Americans thought that the Alan Shepard flight, the flight that New Shepherd is named after, the First American in space, he went on his suborbital flight, they thought he had about a 75% chance of success. So, that’s a pretty big risk, a 25% risk.
Lex Fridman
(00:07:36)
It’s kind of interesting that Alan Shepard is not quite as famous as John Glenn. So, for people who don’t know, Alan Shepard is the first astronaut-
Jeff Bezos
(00:07:44)
The first American in space.
Lex Fridman
(00:07:46)
American in suborbital flight.
Jeff Bezos
(00:07:48)
Correct.
Lex Fridman
(00:07:48)
And then, the first orbital flight is-
Jeff Bezos
(00:07:51)
John Glenn is the first American to orbit the Earth. By the way, I have the most charming, sweet, incredible letter from John Glenn, which I have framed and hanging on my office wall.
Lex Fridman
(00:08:04)
What did he say?
Jeff Bezos
(00:08:04)
Where he tells me how grateful he is that we have named New Glenn after him. And he sent me that letter about a week before he died. And it’s really an incredible… It’s also a very funny letter. He’s writing and he says, “This is a letter about New Glenn from the original Glenn.” And he’s got a great sense of humor and he’s very happy about it and grateful. It’s very sweet.
Lex Fridman
(00:08:30)
Does he say, “P.S. Don’t mess this up,” or is that-
Jeff Bezos
(00:08:34)
No, he doesn’t.
Lex Fridman
(00:08:35)
“Make me look good.”
Jeff Bezos
(00:08:35)
He doesn’t do that. But John, wherever you are, we’ve got you covered.
Lex Fridman
(00:08:39)
Good. So, back to maybe the big picture of space. When you look up at the stars and think big, what do you hope is the future of humanity, hundreds, thousands of years from now out in space?
Jeff Bezos
(00:08:54)
I would love to see a trillion humans living in the solar system. If we had a trillion humans, we would have, at any given time, 1,000 Mozarts and 1,000 Einsteins. That our solar system would be full of life and intelligence and energy. And we can easily support a civilization that large with all of the resources in the solar system.
Lex Fridman
(00:09:21)
So, what do you think that looks like? Giant space stations?
Jeff Bezos
(00:09:24)
Yeah, the only way to get to that vision is with giant space stations. The planetary surfaces are just way too small. So, I mean, unless you turn them into giant space stations or something. But yeah, we will take materials from the moon and from near-Earth objects and from the asteroid belt and so on, and we’ll build giant O’Neill style colonies and people will live in those. They have a lot of advantages over planetary surfaces. You can spin them to get normal Earth gravity. You can put them where you want them. I think most people are going to want to live near Earth, not necessarily in Earth orbit, but near Earth vicinity orbits. And so, they can move relatively quickly back and forth between their station and Earth. I think a lot of people, especially in the early stages, are not going to want to give up Earth altogether.
Lex Fridman
(00:10:24)
They go to earth for vacation?
Jeff Bezos
(00:10:26)
Yeah, same way that you might go to Yellowstone National Park for vacation, people will… And people will get to choose where they live on Earth or whether they live in space, but they’ll be able to use much more energy and much more material resource in space than they would be able to use on Earth.
Lex Fridman
(00:10:45)
One of the interesting ideas you had is to move the heavy industry away from Earth. So, people sometimes have this idea that somehow space exploration is in conflict with the celebration of the planet Earth, that we should focus on preserving Earth. And basically, your idea is that space travel and space exploration is a way to preserve Earth.
Jeff Bezos
(00:11:06)
Exactly. We’ve sent robotic probes to all the planets, we know that this is the good one.
Lex Fridman
(00:11:17)
Not to play favorites or anything, but…
Jeff Bezos
(00:11:19)
Earth really is the good planet. It’s amazing. The ecosystem we have here, all of the life and the lush plant life and the water resources, everything. This planet is really extraordinary. And of course, we evolved on this planet, so of course it’s perfect for us, but it’s also perfect for all the advanced life forms on this planet, all the animals and so on. And so, this is a gem. We do need to take care of it. And as we enter the Anthropocene, as we humans have gotten so sophisticated and large and impactful, as we stride across this planet, that is going to… We want to use a lot of energy. We want to use a lot of energy per capita. We’ve gotten amazing things. We don’t want to go backwards.

(00:12:10)
If you think about the good old days, they’re mostly an illusion. In almost every way, life is better for almost everyone today than it was say 50 years ago or 100 years ago. We live better lives by and large than our grandparents did, and their grandparents did, and so on. And you can see that in global illiteracy rates, global poverty rates, global infant mortality rates. Almost any metric you choose, we’re better off than we used to be. And we get antibiotics and all kinds of lifesaving medical care, and so on, and so on. And there’s one thing that is moving backwards, and it’s the natural world.

(00:12:54)
So, it is a fact that 500 years ago, pre-industrial age, the natural world was pristine. It was incredible. And we have traded some of that pristine beauty for all of these other gifts that we have as an advanced society. And we can have both, but to do that, we have to go to space. And the most fundamental measure is energy usage per capita. You do want to continue to use more and more energy, it is going to make your life better in so many ways, but that’s not compatible ultimately with living on a finite planet. And so, we have to go out into the solar system. And really, you could argue about when you have to do that, but you can’t credibly argue about whether you have to do that.
Lex Fridman
(00:13:49)
Eventually we have to do that.
Jeff Bezos
(00:13:51)
Exactly.
Lex Fridman
(00:13:52)
Well, you don’t often talk about it, but let me ask you on that topic about the Blue Ring and the Orbital Reef space infrastructure projects. What’s your vision for these?
Jeff Bezos
(00:14:03)
So, Blue Ring is a very interesting spacecraft that is designed to take up to 3,000 kilograms of payload up to geosynchronous orbit or in lunar vicinity. It has two different kinds of propulsion. It has chemical propulsion and it has electric propulsion. And so, you can use Blue Ring in a couple of different ways. You can slowly move, let’s say up to geosynchronous orbit using electric propulsion. That might take 100 days or 150 days, depending on how much mass you’re carrying. And reserve your chemical propulsion, so that you can change orbits quickly in geosynchronous orbit. Or you can use the chemical propulsion first to quickly get up to geosynchronous and then use your electrical propulsion to slowly change your geosynchronous orbit.

(00:14:55)
Blue Ring has a couple of interesting features. It provides a lot of services to these payloads. So, it could be one large payload or it can be a number of small payloads, and it provides thermal management, it provides electric power, it provides compute, provides communications. And so, when you design a payload for Blue Ring, you don’t have to figure out all of those things on your own. So, kind of radiation tolerant compute is a complicated thing to do. And so, we have an unusually large amount of radiation tolerant compute on board Blue Ring, and your payload can just use that when it needs to. So, it’s sort of all these services… It’s like a set of APIs. It’s a little bit like Amazon Web Services, but-
Lex Fridman
(00:15:51)
For space?
Jeff Bezos
(00:15:52)
… for space payloads that need to move about in Earth vicinity or lunar vicinity.
Lex Fridman
(00:15:57)
AWSS space. So, compute and space. So, you get a giant chemical rocket to get a payload out to orbit. And then, you have these admins that show up, this Blue Ring thing that manages various things like compute?
Jeff Bezos
(00:16:13)
Exactly. And it can also provide transportation and move you around to different orbits.
Lex Fridman
(00:16:19)
Including humans, do you think?
Jeff Bezos
(00:16:21)
No, Blue Ring is not designed to move humans around. It’s designed to move payloads around. So, we’re also building a lunar lander, which is of course designed to land humans on the surface of the moon.

Physics

Lex Fridman
(00:16:34)
I’m going to ask you about that, but let me ask you to just step back to the old days. You were at Princeton with aspirations to be a theoretical physicist.
Jeff Bezos
(00:16:45)
Yeah.
Lex Fridman
(00:16:47)
What attracted you to physics and why did you change your mind and not become… Why are you not Jeff Bezos, the famous theoretical physicist?
Jeff Bezos
(00:16:57)
So, I loved physics and I studied physics and computer science, and I was proceeding along the physics path. I was planning to major in physics, and I wanted to be a theoretical physicist. And the computer science was sort of something I was doing for fun. I really loved it and I was very good at the programming and doing those things, and I enjoyed all my computer science classes immensely. But I really was determined to be a theoretical physicist. That’s why I went to Princeton in the first place. It was definitely… And then, I realized I was going to be a mediocre theoretical physicist. And there were a few people in my classes, like in quantum mechanics and so on, who they could effortlessly do things that were so difficult for me. And I realized there are 1,000 ways to be smart.

(00:17:52)
Theoretical physics is not one of those fields where only the top few percent actually move the state-of-the-art forward. It’s one of those things where your brain has to be wired in a certain way. And there was a guy named… One of these people who convinced me, he didn’t mean to convince me, but just by observing him, he convinced me that I should not try to be a theoretical physicist. His name was Yosanta. And Yosanta was from Sri Lanka, and he was one of the most brilliant people I’d ever met. My friend Joe and I were working on a very difficult partial differential equations problem set one night. And there was one problem that we worked on for three hours and we made no headway whatsoever. And we looked up at each other at the same time and we said, “Yosanta.”

(00:18:49)
So, we went to Yosanta’s dorm room and he was there. He was almost always there. And we said, “Yosanta, we’re having trouble solving this partial differential equation. Would you mind taking a look?” And he said, “Of course.” By the way, he was the most humble, most kind person. And so, he looked at our problem and he stared at it for just a few seconds, maybe 10 seconds, and he said, “cosine.” And I said, “What do you mean, Yosanta? What do you mean cosine?” He said, “That’s the answer.” And I said, “No, no, no, come on.” And he said, “Let me show you.” And he took out some paper and he wrote down three pages of equations, everything canceled out, and the answer was cosine.

(00:19:30)
And I said, “Yosanta, did you do that in your head?” And he said, “Oh, no. That would be impossible. A few years ago I solved a similar problem and I could map this problem onto that problem, and then it was immediately obvious that the answer was cosine.” You have an experience like that, you realize maybe being a theoretical physicist isn’t what the universe wants you to be. And so, I switched to computer science and that worked out really well for me. I enjoy it. I still enjoy it today.
Lex Fridman
(00:20:07)
Yeah, there’s a particular kind of intuition you need to be a great physicist, and applied to physics.
Jeff Bezos
(00:20:12)
I think the mathematical skill required today is so high. You have to be a world-class mathematician to be a successful theoretical physicist today. And you probably need other skills too, intuition, lateral thinking and so on. But without just top-notch math skills, you’re unlikely to be successful.
Lex Fridman
(00:20:39)
And visualization skill, you have to be able to really do these kinds of thought experiments if you want truly great creativity. Actually Walter Isaacson writes about you and puts you on the same level as Einstein and-
Jeff Bezos
(00:20:53)
Well, that’s very kind. I’m an inventor. If you want to boil down what I am, I’m really an inventor. And I look at things and I can come up with atypical solutions. And then, I can create 100 such atypical solutions for something, 99 of them may not survive scrutiny, but one of those 100 is like, “Hmm, maybe that might work.” And then, you can keep going from there. So, that kind of lateral thinking, that kind of inventiveness in a high-dimensionality space where the search space is very large, that’s where my inventive skills come… I self-identify as an inventor more than anything else.
Lex Fridman
(00:21:43)
Yeah. And he describes in all kinds of different ways, Walter Isaacson does, that creativity combined with childlike wander that you’ve maintained still to this day, all of that combined together. If you were to study your own brain, introspect, how do you think? What’s your thinking process like? We’ll talk about the writing process of putting it down on paper, which is quite rigorous and famous at Amazon. But when you sit down, maybe alone, maybe with others, and thinking through this high-dimensional space and looking for creative solutions, creative paths forward, is there something you could say about that process?
Jeff Bezos
(00:22:26)
It’s such a good question, and I honestly don’t know how it works. If I did, I would try to explain it. I know it involves lots of wandering, so when I sit down to work on a problem, I know I don’t know where I’m going. So, to go in a straight line… To be efficient… Efficiency and invention are sort of at odds, because real invention, Not incremental improvement… Incremental improvement is so important in every endeavor, in everything you do, you have to work hard on also just making things a little bit better. But I’m talking about real invention, real lateral thinking that requires wandering, and you have to give yourself permission to wander.

(00:23:11)
I think a lot of people, and they feel like wandering is inefficient. And when I sit down at a meeting, I don’t know how long the meeting is going to take if we’re trying to solve a problem, because if I did, then I’d know there’s some kind of straight line that we’re drawing to the solution. The reality is we may have to wander for a long time. And I do like group invention. I think there’s really nothing more fun than sitting at a whiteboard with a group of smart people and spit balling and coming up with new ideas and objections to those ideas, and then solutions to the objections and going back and forth. So, sometimes you wake up with an idea in the middle of the night and sometimes you sit down with a group of people and go back and forth, and both things are really pleasurable.
Lex Fridman
(00:24:14)
And when you wander, I think one key thing is to notice a good idea. And maybe to notice the kernel of a good idea. I’ll maybe pull at that string. Because I don’t think good ideas come fully-formed.
Jeff Bezos
(00:24:31)
100% right. In fact, when I come up with what I think is a good idea and it survives the first level of scrutiny that I do in my own head, and I’m ready to tell somebody else about the idea, I will often say, “Look, it is going to be really easy for you to find objections to this idea, but work with me.”
Lex Fridman
(00:24:53)
There’s something there.
Jeff Bezos
(00:24:54)
There’s something there. And that is intuition, because it’s really easy to kill new ideas in the beginning because there’s so many easy objections to them. So, you need to kind of forewarn people and say, “Look, I know it’s going to take a lot of work to get this to a fully-formed idea. Let’s get started on that. It’ll be fun.”
Lex Fridman
(00:25:17)
So, you got that ability to say cosine in you somewhere after all, maybe not on math, but-
Jeff Bezos
(00:25:23)
In a different domain.
Lex Fridman
(00:25:24)
Yeah.
Jeff Bezos
(00:25:25)
There are 1,000 ways to be smart, by the way, and that is a really… When I go around and I meet people, I’m always looking for the way that they’re smart. And you find that’s one of the things that makes the world so interesting and fun is that it’s not like IQ is a single dimension. There are people who are smart in such unique ways.
Lex Fridman
(00:25:53)
Yeah, you just gave me a good response when somebody calls me an idiot on the internet. “You know, there’s 1,000 ways to be smart, sir.”
Jeff Bezos
(00:26:01)
Well, they might tell you, “Yeah, but there are a million to be ways to be dumb.”

New Glenn

Lex Fridman
(00:26:04)
Yeah, right. I feel like that’s a Mark Twain quote. Okay. All right. You gave me an amazing tour of Blue Origin Rocket Factory and Launch Complex in the historic Cape Canaveral. That’s where New Glenn, the big rocket we talked about, is being built and will launch. Can you explain what the New Glenn rocket is and tell me some interesting technical aspects of how it works?
Jeff Bezos
(00:26:29)
Sure. New Glenn is a very large heavy-lift launch vehicle. It’ll take about 45 metric tons to LEO, very large class. It’s about half the thrust, a little more than half the thrust of the Saturn V rocket. So, it’s about 3.9 million pounds of thrust on liftoff. The booster has seven BE-4 engines. Each engine generates a little more than 550,000 pounds of thrust. The engines are fueled by liquified natural gas, LNG as the fuel, and LOX as the oxidizer. The cycle is an ox-riched stage combustion cycle. It’s a cycle that was really pioneered by the Russians. It’s a very good cycle. And that engine is also going to power the first stage of the Vulcan rocket, which is the United Launch Alliance rocket. Then the second stage of New Glenn is powered by two BE-3U engines, which is a upper-stage variant of our New Shepard liquid hydrogen engine.

(00:27:44)
So, the BE-3U has 160,000 pounds of thrust, so two of those, 320,000 pounds of thrust. And hydrogen is a very good propellant for upper stages because it has very high ISP. It’s not a great propellant in my view for booster stages, because the stages then get physically so large. Hydrogen has very high ISP, but liquid hydrogen is not dense at all. So, to store liquid hydrogen, if you need to store many thousands of pounds of liquid hydrogen, your liquid hydrogen tank gets very large. So, you get more benefit from the higher ISP, the specific impulse, you get more benefit from the higher specific impulse on the second stage. And that stage carries less propellant, so you don’t get such geometrically-gigantic tanks. The Delta IV is an example of a vehicle that is all hydrogen. The booster stage is also hydrogen, and I think that it’s a very effective vehicle, but it never was very cost-effective. So, it’s operationally very capable but not very cost-effective.
Lex Fridman
(00:28:56)
So, size is also costly?
Jeff Bezos
(00:28:58)
Size is costly. So, it’s interesting. Rockets love to be big. Everything works better.
Lex Fridman
(00:29:05)
What do you mean by that? You’ve told me that before. It sounds epic, but what does it mean?
Jeff Bezos
(00:29:10)
I mean, when you look at the physics of rocket engines, and also when you look at parasitic mass… Let’s say you have an avionic system, so you have a guidance and control system, that is going to be about the same mass and size for a giant rocket as it is going to be for a tiny rocket. And so, that’s just parasitic mass that is very consequential if you’re building a very small rocket, but is trivial if you’re building a very large rocket. So, you have the parasitic mass thing. And then if you look at, for example, rocket engines have turbo pumps. They have to pressurize the fuel in the oxidizer up to a very high pressure level in order to inject it into the thrust chamber where it burns. And those pumps, all rotating machines, in fact, get more efficient as they get larger. So, really tiny turbo pumps are very challenging to manufacture, and any kind of gaps between the housing, for example, and the rotating impeller that pressurizes the fuel, there has to be some gap there. You can’t have those parts scraping against one another, and those gaps drive inefficiencies. And so, if you have a very large turbo pump, those gaps in percentage terms end up being very small. And so, there’s a bunch of things that you end up loving about having a large rocket and that you end up hating for a small rocket. But there’s a giant exception to this rule, and it is manufacturing. So, manufacturing large structures is very, very challenging. It’s a pain in the butt. And so, if you’re making a small rocket engine, you can move all the pieces by hand, you could assemble it on a table, one person can do it. You don’t need cranes and heavy lift operations and tooling and so on and so on. When you start building big objects, infrastructure, civil infrastructure, just like the launchpad and all this we went and visited, I took you to the launchpad. And you can see it’s so monumental.
Lex Fridman
(00:31:27)
Yeah, it is.
Jeff Bezos
(00:31:28)
And so, just these things become major undertakings, both from an engineering point of view, but also from a construction and cost point of view.
Lex Fridman
(00:31:37)
And even the foundation of the launchpad. I mean, this is Florida, isn’t it swamp land? How deep do you have to go?
Jeff Bezos
(00:31:44)
At Cape Canaveral, in fact, most launch pads are on beaches somewhere on the ocean side because you want to launch over water for safety reasons. Yes, you have to drive pilings, dozens and dozens and dozens of pilings, 50, 100, 150 feet deep to get enough structural integrity for these very large… Yes, these turn into major civil engineering projects.
Lex Fridman
(00:32:15)
I just have to say everything about that factory is pretty badass. You said tooling, the bigger it gets, the more epic it is.
Jeff Bezos
(00:32:22)
It does make it epic. It’s fun to look at. It’s extraordinary.
Lex Fridman
(00:32:26)
It’s humbling also because humans are so small compared to it.
Jeff Bezos
(00:32:29)
We are building these enormous machines that are harnessing enormous amounts of chemical power in very, very compact packages. It’s truly extraordinary.
Lex Fridman
(00:32:44)
But then, there’s all the different components and the materials involved. Is there something interesting that you can describe about the materials that comprise the rocket? So, it has to be as light as possible, I guess, whilst withstanding the heat and the harsh conditions?
Jeff Bezos
(00:33:03)
Yeah-
Lex Fridman
(00:33:00)
Whilst withstanding the heat and the harsh conditions?
Jeff Bezos
(00:33:03)
Yeah, I play a little game sometimes with other rocket people that I run into where say, “What are the things that would amaze the 1960s engineers? What’s changed?” Because surprisingly, some of rocketry’s greatest hits have not changed. They would recognize immediately a lot of what we do today and it’s exactly what they pioneered back in the ’60s. But a few things have changed. The use of carbon composites is very different today. We can build very sophisticated … You saw our carbon tape laying machine that builds the giant fairings and we can build these incredibly light, very stiff fairing structures out of carbon composite material that they could not have dreamed of. The efficiency, the structural efficiency of that material is so high compared to any metallic material you might use or anything else. So that’s one.

(00:34:12)
Aluminum-lithium and the ability to friction stir weld aluminum-lithium. Do you remember the friction stir welding that I showed you?
Lex Fridman
(00:34:20)
Yes. It’s incredible.
Jeff Bezos
(00:34:21)
This is a remarkable technology that’s invented decades ago, but has become very practical over just the last couple of decades. And instead of using heat to weld two pieces of metal together, it literally stirs the two pieces. There’s a pin that rotates at a certain rate and you put that pin between the two plates of metal that you want to weld together and then you move it at a very precise speed. And instead of heating the material, it heats it a little bit because of friction, but not very much, you can literally immediately after welding with stir friction welding, you can touch the material and it’s just barely warm. It literally stirs the molecules together. It’s quite extraordinary.
Lex Fridman
(00:35:06)
Relatively low temperature and I guess high temperatures, that makes it a weak point.
Jeff Bezos
(00:35:11)
Exactly. So …
Lex Fridman
(00:35:13)
Amazing.
Jeff Bezos
(00:35:13)
… with traditional welding techniques, you whatever the underlying strength characteristics of the material are, you end up with weak regions where you weld. And with friction stir welding, the welds are just as strong as the bulk material. So it really allows you … Let’s say you’re building a tank that you’re going to pressurize a large liquid natural gas tank for our booster stage, for example, if you are welding that with traditional methods, you have to size those weld lands, the thickness of those pieces with that knockdown for whatever damage you’re doing with the weld and that’s going to add a lot of weight to that tank.
Lex Fridman
(00:35:54)
Even just looking at the fairings, the result of that, the complex shape that it takes and what it’s supposed to do is incredible because some people don’t know, it’s on top of the rock, it’s going to fall apart. That’s its task, but it has to stay strong sometimes and then disappear when it needs to …
Jeff Bezos
(00:36:14)
That’s right.
Lex Fridman
(00:36:15)
… which is a very difficult task.
Jeff Bezos
(00:36:17)
Yes. When you need something that needs to have 100% integrity until it needs to have 0% integrity, it needs to stay attached until it’s ready to go away, and then when it goes away, it has to go away completely. You use explosive charges for that and so it’s a very robust way of separating structure when you need to.
Lex Fridman
(00:36:40)
Exploding.
Jeff Bezos
(00:36:41)
Yeah, little tiny bits of explosive material and it will sever the whole connection.
Lex Fridman
(00:36:49)
So if you want to go from 100% structural integrity to zero as fast as possible is explosives.
Jeff Bezos
(00:36:58)
Use explosives.
Lex Fridman
(00:36:59)
The entirety of this thing is so badass. Okay, so we’re back to the two stages. So the first stage is reusable.
Jeff Bezos
(00:37:06)
Yeah. Second stage is expendable. Second stage is liquid hydrogen, liquid oxygen. So we get take advantage of the higher specific impulse. The first stage lands down range on a landing platform in the ocean, comes back for maintenance and get ready to do the next mission.
Lex Fridman
(00:37:27)
There’s a million questions, but also is there a path towards reusability for the second stage?
Jeff Bezos
(00:37:32)
There is and we know how to do that. Right now, we’re going to work on manufacturing that second stage to make it as inexpensive as possible, two paths for a second stage, make it reusable or work really hard to make it inexpensive, so you can afford to expend it. And that trade is actually not obvious which one is better.
Lex Fridman
(00:38:00)
Even in terms of cost, like time, cost-
Jeff Bezos
(00:38:01)
Even in terms of … And I’m talking about cost. Space, getting into orbit is a solved problem. We solved it back in the ’50s and ’60s.
Lex Fridman
(00:38:11)
You’re making it sound easy.
Jeff Bezos
(00:38:13)
The only interesting problem is dramatically reducing the cost of access to orbit, which is, if you can do that, you open up a bunch of new endeavors that lots of start-up companies everybody else can do. One of our missions is to be part of this industry and lower the cost to orbit, so that there can be a renaissance, a golden age of people doing all kinds of interesting things in space.
Lex Fridman
(00:38:47)
I like how you said getting to orbit is a solved problem. It’s just the only interesting thing is reducing the cost. You know how you can describe every single problem facing human civilization that way? The physicists would say, “Everything is a solved problem. We’ve solved everything. The rest is just,” what did Rutherford said, “that it’s just stamp collecting. It’s just the details.” Some of the greatest innovations and inventions and brilliance is in that cost reduction stage, right? And you’ve had a long career of cost reduction.
Jeff Bezos
(00:39:18)
For sure. What does cost reduction really mean? It means inventing a better way.
Lex Fridman
(00:39:24)
Yeah, exactly.
Jeff Bezos
(00:39:25)
Right? And when you invent a better way, you make the whole world richer. So whatever it was, I don’t know how many thousands of years ago, somebody invented the plow. And when they invented the plow, they made the whole world richer because they made farming less expensive. And so it is a big deal to invent better ways. That’s how the world gets richer.
Lex Fridman
(00:39:48)
So what are some of the biggest challenges on the manufacturing side, on the engineering side that you’re facing in working to get to the first launch of New Glenn?
Jeff Bezos
(00:40:01)
The first launch is one thing and we’ll do that in 2024, coming up in this coming year. The real thing that’s the bigger challenge is making sure that our factory is efficiently manufacturing at rate. So rate production, so consider if you want to launch New Glenn 24 times a year, you need to manufacture a upper stage since they’re expendable, twice a month. You need to do one every two weeks. So you need to have all of your manufacturing facilities and processes and inspection techniques and acceptance tests and everything operating at rate. And rate manufacturing is at least as difficult as designing the vehicle in the first place and the same thing. So every upper stage has two BE-3U engines.

(00:41:03)
So those engines, if you’re going to launch the vehicle twice a month, you need four engines a month. So you need an engine every week. That engine needs to be being produced at rate and there’s all of the things that you need to do that, all the right machine tools, all the right fixtures, the right people, process, etcetera. So it’s one thing to build a first article, right? To launch New Glenn for the first time, you need to produce a first article, but that’s not the hard part. The hard part is everything that’s going on behind the scenes to build a factory that can produce New Glenns at rate.
Lex Fridman
(00:41:47)
So the first one is produced in a way that enables the production of the second and third and the fourth and the fifth and sixth-
Jeff Bezos
(00:41:53)
You could think of the first article as pushing, it pushes all of the rate manufacturing technology along. In other words, it’s the test article in a way that’s testing out your manufacturing technologies.
Lex Fridman
(00:42:13)
The manufacturing is the big challenge.
Jeff Bezos
(00:42:15)
Yes. I don’t want to make it sound like any of it is easy. The people who are designing the engines and all this, all of this is hard for sure, but the challenge right now is driving really hard to get to is to get to rate manufacturing and to do that in an efficient way, again back to our cost point. If you get to rate manufacturing in an inefficient way, you haven’t really solved the cost problem and maybe you haven’t really moved the state of the art forward. All this has to be about moving this state of the art forward. There are easier businesses to do. I always tell people, “Look, if you are trying to make money, start a salty snack food company or something.”
Lex Fridman
(00:42:56)
I’m going to write that idea down.
Jeff Bezos
(00:43:01)
Make the Lex Fridman Potato Chips.
Lex Fridman
(00:43:04)
Right. Don’t say it. People are going to steal it. But yeah, it’s hard.
Jeff Bezos
(00:43:10)
Do you see what I’m saying? There’s nothing easy about this business, but it’s its own reward. It’s fascinating, it’s worthwhile, it’s meaningful. I don’t want to pick on salty snack food companies, but I think it’s less meaningful. At the end of the day, you’re not going to have accomplished something amazing …
Lex Fridman
(00:43:33)
Yeah, there’s-
Jeff Bezos
(00:43:33)
… even if you do make a lot of money on it.
Lex Fridman
(00:43:35)
Yeah, there’s something fundamentally different about the “business of space exploration.”
Jeff Bezos
(00:43:41)
Yeah, for sure.
Lex Fridman
(00:43:42)
It’s a grand project of humanity.
Jeff Bezos
(00:43:44)
Yes, it’s one of humanity’s grand challenges, and especially as you look at going to the moon and going to Mars and building giant O’Neill colonies and unlocking all the things. I won’t live long enough to see the fruits of this, but the fruits of this come from building a road to space, getting the infrastructure. I’ll give you an analogy. When I started Amazon, I didn’t have to develop a payment system. It already existed. It was called the credit card. I didn’t have to develop a transportation system to deliver the packages. It already existed. It was called the Postal Service and Royal Mail and Deutsche Post and so on. So all this heavy lifting infrastructure was already in place and I could stand on its shoulders. And that’s why, when you look at the internet …

(00:44:40)
And by the way, another giant piece of infrastructure that was around in the early, I’m taking you back to 1994, people were using dial-up modems and it was piggybacking on top of the long distance phone network. That’s how the internet … That’s how people were accessing servers and so on. And again, if that hadn’t existed, it would’ve been hundreds of billions of CapEx to put that out there. No startup company could have done that. And so the problem you see, if you look at the dynamism in the internet space over the last 20 years, it’s because you see two kids in a dorm room could start an internet company that could be successful and do amazing things because they didn’t have to build heavy infrastructure. It was already there. And that’s what I want to do. I take my Amazon winnings and use that to build heavy infrastructure so that the next generation, the generation that’s my children and their children, those generations can then use that heavy infrastructure, then there’ll be space entrepreneurs who start in their dorm room. That will be a marker of success when you can have a really valuable space company started in a dorm room, then we know that we’ve built enough infrastructure so that ingenuity and imagination can really be unleashed. I find that very exciting.
Lex Fridman
(00:46:11)
They will, of course, as kids do, take all of this hard infrastructure ability for granted.
Jeff Bezos
(00:46:16)
Of course.
Lex Fridman
(00:46:18)
That entrepreneurial spirit.
Jeff Bezos
(00:46:19)
That’s an inventor’s greatest dream, is that their inventions are so successful that they are one day taken for granted. Nobody thinks of Amazon as an invention anymore. Nobody thinks of customer reviews as an invention. We pioneered customer reviews, but now they’re so commonplace. Same thing with one-click shopping and so on, but that’s a compliment. You invent something that’s so used, so beneficially used by so many people that they take it for granted.
Lex Fridman
(00:46:49)
I don’t know about nobody. Every time I use Amazon, I’m still amazed, “How does this work, the logistics, the Wazuh?”
Jeff Bezos
(00:46:55)
Well, that proves you’re a very curious explorer.
Lex Fridman
(00:46:57)
All right, all right, back to rocket. Timeline, you said 2024. As it stands now, are both the first test launch and the launch of ESCAPADE explorers to Mars still possible in 2024?
Jeff Bezos
(00:47:11)
In 2024?
Lex Fridman
(00:47:12)
Yeah.
Jeff Bezos
(00:47:13)
Yeah, I think so. For sure, the first launch and then we’ll see if ESCAPADE goes on that or not. I think that the first launch for sure and I hope ESCAPADE too.
Lex Fridman
(00:47:23)
Hope-
Jeff Bezos
(00:47:24)
Well, I just don’t know which mission it’s actually going to be slated on. So we also have other things that might go on that first mission.
Lex Fridman
(00:47:31)
Oh, I got it. But you’re optimistic that the launches will still-
Jeff Bezos
(00:47:35)
Oh, the first launch. I’m very optimistic that the first launch of New Glenn will be in 2024 and I’m just not 100% certain what payload will be on that first launch.
Lex Fridman
(00:47:44)
Are you nervous about it?
Jeff Bezos
(00:47:46)
Are you kidding? I’m extremely nervous about it.
Lex Fridman
(00:47:51)
Oh, man.
Jeff Bezos
(00:47:52)
100%. Every launch I go to, for New Shepherd, for other vehicles too, I’m always nervous for these launches. But yes, for sure, a first launch, to have no nervous about that would be some sign of derangement, I think so.
Lex Fridman
(00:48:09)
Well, I got to visit the launch, man. It’s pretty … I mean, it’s epic.
Jeff Bezos
(00:48:14)
We have done a tremendous amount of ground testing, a tremendous amount of simulation. So a lot of the problems that we might find in flight have been resolved, but there are some problems you can only find in flight. So cross your fingers. I guarantee you you’ll have fun watching it no matter what happens.
Lex Fridman
(00:48:37)
100%. When the thing is fully assembled, it comes up-
Jeff Bezos
(00:48:41)
Yeah, the transporter erector.
Lex Fridman
(00:48:44)
It’s the erector, yeah.
Jeff Bezos
(00:48:45)
Just the transporter erector for a rocket of this scale is extraordinary.
Lex Fridman
(00:48:49)
That’s an incredible machine.
Jeff Bezos
(00:48:50)
The vehicle travels out horizontally and then comes up and-
Lex Fridman
(00:48:57)
Over a few hours?
Jeff Bezos
(00:48:58)
Yeah, it’s a beautiful thing to watch.
Lex Fridman
(00:49:00)
Speaking of which, if that makes you nervous, I don’t know if you remember, but you were aboard New Shepard on its first crewed flight. How was that experience? Were you terrified then?
Jeff Bezos
(00:49:20)
Strangely, I wasn’t.
Lex Fridman
(00:49:22)
When you ride the rocket, wasn’t nerve wracking? Okay.
Jeff Bezos
(00:49:24)
It’s true. I’ve watched other people riding the rocket and I’m more nervous than when I was inside the rocket myself. It was a difficult conversation to have with my mother when I told her I was going to go on the first one. And not only was I going to go, but I was going to bring my brother too. This is a tough conversation to have with a mom.
Lex Fridman
(00:49:44)
There’s a long pause when you told her.
Jeff Bezos
(00:49:47)
She’s like, “Both of you?” It was an incredible experience and we were laughing inside the capsule and we’re not nervous. The people on the ground were very nervous for us. It was actually one of the most emotionally powerful parts of the experience happened even before the flight. At 4:30 in the morning, brother and I are getting ready to go to the launch site and Lauren is going to take us there in her helicopter and we’re getting ready to leave. And we go outside, outside the ranch house there in West Texas where the launch facility is and all of our family, my kids and my brother’s kids and our parents and close friends are assembled there and they’re saying goodbye to us, but they’re saying, “Maybe they think they’re saying goodbye to us forever,” and we might not have felt that way, but it was obvious from their faces how nervous they were that they felt that way. And it was powerful because it allowed us to see … It was almost like a attending year old memorial service or something like you could feel how loved you were in that moment and it was really amazing.
Lex Fridman
(00:51:12)
Yeah, and there’s just a epic nature to it too.
Jeff Bezos
(00:51:17)
The ascent, the floating in zero gravity. I’ll tell you something very interesting, zero gravity feels very natural. I don’t know if it’s because it’s like return to the womb or-
Lex Fridman
(00:51:31)
You just confirmed you’re an alien, but that’s all. I think that’s what you just said.
Jeff Bezos
(00:51:36)
It feels so natural to be in zero G. It was really interesting. And then what people talk about the overview effect and seeing Earth from space, I had that feeling very powerfully. I think everyone did. You see how fragile the Earth is. If you’re not an environmentalist, it will make you one. The great Jim Lovell quote, he looked back at the Earth from space and he said he realized, “You don’t go to heaven when you die. You go to heaven when you’re born.” That’s the feeling that people get when they’re in space. You see all this blackness, all this nothingness and there’s one gem of life and it’s Earth.
Lex Fridman
(00:52:15)
It is a gem. You’ve talked a lot about decision making throughout your time with Amazon. What was that decision like to be the first to ride New Shepard? Just before you talk to your mom, the pros and cons? Actually, as one human being, as a leader of a company on all fronts, what was that decision making like?
Jeff Bezos
(00:52:43)
I decided that … First of all, I knew the vehicle extremely well. I know the team who built it. I know the vehicle. I’m very comfortable with the escape system. We put as much effort into the escape system on that vehicle as we put into all the rest of the vehicle combined. It’s one of the hardest pieces of engineering in the entire New Shepard architecture.
Lex Fridman
(00:53:10)
Can you actually describe what do you mean by escape system? What’s involved?
Jeff Bezos
(00:53:13)
We have a solid rocket motor in the base of the crew capsule, so that if anything goes wrong on ascent, while the main rocket engine is firing, we can ignite this solid rocket motor in the base of the crew capsule and escape from the booster. It’s a very challenging system to build, design, validate, test, all of these things. It is the reason that I am comfortable letting anyone go on New Shepard. So the booster is as safe and reliable as we can make it, but we are harnessing … Whenever you’re talking about rocket engines, I don’t care what rocket engine you’re talking about, you’re harnessing such vast power in such a small compact geometric space. The power density is so enormous that it is impossible to ever be sure that nothing will go wrong.

(00:54:18)
And so the only way to improve safety is to have an escape system. And historically, human-rated rockets have had escape systems. Only the space shuttle did not, but Apollo had one. All of the previous Gemini, etcetera, they all had escape systems. And we have on New Shepard an unusual escape … Most escape systems are towers. We have a pusher escape system. So the solid rocket motor is actually embedded in the base of the crew capsule and it pushes and it’s reusable in the sense that, if we don’t use it, so if we have a nominal mission, we land with it. The tower systems have to be ejected at a certain point in the mission and so they get wasted even in a nominal mission.

(00:55:09)
And so again, costs really matters on these things, so we figured out how to have the escape system be a reusable. In the event that it’s not used, it can reuse it and have it be a pusher system. It’s a very sophisticated thing. So I knew these things. You asked me about my decision to go and so I know the vehicle very well, I know the people who designed it, I have great trust in them and in the engineering that we did. And I thought to myself, “Look, if I am not ready to go, then I wouldn’t want anyone to go.” A tourism vehicle has to be designed, in my view, to be as safe as one can make it. You can’t make it perfectly safe. It’s impossible, but you have … People will do things. People take risk. They climb mountains, they skydive, they do deep underwater scuba diving and so on. People are okay taking risk. You can’t eliminate the risk, but it is something, because it’s a tourism vehicle, you have to do your utmost to eliminate those risks.

(00:56:16)
And I felt very good about the system. I think it’s one of the reasons I was so calm inside and maybe others weren’t as calm. They didn’t know as much about it as I did.
Lex Fridman
(00:56:26)
Who was in charge of engaging the escape system? Did you have-
Jeff Bezos
(00:56:28)
It’s automated. The escape system is …
Lex Fridman
(00:56:31)
Okay. I was visualizing-
Jeff Bezos
(00:56:33)
… completely automated. Automated is better because it can react so much faster.
Lex Fridman
(00:56:38)
Okay. So yeah, for tourism rockets, safety is a huge, huge, huge priority for space exploration also, but a delta less.
Jeff Bezos
(00:56:46)
Yes. I think if you’re doing … There are human activities where we tolerate more risk if you’re saving somebody’s life, if you are engaging in real exploration. These are things where I personally think we would accept more risk in part because you have to.
Lex Fridman
(00:57:09)
Is there a part of you that’s frustrated by the rate of progress in Blue Origin?
Jeff Bezos
(00:57:15)
Blue Origin needs to be much faster. And it’s one of the reasons that I left my role as the CEO of Amazon a couple of years ago, “I wanted to come in and Blue Origin needs me right now.” And so I had always … When I was the CEO of Amazon, my point of view on this is, “If I’m the CEO of a publicly traded company, it’s going to get my full attention.” And it’s just how I think about things. It was very important to me. I felt I had an obligation to all the stakeholders at Amazon to do that. And so having turned the CEO, I’m still the executive chair there, but I turned the CEO role over, and the primary reason I did that is that I could spend time on Blue Origin, adding some energy, some sense of urgency, “We need to move much faster and we’re going to.”
Lex Fridman
(00:58:14)
What are the ways to speed it up? You’ve talked a lot of different ways at Amazon removing barriers for progress or distributing, making everybody autonomous and self-reliant, all those kinds of things. Is that apply at Blue Origin or is-
Jeff Bezos
(00:58:37)
It does apply. I’m leading this directly. We’re going to become the world’s most decisive company across any industry. And so at Amazon, for ever since the beginning, I said, “We’re going to become the world’s most customer-obsessed company.” And no matter the industry, one day, people are going to come to Amazon from the healthcare industry and want to know, “How are you so customer-obsessed? How do you not just pay lip service that, but actually do that?” All different industries should come want to study us to see how we accomplish that. And the analogous thing at Blue Origin and will help us move faster is we’re going to become the world’s most decisive company. We’re going to get really good at taking appropriate technology risk and making those decisions quickly, being bold on those things and having the right culture that supports that.

(00:59:40)
You need people to be ambitious, technically ambitious, “If there are five ways to do something, we’ll study them, but let’s study them very quickly and make a decision.” We can always change our mind. Changing your mind, I talk about one-way doors and two-way doors, most decisions are two-way doors.
Lex Fridman
(01:00:03)
Can you explain that because I love that metaphor?
Jeff Bezos
(01:00:06)
If you make the wrong decision, if it’s a two-way door decision, you pick a door, you walk out and you spend a little time there. It turns out to be the wrong decision, you can come back in and pick another door. Some decisions are so consequential and so important and so hard to reverse that they really are one-way door decisions. You go in that door, you’re not coming back. And those decisions have to be made very deliberately, very carefully. If you can think of yet another way to analyze the decision, you should slow down and do that. So when I was CEO of Amazon, I often found myself in the position of being the chief slow down officer because somebody would be bringing me a one-way door decision and I would say, “Okay, I can think of three more ways to analyze that. So let’s go do that because we are not going to be able to reverse this one easily. Maybe you can reverse it if it’s going to be very costly and very time-consuming. We really have to get this one right from the beginning.”

(01:01:10)
And what happens, unfortunately, in companies, what can happen, is that you have a one-size-fits-all decision-making process where you end up using the heavyweight process on all decisions …
Lex Fridman
(01:01:28)
For everything, yeah.
Jeff Bezos
(01:01:29)
… Including the lightweight ones, the two-way door decisions. Two-way door decisions should mostly be made by single individuals or by very small teams deep in the organization. And one-way door decisions are the irreversible ones. Those are the ones that should be elevated up to the senior-most executives who should slow them down and make sure that the right thing is being done.
Lex Fridman
(01:01:55)
Yeah, part of the skill here is to know the difference between one-way and two-way. I think you mentioned …
Jeff Bezos
(01:01:55)
Yes.
Lex Fridman
(01:02:01)
I think you mentioned Amazon Prime, the decision to create Amazon Prime as a one-way door. It’s unclear if it is or not, but it probably is and it’s a really big risk to go there.
Jeff Bezos
(01:02:14)
There are a bunch of decisions like that are … Changing the decision is going to be very, very complicated. Some of them are technical decisions too because some technical decisions are like quick-drying cement. Once you make them, it gets really hard. Choosing which propellants to use in a vehicle, selecting LNG for the booster stage and selecting hydrogen for the upper stage, that has turned out to be a very good decision. But if you changed your mind, that would be a very big setback. Do you see what I’m saying?
Lex Fridman
(01:02:51)
Yeah, yeah.
Jeff Bezos
(01:02:52)
So that’s the kind of decision you scrutinize very, very carefully. Other things just aren’t like that. Most decisions are not that way. Most decisions should be made by single individuals and done quickly in the full understanding that you can always change your mind.
Lex Fridman
(01:03:11)
One of the things I really liked, perhaps it’s not a two-way door decisions, is, “I disagree and commit,” phrase. So somebody brings up an idea to you, if it’s a two-way door, you state that you don’t understand enough to agree, but you still back them. I’d love for you to explain that-
Jeff Bezos
(01:03:35)
Well, yes, disagree and commit is a really important principle that saves a lot of arguing. So-
Lex Fridman
(01:03:39)
Yeah, I’m going to use that in my personal life, “I disagree, but commit.”
Jeff Bezos
(01:03:44)
It’s very common in any endeavor in life, in business and anybody where you have teammates, you have a teammate and the two of you disagree. At some point, you have to make a decision. And in companies, we tend to organize hierarchically. Whoever’s the more senior person ultimately gets to make the decision. So ultimately, the CEO gets to make that decision. And the CEO may not always make the decision that they agree with. So I would be the one who would disagree and commit. One of my direct reports would very much want to do something in a particular way. I would think it was a bad idea. I would explain my point of view. They would say, ” Jeff, I think you’re wrong and here’s why,” and we would go back and forth.

(01:04:35)
And I would often say, “You know what? I don’t think you’re right, but I’m going to gamble with you and you’re closer to the ground truth than I am. I’d known you for 20 years. You have great judgment. I don’t know that I’m right either. Not really, not for sure. All these decisions are complicated. Let’s do it your way.” But at least then you’ve made a decision and I’m agreeing to commit to that decision. So I’m not going to be second guessing it. I’m not going to be sniping at it. I’m not going to be saying, “I told you so.” I’m going to try actively to help make sure it works. That’s a really important teammate behavior.

(01:05:18)
There’s so many ways that dispute resolution is a really interesting thing on teams. And there are so many ways when two people disagree about something, even … I’m assuming the case for everybody is well-intentioned. They just have a very different opinion about what the right decision is. And in our society and inside companies, we have a bunch of mechanisms that we use to resolve these kinds of disputes. A lot of them are, I think, really bad. So an example of a really bad way of coming to agreement is compromise. So compromise, we’re in a room here and I could say, “Lex, how tall do you think this ceiling is?”
Jeff Bezos
(01:06:00)
I’m here and I could say, “Lex, how tall do you think this ceiling is?” And you’d be like, “I don’t know, Jeff, maybe 12 feet tall.” And I would say, “I think it’s 11 feet tall.” And then we’d say, “You know what? Let’s just call it 11 and a half feet.” That’s compromise, instead of. The right thing to do is to get a tape measure or figure out some way of actually measuring, but think getting that tape measure and figure out how to get it to the top of the ceiling and all these things, that requires energy. Compromise, the advantage of compromise as a resolution mechanism is that it’s low energy, but it doesn’t lead to truth. And so in things like the height of the ceiling where truth is a noble thing, you shouldn’t allow compromise to be used when you can know the truth.

(01:06:51)
Another really bad resolution mechanism that happens all the time is just who’s more stubborn? This is also, let’s say two executives who disagree and they just have a war of attrition, and whichever one gets exhausted first capitulates to the other one. Again, you haven’t arrived at truth and this is very demoralizing. So this is where escalation, I try to ask people on my team and say, “Never get to a point where you are resolving something by who gets exhausted first. Escalate that.” I’ll help you make the decision because that’s so de-energized and such a terrible, lousy way to make a decision.
Lex Fridman
(01:07:40)
Do you want to get to the resolution as quickly as possible because that ultimately leads to high velocity of decision?
Jeff Bezos
(01:07:45)
Yes, and you want to try to get as close to truth as possible. Exhausting the other person is not truth seeking.
Lex Fridman
(01:07:53)
Yes.
Jeff Bezos
(01:07:54)
And compromise is not truth seeking. And there are a lot of cases where no one knows the real truth and that’s where disagree and commit can come in, but escalation is better than war of attrition. Escalate to your boss and say, “Hey, we can’t agree on this. We like each other. We’re respectful of each other, but we strongly disagree with each other. We need you to make a decision here so we can move forward.” But decisiveness, moving forward quickly on decisions, as quickly as you responsibly can is how you increase velocity. Most of what slows things down is taking too long to make decisions at all scale levels. So it has to be part of the culture to get high velocity. Amazon has a million and a half people and the company is still fast. We’re still decisive, we’re still quick, and that’s because the culture supports that.
Lex Fridman
(01:08:53)
At every scale in a distributed way-
Jeff Bezos
(01:08:53)
Yes.
Lex Fridman
(01:08:56)
Try to maximize the velocity of decisions.
Jeff Bezos
(01:08:58)
Exactly.

Lunar program

Lex Fridman
(01:08:59)
You’ve mentioned the lunar program. Let me ask you about that. There’s a lot going on there and you haven’t really talked about it much. So in addition to the Artemis program with NASA, Blue is doing its own lander program. Can you describe it? There’s a sexy picture on Instagram with one of them. Is it the MK1, I guess?
Jeff Bezos
(01:09:20)
Yeah, The Mark 1. The picture here is me with Bill Nelson, the NASA Administrator.
Lex Fridman
(01:09:26)
Just to clarify, the lander is the sexy thing about the [inaudible 01:09:29]. I really want to clarify that.
Jeff Bezos
(01:09:32)
I know it’s not me. I know it was either the lander or Bill.
Lex Fridman
(01:09:34)
Okay. I love Bill, but-
Jeff Bezos
(01:09:37)
Thank you for clarifying.
Lex Fridman
(01:09:37)
Okay.
Jeff Bezos
(01:09:40)
Yes, the Mark 1 lander is designed to take 3,000 kilograms to the surface of the moon and to cargo expendable cargo. It’s an expendable lander. Lands on the moon, stays there, take 3,000 kilograms to the surface. It can be launched on a single New Glenn flight, which is very important. So it’s a relatively simple architecture, just like the human landing system lander, they’re called the Mark 2. Mark 1 is also fueled with liquid hydrogen, which is for high energy emissions like landing on the surface of the moon. The high specific impulsive hydrogen is a very big advantage.

(01:10:24)
The disadvantage of hydrogen has always been that since it’s such a deep cryogen, it’s not storable. So it’s constantly boiling off and you’re losing propellant because it’s boiling off. And so what we’re doing as part of our lunar program is developing solar-powered cryo coolers that can actually make hydrogen a storable propellant for deep space. And that’s a real game-changer. It’s a game-changer for any high energy mission. So to the moon, but to the outer planets, to Mars, everywhere.
Lex Fridman
(01:11:00)
So the idea with both Mark 1 and Mark 2 is the New Glenn can carry it from the surface of earth to the surface of the moon?
Jeff Bezos
(01:11:12)
Exactly. So the Mark 1 is expendable. The lunar lander we’re developing for NASA, the Mark 2 lander, that’s part of the Artemis program. They call it the Sustaining Lander Program. So that lander is designed to be reusable. It can land on the surface of the moon in a single stage configuration and then take off. So if you look at the Apollo program, the lunar lander and Apollo was really two stages. It would land on the surface and then it would leave the descent stage on the surface of the moon and only the ascent stage would go back up into lunar orbit where it would rendezvous with the command module.

(01:11:56)
Here, what we’re doing is we have a single stage lunar lander that carries down enough propellant so that it can bring the whole thing back up so that it can be reused over and over. And the point of doing that, of course, is to reduce cost so that you can make lunar missions more affordable over time, which is that’s one of NASA’s big objectives because this time… The whole point of Artemis is go back to the moon, but this time to stay. So back in the Apollo program, we went to the moon six times and then ended the program and it really was too expensive to continue.
Lex Fridman
(01:12:35)
And so there’s a few questions there, but one is how do you stay on the moon? What ideas do you have about sustaining life where a few folks can stay there for prolonged periods of time?
Jeff Bezos
(01:12:51)
Well, one of the things we’re working on is using lunar resources like lunar regolith to manufacture commodities and even solar cells on the surface of the moon. We’ve already built a solar cell that is completely made from lunar regolith stimulant, and this solar cell is only about 7% power efficient. So it’s very inefficient compared to the more advanced solar cells that we make here on earth. But if you can figure out how to make a practical solar cell factory that you can land on the surface of the moon and then the raw material for those solar cells is simply lunar regolith, then you can just continue to churn out solar cells on the surface of the moon, have lots of power on the surface of the moon. That will make it easier for people to live on the moon.

(01:13:51)
Similarly, we’re working on extracting oxygen from lunar regolith. So lunar regolith by weight has a lot of oxygen in it. It’s bound very tightly as oxides with other elements. And so you have to separate the oxygen, which is very energy intensive. So that also could work together with the solar cells. And then ultimately, we may be able to find practical quantities of ice in the permanently shadowed craters on the poles of the moon. And we know there is ice water or water ice in those craters, and we know that we can break that down with electrolysis into hydrogen and oxygen. And then you’d not only have oxygen, but you’d also have a very good high efficiency propellant fuel in hydrogen.

(01:14:57)
So there’s a lot we can do to make the moon more sustainable over time, but the very first step, the gate that all of that has to go through is we need to be able to land cargo and humans on the surface of the moon at an acceptable cost.
Lex Fridman
(01:15:16)
To fast-forward a little bit, is there any chance Jeff Bezos steps foot on the moon and on Mars, one or the other or both?
Jeff Bezos
(01:15:27)
It’s very unlikely. I think it’s probably something that gets done by future generations by the time it gets to me. I think in my lifetime that’s probably going to be done by professional astronauts, sadly. I would love to sign up for that mission. So don’t count me out yet, Lex. Give me a finding shot here maybe, but I think if we are placing reasonable bets on such a thing, in my lifetime, that will continue to be done by professional astronauts.
Lex Fridman
(01:15:59)
So these are risky, difficult missions?
Jeff Bezos
(01:16:02)
And probably missions that require a lot of training. You are going there for a very specific purpose to do something. We’re going to be able to do a lot on the moon too with automation. So in terms of setting up these factories and doing all that, we are sophisticated enough now with automation that we probably don’t need humans to tend those factories and machines. So there’s a lot that’s going to be done in both modes.
Lex Fridman
(01:16:28)
So I have to ask the bigger picture question about the two companies pushing humanity forward out towards the stars, Blue Origin and SpaceX. Are you competitors, collaborators? Which and to what degree?
Jeff Bezos
(01:16:44)
Well, I would say just like the internet is big and there are lots of winners at all scale levels, there are half a dozen giant companies that the internet has made, but there are a bunch of medium-sized companies and a bunch of small companies, all successful, all with profit streams, all driving great customer experiences. That’s what we want to see in space, that kind of dynamism. And space is big. There’s room for a bunch of winners and it’s going to happen at all skill levels. And so SpaceX is going to be successful for sure. I want Blue Origin to be successful, and I hope there are another five companies right behind us.
Lex Fridman
(01:17:25)
But I spoke to Elon a few times recently about you, about Blue Origin, and he was very positive about you as a person and very supportive of all the efforts you’ve been leading at Blue. What’s your thoughts? You worked with a lot of leaders at Amazon at Blue. What’s your thoughts about Elon as a human being and a leader?
Jeff Bezos
(01:17:46)
Well, I don’t really know Elon very well. I know his public persona, but I also know you can’t know anyone by their public persona. It’s impossible. You may think you do, but I guarantee you don’t. So I don’t really know. You know Elon way better than I do, Lex, but in terms of judging by the results, he must be a very capable leader. There’s no way you could have Tesla and SpaceX without being a capable leader. It’s impossible.
Lex Fridman
(01:18:22)
Yeah, I hope you guys hang out sometimes, shake hands and sort of have a kind of friendship that would inspire just the entirety of humanity, because what you’re doing is one of the big grand challenges ahead for humanity.
Jeff Bezos
(01:18:40)
Well, I agree with you and I think in a lot of these endeavors we’re very like-minded. So I’m not saying we’re identical, but I think we’re very like-minded. And so I love that idea.

Amazon

Lex Fridman
(01:18:56)
All right, going back to sexy pictures on your Instagram, there’s a video of you from the early days of Amazon, giving a tour of your, “Offices.” I think your dad is holding the camera.
Jeff Bezos
(01:19:10)
He is. Yeah, I know, right? Yes. This is what? The giant orange extension cord.
Lex Fridman
(01:19:12)
And you’re explaining the genius of the extension cord and how this is a desk and the CRT monitor, and that’s where all the magic happened. I forget what your dad said, but this is the center of it all. So what was it like? What was going through your mind at that time? You left a good job in New York and took this leap. Were you excited? Were you scared?
Jeff Bezos
(01:19:37)
So excited and scared, anxious. Thought the odds of success were low. Told all of our early investors that I thought there was a 30% chance of success by which I just mean getting your money back, not what actually happened. Because that’s the truth. Every startup company is unlikely to work. It’s helpful to be in reality about that, but that doesn’t mean you can’t be optimistic. So you have to have this duality in your head. On the one hand, you know what the baseline statistics say about startup companies, and the other hand, you have to ignore all of that and just be 100% sure it’s going to work, and you’re doing both things at the same time. You’re holding that contradiction in your head.

(01:20:24)
But it was so exciting. From 1994 when the company was founded to 1995 when we opened our doors, all the way until today, I find Amazon so exciting. And that doesn’t mean… It’s full of pain, full of problems. It’s like there’s so many things that need to be resolved and worked and made better and et cetera. But on balance, it’s so fun. It’s such a privilege. It’s been such a joy. I feel so grateful that I’ve been part of that journey. It’s just been incredible.
Lex Fridman
(01:21:04)
So in some sense, you don’t want a single day of comfort. You’ve written about this many times. We’ll talk about your writing, which I would highly recommend people read and just the letters to shareholders. So explaining the idea of day one thinking, I think you first wrote about in 97 letters to shareholders. Then you also in a way wrote it about, sad to say, is your last letter to shareholders as CEO. And you said that, “Day two is stasis followed by irrelevance, followed by excruciating painful decline, followed by death.” And that is why it’s always day one. Can you explain this day one thing? This is a really powerful way to describe the beginning and the journey of Amazon.
Jeff Bezos
(01:21:56)
It’s really a very simple, and I think age-old idea about renewal and rebirth and every day is day one. Every day you are deciding what you’re going to do and you are not trapped by what you were or who you were or any self-consistency. Self-consistency even can be a trap. And so day one thinking is we start fresh every day and we get to make new decisions every day about invention, about customers, about how we’re going to operate. Even as deeply as what our principles are, we can go back to that. It turns out we don’t change those very often, but we change them occasionally.

(01:22:49)
And when we work on programs at Amazon, we often make a list of tenants. And the tenants are… They’re not principles, they’re a little more tactical than principles, but it’s the main ideas that we want this program to embody, whatever those are. And one of the things that we do is we put, “These are the tenets for this program and parentheses.” We always put, “Unless you know a better way.” And that idea, “Unless you know a better way,” is so important because you never want to get trapped by dogma. You never want to get trapped by history. It doesn’t mean you discard history or ignore it. There’s so much value in what has worked in the past, but you can’t be blindly following what you’ve done. And that’s the heart of day one, is you’re always starting afresh.
Lex Fridman
(01:23:51)
And to the question of how to fend off day two, you said, “Such a question can’t have a simple answer,” as you’re saying. “There will be many elements, multiple paths, and many traps. I don’t know the whole answer, but I may know bits of it. Here’s a starter pack of essentials, maybe others come to mind. For day one, defense, customer obsession, a skeptical view of proxies, the eager adoption of external trends and high velocity decision-making.”

(01:24:19)
So we talked about high velocity decision-making, that’s more difficult than it sounds. So maybe you can pick one that stands out to you as you can comment on. Eager adoption of external trends, high velocity decision-making, skeptical view of proxies. How do you fight off day two?
Jeff Bezos
(01:24:36)
Well, I’ll talk about… Because I think it’s the one that is maybe in some ways the hardest to understand, is the skeptical view of proxies. One of the things that happens in business, probably anything where you have an ongoing program and something is underway for a number of years, is you develop certain things that you’re managing to. The typical case would be a metric, and that metric isn’t the real underlying thing. And so maybe the metric is efficiency metric around customer contacts per unit sold or something like. If you sell a million units, how many customer contacts do you get or how many returns do you get? And so on and so on.

(01:25:30)
And so what happens is a little bit of a kind of inertia sets in where somebody a long time ago invented that metric and they invented that metric, they decided, “We need to watch for customer returns per unit sold as an important metric.” But they had a reason why they chose that metric, the person who invented that metric and decided it was worth watching. And then fast-forward five years, that metric is the proxy.
Lex Fridman
(01:26:02)
The proxy for truth, I guess.
Jeff Bezos
(01:26:04)
The proxy for truth. Let’s say in this case it’s a proxy for customer happiness, but that metric is not actually customer happiness. It’s a proxy for customer happiness. The person who invented the metric understood that connection. Five years later, a kind of inertia can set in and you forget the truth behind why you were watching that metric in the first place. And the world shifts a little and now that proxy isn’t as valuable as it used to be or it’s missing something. And you have to be on alert for that. You have to know, “Okay, I don’t really care about this metric. I care about customer happiness and this metric is worth putting energy into and following and improving and scrutinizing, only in so much as it actually affects customer happiness.”

(01:27:03)
And so you’ve got to constantly be on guard and it’s very, very common. This is a nuanced problem. It’s very common, especially in large companies, that they’re managing to metrics that they don’t really understand. They don’t really know why they exist, and the world may have shifted out from under them a little and the metrics are no longer as relevant as they were when somebody 10 years earlier invented the metric.
Lex Fridman
(01:27:29)
That is a nuance, but that’s a big problem. Right?
Jeff Bezos
(01:27:33)
It’s a huge problem.
Lex Fridman
(01:27:34)
There’s something so compelling to have a nice metric to try to optimize.
Jeff Bezos
(01:27:38)
Yes. And by the way, you do need metrics.
Lex Fridman
(01:27:41)
Yes, you do.
Jeff Bezos
(01:27:41)
You can’t ignore them. Want them, but you just have to be constantly on guard. This is a way to slip into day two thinking would be to manage your business to metrics that you don’t really understand and you’re not really sure why they were invented in the first place, and you’re not sure they’re still as relevant as they used to be.
Lex Fridman
(01:28:03)
What does it take to be the guy or gal who brings up the point that this proxy might be outdated? I guess what does it take to have a culture that enables that in the meeting? Because that’s a very uncomfortable thing to bring up at a meeting. “We all showed up here, it’s a Friday.”
Jeff Bezos
(01:28:21)
You have just asked a million-dollar question. So if I generalize what you’re asking, you are talking in general about truth-telling and we humans are not really truth-seeking animals. We are social animals.
Lex Fridman
(01:28:42)
Yeah, we are.
Jeff Bezos
(01:28:44)
And take you back in time 10,000 years and you’re in a small village. If you go along to get along, you can survive. You can procreate. If you’re the village truth-teller, you might get clubbed to death in the middle of the night. Truths are often… They don’t want to be heard because important truths can be uncomfortable, they can be awkward, they can be exhausting.
Lex Fridman
(01:29:12)
Impolite and all that kind of stuff.
Jeff Bezos
(01:29:14)
Yes, challenging. They can make people defensive even if that’s not the intent. But any high performing organization, whether it’s a sports team, a business, a political organization, an activist group, I don’t care what it is, any high performing organization has to have mechanisms and a culture that supports truth-telling. One of the things you have to do is you have to talk about that. You have to talk about the fact that it takes energy to do that. You have to talk to people, you have to remind people, “It’s okay that it’s uncomfortable.” Literally tell people, “It’s not what we’re designed to do as humans.” It’s kind of a side effect. We can do that, but it’s not how we survive. We mostly survive by being social animals and being cordial and cooperative, and that’s really important.

(01:30:10)
And so science is all about truth-telling. It’s actually a very formal mechanism for trying to tell the truth. And even in science, you find that it’s hard to tell the truth. Even you’re supposed to have hypothesis and test it and find data and reject the hypothesis and so on, it’s not easy.
Lex Fridman
(01:30:36)
But even in science, there’s like the senior scientists and the junior scientists.
Jeff Bezos
(01:30:36)
Correct.
Lex Fridman
(01:30:41)
And then there’s a hierarchy of humans where somehow seniority matters in the scientific process, which it should not.
Jeff Bezos
(01:30:49)
Yes, and that’s true inside companies too. And so you want to set up your culture so that the most junior person can overrule the most senior person if they have data. And that really is about trying to… There are little things you can do. So for example, in every meeting that I attend, I always speak last. And I know from experience that if I speak first, even very strong-willed, highly intelligent, high judgment participants in that meeting will wonder, “Well, if Jeff thinks that, I came in this meeting thinking one thing, but maybe I’m not right.” And so you can do little things like if you’re the most senior person in the room, go last, let everybody else go first. In fact, ideally, let’s try to have the most junior person go first and the second and try to go in order of seniority so that you can hear everyone’s opinion in an unfiltered way. Because we really do, we actually literally change our opinions. If somebody who you really respect says something, it makes you change your mind a little.
Lex Fridman
(01:32:17)
So you’re saying implicitly or explicitly, give permission for people to have a strong opinion, as long as it’s backed by data.
Jeff Bezos
(01:32:27)
Yes, and sometimes it can even… By the way, a lot of our most powerful truths turn out to be hunches, they turn out to be based on anecdotes, they’re intuition based. And sometimes you don’t even have strong data, but you may know the person well enough to trust their judgment. You may feel yourself leaning in. It may resonate with a set of anecdotes you have, and then you may be able to say, “Something about that feels right. Let’s go collect some data on that. Let’s try to see if we can actually know whether it’s right. But for now, let’s not disregard it. It feels right.”

(01:33:06)
You can also fight inherent bias. There’s an optimism bias. If there are two interpretations of a new set of data and one of them is happy and one of them is unhappy, it’s a little dangerous to jump to the conclusion that the happy interpretation is right. You may want to compensate for that human bias of trying to find the silver lining and say, “Look, that might be good, but I’m going to go with it’s bad for now until we’re sure.”
Lex Fridman
(01:33:36)
So speaking of happiness bias, data collection and anecdotes, you have to… How’s that for a transition? You have to tell me the story of the call you made, the customer service call you made to demonstrate a point about wait times?
Jeff Bezos
(01:33:57)
Yeah. This is very early in the history of Amazon.
Lex Fridman
(01:34:00)
Yes.
Jeff Bezos
(01:34:00)
And we were going over a weekly business review and a set of documents, and I have a saying, which is when the data and the anecdotes disagree, the anecdotes are usually right. And it doesn’t mean you just slavishly go follow the anecdotes then. It means you go examine the data because it’s usually not that the data is being miscollected, it’s usually that you’re not measuring the right thing. And so of you have a bunch of customers complaining about something and at the same time, your metrics look like they shouldn’t be complaining, you should doubt the metrics.

(01:34:43)
And an early example of this was we had metrics that showed that our customers were waiting, I think less than, I don’t know, 60 seconds when they called a 1-800 number to get phone customer service. The wait time was supposed to be less than 60 seconds, but we had a lot of complaints that it was longer than that. And anecdotally it seemed longer than that. I would call customer service myself. And so one day we’re in a meeting, we’re going through the WBR, the weekly business review, and we get to this metric in the deck, and the guy who leads customer service is defending the metric. And I said, “Okay, let’s call.” Picked up the phone, and I dialed the 1-800 number and called customer service, and we just waited in silence.
Lex Fridman
(01:35:39)
What did it turn out to be?
Jeff Bezos
(01:35:40)
Oh, it was really long, more than 10 minutes, I think.
Lex Fridman
(01:35:42)
Oh, wow.
Jeff Bezos
(01:35:43)
It was many minutes. And so it dramatically made the point that something was wrong with the data collection. We weren’t measuring the right thing, and that set off a whole chain of events where we started measuring it right. And that’s an example, by the way, of truth-telling is like that’s an uncomfortable thing to do, but you have to seek truth even when it’s uncomfortable, and you have to get people’s attention and they have to buy into it, and they have to get energized around really fixing things.

Principles

Lex Fridman
(01:36:16)
So that speaks to the obsession with the customer experience. So one of the defining aspects of your approach to Amazon is just being obsessed with making customers happy. I think companies sometimes say that, but Amazon is really obsessed with that. I think there’s something really profound to that, which is seeing the world through the eyes of the customer, like the customer experience, the human being that’s using the product, that’s enjoying the product, the subtle little things that make up their experience. How do you optimize those?
Jeff Bezos
(01:36:55)
This is another really good and deep question because there are big things that are really important to manage, and then there are small things. Internally into Amazon, we call them paper cuts. So we’re always working on the big things, if you ask me. And most of the energy goes into the big things, as it should, and you can identify the big things. And I would encourage anybody, if anybody listening to this is an entrepreneur, has a small business, whatever, think about the things that are not going to change over 10 years. And those are probably the big things.

(01:37:38)
So I know in our retail business at Amazon, 10 years from now, customers are still going to want low prices. I know they’re still going to want fast delivery, and I just know they’re still going to want big selection. So it’s impossible to imagine a scenario where 10 years from now where a customer says, “I love Amazon, I just wish the prices were a little higher,” or, “I love Amazon, I just wish you delivered a little more slowly.” So when you identify the big things you can tell they’re worth putting energy into because they’re stable in time.

(01:38:10)
Okay, but you’re asking about something a little different, which is in every customer experience, there are those big things. And by the way, it’s astonishingly hard to focus even on just the big things. So even though they’re obvious, they’re really hard to focus on. But in addition to that, there are all these little tiny customer experience deficiencies, and we call those paper cuts. We make long lists of them. And then we have dedicated teams that go fix paper cuts because the teams working on the big issues never get to the paper cuts. They never work their way down the list to get to… They’re working on big things, as they should and as you want them to. And so you need special teams who are charged with fixing…
Jeff Bezos
(01:39:00)
Special teams who are charged with fixing paper cuts.
Lex Fridman
(01:39:04)
Where would you put on the paper cut spectrum the Buy now with the 1-Click button? Which is, I think, pretty genius. So to me, okay, my interaction with things I love on the internet, there’s things I do a lot. I, maybe representing a regular human, I would love for those things to be frictionless. For example, booking airline tickets, just saying. But it’s buying a thing with one click, making that experience frictionless, intuitive, all aspects of that, that just fundamentally makes my life better, not just in terms of efficiency, in terms of some kind of-
Jeff Bezos
(01:39:49)
Cognitive load.
Lex Fridman
(01:39:50)
… Yeah, cognitive load and inner peace and happiness. Because, first of all, buying stuff is a pleasant experience. Having enough money to buy a thing and then buying it is a pleasant experience. And having pain around that is somehow just you’re ruining a beautiful experience. And I guess all I’m saying as a person who loves good ideas, is that a paper cut, a solution to a paper cut?
Jeff Bezos
(01:40:17)
Yes. So that particular thing is probably a solution to a number of paper cuts. So if you go back and look at our order pipeline and how people shopped on Amazon before we invented 1-Click shopping, there was more friction. There was a whole series of paper cuts and that invention eliminated a bunch of paper cuts. And I think you’re absolutely right by the way, that when you come up with something like 1-Click shopping, again, this is so ingrained in people now, I’m impressed that you even notice it. Most people-
Lex Fridman
(01:40:54)
Every time I click the button, I just-
Jeff Bezos
(01:40:54)
… most people never notice.
Lex Fridman
(01:40:55)
… just a surge of happiness.
Jeff Bezos
(01:41:00)
There is in the perfect invention for the perfect moment in the perfect context, there is real beauty. It is actual beauty and it feels good. It’s emotional. It’s emotional for the inventor, it’s emotional for the team that builds it. It’s emotional for the customer. It’s a big deal and you can feel those things.
Lex Fridman
(01:41:23)
But to keep coming up with that idea, with those kinds of ideas, I guess is the day one thinking effort.
Jeff Bezos
(01:41:29)
Yeah, and you need a big group of people who feel that kind of satisfaction with creating that kind of beauty.
Lex Fridman
(01:41:38)
There’s a lot of books written about you. There’s a book Invent & Wander where Walter Isaacson does an intro. It’s mostly collective writings of yours. I’ve read that. I also recommend people check out the Founders Podcast that covers you a lot and it does different analysis of different business advice you’ve given over the years. I bring all that up because I mentioned that you said that books are an antidote for short attention spans. And I forget how it was phrased, but that when you were thinking about the Kindle that you were thinking about how technology changes us.
Jeff Bezos
(01:42:20)
Changes us. We co-evolve with our tools. So we invent new tools and then our tools change us.
Lex Fridman
(01:42:30)
Which is fascinating to think about.
Jeff Bezos
(01:42:32)
It goes in a circle
Lex Fridman
(01:42:33)
And there’s some aspect, even just inside business, where you don’t just make the customer happy, but you also have to think about where is this going to take humanity if you zoom out a bit?
Jeff Bezos
(01:42:45)
A hundred percent and you can feel your brain. Brains are plastic and you can feel your brain getting reprogrammed. I remember the first time this happened to me was when Tetris who’d first came on the scene. Anybody who’s been a game player has this experience where you close your eyes to lay down to go to sleep and you see all the little blocks moving and you’re kind of rotating them in your mind and you can just tell as you walk around the world that you have rewired your brain to play Tetris. But that happens with everything. I think we still have yet to see the full repercussions of this, I fear, but I think one of the things that we’ve done online and largely because of social media is we have trained our brains to be really good at processing super short form content.

(01:43:52)
Your podcast flies in the face of this. You do these long format things.
Lex Fridman
(01:43:59)
Books do too.
Jeff Bezos
(01:44:00)
And reading books is a long format thing and if something is convenient, we do more of it. We carry around in our pocket a phone, and one of the things that phone does for the most part is it is an attention shortening device because most of the things we do on our phone shorten our attention spans. And I’m not even going to say we know for sure that that’s bad, but I do think it’s happening. That’s one of the ways we’re co-evolving with that tool. But I think it’s important to spend some of your time and some of your life doing long attention span things.
Lex Fridman
(01:44:41)
Yeah, I think you’ve spoken about the value in your own life of focus, of singular focus on a thing for prolonged periods of time, and that’s certainly what books do and that’s certainly what that piece of technology does. But I bring all that up to ask you about another piece of technology, AI, that has the potential to have various trajectories to have an impact on human civilization. How do you think AI will change us?
Jeff Bezos
(01:45:14)
If you’re talking about generative AI, large language models, things like ChatGPT, and its soon successors, these are incredibly powerful technologies. To believe otherwise is to bury your head in the sand, soon to be even more powerful. It’s interesting to me that large language models in their current form are not inventions, they’re discoveries. The telescope was an invention, but looking through it at Jupiter, knowing that it had moons, was a discovery. My God, it has moons. And that’s what Galileo did. And so this is closer on that spectrum of invention. We know exactly what happens with a 787, it’s an engineered object. We designed it. We know how it behaves. We don’t want any surprises. Large language models are much more like discoveries. We’re constantly getting surprised by their capabilities. They’re not really engineered objects.

(01:46:35)
Then you have this debate about whether they’re going to be good for humanity or bad for humanity. Even specialized AI could be very bad for humanity. Just regular machine learning models can make certain weapons of war, that could be incredibly destructive and very powerful. And they’re not general AIs. They could just be very smart weapons. And so we have to think about all of those things. I’m very optimistic about this. So even in the face of all this uncertainty, my own view is that these powerful tools are much more likely to help us and save us even than they are to on balance hurt us and destroy us. I think we humans have a lot of ways of we can make ourselves go extinct. These things may help us not do that, so they may actually save us. So the people who are overly concerned, in my view, overly, it is a valid debate. I think that they may be missing part of the equation, which is how helpful they could be in making sure we don’t destroy ourselves.

(01:48:07)
I don’t know if you saw the movie Oppenheimer, but to me, first of all, I loved the movie and I thought the best part of the movie is this bureaucrat played by Robert Downey Jr, who some of the people I’ve talked to think that’s the most boring part of the movie. I thought it was the most fascinating because what’s going on here is you realize we have invented these awesome, destructive, powerful technologies called nuclear weapons and they’re managed and we humans, we’re not really capable of wielding those weapons. And that’s what he represented in that movie is here’s this guy, he wrongly thinks… he’s being so petty. He thinks that Oppenheimer said something bad to Einstein about him. They didn’t talk about him at all as you find out in the final scene of the movie. And yet he’s spent his career trying to be vengeful and petty.

(01:49:19)
And that’s the problem. We as a species are not really sophisticated enough and mature enough to handle these technologies. And by the way, before you get to general AI and the possibility of AI having agency and there’s a lot of things would have to happen, but there’s so much benefit that’s going to come from these technologies in the meantime, even before there are general AI in terms of better medicines and better tools to develop more technologies and so on. So I think it’s an incredible moment to be alive and to witness the transformations that are going to happen. How quickly will happen, no one knows. But over the next 10 years and 20 years, I think we’re going to see really remarkable advances. And I personally am very excited about it.
Lex Fridman
(01:50:12)
First of all, really interesting to say that it’s discoveries, that it’s true that we don’t know the limits of what’s possible with the current language models.
Jeff Bezos
(01:50:24)
We don’t.
Lex Fridman
(01:50:24)
And it could be a few tricks and hacks here and there that open doors to hold entire new possibilities.
Jeff Bezos
(01:50:33)
We do know that humans are doing something different from these models, in part because we’re so power efficient. The human brain does remarkable things and it does it on about 20 watts of power. And the AI techniques we use today use many kilowatts of power to do equivalent tasks. So there’s something interesting about the way the human brain does this. And also we don’t need as much data. So self-driving cars, they have to drive billions and billions of miles to try to learn how to drive. And your average 16-year-old figures it out with many fewer miles. So there are still some tricks, I think, that we have yet to learn. I don’t think we’ve learned the last trick. I don’t think it’s just a question of scaling things up. But what’s interesting is that just scaling things up, and I put just in quotes because it’s actually hard to scale things up, but just scaling things up also appears to pay huge dividends.
Lex Fridman
(01:51:40)
Yeah. And there’s some more nuanced aspect about human beings that’s interesting if it’s able to accomplish like being truly original and novel. Large language models, being able to come up with some truly new ideas. That’s one. And the other one is truth. It seems that large language models are very good at sounding like they’re saying a true thing, but they don’t require or often have a grounding in a mathematical truth, basically is a very good bullshitter. So if there’s not enough data in the training data about a particular topic, it’s just going to concoct accurate sounding narratives, which is a very fascinating problem to try to solve, how do you get language models to infer what is true or not to introspect?
Jeff Bezos
(01:52:41)
Yeah, they need to be taught to say, “I don’t know,” more often and I know several humans who could be taught that as well.
Lex Fridman
(01:52:50)
Sure. And then the other stuff, because you’re still a bit involved in the Amazon side with the AI things, the other open question is what kind of products are created from this?
Jeff Bezos
(01:53:01)
Oh, so many. We have Alexa and Echo and Alexa has hundreds of millions of installed base inputs. And so there’s Alexa everywhere. And guess what? Alexa is about to get a lot smarter. And so from a product point of view, that’s super exciting.
Lex Fridman
(01:53:27)
There’s so many opportunities there,
Jeff Bezos
(01:53:30)
So many opportunities. Shopping assistant, all that stuff is amazing. And AWS, we’re building Titan, which is our foundational model. We’re also building Bedrock, which are corporate clients at AWS. Our enterprise clients, they want to be able to use these powerful models with their own corporate data without accidentally contributing their corporate data to that model. And so those are the tools we’re building for them with Bedrock. So there’s tremendous opportunity here.
Lex Fridman
(01:54:03)
Yeah, the security, the privacy, all those things are fascinating. Because so much value can be gained by training on private data, but you want to keep this secure. It’s a fascinating technical problem.
Jeff Bezos
(01:54:13)
Yes. This is a very challenging technical problem and it’s one that we’re making progress on and dedicated to solving for our customers.
Lex Fridman
(01:54:21)
Do you think there will be a day when humans and robots, maybe Alexa, have a romantic relationship like in the movie Her?
Jeff Bezos
(01:54:29)
Well, I think if you look at the-
Lex Fridman
(01:54:31)
Just brainstorming products here.
Jeff Bezos
(01:54:32)
… if you look at the spectrum of human variety and what people like, sexual variety, there are people who like everything. So the answer to your question has to be yes.
Lex Fridman
(01:54:43)
Okay. I guess I’m asking when-
Jeff Bezos
(01:54:45)
I don’t know how widespread that will be.
Lex Fridman
(01:54:45)
… All right.
Jeff Bezos
(01:54:48)
But it will happen.

Productivity

Lex Fridman
(01:54:49)
I was just asking when for a friend, but it’s all right. Moving on. Next question. What’s a perfectly productive day in the life of Jeff Bezos? You’re one of the most productive humans in the world.
Jeff Bezos
(01:55:03)
Well, first of all, I get up in the morning and I putter. I have a coffee.
Lex Fridman
(01:55:09)
Can you define putter?
Jeff Bezos
(01:55:11)
I slowly move around. I’m not as productive as you might think I am. Because I do believe in wandering and I read my phone for a while. I read newspapers for a while. I chat with Laura and I drink my first coffee. So I move pretty slowly in the first couple of hours. I get up early just naturally, and then I exercise most days. Most days it’s not that hard for me. Some days it’s really hard and I do it anyway, I don’t want to, and it’s painful. And I’m like, “Why am I here?” And I don’t want to do any of this.
Lex Fridman
(01:55:52)
“Why am I here at the gym?”
Jeff Bezos
(01:55:53)
“Why am I here at the gym? Why don’t I do something else?” It’s not always easy.
Lex Fridman
(01:55:59)
What’s your social motivation in those moments?
Jeff Bezos
(01:56:02)
I know that I’ll feel better later if I do it. And so the real source of motivation, I can tell the days when I skip it, I’m not quite as alert. I don’t feel as good. And then there’s harder motivations. It’s longer term, you want to be healthy as you age. You want health span. Ideally, you want to be healthy and moving around when you’re 80 years old. And so there’s a lot of… But that kind of motivation is so far in the future, it can be very hard to work in the second. So thinking about the fact I’ll feel better in about four hours if I do it now, I’ll have more energy for the rest of my day and so on and so on.
Lex Fridman
(01:56:42)
What’s your exercise routine, just to linger on that? How much you curl? What are we talking about here? That’s all I do at the gym so I just…
Jeff Bezos
(01:56:52)
My routine on a good day, I do about half an hour of cardio and I do about forty-five minutes of weightlifting, resistance training of some kind, mostly weights. I have a trainer who I love who pushes me, which is really helpful. He’ll say, “Jeff, can we go up on that weight a little bit?”

(01:57:18)
And I’ll think about it and I’ll be like, “No, I don’t think so.”

(01:57:23)
And he’ll look at me and say, “Yeah, I think you can.” And of course he’s right.
Lex Fridman
(01:57:31)
Yeah, of course. Of course.
Jeff Bezos
(01:57:32)
So it’s helpful to have somebody push you a little bit.
Lex Fridman
(01:57:34)
But almost every day, you do that?
Jeff Bezos
(01:57:37)
Almost every day, I do a little bit of cardio and a little bit of weightlifting and I’d rotate. I do a pulling day and a pushing day and a leg day. It’s all pretty standard stuff.
Lex Fridman
(01:57:48)
So puttering, coffee, gym-
Jeff Bezos
(01:57:49)
Puttering, coffee, gym, and then work.
Lex Fridman
(01:57:53)
… work. But what’s work look like? What do the productive hours look like for you?
Jeff Bezos
(01:57:59)
So a couple years ago, I left as the CEO of Amazon, and I have never worked harder in my life. I am working so hard and I’m mostly enjoying it, but there are also some very painful days. Most of my time is spent on Blue Origin and I’m so deeply involved here now for the last couple of years. And in the big, I love it, and the small, there’s all the frustrations that come along with everything. We’re trying to get to rate manufacturing as we talked about. That’s super important. We’ll get there. We just hired a new CEO, a guy I’ve known for close to 15 years now, a guy named Dave Limp who I love. He’s amazing. So we’re super lucky to have Dave, and you’re going to see us move faster there.

(01:58:46)
So my day of work, reading documents, having meetings, sometimes in person, sometimes over Zoom, depends on where I am. It’s all about the technology, it’s about the organization. I have architecture and technology meetings almost every day on various subsystems inside the vehicle, inside the engines. It’s super fun for me. My favorite part of it is the technology. My least favorite part of it is building organizations and so on. That’s important, but it’s also my least favorite part. So that’s why they call it work. You don’t always get to do what you want to do.
Lex Fridman
(01:59:31)
How do you achieve time where you can focus and truly think through problems?
Jeff Bezos
(01:59:36)
I do little thinking retreats. So this is not the only way, I can do that all day long. I’m very good at focusing. I don’t keep to a strict schedule. My meetings often go longer than I planned for them to because I believe in wandering. My perfect meeting starts with a crisp document. So the document should be written with such clarity that it’s like angels singing from on high. I like a crisp document and a messy meeting. And so the meeting is about asking questions that nobody knows the answer to and trying to wander your way to a solution. And when that happens just right, it makes all the other meetings worthwhile. It feels good. It has a kind of beauty to it. It has an aesthetic beauty to it, and you get real breakthroughs in meetings like that.
Lex Fridman
(02:00:37)
Can you actually describe the crisp document? This is one of the legendary aspects of Amazon, of the way you approach meetings is this, the six-page memo. Maybe first describe the process of running a meeting with memos.
Jeff Bezos
(02:00:51)
Meetings at Amazon and Blue Origin are unusual. When new people come in, like a new executive joins, they’re a little taken aback sometimes because the typical meeting, we’ll start with a six-page narratively structured memo and we do study hall. For 30 minutes, we sit there silently together in the meeting and read.
Lex Fridman
(02:00:51)
I love this.
Jeff Bezos
(02:01:17)
Take notes in the margins. And then we discuss. And the reason, by the way, we do study, you could say, I would like everybody to read these memos in advance, but the problem is people don’t have time to do that. And they end up coming to the meeting having only skimmed the memo or maybe not read it at all, and they’re trying to catch up. And they’re also bluffing like they were in college having pretended to do the reading.
Lex Fridman
(02:01:42)
Yeah. Exactly.
Jeff Bezos
(02:01:43)
It’s better just to carve out the time for people.
Lex Fridman
(02:01:47)
Yeah. And do it together.
Jeff Bezos
(02:01:47)
So now we’re all on the same page, we’ve all read the memo, and now we can have a really elevated discussion. And this is so much better from having a slideshow presentation, a PowerPoint presentation of some kind, where that has so many difficulties. But one of the problems is PowerPoint is really designed to persuade. It’s kind of a sales tool. And internally, the last thing you want to do is sell. Again, you’re truth seeking. You’re trying to find truth. And the other problem with PowerPoint is it’s easy for the author and hard for the audience. And a memo is the opposite. It’s hard to write a six-page memo. A good six-page memo might take two weeks to write. You have to write it, you have to rewrite it, you have to edit it, you have to talk to people about it. They have to poke holes in it for you. You write it again, it might take two weeks. So the author, it’s really a very difficult job, but for the audience it’s much better.

(02:02:45)
So you can read a half hour, and there are little problems with PowerPoint presentations too. Senior executives interrupt with questions halfway through the presentation. That question’s going to be answered on the next slide, but you never got there. If you read the whole memo in advance… I often write lots of questions that I have in the margins of these memos, and then I go cross them all out because by the time I get to the end of the memo, they’ve been answered. That’s why I save all that time.

(02:03:11)
You also get, if the person who’s preparing the memo, we talked earlier about group think and the fact that I go last in meetings and that you don’t want your ideas to pollute the meeting prematurely, the author of the memos has got to be very vulnerable. They’ve got to put all their thoughts out there and they’ve got to go first. But that’s great because it makes them really good. And you get to see their real ideas and you’re not trompling on them accidentally in a big PowerPoint presentation meeting.
Lex Fridman
(02:03:50)
What’s that feel like when you’ve authored a thing and then you’re sitting there and everybody’s reading your thing?
Jeff Bezos
(02:03:54)
I think it’s mostly terrifying.
Lex Fridman
(02:03:57)
Yeah. But maybe in a good way? Like a purifying?
Jeff Bezos
(02:04:02)
I think it’s terrifying in a productive way, but I think it’s emotionally, a very nerve-racking experience.
Lex Fridman
(02:04:13)
Is there a art, science to the writing of this six-page memo or just writing in general to you?
Jeff Bezos
(02:04:20)
It’s really got to be a real memo. So it means paragraphs have topic sentences. It’s verbs and nouns. That’s the other problem with PowerPoint presentations, they’re often just bullet points. And you can hide a lot of sloppy thinking behind bullet points. When you have to write in complete sentences with narrative structure, it’s really hard to hide sloppy thinking. So it forces the author to be at their best, and so they’re somebody’s really their best thinking. And then you don’t have to spend a lot of time trying to tease that thinking out of the person, and you’ve got it from the very beginning. So it really saves you time in the long run.
Lex Fridman
(02:05:03)
So that part is crisp, and then the rest is messy. Crisp document, messy meeting.
Jeff Bezos
(02:05:07)
Yeah, so you don’t want to pretend that the discussion should be crisp. Most meetings, you’re trying to solve a really hard problem. There’s a different kind of meeting, which we call weekly business reviews or business reviews that may be weekly or monthly or daily, whatever they are. But these business review meetings, that’s usually for incremental improvement. And you’re looking at a series of metrics, every time it’s the same metrics. Those meetings can be very efficient. They can start on time and end on time.

Future of humanity

Lex Fridman
(02:05:35)
So we’re about to run out of time, which is a good time to ask about the 10,000-Year Clock.
Jeff Bezos
(02:05:43)
It’s funny.
Lex Fridman
(02:05:44)
Yes, that’s what I’m known for, is the humor. Okay. Can you explain what the 10,000-Year Clock is?
Jeff Bezos
(02:05:53)
Is? 10,000-Year Clock is a physical clock of monumental scale. It’s about 500 feet tall. It’s inside a mountain in west Texas at a chamber that’s about 12 feet in diameter and 500 feet tall. 10,000-Year Clock is an idea conceived by a brilliant guy named Danny Hillis way back in the ’80s. The idea is to build a clock as a symbol for long-term thinking. And you can kind of just very conceptually think of the 10,000-Year Clock as it ticks once a year, it chimes once every a hundred years, and the cuckoo comes out once every a thousand years. So it just sort of slows everything down. And it’s a completely mechanical clock. It is designed to last 10,000 years with no human intervention. So the material choices and everything else. It’s in a remote location, both to protect it, but also so that visitors have to make a pilgrimage.

(02:06:57)
The idea is that over time, and this will take hundreds of years, but over time, it will take on the patina of age, and then it will become a symbol for long-term thinking that will actually hopefully get humans to extend their thinking horizons. And in my view, that’s really important as we have become, as a species, as a civilization, more powerful. We’re really affecting the planet now. We’re really affecting each other. We have weapons of mass destruction. We have all kinds of things where we can really hurt ourselves and the problems we create can be so large. The unintended consequences of some of our actions like climate change, putting carbon in the atmosphere is a perfect example. That’s an unintended consequence of the Industrial Revolution, got a lot of benefits from it, but we’ve also got this side effect that is very detrimental.

(02:07:56)
We need to start training ourselves to think longer term. Long-term thinking is a giant lever. You can literally solve problems if you think long-term, that are impossible to solve if you think short-term. And we aren’t really good at thinking long-term. Five years is a tough timeframe for most institutions to think past. And we probably need to stretch that to 10 years and 15 years and 20 years and 25 years, and we’d do a better job for our children or our grandchildren if we could stretch those thinking horizons. And so the clock, in a way, it’s an art project, it’s a symbol. And if it ever has any power to influence people to think longer term, that won’t happen for hundreds of years, but we are going to build it now and let it accrue the patina of age.
Lex Fridman
(02:08:52)
Do you think humans will be here when the clock runs out here on earth?
Jeff Bezos
(02:08:56)
I think so. But the United States won’t exist. Whole civilizations rise and fall. 10,000 years is so long. No nation state has ever survived for anywhere close to 10,000 years.
Lex Fridman
(02:09:12)
And the increasing rate of progress makes that even fantastic.
Jeff Bezos
(02:09:15)
Even less likely so. Do I think humans will be here? Yes. How will we have changed ourselves and what will we be and so on and so on? I don’t know, but I think we’ll be here.
Lex Fridman
(02:09:25)
On that grand scale, a human life feels tiny. Do you ponder your own mortality? Are you afraid of death?
Jeff Bezos
(02:09:32)
No. I used to be afraid of death. I did. I remember as a young person being very scared of mortality, didn’t want to think about it, and so on. And as I’ve gotten older, I’m 59 now, as I’ve gotten older, somehow that fear has sort of gone away. I would like to stay alive for as long as possible, but I’m really more focused on health span. I want to be healthy. I want that square wave. I want to be healthy, healthy, healthy, and then gone. I don’t want the long decay. And I’m curious. I want to see how things turn out. I’d like to be here. I love my family and my close friends, and I’m curious about them, and I want to see. So I have a lot of reasons to stay around, but mortality doesn’t have that effect on me that it did maybe when I was in my twenties.
Lex Fridman
(02:10:38)
Well, Jeff, thank you for creating Amazon, one of the most incredible companies in history, and thank you for trying your best to make humans a multi-planetary species, expanding out into our solar system, maybe beyond, to meet the aliens out there. And thank you for talking today.
Jeff Bezos
(02:10:55)
Lex, thank you for doing your part to lengthen our attention spans. Appreciate that very much.
Lex Fridman
(02:11:04)
I’m doing my best. Thanks for listening to this conversation with Jeff Bezos. To support this podcast, please check out our sponsors in the description. And now let me leave you with some words from Jeff Bezos himself. Be stubborn on vision, but flexible on the details. Thank you for listening and hope to see you next time.

Transcript for Lee Cronin: Controversial Nature Paper on Evolution of Life and Universe | Lex Fridman Podcast #404

This is a transcript of Lex Fridman Podcast #404 with Lee Cronin.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lee Cronin
(00:00:00)
Every star in the sky probably has planets and life is probably emerging on these planets. But I think the commentorial space associated with these planets is so different. Our causal cones are never going to overlap or not easily. And this is the thing that makes me sad about alien life, why we have to create alien life in the lab as quickly as possible because I don’t know if we are going to be able to build architectures that will intersect with alien intelligence architectures.
Lex Fridman
(00:00:35)
Intersect, you don’t mean in time or space-
Lee Cronin
(00:00:38)
Time and the ability to communicate.
Lex Fridman
(00:00:40)
The ability to communicate.
Lee Cronin
(00:00:41)
Yeah. My biggest fear in a way is that life is everywhere, but we’ve become infinitely more lonely because of our scaffolding in that commentorial space.
Lex Fridman
(00:00:52)
The following is a conversation with Lee Cronin, his third time in this podcast. He’s a chemist from University of Glasgow who is one of the most fascinating, brilliant and fun to talk to scientists I’ve ever had the pleasure of getting to know. This is the Lex Fridman podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Lee Cronin.

Assembly theory paper


(00:01:16)
So your big assembly theory paper was published in Nature. Congratulations.
Lee Cronin
(00:01:21)
Thanks.
Lex Fridman
(00:01:21)
It created, I think it’s fair to say, a lot of controversy, but also a lot of interesting discussion. So maybe I can try to summarize assembly theory and you tell me if I’m wrong.
Lee Cronin
(00:01:32)
Go for it.
Lex Fridman
(00:01:33)
So assembly theory says that if we look at any object in the universe, any object, that we can quantify how complex it is by trying to find the number of steps it took to create it. And also we can determine if it was built by a process akin to evolution by looking at how many copies of the object there are.
Lee Cronin
(00:01:55)
Yep. That’s spot on. Yep.
Lex Fridman
(00:01:56)
Spot on.
Lee Cronin
(00:01:57)
Spot on.
Lex Fridman
(00:01:58)
I was not expecting that. Okay, so let’s go through definitions. So there’s a central equation I’d love to talk about, but definition wise, what is an object?
Lee Cronin
(00:02:11)
Yeah, an object. So if I’m going to try to be as meticulous as possible, objects need to be finite and they need to be decomposable into sub-units. All human made artifacts are objects. Is a planet an object? Probably yes, if you scale out. So an object is finite and accountable and decomposable, I suppose, mathematically. But yeah, I still wake up some days and go to think to myself, what is an object? Because it’s a non-trivial question.
Lex Fridman
(00:02:50)
Persists over time, I’m quoting from the paper here. An object is finite, is distinguishable. I’m sure that’s a weird adjective, distinguishable.
Lee Cronin
(00:03:03)
We’ve had so many people help offering to rewrite the paper after it came out. You wouldn’t believe it’s so funny.
Lex Fridman
(00:03:10)
Persists over time. And is breakable such that the set of constraints to construct it from elementary building blocks is quantifiable, such that the set of constraints to construct it from elementary building blocks is quantifiable.
Lee Cronin
(00:03:25)
The history is in the objects. It’s kind of cool, right?
Lex Fridman
(00:03:29)
Okay. So what defines the object is its history or memory, whichever is the sexier word.
Lee Cronin
(00:03:36)
I’m happy with both depending on the day.
Lex Fridman
(00:03:38)
Okay, so the set of steps it took to create the object. So there’s a sense in which every object in the universe has a history. And that is part of the thing that is used to describe its complexity. How complicated it is. Okay, what is an assembly index?
Lee Cronin
(00:04:00)
So the assembly index, if you’re to take the object apart and be super lazy about it or minimal say ’cause it’s like you’ve got a really short-term memory. So what you do is you lay all the parts on the path and you find the minimum number of steps you take on the path to add the parts together to reproduce the object. And that minimum number is the assembly index. It’s minimum bound. And it was always my intuition, the minimum bound and assembly theory was really important that I only worked out why a few weeks ago, which is kind of funny ’cause I was just like, “No, this is sacrosanct. I don’t know why, it’ll come to me one day.”

(00:04:37)
And then when I was pushed by a bunch of mathematicians, we came up with the correct physical explanation, which I can get to, but it’s the minimum and it’s really important. It’s the minimum. And the reason I knew the minimum was right is because we could measure it. So almost before this paper came out, we’d published papers, explain how you can measure the assembly index of molecules.
Lex Fridman
(00:05:01)
Okay, so that’s not so trivial to figure out. So when you look at an object, we could say a molecule, we could say object more generally. To figure out the minimum number of steps it takes to create that object, that doesn’t seem like a trivial thing to do.
Lee Cronin
(00:05:17)
So with molecules, it is not trivial, but it is possible because what you can do and because I’m a chemist, so I’m kind of like I see the lens of the world for just chemistry. I break the molecule apart and break bonds. And if you take a molecule and you break it all apart, you have a bunch of atoms and then you say, “Okay, I’m going to then take the atoms and form bonds and go up the chain of events to make the molecule.”

(00:05:46)
And that’s what made me realize, take a toy example, literally a toy example, take a Lego object, which is broken up of Lego blocks. So you could do exactly the same thing. In this case, the Lego blocks are naturally the smallest. They’re the atoms in the actual composite Lego architecture. But then if you maybe take a couple of blocks and put them together in a certain way, maybe they’re offset in some way, that offset is on the memory, you can use that offset again with only a penalty of one and you can then make a square, triangle and keep going.

(00:06:19)
And you remember those motifs on the chain. So you can then leap from the start with all the Lego blocks or atoms just laid out in front of you and say, “Right, I’ll take you, you, you,” connect and do the least amount of work. So it’s really like the smallest steps you can take on the graph to make the object. And so for molecules, it came relatively intuitively. And then we started to apply it to language. We’ve even started to apply it to mathematical theorems. But I’m so well out of my depth. But it looks like you can take minimum set of axioms and then start to build up mathematical architectures in the same way. And then the shortest path to get there is something interesting that I don’t yet understand.
Lex Fridman
(00:07:02)
So what’s the computational complexity of figuring out the shortest path with molecules, with language, with mathematical theorems? It seems like once you have the fully constructed Lego castle or whatever your favorite Lego world is, figuring out how to get there from the basic building blocks, is that an empty hard problem? It’s a hard problem.
Lee Cronin
(00:07:28)
It’s a hard problem. But actually if you look at it, so the best way to look at it, let’s take a molecule. So if the molecule has 13 bonds, first of all, take 13 copies of the molecule and just cut all the bonds. So cut 12 bonds and then you just put them in order and then that’s how it works. And you keep looking for symmetry or copies so you can then shorten it as you go down.

(00:07:51)
And that becomes [inaudible 00:07:53] quite hard. For some natural product molecules, it becomes very hard. It’s not impossible, but we’re looking at the bounds on that at the moment. But as the object gets bigger it becomes really hard. But that’s the bad news. But the good news is there are shortcuts. And we might even be able to physically measure the complexity without computationally calculating it, which is kind of insane.
Lex Fridman
(00:08:20)
Wait, how would you do that?
Lee Cronin
(00:08:20)
Well, in the case of molecule, so if you shine light on a molecule, let’s take an infrared. The molecule has each of the bonds absorbs the infrared differently in what we call the fingerprint region. And so it’s a bit like because it’s quantized as well, you have all these discreet kind of absorbances. And my intuition, after we realized we could cut molecules up in mass spec, that was the first go at this. We did it with using infrared. And the infrared gave us an even better correlation assembly index. And we used another technique as well in addition to infrared called NMR, nuclear magnetic resonance, which tells you about the number of different magnetic environments in a molecule. And that also worked out. So we have three techniques which each of them independently gives us the same or tending towards the same assembly index from molecule that we can calculate mathematically.
Lex Fridman
(00:09:12)
So these are all methods of mass spectrometry, mass spec. You scan a molecule, it gives you data in the form of a mass spectrum. And you’re saying that the data correlates to the assembly index?
Lee Cronin
(00:09:25)
Yeah.
Lex Fridman
(00:09:26)
So how generalizable is that shortcut, first of all it’s chemistry. And second of all, beyond that, that seems like a nice hack and you’re extremely knowledgeable about various aspects of chemistry. So you can say, okay, it kind of correlates. But the whole idea behind assembly theory paper and perhaps why it’s so controversial is that it reaches bigger. It reaches for the bigger general theory of objects in the universe.
Lee Cronin
(00:09:58)
Yeah, I’d say so. I’d agree. So I’ve started assembly theory of emoticons with my lab, believe it or not. So we take emojis, pixelate them and work out the assembly index of the emoji and then work out how many emojis you can make on the path of emoji. So there’s the uber emoji from which all other emojis emerge. So you can then take a photograph and by looking at the shortest path, by reproducing the pixels to make the image you want, you can measure that. So then you start to be able to take spatial data.

(00:10:32)
Now there’s some problems there. What is then the definition of the object? How many pixels? How do you break it down? And so we’re just learning all this right now.
Lex Fridman
(00:10:42)
So how do you compute, begin to compute the assembly index of a graphical, a set of pixels on a 2D plane that form a thing?
Lee Cronin
(00:10:54)
So you would first of all determine the resolution. So then what is your XY and what the number on the X and Y plane and then look at the surface area. And then you take all your emojis and make sure they’re all looked at the same resolution. And then we would basically then do exactly the same thing we would do for cutting the bonds. You’d cut bits out of the emoji and look at, you’d have a bag of pixels and you would then add those pixels together to make the overall emoji.
Lex Fridman
(00:11:26)
Wait, wait a minute. But first of all, not every pixels, I mean this is at the core, machine learning and computer vision, not every pixels that important. And there’s macro features, there’s micro features and all that kind of stuff.
Lee Cronin
(00:11:40)
Exactly.
Lex Fridman
(00:11:42)
The eyes appear in a lot of them, the smile appears in a lot of them.
Lee Cronin
(00:11:47)
So in the same way in chemistry we assume the bond is fundamental. What we do in they’re and here is we assume the resolution at the scale at which we do it is fundamental and we’re just working that out. And you’re right, that will change because as you take your lens out a bit, it will change dramatically.

(00:12:02)
But it’s just a new way of looking at, not just compression. What we do right now in computer science and data, one big kind of misunderstanding as assembly theory is telling you about how compressed the object is. That’s not right. It’s how much information is required on a chain of events. Because the nice thing is if, when you do compression and computer science, we’re wandering a bit here, but it’s kind of worth wandering I think, you assume you have instantaneous access to all the information in the memory. In assembly theory you say, “No, you don’t get access to that memory until you’ve done the work.” And then when you’ve done access to that memory, you can have access but not to the next one.

(00:12:45)
And this is how in assembly theory, we talk about the four universes, the assembly universe, the assembly possible, and the assembly contingent, and then the assembly observed. And they’re all scales in this commentorial universe.
Lex Fridman
(00:12:58)
Yeah. Can you explain each one of them?
Lee Cronin
(00:13:00)
Yep. So the assembly universe is like anything goes, just combinatorial kind of explosion in everything.
Lex Fridman
(00:13:07)
So that’s the biggest one?
Lee Cronin
(00:13:08)
That’s the biggest one. It’s massive.
Lex Fridman
(00:13:09)
Assembly universe, assembly possible, assembly contingent, assembly observed. And the Y axis is assembly steps in time and the X axis as the thing expands through time, more and more unique objects appear.
Lee Cronin
(00:13:29)
Yeah, so assembly universe, everything goes. Assembly possible, laws of physics come in this case in chemistry, bonds assembly. So that means-
Lex Fridman
(00:13:39)
Those are extra constraints, I guess?
Lee Cronin
(00:13:40)
Yes. And they’re the only constraints. They’re the constraints at the base. So the way to look at it’s you’ve got all your atoms, they’re contized and you can just bond them together. So then you can become a kind of, so in the way in computer science speak, I suppose the assembly universe is just like no laws of physics. Things can fly through mountains, beyond the speed of light. In the assembly possible. You have to apply the laws of physics, but you can get access to all the motifs instantaneously with no effort. So that means you could make anything.

(00:14:10)
Then the assembly contingent says “No, you can’t have access to the highly assembled object in the future until you’ve done the work in the past on the causal chain.” And that’s really, the really interesting shift where you go from assembly possible to assembly contingent. That is really the key thing in assembly theory that says you cannot just have instantaneous access to all those memories. You have to have done the work. Somehow the universe has to have somehow built a system that allows you to select that path rather than other paths.

(00:14:45)
And then the final thing the assembly observed is basically us saying, “Oh, these are the things we actually see. We can go backwards now and understand that they have been created by this causal process.”
Lex Fridman
(00:14:59)
Wait a minute. So when you say the universe has to construct the system that does the work, is that like the environment that allows for selection?
Lee Cronin
(00:15:08)
Yeah.
Lex Fridman
(00:15:09)
So that’s the thing that does the selection.
Lee Cronin
(00:15:10)
You could think about in terms of a Von Neumann constructor versus a selection, a ribosome, a Tesla plant assembling Teslas. The difference between the assembly universe in Tesla land and the Tesla factory is everyone says, “No, Teslas are just easy. They just spring out, you know how to make them all. The Tesla factory, you have to put things in sequence and out comes a Tesla.
Lex Fridman
(00:15:32)
So you’re talking about the factory?
Lee Cronin
(00:15:33)
Yes. This is really nice, super important point is that when I talk about the universe having a memory or there’s some magic, it’s not that. It’s that tells you that there must be a process encoded somewhere in physical reality, be it a cell, a Tesla factory or something else that is making that object. I’m not saying there’s some kind of woo-woo memory in the universe, morphic resonance or something. I’m saying that there is an actual causal process that is being directed, constrained in some way. So it’s not kind of just making everything.
Lex Fridman
(00:16:10)
Yeah, but Lee, what’s the factory that made the factory? First of all, you assume the laws of physics is just sprung to existence at the beginning. Those are constraints. But what makes the factory the environment that does the selection?
Lee Cronin
(00:16:29)
This is the question of, well, it’s the first interesting question that I want to answer out of four. I think the factory emerges in the interplay between the environment and the objects that are being built. And let me, I’ll have a go at explain to you the shortest path.

(00:16:48)
So why is the shortest path important? Imagine you’ve got, I’m going to have to go chemistry for a moment, then abstract it. So imagine you’ve got a given environment that you have a budget of atoms, you’re just flinging together. And the objective of those atoms that being flung together in say, molecule A, they decompose. So molecules decompose over time. So the molecules in this environment, in this magic environment have to not die, but they do die. They have a half-life.

(00:17:23)
So the only way the molecules can get through that environment out the other side, let’s pretend the environment is a box and can go in and out without dying. And there’s just an infinite supply of atoms coming or, well, a large supply, the molecule gets built, but the molecule that is able to template itself being built and survives in the environment will basically reign supreme.

(00:17:49)
Now let’s say that molecule takes 10 steps and it is using a finite set of atoms. Now, let’s say another molecule, smart ass molecule we’ll call it, comes in and can survive in that environment and can copy itself, but it only needs five steps. The molecule that only needs five steps continued, both molecules are being destroyed, but they’re creating themselves faster they can be destroyed. You can see that the shortest path reigns supreme. So the shortest path tells us something super interesting about the minimal amount of information required to propagate that motif in time and space. And it seems to be like some kind of conservation law.
Lex Fridman
(00:18:35)
So one of the intuitions you have is the propagation of motifs in time will be done by the things that can construct themselves in the shortest path.
Lee Cronin
(00:18:47)
Yeah.
Lex Fridman
(00:18:47)
So you can assume that most of the objects in the universe are built in the shortest, in the most efficient way. Big leap I just took there.
Lee Cronin
(00:18:58)
Yeah. Yes and no, because there are other things. So in the limit, yes, because you want to tell the difference between things that have required a factory to build them and just random processes. But you can find instances where the shortest path isn’t taken for an individual object, individual function. And people go, “Ah, that means the shortest path isn’t right.” And then I say, “Well, I don’t know. I think it’s right still because,” so of course, because there are other driving forces, it’s not just one molecule.

(00:19:33)
Now you start to consider two objects, you have a joint assembly space. And it’s not now, it’s a compromise between not just making A and B in the shortest path. You want to be able to make A and B in the shortest path, which might mean that A is slightly longer, compromise. So when you see slightly more nesting in the construction, when you take a given object, that can look longer. But that’s because the overall function is the object is still trying to be efficient. And this is still very hand wavy and maybe having no leg to stand on, but we think we’re getting somewhere with that.
Lex Fridman
(00:20:09)
And there’s probably some parallelization, right?
Lee Cronin
(00:20:12)
Yeah.
Lex Fridman
(00:20:12)
So this is not sequential. The building is, I guess.
Lee Cronin
(00:20:17)
No, you’re right.
Lex Fridman
(00:20:18)
When you’re talking about complex objects, you don’t have to work sequentially. You can work in parallel, you can get your friends together and they can…
Lee Cronin
(00:20:25)
Yeah, and the thing we’re working on right now is how to understand these parallel processes. Now there’s a new thing we’ve introduced called assembly depth. And assembly depth can be lower than the assembly index for a molecule when they’re cooperating together because exactly this parallel processing is going on. And my team have been working this out in the last few weeks because we’re looking at what compromises does nature need to make when it’s making molecules in a cell? And I wonder if maybe like, well, I’m always leaping out of my competence, but in economics, I’m just wondering if you could apply this in economic processes. It seems like capitalism is very good at finding shortest path every time. And there are ludicrous things that happen because actually the cost function has been minimized.

(00:21:15)
And so I keep seeing parallels everywhere where there are complex nested systems where if you give it enough time and you introduce a bit of heterogeneity, the system readjusts and finds a new shortest path. But the shortest path isn’t fixed on just one molecule now. It’s in the actual existence of the object over time. And that object could be a city, it could be a cell, it could be a factory, but I think we’re going way beyond molecules and my competence so probably should go back to molecules, but hey.

Assembly equation

Lex Fridman
(00:21:44)
All right, before we get too far, let’s talk about the assembly equation. Okay. How should we do this? Let me just even read that part of the paper. We define assembly as the total amount of selection necessary to produce an ensemble of observed objects quantified using equation one. The equation basically has A on one side, which is the assembly of the ensemble, and then a sum from one to N, where N is the total number of unique objects.

(00:22:20)
And then there is a few variables in there that include the assembly index, the copy number which we’ll talk about. That’s an interesting, I don’t remember you talking about that. That’s an interesting addition and I think a powerful one. It has to do with what, that you can create pretty complex objects randomly, and in order to know that they’re not random, that there’s a factory involved, you need to see a bunch of them. That’s the intuition there. It’s an interesting intuition and then some normalization. What else is and-
Lee Cronin
(00:22:54)
N minus one, just to make sure that more than one object, one object could be a one-off and random. And then you have more than one identical object. That’s interesting.
Lex Fridman
(00:23:03)
When there’s two of a thing.
Lee Cronin
(00:23:05)
Two of a thing is super important, especially if the index assembly index is high.
Lex Fridman
(00:23:10)
So we could say several questions here. One, let’s talk about selection. What is this term selection? What is this term evolution that we’re referring to? Which aspect of Darwinian evolution are we referring to? That’s interesting here.
Lee Cronin
(00:23:26)
Yeah, so this is probably what the paper, we should talk about the paper for a second. The paper, what it did is it kind of annoyed, we didn’t know it. It got intention and obviously the angry people were annoyed.
Lex Fridman
(00:23:39)
There’s angry people in the world. That’s good.
Lee Cronin
(00:23:41)
So what happened is the evolutionary biologists got angry. We were not expecting that. We thought evolutionary biologists would be cool. I knew that some, not many, computational complexity people will get angry because I’ve kind of been poking them and maybe I deserved it, but I was trying to poke them in a productive way. And then the physicists kind of got grumpy because the initial conditions tell everything. The prebiotic chemist got slightly grumpy because there’s not enough chemistry in there. Then finally, when the creationist said it wasn’t creationist enough, I was like, “I’ve done my job.”
Lex Fridman
(00:24:13)
You’re saying the physics, they say, because you’re basically saying that physics is not enough to tell the story of how biology emerges?
Lee Cronin
(00:24:22)
I think so.
Lex Fridman
(00:24:22)
And then they said a few physics is the beginning and the end of the story.
Lee Cronin
(00:24:29)
So what happened is the reason why people put the phone down on the call of the paper, if you view reading the paper like a phone call, they got to the abstract and in the abstract-
Lex Fridman
(00:24:39)
First sentence is pretty strong.
Lee Cronin
(00:24:40)
The first two sentences caused everybody-
Lex Fridman
(00:24:42)
Scientists have grappled with reconciling biological evolution with the immutable laws of the universe defined by physics.
Lee Cronin
(00:24:51)
True, right? There’s nothing wrong with that statement. Totally true.
Lex Fridman
(00:24:55)
Yeah. These laws underpin life’s origin, evolution, and the development of human culture and technology, yet they do not predict the emergence of these phenomena. Wow. First of all, we should say the title of the paper, this paper was accepted and published in Nature. The title is Assembly Theory Explains and Quantifies Selection and Evolution, very humble title. And the entirety of the paper, I think, presents interesting ideas, but reaches high.
Lee Cronin
(00:25:26)
I am not… I would do it all again. This paper was actually on the pre-print server for over a year.
Lex Fridman
(00:25:33)
You regret nothing?
Lee Cronin
(00:25:34)
Yeah.
Lee Cronin
(00:25:35)
I think, yeah, I don’t regret anything.
Lex Fridman
(00:25:37)
You and Frank Sinatra did it your way.
Lee Cronin
(00:25:39)
What I love about being a scientist is sometimes because I’m a bit dim and I don’t understand what people are telling me, I want to get to the point. This paper says, “Hey, the laws of physics are really cool, the universe is great, but they don’t really, it’s not intuitive that you just run the standard model and get life out.” I think most physicists might go, “Yeah, it’s not just, we can’t just go back and say that’s what happened.” Because physics can’t explain the origin of life yet. That doesn’t mean it won’t or can’t. Okay. Just to be clear. Sorry intelligent designers, we are going to get there.

(00:26:16)
Second point, we say that evolution works, but we don’t know how evolution got going. So biological evolution and biological selection. So for me, this seems like a simple continuum. So when I mentioned selection and evolution in the title, I think, and in the abstract, we should have maybe prefaced that and said non-biological selection and non-biological evolutions. And then that might have made it even more crystal clear. But I didn’t think that biology, evolutionary biology, should be so bold to claim ownership of selection and evolution.

(00:26:49)
And secondly, a lot of evolutionary biologists seem to dismiss the origin of life question and just say it’s obvious. And that causes a real problem scientifically because two different, when the physicists are like, ” We own the universe. The universe is good, we explain all of it, look at us.” And even biologists say, “We can explain biology.” And the poor chemists in the middle going, “But hang on.”

(00:27:12)
And this paper kind of says, “Hey, there is an interesting disconnect between physics and biology. And that’s at the point at which memories get made in chemistry through bonds. And hey, let’s look at this close and see if we can quantify it.” So yeah, I never expected the paper to get that much interest. And still, it’s only been published just over a month ago now.
Lex Fridman
(00:27:38)
So just to link on the selection, what is the broader sense of what selection means?
Lee Cronin
(00:27:46)
Yeah, that’s really good. For selection, so I think for selection, so this is where for me, the concept of an object is something that can persist in time and not die, but basically can be broken up. So if I was going to kind of bolster the definition of an object, so if something can form and persist for a long period of time under an existing environment that could destroy other, and I’m going to use anthropomorphic terms, I apologize, about weaker objects or less robust, then the environment could have selected that.

(00:28:30)
So good chemistry examples, if you took some carbon and you made a chain of carbon atoms, whereas if you took some, I don’t know, some carbon, nitrogen and oxygen and made change from those, you’d start to get different reactions and rearrangements. So a chain of carbon atoms might be more resistant to falling apart under a acidic or basic conditions versus another set of molecules. So it survives in that environment. So the acid pond, the resistant molecule can get through. And then that molecule goes into another environment. So that environment now maybe being acid pond is a basic pond or maybe it’s an oxidizing pond. And so if you’ve got carbon and it goes an oxidizing pond, maybe the carbon starts to oxidize and break apart. So you go through all these kind of obstacle courses if you like, given by reality. So selection is the ability happens when object survives in an environment for some time.

(00:29:33)
And this is the thing that’s super subtle. The object has to be continually being destroyed and made by process. So it’s not just about the object now, it’s about the process and time that makes it because a rock could just stand on the mountain side for 4 billion years and nothing happened to it. And that’s not necessarily really advanced selection. So for selection to get really interesting, you need to have a turnover in time. You need to be continually creating objects, producing them, what we call discovery time. So there’s a discovery time for an object.

(00:30:07)
When that object is discovered, if it’s say a molecule that can then act on itself or the chain of events that caused itself to bolster its formation, then you go from discovery time to production time and suddenly you have more of it in the universe. So it could be a self-replicating molecule and the interaction of the molecule in the environment, in the warm little pond or in the sea or wherever in the bubble could then start to build a proto factory, the environment.

(00:30:34)
So really to answer your question, what the factory is, the factory is the environment, but it’s not very autonomous, it’s not very redundant. There’s lots of things that could go wrong. So once you get high enough up the hierarchy of networks, of interactions, something needs to happen that needs to be compressed into a smaller volume and made resistant robust because in biology, selection and evolution is robust that you have error correction built in. You have really, there’s good ways of basically making sure propagation goes on.

(00:31:07)
So really the difference between inorganic, antibiotic selection and evolution and evolution and stuff in biology is robustness the ability to propagate, the ability to survive in lots of different environments. Whereas our poor little inorganic sole molecule, whatever, just dies in lots of different environments. So there’s something super special that happens from the inorganic molecule in the environment that kills it to where you’ve got evolution and cells can survive everywhere.
Lex Fridman
(00:31:44)
How special is that? How do you know those kinds of evolution factors aren’t everywhere in the universe?
Lee Cronin
(00:31:51)
I don’t, and I’m excited because I think selection isn’t special at all. I think what is special is the history of the environments on earth that gave rise to the first cell that now has taken all those environments and is now more autonomous. And I would like to think that, you know this paper could be very wrong, but I don’t think it’s very wrong. I mean it’s certainly wrong, but it’s less wrong than some other ideas, I hope, right? And if this inspires us to go and look for selection in the universe because we now have an equation where we can say, we can look for selection going on and say, “Oh, that’s interesting. We seem to have a process. It’s giving us high copy number objects that also are highly complex, but that doesn’t look like life as we know it.”

(00:32:46)
And we use that and say, “Oh, there’s a hydrothermal vent. Oh, there’s a process going on. There’s molecular networks,” because the assembly equation is not only meant to identify at the higher end advanced selection, what you get, I would call in biology super advanced selection. And even, you could use the assembly equation to look for technology and God forbid we could talk about consciousness and abstraction, but let’s keep it primitive, molecules and biology. So I think the real power of the assembly equation is to say how much selection is going on in this space.

(00:33:20)
And there’s a really simple thought experiment I could do is you have a little Petri dish and on that Petri dish you put some simple food. So the assembly index of all the sugars and everything is quite low. So then, and you put a single cell of E. coli cell and then you say, “I’m going to measure the assembly in this, amount of assembly in the box.” So it’s quite low, but the rate of change of assembly, DADT will go [inaudible 00:33:47] sigmoidal as it eats all the food and the number of coli cells will replicate because they take all the food, they copy themselves, the assembly index of all the molecules goes up, up and up until the food is exhausted in the box. So now the E. coli’s stopped-
Lee Cronin
(00:34:00)
… in the box. So now the E. coli’s stopped… I mean, die is probably a strong word. They stopped respiring because all the food is gone. But suddenly, the amount of assembly in the box has gone up gigantically because of that one E. coli factory has just eaten through, milled lots of other E. coli factories run out of food and stopped. And so that, looking at that… So in the initial box, although the amount of assembly was really small, it was able to replicate and use all the food and go up. And that’s what we’re trying to do in the lab, actually, is make those experiments and see if we can spot the emergence of molecular networks that are producing complexity, as we feed in raw materials and we feed a challenge, an environment. We try and kill the molecules. And really, that’s the main idea for the entire paper.
Lex Fridman
(00:34:52)
Yeah, and see if you can measure the changes in the assembly index throughout the whole system.
Lee Cronin
(00:34:56)
Yeah.

Discovering alien life

Lex Fridman
(00:34:57)
Okay. What about, if I show up to a new planet, we’ll go to Mars or some other planet from a different solar system, how do we use assembly index there to discover alien life?
Lee Cronin
(00:35:11)
Very simply, actually. Let’s say we’ll go to Mars with a mass spectrometer, with a sufficiently high resolution, so what you have to be able to do, so a good thing about mass spec is that you can select the molecule from the mass, and then if it’s high enough resolution, you can be more and more sure that you’re just seeing identical copies. You can count them. And then you fragment them and you count the number of fragments, and look at the molecular weight. And the higher the molecular weight and the higher the number of the fragments, the higher the assembly index.

(00:35:43)
So if you go to Mars and you take a mass spec, with high enough resolution, and you can find molecules, a guide on earth, if you could find molecules, say, greater than 350 molecular weight, with more than 15 fragments, you have found artifacts that can only be produced, at least on earth, by life. And now you would say, “Oh, well, maybe the geological process.” I would argue very virulently that that is not the case.

(00:36:10)
But we can say, “Look, if you don’t like the cutoff on earth, go up higher, 30, 100, because there’s going to be a point where you can find a molecule with so many different parts, the chances of you getting a molecule that has a hundred different parts and finding a million identical copies, that’s just impossible. That could never happen in an infinite set of universes.
Lex Fridman
(00:36:37)
Can you just linger on this copy number thing? A million different copies, what do you mean by copies and why is the number of copies important?
Lee Cronin
(00:36:49)
Yeah, that was so interesting. I always understood the copy number is really important, but I never explained it properly, for ages. And I kept having this, it goes back to this, if I give you a, I don’t know, a really complicated molecule, and I say it’s complicated, you could say, “Hey, that’s really complicated.” But is it just really random?
Lex Fridman
(00:37:12)
Mm-hmm.
Lee Cronin
(00:37:14)
So I realized that ultimate randomness and ultimate complexity are indistinguishable until you can see a structure in the randomness, so you can see copies.
Lex Fridman
(00:37:26)
So copies implies structure.
Lee Cronin
(00:37:31)
Yeah. The factory-
Lex Fridman
(00:37:34)
I mean, there’s a deep profound thing in there. Because if you just have a random process, you’re going to get a lot of complex, beautiful, sophisticated things.
Lee Cronin
(00:37:46)
Mm-hmm.
Lex Fridman
(00:37:47)
What makes them complex in the way we think life is complex or, yeah, something like a factory that’s operating under a selection processes, there should be copies. Is there some looseness about copies? What does it mean for two objects to be equal?
Lee Cronin
(00:38:06)
It’s all to do with the telescope or the microscope you’re using. And so, at the maximum resolution… The nice thing about chemists is they have this concept of the molecule and they’re all familiar with the molecule. And molecules, you can hold on your hand, lots of them, identical copies. A molecule is actually a super important thing in chemistry, to say, look, you can have a mole of a molecules, an Avogadro’s number of molecules, and they’re identical. What does that mean? That means that the molecular composition, the bonding and so on, the configuration is indistinguishable. You can hold them together. You can overlay them.

(00:38:43)
So the way I do it is if I say, “Here’s a bag of 10 identical molecules, let’s prove they’re identical.” You pick one out of the bag and you basically observe it, using some technique, and then you take it away and then you take another one out. If you observe it using technique, you see no differences. They’re identical. It’s really interesting to get right. Because if you take, say, two molecules, molecules can be in different vibrational rotational states. They’re moving all the time.

(00:39:09)
So in this respect, identical molecules have identical bonding. In this case, we don’t even talk about chirality, because we don’t have a chirality detector. So two identical molecules in one conception, assembly theory, basically considers both hands as being the same. But, of course, they’re not, they’re different. As soon as you have a chiral distinguisher to detect the left and the right hand, they become different. And so, it’s to do with the detection system that you have and the resolution.
Lex Fridman
(00:39:39)
So I wonder if there’s an art and science to the, which detection system is used when you show up to a new planet.
Lee Cronin
(00:39:49)
Yeah. Yeah, yeah.
Lex Fridman
(00:39:49)
So you’re talking about chemistry a lot today. We have standardized detection systems of how to compare molecules. So when you start to talk about emojis and language and mathematical theorems and, I don’t know, more sophisticated things at different scale, at a smaller scale than molecules, at a larger scale than molecules, what detection… If we look at the difference between you and me, Lex and Lee, are we the same? Are we different?
Lee Cronin
(00:40:24)
Sure. I mean, of course we’re different close up, but if you zoom out a little bit, we will morphologically look the same. High in characteristics, hair length, stuff like that.
Lex Fridman
(00:40:35)
Well, also, the species and-
Lee Cronin
(00:40:37)
Yeah, yeah, yeah.
Lex Fridman
(00:40:38)
… and also there’s a sense why we’re both from earth.
Lee Cronin
(00:40:42)
Yeah, I agree. I mean, this is the power of assembly theory in that regard. So if everything… So the way to look at it, if you have a box of objects, if they’re all indistinguishable, then using your technique, what you then do is you then look at the assembly index. Now, if the assembly index of them is really low and they’re all indistinguishable, then they’re telling you that you have to go to another resolution. So that would be, it is a sliding scale. It’s nice.
Lex Fridman
(00:41:15)
Got it. So those two are attentional with each other.
Lee Cronin
(00:41:18)
Yeah.
Lex Fridman
(00:41:18)
The number of copies and the assembly index.
Lee Cronin
(00:41:20)
Yeah.
Lex Fridman
(00:41:22)
That’s really, really interesting. So, okay. So you show up to a new planet, you’ll be doing what?
Lee Cronin
(00:41:28)
I would do mass spec. I would bring-
Lex Fridman
(00:41:30)
On a sample of what? First of all, how big of a scoop do you take? Do you just take a scoop? What… So we’re looking for primitive life.
Lee Cronin
(00:41:42)
I would look… Yeah, so if you’re just going to Mars or Titan or Enceladus, or somewhere, so a number of ways of doing it. So you could take a large scoop or you go through the atmosphere and detect stuff. You could make a life meter, right? One of Sarah’s colleagues at ASU, Paul Davies, keeps calling it a life meter, which is quite a nice idea. Because you think about it, if you’ve got a living system that’s producing these highly complex molecules and they drift away, and they’re in a highly demanding environment, they could be burnt, right? So they could just be falling apart. So you want to sniff a little bit of complexity and say warmer, warmer, warmer. Oh, we’ve found life, we found the alien. We’ve found the alien Elon Musk, smoking a joint in the bottom of the cave on Mars, or Elon himself, whatever, right?
Lex Fridman
(00:41:42)
Yeah. Mm-hmm.
Lee Cronin
(00:42:34)
You say, “Okay, found it.” So what you can do is a mass spectrometer, you could just look for things in the gas phase or you go on the surface, drill down, because you want to find molecules that are… Well, you’ve either got to find the source, living system, because the problem with just looking for complexity is it gets burnt away. So in a harsh environment on, say, on the surface of Mars, there’s a very low probability that you’re going to find really complex molecules because of all the radiation and so on.

(00:43:05)
If you drill down a little bit, you could drill down a bit into soil that’s billions of years old. Then I would put in some solvent, water, alcohol, or something, or take a scoop, make it volatile, put it into the mass spectrometer and just try and detect high complexity, high abundant molecules. And if you get them, hey, presto, you can have evidence of life. Wouldn’t that then be great if you could say, “Okay, we’ve found evidence of life, now we want to keep the life meter, keep searching for more and more complexity,” until you actually find living cells. And you can get those new living cells and then you could bring them back to earth or you could try and sequence them. You could see that they have different DNA and proteins.
Lex Fridman
(00:43:48)
Go along the gradient of the life meter.
Lee Cronin
(00:43:50)
Exactly.
Lex Fridman
(00:43:50)
How would you build a life meter? Let’s say we’re together, starting new-
Lee Cronin
(00:43:50)
Just a mass spectrometer.
Lex Fridman
(00:43:56)
… new company, launching a life-
Lee Cronin
(00:43:56)
Mass spectrometer would be the first way of doing it. Just take-
Lex Fridman
(00:43:59)
No, no, no, but that’s one of the major components of it. But I’m talking about-
Lee Cronin
(00:44:03)
I would-
Lex Fridman
(00:44:04)
… if it’s a device and branding, logo we got to talk about-
Lee Cronin
(00:44:04)
All right.
Lex Fridman
(00:44:08)
… that’s later. But what’s the input and what’s the… How do you get to the metered output?
Lee Cronin
(00:44:15)
So I would take a… So my life meter, our life meter. There you go.
Lex Fridman
(00:44:21)
Oh, thank you.
Lee Cronin
(00:44:21)
Yeah, you’re welcome, would have both infrared and mass spec. It would have two ports so it could shine a light. And so, what it would do is you would have a vacuum chamber and you would have an electrostatic analyzer, and you’d have a monochromator to producing infrared. You’d add the sum. So you’d take a scoop of the sample, put it in the life meter, it would then add a solvent or heat up the sample so some volatiles come off. The volatiles would then be put into the mass… into electrostatic trap, and you’d weigh the molecules and fragment them. Alternatively, you’d shine infrared light on them and you count number of bands. But you’d have to, in that case, do some separation, because you want to separate… And so, in mass spec, it’s really nice and convenient, because you can separate electrostatically, but you need to have that.
Lex Fridman
(00:45:12)
Can you do it in real time?
Lee Cronin
(00:45:13)
Yeah, pretty much. Pretty much, yeah. So let’s go all the way back. Okay, we’re really going to get this-
Lex Fridman
(00:45:13)
Let’s go.
Lee Cronin
(00:45:18)
… Lex’s life… Lex and Lee’s life meter.
Lex Fridman
(00:45:20)
No, I like Lex and Lee. It’s a good ring to it.
Lee Cronin
(00:45:25)
All right. So you have a vacuum chamber, you have a little nose. The nose would have some, a packing material. So you would take your sample, add it onto the nose, add a solvent or a gas. It would then be sucked up the nose and that would be separated, using what we call chromatography. And then as each band comes off the nose, we’ll then do mass spec and infrared. And in the case of the infrared, count the number of bands, in the case of mass spec, count the number of fragments and weigh it.

(00:45:56)
And then the further up in molecular weight range for the mass spec, and the number of bands, you go up and up and up from the dead, interesting, interesting, over the threshold, oh my gosh, earth life, and then right up to the batshit crazy, this is definitely alien intelligence that’s made this life, right? You could almost go all the way there. Same in the infrared. And pretty simple.

(00:46:18)
The thing that is really problematical is that for many years, decades, what people have done, and I can’t blame them, is they’ve rather, they’ve been obsessing about small biomarkers that we find on earth, amino acids, like single amino acids or evidence of small molecules and these things, and looking for those while I’m looking for complexity. The beautiful thing about this is you can look for complexity without earth chemistry bias or earth biology bias. So assembly theory is just a way of saying, hey, complexity in abundance is evidence of selection. That’s how our universal life meter will work.
Lex Fridman
(00:46:59)
Complexity in abundance is evidence of selection. Okay. So let’s apply our life meter to earth. If we were just to apply assembly index measurements to earth, what kind of stuff are going to get? What’s impressive about-
Lee Cronin
(00:46:59)
So-
Lex Fridman
(00:47:26)
… some of the complexity on earth?
Lee Cronin
(00:47:28)
… So we did this a few years ago when I was trying to convince NASA and colleagues that this technique could work. And honestly, it’s so funny, because everyone’s like, “No, it ain’t going to work.” And it was just like, because the chemists were saying, “Of course there are complicated molecules out there you can detect that just form randomly.” And I was like, “Really?” That was like, it’s a bit like, I don’t know, someone saying, “Of course, Darwin’s textbook was just written randomly by some monkeys and a typewriter.” Just for me, it was like, “Really?” And I’ve pushed a lot on the chemists now. And I think most of them are on board, but not totally. I really had some big arguments, but the copy number caught there. Because I think I confused the chemists by saying one-off. And then when I made clear about the copy number, I think that made it a little bit easier.
Lex Fridman
(00:48:16)
Just to clarify, a chemist might say that, of course out there, outside of earth there’s complex molecules?
Lee Cronin
(00:48:24)
Yes.
Lex Fridman
(00:48:24)
Okay. And then you’re saying, “Wait a minute, that’s like saying, ‘Of course there’s aliens out there.'” Like you-
Lee Cronin
(00:48:31)
Yeah, exactly that.
Lex Fridman
(00:48:32)
Okay.
Lee Cronin
(00:48:32)
Exactly.
Lex Fridman
(00:48:34)
You clarify that, that’s actually a very interesting question and we should be looking for complex molecules of which the copy number is two or greater.
Lee Cronin
(00:48:45)
Yeah, exactly. So on earth, so coming back to earth, what we did is we took a whole bunch of samples and we were running prebiotic chemistry experiments in the lab. We took various inorganic minerals and extracted them, look at the volatile. Because there’s a special way of treating minerals and polymers in assembly theory. In this, in our life machine, we’re looking at molecules. We don’t care about polymers, because they don’t, they’re not volatile. You can’t hold them. How can you make… If you can’t discern that they’re identical, then it’s very difficult for you to work out if this, undergone selection or they’re just a random mess.

(00:49:26)
Same with some minerals, but we can come back to that. So basically what you do, we’ve got a whole loads of samples, inorganic ones, we got a load of, we got Scotch whiskey and also got-
Lex Fridman
(00:49:36)
Nice.
Lee Cronin
(00:49:36)
… took a odd bag, which is one of my favorite whiskeys, which is very peaty. And another-
Lex Fridman
(00:49:41)
What’s peaty mean?
Lee Cronin
(00:49:42)
It is like… So the way that in Scotland, in Isla, which is a little island, the scotch, the whiskey is let to mature in barrels. It’s said that the peat, the complex molecules in the peat find their way through into the whiskey, and that’s what gives it this intense brown color and really complex flavor. It’s literally molecular complexity that does that. And so, vodka’s the complete opposite. It’s just pure, right?-
Lex Fridman
(00:50:16)
So the better the whiskey, the higher the assembly index, the higher the assembly index, the better the whiskey.
Lee Cronin
(00:50:20)
I mean, I really love deep, peaty Scottish whiskeys. Near my house, there is one of the lowland distilleries, called Glengoyne. It’s still beautiful whiskey but not as complex. So for fun, I took some Glengoyne whiskey in our bag and put them into the mass spec and measured the assembly index. I also got E. coli. So the way we do it, take the E. coli, break the cell apart, take it all apart. And also got some beer. And people were ridiculing us saying, “Oh, beer is evidence of complexity.”

(00:50:53)
And one of the computational complexity people, it was just throwing, yeah… He’s very vigorous in his disagreement of assembly theory, was just saying, “You don’t know what you’re doing. Even beer is more complicated than human.” What he didn’t realize is that it’s not beer, per se, it’s taking the yeast extract, taking the extract, breaking the cells, extracting the molecules, and just looking at the profile of the molecules, see if there’s anything over the threshold. And we also put in a really complex molecule, Taxol.

(00:51:24)
So we took all of these, but also NASA gave us, I think, five samples, and they wouldn’t tell us what they are. They said, “No, we don’t believe you’re going to get this to work.” And they really gave us some super complex samples. And they gave us two fossils, one that was a million years old and one was at 10,000 years old, something from Antarctica, seabed. They gave us some Murchison and meteorite, and a few others. Put them through the system. So we took all the samples, treat them all identically, put them into mass spec, fragmented them, counted.

(00:51:56)
And in this case, implicit in the measurement was we, in mass spec, you only detect peaks when you’ve got more than, say, let’s say 10,000 identical molecules. So the copy number’s already baked in, but wasn’t quantified, which is super important there. This was in the first paper. Because I was like, it’s abundant, of course.

(00:52:17)
And when you then took it all out, we found that the biological samples gave you molecules that had an assembly index greater than 15. And all the abiotic samples were less than 15. And then we took the NASA samples and we looked at the ones that were more than 15, less than 15, and we gave them back to NASA, and they’re like, “Oh, gosh. Yep, dead, living, dead, living. You got it.” And that’s what we found on earth.
Lex Fridman
(00:52:44)
That’s a success.
Lee Cronin
(00:52:45)
Yeah. Oh yeah, resounding success.
Lex Fridman
(00:52:48)
Can you just go back to the beer and the E. coli? So what’s the assembly index on those?
Lee Cronin
(00:52:54)
So what you were able to do is, the assembly index of… We found high assembly index molecules originating from the beer sample and the E. coli sample.
Lex Fridman
(00:53:08)
Yeast in the beer.

Evolution of life on Earth

Lee Cronin
(00:53:10)
I didn’t know which one was higher. We didn’t really do any detail there. Because now we are doing that. Because one of the things we’ve done, it’s a secret, but I can tell you. I think it’s-
Lex Fridman
(00:53:23)
Nobody’s listening.
Lee Cronin
(00:53:25)
… well, is that we’ve just mapped the tree of life using assembly theory, because everyone said, ” Oh, you can’t do anything from biology.” And what we’re able to do is, so I think there’s three, well, two ways of doing tree of life… Well, three ways actually.
Lex Fridman
(00:53:38)
What’s the tree of life?
Lee Cronin
(00:53:39)
So the tree of life is basically tracing back the history of life on earth, all the different species, going back who evolved from what. And it all goes all the way back to the first life forms, and they branch off. And you have plant kingdom, the animal kingdom, the fungi kingdom, and different branches all the way up. And the way this was classically done, and I’m no evolutionary biologist. The evolutionary biologists tell me every day, at least 10 times… I want to be one though. I like biology, it’s cool.
Lex Fridman
(00:54:12)
Yeah, it’s very cool.
Lee Cronin
(00:54:13)
But basically-
Lex Fridman
(00:54:14)
Evolutionary.
Lee Cronin
(00:54:16)
… What Darwin and Mendeleev, and all these people do is just, they draw pictures and they [inaudible 00:54:20] taxa. They were able to draw pictures and say, “Oh, these look like common classes.”
Lex Fridman
(00:54:26)
Yeah.
Lee Cronin
(00:54:26)
Then…
Lex Fridman
(00:54:29)
They’re artists really. They’re just…
Lee Cronin
(00:54:32)
They were able to find out a lot, right? And looking at vertebrates and vertebrates, Cambrian explosion and all this stuff. And then came the genomic revolution and suddenly, everyone used gene sequencing. And Craig Venter’s a good example. I think he’s gone around the world in his yacht, just picking up samples, looking for new species. Where he’s just found new species of life just from sequencing. It’s amazing. So you have taxonomy, you have sequencing, and then you can also do a little bit of molecular archeology, like measure the samples and form some inference.

(00:55:08)
What we did is we were able to fingerprint… So we took a load of random samples from all of biology and we used mass spectrometry. And what we did now is not just look for individual molecules, but we looked for coexisting molecules where they had to look at their joint assembly space. And we were able to cut them apart and undergo recursion in the mass spec and infer some relationships. And we’re able to recapitulate the tree of life using mass spectroscopy, no sequencing and no drawing.
Lex Fridman
(00:55:41)
All right. Can you try to say that again, with a little more detail? So recreating, what does it take to recreate the tree of life? What does the reverse engineering process look like here?
Lee Cronin
(00:55:52)
So what you do is you take an unknown sample, you bung it into the mass spec, you get… Because this comes from what you’re asking, what do you see in E. coli?
Lex Fridman
(00:56:00)
Mm-hmm.
Lee Cronin
(00:56:00)
And so, in E. coli, you don’t just see, it’s not the most sophisticated cells on earth make the most sophisticated molecules. It is the coexistence of lots of complex molecules above a threshold. And so, what we realized is you could fingerprint different life forms. So fungi make really complicated molecules. Why? Because they can’t move. They have to make everything onsite.

(00:56:24)
Whereas, some animals are lazy, they can just go eat the fungi, and they don’t need to make very much. And so, what you do is you look at the, so you take, I don’t know, the fingerprint, maybe the top number of high molecular weight molecules you find in the sample, you fragment them to get their assembly indices, and then what you can do is you can infer common origins of molecules. You can do a molecular… When the reverse engineering of the assembly space, you can infer common roots and look at what’s called the joint assembly space.

(00:57:02)
But let’s translate that into the experiment. Take a sample, bung it in the mass spec, take the top, say, 10 molecules, fragment them, and that gives you one fingerprint. Then you do it for another sample, you get another fingerprint. Now the question is you say, “Hey, are these samples the same or different?” And that’s what we’ve been able to do and by basically looking at the assembly space that these molecules create. Without any knowledge of assembly theory, you are unable to do it. With a knowledge of assembly theory, you can reconstruct the tree.
Lex Fridman
(00:57:35)
How does knowing if they’re the same or different give you the tree?
Lee Cronin
(00:57:38)
Let’s go to two leaves on different branches on the tree, right? What you can do, by counting the number of differences, you can estimate how far away their origin was.
Lex Fridman
(00:57:48)
Got it.
Lee Cronin
(00:57:49)
And that’s what we do, and it just works. But when we realized you could even use assembly theory to recapitulate the tree of life with no gene sequencing, we were like, “Huh.”
Lex Fridman
(00:57:58)
So this is looking at samples that exist today in the world.
Lee Cronin
(00:58:01)
Yeah.
Lex Fridman
(00:58:01)
What about things that are no longer exist? I mean, the tree contains information about the past-
Lee Cronin
(00:58:08)
I would-
Lex Fridman
(00:58:09)
… some of it is gone.
Lee Cronin
(00:58:11)
Yeah, absolutely. I would love to get old fossil samples and apply assembly theory, mass spec, and see if we can find new forms of life that have, that are no longer amenable to gene sequencing, because the DNA is all gone. Because DNA and RNA’s quite unstable, but some of the more complex molecules might be there. They might give you a hint something new, or wouldn’t it be great if you-
Lex Fridman
(00:58:11)
I understand.
Lee Cronin
(00:58:33)
… if you find a sample that’s worth really persevering and doing the proper extraction to PCR and so on and then sequence it, and then put it together-
Lex Fridman
(00:58:46)
So when a thing dies, you can still get some information about its complexity.
Lee Cronin
(00:58:50)
Yeah. And it appears that you can do some dating. Now there are really good techniques. There’s radiocarbon dating, there is longer dating, going looking at radioactive minerals and so on. And you can also, in bone, you can look at… What happens after something dies, is you get what’s called racemization, where the chirality in the polymers basically changes and you get decomposition, and the deviation from the pure enantiomer to the mixture, you can have, it gives you a timescale on it, half-life, so you can date when it died. I want to use assembly theory to see if I can use it and date death and things, and trace the tree of life and also decomposition of molecules.
Lex Fridman
(00:59:45)
Do you think it’s possible?
Lee Cronin
(00:59:46)
Oh yeah, without a doubt. It may not be better than what… I was just at conference where there’s some brilliant people, looking isotope enrichment and looking at how life enriches isotopes, and they’re really sophisticated stuff that they’re doing. But I think there’s some fun to be had there, because it gives you another dimension of dating. How old is this molecule in terms of, or more importantly, how long ago was this molecule produced by life? More complex the molecule, the more prospect for decomposition, oxidation, reorganization, loss of chirality, and all that jazz.

(01:00:21)
But what life also does is it enriches. As you get older, the amount of carbon-13 in you goes up, because of the way the bonding is in carbon-13. So it has a slightly different strength, bond strength, than you. It’s called a kinetic isotope effect. So you can literally date how old you are or when you stop metabolizing. So you could date someone’s… how old they are, I think. I’m making this up, this might be right, but I think it’s roughly right. The amount of carbon-13 you have in you, you can estimate how old you are.
Lex Fridman
(01:00:56)
How old living humans are, or living organism?
Lee Cronin
(01:01:00)
Yeah, yeah. You could say, “Oh, this person is 10 years old and this person is 30 years old, because they’ve been metabolizing more carbon and they’ve accumulated it.” That’s the basic idea. It’s probably completely wrong timescale-

Response to criticism

Lex Fridman
(01:01:10)
Signatures of chemistry are fascinating. So you’ve been saying a lot of chemistry examples for assembly theory. What if we zoom out and look at a bigger scale of an object, like really complex objects, like humans or living organisms that are made up of millions or billions of other organisms, how do you try to apply assembly theory to that?
Lee Cronin
(01:01:38)
At the moment, we should be able to do this to morphology in cells. So we’re looking at cell surfaces, and really, I’m to trying to extend further. It’s just that we work so hard to get this paper out and people to start discussing the ideas, but it’s kind of funny, because I think the penny is falling on this. So yeah-
Lex Fridman
(01:02:03)
What does that even… What’s it mean for a penny to be-
Lee Cronin
(01:02:06)
I mean, no, the penny’s dropped, right? A lot of people were like, “It’s rubbish, it’s rubbish. You’ve insulted me. It’s wrong.” I mean, the paper got published on the 4th of October. It had 2.3 million engagements on Twitter and it’s been downloaded over a few hundred thousand times. And someone actually said to me, wrote to me and said, “This is an example of really bad writing and what not to do.” And I was like, if all of my papers got read this much, because that’s the objective, if I have a publishing a paper, I want people to read it. I want to write that badly again.
Lex Fridman
(01:02:37)
Yeah. I don’t know, what’s the deep insight here about the negativity in the space. I think it’s probably the immune system of the scientific community, making sure that there’s no bullshit that gets published and that it can overfy, it can do a lot of damage. It can shut down conversations in a way that’s not productive.
Lee Cronin
(01:02:54)
And I go back, I mean, I’ll answer your question about the hierarchy in assembly, but let’s go back to the perception people saying the paper was badly written. I mean, of course we could improve it. We could always improve the clarity.
Lex Fridman
(01:03:04)
Let’s go there before we go to the hierarchy.
Lee Cronin
(01:03:08)
Yeah.
Lex Fridman
(01:03:08)
It has been criticized quite a bit, the paper. What has been some criticism that you’ve found most powerful, that you can understand and can you explain it?
Lee Cronin
(01:03:23)
Yes. The most exciting criticism came from the evolutionary biologist telling me that he thought that origin of life was a solved problem. And I was like, “Whoa, we’re really onto something, because it’s clearly not.” And when you poked them on that they just said, “No. You you don’t understand evolution.” And I said, “No, no, I don’t think you understand that evolution had to occur before biology and there’s a gap.” That was really for me, that misunderstanding, and that did cause an immune response, which was really interesting.

(01:03:56)
The second thing was the fact that physicists, the physicists were actually really polite, really nice about it. But they just said, “Huh, we’re not really sure about the initial conditions thing. But this is a really big debate that we should certainly get into, because the emergence of life was not encoded in the initial conditions of the universe.” And I think assembly theory shows why it can’t be. I’ll say that-
Lex Fridman
(01:04:23)
Okay. Sure. If you could say that again.
Lee Cronin
(01:04:27)
The origin of, the emergence of life was not and cannot, in principle, be encoded in the initial conditions of the universe.
Lex Fridman
(01:04:35)
Just to clarify what you mean by life is what, high assembly index objects?
Lee Cronin
(01:04:39)
Yeah. And this goes back to your favorite subject.
Lex Fridman
(01:04:43)
What’s that?
Lee Cronin
(01:04:43)
Time.
Lex Fridman
(01:04:47)
Right. So why? What does time have to do with it?
Lee Cronin
(01:04:50)
I mean, probably we can come back to it later, but I think it might be, if we have time.
Lex Fridman
(01:04:56)
Yeah.
Lee Cronin
(01:04:56)
But I think that, I think I now understand how to explain how… Lots of people got angry with the assembly paper, but also, the ramifications of this is how time is fundamental in the universe and this notion of commentorial spaces. And there are so many layers on this, but you have to become an… I think you have to become an intuitionist mathematician and you have to abandon Platonic mathematics. And also, Platonic mathematics is left physics astray, but there’s a lot to unpack there. So we can go to the-
Lex Fridman
(01:05:34)
Platonic mathematic, okay. It’s okay, the evolutionary biologists criticized, because the origin of life is understood and not, it doesn’t require an explanation that involves physics.
Lee Cronin
(01:05:51)
Yeah. It-
Lex Fridman
(01:05:51)
That’s their statement.
Lee Cronin
(01:05:54)
Well, I mean, they said lots of confusing statements. Basically, I realized the evolutionary biology community that were vocal, and some of them were really rude, really spiteful, and needlessly so, right? Because look, I didn’t, people misunderstand publication as well. Some of the peoples have said, “How dare this be published in Nature. What a terrible journal.” And it really, and I watched, said to people, “Look, this is a brand new idea that’s not only potentially going to change the way we look at biology, it’s going to change the way we look at the universe.”

(01:06:36)
And everyone’s saying, “How dare, how dare you be so grandiose?” I’m like, “No, no, no. This is not hype. We’re not saying we’ve invented some, I don’t know, we’ve discovered a alien in a closet somewhere, just for hype. We genuinely mean this to genuinely have the impact or asked the question. And the way people jumped on that was a really bad precedent for young people who want to actually do something new.

(01:07:02)
Because this makes a bold claim, and the chances are that it’s not correct. But what I wanted to do is a couple of things. Is I wanted to make a bold claim that was precise and testable and correctable. Not another wooly information-in-biology argument, information-churring machine, blah, blah, blah, blah, blah. A concrete series of statements that can be falsified and explored, and either the theory could be destroyed or built upon.
Lex Fridman
(01:07:32)
Well, what about the criticism of you’re just putting a bunch of sexy names on something that’s already obvious?
Lee Cronin
(01:07:42)
Yeah, that’s really good. So the assembly index of a molecule is not obvious. No one had measure it before. And no one has thought to quantify selection, complexity, and copy number before, in such a primitive, quantifiable way. I think the nice thing about this paper-
Lee Cronin
(01:08:00)
… quantifiable way. I think the nice thing about this paper, this paper is a tribute to all the people that understand that biology does something very interesting. Some people call it negentropy. Some people call it, think about organizational principles that lots of people were not shocked by the paper because they’d done it before. A lot of the arguments we got, some people said, “Oh, it’s rubbish. Oh, by the way, I had this idea 20 years before.” I was like, ” Which one?” Is it the rubbish part or the really revolutionary part.

(01:08:35)
So this kind of plucked two strings at once. It plucked the there is something interesting that biology are, we can see around this, but we haven’t quantified yet. And what this is, is the first stab at quantifying that, so the fact that people said “This is obvious.” But if it’s obvious, why have you not done it?
Lex Fridman
(01:08:58)
Sure. But there’s a few things to say there. One is, this is in part of philosophical framework because it’s not like you can apply this generally to any object in the universe. It’s very chemistry focused.
Lee Cronin
(01:09:15)
Yeah, well, I think you will be able to, we just haven’t got there robustly. So if we can say how can we… Let’s go up a level. So if we go up from level, let’s go up from molecules to cells because you would jump to people and I jump to emoticons and both are good and they will be assembly…
Lex Fridman
(01:09:30)
Lets stick with cells, yeah. Good point.
Lee Cronin
(01:09:34)
If we go from molecules to assemblies and let’s take acellular assembly. A nice thing about a cell is you can tell the difference between a eukaryote and a prokaryote, right? The organalles are specialized differently when then look at the cell surface and the cell surface has different glycosylation patterns and these cells will stick together. Now let’s go up a level in multicellular creatures you have cellular differentiation.

(01:09:57)
Now if you think about how embryos develop, you go all the way back, those cells undergo differentiation on a causal way that’s biomechanically a feedback between the genetics and biomechanics. I think we can use assembly theory to apply to tissue types. We can even apply it to different cell disease types. So that’s what we’re doing next. But we are trying to walk… The thing is, I’m trying to, I want a leap ahead to go, whoa, we apply it to culture. Clearly you can apply it to memes and culture. And we’ve also applied to assembly theory to CA’s and not as you think…
Lex Fridman
(01:09:57)
Cellular automaton, by the way.
Lee Cronin
(01:10:34)
Yeah, yeah. Cellular automaton, not just as you think. Different CA rules were invented by different people at different times. And one of my coworkers, very talented chap basically was like, “Oh, I can realize that different people had different ideas with different rules and they copied each other and made slightly different cellular automaton rules and looked at them online.” And so he was able to refer an assembly index and copy number of rule, whatever, doing this thing. But I digress.

(01:11:04)
But it does show you can apply it at a higher scale. So what do we need to do to apply assembly theory to things? We need to agree, there’s a common set of building blocks. So in a cell, well, in a multicellular creature, you need to look back in time. So there is the initial cell, which the creature is fertilized and then starts to grow and then there is cell differentiation. And you have to then make that causal chain both on those. So that requires development of the organism in time. Or if you look at the cell surfaces and the cell types, they’ve got different features on the cell walls and inside the cell. So we’re building up, but obviously I want a leap to things like emoticons, language, mathematical theorems.
Lex Fridman
(01:11:54)
But that’s a very large number of steps to get from a molecule to the human brain.
Lee Cronin
(01:12:01)
Yeah, and I think they are related, but in hierarchies of emergence. So you shouldn’t compare them. I mean the assembly index of a human brain, what does that even mean? Well, maybe we can look at the morphology of the human brain, say all human brains have these number of features in common. If they have those number… And then let’s look at a brain in a whale or a dolphin or a chimpanzee or a bird and say, “Okay, let’s look at the assembly indices and number of features in these.” And now the copy number is just the number of how many birds are there, how many chimpanzees are there, how many humans are there?
Lex Fridman
(01:12:35)
But then you have to discover for that the features that you would be looking for.
Lee Cronin
(01:12:39)
Yeah, and that means you need to have some idea of the anatomy.
Lex Fridman
(01:12:43)
But is there an automated way to discover features?
Lee Cronin
(01:12:46)
I guess so. And I think this is a good way to apply machine learning and image recognition just to basically characterize things.
Lex Fridman
(01:12:55)
To apply compression to it, to see what emerges, and then use the features used as part of the compression, as the measurement of… As the thing that is searched for when you’re measuring assembly index and copy number.
Lee Cronin
(01:13:09)
And the compression has to be, remember the assembly universe, which is you have to go from assembly possible to assembly contingent and that jump from… Because assembly possible all possible brains, all possible features all the time. But we know that on the tree of life and also on the lineage of life, going back to Luca, the human brain just didn’t spring into existence yesterday, it’s a long lineage of brains going all the way back. And so if we could do assembly theory to understand the development, not just in evolutionary history, but in biological development, as you grow, we are going to learn something more.
Lex Fridman
(01:13:45)
What would be amazing is if you can use assembly theory, this framework to show the increase in the assembly index associated with, I don’t know, cultures or pieces of text like language or images and so on and illustrate without knowing the data ahead of time, just kind like you did with NASA that you were able to demonstrate that it applies in those other contexts. I mean, and that probably wouldn’t at first, and you have to evolve the theory somehow. You have to change it, you have to expand it.
Lee Cronin
(01:14:21)
I think so.
Lex Fridman
(01:14:24)
I guess this is as a paper, a first step in saying, okay, “Can we create a general framework for measuring complexity of objects. For measuring life, the complexity of living organisms.”
Lee Cronin
(01:14:39)
Yeah.
Lex Fridman
(01:14:39)
That’s what this is reaching for.
Lee Cronin
(01:14:41)
That is the first step. And also to say, look, we have a way of quantifying selection and evolution in a fairly, not mundane, but a fairly mechanical way because before now… The ground truth for it was very subjective. Whereas here we’re talking about clean observables and there’s going to be layers on that. I mean, with collaborators right now, we already think we can do assembly theory on language. And not only that, wouldn’t it be great if we can figure out how under pressure language is going to involve and be more efficient? Because you’re going to want to transmit things.

(01:15:20)
And again, it’s not just about compression, it is about understanding how you can make the most of the architecture you’ve already built. And I think this is something beautiful that evolution does. We are reusing those architectures. We can’t just abandon our evolutionary history. And if you don’t want to abandon your evolutionary history and you know that evolution has been happening, then assembly theory works.

(01:15:44)
And I think that’s a key comment I want to make is that assembly theory is great for understanding when evolution has been used. The next jump is when we go to technology, because of course, if you take the M3 processor… I want to buy, I haven’t bought one yet. I can’t justify it, but I want it at some point. The M3 processor arguably is there’s quite a lot of features, a quite large number. The M2 came before it, then the M1 all the way back, you can apply assembly theory to microprocessor architecture. It doesn’t take a huge leap to see that.
Lex Fridman
(01:16:15)
I’m a Linux guy, by the way. So your examples go way over my head.
Lee Cronin
(01:16:18)
Yeah, well, whatever…
Lex Fridman
(01:16:19)
Is that a fruit company of some sort? I don’t even know. Yeah, there’s a lot of interesting stuff to ask about language. Like you could look at… How would that work? You could look at GPT-1, GPT-2, GPT-3, 3, 5, 4, and try to analyze the kind of language it produces. I mean, that’s almost trying to look at assembly index of intelligence systems.
Lee Cronin
(01:16:45)
Yeah, I mean I think the thing about large language models, and this is a whole hobbyhorse I have at the moment, is that obviously they’re all about… The evidence of evolution in the large language model comes from all the people that produced all the language. And that’s really interesting. And all the corrections in the Mechanical Turk, right?
Lex Fridman
(01:17:15)
Sure. But that’s part of the history, part of the memory of the system.
Lee Cronin
(01:17:20)
Exactly. So it would be really interesting to basically use an assembly based approach to making language in a hierarchy. My guess is that we might be able to build a new type of large language model that uses assembly theory, that it has more understanding of the past and how things were created. Basically the thing with LLMs is like, everything everywhere, all at once, splat and make the user happy. So there’s not much intelligence in the model. The model is how the human interacts with the model. But wouldn’t it be great if we could understand how to embed more intelligence in the system?
Lex Fridman
(01:18:03)
What do you mean by intelligence there? You seem to associate intelligence with history or memory?
Lee Cronin
(01:18:11)
Yeah. I think selection produces intelligence.
Lex Fridman
(01:18:16)
You’re almost implying that selection is intelligence. No.
Lee Cronin
(01:18:21)
Kind of, I would go out in limb and say that, but I think it’s a little bit more, human beings have the ability to abstract and they can break beyond selection. And this is… Darwinian selection, because a human being doesn’t have to basically do trial and error, but they can think about it and say, “Oh, that’s a bad idea, won’t do that.” And then technologies and so on.
Lex Fridman
(01:18:39)
So we escaped Darwinian evolution and now we’re onto some other kind of evolution, I guess? Higher level.
Lee Cronin
(01:18:46)
And assembly theory will measure that as well, right? Because it’s all a lineage.

Kolmogorov complexity

Lex Fridman
(01:18:50)
Okay. Another piece of criticism or by way of question is how is assembly theory or maybe assembly index different from Kolmogorov complexity? So for people who don’t know, a Kolmogorov complexity of an object is the length of a shortest computer program that produces the object as output.
Lee Cronin
(01:19:10)
Yeah, there seems to be a disconnect between the computational approach. So Kolmogorov measure requires a Turing machine, requires a computer, and that’s one thing. And the other thing is assembly theory is supposed to trace the process by which life evolution emerged, right? There’s a main thing there. There are lots of other layers.

(01:19:42)
So Kolmogorov complexity, you can approximate Kolmogorov complexity, but it’s not really telling you very much about the actual… It’s really telling you about your dataset, compression of your dataset.
Lex Fridman
(01:20:00)
Sure.
Lee Cronin
(01:20:00)
And so that doesn’t really help you identify… The turtle in this case is the computer. And so what assembly theory does is, I’m going to say, it’s a trigger warning for anyone listening who loves complexity theory. I think that we’re going to show that AIT is a very important subset of assembly theory because here’s what happens. I think that assembly theory allows us to go understand when were selections occurring. Selection produces factories and things, factories in the end produce computers, and then algorithmic information theory comes out of that. The frustration I’ve had with looking at life through this kind of information theory is it doesn’t take into account causation. So the main difference between assembly theory and all these complexity measures is there’s no causal chain. And I think that’s the main…
Lex Fridman
(01:21:00)
That’s the causal chain is at the core of assembly theory.
Lee Cronin
(01:21:06)
Exactly. And if you’ve got all your data in a computer memory, all the data’s the same. You can access it in the same way. You don’t care. You just compress it. And you either look at the program runtime or the shortest program. And that for me is absolutely not capturing what it is. What selection does.
Lex Fridman
(01:21:28)
But assembly theory looks at objects. It doesn’t have information about the object history. It’s going to try to infer that history by looking for the shortest history, right? The object doesn’t have a Wikipedia page that goes with it about its history.
Lee Cronin
(01:21:49)
I would say it does in a way, and it is fascinating to look at. So you’ve just got the object and you have no other information about the object. What assembly theory allows you to do with just with the object is to, and the word infer is correct, I agree with infer. You say, well, that’s not the history. But something really interesting comes from this.

(01:22:10)
The shortest path is inferred from the object. That is the worst case scenario if you have no machine to make it. So that tells you about the depth of that object in time. And so what assembly theory allows you to do is without considering any other circumstances, to say from this object, how deep is this object in time if we just treat the object as itself without any other constraints? And that’s super powerful because the shortest path then allows you to say, “Oh, this object wasn’t just created randomly. There was a process.” And so assembly theory is not meant to one up AIT or to ignore the factory. It’s just to say, “Hey, there was a factory and how big was that factory? And how deep in time is it?”
Lex Fridman
(01:23:01)
But it’s still computationally very difficult to compute that history, right? For complex objects?
Lee Cronin
(01:23:11)
It is. It becomes harder. But one of the thing that’s super nice is that it constrains your initial conditions, right?
Lex Fridman
(01:23:18)
Sure.
Lee Cronin
(01:23:18)
It constrains where you’re going to be. So if you take, say, imagine… So one of the things we’re doing right now is applying assembly theory to drug discovery. Now, what everyone’s doing right now is taking all the proteins and looking at the proteins and looking at molecules, doppler proteins, why not instead, look at the molecules that are involved in interacting with the receptors over time, rather than thinking about and use the molecules, evolve over time as a proxy for how the proteins evolved over time. And then use that to constrain your drug discovery process.

(01:23:51)
You flip the problem 180 and focus on the molecule evolution rather than the protein. And so you can guess in the future what might happen. So you rather than having to consider all possible molecules, you know where to focus. And that’s the same thing if you’re looking at in assembly spaces for an object where you don’t know the entire history, but you know that in the history of this object, it’s not going to have some other motif there that it doesn’t apply. It doesn’t appear in the past.
Lex Fridman
(01:24:22)
But just even for the drug discovery point you made, don’t you have to simulate all of chemistry to figure out how to come up with constraints?
Lee Cronin
(01:24:32)
No.
Lex Fridman
(01:24:32)
And the molecules and the…
Lee Cronin
(01:24:34)
No.
Lex Fridman
(01:24:35)
I don’t know enough about protein.
Lee Cronin
(01:24:36)
Well, this is another thing that I think causes… Because this paper goes across so many boundaries. So chemists have looked at this and said, “This is not correct reaction.” It’s like, no, it’s a graph.
Lex Fridman
(01:24:52)
Sure, there’s assembly index and shortest path examples here on chemistry.
Lee Cronin
(01:24:58)
Yeah, and what you do is you look at the minimal constraints on that graph. Of course it has some mapping to the synthesis, but actually you don’t have to know all of chemistry. You can build up the constraints space rather nicely. But this is just at the beginning, right? There are so many directions this could go in and as I said, it could all be wrong, but hopefully it’s less wrong.
Lex Fridman
(01:25:22)
What about the little criticism I saw of… By way of question, do you consider the different probabilities of each reaction in the chain so that there could be different… When you look at a chain of events that led up to the creation of an object, doesn’t it matter that some parts in the chain are less likely than others?
Lee Cronin
(01:25:46)
No.
Lex Fridman
(01:25:48)
It doesn’t matter?
Lee Cronin
(01:25:48)
No, no. Well, let’s go back. So no, not less likely, but react… So, no. So let’s go back to what we’re looking at here. So the assembly index is the minimal path that could have created that object probabilistically. So imagine you have all your atoms in a plasma, you’ve got enough energy, there’s collisions. What is the quickest way you could zip out that molecule with no reaction constraints?
Lex Fridman
(01:26:12)
How do you define quickest there then?
Lee Cronin
(01:26:14)
It’s just basically walk on a random graph. So we make an assumption that basically the timescale for forming the bonds. So no, I don’t want to say that because then it’s going to have people getting obsessing about this point. And your criticism is a really good one. What we’re trying to say is this puts a lower bound on something. Of course, some reactions are less possible than others, but actually I don’t think chemical reactions exist.
Lex Fridman
(01:26:39)
Oh, boy. What does that mean? Why don’t chemical reactions exist?
Lee Cronin
(01:26:44)
I’m writing a paper right now that I keep being told I have to finish, and it’s called ‘The Origin of Chemical Reactions.’ And it merely says that reactivity exists as controlled by the laws of quantum mechanics. And reactions, chemists put names on reactions. So you can have, I don’t know, the Wittig reaction, which is by Wittig. You could have the Suzuki reaction, which is by Suzuki.

(01:27:09)
Now what are these reactions? So these reactions are constrained by the following. They’re constrained by the fact they’re on planet Earth, 1G, 298 Kelvin, 1 Bar. So these are constraints. They’re also constrained by the chemical composition of earth, oxygen availability, all this stuff. And that then allows us to focus in our chemistry. So when a chemist does a reaction, that’s a really nice compressed shorthand for constraint application, glass flask, pure reagent, temperature, pressure, boom, boom, boom, control, control control, control control.

(01:27:44)
So of course we have bond energies. So the bond energies are kind of intrinsic in a vacuum. So the bond energy, you have to have a bond. And so for assembly theory to work, you have to have a bond, which means that bond has to give the molecule a half life. So you’re probably going to find later on that some bonds are weaker and that you are going to miss in mass spectrum, when you look at the assembly of some molecules, you’re going to miscount the assembly of the molecule. It falls apart too quickly because the bonds just form. But you can solve that with looking at infrared.

(01:28:21)
So when people think about the probability, they’re kind of misunderstanding. Assembly theory says nothing about the chemistry because chemistry is chemistry and their constraints are put in by biology. There was no chemist on the origin of life unless you believe in the chemist in the sky… And it’s like Santa Claus, they had a lot of work to do, but chemical reactions do not exist and the constraints that allow chemical transformations to occur do exist.
Lex Fridman
(01:28:52)
Okay, okay. So it’s constraint. So there’s no chemical reactions. It’s all constraint application, which enables the emergence of… What’s a different word for chemical reaction?
Lee Cronin
(01:29:10)
Transformation?
Lex Fridman
(01:29:11)
Transformation.
Lee Cronin
(01:29:11)
Yeah, like a function. It’s a function, but no, but I love chemical reactions as a shorthand. And so the chemists don’t all go mad. I mean, of course chemical reactions exist on earth.
Lex Fridman
(01:29:20)
It’s a shorthand.
Lee Cronin
(01:29:21)
It’s a shorthand for these constraints.
Lex Fridman
(01:29:24)
So assuming all these constraints that we’ve been using for so long that we just assume that that’s what was the case in natural language conversation.
Lee Cronin
(01:29:30)
Exactly. The grammar of chemistry of course emerges in reactions and we can use them reliably, but I do not think the Wittig reaction is accessible on Venus.
Lex Fridman
(01:29:41)
Right, and this is useful to remember to frame it as constraint application is useful for when you zoom out to the bigger picture of the universe and looking at the chemistry of the universe and then starting to apply assembly theory. That’s interesting. That’s really interesting. But we’ve also pissed off the chemists now.
Lee Cronin
(01:30:01)
Oh, they’re pretty happy, but well, most of them.
Lex Fridman
(01:30:04)
No. Everybody deep down is happy, I think. They’re just sometimes feisty, that’s how they have fun.
Lee Cronin
(01:30:13)
Everyone is grumpy on some days when you challenge… The problem with this paper is… It’s almost like I went to a park, it’s like I used to do this occasionally when I was young. Go to a meeting and just find a way to offend everyone at the meeting simultaneously. Even the factions that don’t like each other, they’re all unified in the hatred of you just offending them. This paper, it feels like the person that went to the party and offended everyone simultaneously. So stop fighting with themselves and just focused on this paper.

Nature review process

Lex Fridman
(01:30:41)
Maybe just a little insider interesting information. What were the editors of Nature, what the reviews and so on, how difficult was that process because this is a pretty big paper.
Lee Cronin
(01:30:55)
So when we originally sent the paper, we sent the paper and the editor said that… This is quite a long process. We sent the paper and the editor gave us some feedback and said, “I don’t think it’s that interesting.” Or “It’s hard. It’s hard concept.” And the editor gave us some feedback and Sarah and I took a year to rewrite the paper.
Lex Fridman
(01:31:26)
Was the Nature of the feedback very specific on this part? This part? Or was it like, “What are you guys smoking? What kind of crack are you taking?”
Lee Cronin
(01:31:34)
Yeah, it was kind of the latter. What are you smoking.
Lex Fridman
(01:31:35)
Okay. But polite and there’s promise.
Lee Cronin
(01:31:41)
Yeah. Well the thing is the editor was really critical, but in a really professional way. And I mean for me, this was the way science should happen. So when it came back, we had too many equations in the paper. If you look at the pre-print, there’s just equations everywhere, like 23 equations. And when I said to Abhishek, who was the first author, we’ve got to remove all the equations, but my assembly equations staying in Abhishek was like, “No, we can’t.”

(01:32:05)
I said, “Well look, if we want to explain this to people, there’s a real challenge.” And so Sarah and I went through the, I think it was actually 160 versions of the paper, but basically we got to version 40 or something. We said, “Right, zero it start again.” So we wrote the whole paper again. We knew the entire…
Lex Fridman
(01:32:21)
Amazing.
Lee Cronin
(01:32:22)
And we just went bit by bit by bit and said, “What is it we want to say?” And then we sent the paper in and we expected it to be rejected and not even go to review. And then we got notification back, it had gone to review and we were like, “Oh my God, it’s so going to get rejected. How’s it going to get rejected?” Because the first assembly paper on the mass spec we sent to Nature went through six rounds of review and rejected. And by a chemist that just said, “I don’t believe you. You must be committing fraud.”

(01:32:54)
And long story, probably a boring story, but in this case it went out to review, the comments came back and the comments were incredibly, they were very deep comments from all the reviewers. But the nice thing was the reviewers were kind of very critical, but not dismissive. They were like, “Oh, really? Explain this, explain this, explain this, explain this.”
Lex Fridman
(01:32:54)
That’s great.
Lee Cronin
(01:33:26)
Are you sure it’s not Kolmogorov? Are you sure it’s not this? And we went through I think three rounds of review pretty quick and the editor went, yeah, it’s in.
Lex Fridman
(01:33:39)
But maybe you could just comment on the whole process. You’ve published some pretty huge papers on all kinds of topics within chemistry and beyond. Some of them have some little spice in them, a little spice of crazy like Tom Waits, says, “I like my Tom with a little drop of poison.” It’s not a mundane paper. So what’s it like psychologically to go through all this process to keep getting rejected, to get reviews from people that don’t get the paper or all that kind of stuff? Just from a question of a scientist, what is that like?
Lee Cronin
(01:34:19)
I mean this paper for me kind of, because this wasn’t the first time we tried to publish assembly theory at the highest level. The Nature communications paper on the mass spec, the idea went to Nature and got rejected, went through six rounds of review and got rejected. And I just was so confused when the chemist said, this can’t be possible. I do not believe you can measure complexity using mass spec. And also by the way, complex molecules can randomly form. And we’re like, “But look at the data. The data says…” And they said, “No, no. We don’t believe you.” And we went and I just wouldn’t give up. And the editor in the end was just like… Different editors actually. Right?
Lex Fridman
(01:35:10)
What’s behind that never giving up? When you’re sitting there 10 o’clock in the evening, there’s a melancholy feeling that comes over you and you’re like, “Okay, this is rejection number five.” Or it’s not rejection, but maybe it feels like a rejection because the comments are that you totally don’t get it. What gives you strength to keep going there?
Lee Cronin
(01:35:31)
I don’t know. I don’t normally get emotional about papers, but it is not about giving up because we want to get it published because we want the glory or anything. It’s just like, why don’t you understand? And so what I would just… Is try to be as rational as possible and say, yeah, you didn’t like it. Tell me why. And then…

(01:36:26)
Sorry, give me a second. Silly, never get emotional about papers normally, but I think what we do, you just compressed five years of angst from this.
Lex Fridman
(01:36:38)
So it’s been rough?
Lee Cronin
(01:36:40)
It’s not just rough. It’s like, it happened… I came up with the assembly equation remote from Sarah in Arizona and the people at SFI. I felt like I was a mad person. The guy depicted in A Beautiful Mind who was just like… Not the actual genius part, but just the gibberish, gibberish, gibberish.
Lex Fridman
(01:36:59)
Just the crazy part.
Lee Cronin
(01:37:02)
Because I kept writing expanded and I have no mathematical ability at all. And I was making these mathematical expansions where I kept seeing the same motif again. I was like, I think this is a copy number. The same string is coming again and again and again, I couldn’t do the math. And then I realized the copy number fell out of the equation and everything collapsed down. I was like, oh, that works kind of.

(01:37:23)
So we submitted the paper and then when it was almost accepted, the mass spec one and it was astrobiologists said, great, a mass spectroscopist said great. And the chemist went nonsense, biggest pile of nonsense ever. Fraud. And I was like, “But why fraud?” And they just said, “Just because.” I was like well… I could not convince the editor in this case. The editor was just so pissed off. They see it as a, you’re wasting my time. And I would not give up. I wrote, I went and dissected all the parts. And I think, although, I mean I got upset about, it was kind of embarrassing actually, but I guess…
Lex Fridman
(01:38:05)
I bet it was beautiful.
Lee Cronin
(01:38:08)
But it was just trying to understand why they didn’t like it. So part of me was really devastated and a part of me was super excited because I’m like, “Huh, they can’t tell me why I’m wrong.” And this kind of goes back to when I was at school, I was in a kind of learning difficulties class, and I kept going to the teacher and saying, “What do I do today to prove I’m smart?” And they were like, “Nothing, you can’t.” I was like, “Give me a job, give me something to do, give me a job to do. Something to do.” And I kind of felt like that a bit when I was arguing with the, and not arguing. There was no ad hominem. I wasn’t telling the editor they were idiots or anything like this or the reviewers. I kept it strictly factual.

(01:38:51)
And all I did is I just kept knocking it down bit by bit, by bit, by bit by bit. It was ultimately rejected and it got published elsewhere. And then the actual experimental data, so in this paper, the experimental justification was already published. So when we did this one and we went through the versions and then we sent it in and in the end it just got accepted. We were like, well, that’s kind of cool, right? This is kind of like some days…

(01:39:21)
Sorry, the first author was like, “I can’t believe it got accepted.” I was like, “Nor am I, but it’s great. It’s good.” And then when the paper was published, I was not expecting the backlash. I was expecting computational. Well, no, actually I was just expecting one person who’d been trolling me for a while about it just to carry on trolling, but I didn’t expect the backlash. And then I wrote to the editor and apologized and the editor was like, “What are you apologizing for? It was a great paper. Of course it’s going to get backlash. You said some controversial stuff, but it’s awesome.”
Lex Fridman
(01:39:56)
Well, I think it’s a beautiful story of perseverance and the backlash is just a negative word for discourse, which I think is beautiful. That’s the science.
Lee Cronin
(01:40:08)
I think, as I said when it got accepted and people were saying, we’re kind of hacking on it. And I was like, papers are not gold medals. The reason I wanted to publish that paper in Nature is because it says, “Hey, there’s something before biological evolution.” You have to have that, if you’re not a creationist, by the way, this is an approach. First time someone has put a concrete mechanism, or sorry, a concrete quantification and what comes next you are pushing on is a mechanism. And that’s what we need to get to is an auto catalytic sets, self-replicating molecules, some other features that come in.

(01:40:48)
And the fact that this paper has been so discussed, for me is a dream come true, it doesn’t get better than that. If you can’t accept a few people hating it… And the nice thing is, the thing that really makes me happy is that no one has attacked the actual physical content.

(01:41:10)
You can measure the assembly index, you can measure selection now. So either that’s right or it’s… Well, either that’s helpful or unhelpful. If it’s unhelpful, this paper will sink down and no one will use it again. If it’s helpful, it’ll help people scaffold on it and we’ll start to converge for a new paradigm. So I think that that’s the thing that I wanted to see my colleagues, authors, collaborators and people were like, you’ve just published this paper. You’re a chemist. Why have you done this? Who are you to be doing evolutionary theory? Well, I don’t know. I mean, sorry, did I need to…
Lex Fridman
(01:41:48)
Who is anyone to do anything? Well, I’m glad you did. Let me just before coming back to Origin of Life and these kinds of questions, you mentioned learning difficulties. I didn’t know about this. So what was it like?
Lee Cronin
(01:42:00)
I wasn’t very good at school, right.
Lee Cronin
(01:42:00)
I wasn’t very good at school, right?
Lex Fridman
(01:42:04)
This is when you were very young?
Lee Cronin
(01:42:06)
Yeah. But in primary school, my handwriting was really poor and apparently I couldn’t read and my mathematics was very poor. So they just said, “This is a problem.” They identified it. My parents at the time, were confused because I was busy taking things apart, buying electronic junk from the shop, trying to build computers and things. And then once I got out of… when I think, about the major transition in my stupidity, everyone thought I wasn’t that stupid when I was… Basically, everyone thought I was faking. I liked stuff and I was faking wanting to be it. So I always want to be a scientist. So five, six, seven years old, I’d be a scientist, take things apart, and everyone’s like, “Yeah, this guy wants to be a scientist, but he’s an idiot.” So everyone was really confused, I think, at first, that I wasn’t smarter than I was claiming to be.

(01:42:58)
And then I just basically didn’t do well in any of the tests, and I went down and down and down and down and then I was like, “Huh, this is really embarrassing. I really like maths and everyone says I can’t do it. I really like physics and chemistry and science and people say you can’t read and write.” And so I found myself in a learning difficulties class at the end of primary school and the beginning of secondary school. In the UK, secondary school is 11, 12 years old. And I remember being put in the remedial class. And the remedial class was basically full of three types of people. There were people quite violent and there were people who couldn’t speak English and there were people that really had learning difficulties. So the one thing I can objectively remember was… I could read. I liked reading. I read a lot. But something in me, I’m a bit of a rebel. I refused to read what I was told to read and I found it difficult to read individual words in the way they were told.

(01:44:24)
But anyway, I got caught one day teaching someone else to read and they said, “Okay, we don’t understand this.” I’d always known I wanted to be a scientist, but I didn’t really know what that meant and I realized you had to go to university and I thought, “I can just go to university. They take curious people.” “No, no, no need to have these. You have to be able to enter these exams to get this grade point average, and the fact is, the exams you’ve been entered into, you are just going to get C, D or E.” You can’t even get A, B or C. These are the UK GCSEs. I was like, ” Oh, shit,” and I said, “Can you just put me into the higher exams?” They said, “No, no, you’re going to fail. There’s no chance.” So my father intervened and said, “Just let him go in the exams,” and they said, “He’s definitely going to fail. It’s a waste of time, waste of money,” and he said, “What if we paid?” So they said, “Okay,” so you didn’t actually have to pay. You only had to pay if I failed.

(01:45:23)
So I took the exams and passed them, fortunately. I didn’t get the top grades, but I got into A Levels. But then that also limited what I could do at A Levels. I wasn’t allowed to do A Level maths.
Lex Fridman
(01:45:35)
What do you mean you weren’t allowed to?
Lee Cronin
(01:45:36)
Because I had such a bad math grade from my GCSE, I only had a C. But they wouldn’t let me go into the ABC for maths because of some coursework requirement back then so the top grade I could have got was a C. So C, D or E. So I got a C and they let me do AS Level maths, which is this half intermediate and get to go to university. But I liked chemistry. I had a good chemistry teacher so in the end I got to university to do chemistry.
Lex Fridman
(01:46:01)
So through that process, I think for kids in that situation, it’s easy to start believing that you’re not… How do I put it… That you’re stupid, and basically give up, that you’re just not good at math, you’re not good at school. So this is, by way of advice for people, for interesting people, for interesting young kids right now, experiencing the same thing. Where was the place? What was the source of you not giving up there?
Lee Cronin
(01:46:33)
I have no idea. Other than… I really liked not understanding stuff. For me, when I not understand something… I feel like I don’t understand anything. But now, but back then, I remember when I was like… I don’t know, I tried to build a laser when I was eight and I thought, “How hard could it be?” And basically, I was going to build a CO2 laser and I was like, “Right, I think I need some partially coated mirrors. I need some carbon dioxide and I need a high voltage.” And I was so stupid. I was so embarrassed. T make enough CO2, I actually set a fire and tried to filter the flame.
Lex Fridman
(01:47:30)
Oh, nice. That’s an idea.
Lee Cronin
(01:47:30)
Just to collect enough CO2 and it completely failed. And I burnt half the garage down. So my parents were not very happy about that. So that was one thing. I really liked first principle thinking. So I remember being super curious and being determined to find answers. And so when people do give advice about this, why ask for advice about this? I don’t really have that much advice other than don’t give up. And one of the things I try to do as a chemistry professor in my group is I hire people that I think, if they’re persistent enough, who am I to deny them the chance? Because people gave me a chance and I was able to do stuff.
Lex Fridman
(01:48:18)
Do you believe in yourself essentially?
Lee Cronin
(01:48:22)
So I love being around smart people and I love confusing smart people. And when I’m confusing smart people, not by stealing their wallets and hiding it somewhere, but if I can confuse smart people, that is the one piece of hope that I might be doing something interesting.
Lex Fridman
(01:48:37)
Wow, that’s quite brilliant. As a gradient to optimize. Hang out with smart people and confuse them. And the more confusing it is, the more there’s something there.
Lee Cronin
(01:48:47)
And as long as they’re not telling you just a complete idiot and they give you different reasons. And everyone, because with assembly theory and people said, “Oh, it’s wrong.” And I was like, “Why?” And no one could give me a consistent reason. They said, “Oh, because it’s been done before or it’s just [inaudible 01:49:04] or it’s just there, that and the other. So I think the thing that I like to do is, and in academia it’s hard because people are critical. But the criticism, although I got upset about it earlier, which is silly, but not silly because obviously it’s hard work being on your own or with a team spatially separated during lockdown and try to keep everyone on board and have some faith. I always wanted to have a new idea. And so I like a new idea and I want to nurture it as long as possible. And if someone can give me actionable criticism, that’s why I think I was trying to say earlier when I was stuck for words, give me actionable criticism.

(01:49:51)
“It’s wrong.” “Okay, why is it wrong?” Say, “Oh, your equation’s incorrect for this or your method is wrong.” So what I try and do is get enough criticism from people to then triangulate and go back. And I’ve been very fortunate in my life that I’ve got great colleagues, great collaborators, funders, mentors, and people that will take the time to say, “You are wrong because.” And then what I have to do is integrate the wrongness and go, “Oh, cool, maybe I can fix that.” And I think criticism is really good. People have a go at me because I’m really critical. But I’m not criticizing you as a person. I’m just criticizing the idea and trying to make it better and say, “What about this?”

(01:50:34)
And sometimes my filters are truncated in some ways. I’m just like, “That’s wrong, that’s wrong, that’s wrong. Why’d you do this?” And people are like, “Oh my God, you just told me, you destroyed my life’s work.” I’m like, “Relax. No.” I’m just like, “Let’s make it better.” And I think that we don’t do that enough because we are either personally critical, which isn’t helpful or we don’t give any criticism at all because we’re too scared.
Lex Fridman
(01:51:03)
Yeah, I’ve seen you be pretty aggressively critical but every time I’ve seen, it’s the idea, not the person.

Time and free will

Lee Cronin
(01:51:15)
I’m sure I make mistakes on that.I argue lots with Sara and she’s shocked. I’ve argued with Joscha, Joscha Bach, in the past and he is like, “You’re just making that up.” And I’m like, “No, not quite. But kind of.” But I had a big argument with Sara about time and she’s like, “No, time doesn’t exist.” I’m like, “No, no, time does exist.” And as she realized that her conception of assembly theory and my conception of assembly theory was the same thing, necessitated us to abandon the fact that time is eternal, to actually really fundamentally question how the universe produces combinatorial novelty.
Lex Fridman
(01:51:59)
So time is fundamental for assembly theory? I’m just trying to figure out where you and Sara converged.
Lee Cronin
(01:52:06)
I think assembly theory is fine in this time right now but I think it helps us understand that something interesting is going on. I’ve been really inspired by a guy called Nick Gisin. I’m going to butcher his argument but I love his argument a lot. So I hope he forgives me if he hears about it. But basically if you want free will, time has to be fundamental. And if you want time to be fundamental, you have to give up on platonic mathematics and you have to use intuition. By the way, and again I’m going to butcher this, but basically Hilbert said that infinite numbers are allowed. And I think it was Brouwer who said, “No, you can’t. All numbers are finite.” So let’s go back a step because it was like people going to say, assembly theory seems to explain that large combinatorial space allows you to produce things like life and technology. And that large combinatorial space is so big it’s not even accessible to a Sean Carroll, David Deutsch multiverse that physicists saying that all of the universe already exists in time is probably, provably, that’s a strong word, not correct.

(01:53:43)
That we are going to know that the universe as it stands, the present, the way the present builds the future is so big, the universe can’t ever contain the future. And this is a really interesting thing. I think Max Tegmark has this mathematical universe. He says the universe is like a block universe, and I apologize to Max if I’m getting it wrong, but people think you can just move. You have the stat, you have the initial conditions, and you can run the universe right to the end and go backwards and forwards in that universe. That is not correct.
Lex Fridman
(01:54:17)
Let me load that in. The universe is not big enough to contain the future.
Lee Cronin
(01:54:21)
Yeah. That’s why. That’s it.
Lex Fridman
(01:54:24)
That’s a beautiful way of saying that time is fundamental.
Lee Cronin
(01:54:26)
Yes. And this is why the law of the excluded middle, something is true or false, only works in the past. Is it going to snow in New York next week or in Austin? You might, in Austin, say probably not. In New York, you might say, yeah. If you go forward to next week and say, “Did it snow in New York last week? True or false?” You can answer that question. The fact that the law of the excluded middle cannot apply to the future explains why time is fundamental.
Lex Fridman
(01:55:01)
That’s a good example, intuitive example, but it’s possible that we might be able to predict whether it’s going to snow if we had the perfect information.
Lee Cronin
(01:55:10)
I think…
Lex Fridman
(01:55:11)
You’re saying it not.
Lee Cronin
(01:55:13)
Impossible. Impossible. So here’s why. I’ll make a really quick argument and this argument isn’t mine. It’s Nick’s and a few other people.
Lex Fridman
(01:55:23)
Can you explain his view on time being fundamental?
Lee Cronin
(01:55:28)
Yeah. So I’ll give my view, which resonates with his, but basically it’s very simple actually. It would say your ability to design and do an experiment is exercising free will. So he used that thought process. I never really thought about it that way, and that you actively make decisions. I used to think that free will was a consequence of just selection but I’m understanding that human free will is something really interesting. And he very much inspired me. But I think that what Sara Walker said that inspired me as well, these will converge, is that I think that the universe, and the universe is very big, huge, but actually the place that is largest in the universe right now, the largest place in the universe, is earth.
Lex Fridman
(01:56:20)
Yeah, I’ve seen you say that. And boy, does that… That’s an interesting one to process. What do you mean by that earth is the biggest place in the universe?
Lee Cronin
(01:56:31)
Because we have this combinatorial scaffolding going all the way back from LUCA. So you’ve got cells that can self-replicate and then you go all the way to terraforming the earth. You’ve got all these architectures, the amount of selection that’s going on, biological selection, just to be clear, biological evolution, and then have multicellularity then animals and abstraction. And with abstraction, there was another kick because you can then build architectures and computers and cultures and language and these things are the biggest things that exist in the universe because we can just build architectures that could naturally arise anywhere and the further that distance goes in time, and it’s gigantic.
Lex Fridman
(01:57:15)
From a complexity perspective.
Lee Cronin
(01:57:17)
Yeah.
Lex Fridman
(01:57:17)
Okay, wait a minute. But I know you’re being poetic, but how do you know there’s not other earth-like… How do you know? You’re basically saying earth is really special. It’s awesome stuff as far as we look out, there’s nothing like it going on. But how do you know there’s not nearly infinite number of places where cool stuff like this is going on?
Lee Cronin
(01:57:40)
I agree and I would say, I’ll say again, that earth is the most gigantic thing we know in the universe combinatorially we know.
Lex Fridman
(01:57:49)
We know. Yeah.

Communication with aliens

Lee Cronin
(01:57:50)
Now, I guess this is just purely a guess. I have no data other than hope. Maybe not hope, maybe… No, I have some data. That every star in the sky probably has planets and life is probably emerging on these planets. But the amount of contingency that is associated with life, is I think the combinatorial space associated with these planets is so different. Our causal cones are never going to overlap or not easily. And this is the thing that makes me sad about alien life. It’s why we have to create alien life in the lab as quickly as possible because I don’t know if we are going to be able to be able to build architectures that will intersect with alien intelligence architectures.
Lex Fridman
(01:58:42)
Intersect, you don’t mean in time or space?
Lee Cronin
(01:58:46)
Time and the ability to communicate.
Lex Fridman
(01:58:48)
The ability to communicate.
Lee Cronin
(01:58:49)
Yeah. My biggest fear in a way is that life is everywhere but we become infinitely more lonely because of our scaffolding in that combinatorial space. Because it’s so big.
Lex Fridman
(01:59:00)
So you’re saying the constraints created by the environment that led to the factory of Darwinian evolution are just this little tiny cone in a nearly infinite combinatorial space.
Lee Cronin
(01:59:14)
Exactly.
Lex Fridman
(01:59:14)
So there’s other cones like it. Why can’t we communicate with other… Just because we can’t create it doesn’t mean we can’t appreciate the creation, right? Sorry, detect the creation.
Lee Cronin
(01:59:30)
I truly don’t know but it’s an excuse for me to ask for people to give me money to make a planet simulator.
Lex Fridman
(01:59:36)
Yeah, right.
Lee Cronin
(01:59:36)
If I can make…
Lex Fridman
(01:59:38)
With a different [crosstalk 01:59:40]
Lee Cronin
(01:59:39)
It’s like another shameless say, it’s like, “Give me money. I need money.”
Lex Fridman
(01:59:42)
This was all long plug for a planet simulator. Hey, I won’t be the first in line to do that.
Lee Cronin
(01:59:50)
My rick garage has run out of room.
Lex Fridman
(01:59:53)
Yeah.
Lee Cronin
(01:59:54)
No.
Lex Fridman
(01:59:54)
And this planet simulator, you mean a different planet or different sets of environments and pressures?
Lee Cronin
(02:00:01)
Exactly. If we could basically recreate the selection before biology as we know it, that gives rise to a different biology, we should be able to put the constraints on where to look in the universe. So here’s the thing. Here’s my dream. My dream is that by creating life in the lab based upon constraints we understand, let’s go for Venus type life or earth type life or something again, do an Earth 2.0. Screw it, let’s do an Earth 2.0. An Earth 2.0 has a different genetic alphabet. Fine, that’s fine. Different protein alphabet, fine. Have cells and evolution, all that stuff. We will then be able to say, “Okay, life is a more general phenomena. Selection is more general than what we think is the chemical constraints on life.” And we can point at James Webb and other telescopes at other planets that we are in that zone we are most likely to combinatorially overlap with because, so there’s chemistry…
Lex Fridman
(02:01:01)
You’re looking for some overlap.
Lee Cronin
(02:01:02)
And then we can then basically shine light on them literally and look at light coming back and apply advanced assembly theory to general theory of language that we’ll get and say, “Huh, in that signal, it looks random but there’s a copy number. Oh, this random set of things that shouldn’t be that looks like a true random number generator has structure as not [inaudible 02:01:32], an IT type structure, but evolutionary structure given by assembly theory,” and we start to… But I would say that because I’m a shameless assembly theorist.
Lex Fridman
(02:01:42)
Yeah, it just feels like the cone, I might be misusing the word cone here but the width of the cone is growing faster, is growing really fast to where eventually all the cones overlap even in a very, very, very large combinatorial space. But then again, if you’re saying the universe is also growing very quickly in terms of possibilities…
Lee Cronin
(02:02:14)
I hope that as we build abstractions, one idea is that as we go to intelligence, intelligence allows us to look at the regularities around us in the universe. And that gives us some common grounding to discuss with aliens. And you might be right that we will overlap there. Even though we have completely different chemistry, literally completely different chemistry, that we will be able to pass information from one another. But it’s not a given. And I have to try and divorce hope and emotion away from what I can logically justify.
Lex Fridman
(02:03:02)
But it’s just hard to intuit a world, a universe where there’s nearly infinite complexity objects and they somehow can’t detect each other.
Lee Cronin
(02:03:13)
The universe is expanding. But the nice thing is I would say, I would look, you see, I think Carl Sagan did the wrong thing. Not the wrong thing. He flicked the Voyager program and the Pale Blue Dot and said, “Look how big the universe is.” I would’ve done it the other way around and said, “Look at the Voyager probe that came from the planet earth that came from LUCA. Look at how big earth is.”
Lex Fridman
(02:03:31)
Then it produced that.
Lee Cronin
(02:03:32)
It produced that.
Lex Fridman
(02:03:34)
Yeah.
Lee Cronin
(02:03:35)
And that I think is completely amazing. And then that should allow people on earth to think about, “Probably we should try and get causal chains off Earth onto Mars, onto the moon, wherever. Whether it’s human life or martian life that we create, it doesn’t matter. But I think this combinatorial space tells us something very important about the universe and that I realized in assembly theory that the universe is too big to contain itself. Now coming back, I want to change your mind about time because I’m guessing that your time is just a coordinate. So I’m going to change…
Lex Fridman
(02:03:35)
I’m guessing you’re one of those.
Lee Cronin
(02:04:20)
One of those. I’m change my mind in real time or at least attempt.
Lex Fridman
(02:04:22)
Oh, in real time. There you go. I already got the tattoo. So this is going to be embarrassing if you change my mind.
Lee Cronin
(02:04:27)
But you can just add an arrow of time onto it, right?
Lex Fridman
(02:04:27)
Yeah, true. Just modify it.
Lee Cronin
(02:04:32)
Or raise it a bit. And the argument that I think that is really most interesting is people say the initial conditions specify the future of the universe. Okay, fine. Let’s say that’s the case for a moment. Now let’s go back to Newtonian mechanics. Now, the uncertainty principle in Newtonian mechanics is this. If I give you the coordinates of an object moving in space and the coordinates of another object and they collide in space. And those initial conditions, you should know exactly what’s going to happen. However, you cannot specify these coordinates to infinite precision. Now everyone says, “Oh, this is like the chaos theory argument.” No, no, it’s deeper than that. Here’s a problem with numbers. This is where Hilbert and Brouwer fell out. To have the coordinates of this object, a given object that’s colliding, you have to have them to infinite precision. That’s what Hilbert says. There’s no problem. Infinite precision is fine. Let’s just take that for granted.

(02:05:38)
But when the object is finite and it can’t store its own coordinates, what do you do? So in principle, if a finite object cannot be specified to infinite precision, in principle, the initial conditions don’t apply.
Lex Fridman
(02:05:58)
How do you know it can’t store its…
Lee Cronin
(02:06:01)
How do you store an in long number in a finite size?
Lex Fridman
(02:06:09)
We’re using infinity very loosely here.
Lee Cronin
(02:06:11)
No, no. We’re using…
Lex Fridman
(02:06:12)
Infinite precision. Not loosely, but…
Lee Cronin
(02:06:14)
Very precisely.
Lex Fridman
(02:06:15)
So you think infinite precision is required?
Lee Cronin
(02:06:18)
Let’s take the object. Let’s say the object is a golf ball. A golf ball is a few centimeters in diameter. We can work out how many atoms are in the golf ball. And let’s say we can store numbers down to atomic dislocations. So we can work out how many atoms there are in the golf ball and we can store the coordinates in that golf ball down to that number. But beyond that, we can’t. Let’s make the golf ball smaller. And this is where I think that we think that we get randomness in quantum mechanics and some people say you can’t get randomness, quantum mechanic’s deterministic, but aha, this is where we realize that classical mechanics and quantum mechanics suffer from the same uncertainty principle. And that is the inability to specify the initial conditions to a precise enough degree to give you determinism.

(02:07:09)
The universe is intrinsically too big and that’s why time exists. It’s non-deterministic. Looking back into the past, you can use logical arguments because you can say, “Was it true or false?” You already know. But this is the fact we are unable to predict the future with the precision is not evidence of lack of knowledge. It’s evidence the universe is generating new things.
Lex Fridman
(02:07:38)
Okay, first of all, quantum mechanics, you could just say statistically what’s going to happen when two golf balls hit each other.
Lee Cronin
(02:07:44)
Statistically. But sure, I can say statistically what’s going to happen. But then when they do happen and then you keep nesting it together, it goes almost back to, look, let’s think about entropy in the universe. So how do we understand entropy change or process? We can use the ergodic hypothesis. We can also have have the counterfactuals where we have all the different states and we can even put that in the multiverse. But both those, they’re nonphysical. The multiverse collapses back to the same problem about the precision. So if you accept, you don’t have to have true and false going forward into the future. The real numbers are real. They’re observables.
Lex Fridman
(02:08:47)
We’re trying to see exactly where time being fundamental sneaks in. And this difference between the golf ball can’t contain its own position perfectly precisely. How that leads to time needing to be fundamental.
Lee Cronin
(02:09:07)
Do you believe or do you accept you have free will?
Lex Fridman
(02:09:12)
Yeah, I think at this moment in time, I believe that I have free will.
Lee Cronin
(02:09:17)
So then you have to believe that time is fundamental.
Lex Fridman
(02:09:23)
I understand that’s a statement you’ve made.
Lee Cronin
(02:09:25)
No, that we can logically follow because if you don’t have free will, so if you’re in a universe that has no time, universe is deterministic. If it’s deterministic, then you have no free will.
Lex Fridman
(02:09:37)
I think the space of how much we don’t know is so vast that saying the universe is deterministic and from that jumping into there’s no free will is just too difficult of a leap.
Lee Cronin
(02:09:48)
No, I logically follow. No, no, I don’t disagree. It’s deep and it’s important. All I’m saying, and it’s actually different to what I’ve said before, is that if you don’t require platonistic mathematics and accepts that non-determinism is how the universe looks and that gives us our creativity and the way the universe is getting novelty, it’s really deeply important in assembly theory because assembly theory starts to actually give you a mechanism where you go from boring time, which is basically initial conditions specify everything, to a mismatch in creative time. And I hope we’ll do experiments. I would love to do an experiment that prove that time is fundamental and the universe is generating novelty. I don’t know all the features of that experiment yet, but by having these conversations openly and getting people to think about the problems in a new way, better people, more intelligent people with good mathematical backgrounds can say, “Oh, hey, I’ve got an idea. I would love to do an experiment that shows that the universe is too big for itself going forward in time.”

(02:11:04)
And this is why I really hate the idea of the Boltzmann brain. The Boltzmann brain makes me super, like everyone’s having a free lunch. It’s like saying, “Let’s break all the laws of physics.” So a Boltzmann brain is this idea that in a long enough universe, a brain will just emerge in the universe as conscious. And that neglects the causal chain of evolution that required to produce that brain. And this is where the computational argument really falls down because a computationalist could say,” I can calculate probability of a Boltzmann brain.” And they’ll give you a probability. But I can calculate probability of a Boltzmann brain. Zero.
Lex Fridman
(02:11:40)
Just because the space of possibilities is so large?
Lee Cronin
(02:11:43)
Yeah. When we start falling ourselves with numbers that we can’t actually measure and we can’t ever conceive of, I think it doesn’t give us a good explanation. And I want to explain why life is in the universe. I think life is actually novelty minor. Life basically mines novelty almost from the future and actualizes in the present.
Lex Fridman
(02:12:11)
Okay. Life is a novelty minor from the future that is actualized in the present.
Lee Cronin
(02:12:20)
Yep. I think so.
Lex Fridman
(02:12:24)
Novelty minor. First of all, novelty. What’s the origin of novelty when you go from boring time to creative time? Where is that? Is it as simple as randomness like you’re referring to?
Lee Cronin
(02:12:39)
I am really struggling with randomness because I had a really good argument with Joscha Bach about randomness, and he just said, “Randomness doesn’t give you free will. That’s insane because you’d just be random.” And I think he’s right at that level but I don’t think he is right on another level. And it’s not about randomness, it’s about constrained, I’m making this up as I go along, so making this up, constrained opportunity. So the novelty. What is novelty? This is what I think is a funny thing if you ever want to discuss AI. Why I think everyone’s gone AI mad is that they’re misunderstanding novelty. But let’s think about novelty. Yes. What is novelty? So I think novelty is a genuinely new configuration that is not predicted by the past and that you discover in the present. And that is truly different. Now, everyone says that. Some people say that novelty doesn’t exist. It’s always with precedent. I want to do experiments that show that that is not the case. And it goes back to a question you asked me a few moments ago, which is where is the factory?

(02:13:58)
Because I think the same mechanism that gives us a factory gives us novelty. And I think that is why I’m so deeply hung up on time. Of course I’m wrong, but how wrong? And I think that life opens up that combinatorial space in a way that our current laws of physics, although as contrived in a deterministic initial condition universe even with the get out of the multiverse, David Deutsch style, which I love by the way, but I don’t think is correct, but it’s really beautiful.
Lex Fridman
(02:14:37)
Multiverse.
Lee Cronin
(02:14:38)
David Deutsche’s conception of the multiverse is given. But I think that the problem with wave particle duality in quantum mechanics is not about the multiverse. It’s about understanding how determined the past is. I don’t just think that actually, this is a discussion I was having with Sara about that, where she was like, “Oh, I think we’ve been debating this for a long time now, about how do we reconcile novelty determinism in determinism.”
Lex Fridman
(02:15:13)
Okay. Just to clarify, both you and Sara think the universe is not deterministic?
Lee Cronin
(02:15:19)
I won’t speak for Sara but roughly. I think the universe is deterministic looking back in the past but undetermined going forward in the future. So I’m having my cake and eating it here. This is because I fundamentally don’t understand randomness, as Joscha told me or other people told me. But if I adopt a new view now which the new view is the universe is just non-deterministic, but I’d like to refine that and say the universe appears deterministic going back in the past but it’s undetermined going forward in the future. So how can we have a universe that has deterministically looking rules that’s non-determined
Lee Cronin
(02:16:00)
… universe that has deterministically-looking rules that is non-determined going into the future. It’s this breakdown in precision in the initial conditions, and we have to just stop using initial conditions and start looking at trajectories, and how the combinatorial space behaves in an expanding universe in time and space. And assembly theory helps us quantify the transition to biology, and biology appears to be novelty-mining, because it’s making crazy stuff that are unique to Earth. Right? There are objects on Earth that are unique to Earth that will not be found anywhere else, because you can do the combinatorial math.
Lex Fridman
(02:16:41)
What was that statement you made about “life is novelty-mining from the future”? What’s the little element of time that you’re introducing there?
Lee Cronin
(02:16:51)
What I’m kind of meaning is because the future is bigger than the present, in a deterministic universe, how do the states go from one to another? There’s a mismatch, right?
Lex Fridman
(02:17:02)
Yeah.
Lee Cronin
(02:17:03)
So, that must mean that you have a little bit of indeterminism. Whether that’s randomness or something else, I don’t understand. I want to do experiments to formulate a theory to refine that as we go forward that might help us explain that. And I think that’s why I’m so determined to try and crack the “non-life to life” transition looking at networks and molecules, and that might help us think about the mechanism. But certainly the future is bigger than the past in my conception of the universe and some conception of the universe. And-
Lex Fridman
(02:17:35)
By the way, that’s not obvious, right? The future being bigger than the past, well, that’s one statement, and the statement that the universe is not big enough to contain the future is another statement. That one is a big one. That one’s a really big one.
Lee Cronin
(02:17:53)
I think so, but I think it’s entirely … Because look, we have the second law, and right now we don’t need the second law if the future’s bigger than the past. It follows naturally. So, why are we retrofitting all these sticking plasters onto our reality to hold onto a timeless universe?
Lex Fridman
(02:18:13)
Yeah, but that’s because it’s kind of difficult to imagine the universe that can’t contain the future.
Lee Cronin
(02:18:21)
But isn’t that really exciting?
Lex Fridman
(02:18:23)
It’s very exciting, but it’s hard. We are humans on Earth, and we have a very kind of four-dimensional conception of the world, of 3D plus time. It’s just hard to intuit a world where, what does that even mean, a universe that can’t contain the future?
Lee Cronin
(02:18:47)
Yeah. It’s kind of crazy but obvious.
Lex Fridman
(02:18:50)
It’s weird, it’s weird. I suppose it sounds obvious, yeah, if it’s true.
Lee Cronin
(02:18:56)
So, the reason why assembly theory turned me onto that was that, let’s just start in the present, and look at all the complex molecules, and go backwards in time, and understand how evolutionary processes gave rise to them. It’s not at all obvious that taxol, which is one of the most complex natural products produced by biology, was going to be invented by biology. It’s an accident.

(02:19:24)
Taxol is unique to Earth. There’s no taxol elsewhere in the universe, and taxol was not decided by the initial conditions. It was decided by this interplay between the … So, the past simply is embedded in the present. It gives some features. But why the past doesn’t map to the future one-to-one is because the universe is too big to contain itself. That gives space for creativity, and novelty, and some things which are unpredictable.

Cellular automata

Lex Fridman
(02:19:57)
Well, okay. So, given that you’re disrespecting the power of the initial conditions, let me ask you about, how do you explain that cellular automata are able to produce such incredible complexity given just basic rules and basic initial conditions?
Lee Cronin
(02:20:12)
I think that this falls into the Brouwer-Hilbert trap. So, how do you get cellular automata to produce complexity? You have a computer, you generate a display, and you map the change of that in time. There are some CAs that repeat like functions.

(02:20:32)
It’s fascinating to me that for pi, there is a formula where you can go to the millionth decimal place of pi and read out the number without having to go there. But there are some numbers where you can’t do that, and you have to just crank through. Whether it’s Wolframian computational irreducibility or some other thing, well, it doesn’t matter. But these CAs, that complexity, is that just complexity, or a number that is basically you’re mining that number in time? Is that just a display screen for that number, that function?
Lex Fridman
(02:21:10)
Well, can’t you say the same thing about the complexity on Earth then?
Lee Cronin
(02:21:12)
No. Because the complexity on Earth has a copy number and an assembly index associated with it. That CA is just a number running.
Lex Fridman
(02:21:20)
You don’t think it has a copy number? Wait a minute …
Lee Cronin
(02:21:23)
Well, it does where we’re looking at humans producing different rules, but then it’s nested on selection. So, those CAs are produced by selection. The CA is such a fascinating pseudo-complexity generator. What I would love to do is understand, quantify the degree of surprise in a CA and run it long enough. But what I guess that means is we have to instantiate, we have to have a number of experiments where we’re generating different rules and running them time steps, but … Oh, I got it.

(02:21:53)
CAs are mining novelty in the future by iteration, right? And you’re like, ” Oh, that’s great. That’s great.” You didn’t predict it. Some rules you can predict what’s going to happen, and other rules you can’t. So for me, if anything, CAs are evidence that the universe is too big to contain itself, because otherwise you’d know what the rules are going to do forevermore.
Lex Fridman
(02:22:14)
Right. I guess you were saying that the physicist saying that all you need is the initial conditions and the rules of physics is somehow missing the bigger picture.
Lee Cronin
(02:22:26)
Yeah.
Lex Fridman
(02:22:27)
And if you look at CAs, all you need is the initial condition and the rules, and then run the thing.
Lee Cronin
(02:22:33)
You need three things; You need the initial conditions, you need the rules, and you need time iteration to mine it out. Without the coordinate, you can’t get it out.
Lex Fridman
(02:22:45)
Sure, and that to you is fundamental?
Lee Cronin
(02:22:47)
And you can’t predict it from the initial conditions. If you could, then it could be fine.
Lex Fridman
(02:22:51)
And that time is-
Lee Cronin
(02:22:53)
A resource.
Lex Fridman
(02:22:54)
… like the foundation of the history, the memory of each of the things it created. It has to have that memory of all the things that led up to it.
Lee Cronin
(02:23:05)
Yeah, you have to have the resource. Because time is a fundamental resource. Yeah, I think I had a major epiphany about randomness, but I keep doing that every two days and then it goes away again. It’s random.
Lex Fridman
(02:23:24)
You’re a time fundamentalist.
Lee Cronin
(02:23:26)
And you should be as well. If you believe in free will, then the only conclusion is that time is fundamental. Otherwise you cannot have free will. It logically follows.
Lex Fridman
(02:23:37)
Well, the foundation of my belief in free will is observation-driven.
Lee Cronin
(02:23:48)
But that’s-
Lex Fridman
(02:23:48)
I think if you use logic, logically it seems like the universe is deterministic.
Lee Cronin
(02:23:55)
Looking backwards in time then that’s correct, the universe is.
Lex Fridman
(02:23:59)
And then everything else is a kind of leap. It requires a leap.
Lee Cronin
(02:24:11)
This is why I think machine learning is going to provide a chunk of that, right? To help us explain this. So, the way I’d say it, if you take …
Lex Fridman
(02:24:19)
That’s interesting. Why?

AGI

Lee Cronin
(02:24:21)
Well, my favorite one is … Because AI doomers are driving me mad, and in fact we don’t have any intelligence yet. I call AI “autonomous informatics” just to make people grumpy.
Lex Fridman
(02:24:34)
Yeah. You’re saying we’re quite far away from AGI.
Lee Cronin
(02:24:39)
I think that we have no conception of intelligence, and I think that we don’t understand how the human brain does what it does. I think that neuroscience is making great advances, but I think that we have no idea about AGI. So, I am a technological, I guess optimist. I believe we should do everything. The whole regulation of AI is nonsensical. Why would you regulate Excel, other than the fact that Clippy should come back and I love Excel ’97 because we can do the flight simulator.
Lex Fridman
(02:25:11)
Sorry, in Excel?
Lee Cronin
(02:25:12)
Yeah, have you not played the flight simulator in-
Lex Fridman
(02:25:14)
In Excel ’97?
Lee Cronin
(02:25:16)
Yeah.
Lex Fridman
(02:25:18)
What does that look like?
Lee Cronin
(02:25:19)
It’s like wireframe, very basic. But basically I think it’s X zero, Y zero, shift, and it opens up and you can play the flight simulator.
Lex Fridman
(02:25:29)
Oh, wow. Wait, wait, is it using Excel?
Lee Cronin
(02:25:32)
Excel ’97.
Lex Fridman
(02:25:33)
Okay.
Lee Cronin
(02:25:34)
I resurrected it the other day and saw Clippy again for the first time in a long time.
Lex Fridman
(02:25:37)
Well, Clippy is definitely coming back. But you’re saying we don’t have a great understanding of what is intelligence, what is the intelligence underpinning the human mind.
Lee Cronin
(02:25:50)
I’m very frustrated by the way that we’re AI dooming right now, and people are bestowing some kind of magic. Now, let’s go back a bit. So, you said about AGI, are we far away from AGI? Yes. I do not think we’re going to get to AGI anytime soon. I’ve seen no evidence of it, and the AI doom scenario is nonsensical in the extreme.

(02:26:12)
The reason why I think it’s nonsensical … And I don’t think there isn’t things we should do and be very worried about. There are things we need to worry about right now, what AI are doing. Whether it’s fake data, fake users. I want authentic people, authentic data. I don’t want everything to be faked, and I think it’s a really big problem, and I absolutely want to go on the record to say I really worry about that. What I’m not worried about is that some fictitious entity is going to turn us all to paperclips or detonate nuclear bombs, or maybe, I don’t know, anything you can think of.

(02:26:49)
Why is this? I’ll take a very simple series of logical arguments, and the AI doomers do not have the correct epistemology. They do not understand what knowledge is. And until we understand what knowledge is, they’re not going to get anywhere because they’re applying things falsely. So, let me give you a very simple argument.

(02:27:18)
People talk about the probability, “P(doom)”, of AI. We can work out the probability of an asteroid hitting the planet. Why? Because it’s happened before. We know the mechanism. We know that there’s a gravity well, or that spacetime is bent and stuff falls in. We don’t know the probability of AGI because we have no mechanism. So, let me give you another one, which is like, “I’m really worried about AG.” What’s AG? AG is anti-gravity. “One day we could wake up and anti-gravity is discovered, we’re all going to die, the atmosphere is going to float away, we’re going to float away, we’re all doomed.”

(02:27:52)
What is the probability of AG? We don’t know because there’s no mechanism for AG. Do we worry about it? No, and I don’t understand the current reason for certain people in certain areas to be generating this nonsense. I think they’re not doing it maliciously. I think we’re observing the emergence of new religions, how religions come, because religions are about some controls.

(02:28:20)
You’ve got the optimist saying, “AI is going to cure us all,” and, “AI is going to kill us all.” What’s the reality? Well, we don’t have AI. We have really powerful machine learning tools and they will allow us to do interesting things, and we need to be careful about how we use those tools in terms of manipulating human beings and faking stuff. Right?
Lex Fridman
(02:28:38)
Right. Well, let me try to steel man the AI doomers’ argument. And actually, I don’t know, are AI doomers in the Yudkowsky camp saying it’s definitely going to kill us? Because there’s a spectrum.
Lee Cronin
(02:28:38)
95% I think is the limit.
Lex Fridman
(02:28:54)
Plus? 95%-plus, that’s the-
Lee Cronin
(02:28:55)
No, not plus. I don’t know. I was seeing on Twitter today various things. But I think Yudkowsky is at 95%.
Lex Fridman
(02:29:02)
But to belong to the AI doomer club, is there a threshold? I don’t know what the membership …
Lee Cronin
(02:29:06)
Maybe.
Lex Fridman
(02:29:07)
And what are the fees?
Lee Cronin
(02:29:09)
Well, I think Scott Aronson, I was quite surprised, had put two … I saw this online, so I could be wrong. So, sorry if it’s wrong. He says 2%. But the thing is, if someone said there’s a 2% chance that you’re going to die going into the lift, would you go into the lift?
Lex Fridman
(02:29:24)
In the elevator, for the American English-speaking audience. Well, no, not for the elevator.
Lee Cronin
(02:29:30)
So, I would say anyone higher than 2% … I think there’s a 0% chance of AGI doom. Zero.
Lex Fridman
(02:29:37)
Just to push back on the argument where N of zero on the AGI … We could see on Earth that there’s increasing levels of intelligence of organisms. We can see what humans with extra intelligence were able to do to the other species. So, that is a lot of samples of data, what a delta in intelligence gives you. When you have an increase in intelligence, how you’re able to dominate a species on Earth.

(02:30:08)
So, the idea there is that if you have a being that’s 10x smarter than humans, we’re not going to be able to predict what that being is going to be able to do, especially if it has the power to hurt humans. Which, you can imagine a lot of trajectories in which the more benefit AI systems give, the more control we give to those AI systems over our power grid, over our nuclear weapons, or weapons of any sort. And then it’s hard to know what an ultra-intelligence system would be able to do in that case. You don’t find that convincing?
Lee Cronin
(02:30:50)
I think I would fail that argument 100%. Here’s a number of reasons to fail it on. First of all, we don’t know where the intention comes from. The problem is that people keep … I’ve been watching all the hucksters online with the prompt engineering and all this stuff. When I talk to a typical AI computer scientist, they keep talking about the AIs having some kind of decision-making ability. That is a category error.

(02:31:17)
The decision-making ability comes from human beings. We have no understanding of how humans make decisions. We’ve just been discussing free will for the last half an hour, right? We don’t even know what that is. So, the intention, I totally agree with you, people who intend to do bad things can do bad things and we should not let that risk go. That’s totally here and now. I do not want that to happen, and I’m happy to be regulated to make sure that systems I generate, whether they’re computer systems, or … I’m working on a new project called “Chem Machina”.
Lex Fridman
(02:31:53)
Nice. Well done.
Lee Cronin
(02:31:54)
Yeah, yeah. Which is basically a …
Lex Fridman
(02:31:59)
For people who don’t understand the pun, the Ex Machina is a great film about I guess AGI embodied, and “chem” is the chemistry version of that.
Lee Cronin
(02:32:07)
And I only know one way to embody intelligence, and that’s in chemistry and human brains. So, category error number one is that they have agency. Category error number two is assuming that anything we make is going to be more intelligent. Now, you didn’t say super-intelligent. I’ll put the words into our mouths here, super-intelligent. I think that there is no reason to expect that we are going to make systems that are more intelligent. More capable …

(02:32:38)
When people play chess computers, they don’t expect to win now, right? The chess computer is very good at chess. That doesn’t mean it’s super-intelligent. So, I think that super-intelligence, and I think even Nick Bostrom is pulling back on this now, because he invented this … So, I see this a lot. When did I see it first happen? Eric Drexler, nanotechnology. Atomically precise machines. He came up with a world where we had these atom cogs everywhere and we were going to make self-replicating nanobots.

(02:33:06)
Not possible. Why? Because there’s no resources to build these self-replicating nanobots. You can’t get the precision. It doesn’t work. It was a major category error in taking engineering principles down to the molecular level. The only functioning nanomolecular technology we know is produced by evolution. There.

(02:33:27)
So, now let’s go forward to AGI. What is AGI? We don’t know. It’s super, it can do this, or humans can’t think. I would argue the only AGIs that exist in the universe are produced by evolution. And sure, we may be able to make our working memory better. We might be able to do more things. The human brain is the most compact computing unit in the universe. It uses 20 watts, uses a really limited volume. It’s not like a ChatGPT cluster which has to have thousands of watts, and a model that’s generated, and it has to be corrected by human beings. You are autonomous and embodied intelligence.

(02:34:04)
So, I think that there are so many levels that we’re missing out, we’ve just kind of went, “Oh, we’ve discovered fire. Oh gosh, the planet’s just going to burn one day randomly.” I just don’t understand that leap. There are bigger problems we need to worry about. So, what is the motivation? Why are these people, and let’s assume they’re earnest, have this conviction? Well, I think they’re making leaps and they’re trapped in a virtual reality that isn’t reality.
Lex Fridman
(02:34:34)
Well, I can continue a set of arguments here, but also it is true that ideologies that fearmonger are dangerous. Because you can then use it to control, to regulate in a way that halts progress, to control people, and to cancel people, all that kind of stuff. So, you have to be careful, because reason ultimately wins. Right?

(02:35:03)
But there is a lot of concerns with super-intelligent systems, very capable systems. I think when you hear the word “super-intelligent”, you’re hearing, “It’s smarter than humans in every way that humans are smart.” But the paperclip manufacturing system doesn’t need to be smart in every way. It just needs to be smart in a set of specific ways. And the more capable the AI systems become, the more you could see us giving them control over, like I said, our power grid, a lot of aspects of human life. And then that means they’ll be able to do more and more damage when there’s unintended consequences that come to life.
Lee Cronin
(02:35:46)
I think that that’s right. The unintended consequences we have to think about, and that I fully agree with. But let’s go back a bit. Sentience … Again, I’m far away from my comfort zone and all this stuff, but hey, let’s talk about it. Because I give myself a qualification.
Lex Fridman
(02:36:02)
Yeah, we’re both qualified in sentience, I think, as much as anyone else.
Lee Cronin
(02:36:07)
I think the paperclip scenario is just such a poor one, because let’s think about how that would happen. And also, let’s think about, we are being so unrealistic about how much of the Earth’s surface we have commandeered. For paperclip manufacturing to really happen, do the math. It’s not going to happen. There’s not enough energy, there’s not enough resource. Where is it all going to come from?

(02:36:32)
I think that what happens in evolution, it’s really: Why has a killer virus not killed all life on Earth? Well, what happens is, sure, superkiller viruses that kill the ribosome have emerged. But you know what happens? They nuke a small space because they can’t propagate. They all die. So, there’s this interplay between evolution and propagation, right? And death. So …
Lex Fridman
(02:36:56)
In evolution. You don’t think it’s possible to engineer, for example, and sorry to interrupt, but a perfect virus?
Lee Cronin
(02:37:02)
No.
Lex Fridman
(02:37:02)
That’s deadly enough?
Lee Cronin
(02:37:04)
No. Nonsensical. I think again, it wouldn’t work. Because if it was too deadly, it would just kill the radius and not replicate.
Lex Fridman
(02:37:11)
Yeah. But you don’t think it’s possible to get a …
Lee Cronin
(02:37:16)
If you were …
Lex Fridman
(02:37:17)
Not kill all of life on Earth, but kill all humans. There’s not many of us. There’s only like 8 billion. There’s so much more ants. So many more ants, and they’re pretty smart.
Lee Cronin
(02:37:32)
I think the nice thing about where we are, I would love for the AI crowd to take a leaf out of the book of the bio-warfare, chemical warfare crowd. I mean, not love, because actually people have been killed with chemical weapons in the first and second World War, and bio-weapons have been made, and we can argue about COVID-19 and all this stuff. Let’s not go there just now. But I think there is a consensus that some certain things are bad and we shouldn’t do them, right? And sure, it would be possible for a bad actor to engineer something bad, but we would see it coming and we would be able to do something about it.

(02:38:16)
Now, I guess what I’m trying to say is when people talk about doom, and when you ask them for the mechanism, they just make something up. In this case, I’m with Yann LeCun. I think you put out a very good point about trying to regulate jet engines before we’ve even invented them. And I think that’s what I’m saying.

(02:38:39)
I’m not saying we should … I just don’t understand why these guys are going around literally making stuff up about us all dying, when basically we need to actually really focus on … Now, let’s say there’s some actors that are earnest. Let’s say Yudkowsky is being earnest and he really cares. But he loves it. He goes, “Da, da, da, and then you’re all going to die.” It’s like, why don’t we try and do the same thing and say, “You could do this, and then you’re all going to be happy forever after”?
Lex Fridman
(02:39:07)
Well, I think there’s several things to say there. One, I think there is a role in society for people that say we’re all going to die. Because I think it filters through as a message, as a viral message that gives us the proper amount of concern. Meaning it’s not 95%, but when you say 95% and it filters through society, it’ll give an average of like a 0.03%. An average. So, it’s nice to have people that are like, “We’re all going to die,” and then we’ll have a proper concern.

(02:39:41)
For example, I do believe we’re not properly concerned about the threat of nuclear weapons currently. It just seems like people have forgotten that that’s a thing, and there’s a war in Ukraine with a nuclear power involved. There’s nuclear powers throughout the world, and it just feels like war in the brink of a potential world war to a percentage that I don’t think people are properly calibrating in their head. We’re all thinking it’s a Twitter battle as opposed to actual threat.

(02:40:12)
So, it’s nice to have that kind of level of concern. But to me, when I hear AI doomers, what I’m imagining is with unintended consequences a potential situation where let’s say 5% of the world suffers deeply because of a mistake made, of unintended consequences. I don’t want to imagine the entirety of human civilization dying, but there could be a lot of suffering if this is done poorly.
Lee Cronin
(02:40:39)
I understand that, and I guess I’m involved in the whole hype cycle. So, let’s say having some people saying AI doom is a worry, fine. Let’s give them that. But what seems to be happening is there seems to be people who don’t think AI is doing that, and they’re trying to use that to control regulation and to push people to regulate, which stops humans generating knowledge. And I am an advocate for generating as much knowledge as possible.

Nuclear weapons


(02:41:15)
When it comes to nuclear weapons, I grew up in the ’70s and ’80s where there was nuclear doom and a lot of adults really had existential threat, almost as bad as now with AI doom. They were really worried. There were some great … Well, not great. There were some horrific documentaries. I think there was one called Threads that was generated in the UK, which, it was terrible. It was so scary.

(02:41:40)
And I think that the correct thing to do is obviously get rid of nuclear weapons, but let’s think about unintended consequences. We’ve got rid of … This is going to be such a non sequitur. We got rid of all the sulfur particles in the atmosphere, right? All the soot. And what’s happened in the last couple of years is global warming has accelerated because we’ve cleaned up the atmosphere too much. So …
Lex Fridman
(02:42:02)
Sure. The same thing if you get rid of nuclear weapons. You’ll get [inaudible 02:42:05]-
Lee Cronin
(02:42:05)
Exactly, that’s my point. So, what we could do is if we actually started to put the AI in charge … Which I’d really like an AI to be in charge of all world politics, and this will sound ridiculous for a second. Hang on. But if we could all agree on the-
Lex Fridman
(02:42:19)
The AI doomers just woke up on that statement.
Lee Cronin
(02:42:22)
Yeah, yeah, yeah. But I really don’t like politicians who are basically just looking at local sampling. But if you could say globally, “Look, here’s some game theory here. What is the minimum number of nuclear weapons we need to distribute around the world to everybody to basically reduce war to zero?”
Lex Fridman
(02:42:40)
Just the thought experiment of, the United States and China and Russia and major nuclear powers get together and say, “All right, we’re going to distribute nuclear weapons to every single nation on Earth.” Oh, boy. That has a probably greater than 50% chance of eliminating major military conflict, but it’s not a hundred percent.
Lee Cronin
(02:43:07)
But I don’t think anyone will use them, because … And look, what you’ve got to try and do is to qualify for these nuclear weapons … This is a great idea. The game theorists could do this, right?
Lex Fridman
(02:43:19)
Uh-huh.
Lee Cronin
(02:43:20)
I think the question is this … I really buy your question. We have too many nukes. Just from a feeling point of view, that we’ve got too many of them. So, let’s reduce the number, but not get rid of them because we’ll have too much conventional warfare. So then, what is the minimum number of nuclear weapons we can distribute around to remove … Humans hurting each other is something we should stop doing. It’s not out with our conceptual capability …

(02:43:46)
But right now, what about certain nations that are being exploited for their natural resources in the future for a short-term gain because we don’t want to generate knowledge? So, if everybody had an equal doomsday switch, I predict the quality of life of the average human will go up faster. I am an optimist, and I believe that humanity is going to get better and better and better, that we’re going to eliminate more problems. But I think, yeah, let’s-
Lex Fridman
(02:44:13)
But the probability of a bad actor, of one of the nations setting off a nuclear weapon, you have to integrate that into the calculus here.
Lee Cronin
(02:44:26)
But we just give you [inaudible 02:44:28] nukes population. Right? What we do is we … I can’t believe this. But anyway, let’s just go there. So, if a small nation with a couple of nukes uses one because they’re a bit bored or annoyed, the likelihood that they are going to be pummeled out of existence immediately is 100%. And yet they’ve only nuked one other city. I know this is crazy, and I apologize for …
Lex Fridman
(02:44:51)
Well, no, no. Just to be clear, we’re just having a thought experiment that’s interesting. But there’s terrorist organizations that would take that trade. We have to ask ourselves a question of: Which percentage of humans would be suicide bombers, essentially? Where they would sacrifice their own life because they hate another group of people. I believe it’s a very small fraction, but is it large enough to, if you give out nuclear weapons …
Lee Cronin
(02:45:25)
I can predict a future where we take all nuclear material and we burn it for energy, right? Because we’re getting there. And the other thing you could do is say, “Look, there’s a gap.” So, if we get all the countries to sign up to the virtual agreement where we have a simulation where we can nuke each other in the simulation and the economic consequences are catastrophic …
Lex Fridman
(02:45:43)
Sure. In the simulation, I love it. It’s not going to kill all humans, it’s just going to have economic consequences.
Lee Cronin
(02:45:49)
Yeah, yeah. I don’t know, I just made it up. It seems like a cool idea.
Lex Fridman
(02:45:51)
No, it’s interesting. But it’s interesting whether that would have as much power on human psychology as actual physical nuclear explosion.
Lee Cronin
(02:45:59)
I think so.
Lex Fridman
(02:46:00)
It’s possible, but people don’t take economic consequences as seriously I think as actual nuclear weapons exploding.
Lee Cronin
(02:46:07)
I think they do in Argentina, and they do in Somalia. And they do in a lot of these places where … No, I think this is a great idea. I’m a strong advocate now for … So, what have we come up with? Burning all the nuclear material to have energy. And before we do that, because MAD is good, mutually assured destruction is very powerful, let’s take it into the metaverse and then get people to kind of subscribe to that. And if they actually nuke each other even for fun in the metaverse, there are dire consequences.
Lex Fridman
(02:46:36)
Yeah, yeah. So, it’s like a video game. We all have to join this metaverse video game …
Lee Cronin
(02:46:41)
Yeah. I can’t believe we just …
Lex Fridman
(02:46:43)
And then there’s dire economic consequences. And it’s all run by AI, as you mentioned, so the AI doomers are really terrified at this point.
Lee Cronin
(02:46:52)
No, they’re happy. They have a job for another 20 years, right?
Lex Fridman
(02:46:55)
Oh, fear-mongering.
Lee Cronin
(02:46:56)
Yeah, yeah, yeah. I’m a believer in equal employment.

Chem Machina

Lex Fridman
(02:47:00)
You’ve mentioned that, what’d you call it … Chem Machina?
Lee Cronin
(02:47:06)
Yeah.
Lex Fridman
(02:47:07)
Yeah. So, you’ve mentioned that a chemical brain is something you’re interested in creating, and that’s the way to get conscious AI soon. Can you explain what a chemical brain is?
Lee Cronin
(02:47:22)
I want to understand the mechanism of intelligence that’s gone through evolution, right? Because the way that intelligence was produced by evolution appears to be the following: origin of life, multi-cellularity, locomotion, senses. Once you can start to see things coming towards you, and you can remember the past and interrogate the present and imagine the future, you can do something amazing, right? And I think only in recent years did humans become Turing-complete, right?
Lex Fridman
(02:47:57)
Yeah.
Lee Cronin
(02:47:58)
Right? So, that Turing completeness kind of gave us another kick up. But our ability to process that information was produced in a wet brain. And I think that we do not have the correct hardware architectures to have the domain flexibility and the ability to integrate information, and I think intelligence also comes at a massive compromise of data. Right now we’re obsessing about getting more and more data, more and more processing, more and more tricks to get dopamine hits. So, when we look back on this going, “Oh yeah, that was really cool, because when I asked ChatGPT, it made me feel really happy and I got a hit from it.” But actually it just exposed how little intelligence I use in every moment, because I’m easily fooled.

(02:48:58)
So, what I would like to do is to say, “Well, hey, hang on. What is it about the brain?” So, the brain has this incredible connectivity, and it has the ability to … As I said earlier about my nephew, I went from “Bill” to “Billy” and he went, “All right, Leroy.” How did he make that leap? That he was able to basically without any training … I extended his name in a way that he doesn’t like. He wants to be called Bill. He went back and said, “You like to be called Lee? I’m going to call you Leroy.”

(02:49:29)
So, human beings have a brilliant ability, or intelligent beings appear to have a brilliant ability to integrate across all domains all at once, and to synthesize something which allows us to generate knowledge. And becoming Turing-complete on our own, although AIs are built and Turing-complete things, their thinking is not Turing-complete in that they are not able to build universal explanations. And that lack of universal explanation means that they’re just-
Lee Cronin
(02:50:00)
Lack of universal explanation means that they’re just inductivists. Inductivism doesn’t get you anywhere. It’s just basically a party trick. I think it’s in The Fabric Of Reality from David Deutsch where basically the farmer is feeding the chicken every day and the chicken’s getting fat and happy. And the chicken’s like, “I’m really happy every time the farmer comes in and feeds me.” And then one day the farmer comes in and instead of feeding the chicken, just rings its neck. And had the chicken had an alternative understanding of why the farmer was feeding it.
Lex Fridman
(02:50:37)
It’s interesting though, because we don’t know what’s special about the human mind that’s able to come up with these kind of generalities. This universal theories of things. And we’ll come up with novelty. I can imagine… Because you gave an example about William and Leroy. I feel like an example like that we’ll be able to see in future versions of large language models. We’ll be really, really, really impressed by the humor, the insights, all of it. Because it’s fundamentally trained on all the incredible humor and insights that’s available out there on the internet. So we’ll be impressed. I think we’ll be impressed.
Lee Cronin
(02:51:22)
Oh, I’m impressed. I’m impressed.
Lex Fridman
(02:51:25)
Increasingly so.
Lee Cronin
(02:51:26)
But we are mining the past.
Lex Fridman
(02:51:28)
Yes.
Lee Cronin
(02:51:28)
And what the human brain appears to be able to do is mine the future.
Lex Fridman
(02:51:31)
Yes. So novelty, it is interesting whether these large language models will ever be able to come up with something truly novel.
Lee Cronin
(02:51:41)
I can show on the back of a piece of paper why that’s impossible. And it’s like the problem is that… And again these are domain experts kind of bullshitting each other. The term generative, right. Average person say, oh, it’s no, no, no. Look, if I take the numbers between zero and 1000 and I train a model to pick out the prime numbers by giving all the prime numbers between zero and a thousand, it doesn’t know what prime number is. Occasionally if I can cheat a bit, it will start to guess.

(02:52:12)
It never will produce anything out with the dataset because you mine the past. The thing that I’m getting to is I think that actually current machine learning technologies might actually help reveal why time is fundamental. It’s like kind of insane. Because they tell you about what’s happened in the past, but they can never help you understand what’s happening in the future without training examples. Sure, if that thing happens again. So let’s think about what large language models are doing. We have all the internet as we know it, language, but also they’re doing something else. We having human beings correcting it all the time. Those models are being corrected,
Lex Fridman
(02:52:54)
Steered.
Lee Cronin
(02:52:56)
Corrected, modified, tweaked.
Lex Fridman
(02:53:01)
Well, yeah, but-
Lee Cronin
(02:53:02)
Cheating.
Lex Fridman
(02:53:04)
Well you could say the training on human data in the first place is cheating.
Lee Cronin
(02:53:08)
Well, human is in the loop. Sorry to interrupt.
Lex Fridman
(02:53:10)
Yes. So human is definitely in the loop, but it’s not just human is in the loop. A very large collection of humans is in the loop.
Lee Cronin
(02:53:10)
Look I totally-
Lex Fridman
(02:53:21)
And that could be… I mean to me it’s not intuitive that you said prime numbers, that the system can’t generate an algorithm. That the algorithm that can generate prime numbers or the algorithm that can tell you if a number is prime and so on. And generate algorithms that generate algorithms, that generate algorithms that start to look a lot like human reasoning.
Lee Cronin
(02:53:46)
I think again, we can show that on a piece of paper, that sure. I think you have to have… So this is the failure in epidemiology. I’m glad I even can say that word, let know what it means.
Lex Fridman
(02:53:59)
You said it multiple times.
Lee Cronin
(02:54:00)
I know. It’s like three times now.
Lex Fridman
(02:54:01)
Without failure. Quit while you’re ahead. Just don’t say it again because you did really well.
Lee Cronin
(02:54:07)
Thanks. But I think, so what is reasoning? So coming back to the chemical brain. If I could show the inner… Because I mean I’m never going to make an intelligence in ca machina. Because if you don’t have brain cells, they don’t have glial cells, they don’t have neurons. But if I can take a gel and engineer the gel to have it be a hybrid hardware for reprogramming, which I think I know how to do, I will able to process a lot more information and train models billions of times cheaper and use cross domain knowledge. And there’s certain techniques I think we can do. But there’s still missing, though the abilities that human beings have had to become true and complete. And so I guess the question to give back at you is like how do you tell the difference between trial and error and the generation of new knowledge?

(02:55:06)
I think the way you can do it is this, is that you come up with a theory, an explanation, inspiration comes from out, and then you then test that, and then you see that’s going towards the truth. And human beings are very good at doing that. And the transition between philosophy, mathematics, physics and natural sciences. And I think that we can see that. Where I get confused is why people misappropriate the term artificial intelligence to say, “Hey, there’s something else going on here.” Because I think you and I both agree, machine learning’s really good, it’s only going to get better. We’re going to get happier with the outcome. But why would you ever think the model is thinking or reasoning? Reasoning requires intention. And the intention, if the model isn’t reasoning, the intentions come from the prompter. And the intention has come from the person who programmed it to do it.
Lex Fridman
(02:56:08)
But don’t you think you can prompt it to have intention?Basically start with the initial conditions and get it going? Where currently large language models, ChatGPT only talks to you when you talk to it. There’s no reason why you can’t just start it talking.
Lee Cronin
(02:56:31)
But those initial conditions came from someone starting it.
Lex Fridman
(02:56:35)
Yes.
Lee Cronin
(02:56:35)
And that causal chain in there. So that intention comes from the outside. I think that there is something in that causal chain of intention that’s super important. I don’t disagree, we’re going to get to AGI. It’s a matter of when and what hardware. I think we’re not going to do it in this hardware and I think we’re unnecessarily fetishizing really cool outputs and dopamine hits. Because obviously that’s what people want to sell us.
Lex Fridman
(02:56:57)
Well, but there could be AGI is a loaded term. But there could be incredibly super impressive intelligence systems on the way to AGI. So these large language models, I mean if it appears conscious, if it appears super intelligent, who are we to say it’s not.
Lee Cronin
(02:57:21)
I agree, but the super intelligence I want, I want to be able to have a discussion with it about coming up with fundamental new ideas that generate knowledge. And if the superintelligent we generate can mine novel even from the future that I didn’t see in its training set in the past, I would agree that something really interesting is coming on. I’ll say that again. If the intelligence system, be it a human being, a Chatbot, something else, is able to produce something truly novel that I could not predict ,even having full audit trail from the past, then I’ll be sold.
Lex Fridman
(02:57:58)
Well, so we should be clear that it can currently produce things that are in a shallow sense novel. That are not in the training set. But you’re saying truly novel.
Lee Cronin
(02:58:11)
I think they are in the training set. I think everything it produces comes from a training set. There’s a difference between novelty and interpolation. We do not understand where these leaps come from yet. That is what intelligence is I would argue. Those leaps and some people say no, it’s actually just what will happen if you just do cross domain training and all that stuff. And that may be true. And I may be completely wrong. But right now the human mind is able to mine novelty in a way that artificial intelligence systems cannot. And this is why we still all have a job. And we’re still doing staff. And I used ChatGPT for a few weeks. Oh this is cool. And then what happened is it took me too much time to correct it. Then it got really good. And now they’ve done something to it. It’s not actually that good.
Lex Fridman
(02:58:58)
Yeah, right.
Lee Cronin
(02:58:59)
I don’t know what’s going on.
Lex Fridman
(02:59:00)
Censorship. Yeah, I mean that’s interesting. But it will push us humans to characterize novelty better. Characterize the novel, what is novel, what is truly novel, what’s the difference between novelty and interpolation.
Lee Cronin
(02:59:10)
I think that this is the thing that makes me most excited about these technologies, is they’re going to help me demonstrate to you that time is fundamental. And the unit future is bigger than the present. Which is why human beings are quite good at generating novelty because we have to expand our dataset. And to cope with unexpected things in our environment. Our environment throws them all at us. Again, we have to survive in that environment. And I mean, I never say never. I would be very interested in how we can get cross domain training cheaply in chemical systems. Because I’m a chemist and bray, the only sim thing I know of is a human brain. But maybe that’s just me being boring and predictable and not novel.

GPT for electron density

Lex Fridman
(02:59:54)
Yeah. You mentioned GPT for electron density. So a GPT like system for generating molecules that can bind to host automatically. I mean that’s interesting. I’s really interesting. Applying this same kind of transform mechanism.
Lee Cronin
(03:00:11)
I mean, my team, I try and do things that are non obvious but non obvious in certain areas. And one of the things I was always asking about in chemistry, people like to represent molecules as graphs and it’s quite difficult. It’s really hard if you’re doing AI and chemistry, you really want to basically have good representations. You can generate new molecules are interesting. And I was thinking, well molecules aren’t really graphs and they’re not continuously differentiable. Could I do something that was continuously differentiable? I was like, well, molecules are actually made up of electron density. So I got thinking and say, well, okay, could there be a way where we could just basically take a database of readily solved electron densities for millions of molecules? So we took the electron density for millions of molecules and just train the model to learn what electron density is.

(03:01:06)
And so what we built was a system that you literally could give it a, let’s say you could take a protein that has a particular active site or a cup with a certain hole in it. You pour noise into it and with A GPT you turn the noise into electron density. And then in this case it hallucinates, like all of them do. But then hallucinations are good because it means I don’t have to train on such a huge dataset, because these data sets are very expensive. How do you produce it? So go back a step. So you’ve got all these molecules in this dataset, but what you’ve literally done is a quantum mechanical calculation. We produce electron densities for each molecule. So you say, oh, this representation of this molecule has these electron densities associated with it, so you know what the representation is and you train the neural network to know what electron density is.

(03:01:54)
So then you give it an unknown pocket. You pour in noise and you say, right, produce me electron density, it produces electron density that doesn’t look ridiculous. And what we did in this case is we produce electron density that maximizes the electrostatic potential, so the stickiness, but minimizes what we call the steric hindrance. So the overlaps, so it’s repulsive. So make the perfect fit. And then we then use kind of like a ChatGPT type thing to turn that electron density into what’s called a smile. A smile string is a way of representing a molecule in letters. And then we can then-
Lex Fridman
(03:02:32)
So it just generates them then.
Lee Cronin
(03:02:34)
Just generates them. And then the other thing is then we bung that into the computer and then it just makes it.
Lex Fridman
(03:02:39)
Yeah, the computer being the thing that right… To generate-
Lee Cronin
(03:02:40)
The robot we’ve got that can basically just do chemistry. So we’ve kind of got this end-to-end drug discovery machine where you can say, “Oh, you want to bind to this active site, here you go.” I mean it is a bit leaky and things kind of break, but it is the proof of principle.
Lex Fridman
(03:02:56)
But were the hallucinations, are those still accurate?
Lee Cronin
(03:03:01)
Well the hallucinations are really great in this case, because in the case of a large language model, the hallucinations just make everything up. It doesn’t just make everything up, but it gives you an output that you are plausibly comfortable with and thinks you’re doing probabilistically. The problem on these tron density models is it’s very expensive to solve a shredding equation going up to many heavy atoms and large molecules. And so we wondered if we trained the system on up to nine heavy atoms, whether it would go beyond nine and it did, It started to generate molecules for 12. No problem. They look pretty good. And I was like, well this hallucination I will take for free. Thank you very much.

(03:03:42)
Because it just basically… This is a case where interpolation extrapolation worked relatively well. And we were able to generate the really good molecules. And then what we were able to do here is, and this is a really good point and what I was trying to say earlier, that we were able to generate new molecules, from the known set, that would bind to the host. So a new guest would bind. Were these truly novel? Not really because they were constrained by the host. Were they new to us? Yes. So I do, well understand… I can concede that machine learning systems, artificial intelligence systems can generate new entities, but how novel are they? It remains to be seen.
Lex Fridman
(03:04:32)
And how novel the things that humans generate is also difficult to quantify. They seem novel.
Lee Cronin
(03:04:40)
That’s what a lot of people say. So the way to really get to genuine novelty, and assembly theory shows you the way, is to have different causal chains overlap. And this really, really resonates with the time is fundamental argument. And if you are bringing together a couple of objects with different initial conditions coming together, when they interact, the more different their histories, the more novelty they generate in time going forward. And so it could be that genuine novelty is basically about mix it up a little. And the human brain is able to mix it up a little little, and all that stimulus comes from the environment. But all I think I’m saying is the universe is deterministic going back in time. Non-deterministic going forward in time. Because the universe is too big in the future to contain in the present. Therefore these collisions of known things generate unknown things, that then become part of your data set and don’t appear weird. That’s how we give ourselves comfort. The past looks consistent with this initial condition hypothesis, but actually we’re generating more and more novelty. And that’s how it works. Simple.
Lex Fridman
(03:05:58)
So it’s hard to quantify novelty looking backwards. I mean the present and the future at the novelty generators.
Lee Cronin
(03:06:05)
But I like this whole idea of mining novelty. I think it is going to reveal why the limitations of current AI is a bit like a printing press. Everyone thought that when the printing press came that writing books is going to be terrible, that you had evil spirits and all this. They were just books.
Lex Fridman
(03:06:26)
And same with AI. But I think just the scale you can achieve in terms of impact with AI systems is pretty nerve wracking.
Lee Cronin
(03:06:35)
But that’s what the big companies want you to think.
Lex Fridman
(03:06:39)
But not in terms of destroy all humans. But you can have major consequences in the way social media has had major consequences, both positive and negative. And so you have to think about it and worry about it. But yeah, people that fear monger…
Lee Cronin
(03:06:55)
My pet theory for this, you want to know?
Lex Fridman
(03:06:58)
Yeah.
Lee Cronin
(03:06:59)
Is I think that a lot… And maybe I’m being… And I really do respect a lot of the people out there who are trying to have discourse about the positive future. So open AI guys, meta guys and all this. What I wonder if they’re trying to cover up for the fact that social media has had a pretty disastrous effect at some level, and they’re just trying to say, “Oh yeah, we should do this.” Covering up for the fact that we have got some problems with teenagers, and Instagram, and Snapchat, and all this stuff, and maybe they’re just overreacting now. It’s like, “Oh yeah, sorry, we made the bubonic plate and gave it to you all and you’re all dying.” And “Oh yeah, but look at this over here it’s even worse.”
Lex Fridman
(03:07:40)
Yeah, there’s a little bit of that. But there’s also not enough celebration of the positive impact that all of these technologies have had. We tend to focus on the negative and tend to forget that. In part because it’s hard to measure. It is very hard to measure the positive impact social media had on the world.
Lee Cronin
(03:07:58)
Yeah, I agree. But what I worry about right now is I do care about the ethics of what we’re doing. And one of the reasons why I’m so open about the things we’re trying to do in the lab, make life look at intelligence, all this, so people say, what are the consequences of this? And you say, what are the consequences of not doing it? And I think that what worries me right now in the present is lack of authenticated users and authenticated data and-
Lex Fridman
(03:08:25)
Human users.
Lee Cronin
(03:08:26)
Yeah, human.
Lex Fridman
(03:08:28)
I still think that there will be AI agents that appear to be conscious, but they would have to be also authenticated and labeled as such. There’s too much value in that. Like friendships with AI systems. There’s too much meaningful human experiences to have with the AI systems that I just…
Lee Cronin
(03:08:48)
But that’s like a tool, right? It’s a bit like a meditation tool, right?
Lex Fridman
(03:08:50)
Sure.
Lee Cronin
(03:08:50)
Some people have a meditation tool, it makes them feel better. But I’m not sure you can ascribe sentience and legal rights to a chatbot that makes you feel less lonely.
Lex Fridman
(03:09:00)
Sentience, yes. I think legal rights, no. I think it’s the same. You can have a really deep, meaningful relationship with a dog.
Lee Cronin
(03:09:08)
Well the dog is sentient.
Lex Fridman
(03:09:10)
Yes.
Lee Cronin
(03:09:12)
The chatbots right now, using the technology we use, it’s not going to be sentient.

God

Lex Fridman
(03:09:16)
This is going to be a fun continued conversation on Twitter that I look forward to. Since you’ve had also from another place some debates that were inspired by the assembly theory paper, let me ask you about God. Is there any room for notions of God in assembly theory? Of God.
Lee Cronin
(03:09:42)
Yeah. I don’t know what God is a… I mean, so God exists in our mind created by selection. So the human beings have created the concept of God in the same way that human beings have created the concept of super intelligence.
Lex Fridman
(03:09:57)
Sure, but does it mean, does it not… It still could mean that that’s a projection from the real world where we’re just assigning words and concepts to a thing that is fundamental to the real world. That there is something out there that is a creative force underlying the universe.
Lee Cronin
(03:10:22)
I think the universe… There is a creative force in the universe, but I don’t think it’s sentient. So I do not understand the universe. So who am I to say that God doesn’t exist? I am an atheist, but I’m not an angry atheist. There’s some people I know that are angry atheists and say-
Lex Fridman
(03:10:49)
Cranky.
Lee Cronin
(03:10:50)
Say that religious people are stupid. I don’t think that’s the case. I have faith in some things. I mean when I was a kid I was like, I need to know what the charge of electron is. And I was like, I can’t measure the charge on electron. I just gave up and had faith. Okay, you know, resistors worked. So when it comes to… I want to know why the universe is growing in the future and what humanity is going to become. And I’ve seen that the acquisition of knowledge via the generation of novelty to produce technology has uniformly made humans’ lives better. I would love to continue that tradition.
Lex Fridman
(03:11:31)
You said that there’s that creative force. Do you think, just to think on that point, do you think there’s a creative force? Is there like a thing, like a driver that’s creating stuff?
Lee Cronin
(03:11:45)
Yeah, so I think that…
Lex Fridman
(03:11:48)
And where? What is it? Can you describe it mathematically?
Lee Cronin
(03:11:51)
Well, I think selection. I think selection.
Lex Fridman
(03:11:53)
Selection is the force.
Lee Cronin
(03:11:54)
Selection is the force in the universe. It creates novelty.
Lex Fridman
(03:11:58)
So is selection somehow fundamental? Like what…
Lee Cronin
(03:12:03)
Yeah, I think persistence of objects that could decay into nothing through operations that maintain that structure. I mean, think about it. It’s amazing that things exist at all. That we’re just not a big commentorial mess.
Lex Fridman
(03:12:17)
Yes.
Lee Cronin
(03:12:18)
So the fact that-
Lex Fridman
(03:12:21)
And exist. A thing that exists persist in time.
Lee Cronin
(03:12:23)
Yeah. Let’s think, maybe the universe is actually in the present. The things… Everything that can exist in the present does exist.
Lex Fridman
(03:12:39)
Well that would mean it’s deterministic, right?
Lee Cronin
(03:12:42)
I think the universes might. So the universe started super small. The past was deterministic, there wasn’t much going on. And it was able to mine mine, mine, mine, mine. And so the process is somehow generating universes basically… I’m trying to put this into words.
Lex Fridman
(03:13:02)
Did you just say there’s no free will though?
Lee Cronin
(03:13:04)
No, I didn’t say that.
Lex Fridman
(03:13:05)
As if-
Lee Cronin
(03:13:06)
Sorry, sorry, sorry.
Lex Fridman
(03:13:06)
-it can exist.
Lee Cronin
(03:13:07)
I said there is free will. I’m saying that free will occurs at the boundary between the-
Lex Fridman
(03:13:17)
The past and the future?
Lee Cronin
(03:13:19)
The past and the future.
Lex Fridman
(03:13:20)
Yeah, I got you. But everything that can exist does exist.
Lee Cronin
(03:13:25)
So everything that’s possible to exist at this… So no, I’m really pulling this…
Lex Fridman
(03:13:30)
There’s a lot of loaded words there. There’s a time element loaded into that statement.
Lee Cronin
(03:13:36)
I think that the universe is able to do what it can in the present, right?
Lex Fridman
(03:13:40)
Yeah.
Lee Cronin
(03:13:40)
And then I think in the future there are other things that could be possible. We can imagine lots of things, but they don’t all happen.
Lex Fridman
(03:13:45)
Sure.
Lee Cronin
(03:13:46)
So what-
Lex Fridman
(03:13:46)
So that’s where-
Lee Cronin
(03:13:47)
So that’s what I guess I’m getting to.
Lex Fridman
(03:13:49)
-you sneak in free will right there.
Lee Cronin
(03:13:50)
Yeah. So I guess what I’m saying is what exists is a convolution of the past with the present, and the free will going into the future.
Lex Fridman
(03:14:00)
Well, we could still imagine stuff. Right? We can imagine stuff that will never happen.
Lee Cronin
(03:14:04)
And it’s amazing force. Because this is the most important thing that we don’t understand. Is our imaginations can actually change the future in a tangible way. Which is what the initial conditions and physics cannot predict. Your imagination has a causal consequence in the future.
Lex Fridman
(03:14:25)
Isn’t that weird to you?
Lee Cronin
(03:14:26)
Yeah. It breaks the laws of physics as we know them right now.
Lex Fridman
(03:14:37)
So you think the imagination has a causal effect on the future?
Lee Cronin
(03:14:41)
Yeah.
Lex Fridman
(03:14:42)
But it does exist in there in the head.
Lee Cronin
(03:14:45)
It does, but-
Lex Fridman
(03:14:45)
There must be a lot of power in whatever’s going on. There could be a lot of power, whatever’s going on in there.
Lee Cronin
(03:14:50)
If we then go back to the initial conditions, and that is simply not possible that can happen. But if we go into a universe where we accept that there is a finite ability to represent numbers. And you have rounding… Well not rounding errors, you have sum… What happens, your ability to make decisions, imagine and do stuff is that that interface between the certain and the uncertain. It’s not as Yashar was saying to me, “Randomness goes and you just randomly do random stuff.” It is that you are set free a little on your trajectory. Free will is about being able to explore on this narrow trajectory, that allows you to build… You have a choice about what you build. Or that choice is you interacting with a future in the present.
Lex Fridman
(03:15:39)
What to you is most beautiful about this whole thing? The universe?
Lee Cronin
(03:15:46)
The fact it seems to be very undecided, very open. The fact that every time I think I’m getting towards an answer to a question, there are so many more questions that make the chase.
Lex Fridman
(03:16:03)
Do you hate that it’s going to be over at some point for you?
Lee Cronin
(03:16:06)
No. Well for me. I think if you think about it, is it over for Newton now? Newton has had causal consequences in the future. We discuss him all the time,
Lex Fridman
(03:16:18)
His ideas, but not the person.
Lee Cronin
(03:16:21)
The person just had a lot of causal power when he was alive. But oh my God, one of the things I want to do is leave as many Easter eggs in the future when I’m gone to go, “Oh, that’s cool.”
Lex Fridman
(03:16:30)
Would you be very upset if somebody made a good large language model that’s fine tuned to Lee Cronin?
Lee Cronin
(03:16:37)
It would be quite boring. Because I mean, I…
Lex Fridman
(03:16:40)
No novelty generation?
Lee Cronin
(03:16:42)
I mean if it’s a faithful representation of what I’ve done in my life, that’s great. That’s an interesting artifact. But I think the most interesting thing about knowing each other is we don’t know what we’re going to do next.
Lex Fridman
(03:16:54)
Sure. Sure.
Lee Cronin
(03:16:57)
I mean within some constraints I’ve got, I can predict some things about you. You can predict some things about me. But we can’t predict everything.
Lex Fridman
(03:17:04)
Everything.
Lee Cronin
(03:17:05)
And it’s because we can’t predict everything is why we’re exciting to come back and discuss and see. So yeah, I’m happy that it’ll be interesting that some things that I’ve done can be captured, but I’m pretty sure that my angle on mining novelty for the future will not be captured.
Lex Fridman
(03:17:28)
Yeah. Yeah. So that’s what life is, is just some novelty generation and then you’re done. Each one of us just generally a little bit. Or have the capacity to at least.
Lee Cronin
(03:17:43)
I think life is a selection produces life. And life affects a universe. Universes with life in them are materially, physically, fundamentally different than universes without life. And that’s super interesting. And I have no beginnings of understanding. I think maybe this is in a thousand years, there’ll be a new discipline. And the humans will be like, “Yeah, of course. This is how it all works.” Right?
Lex Fridman
(03:18:10)
In retrospect, it’ll all be obvious I think.
Lee Cronin
(03:18:13)
I think assembly theory is obvious, that’s why a lot of people got angry. They were like, “Oh my God, this is such nonsense.” And like, “Oh, actually it’s not quite.” But the writing’s really bad.
Lex Fridman
(03:18:25)
Well, I can’t wait to see where it evolves, Lee. And I am glad I get to exist in this universe with you. You’re a fascinating human. This is always a pleasure. I hope to talk to you many more times. And I’m a huge fan of just watching you create stuff in this world. And thank you for talking today.
Lee Cronin
(03:18:44)
It’s a pleasure as always, Lex. Thanks for having me on.
Lex Fridman
(03:18:47)
Thanks for listening to this conversation with Lee Cronin. To support this podcast, please check out our sponsors in the description. And now, let me leave you with some words from Carl Sagan. We can judge our progress by the courage of our questions, and the depth of our answers. Our willingness to embrace what is true rather than what feels good. Thank you for listening. And hope to see you next time.

Transcript for Lisa Randall: Dark Matter, Theoretical Physics, and Extinction Events | Lex Fridman Podcast #403

This is a transcript of Lex Fridman Podcast #403 with Lisa Randall.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
The following is a conversation with Lisa Randall, a theoretical physicist and cosmologist at Harvard. Her work involves improving our understanding of particle physics, supersymmetry, baryogenesis, cosmological inflation, and dark matter.

(00:00:15)
This is the Lex Friedman podcast. To support it, please check out our sponsors in the description. Now, dear friends, here’s Lisa Randall.

Dark matter


(00:00:24)
One of the things you work on and write about is dark matter. We can’t see it, but there’s a lot of it in the universe. You also end one of your books with a Beatles song quote, “‘Got to be good-looking because he’s so hard to see.” What is dark matter? How should we think about it given that we can’t see it? How should we visualize it in our mind’s eye?
Lisa Randall
(00:00:47)
I think one of the really important things that physics teaches you is just our limitations, but also our abilities. The fact that we can deduce the existence of something that we don’t directly see is really a tribute to people that we can do that. It’s also something that tells you, you can’t overly rely on your direct senses. If you just relied on just what you see directly, you would miss so much of what’s happening in the world.

(00:01:15)
We can generalize this, but just for now to focus on dark matter, it’s something we know is there, and it’s not just one way we know it’s there. In my book, Dark Matter and the Dinosaurs, I talk about the many different ways. There’s eight or nine that we deduce not just the existence of dark matter, but how much is there, and they all agree.

(00:01:36)
Now, how do we know it’s there? Because of its gravitational force. Individually, a particle doesn’t have such a big gravitational force. In fact, gravity is an extremely weak force compared to other forces we know about in nature, but there’s a lot of dark matter out there. It carries a lot of energy. Five times the amount of energy as the matter. We know that’s in atoms, et cetera.

(00:02:00)
You can ask, how should we think about it? It’s just another form of matter that doesn’t interact with light, or at least as far as we know. It interacts gravitationally, it clumps, it forms galaxies, but it doesn’t interact with light, which means we just don’t see it. Most of our detection, before gravitational wave detectors, we only saw things because of their interactions with light in some sense.
Lex Fridman
(00:02:25)
In theory, it behaves just like any other matter, it just doesn’t interact with light.
Lisa Randall
(00:02:30)
When we say it interacts just like any other form of matter, we have to be careful because gravitationally, it interacts like other forms of matter, but it doesn’t experience electromagnetism, which is why it has a different distribution.

(00:02:44)
In our galaxy, it’s roughly spherical unless it has its own interactions, that’s another story. We know that it’s roughly spherical, whereas ordinary matter can radiate and clumps into a disk. That’s why we see the Milky Way disk. On large scales, in some sense, yes, all the matter is similar in some sense.

(00:03:06)
In fact, dark matter is in some sense more important because it can collapse more readily than ordinary matter because ordinary matter has radiative forces, which makes it hard to collapse on small scales. Actually it’s dark matter that drives galaxy formation and then ordinary matter comes along with it.

(00:03:30)
There’s also just more of it, and because there’s more of it can start collapsing sooner. That is to say the energy density in dark matter dominates over radiation earlier than you would if you just had an ordinary matter.
Lex Fridman
(00:03:43)
It’s part of the story of the origin of the galaxy, part of the story of the end of the galaxy, and part of the story of all the various interactions throughout.
Lisa Randall
(00:03:50)
Exactly. In my book, I make jokes about, it’s like when we think about a building, we think about the architect, we think about the high level, but we forget about all the workers that did all the grunt work. In fact, dark matter was really important in the formation of our universe, and we forget that sometimes.
Lex Fridman
(00:04:07)
That’s a metaphor on top of a metaphor. Okay. The unheard voices that do the actual work.
Lisa Randall
(00:04:16)
Exactly. No, but it is a metaphor, but it also captures something because the fact is we don’t directly see it, so we forget it’s there or we don’t understand it’s there, or we think it’s not. The fact that we don’t see it makes it no less legitimate, it just means that we have challenges in order to find out exactly what it is.
Lex Fridman
(00:04:35)
Yeah, but the things we cannot see that nevertheless have a gravitational interaction with the things we can’t see is at the layman level, it’s just mind-blowing.
Lisa Randall
(00:04:49)
It is and it isn’t because I think what it’s teaching us is that we’re human, the universe is what it is, and we’re trying to interact with that universe and discover what it is. We’ve discovered, amazing things.

(00:05:03)
In fact, I would say it’s more surprising that the matter that we know about is constitutes as big a fraction of the universe as it does. We’re limited, we’re human. The fact that we see 5% of the energy density of the universe, about one sixth of the energy density in matter, that’s remarkable. Why should that be? Anything could be out there, yet the universe that we see is a significant fraction.
Lex Fridman
(00:05:30)
Yeah, but a lot of our intuition, I think operates using visualizations in the mind.
Lisa Randall
(00:05:36)
That’s absolutely true. Certainly writing books, I realized also how many of our words are based on how we see the world, and that’s true. That’s actually one of the fantastic things about physics is that it teaches you how to go beyond your immediate intuition to develop intuitions that apply at different distances, different scales, different ways of thinking about things.
Lex Fridman
(00:05:57)
Yeah. How do you anthropomorphize dark matter?
Lisa Randall
(00:06:01)
I just did, I think. I made it the grunt workers.
Lex Fridman
(00:06:04)
Oh yeah, that’s good. You did. That’s why you get paid the big bucks and write the great books. Okay, you also write in that book about dark matter, having to do something with the extinction events, the extinction of the dinosaurs, which is a fascinating presentation of how everything is connected.

(00:06:28)
I guess the disturbances from the dark matter, they create gravitational disturbances in the Oort Cloud at the edge of our solar system, and then that increases the rate of asteroids hitting earth.
Lisa Randall
(00:06:42)
I want to be really clear, this was a speculative theory.
Lex Fridman
(00:06:44)
I love it, though.
Lisa Randall
(00:06:48)
I liked it too. We still don’t know for sure, but what we liked about it… Let me take a step back. We usually assume that dark matter, we being physicists, that it’s just one thing. It’s just basically non-interacting aside from gravity or very weakly interacting matter.

(00:07:11)
Again, we have to get outside this mindset of just humans and ask what else could be there. What we suggested is that there’s a fraction of dark matter, not all the dark matter, but some of the dark matter, maybe it has interactions of its own just the same way in our universe, we have lots of different types of matter. We have nuclei, we have electrons, we have neutrons, we have forces.

(00:07:35)
It’s not a simple model, the standard model, but it does have some basic ingredients, so maybe dark matter also has some interesting structure to it. Maybe there’s some small fraction. The interesting thing is that if some of the dark matter does radiate, and I like to call it dark light because it’s light that we don’t see, but dark matter would see. It could radiate that and then it could perhaps collapse into a disk the same way ordinary matter collapsed into the Milky Way disk.

(00:08:06)
It’s not all the dark matter, it’s a fraction, but it could conceivably be a very thin disk of dark matter, thin, dense disk of dark matter. The question is do these exist? People have done studies now to think about whether they can find them. It’s an interesting target, it’s something you can measure. By measuring the positions and velocities of stars, you can find out what the structure of the Milky Way is, but the fun proposal was that the solar system orbits around the galaxy.

(00:08:36)
As it does so, it goes a little bit up and down kind of horses on a carousel. The suggestion was every time it goes through, you have an enhanced probability that you would dislodge something from the edge of the solar system in something called the Oort Cloud. The idea was that at those times, you’re more likely to have these cataclysmic events such as the amazing one that actually caused the last extinction that we know of for sure.
Lex Fridman
(00:09:01)
It wasn’t so amazing for the dinosaurs.
Lisa Randall
(00:09:04)
Or for two thirds of the species on the planet.
Lex Fridman
(00:09:06)
But it gets amazing for humans. It wouldn’t be-
Lisa Randall
(00:09:08)
What really is amazing… I talk about this in Dark Matter and the Dinosaurs. It is just an amazing scientific story because it really is one of the real stories that combine together different fields of science. Geologists at the time or people thought that things happen slowly and this would be a cataclysmic event.

(00:09:27)
Also, I have to say if you think about it, it sounds like a story a five-year-old would make up. Maybe the dinosaurs were killed by some big rock that came and hit the earth, but then there really was a scientific story behind it. That’s also why I like the dark disk because there’s a scientific story behind it. As far-fetched as it might sound, you could actually go and look for the experimental consequences, for the observational consequences to test whether it’s true.
Lex Fridman
(00:09:51)
I wish you could know high-resolution details of where that asteroid came from, where in the Oort Cloud, why it happened, is it in fact because of dark matter? Just the full tracing back to the origin of the universe because humans seem to be somewhat special. It seems like so many fascinating events at all scales of physics had to happen for [inaudible 00:10:17].
Lisa Randall
(00:10:16)
I’m really, really glad you mentioned that because actually that was one of the main points of my book, Dark Matter and the Dinosaurs. One of the reasons I wrote it was because I really think we are abusing the planet, we’re changing the planet way too quickly. Just like anything else, when you alter things, it’s good to think about the history of what it took to get here.

(00:10:34)
As you point out, it took many operations on many different scales. We had to have the formation of structure, the formation of galaxies, the formation of the solar system, the formation of our planet, the formation of humans. There’s so many steps that go into this. Humans in some part were the result of the fact that this big object hit the earth, made the dinosaurs go extinct, and mammals developed. It is an incredible story and yes, something else might come of it, but it won’t be us if we mess with it too much.
Lex Fridman
(00:11:05)
But it is on a grand scale, earth is a pretty resilient system. Can you just clarify, just fascinating, the shape of things. The shape of the Milky Way’s… Of the observable stuff is mostly flat. You said dark matter tends to be spherical, but a subset of that might be a flat disk.
Lisa Randall
(00:11:31)
You want to hear about the shape of things.
Lex Fridman
(00:11:34)
Yes, please.
Lisa Randall
(00:11:36)
Structure formed early on, and now our structure that we live in is… We know about the Milky Way galaxy. The Milky Way galaxy has the disk you can see in a dry dark place, that’s where stars and light is, but you can also measure in some ways the dark matter. We believe that dark matter is more or less spherically distributed. Like we said, there’s a lot of it, not necessarily in the disk, but just because it’s a sphere, there’s a lot of it sitting there.

(00:12:11)
The reason it doesn’t collapse as far as we know is that it can’t radiate the same way. Because it can radiate ordinary matter collapses, and this actually, because of conservation of angular momentum, it stays a disk and it doesn’t just collapse to the center. Our suggestion was that maybe there are some components of dark matter that also radiate.

(00:12:31)
Like I said, that’s far from proven. People have looked for a disk, they see some evidence of some disks of certain densities, but these are all questions that are worth asking. Basically if we can figure it out from existing measurements, why not try?
Lex Fridman
(00:12:44)
Okay, so not all dark matter is made the same.
Lisa Randall
(00:12:48)
That’s a possibility. We actually don’t know what dark matter is in the first place, we don’t know what most of it is, we don’t know what a fraction is. It’s hard to measure. Why is it hard to measure for exactly the reason you said earlier, we don’t see it. We want to think of possibilities for what it can be, especially if those give rise to some observational consequences. It’s a tough game because it’s not something that’s just there for the taking. You have to think about what it could be and how you might find it.
Lex Fridman
(00:13:16)
The way you detect it is gravitational effects on things we can see.
Lisa Randall
(00:13:22)
That would be the way you detect the type of dark matter. I’ve been talking about people have suggestions for other forms of dark matter. They could be particles called axions, they could be other types of particles, and then there are different ways of detecting it.

(00:13:34)
The most popular candidate for dark matter probably until pretty recently because they haven’t found it, is something called WIMPs, Weakly Interacting Massive Particles, particles that have mass about the same as the Higgs boson mass, and it turns out then you would get about the right density of dark matter.

(00:13:52)
People really like that, of course, because it is connected to the standard model, the particles that we know about, and if it’s connected to that, we have a better chance of actually seeing it. Fortunately or unfortunately, it’s also a better chance that you can rule it out because you can look for it. So far, no one has found it. We’re still looking for
Lex Fridman
(00:14:08)
Is that one of the hopes of the Large Hadron Collider?
Lisa Randall
(00:14:11)
That was originally one of the hopes of Large Hadron Collider. I’d say at this point, it would be very unlikely given what they’ve already accomplished, but there are these underground detectors, xenon detectors that look for dark matter coming in, and they are going to try to achieve a much stronger bound than exists today.
Lex Fridman
(00:14:35)
Just to take that tangent, looking back now, what’s the biggest, to you, insight to humanity that the LHC has been able to provide?
Lisa Randall
(00:14:47)
It’s interesting. It’s both a major victory. The Higgs boson was proposed 50 years ago, and it was discovered. The Higgs mechanism seemed to be the only way to explain elementary particle masses and it was right so on the one hand, it was a major victory. On the other hand, I’ve been in physics long enough to know it was also a cautionary tale in some sense because at the time I started out in physics, we had proposed something in the United States called the Superconducting Supercollider.

(00:15:15)
A lot of physicists, I’ll say particularly in Europe, but I’d say a lot of physicists were saying when that the Large Hadron Collider would have the energy reach necessary to discover what underlies the standard model. We don’t want to just discover the standard model, we want to know what the next step is.

(00:15:31)
I think here people were more cautious about that. They want to have a more comprehensive search that could get to higher energies, more events so that we could really more definitively rule it out. In that case, many people thought they knew what would be there. It happened to be a theory called supersymmetry. A lot of physicists thought it would be supersymmetry.

(00:15:51)
It’s one of the many factors I think that went into the fact that the Large Hadron Collider became the only machine in town, and the Superconducting Supercollider would’ve just been a much… If it had really had achieved what it was supposed to, would’ve been a much more robust test of the space.

(00:16:07)
I’d say for humanity, it’s both a tribute to the ability of discovery and the ability of really believing in things so that you have the confidence to go look for them, but it’s also a cautionary tale that you don’t want to assume things before they’ve been actually found. You want to believe in your theories, but you also want to question them at the same time in ways that you’re more likely to discover the truth.
Lex Fridman
(00:16:32)
It’s also an illustration of grand engineering efforts that humanity can take on and maybe a lesson that you could go even bigger.
Lisa Randall
(00:16:43)
I’m really glad you said that though too, because that’s absolutely true. It really is an impressive… It’s impressive in so many ways. It’s impressive technologically, it’s impressive at engineering level.

(00:16:55)
It’s also impressive that so many countries work together to do this. It wasn’t just one country. It was also impressive in that it was a long-term project that people committed to and made it happen. It is a demonstration that when people set their minds to things and they commit to it, that they can do something amazing.
Lex Fridman
(00:17:18)
Also in the United States, maybe a lesson that bureaucracy can slow things down to [inaudible 00:17:24].
Lisa Randall
(00:17:24)
Bureaucracy and politics.
Lex Fridman
(00:17:26)
Politics.
Lisa Randall
(00:17:27)
And economics. Many things can make them faster and make them slower.
Lex Fridman
(00:17:32)
Science is the way to make progress, politics is the way to slow that progress down. And here we are.
Lisa Randall
(00:17:39)
I don’t want to overstate that because without politics, the [inaudible 00:17:42] wouldn’t happen either.
Lex Fridman
(00:17:43)
You need broccoli.
Lisa Randall
(00:17:49)
Sometimes I do think… You’re not asking this question, but sometimes I do think when I think about some of these conflicts, sometimes it’s just good to have a project that people work on together. There were some efforts to do that in science too, to have Palestinians and Israelis work together, a project called Sesame. I think it’s not a bad idea when you can do that, when you can get… Forget the politics and just focus on some particular project. Sometimes that can work.
Lex Fridman
(00:18:25)
Some kind of forcing function, some kind of deadline that gets people to sit in a room together and you’re working on a thing. As part of that, you realize the common humanity, that you all have the same concerns, the same hopes, the same fears, that you are all human. That’s an accidental side effect of working together on a thing.
Lisa Randall
(00:18:45)
That’s absolutely true. It’s one of the reasons CERN was formed actually. It was post-World War II, and a lot of European physicists had actually left Europe and they wanted to see Europeans work together and rebuild, and it worked. They did. It’s true, I often think that, that one of the major problems is we just don’t meet enough people so that everyone… When they seem like the other, it’s more easy to forget their humanity. I think it is important to have these connections.

Extinction events

Lex Fridman
(00:19:16)
Given the complexity, all cosmological scales involved here that led to the extinction of the dinosaurs, when you look out at the future of earth, do you worry about future extinction events?
Lisa Randall
(00:19:29)
I do think that we might be in the middle of an extinction right now if you define it by the number of species that are getting killed off. It’s subtle, but it’s a complex system. The way things respond to events is sometimes things evolve, sometimes animals just move to another place. The way we’ve developed the earth, it’s very hard for species just to move somewhere else.

(00:19:54)
We’re seeing that with people now, too. I know people are worried just about AI taking over, and that’s a totally different story. We just don’t think about the future very much. We think about what we’re doing now, and we certainly don’t think enough about all the animals that we’re destroying, all the things that are precursors to humans that we rely on.
Lex Fridman
(00:20:14)
It’s interesting to think whether the things that threaten us is the stuff we see that’s happening gradually or the stuff we don’t really see that’s going to happen all of a sudden. I sometimes think about what should we be worried about? It seems like with the asteroids or nuclear war, it could be stuff that just happens one day. When I say one day meaning over a span of a few days or a few months, but not on a scale of decades and centuries. We sometimes mostly talk about stuff that’s happening gradually, but we can be really surprised.
Lisa Randall
(00:20:57)
It’s actually really interesting. That was actually one of the reasons it took a while to determine what it was that it caused the last extinction because people did think at the time, many people thought that things were more gradual, and the idea of extinction was actually a novel concept at some point.

(00:21:14)
These aren’t predictable events necessarily. They’re only predictable on a grand scale, but sometimes they are. I think people were pretty aware that nuclear weapons were dangerous. I’m not sure people are as aware now as they were say, 20 or 30 years ago, and that certainly worries me. I have to say I was not as worried about AI as other people, but now I understand. It’s more that as soon as you create things that we lose control over, it’s scary.

(00:21:50)
The other thing that we’re learning from the events today is that it takes a few bad actors. It takes everyone to make things work well, it takes not that many things to make things go wrong. The issue with disease, we can find out what causes a disease, but to make things better is not necessarily that simple. Sometimes it is. But for things to be healthy, a lot of things have to work. For things to go wrong, only one thing has to go wrong. It’s amazing that we do it.

(00:22:19)
The same is true for democracy. For democracy to work, a lot of people have to believe in it. A few bad actors can destroy things sometimes. A lot of the things that we really rely on are delicate equilibrium situations. There is some robustness in the systems, we try to build in robustness, but a few extreme events can sometimes alter things. I think that’s what people are scared of today in many ways. They’re scared of it for democracy, they’re scared of it for peace, they’re scared of it for AI.

(00:22:51)
I think they’re not as scared as they should be about nuclear weapons, to be honest. I think that’s more serious danger than people realize. I think people are a little bit more scared about pandemics than they were before, but I still say they’re not super scared about it. So you’re right, there are these major events that can happen and we are setting things up so that they might happen, and we should be thinking about them. The question is who should be thinking about them? How should we be thinking about them? How do you make things happen on a global scale, because that’s really what we need.
Lex Fridman
(00:23:23)
It certainly shouldn’t be a source of division, it should be a source of grand collaboration probably.
Lisa Randall
(00:23:29)
Wouldn’t that be nice?
Lex Fridman
(00:23:30)
Yeah. I just wonder what it’d be like to be a dinosaur. It must have been beautiful to look at that asteroid enter the atmosphere. Until everything…. Man, that would be one of the things I would travel back in time to just to watch it.
Lisa Randall
(00:23:50)
That’s also one of the things that I think you probably could do with virtual reality. I don’t think you have to be there and get extinct.
Lex Fridman
(00:23:54)
To just experience it.
Lisa Randall
(00:23:55)
I think there’s something… It’s an event. You’re just watching. You’re not doing anything, you’re just looking at it, so maybe you could just recreate it.
Lex Fridman
(00:24:01)
I actually heard that there’s a nuclear weapon explosion experience in virtual reality that’s good to remind you about what it would feel like.
Lisa Randall
(00:24:14)
I have to say, I got an award from the Museum of Nuclear History and Technology in the Southwest, and I went to visit the museum, which turned out to be mostly a museum of nuclear weapons. The scary thing is that they look really cool.

(00:24:30)
It’s true that you have that, yes, this is scary, but you also have, this is cool feeling and I think we have to get around that because I think that yes, you can be in that, but I’m not sure that’s going to make people scared. Have they actually asked afterwards, are you more or less scared?
Lex Fridman
(00:24:50)
That’s a really good point. That’s a good summary of just humanity in general. We’re attracted to creating cool stuff, even though it can be dangerous.
Lisa Randall
(00:25:01)
Actually, that was the really interesting thing about visiting that museum, actually. It was very nice because I had a tour from people who had been working there in the Cold War and actually one or two people from the Manhattan Project. It was a very cool tour. You just realize just how just the thing itself gets you so excited.

(00:25:16)
I think that’s something sometimes these movies miss, just the thing itself. You’re not thinking about the overall consequences. In some ways it was like the early Silicon Valley. People were just thinking what if we did this? What if we did that? Not keeping track of what the peripheral consequences are. You definitely see that happening with AI now. I think that was the moral of the battle that just happened, that it’s just full speed ahead.
Lex Fridman
(00:25:43)
Which gives me a really great transition to another quote in your book. You write about the experience of facing the sublime in physics, and you quote Rainer Rilke. “For beauty is nothing but the beginning of terror, which we are still just able to endure, and we’re so odd because it’s serenely disdains to annihilate us.” It’s pretty intense. It I think applies to nuclear weapons.
Lisa Randall
(00:26:14)
At a more mundane, perhaps level, I think it applies… It’s really interesting. One of the things that I found when I wrote these books is some people love certainty. Scientists, many revel in uncertainty. It’s not that you want to be uncertain, you want to solve it, but you’re at this edge where it’s really frustrating because you don’t really want to not know the answer, but of course, if you knew the answer, it would be done.

(00:26:41)
You’re always at this edge where you’re trying to sort things out and there is something scary. You don’t know if there’s going to be a solution, you don’t know if you’re going to find it. It’s not something that can destroy the earth, it’s just something that you do on your individual level. But then of course there are much bigger things like the ones you’re talking about where they could actually be dangerous. The stuff I do, I just want to be clear, I’m doing theoretical physics. Not very dangerous, but sometimes things end up having bigger consequences than you think.
Lex Fridman
(00:27:13)
Dangerous in a very pragmatic sense. Isn’t it still in part terrifying when you think of just the size of things like the size of dark matter, the power of this thing in terms of its potential gravitational effects, just cosmological objects of a black hole at the center of our galaxy.
Lisa Randall
(00:27:36)
This might be why I’m a physicist or why I differ from other people because I’m not such a big fan of humanity in some ways. Some ways I am, but the idea that we were everything would be really boring to me. I love the idea that there’s so much more out there, that there’s a bigger universe and there’s lots to discover and that we’re not all there is. Wouldn’t it be disappointing if we were all there is?
Lex Fridman
(00:27:57)
Yeah, and the full diversity of other stuff is pretty interesting.
Lisa Randall
(00:28:04)
We have no idea how much there is. We know what we can observe so far, so the idea that there’s other stuff out there that we yet have to figure out, it’s exciting.
Lex Fridman
(00:28:13)
Let me ask you an out there question. If you think of humans on earth, life on earth as this pocket of complexity that emerged and there’s a bunch of conditions that came to be, and there’s Darwinian evolution and however life originated, do you think it’s possible there’s some pockets of complexity of that sort inside dark matter that we can’t see?
Lisa Randall
(00:28:42)
That’s possible.
Lex Fridman
(00:28:47)
Chemistry and biology evolving in different ways.
Lisa Randall
(00:28:49)
That’s one of the reasons we suggest… It’s not the reason, but it would be true if there were the type of interactions we’d suggest, it would need more complex ones. We don’t know. I will say that the conditions that give rise to life and complexity, they’re complex, they’re unlikely. It’s not like there’s great odds that would happen, but there’s no reason to know that it doesn’t happen. It’s worth investigating are there other forces that exist in the dark matter sector? That’s exactly-
Lex Fridman
(00:29:20)
So the dark matter sector doesn’t have all the forces of the standard model of physics?
Lisa Randall
(00:29:26)
Right. As far as we know, it doesn’t have any. It might have it at some low level, but it could have its own forces, just like the dark matter might not experience our light. Maybe it has its light that we don’t experience.
Lex Fridman
(00:29:38)
So there could be other kinds of forces.
Lisa Randall
(00:29:41)
There could be other kinds of forces even within our sector that are too weak for us to have discovered so far or that exist at different scales than we know about. We detect what interacts strongly enough with our detectors to detect. It’s worth asking, and that’s one of the reasons we build big colliders to see are there other forces, other particles that exist say, at higher energies, at shorter distance scales than we’ve explored so far. It’s not just in the dark matter sector. Even in our sector, there could be a whole bunch of stuff we don’t yet know.

Particle physics

Lex Fridman
(00:30:17)
Maybe let’s zoom out and look at the standard model of particle physics. How does dark matter fit into it? First of all, what is it? Can you explain what the standard model is?
Lisa Randall
(00:30:28)
The standard model of particle physics is basically it tells us about nature’s most basic elements and their interactions. It’s the substructure as far as we understand it. If you look at atoms, we know they have nuclei and electrons, nuclei have protons and neutrons in them, protons and neutrons have particles called quarks that are held together by something called the strong force.

(00:30:54)
They interact through the strong force, the strong nuclear force. There’s something called the weak nuclear force and electromagnetism. Basically, all those particles and their interactions describe many, many things we understand. That’s the standard model. We now know about the Higgs boson, which is associated with how elementary particles get their mass. That piece of the puzzle has also been completed.

(00:31:20)
We also know that there are a weird array of masses of elementary particles. There’s not just the up and down quark, but there are heavier versions of the up and down quark. Charm and strange, top and bottom. There’s not just the electron, there’s a muon and a tau. There are particles called neutrinos, which are under intense study now, which are partnered with the leptons through the weak interactions.

(00:31:42)
We really do know these basic elements and we know the forces. When we’re doing particle physics experiments, we can usually even ignore gravity except in exceptional cases that we can talk about. Those are the basic elements in their interactions.

(00:31:58)
Dark matter stands outside that, it’s not interacting through those forces. When we look at the world around us, we don’t usually see the effects of dark matter. It’s because there’s so much of it that we do and it doesn’t have those forces that we know about. The standard model has worked spectacularly well. It’s been tested to a high degree of precision. People are still testing it.

(00:32:20)
One of the things we do as physicists is we actually want it to break down at some level, we’re looking for the precision measurement or the energy or whatever it will take where the standard model is no longer working. Not that it’s not working approximately, but we’re looking for the deviations. Those deviations are critical because they can tell us what underlies the standard model, which is what we really want to see next.
Lex Fridman
(00:32:45)
Where can you find the places where the standard model breaks down? What are the places you can see those tiny little deviations?
Lisa Randall
(00:32:53)
We don’t know yet, but we know the kinds of things you wouldn’t want to look for. One obvious place to look is at higher energy. We’re looking at the Large Hadron Collider, but we’d love to go beyond that. Higher energy means shorter distances and it means things that we just couldn’t produce before. E=mc², so if you have a heavy particle and you don’t have enough energy to make it, you’ll never see it. That’s one place.

(00:33:17)
The other place is precision measurements. The standard model has been tested exquisitely, so if it’s been tested 1%, you want to look at a 10th of a percent. There are some processes that we know shouldn’t even happen at all in the standard model or happen at very suppressed level, and those are other things that we look for. All of those things could indicate there’s something beyond what we know about, which of course would be very exciting.
Lex Fridman
(00:33:42)
When you just step back and look at the standard model, the quarks and all the different particles and neutrinos, isn’t it wild how this little system came to be and underpins everything we see?
Lisa Randall
(00:33:59)
Absolutely. That’s why we’d like to understand it better. We want to know is it part of some bigger sector? Why are these particles… Why do they have the masses they do? Why is the Higgs boson so light compared to the mass that could have had, which we might’ve even expected based on the principles of special relativity and quantum mechanics. That’s a really big question. Why are they what they are?
Lex Fridman
(00:34:21)
And they originate, there’s some mechanism that created the whole thing?
Lisa Randall
(00:34:24)
That’s one of the things we’re trying to study. Why is it what it is?
Lex Fridman
(00:34:29)
Even just the mechanism that creates stuff, the way a human being is created from a single cell. It’s like embryogenesis, the whole thing, you build up this thing. All of it, this whole thing comes to be from just like a [inaudible 00:34:47].
Lisa Randall
(00:34:46)
Don’t forget it is interacting with the environment.
Lex Fridman
(00:34:49)
For sure. Okay, right, right, right.
Lisa Randall
(00:34:51)
It’s important.
Lex Fridman
(00:34:53)
That’s a really good question is how much of it is the environment? Is it just the environment acting on a set of constraints? How much of it is just the information in the DNA or any information? How much is it in the initial conditions of the universe versus some other thing acting on it?
Lisa Randall
(00:35:14)
These are big questions. These are big questions in pretty much every field. For the universe, we do consider it… It’s everything there is by definition. But people now think about it. Is it one of many universes? Of course it’s a misnomer, but could there be other places where there are self-contained gravitational systems that we don’t even interact with? Those are really important questions, and the only way we’re going to answer them is we go back as far as we can. We try to think theoretically, and we try to think about observational consequences. That’s all we can do.
Lex Fridman
(00:35:49)
One interesting way to explore the standard model is to look at your fun, nuanced disagreement with Carlo Rovelli. When you talked about him writing in his book, “Electrons don’t always exist. They exist when they interact. They materialize in a place when they collide with something else.” You wrote that… I’ll just read the whole thing because it’s interesting.

(00:36:12)
“Stocks may not achieve a precise value until they’re traded, but that doesn’t mean we can’t approximate their worth until they change hands. Similarly, electrons might not have definite properties, but they do exist. It’s true that the electron doesn’t exist as a classical object with definite position until the position is measured. But something was there – which physicists use a wave function to describe.” It’s a fascinating nuanced disagreement. Do electrons always exist or not? Does a tree fall in the forest if nobody’s there to see it?
Lisa Randall
(00:36:48)
I like to think of the universe as being out there, whether or not… It would be really weird if the only time things came into existence was when I saw them or I measured them.
Lex Fridman
(00:36:57)
There’s a lot of weird stuff in the works.
Lisa Randall
(00:36:58)
I could believe that the Middle East doesn’t exist because I’m not there now. That would be kind of ridiculous, I think we would all agree on that. I think there’s only so much that we can attribute to our own powers of seeing. The whole system doesn’t come into being because I’m measuring it. What is weird, and this isn’t even a disagreement about the standard model, this is a disagreement about how you interpret quantum mechanics.

(00:37:22)
I would say that those wave functions are real. One of the things that don’t forget that particle physics does that quantum field theory says is that electrons can be created and destroyed. It’s not that every electron has to be in the universe. That’s what happens at colliders, particles get created and destroyed, but that doesn’t mean that if I have electron in an atom, it’s not there. It’s certainly there, and we know about it. Its charge is there.
Lex Fridman
(00:37:47)
Physics is a kind of way to see the world. At the bottom, what’s the bottom turtle? Do you have a sense that there’s a bottom reality that we’re trying to approximate with physics?
Lisa Randall
(00:38:01)
I think we always have in our head maybe that we’d like to find that, but I have to… I might not seem so, but I think I’m more humble than a lot of physicists. I’m not sure that we’re ever going to get to that bottom level, but I do think we’re going to keep penetrating different layers and get further.
Lex Fridman
(00:38:16)
I just wonder how far away we are.
Lisa Randall
(00:38:20)
We all wonder that. What’s even the measure of how far away we are. One way you can measure it is just by our everyday lives. In terms of our everyday lives, we’ve measured everything. In terms of what underlies it. There’s a lot more to see. Part of it has to do with how far we think we can go. It might be that the nature of reality changes so much that even these terms are different. Maybe the notion of distance itself might break down at some point.
Lex Fridman
(00:38:49)
Also to push back on the we’ve measured everything, maybe there’s stuff we haven’t even considered is measurable. For example, consciousness. There might be stuff, just like you said, forces unseen, undetected.
Lisa Randall
(00:39:03)
It’s an interesting thing, and this is often a confusion that happens. There’s the fundamental stuff underlying it, and then there’s the higher levels, what we’ll call an effective theory at some level. We’re not always working… When I throw a ball, I don’t tell you where every atom is. I tell you there’s a ball.

(00:39:22)
There might be different layers of reality that are built in terms of the matter that we know about in terms of the stuff we know about that. When I say we’ve measured everything, I say that with a grain of salt. I mean we’ve measured everything about the standard model. There’s lots of phenomena that we don’t understand, but often there are complex phenomena that will be given in terms of the fundamental ingredients that we know about.
Lex Fridman
(00:39:47)
That is an interesting question because yes, there’s phenomena that are at the higher level of abstractions that emerge, but maybe with consciousness, there is far out people that think that consciousness is panpsychus, that there’s going to be almost like a fundamental force of physics. That’s consciousness that permeates all that matter.
Lisa Randall
(00:40:10)
Usually when you have a crazy… Sorry, when you have a far out theory, the thing you do is you test all the possibilities within the constructs that exist. You don’t just jump to the most far out possibility. You can do that, but then to see if it’s true, you either have to find evidence of it or you have to show that it’s not possible without that, and we’re very far from that.
Lex Fridman
(00:40:32)
I think one of the criticisms of your theory on the dinosaurs was that it requires, if I remember correctly, for dark matter to be weirder than it already is. I think you had a clever response to that. Can you remind…
Lisa Randall
(00:40:46)
I’m not sure I remember what I said then, but we have no idea how weird dark matter is. It’s based on everyone thinking they know what dark matter is. Weirder than it already is, it’s not already anything. We don’t know what it is, so there’s no normalization here.
Lex Fridman
(00:40:59)
Do we know if dark matter varies in density?
Lisa Randall
(00:41:05)
It just certainly does in the universe, just like… For example, there’s more dark matter in galaxies than there’s between galaxies. It clumps. It’s matter, so it’s distributed like matter. It is matter.
Lex Fridman
(00:41:18)
It does clump, but the full details of how it clumps and the complexity of the clumping…
Lisa Randall
(00:41:25)
It’s understood pretty well. People do simulations… Where people are always looking for things, including us as particle physics, it’s at small scales, are the deviations on small scales so that indicating other interactions or other processes or interactions with baryons. That is to say normal matter that we don’t understand. But on large scales, we have a pretty good understanding of dark matter distribution.
Lex Fridman
(00:41:50)
You were part of a recent debate on can science uncover reality. Let me ask you this question then, what do you think is the limits of science?
Lisa Randall
(00:42:00)
I’m smart enough to know that I have no idea. Also it’s not even clear what science means because there’s the science that we do, which is particle physics. We try to find fundamental things and figure out what their effects are. There’s science like biology where at a higher level, the kind of questions you ask are different, the kind of measurements are different.

(00:42:21)
The kind of science that’s going to happen in the more numerical age or even AI, what does it mean to answer a question? Does it mean that we can predict it? Does it mean that we can reproduce it? I think we’re coming up against the definition of what we mean by science as human beings. In terms of the science that we can do, I don’t think we’ll know it until we get there. We’re trying to solve hard problems and we’ve made progress.

(00:42:50)
If you think of how much science has advanced in the last century or century and a half, it’s incredible. We didn’t even know the universe was expanding at the beginning of the 20th century. We didn’t know about quantum mechanics at the beginning of the century, we didn’t know about special relativity. That’s a lot in a relatively short time, depending on how you think of time. I think it would be premature to say we know limitations.
Lex Fridman
(00:43:14)
At various points throughout the history, we thought we solved everything or at least various people declared-
Lisa Randall
(00:43:20)
[inaudible 00:43:20] various people. Exactly.
Lex Fridman
(00:43:21)
Declared that we’ve solved everything. This also a good place to… Maybe could you describe the difference between top-down and bottom-up approaches to theoretical physics that you talked about in the book?
Lisa Randall
(00:43:33)
You could try to jump in and say I have a theory that I think is so perfect that I can predict everything from it or at least predict some salient features from it.
Lex Fridman
(00:43:46)
Mm-hmm. That’s top-down.
Lisa Randall
(00:43:47)
That would be a top-down. Bottom-up is more like the questions we just asked. Why are masses what they are? We measure things. We want to put them together. Usually a good approach is to combine the two. If you ask a very specific question but combine it with the methods of knowing that there could be a fundamental theory underlying it, sometimes you make progress.

(00:44:09)
The community tends to get segmented or fragmented into people who do one or the other, but there are definitely times… Some of my best collaborations with people who are more top-down than I am, so that we come up with interesting ideas that we wouldn’t have thought of if either one of us was working individually.
Lex Fridman
(00:44:25)
Would you say the truly big leaps happened top-down? Like Einstein?
Lisa Randall
(00:44:30)
Einstein was not a top-down person in the beginning. Special relativity was very much him thinking about… They were thought experiments, but he was very much… The original theory about relativity is something like on the nature of electromagnetism. He was trying to understand how Maxwell’s laws could make sense when they seemed to have different symmetries than what we had thought they were.

(00:44:54)
He was very much a bottom-up person, and in fact, he resisted top-down for a long time. Then when he tried to do the theory of general relativity or the general theory of relativity, whichever you want to call it, incorporating gravity into the system when you need some feedback, then he was helped by a mathematician who had developed some differential geometry and helped him figure out how to write down that.

(00:45:16)
After that, he thought top-down was the way to go, but he actually didn’t make that much progress. I think it’s naive to think it was just one or the other. In fact, a lot of people who made real progress were rooted in actual measurements.

Physics vs mathematics

Lex Fridman
(00:45:31)
Speaking of mathematicians, what do you is the difference, you’ve had a bit of foot in both, between physics and mathematics in the way it helps us understand the world?
Lisa Randall
(00:45:41)
To be frank, there’s a lot more overlap in physics and math. I think that has been… Maybe not more, but there’s certainly a lot. I think, again, the kinds of questions you’re asking are usually different. Mathematicians like the structure itself, physicists are trying to concentrate on, to some extent, on the consequences for the world. But there is a lot of overlap.
Lex Fridman
(00:46:04)
The string theory is an example. There’s certain theories where there’s a certain mathematical beauty to it.
Lisa Randall
(00:46:12)
There’s also some really cool ideas that you get in particle physics where you can describe what’s going on and connect it to other ideas. That’s also really beautiful. I think basically insights can be beautiful. They might seem simple, and sometimes they genuinely are, and sometimes they’re built on a whole system that you have to understand before. If you actually saw Einstein’s equations written out in components, if you wouldn’t think it’s so beautiful. If you write in a compact way, it looks nice.
Lex Fridman
(00:46:43)
What do you think about the successes and the failures of string theory? To what degree do you think it succeeded, to what degrees it not succeeded yet or has failed?
Lisa Randall
(00:46:54)
I think to talk about any science in terms of success and failure often misses the point because there’s not some absolute thing. I do think that string theorists were a bit overly ambitious… Not overly ambitious, but a little bit overly arrogant in the beginning, thinking they could solve many problems that they weren’t going to solve.

(00:47:14)
That’s not to say the methods and advances in strength theory don’t exist, but they certainly weren’t able to immediately solve all the problems they thought they could solve. It has given us tools, it has given us some insights, but it becomes almost a sociological question of how much it should be one or the other.

(00:47:35)
I do think that you can get caught up in the problems themselves, and sometimes you can get caught up in the methods and just do other examples. The real physics insights often come from people who are thinking about physics as well as math.
Lex Fridman
(00:47:49)
Because you mentioned AI, is there hope that AI might be able to help find some interesting insights? Another way to ask this question is how special are humans that we’re able to discover novel insights about the world?
Lisa Randall
(00:48:09)
That’s a great question, and it depends on what kind of insights and what we’re going to find that out. Because it’s hard to think about something that doesn’t quite exist yet, I could just think about something, take a step back. It’s a little bit like I’m trying understand four dimensions so you go back to three dimensions. Go to something you can imagine.

(00:48:31)
You can say a lot of the things in a very different level about the internet. You could say has the internet helped do things? It definitely took on a life of its own in some sense, but it’s also something that we’re able to tame. I know that I, myself wouldn’t have been able to write books if the internet didn’t exist because I wouldn’t have had the time to go to the library and look everything up. It helped me enormously.

(00:48:57)
In some sense, AI could be that. In a very nice world, it could be a tool that helps us go a step further than we would and a lot more efficiently. It’s already done that to some extent. Or it could be like the parts of the internet that we can control that are ruining politics or whatever. There’s certainly a lot of indications that can do that. Then there are even bigger things that people speculate about AI being able to do its own things, but in terms of actually figuring things out, we’re in the early stages.
Lex Fridman
(00:49:33)
Yeah, there’s several directions here. One is on the theorem prover side, Wolfram Alpha where everything’s much more precise, and we have large language model type of stuff. One of the limitations of those is it seems to come up with convincing looking things, which we don’t know if it’s true or not, and that’s a big problem for physics.
Lisa Randall
(00:49:54)
Large language models are more or less generalizations of stuff that we have. There’s still breakthroughs in AI waiting to happen, and maybe they are happening and maybe they’ll be good, maybe not, but that’s not quite the same. Maybe in some cases, it’s just pattern recognition that leads to important things, but sometimes it could be something more insightful than that that I can’t even put my finger on.

(00:50:21)
It forces us to… We don’t really understand how smart we are. We don’t understand how we think about things all that well, actually. But one thing is true though, we are a lot more efficient right now than computers and coming up with things, we require a lot less energy to do that. If computers figure out how to do that, then it’s going to be at a totally different ball game.

(00:50:42)
Here are clearly kinds of connections that we don’t know how we’re making, but we are making them. That’s going to be interesting. I say we’re in early stages, but this is changing very rapidly. Right now, I don’t think that it’s actually discovered new laws of physics, but could it in the future? Maybe it can.
Lex Fridman
(00:51:06)
It will raise big questions about what is special about humans that we don’t quite appreciate. There could be things that are like that leap of insight that happens, truly novel ideas, that could potentially be very difficult to do.
Lisa Randall
(00:51:26)
There are abstract questions like that. There’s also questions of how is it that we can address to some extent, how will AI be used in the context of the world we live in? Which is based on at least our country’s based on capitalism in a certain political system. How will global politics deal with it? How will our capitalist system deal with it? What will be the things that we focus on doing with it? How much will researchers get control of it to be able to ask different sorts of questions?

(00:51:58)
While it was starting out, people were doing these kinds of toy problems, but what will it actually be applied to and what will it be optimized to do? There’s a lot of questions out there that it’s really important we start addressing.
Lex Fridman
(00:52:11)
What to you is the most beautiful unsolved problem in physics and cosmology, which is really exciting if we can unlock the mystery of in the next few decades?
Lisa Randall
(00:52:30)
Is it what’s the most beautiful unsolved problem, or what is the most beautiful unsolved problem I think we can make progress on?
Lex Fridman
(00:52:35)
Oh boy, we make progress on in the next few centuries.
Lisa Randall
(00:52:43)
Most of the big questions have to do with what underlies things, how things started, what’s at the base of it. There’s also just basic questions like that you asked earlier, how far will science take us? How much can we understand? There are questions like how we got here, what underlies it, are there.

(00:53:02)
Also, there’s really deep questions like what fraction are we actually seeing? If there are these other forces, if there is another way of seeing the world, are there universes beyond their own? If they’re so totally different, how do we even comprehend them? What would we even think about them? There’s a lot about trying to get beyond… It’s always just getting beyond our limited vision and limited experience and trying to see what underlies it, both at small scales and at large scales.

(00:53:35)
We just don’t know the answers. I’d like to think that we understand more about dark matter, about dark energy, about are there extra dimensions, things that we actually work on, but there’s probably a lot beyond what we work on that’s yet to be discovered.
Lex Fridman
(00:53:50)
Yeah, understanding the extra dimensions piece will be really interesting.
Lisa Randall
(00:53:55)
Totally. If it is how the universe went from higher dimensions to what we see, are the extra dimensions present everywhere? One of the really interesting pieces of physics we did that I talk about in my first book, Warped Passages, is finding out that there can be a higher dimension, but only locally. Do you think there’s a gravity of a lower dimension? It could be like only locally do we think we live in three dimensions. It could be higher dimensions is different.

(00:54:25)
That’s not actually the gravity we have, but there’s all sorts of phenomena that might be out there that we don’t know about. All sorts of evolution things, time dependence that we don’t know about. Of course, that’s from the point of view of particle physics, from the point of view of other kinds of physics, we’re just beginning, so who knows?
Lex Fridman
(00:54:40)
Yeah, if the physics changes throughout is not homogeneous throughout the universe, that’ll be weird.
Lisa Randall
(00:54:48)
I mean, for the observable universe, it’s the same. But beyond the observable universe, who knows?
Lex Fridman
(00:54:58)
You’ve had an exceptional career. What advice would you give to young people, maybe high school, college, on how to have a career they can be proud of and a life they can be proud of?
Lisa Randall
(00:55:10)
I think the weird thing about being a scientist or an academic in general is you have to believe really strongly what you do while questioning it all the time. That’s a hard balance to have. Sometimes it helps to collaborate with people, but to really believe that you could have good ideas at the same time, knowing they could all be wrong. That’s a tough tightrope to walk sometimes, but to really test them out.

(00:55:34)
The other thing is sometimes if you get too far buried, you look out and you think there’s so much out there. Sometimes it’s just good to bring it back home and just think okay, can I have as good idea as the person next to me rather than the greatest physicist who ever lived? Right now, like you said, I think there’s lots of big issues out there, and it’s hard to balance that.

(00:55:55)
Sometimes it’s hard to forget the role of physics, but I think Wilson said it really well when he said when they were building Fermilab, it was like this won’t defend the country, but it’ll make it worth defending. It’s just the idea that in all this chaos, it’s still important that we still make progress in these things. Sometimes when major world events are happening, it’s easy to forget that. I think those are important too. You don’t want to forget those, but to try to keep that balance because we don’t want to lose what it is that makes humans special.
Lex Fridman
(00:56:24)
That’s the big picture. Do you also lose yourself in the simple joy of puzzle solving?
Lisa Randall
(00:56:29)
Yeah. We all like solving puzzles. Actually one of the things that drives me in my research is the inconsistencies. When things don’t make sense, it really bugs me and it just will go into different directions to see how could these things fit together.
Lex Fridman
(00:56:46)
It bugs you, but that motivates you?
Lisa Randall
(00:56:48)
Yeah, totally.
Lex Fridman
(00:56:49)
Until it doesn’t. You have to resolve it.
Lisa Randall
(00:56:52)
I think I have this underlying belief that it should make sense, even though the world comes at you in many ways and tells you nothing should make sense, but if you believe that it makes sense and you look for underlying logic. I think that’s just good advice for everything to try to find why is it the way.

(00:57:08)
I talk about effective theory in my second book, Knocking On Heaven’s Door, a lot. It’s rather than ask the big questions, sometimes we just ask the questions about the immediate things that we can measure and like I said, we can sometimes tell one that we’ll fail, but we can have these effective theories. Sometimes I think when we approach these big questions, it’s good to do it from an effective theory point. Why do I find this satisfying? Why is the world we have the way it is?

(00:57:31)
We think things are beautiful that we live in. I’m not sure if we had different senses or different ways of looking at things, we wouldn’t necessarily find it beautiful. But I have to say, it is fantastic that no matter how many times I see a sunset, I will always find it beautiful. I don’t think I ever see a sunset as say whatever. It’s just always beautiful.

(00:57:54)
There are things that as humans, clearly resonate with us, but we were maybe evolved that way. But that’s about us. In terms of figuring out the universe, it’s amazing how far we’ve gotten. We have discovered many, many wonderful things, but there’s a lot more out there and I hope we have the opportunity to keep going.
Lex Fridman
(00:58:14)
With effective theories, one small step at a time, just keep unraveling the mystery.
Lisa Randall
(00:58:19)
Also having in mind the big questions, but doing one small step at a time. Exactly.
Lex Fridman
(00:58:23)
Yeah, looking out to the stars. You said the sunset. For me, it’s the sunset, the sunrise, and just looking at the stars. It’s wondering what’s all out there and having a lot of hope that humans will figure it out.
Lisa Randall
(00:58:39)
Right. I like it.
Lex Fridman
(00:58:42)
Lisa, thank you for being one of the humans in the world for having me here for that are pushing it forward and figuring out this beautiful puzzle of ours. Thank you for talking today. This was amazing.
Lisa Randall
(00:58:53)
Thank you for having me here.
Lex Fridman
(00:58:55)
Thanks for listening to this conversation with Lisa Randall. To support this podcast, please check out our sponsors in the description. Now, let me leave you with some words from Albert Einstein. The important thing is to not stop questioning. Curiosity has its own reason for existence. Thank you for listening, and hope to see you next time.

Transcript for Michael Malice: Thanksgiving Pirate Special | Lex Fridman Podcast #402

This is a transcript of Lex Fridman Podcast #402 with Michael Malice.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
What’s your opinion on my bird here, Mr. Parrot?
Michael Malice
(00:00:04)
It’s a Macaw. Scarlet Macaw.
Lex Fridman
(00:00:07)
What?
Michael Malice
(00:00:08)
It is a Scarlet Macaw.
Lex Fridman
(00:00:10)
Oh, you know birds?
Michael Malice
(00:00:11)
Yeah. And that’s actually not life-sized.
Lex Fridman
(00:00:15)
Are you saying he’s not real?
Michael Malice
(00:00:17)
I’m saying it’s not to scale.
Lex Fridman
(00:00:19)
Okay. But he’s real.
Michael Malice
(00:00:21)
Are we doing that Monty Python sketch?
Lex Fridman
(00:00:25)
Everything is a Monty Python sketch.
Michael Malice
(00:00:26)
I don’t think Monty Python’s funny.
Lex Fridman
(00:00:28)
You don’t?
Michael Malice
(00:00:29)
At all. Not once.
Lex Fridman
(00:00:29)
That explains so much.
Michael Malice
(00:00:31)
Does it? What does it explain?
Lex Fridman
(00:00:32)
What do you think is funny?
Michael Malice
(00:00:35)
You not answering that question is pretty funny.
Lex Fridman
(00:00:38)
Yeah. What do you think is funny, having a mantis shrimp?
Michael Malice
(00:00:41)
No.
Lex Fridman
(00:00:42)
You think Big Lebowski is funny?
Michael Malice
(00:00:44)
Oh God, no.
Lex Fridman
(00:00:46)
This is getting worse and worse. The following is a conversation with Michael Malice, anarchist and author of Dear Reader, The New Right, The Anarchist Handbook, The White Pill, and he is the host of the podcast, YOUR WELCOME. This is a Thanksgiving special of the pirate and oceangoing variety. So once again, let me say thank you for listening today and for being part of this wild journey with me. This is a Lex Fridman Podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Michael Malice.

Beauty and mantis shrimp

Michael Malice
(00:01:34)
The box?
Lex Fridman
(00:01:35)
Yeah.
Michael Malice
(00:01:35)
The mystery box.
Lex Fridman
(00:01:36)
I’m wondering what’s in it.
Michael Malice
(00:01:36)
There’s something in that box of exquisite beauty, both literally and in what it symbolizes and why it is here.
Lex Fridman
(00:01:46)
Given the kind of human being you are, I’m terrified at what you find beautiful.
Michael Malice
(00:01:52)
That’s a good point. You kind of hit me with a curve ball. For me, the most beautiful wildlife are what I call God’s mistakes. Because my friend came up with that term where she’s like, “God made these disgusting animals, just threw in the bottom of the ocean.” He’s like, “No one’s ever going to see this.”
Lex Fridman
(00:02:12)
Yeah. You commented on Twitter about some creature, a rainbow type creature.
Michael Malice
(00:02:17)
The peacock mantis shrimp.
Lex Fridman
(00:02:18)
Yeah, it’s beautiful.
Michael Malice
(00:02:20)
It’s horrific though. So it has, I think eight legs, six arms, two punching claws or spearing claws depending on the genus. Two eyes, two antennae, two ear flaps. I don’t know what they do. And its punch can be as strong as a bullet. And the other type with the spears, divers call them thumb splitters because if you stick your finger near it’ll cut your thumb down to the bone. So I had one as a pet. All night I would hear banging on the PVC pipe. And I’ve got to tell you, if they have the best eyesight of any animal because they see in seven different ways. And when you make eye contact with this thing, it’s just absolutely terrifying. But you can eat them as sushi. They call them sea centipedes.
Lex Fridman
(00:03:01)
But they’re colorful and beautiful.
Michael Malice
(00:03:03)
That’s species is, yeah.
Lex Fridman
(00:03:04)
What was it like having one as a pet, and why did you do it?
Michael Malice
(00:03:09)
Well, when you have a species that’s that unique and that much of an outlier, growing up, reading these books, watching these shows, I found this stuff so much more fascinating than space, which is dead. So to be able to have this specimen in your house and just observe its behavior is just an amazing thing.
Lex Fridman
(00:03:32)
Why’d you get rid of it?
Michael Malice
(00:03:34)
I didn’t have, I guess, the right minerals in the mix because-
Lex Fridman
(00:03:36)
It died?
Michael Malice
(00:03:37)
… it had a problem moulting once. Yeah, it couldn’t moult correctly.
Lex Fridman
(00:03:40)
Wow. Do you miss it? Think about it still?
Michael Malice
(00:03:43)
I do think about it, to be honest. I still have a pair of it’s punching appendages from when it moulted.
Lex Fridman
(00:03:51)
What pet animal in your life do you miss the most, that has been in your life that you think about?
Michael Malice
(00:03:59)
I’ve never had cats or dogs growing up or anything like that, which I… Oh God. My problem is-
Lex Fridman
(00:04:07)
Here we go.
Michael Malice
(00:04:08)
… if I like something, I will go down a rabbit hole. So I know if I got one tattoo, I already know my first five are going to be. Okay? So I can’t do it because then once I get those five, it’s going to be a hundred and I’m already too old to be the tattoo guy.
Lex Fridman
(00:04:25)
What would be the first tattoo? My face? Would it go on your ass cheeks or where would you put them if it was my face?
Michael Malice
(00:04:36)
If I got your face, it would definitely be on my arm right here.
Lex Fridman
(00:04:39)
If you had multiple faces, would you put like?
Michael Malice
(00:04:42)
I think delts, right? Shoulders, different faces on different shoulders.
Lex Fridman
(00:04:44)
And when you flex?
Michael Malice
(00:04:45)
I’d want some symmetry.
Lex Fridman
(00:04:46)
Yeah. Would you get a dictator? If you had to get a dictator, who would you get?
Michael Malice
(00:04:51)
Would have to be Kim Jong-il. Right? Because I wrote the book on him.
Lex Fridman
(00:04:54)
Oh, it’s like the plugging your book in the tattoo?
Michael Malice
(00:04:57)
I don’t think plugging, it’s just I have a personal connection to this stuff.
Lex Fridman
(00:05:00)
Good opener, the conversation. People would be asking why him and he’d be like, “Well, I wrote a book about it.” And I’d be like, “Oh, okay.”
Michael Malice
(00:05:07)
Okay. Here’s why-
Lex Fridman
(00:05:08)
“Let me check it out.”
Michael Malice
(00:05:08)
That would be a bad. No, that’s not what happens.
Lex Fridman
(00:05:10)
Okay.
Michael Malice
(00:05:11)
Here’s the thing.
Lex Fridman
(00:05:12)
What happens?
Michael Malice
(00:05:12)
When you write a book about North, “Hey, nice to meet you. What is it you do?’ “I’m an author.” “What kind of books do you write?” “Well, my last book was on North Korea,” 90% of the time, 90, they will then start telling me everything they know about North Korea. And it’s like, “I don’t need, this isn’t a quiz, and it’s a very poorly understood country. I don’t expect you to know anything. You’re not on the spot. And half of what you’re saying is not accurate either. It’s fine.”
Lex Fridman
(00:05:36)
How often did they bring up Dennis Rodman?
Michael Malice
(00:05:38)
A hundred percent.
Lex Fridman
(00:05:39)
A hundred percent of the time.
Michael Malice
(00:05:40)
“Oh, so do you know Dennis Rodman?”
Lex Fridman
(00:05:42)
Yeah.
Michael Malice
(00:05:42)
But I don’t understand why. I guess, people feel the need to, “All right, now we’re talking about this subject. I just got to drop whatever I can talk about.” It’s usually a small amount. And there’s this thing in the culture, which I hate that everyone have to have an opinion on everything. And it’s like it’s okay to be like, “Yeah, I don’t know anything about that. Tell me more.” There’s lots of things I don’t know anything about.

Parrots, Pirates, and Monty Python

Lex Fridman
(00:06:02)
What’s your opinion on my bird here, Mr. Parrot?
Michael Malice
(00:06:07)
It’s Macaw, Scarlet Macaw.
Lex Fridman
(00:06:10)
What?
Michael Malice
(00:06:11)
It is a Scarlet Macaw.
Lex Fridman
(00:06:13)
Oh, you know birds?
Michael Malice
(00:06:14)
Yeah. And that’s actually not life-sized.
Lex Fridman
(00:06:18)
Are you saying he’s not real?
Michael Malice
(00:06:20)
I’m saying it’s not to scale.
Lex Fridman
(00:06:22)
Okay. But he’s real.
Michael Malice
(00:06:24)
Are we doing that Monty Python sketch?
Lex Fridman
(00:06:27)
Everything is a Monty Python sketch.
Michael Malice
(00:06:29)
I don’t think Monty Python’s funny.
Lex Fridman
(00:06:31)
You don’t?
Michael Malice
(00:06:31)
At all. Not that once.
Lex Fridman
(00:06:32)
That explains so much.
Michael Malice
(00:06:33)
Does it? What does it explain?
Lex Fridman
(00:06:35)
What do you think is funny?
Michael Malice
(00:06:38)
You not answering that question is pretty funny.
Lex Fridman
(00:06:39)
Yeah. What do you think is funny, having a mantis shrimp?
Michael Malice
(00:06:44)
No.
Lex Fridman
(00:06:45)
Do you think big Big Lebowski is funny?
Michael Malice
(00:06:46)
Oh God, no. Although…
Lex Fridman
(00:06:49)
This is getting worse and worse.
Michael Malice
(00:06:50)
To be fair, I only tried to watch Big Lebowski after it’s been part of the culture for many years.
Lex Fridman
(00:06:57)
Right.
Michael Malice
(00:06:58)
To the point where every single line has been quoted incessantly by the most annoying frat bros ever. So I kind of have been poisoned to be able to appreciate it.
Lex Fridman
(00:07:10)
Right.
Michael Malice
(00:07:10)
So maybe if I’d seen it when it came out, before it became a thing, I would’ve enjoyed it. I couldn’t get through it. I couldn’t get through 20 minutes.
Lex Fridman
(00:07:17)
Is that how you feel about Schindler’s List?
Michael Malice
(00:07:21)
Well…
Lex Fridman
(00:07:21)
It’s so much easier for me to stare at you when you have sunglasses on.
Michael Malice
(00:07:24)
I didn’t think you’d be the one making Holocaust jokes today. And yet, here we are.
Lex Fridman
(00:07:28)
And cut scene. I actually have no trouble making eye contact with you when you’re wearing shades.
Michael Malice
(00:07:35)
Yes, because you’re a robot.
Lex Fridman
(00:07:36)
Two copies of myself.
Michael Malice
(00:07:38)
Yeah. Oh, you’re seeing yourself in them?
Lex Fridman
(00:07:39)
Mm-hmm.
Michael Malice
(00:07:40)
Okay, cool.
Lex Fridman
(00:07:40)
Yeah, I’m having a conversation with myself. It’s not your fault, Lex.
Michael Malice
(00:07:46)
They made you like this. You were just a good little Roman in Saint Petersburg.
Lex Fridman
(00:07:51)
I could see Mr. Parrot a little bit too.
Michael Malice
(00:07:54)
But what do you find funny? Come on. This is an interesting subject.
Lex Fridman
(00:07:57)
Well, I find Monty Python. I find absurdity funny.
Michael Malice
(00:08:00)
Yes. I find absurdity funny. I think that’s the thing. When people come at me, and maybe this is an Eastern European thing, when they’re like, “How can you find this very dark subject funny?” It’s like, well, the humor. First of all, the humor is that you’re making fun of something that’s dark. So already it’s absurd. It’s completely inappropriate. Second, just psychologically, Joan Rivers said that Winston Churchill said, I don’t know if it’s true, that when you make people laugh, you’re giving them a little vacation. And I was just thinking about this the other day, how when I die, if, I want my funeral to be a roast. It doesn’t help me that everyone’s sad. If I brought people happiness or joy in life, whatever, I want to keep doing that in death. Your sadness doesn’t help me. I know you can’t help it, but tell stories of how I made you laugh. Make fun of me. Make me the punching bag. Even literally, take me out of that coffin and beat the-.

(00:08:55)
Make me a pinata. I don’t care. I don’t understand, well, I do understand, but it’s sad for me when people are like, “This isn’t funny. That isn’t funny.” The way I look at humor is the way it’s like a chef, right? It’s pretty easy to make bacon taste good, but some of these really obscure ingredients to make it palatable, that’s takes skill. So if you’re dealing with a subject that is very emotional or intense and you can make people laugh, then that takes skill and that’s the relief for them.
Lex Fridman
(00:09:29)
Yeah. It’s all about timing.
Michael Malice
(00:09:33)
Yeah.
Lex Fridman
(00:09:34)
Yeah.
Michael Malice
(00:09:38)
What’s the difference? You want to hear one of my jokes?
Lex Fridman
(00:09:41)
Is it a pirate joke? Because that’s the only kind I accept today.
Michael Malice
(00:09:45)
Okay.
Lex Fridman
(00:09:45)
But no, go ahead. It doesn’t have to be a pirate joke this one time.
Michael Malice
(00:09:48)
Do you know who Lia Thomas is?
Lex Fridman
(00:09:50)
Yeah.
Michael Malice
(00:09:51)
What’s difference between Lia Thomas and Hitler?
Lex Fridman
(00:09:53)
What?
Michael Malice
(00:09:54)
Lia Thomas knows how to finish a race.
Lex Fridman
(00:09:58)
Very nice. Very nice.
Michael Malice
(00:09:59)
Did I just get the gold medal?
Lex Fridman
(00:10:00)
Good job. Why does it take pirates forever to get through the alphabet?
Michael Malice
(00:10:09)
Why?
Lex Fridman
(00:10:11)
Because they spent years at sea.
Michael Malice
(00:10:13)
Oh, I thought it was going to be an [inaudible 00:10:15] joke.
Lex Fridman
(00:10:15)
Nope. No.
Michael Malice
(00:10:16)
That’s a good one. I like that.
Lex Fridman
(00:10:17)
Yeah.
Michael Malice
(00:10:18)
When I was in North Korea.
Lex Fridman
(00:10:21)
Oh, you know Dennis Rodman? It’s a callback.
Michael Malice
(00:10:23)
By the way, the thing that is very heartbreaking about the North Korean situation is that they have a great sense of humor. It would be a lot easier if these were robots or drones. They have big personalities, big senses of humor, and that made it much harder to leave and interact with these people because I mean, there’s nothing more human and universal than laughter and laughter’s free.
Lex Fridman
(00:10:47)
Are you saying there’s humor even amongst the people that have most of their freedoms taken away?
Michael Malice
(00:10:52)
Especially. I mean, again, we’re from the Soviet Union, there’s [inaudible 00:10:57] I mean, Russian humor is a thing because there’s nothing you can, if you can’t have food or nice things, at least you can have joy and make each other laugh. I think about it all the time, and I think about my guide all the time. It’s been, what, 2012? So it’s been 11 years since I’ve been there, and she’s still there. And everyone I’ve seen is still there. They just recently electrified the border. So you can’t even, even the few people who are escaping can’t do it anymore.
Lex Fridman
(00:11:19)
Well, that’s interesting that they still have a sense of humor. I attribute the Soviet Union for having that because of the really deep education system. You got to read a lot of literature.
Michael Malice
(00:11:29)
Okay.
Lex Fridman
(00:11:30)
And because of that, you get to kind of learn about the cruelty, the injustices, the absurdity of the world.
Michael Malice
(00:11:40)
Right.
Lex Fridman
(00:11:40)
As long as the writing is not about the current regime.
Michael Malice
(00:11:43)
Yeah. But I think if you look at African Americans, Jewish Americans, gay Americans, they are all disproportionate in terms of attributing to comedy. It’s not because these groups have some kind of magic to them., It’s that when you are on the outside looking in, A, you’re going to have different perspective than the people who are in the middle of the bell curve. But also, when you don’t have anything to lose, at the very least, you can make each other laugh and find happiness that way. So that is something that I think is an important thing to recognize.

Humor and absurdity

Lex Fridman
(00:12:14)
So what do you find funny? What makes you giggle in the most joyful of ways? The suffering of others?
Michael Malice
(00:12:24)
I mean, there are YouTube videos of fat people falling down and they’re really funny.
Lex Fridman
(00:12:36)
There’s two kinds of people in this world, those that laugh at those videos and those that don’t.
Michael Malice
(00:12:42)
No. And those that are in them. My friend Jesse just told me a great Norm Macdonald joke, and this is a good litmus test joke because he says, “A certain group of people lose their minds and a certain group of people just stare at you.” And he goes, “This kind of…” and so I’ll tell you the joke. This is Norm McDonald. A guy walks into a bar and he sees someone at the bar who has a big pumpkin for a head.

(00:13:07)
And the guy’s like, “Dude, what happened to you?” He goes, “Ugh, you never believe this. I got one of those genie lamps and this genie.” He’s like, “Well, what happened?” He goes, “Well, the first wish, I wished for a hundred million dollars.” He’s like, “Yeah, did you get it?” He goes, “Yeah.” He goes, “In my bank account. Feels fine.” He goes, “All right. Well, the second wish, I wished to have sex with as many beautiful women as I want.” He goes, “Did that happen?” He goes, “Yeah, it was amazing.” He goes, “Then what?” “Well, I wished for a giant pumpkin head.”
Lex Fridman
(00:13:34)
Yeah.
Michael Malice
(00:13:35)
So there’s a certain mindset that will just be staring at the screen. And that is, I mean, there’s so many levels why that’s funny, at least to me. And I just love that kind of humor.
Lex Fridman
(00:13:45)
Well, Norm McDonald is just, I watch his videos all the time. He’s a guy that definitely makes me giggle. And he’s one of the people that makes me giggle for reasons I don’t quite understand.
Michael Malice
(00:13:58)
Did you ever see him with Carrot Top on Conan O’Brien?
Lex Fridman
(00:14:01)
No.
Michael Malice
(00:14:02)
Making fun of Carrot Top?
Lex Fridman
(00:14:03)
No.
Michael Malice
(00:14:05)
This is probably the best talk show clip of all time. He’s on with Courtney Thorne-Smith. She was on Melrose’s Place and Conan O’Brien’s the host, and Courtney’s talking about how she’s going to be an upcoming movie with Carrot Top. And Conan is like, “Oh, what’s it going to be called?” And she’s like, “Doesn’t have a title yet.” And Norm goes, “Oh, I know what should be called, Box Office Poison.” And they’re all laughing. And she’s like, “No, no, no, the working title is Chairman of the Board. And Conan goes, “Do something with that smart ass.” And Norm goes, “Yeah, bored is spelled B-O-R-E-D.” And they all just completely lost it.
Lex Fridman
(00:14:39)
There’s something about him with words spoken out of his mouth with the way he turns his head and looks at the camera.
Michael Malice
(00:14:46)
I think he is one of those rare comedians who you really feel like he’s talking to you directly. He feels like he’s winking at you in the audience. And he’s like, “Can you believe I’m doing this?” It’s like almost he feels like he’s, I don’t want to say imposter, but he’s more a member of the audience than he is a member of the people on the stage.
Lex Fridman
(00:15:06)
Yeah, it feels like he’s on our side.
Michael Malice
(00:15:08)
Yes. Yeah.
Lex Fridman
(00:15:09)
Whatever the hell “Our” means.
Michael Malice
(00:15:11)
Roseanne got him his first job.
Lex Fridman
(00:15:14)
Man. Roseanne, you and her have been hanging out.
Michael Malice
(00:15:17)
I got it. Oh my God. Talk about Thanksgiving. When you are talking to Roseanne Barr and making eye contact with this person, it is, I can’t even describe it. It’s just like, “Holy crap, Roseanne Barr’s talking to me.” She is, I’ve said this to her face, pathologically funny. It does not turn off. And you’re sitting there and you’re like, “Holy crap.”

(00:15:40)
And when you make her laugh, which is that laugh that’s in the theme song of her show, you feel like, “Okay, I did a mitzvah. I did something good and right in the world that I made Roseanne Barr laugh.” And it’s also really funny because, and she’s going to hate this, because I tell her, she’s adorable. She doesn’t like that. She’s little. You think of Roseanne Barr as this force of nature, like a tsunami.
Lex Fridman
(00:16:01)
Big, yeah.
Michael Malice
(00:16:01)
She’s five three, I’d say maybe 130. And she puts on the sunglasses, you think this little old Jewish lady. You’d never know this is one of the most epic performers of all time. She lives near here now. So it’s just so much fun talking to her. There was an old satirical magazine in the, I think early two thousands called Heeb, written by Jews, and she dressed up as Hitler for one of the photo shoots, and she was baking little men in the oven. I found it on eBay, I wanted her to sign it to, “Michael, it should have been you.” But she signed it to, “Michael, you’re one smart cookie.” And now it hangs, “Love, mom, Roseanne Barr.” And I call her mom and it hangs over my desk because I have her good domestic goddess energy flowing at me. What?
Lex Fridman
(00:16:52)
What do you find? What else? So Norm McDonald. I guess, we’ve landed on that.
Michael Malice
(00:16:56)
No. My favorite comedian is-
Lex Fridman
(00:16:56)
We agree on something.
Michael Malice
(00:16:57)
My favorite comedian of all time is Neil Hamburger. So Neil Hamburger, I don’t know if I’m ruining the bit, he’s a character performed by this guy named Gregg Turkington. So he comes out in a tuxedo, big eyeglasses, holding three glasses of water, coughing into the mic. And I remember I saw him once in LA and the girl ahead of me, at the table ahead of me was with her boyfriend, this basic chick, pumpkin spice. She turns to him and she goes, “What is this?” And I remember the first time he was on Jimmy Kimmel, and he tells one of his jokes and it was like, “Why does ET Reese’s Pieces so much? Well, that’s what sperm tastes like on his home planet.” And no one laughs. And he goes, “Oh, come on guys. I have cancer.” And it just cuts to this Marine in the audience with his arms crossed. So if you know what he’s doing, it’s just absolutely amazing.

(00:17:58)
He opened for Tenacious D once in somewhere, I think in Ireland or the UK, one of those. And they’re booing him because his jokes are often not funny. He’s like, “Hey, where did my whore ex-wife run off to with that dentist she’s shacking up with? I don’t know. But when I see her in court next month, Alaska.”

(00:18:20)
So they’re booing and he goes, “All right, do you guys want me to bring out Tenacious D?” They’re like, “Yeah.” “Do you want to see your heroes of my Tenacious D?” “Yeah.” “Come on, let me hear it. Do you want to see Tenacious D?” “Yeah.” He goes, “All right, if I tell this next joke and you don’t boo me, I’ll bring out Tenacious D.” And it’s like, I’m trying to think of one that’s not too…
Lex Fridman
(00:18:44)
Self censorship is never good.
Michael Malice
(00:18:45)
Okay. He goes, “Can we agree that George Bush is the worst President America’s ever had?” Everyone claps. He goes, “Which makes it all the stranger that his son, George W. Bush was in fact the best.”
Lex Fridman
(00:18:58)
I take it back on the self-censorship.
Michael Malice
(00:19:01)
So two people laugh and he goes, “Oh, that’s amazing. I guess I’ll do an encore.” And he did 10 more minutes. It was just, I love him so much.
Lex Fridman
(00:19:09)
It’s interesting. They opened for Tenacious D. Jack Black, that’s a comedic genius of a different kind.
Michael Malice
(00:19:16)
Oh, yeah. And he was in one of my favorite movies, Jesus’ Son. It’s this little Indie movie. He did a great turn in that. He’s really underrated as an actor. He’s got a lot of range. I know he kind of get types cast as this one specific type, but he’s really, really talented.
Lex Fridman
(00:19:30)
But also just the pure joy.
Michael Malice
(00:19:32)
Yes. He’s clearly having fun.

Thanksgiving

Lex Fridman
(00:19:35)
Okay. It is Thanksgiving. So in the tradition, following tradition, what are you thankful for, Michael, in this world?
Michael Malice
(00:19:45)
Do you have a list too?
Lex Fridman
(00:19:46)
No, not really.
Michael Malice
(00:19:47)
Really?
Lex Fridman
(00:19:48)
It’s up in here.
Michael Malice
(00:19:49)
Oh, I mean, but you have several things you’re thankful for.
Lex Fridman
(00:19:51)
Yes.
Michael Malice
(00:19:52)
Okay.
Lex Fridman
(00:19:52)
Yes.
Michael Malice
(00:19:53)
One of the things I’m-
Lex Fridman
(00:19:54)
My list comes from the heart. I don’t have to write anything down.
Michael Malice
(00:19:56)
Well, I don’t have written down.
Lex Fridman
(00:19:57)
Okay.
Michael Malice
(00:19:58)
One of the things that I’m most thankful for, this is a common answer, but I can back it up, is my family. Because my nephew, Lucas, is now six years old. And when kids have a sense of humor, it’s like just miraculous. So he stole my sister’s phone, his mom. Figured out that grandma is listed as mom in the phone, and he calls her up and he’s like, “Michael’s in the hospital. He’s really sick.

(00:20:27)
He didn’t want to tell you.” And she’s freaking out. He goes, “Prank.” So I took him, Dinesh D’Souza just released a movie called Police State, which was actually really good, highly recommend it. I was surprised how much I liked it because he wasn’t going Republicans good, Democrats bad.

(00:20:41)
It was just about authoritarianism. And he had a movie premier at Mar-a-Lago. So I’m like, I got to bring Lucas to Mar-a-Lago. So Lucas is, I’m like, “We’re going to the President’s house.” He’s like, “Oh, the White House?” And I’m like, “No, no, a former president.” He goes, “Oh, Abe Lincoln?” And I’m like, “Okay, kid logic.” He’s giving logical answers. This is kind of like AI, you have to program it. It’s using logic correctly.
Lex Fridman
(00:21:04)
You should have told him it’s a president that’s second to only Ab Lincoln in terms of greatness.
Michael Malice
(00:21:11)
Accomplishments, yeah. He went up to all the women in their ball gown, evening gowns, and he goes, “You’re so beautiful. Were you born as a girl?” So when you have this six year old asking you this, it was really, really fun. So that is a great joy to have a nephew. And I have another one, Zach, who’s coming up in age, and he’s starting to talk now. That is really, really fun for me.
Lex Fridman
(00:21:39)
Getting to watch them find out about the world for the first time.
Michael Malice
(00:21:43)
And also training them, that he loves being funny and having fun.
Lex Fridman
(00:21:49)
You’re his audience in a sense?
Michael Malice
(00:21:51)
Yeah, but.
Lex Fridman
(00:21:52)
Because you giggle and?
Michael Malice
(00:21:54)
I give him, “We’re prank bros.” He gives me a high five. My family, and this is one, you talk about what I find funny, this is things that actually enraged me. When people, and this is such a wasp thing, don’t just go with the joke or they’re like, “I don’t get it,” or they don’t understand to just go with it.

(00:22:10)
I was in the car with my sister when she was 10, 12, whatever. She’s much younger than me. She’s 12 years younger. And there’s this species of squid, by the way, which is asymmetric. One of its eyes is very much bigger than the other because it swims horizontally. And so one’s looking up, one’s looking down where there’s more light. Shout out. If you want to learn more about squids, go to octonation.com.
Lex Fridman
(00:22:32)
OctoNation. Shout out.
Michael Malice
(00:22:34)
Shout out to Warren.
Lex Fridman
(00:22:34)
There’s a lot of fascinating stuff. OctoNation on Instagram.
Michael Malice
(00:22:37)
Yes. I was in the car with my sister. She’s 10 or 20.
Lex Fridman
(00:22:40)
Me as a pirate, I’m sorry for the rude interruptions. I appreciate that comment, especially.
Michael Malice
(00:22:45)
Yeah, it’s a great. Yeah.
Lex Fridman
(00:22:46)
These jokes and thoughts are coming to me at a ten-second delay, so I apologize. Anyway, you were telling about the asymmetrical.
Michael Malice
(00:22:54)
I know where I was, don’t worry. I got it.
Lex Fridman
(00:22:54)
All right.
Michael Malice
(00:22:56)
So I tell my-
Lex Fridman
(00:22:56)
Sometimes you need help.
Michael Malice
(00:22:57)
No.
Lex Fridman
(00:22:57)
The age is getting to you.
Michael Malice
(00:22:57)
I was…
Lex Fridman
(00:23:02)
Your skin is showing it. It’s getting dark.
Michael Malice
(00:23:06)
I told my sister, I go, “When you were born, one of your eyes was bigger than the other, and you had to have surgery to fix it.” So she turns, she’s like, “Mom.” And my mom goes, “Honey, the important things that you’re beautiful now. It’s like, what’s the big deal? It was just a little surgery.” And I says like, “All right.” Calls grandma. And grandma goes, she goes, “Michael said that I was born one of the eyes.” She goes, “Why is he telling you this now? It’s not a big deal.” So the fact that everyone went with this…
Lex Fridman
(00:23:35)
Oh, nice.
Michael Malice
(00:23:36)
I was so impressed. I was like, “This is a quality family in this very specific regard.”
Lex Fridman
(00:23:41)
Yeah.
Michael Malice
(00:23:41)
Does your family have a sense of humor?
Lex Fridman
(00:23:43)
Yeah. Yeah. Yeah. Yeah. Yeah. Yeah.
Michael Malice
(00:23:43)
Yeah.
Lex Fridman
(00:23:47)
Soviet culture, there’s a dark sense of humor.
Michael Malice
(00:23:50)
Very much so.
Lex Fridman
(00:23:51)
There’s…
Michael Malice
(00:23:52)
Wordplay.
Lex Fridman
(00:23:52)
Wordplay. Yeah. Yeah. And especially the Russian language allows for some-
Michael Malice
(00:23:58)
Yes.
Lex Fridman
(00:23:59)
Hilarity to it. There’s also culture of poetry and my dad, my mom too, but they remember a lot of lines from books and poems. So you can do a lot of fascinating references that add to the humor and the richness of the conversation.
Michael Malice
(00:24:18)
I feel like that’s a very Russian thing. At a party or maybe at a bar or something, I don’t know where you’d meet people, these are such great ice-
Lex Fridman
(00:24:18)
I never go out.
Michael Malice
(00:24:25)
I meant in Russia.
Lex Fridman
(00:24:27)
Oh.
Michael Malice
(00:24:27)
I meant these would be such good icebreakers, right? You go up to someone and goes, “Hey, did you hear this one?” [foreign language 00:24:32] And you just tell him some little story.
Lex Fridman
(00:24:34)
Did you say icebreakers because it’s cold in Russia? I’m here all night.
Michael Malice
(00:24:42)
That’s true. You never leave the house.
Lex Fridman
(00:24:42)
Literally.
Michael Malice
(00:24:46)
I feel like that’s a thing.
Lex Fridman
(00:24:47)
Yeah.
Michael Malice
(00:24:48)
And that’s not a thing in America.
Lex Fridman
(00:24:51)
You mean like witty banter?
Michael Malice
(00:24:53)
No. Meaning you go up to stranger and that’s your icebreaker. You tell them this little joke, and since everyone kind of has the same sensibilities, right away, you guys are chatting. I don’t think that’s a thing here.
Lex Fridman
(00:25:02)
Yeah.
Michael Malice
(00:25:02)
I think here it’s more small talk, which.
Michael Malice
(00:25:00)
… We’re chatting. I don’t think that’s a thing here. The thing here, it’s more small talk, which drives me crazy.
Lex Fridman
(00:25:05)
So what else are you thankful for?
Michael Malice
(00:25:06)
Well, what’s something you’re thankful for?
Lex Fridman
(00:25:09)
Well, you went with family. I’m definitely thankful for family.
Michael Malice
(00:25:12)
Okay.
Lex Fridman
(00:25:12)
Yeah.
Michael Malice
(00:25:14)
If I may ask, how do they react to you? You’re sitting down with Elon, you’re sitting out Netanyahu, sitting down with Kanye, all these big names. Are they expressing that they’re proud of you or is it more like, why haven’t you talked to this person?
Lex Fridman
(00:25:30)
Yeah, more Michael Malice, please.
Michael Malice
(00:25:34)
The people’s choice.
Lex Fridman
(00:25:36)
Yeah, They’re very proud. But they get argumentative and they’re just like a regular human being with whom I’m close and we just argue about stuff. They’re maybe not enough show the being proud of, but that part is just the nature of our relationship. It’s also the same with your parents?
Michael Malice
(00:25:56)
Yeah. I don’t talk to my dad. That’s one of the reasons because there’s never ever any good job. And at a certain point it’s like, why am I trying to search for approval from someone I’m never getting it for? And from whom it wouldn’t mean anything at this point anyway.
Lex Fridman
(00:26:14)
Well, that’s interesting. There’s a journey like that for a lot of people with their father or their mother. They’re always trying to find approval, and that’s life for a lot of people. That’s a really big part of the human condition is that relationship you have with your father, with your mother. I don’t know. It’s a beautiful thing whether it’s been a rough childhood or a beautiful one, all of it. That’s who you are. The relationship, especially early on in your life with your father or with your mother, is extremely formative.
Michael Malice
(00:26:48)
Yeah. My dad taught me a lot of things at a young age that I’m very, very grateful for. He’s extremely intelligent, very flawed, and that’s fine. We all are, except for me. And it’s the kind of things that when you learn things at a right age, and this is one of the things I like about being older, is that when I’m friends with people-
Lex Fridman
(00:26:48)
Much older.
Michael Malice
(00:27:09)
Much older, much older. When I have friends who are younger, it’s very easy for me to keep them from making the mistakes I did. So at least this is something I’m getting out of it is that, okay, I can’t fix these mistakes, but it just takes me 30 seconds and I can pull you back from making the mistake. So he’s taught me a lot as a kid, he really encouraged me very much to… He has a very good sense of humor and also very bad in some ways. Dad jokes, but also really funny jokes, but also this love of learning that I got that from him. And I have got literally right now, 98 books on my shelf to read. I remember I had a friend and she ran into someone she went to high school with and he stopped me on the train and he’s like, “Yo, you’re not in college. You don’t need to read books anymore.” And I was just horrified to hear this.
Lex Fridman
(00:28:00)
Yeah, yeah. Boy, don’t I know it.
Michael Malice
(00:28:07)
You do laugh, but there’s a lot of things I don’t understand. When you got heat for, I want to read the Western Classics. To me, that might’ve been the internet at its absolute worst.
Lex Fridman
(00:28:21)
I think there’s just a cynical perspective you can take that this is such a simple celebration of a thing, that there must be something behind it. I think the internet for good and bad, is just skeptical. What’s behind this?
Michael Malice
(00:28:36)
My hero, Albert Camus. And if there’s one thing I would want to fight, it’s cynicism because it’s such a giving up. It’s such, everything sucks, this sucks, this sucks. Most things suck. Most stand up comedians suck. Most movies suck. All podcasts suck. But it doesn’t matter.
Lex Fridman
(00:28:54)
Especially yours.
Michael Malice
(00:28:55)
Especially mine. It’s unwatchable.
Lex Fridman
(00:28:58)
You’re welcome. You can’t even spell it correctly.
Michael Malice
(00:29:03)
But the stuff that’s good is what matters. Who cares if 90% of movies are terrible? They’re the ones that change your life, the books, the people, the comedians, the shows, the music.
Lex Fridman
(00:29:17)
And even the terrible things have good moments, beautiful moments.
Michael Malice
(00:29:22)
Some, not all.
Lex Fridman
(00:29:23)
Your podcast being an example of not all. I keep listening for something good, something good.
Michael Malice
(00:29:31)
In all fairness, none of my guests have anything to offer.so that’s not on me. I try.
Lex Fridman
(00:29:37)
Yeah. Well, I wish you’d talk a little less in your podcast. It’s a little excessive. I only listen for the underwear commercials.
Michael Malice
(00:29:46)
Sheathunderwear.com. Promo code Malice.
Lex Fridman
(00:29:48)
I haven’t seen you do it in a while, but this kind of commentary on a debate or I think it was with Rand, like an Ayn Rand debate or something.
Michael Malice
(00:30:00)
Oh yeah. Malice at the Movies. I watched the video and I broke it down.
Lex Fridman
(00:30:03)
That was really great. I wish you did that more.
Michael Malice
(00:30:05)
I haven’t done livestreaming in a long time. It was something I was doing a lot in New York, especially during COVID. I feel that I don’t know, I got so many projects on the plate. Oh, this is something else I’m thankful for. This is something I’m very, very thankful for and I’m going to announce it here.
Lex Fridman
(00:30:26)
Coming out of the closet, finally. Go ahead. Who’s the lucky guy?
Michael Malice
(00:30:38)
You’re the one in drag.
Lex Fridman
(00:30:42)
Guns out. Guns out.
Michael Malice
(00:30:45)
He makes me call him Sex Friedman.
Lex Fridman
(00:30:48)
You like it.
Michael Malice
(00:30:50)
I didn’t say I did.
Lex Fridman
(00:30:52)
All right.
Michael Malice
(00:30:52)
Didn’t even imply that. When I in, as you probably know as you know, but as many people watching this also know, Harvey Pekar who had the comic book series, American Splendor was the subject of the movie, American Splendor. He wrote a graphic novel about me in 2006 called Ego and Hubris, which goes for like $150 on eBay. It’s not worth it, just downloaded it. And I met Harvey because I wrote this screenplay about this band from the 80s called Rubber Rodeo. It’s a real band. And the keyboardist, Gary Leib, who passed away. Rest in peace, Gary. Introduced me to Harvey because he did the animation for the movie. And this script’s been in my desk for over 20 years, and I realized thanks to my buddy Eric July, who has some huge success with his comics, I could just produce this as a graphic novel.

(00:31:43)
So I’ve got an artist, we’re getting it together, so I’m going to make it happen finally. And it’s some of the best writing I’ve ever done. I’m really proud of the story. It’s ironic reading it now, because when you’re a writer, obviously different books, you put different aspects of yourself into them, and this story is very, very dark because basically they did all the right things and they went nowhere. What I realized was reading it now, that all these fears I had over 20 years ago about what if I’m not going to make it? What if I’m doing all the hard work and it’s still not enough? Now it’s been disproven because I can at least pay my rent.
Lex Fridman
(00:32:22)
Do you feel like you’ve made it because you said you could pay your rent.
Michael Malice
(00:32:26)
I feel that to make it is if you don’t have to have a boss, and you know how I really felt like I made it?
Lex Fridman
(00:32:36)
Mm-hmm.
Michael Malice
(00:32:37)
This is going to sound like a joke, and it’s not. This is being an immigrant, I own as you know, Margaret Thatcher’s bookcases.
Lex Fridman
(00:32:45)
Yes.
Michael Malice
(00:32:45)
So to me as an immigrant, to have her bookcases in my house, I’ve made it.
Lex Fridman
(00:32:51)
You’re right. It’s not a joke.
Michael Malice
(00:32:53)
There’s nothing funny about it at all.
Lex Fridman
(00:32:55)
Not laughing.
Michael Malice
(00:32:55)
It’s time to get serious.
Lex Fridman
(00:32:59)
Oh, nice. Oh, now I’m more nervous and aroused. So what else are you thankful for? So we’re both thankful for family.
Michael Malice
(00:33:10)
the fact that I can-
Lex Fridman
(00:33:11)
Still get it up?
Michael Malice
(00:33:12)
What’s that?
Lex Fridman
(00:33:13)
Nothing, go ahead.
Michael Malice
(00:33:14)
I think as an author, to be able to write what you want and have of enough an audience that it covers your living, that’s as good as it gets as an author almost. You don’t need to be Stephen King or some legend. There’s lots of stand-ups who aren’t world famous, but they have perfectly good living. They do their gig, they do what they love. I feel very, very blessed. You must be thankful for your career?
Lex Fridman
(00:33:43)
Yeah, yeah. Career wise. But I think the best part about it’s just making friends with people I admire.
Michael Malice
(00:33:52)
Okay.
Lex Fridman
(00:33:53)
Quite honestly, just friends. The people that have gotten to know me, I hide from the world sometimes, I hit some low points, especially with all the new experiences and just the people that have been there for me and haven’t given up on me.
Michael Malice
(00:34:06)
There’s days, and I’m sure you’ve had this also where I literally don’t speak to someone the whole day. And in certain times in my life, I remember very vividly, I was in DC in ’97, I was an intern, and that summer, DC closes down on the weekends. And I remember those weekends when I got off the phone with the third person. I knew there was no possibility anyone was going to call and what that felt like, and it was dark and it was bad. So I remember those feelings of loneliness a lot.
Lex Fridman
(00:34:44)
I still feel alone like that sometimes. You don’t feel alone?
Michael Malice
(00:34:51)
Not anymore.
Lex Fridman
(00:34:53)
What’s the reason, you think?
Michael Malice
(00:34:57)
Because I have a lot of people who I care about and who care about me. The thing about moving to Austin is I forgot how lonely New York got because it was like one after another, I lost everybody. And then you start losing the places you go to, and then it was just like, “Holy crap. I’m very isolated.” And here in Austin, there’s not as much to do, obviously as in New York, but there’s a lot of people here. More people are coming all the time. So if I ever want to hang out with someone, I’ve got a long list. And these are people who I’ve known for a very long time, people who know me quite well, so I could be myself. My awful, awful, awful, awful self. And that is something I don’t take lightly.
Lex Fridman
(00:35:42)
Now you moved to Texas, it’s going to secede.
Michael Malice
(00:35:44)
Yeah.
Lex Fridman
(00:35:44)
It’s just a very-
Michael Malice
(00:35:46)
Do you know what happened with that?
Lex Fridman
(00:35:47)
No.
Michael Malice
(00:35:48)
I forget the guy’s name, and it’s probably for the best. On Monday, a guy in the Texas legislature introduces a bill to have it on the referendum to have a referendum for Texas to declare its independence. Tuesday, I’m on Rogan. Me and him discuss it. I give it national attention. It was also really funny because a lot of people are like, “These people have been in Texas, five minutes, blah, blah.” I go to the Texas legislature, meet with the guy, have a nice conversation. A month or two later, unanimous, I think, he gets voted kicked out of Congress because he got an intern drunk and was inappropriate with her. At least it was a girl in this case. But yeah, so that was my little Texas independence moment.
Lex Fridman
(00:36:36)
Oh, it didn’t go anywhere?
Michael Malice
(00:36:38)
It did not go anywhere.
Lex Fridman
(00:36:39)
Wow.
Michael Malice
(00:36:41)
But it’s still part of the platform of the Texas Republican Party.
Lex Fridman
(00:36:45)
It’s fascinating that history is probably laden with stories like this of failed revolutionaries. We celebrate the heroes, but then there’s the losers like…
Michael Malice
(00:36:55)
Myself.
Lex Fridman
(00:36:56)
Yeah.
Michael Malice
(00:36:56)
Yeah.
Lex Fridman
(00:36:58)
And we’re going to mark that one as a failure and edit it out and moving on. So thankful. Friendships, right? But by the way, I want to say just to you, I’m thankful in these lonely moments, for people who write books. I’ve been listening to audiobooks a lot and reading a lot. I really like audiobooks actually. I don’t know, I can just name random person, Serhii Plokhy. He’s a historian I’m reading on the-
Michael Malice
(00:37:28)
Wait, I read him. What did he…
Lex Fridman
(00:37:29)
It’s just he’s written a book most recently about the Russia-Ukraine war.
Michael Malice
(00:37:35)
He wrote another one that I read. Didn’t he write about-
Lex Fridman
(00:37:37)
Empires, I think.
Michael Malice
(00:37:38)
The fall of the Soviet Union or something like that.
Lex Fridman
(00:37:38)
Yeah, yeah.
Michael Malice
(00:37:40)
Yeah. It was very, very good.
Lex Fridman
(00:37:41)
He’s great.
Michael Malice
(00:37:42)
I used him as a resource for the White Pill.
Lex Fridman
(00:37:44)
He’s objective while still having emotion and feeling to it. He has a bias.
Michael Malice
(00:37:49)
That’s fine.
Lex Fridman
(00:37:53)
A lot of times when you write a story that involves Putin, people are really ideological. They don’t write with a calmness and the clarity and the rigor of history, there’s emotion in it. There’s almost a virtue signaling. And he doesn’t have that, even though he is Ukrainian and has very strong opinions on the matter. Anyway, there’s people like that and he’s done an incredible job researching a recent event. Like he says, I was looking at everything that’s been written about the war in Ukraine and realizing the old Churchill line, that historians are the worst ones to write about current events except everybody else. And so he’s like, “I might as well just write about this war.” And he does an exceptional job summarizing day by day, the details of this war. Anyway. So I’m just grateful for a guy like that.
Michael Malice
(00:38:50)
For me, I’ll name some historians I love. Arthur Herman, Victor Sebastyen is probably my favorite. David Pietrusza, P-I-E-T-R- U-S-Z-A. When you are a historian, and I try to do this to some degree in the White Pill as much as I could. But when you take data and you make it read like a novel, so you’re learning about who we are as people, what had happened, but also it’s entertaining and readable. That to me is like the Acme of writing. I have so much admiration-
Lex Fridman
(00:39:25)
What does Acme mean?
Michael Malice
(00:39:27)
Top.
Lex Fridman
(00:39:27)
Okay.
Michael Malice
(00:39:28)
Zenith.
Lex Fridman
(00:39:29)
Zenith? Okay. Is this what writers do? They just come up with these incredibly sophisticated words? I’m impressed.
Michael Malice
(00:39:35)
Well, Acme is-
Lex Fridman
(00:39:35)
Because you could have just said the best of writing.
Michael Malice
(00:39:38)
Acme is also the company in Bugs Bunny and Wile E. Coyote is always Acme, like Acme bombs. When they are that good, it leaves me in awe.
Lex Fridman
(00:39:52)
It’s just-
Michael Malice
(00:39:53)
Ron Chernow is another one.
Lex Fridman
(00:39:54)
Who?
Michael Malice
(00:39:55)
He wrote the Hamilton biography.
Lex Fridman
(00:39:57)
Oh, nice. Well, I have a lot of favorite historians about the whole time period of World War II, William Shirer, people that lived during it, especially. I really like those accounts. Obviously Soldier Knudsen, he’s not a historian, but his accounts are fascinating. Actually, how much do you talk about Soldier Knudsen?
Michael Malice
(00:40:20)
Never.
Lex Fridman
(00:40:21)
Not much, right? Why not?
Michael Malice
(00:40:24)
I feel like I wanted to. There’s nothing I could add to him.
Lex Fridman
(00:40:30)
But he’s the Michael Malice of the previous century?
Michael Malice
(00:40:34)
No, he’s talented, charismatic, and skilled. So he’s not the Michael Malice. Yeah. I feel like I didn’t read Gulag Archipelago for the White Pill.
Lex Fridman
(00:40:47)
You didn’t?
Michael Malice
(00:40:48)
I didn’t. No. I got a lot of it from Anne Applebaum, who’s a very controversial figure. Her history books on the Soviet Union, I think are superb, but she’s also accused of being very much a NeoCon and being a warmonger in contemporary times.
Lex Fridman
(00:41:02)
Oh, I see.
Michael Malice
(00:41:02)
And I think comparisons between Putin and Stalin, although there is a Venn diagram, I think are a bit much, because I think it’s very hard to claim that if Putin conquered Ukraine, that there’d be a genocide. I think that’s a very hard argument to make.
Lex Fridman
(00:41:19)
In these tense times. Even the comparisons of what’s going on in Israel on either side, comparisons to the Holocaust are also troubling in this way.
Michael Malice
(00:41:28)
Yes. And I also don’t like how that… I got in trouble. There was some literal demon who works at the Atlantic.
Lex Fridman
(00:41:36)
As opposed to a regular demon?
Michael Malice
(00:41:38)
As opposed to figurative demon.
Lex Fridman
(00:41:40)
I didn’t know they employed demons.
Michael Malice
(00:41:41)
They exclusively employ demons at the Atlantic. And he was giving me crap this a couple of years ago on Twitter because I didn’t think it’s appropriate to refer to George Soros as a Holocaust survivor. And I’m like, “Listen, if you want to put him in the same context as Anne Frank, knock yourself out.” But I think that’s so completely disingenuous and frankly repulsive to me morally to equivocate between figures like that. And also to claim that anyone who is a billionaire who is including Elon, including Sheldon Adelson, there’s no shortage of these people. If you want to use your extreme wealth, use it to influence politics, you have to be up for criticism, Bill Gates. To protect these people from criticism just on the base of their identity is deranged to me.
Lex Fridman
(00:42:35)
But also, the Holocaust as a historical event and the atrocities within it are just singular in history. And so comparing them…
Michael Malice
(00:42:47)
What’s the utility? You’re just basically trying to take this brand. I’m using that term in a very specific way. And when they say climate denial, no one’s denying climate exists. So you’re just trying to go off Holocaust denial. I think it’s shameless and I think it’s gross.
Lex Fridman
(00:43:04)
And it cheapens everything because there’s deep important lessons about the Holocaust.
Michael Malice
(00:43:09)
Yes.
Lex Fridman
(00:43:10)
To me, the lessons are about how extreme it can get.
Michael Malice
(00:43:15)
And how fast.
Lex Fridman
(00:43:17)
Yeah, and how fast.
Michael Malice
(00:43:17)
That’s the one. So people ask, “Oh, are humans basically good? Are they basically evil?” I always say they’re basically animals. And I think most people are almost fundamentally deranged. And that there’s basically this veneer of civilization and decency. And when shit hits the fan and we see this over and over, they do things that would’ve been completely unthinkable even to themselves five years ago.
Lex Fridman
(00:43:46)
Most people are fundamentally deranged with a veneer of civility.
Michael Malice
(00:43:50)
There’s a show called-
Lex Fridman
(00:43:51)
I Think I disagree with that.
Michael Malice
(00:43:53)
What’s the show called? I’m having Alzheimer’s because of the advanced age.
Lex Fridman
(00:43:58)
The age, the skincare. It’s just working well.
Michael Malice
(00:44:00)
There’s a show called, I Think You Should Leave. It’s a sketch comedy.
Lex Fridman
(00:44:00)
I think you should leave. Okay, sorry.
Michael Malice
(00:44:04)
It’s a sketch comedy show. And he captures these great… How’s your hair, princess? He captures these great moments of just the very thin veneer of normalcy and just the craziness that’s so frequently lurking underneath. Another great example of this, when this is dealing with people who are literally crazy, have you ever seen the show, Hoarders?
Lex Fridman
(00:44:27)
Yeah.
Michael Malice
(00:44:27)
So every episode of Hoarders, there’s usually two people in every episode, but every episode has the same plot line, veneer of normalcy, veneer of normalcy, veneer of normalcy, slight expression of concern, full-blown derangement. And it always follows that exact pattern.
Lex Fridman
(00:44:44)
Yeah, I don’t know. I think the deep ocean of the human mind is good. There’s a longing to be good to others.
Michael Malice
(00:44:56)
I have seen literally no evidence of this. And I know everything’s a deep ocean with you people, but-
Lex Fridman
(00:45:01)
What do you mean you people?
Michael Malice
(00:45:02)
Pirates.
Lex Fridman
(00:45:04)
Oh.
Michael Malice
(00:45:05)
I don’t see it.
Lex Fridman
(00:45:05)
What’s that Mr. Parrot? He’s an antisemite/ No, that’s not nice to say in front of such a large audience. You’re embarrassing me, Mr. Parrot.
Michael Malice
(00:45:18)
Lex, you have-
Lex Fridman
(00:45:19)
What’s that Mr. Parrot? He’s a run-of-the-mill troll and barely an intellectual. That’s not nice to say. That’s not true. We talked about this. You have to see the good in people.
Michael Malice
(00:45:31)
You have seen personally, how quickly and easily it is for human beings to form outgroups and to just rid others, as I just did a minute ago with the Atlantic, completely out of the human race. And that happens constantly and very easily. Humans are tribal beings. I don’t see how that’s compatible with this essential desire to do good.
Lex Fridman
(00:45:58)
No, I think it’s like in 1984, the two minutes of hate. There is a part of humans that wants to be tribal and wants to get angry and hateful. And then that hate is easy to direct by, especially people as you, as an anarchist, talk about, there are people in power that direct that anger.
Michael Malice
(00:46:20)
Yes.
Lex Fridman
(00:46:21)
But I think if you just look at recent human history, the desire for good, the communal desire for good outweighs that, I think. Most of life on earth right now, people are being good to each other in a most fundamental sense relative to how nature usually works.
Michael Malice
(00:46:40)
Okay. I think you’re both wrong about people and about nature. So nature is not inherently violent in the sense, for example, if anyone has an aquarium or if you look at wildlife, yeah, you’re going to have predator or prey, but these animals are going to be coexisting and they’re going to be ignoring each other for the most part, right?
Lex Fridman
(00:46:40)
Mm-hmm.
Michael Malice
(00:46:59)
And as for humans being essentially good, I think humans are essentially to each other, you said, I think they’re essentially civil and amiable, but that’s not really being good.
Lex Fridman
(00:47:12)
Good, I think is a thing that gets illustrated when you’re challenged, when there’s difficult situations.
Michael Malice
(00:47:17)
Yes, exactly. Yes.
Lex Fridman
(00:47:18)
Civility is a good starting point. And then when there’s a big challenge that comes, people step up on average.
Michael Malice
(00:47:26)
I completely agree with you that human beings are capable of such profound goodness, that it makes you extremely emotional. And I certainly think that’s that’s true, but I think that’s more unusual than it’s the norm.
Lex Fridman
(00:47:42)
I see beauty everywhere.
Michael Malice
(00:47:43)
So do I, but that doesn’t mean it’s in every person.
Lex Fridman
(00:47:46)
Not in every person, but in most people. I wish there was a really good way to measure this, my general sense of the world. It’s just there’s so much incredible both in terms of economics, in terms of art, in terms of just creation as a whole, that’s happened over the past century, that it feels like the good is out powering the bad.
Michael Malice
(00:48:09)
You just did the perfect segue to the box.

Unboxing the mystery box

Lex Fridman
(00:48:16)
What’s in the box? Is it your fragile ego?
Michael Malice
(00:48:21)
You stole my joke. You stole my joke. That was the joke I made at you before we recorded. You stole my joke.
Lex Fridman
(00:48:28)
No, I didn’t. I write all your material, you hack.
Michael Malice
(00:48:33)
So as you know, I have a lot of beautiful stuff in my house because I think it’s something very important. Everyone listening, if you accomplish something that is great, some achievement, what I like to do is buy myself something to remember that moment. Because sometimes when it’s hard, you forget you’ve done great things in your life. You’ve made accomplishments. It doesn’t have to be some amazing factory. It could just be like my first job or I got a raise or you know what? Anything. So there’s this amazing sculptor named Jake Michael Singer, a singer who’s a sculptor, and I saw a piece of him.
Lex Fridman
(00:49:18)
How’s his singing voice? This joke’s not going-
Michael Malice
(00:49:23)
Hold on. I could go somewhere with this.
Lex Fridman
(00:49:24)
Okay.
Michael Malice
(00:49:25)
How’s his singing voice?
Lex Fridman
(00:49:26)
Do you want me to write your joke for you?
Michael Malice
(00:49:27)
Yeah. What’s the punchline? Harrrd. There it is, that’s the one.
Lex Fridman
(00:49:33)
That’s what she said.
Michael Malice
(00:49:34)
So I followed him on Instagram, he followed me back and he says, “What’s the point of being an artist if the work I create isn’t in the spaces of people I like and admire?” He’s a big fan of yours. You’ve given him and our episodes together give him joy. So he said, “If I make Lex a sculpture, will he put it on the-“
Michael Malice
(00:50:00)
He said, “If I make Lex a sculpture, will he put on the shelf behind him?” And what that reminded me of is when I was a kid, you read Batman comics and there’s the Bat Cave. And the Bat Cave has all this cool stuff in it. I didn’t realize until much later that all of those things in the bat cave had an origin story. So the giant penny, the dinosaur, there was actually a story where that came from. So if you’re a fan of a show, you can spot, oh, this is when this appeared. This is when that appeared. This is when that appeared. So he made you this sculpture. He lives in Turkey and it’s called Chance Murmur. And it is, I haven’t even seen it yet. It is absolutely beautiful.
Lex Fridman
(00:50:42)
So you want to do a little unboxing?
Michael Malice
(00:50:42)
Yes.
Lex Fridman
(00:50:44)
Okay. Axe or…
Michael Malice
(00:50:49)
Body spray?
Lex Fridman
(00:50:54)
All right.
Michael Malice
(00:50:54)
Let’s do it.
Lex Fridman
(00:50:55)
Let’s unbox.
Michael Malice
(00:50:59)
I’m so excited. He lunges out of the box.
Lex Fridman
(00:51:04)
You know that Steven Seagal movie where there’s a stripper that comes out of the box?
Michael Malice
(00:51:07)
Is there?
Lex Fridman
(00:51:08)
Under Siege.
Michael Malice
(00:51:09)
Okay.
Lex Fridman
(00:51:09)
He’s on a boat. You’re not an action film guy.
Michael Malice
(00:51:14)
No.
Lex Fridman
(00:51:19)
One.

(00:51:21)
What does the pirate say when he turns 80?
Michael Malice
(00:51:24)
What?
Lex Fridman
(00:51:25)
Aye matey.
Michael Malice
(00:51:29)
Aye matey. Oh.
Lex Fridman
(00:51:32)
Oh.

(00:51:33)
See, that’s how I know you don’t like humor.
Michael Malice
(00:51:35)
I just don’t like pirates.
Lex Fridman
(00:51:37)
Well, your mom does.

(00:51:39)
Do you play any musical instruments?
Michael Malice
(00:51:40)
No. Neither do you. I’ve seen your guitar videos.

(00:51:46)
Okay.
Lex Fridman
(00:51:48)
Here’s a big piece of wood for you. That’s what it feels like, just so you know.
Michael Malice
(00:51:57)
Oh, wow. Do you need help?

(00:51:57)
Oh my God.
Lex Fridman
(00:52:00)
This traveled across the world.
Michael Malice
(00:52:05)
So here’s why his work speaks so much to me. So first of all, he’s combining so many different references. It’s Nike, the Goddess of Victory, right? It looks like an angel as well. The Italian futurist, which is my favorite art movement from the early 20th century, they tried to capture motion in 2D or 3D form.
Lex Fridman
(00:52:31)
Well, Jake, thank you, thank you, thank you. Thank you for creating beautiful things. Thank you for caring about somebody like me and somebody like Michael. We really feel the love.
Michael Malice
(00:52:43)
That’s the other thing.
Lex Fridman
(00:52:43)
Thank you.
Michael Malice
(00:52:45)
When you have something that matters to you in your house and you’re having a bad day, you can look at it and remember. You know what I mean? That spirit of joy. And I actually have a list here. Okay? I’ve got a little rant ready. Do you want to hear my rant?
Lex Fridman
(00:53:00)
Yeah. Let’s go.
Michael Malice
(00:53:02)
One of the things that drives me crazy is when people, especially conservatives, think that all contemporary art is ugly or abstract or literally garbage. And there’s a lot of that, but so much of the stuff out there in galleries is not only not crazy expensive, but they’re trying to sell things for people in their house. And these are young artists. They’re trying to add beauty. I have a list, so if you don’t believe me and you think all contemporary art is garbage or terrible, go to the website or any of these places that I’m going to rattle off, look through them. And you’re telling me that it’s not about creating beauty and joy and things in people’s lives?

(00:53:40)
So I don’t have any relationship with any of these people, these are just some galleries I follow on Instagram. Outre Gallery, Antler Gallery, Giant Robot 2, Beinart, I don’t know how to pronounce it, I’m sorry. B-E-I-N-A-R-T. Spoke Art Gallery, Var Gallery in Milwaukee, I was there. The pieces were not expensive at all.
Lex Fridman
(00:53:58)
What kind of art are we talking about? Everything? Paintings?
Michael Malice
(00:54:00)
Mostly paintings. Mostly paintings. Some sculptures too, like this. Corey Helford is my favorite one in LA. Night Gallery, Vertical Gallery, Avant Gallery, Hive Gallery, Haven Gallery, and Curio Art Gallery. I’m telling you, it’s not exorbitant. This is not the kind of thing where you have to go to a museum and be like, “This doesn’t make sense to me.” You look at it right away, you’re like, “Okay, I know what this is.” And it’s beautiful. It’s awesome. And you’re supporting someone who’s young and creative trying to do something and make the world a better place.

(00:54:31)
So I’m a big fan of the contemporary art scene. A lot of it is not great, but even the stuff that’s not great is very rarely disgusting or gross. It’s just like, okay, I’ve seen this before, or something like that.
Lex Fridman
(00:54:43)
Okay.
Michael Malice
(00:54:44)
It’s like the difference between, there’s a standup where I’ll pay money for the ticket, and someone who’s an opener. It’s like, I wouldn’t pay to see him perform, but he sure still made me laugh. That person is still by far more good than bad. So a lot of this art isn’t stuff I would own, but it’s like, okay, I get it. I like it.
Lex Fridman
(00:55:01)
Well, as the analogy goes, I really like going to open mics, actually, because funny… It sounds absurd to say, but funny isn’t the only thing that’s beautiful about standup comedy, it’s the…
Michael Malice
(00:55:14)
The agony.
Lex Fridman
(00:55:17)
It’s going for it. It’s trying to be funny. It’s taking the leap, trying the joke. And some of the best stuff is actually funny, but the audience is like three people, two of whom are drunk and bored, and you’re still going for it. And that’s the human spirit right there.
Michael Malice
(00:55:35)
Roseanne was telling me how Gilbert Gottfried would go on, it was like 3:00 in the morning. And it was her and three other comics in the audience and they all were just dying.
Lex Fridman
(00:55:46)
Yeah.
Michael Malice
(00:55:46)
He was just killing them. Who’s your favorite comedian?
Lex Fridman
(00:55:53)
Dave Smith.
Michael Malice
(00:55:54)
Who?
Lex Fridman
(00:55:56)
And cut scene. Favorite comedian. First, Norm Macdonald. If you put a gun to my head and I had to answer really quickly, that would be him.
Michael Malice
(00:56:04)
Okay.
Lex Fridman
(00:56:07)
I would also say Louis C.K.
Michael Malice
(00:56:09)
Oh, wow. Yeah. Oh my God, yes.
Lex Fridman
(00:56:12)
But that’s almost like a vanilla answer at this moment in history because it’s like a-
Michael Malice
(00:56:16)
Louis C.K.’s pretty radioactive.
Lex Fridman
(00:56:18)
He is. Well, yeah. He does the tough topics-
Michael Malice
(00:56:21)
Sure.
Lex Fridman
(00:56:22)
… the best. Mitch Hedberg. The wit of a good one-liner is great. I guess that’s what Norm Macdonald was a genius at. What about you?
Michael Malice
(00:56:33)
I mean, we’re so fortunate to be here in Austin because that Comedy Mothership, you go there and people are just killing it. David Lucas is amazing.
Lex Fridman
(00:56:43)
Yeah, he’s great.
Michael Malice
(00:56:43)
Thai Rivera probably did the best set I’ve seen since I’ve been here in Austin. And I watched him and I’m like, “This guy’s even bitchier than I am.” So I reached out to him. So he’s just terrific. David Lucas is another one, a buddy of mine.
Lex Fridman
(00:56:57)
You just said it twice, I think. David.
Michael Malice
(00:57:00)
I’m thinking of Dave Landau, excuse me.
Lex Fridman
(00:57:01)
Yeah.
Michael Malice
(00:57:01)
Dave Landau. Joe Machi is-
Lex Fridman
(00:57:04)
Old age catching up.
Michael Malice
(00:57:04)
It’s true though.
Lex Fridman
(00:57:05)
It’s true.
Michael Malice
(00:57:06)
It’s true.
Lex Fridman
(00:57:07)
It’s true.
Michael Malice
(00:57:08)
Dave Lucas.
Lex Fridman
(00:57:10)
You ever been to the Comedy Mothership? It’s a great spot.
Michael Malice
(00:57:13)
Where is that? Is that in Austin?
Lex Fridman
(00:57:15)
Austin? Is that where Willie Nelson is from? I haven’t really… Go ahead, I’m-
Michael Malice
(00:57:19)
Oh, I heard a joke about that the other week.
Lex Fridman
(00:57:23)
Go ahead. Tell a joke again.
Michael Malice
(00:57:25)
What’s the only thing worse than giving head to Willie Nelson?
Lex Fridman
(00:57:31)
What?
Michael Malice
(00:57:32)
If he says, “I’m not Willie Nelson.”
Lex Fridman
(00:57:36)
What’s that, Mr. Parrot? I know he’s not funny. He thinks he’s better on Twitter. But that’s not nice to say, and right in front of his face. Just think how he feels.
Michael Malice
(00:57:49)
The statue, Chance Murmur is judging you.
Lex Fridman
(00:57:52)
Chance?
Michael Malice
(00:57:52)
It’s called Chance Murmur.
Lex Fridman
(00:57:54)
Chance Murmur.
Michael Malice
(00:57:55)
God, that’s so beautiful.
Lex Fridman
(00:57:56)
That is gorgeous.
Michael Malice
(00:57:58)
This is another reason I hate cynicism, and I talk about this a lot. Even just on Etsy, there are so many small, not huge companies, individual artisans who are creating great stuff and just making it happen. And it’s really sad for me where people can’t see that. Or if they’re like, “Well, how could I be excited about a sculpture when blah, blah, blah, the Middle East?” And it’s just like, you can always look for an excuse not to look for joy, or you could look for an excuse to look for joy.
Lex Fridman
(00:58:26)
Yeah. Etsy is incredible. I feel the same way about-
Michael Malice
(00:58:28)
OnlyFans?
Lex Fridman
(00:58:30)
… OnlyFans. I can’t even get that out of my mouth before laughing at my own failed joke.
Michael Malice
(00:58:34)
That’s what she said.
Lex Fridman
(00:58:36)
Oh, all right. That might be one of the first that’s what she said from Michael Malice.
Michael Malice
(00:58:43)
Yeah.
Lex Fridman
(00:58:43)
I’m going to count that.
Michael Malice
(00:58:48)
I don’t know what I’m going to do with mine, because I got my own. Mine’s three feet tall, just like me.
Lex Fridman
(00:58:52)
Your box was much bigger.
Michael Malice
(00:58:53)
Yeah.
Lex Fridman
(00:58:54)
And it was giving me an inferiority complex. I think I’m going to invade Russia. That’s a Napoleon reference for those in the audience.
Michael Malice
(00:59:09)
I don’t know if I’m going to… I think I’m going to put it in my bedroom so it’s the first thing I see when I wake up.
Lex Fridman
(00:59:13)
Put it in the bedroom.
Michael Malice
(00:59:13)
Yeah.
Lex Fridman
(00:59:16)
Did we get through everything we’re thankful for?
Michael Malice
(00:59:19)
No, I’ve got lots of things I’m thankful for.
Lex Fridman
(00:59:20)
What else? Friends, family. We said books.
Michael Malice
(00:59:26)
I’m thankful for career. I am thankful for… And I know people are going to lose their minds and I can hear them flipping out already. I am thankful for social media.
Lex Fridman
(00:59:40)
Yeah.
Michael Malice
(00:59:41)
I’m thankful for several reasons. First, it is a way for people to make connections that they couldn’t have made in years past. That if you’ve got some weird hobby, you can find that other person’s weird hobby and you make that connection. It’s a great way to stay in touch permanently for people otherwise you’d lose touch with, you know, at whatever venue. And it’s also a great way to expose corporate depravity. When you have these organizations that are dishonest, I think the community notes thing on Twitter is the greatest thing ever.
Lex Fridman
(01:00:09)
Yeah, it’s incredible. I wish they would pay attention to the Michael Malice account more often.
Michael Malice
(01:00:15)
You shouldn’t be encouraging anyone to pay attention to my Twitter account.
Lex Fridman
(01:00:18)
Yeah.
Michael Malice
(01:00:18)
It’s a dumpster fire. And I don’t mean Bridget, I mean like a literal… Bridget Phetasy.
Lex Fridman
(01:00:23)
Oh, Bridget, by the way, is amazing. But your Twitter account makes-
Michael Malice
(01:00:23)
She lives here.
Lex Fridman
(01:00:25)
Yes. Not here. I wish she did.
Michael Malice
(01:00:29)
She’s in Georgetown.
Lex Fridman
(01:00:30)
No, I mean in this, where we’re sitting.
Michael Malice
(01:00:32)
Oh.
Lex Fridman
(01:00:32)
It’s a joke, Michael.
Michael Malice
(01:00:33)
Is it?
Lex Fridman
(01:00:34)
Yeah.
Michael Malice
(01:00:34)
But I’m just really glad about… It’s another way for people who before would’ve felt very alone. I know some people do feel alone, but for other people it makes them feel connected.
Lex Fridman
(01:00:46)
There’s been a lot of talk about antisemitism recently.
Michael Malice
(01:00:49)
Yeah.
Lex Fridman
(01:00:49)
What’s your sense about this? Is antisemitism like any other brand of hate? There’s a lot of hate out there.
Michael Malice
(01:00:57)
No, I don’t think it’s like any other brand of hate, because I don’t think racists or transphobes or homophobes or misogynists or xenophobes argue openly or even not so openly for the killing of black Americans, transgender people, gay people, women, or immigrants. And it’s not only something that’s talked about, it’s something that has actually happened. And not just the Holocaust, but just centuries of pilgrims, right? There’s this great book that I read many years ago called The Satanization of the Jews. Camille Paglia recommended it and I read it. And they live in this certain specific kind of antisemitism. And again, I’m not talking about people who are against Israel or something like that. I’m talking specifically about Jew hatred. They have this moral calculus that Jews are the only people who are capable of good or evil, and Jews are exclusively capable of evil.

(01:01:56)
For example, if you look at the George W. Bush White House, you had W, you had Cheney, Condoleezza Rice, Colin Powell, Donald Rumsfeld, a lot of these NeoCon advisors. So if there’s 10 people in a room and there’s one Jewish person, it’s his fault, and the rest are Jew controlled. So again, they only exist as a puppet of Jews in this kind of worldview. And it’s like, to me, if there were no Jews on earth, it is crazy to say that John Bolton and Liz Cheney and Lindsey Graham wouldn’t be pushing for more war. That makes no sense to me. It’s like, you blame the Jews when bad things happen, but when a Jewish person does something good, it doesn’t really matter. Or just wait, he’s going to do something bad. Well, yeah, that’s true. Human beings do good things and then they do bad things sometimes. But it only counts when that Jewish person does the bad thing.
Lex Fridman
(01:02:58)
I wonder what’s a way to fight antisemitism and fight hate in general?
Michael Malice
(01:03:02)
I think the only or the best way, because I thought a lot about this, about how did gay Americans go from being universally hated and despised to the point that many people in the ’80s went to their graves, those who had AIDS, without even telling their parents because they were so scared, to now Times Square is just covered in pride flags. And this also works for Islamophobia and some of these other bigotry, is what I call the ambassador program. Because as soon as you know someone who is a member of a certain group, it is a lot harder to be bigoted against them because instead of this being this out group that’s somewhere out there, it’s like, wait a minute, I work with this guy. Yeah, he’s kind of a jerk and maybe he sees things a little differently than me, but this guy is not a horrible human being. So I think the only way to fight any form of bigotry is to be a good example of the counter to whatever archetype or stereotype is in the culture.

Karl Marx and religion

Lex Fridman
(01:04:13)
Karl Marx wrote that, “Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of a soulless condition. It is the opium of the people.” As the famous phrase goes. Do you think he has a point?
Michael Malice
(01:04:26)
No. I hate that quote. I absolutely hate it. I despise this sort of Reddit internet atheist activism for the simple reason that I know many people who in finding faith have become objectively better human beings.
Lex Fridman
(01:04:46)
Yeah.
Michael Malice
(01:04:46)
They start living consciously. They take morality seriously. They try, we all fail, to be moral good people. So this sneering that these midwits, these marginally intelligent people have towards religious people. Now, lots of religious people use religion to rationalize their bad behavior or sinful or big ego, so on and so forth. That exists, that’s true. But to say that it never helps anyone and it’s universally the… See, Marx was talking about a period, I mean, I’ll defend his quote, when his argument was the masses are being starved and oppressed, but they’re promised, don’t worry, you’ll have riches in heaven. So you should kind of let yourself be pushed around now, and this is kind of this BS bargain that the people are being given. So that was, I think, the point he was making. It certainly doesn’t apply nowadays. I’m close to the family in the Midwest. They’re good Christian people. I remember very specifically this guy, shout out to him, Sean Sherrod. I went to college with him. David Lucas.
Lex Fridman
(01:06:01)
Have you checked out the Comedy Mothership? Great club.
Michael Malice
(01:06:03)
Where is it? Is it in Austin?
Lex Fridman
(01:06:05)
Willie Nelson.
Michael Malice
(01:06:07)
I was 17, 18, freshman year, and I was reading all this criticism of the Bible and I was like, “Look, this is in there. Look at this in there.” And he put his hand on my shoulder and he says, “Michael, there’s nothing you’re going to tell me that’s going to make me lose my faith.” And that was a very self-aware and profound thing to say. As I’ve gotten older, I know lots of religious people. There’s no part of me that thinks they’re wrong or they should be mocked. It also reminds me of when people sneer at addicts in recovery, they’re like, “Alcoholism isn’t a disease, it’s a choice.” It’s like, wait a minute. You don’t know what it’s like to have your entire life ruined by drugs or alcohol.
Lex Fridman
(01:06:48)
Yes.
Michael Malice
(01:06:49)
And if you have to tell yourself, “I have this disease and blah, blah, blah,” and that keeps you from drinking and now you’re a moral upstanding person who’s reliable and takes responsibility for their actions, I don’t see the harm at all. So I think this kind of activist atheism is cheap. I don’t agree with it whatsoever. And I do not like that quote at all.
Lex Fridman
(01:07:12)
But otherwise, big fan of Marx?
Michael Malice
(01:07:14)
I mean, I think there’s a fan of mine, I forget who it was, apologies. He had this great quote, and this is me talking. He goes, “The games people play to feel smarter than others is depressing and annoying.” And I think this kind of fedora internet atheism is a good example, because here’s the other thing. If you’ve proven that someone else is stupid, that doesn’t mean you’re smart. You could both be stupid. So congrats, you proved someone else is stupid. Who cares?
Lex Fridman
(01:07:43)
Yeah. And sneering of all forms in general is just not great.
Michael Malice
(01:07:48)
That’s one of the things I block out people on social media instantly. You’re not going to sneer at me in my space. You could sneer at me all you want in your space, but I’m not putting up with your crap. I don’t know you.
Lex Fridman
(01:07:57)
MySpace, great social network.
Michael Malice
(01:08:00)
Is that on Sixth Street?
Lex Fridman
(01:08:04)
AOL.com.
Michael Malice
(01:08:07)
Clang, clang, clang. That’s how Lex comes.
Lex Fridman
(01:08:12)
Like a Pavlov’s dog. That was the sound before you get to see… Spend 10 minutes waiting for an image of a lady load one line at a time.
Michael Malice
(01:08:24)
Yeah.
Lex Fridman
(01:08:29)
I recently talked to John Mearsheimer, I don’t know if you know him at all. So he has this idea about offensive realism. It’s a way to analyze the world into national relations. And the basic idea, and I’ll run it by you and see what you think, is that states, nations want to survive and they try to do so by maximizing power, military power. And he talks about anarchy quite a bit, in that one of these underlying assumptions of this way of viewing the world is that states are anarchic towards each other.
Michael Malice
(01:09:10)
Yes, that’s true.
Lex Fridman
(01:09:11)
And they operate under a lot of uncertainty. States cannot be sure that other states will not use military capabilities against them.
Michael Malice
(01:09:18)
Right.
Lex Fridman
(01:09:19)
They want to survive and they want to use military power to control the uncertainty to protect themselves.
Michael Malice
(01:09:29)
So I disagree in that regard. And I see on your bookshelf, I think the world is a lot closer to Brave New World than it is to 1984. And I think if you look at, let’s suppose China’s influence in America. The influence is far more through soft power than military power. China doesn’t threaten America through “we’re going to kill you.” It’s more like the infiltration of universities, TikTok, things of that nature. Maybe this would’ve worked before the pop culture era, but I think one of the reasons we have this kind of American hegemony isn’t just a function of American military. I think it’s much more a function of American popular culture. When you’re exporting ideas and culture, it makes other people in other countries feel closer to you and also regard you as a friend, and also to adopt your value. It’s a great way to spread propaganda.
Lex Fridman
(01:10:28)
It seems to correlate though, right? It’s interesting. It’s an interesting idea. What has more power, the viral spread of ideas or the power of the military? It seems that the United States is at the top of the world on both.
Michael Malice
(01:10:44)
That’s true.
Lex Fridman
(01:10:45)
And so it’s hard to disentangle the two.
Michael Malice
(01:10:48)
Let’s look at Europe. American culture is very popular in Europe in many ways, right? The best music comes out of Sweden, Swedish indie pop. They’re singing in English, even though… So on and so forth. None of this is a function, maybe it’s a function of post World War II to some extent, but I don’t think it’s a function of American bases there. I think it’s a function of we’re exporting our music, our TV shows, and our movies.
Lex Fridman
(01:11:14)
Yeah. It’s interesting, if the battleground will be Brave New World, the battle of ideas.
Michael Malice
(01:11:18)
I think it’s clearly Brave New World. It’s so much cheaper, and again, this is one of the dark sides of social media, to use influence than it is to use threats. I think Covid is a good example of this. So much of the pressure, yes, there was authoritarianism, but it was the fact that everyone bought into it, rightly or wrongly. But the vast majority of the population wars behind all of these things, and that was through persuasion. And because people are begging for it to come back in many cases.
Lex Fridman
(01:11:47)
So who’s funding you? Which intelligence agency?
Michael Malice
(01:11:50)
Mossad.
Lex Fridman
(01:11:51)
Mossad. Mossad. This is how you do great interviewing. See, he didn’t even expect that. Okay.
Michael Malice
(01:12:01)
What’s that, Mr. Parrot?
Lex Fridman
(01:12:02)
What was that, Mr. Parrot? You knew it? But you didn’t have any documentation, did you?
Michael Malice
(01:12:10)
I think Mr. Parrot is threatened by the better wings on Chance Murmur.
Lex Fridman
(01:12:15)
He gets like that when he’s turned on, he’s not threatened.
Michael Malice
(01:12:18)
Oh, okay.
Lex Fridman
(01:12:18)
You can’t wait until all three of us are alone together. It’s going to be one hell of a party.
Michael Malice
(01:12:25)
Beaks and feathers everywhere.
Lex Fridman
(01:12:27)
And metal. Yeah, this thing is beautiful.

Art

Michael Malice
(01:12:32)
It’s ridiculous.
Lex Fridman
(01:12:34)
You have actually a lot of really cool stuff at your place.
Michael Malice
(01:12:37)
It’s so fun.
Lex Fridman
(01:12:39)
What’s a cool thing that stands out to you? Maybe a recent addition.
Michael Malice
(01:12:44)
So I went to the Dallas Museum of Art last year for my birthday and there was a painting I liked, and I Googled it and I saw the auction for that exact painting. And it was, I think three grand, which is not cheap, but not something you think… You think in a museum, “I could never afford something like this,” right? So when I went to Houston with some friends… The Sideserfs, Natalie, who made the cake of you.
Lex Fridman
(01:13:14)
Oh, yeah, the cake. Terrified my mom.
Michael Malice
(01:13:17)
Did it?
Lex Fridman
(01:13:18)
Yeah.
Michael Malice
(01:13:18)
Aww.
Lex Fridman
(01:13:19)
No, it’s not the cake that terrified my mom. It’s you, Michael Malice, cutting it off, cutting the face off and laughing maniacally.
Michael Malice
(01:13:30)
Well, Natalie’s pregnant. She’s going to have a daughter named Daisy. So congrats to Natalie.
Lex Fridman
(01:13:30)
Congrats to Natalie.
Michael Malice
(01:13:35)
But I was in the museum with them and there was a statue of Thoth, who’s the Egyptian god whose head is an ibis. It’s a bird with a long beak. And Thoth is the god of the moon, god of knowledge, and supposedly he invented writing. So I thought, you know what? I’ve always loved Ancient Egypt. I know a lot about it and especially the mythology. It’d be really cool as an aspiring author to have an ancient Egyptian Thoth statue in my house. Well, it turned out that the Egyptians also killed and mummified ibises and buried them with scribes. And a week after I went to the museum, there was an auction for an ibis mummy. And I have it now in my house, still in its bandages, overlooking my desk. And we all know it’s going to come to life and peck out my eyes and write with my blood. But that is one of the recent cool additions.

(01:14:31)
Another thing I have, which is like, in terms of holy crap I’ve made it. I have an original Patrick Nagel painting, and if people don’t know the name, he’s like the ’80s artist. He did the Duran Duran cover. Whenever you see him in nail salons. I have a male, which were very rare for him to do. So that’s two of my kind of favorite pieces.
Lex Fridman
(01:14:49)
You have what?
Michael Malice
(01:14:49)
He only drew women predominantly. I have one where we drew a male. It was a guy in a jean ad or something. And now I’m looking forward to, so Jake made me a three-foot tall sculpture called Future Murmur, which I am ecstatic-
Michael Malice
(01:15:00)
… sculpture called Future Murmur, which I am ecstatic to get.
Lex Fridman
(01:15:06)
Just remind yourself how many fascinating, beautiful people that are out there.
Michael Malice
(01:15:14)
And just the victory and holiness and technology and speed, and how many people have fought so that I could do what I do.
Lex Fridman
(01:15:26)
Yeah. That’s another thing I’m grateful for. Just like the 100 billion or so people that came before us, and also the trillions of lifeforms that came before that.
Michael Malice
(01:15:38)
Oh God, I’ve gone down this trilobite rabbit hole, buying fossils because as a kid I thought trilobites were the coolest thing, and now I’ve got like 15. And what’s interesting is when you buy trilobite fossils on eBay, they’re listed as used, because it’s got to be new or used according to the programming. So it’s used.
Lex Fridman
(01:15:57)
Yeah. But just thinking about all that history, just all the lifeforms that came before. It seems like a really special thing we have going on earth here.
Michael Malice
(01:16:10)
Oh yeah. I think that’s very fair to say, but also think this kind of is like live life to the fullest. Camus talked about living to the point of tears, especially on behalf of people who didn’t have that privilege. So I dedicated the white pill to my parents who got me out of the Soviet Union and all the kids who never could. And it’s like when I die, I want everyone else to not only, they’re obviously going to be happy, but yeah… I’m not here. Live for me, I can’t have that privilege anymore.

Books

Lex Fridman
(01:16:44)
What do you think about Camus as a writer?
Michael Malice
(01:16:47)
I don’t like his novels at all.
Lex Fridman
(01:16:49)
Oh, you don’t?
Michael Malice
(01:16:49)
At all.
Lex Fridman
(01:16:50)
Yeah. You’ve talked about The Plague to me, a little bit.
Michael Malice
(01:16:52)
Yeah. I think the book is pointless.
Lex Fridman
(01:16:55)
It’s fascinating.
Michael Malice
(01:16:56)
Because all you need to do is read the synopsis and then you get it. I don’t think his book-
Lex Fridman
(01:17:00)
Isn’t that true for most books?
Michael Malice
(01:17:01)
No.
Lex Fridman
(01:17:02)
I mean, you could take, I don’t know… I just don’t agree at all. I mean, it’s Catcher in the Rye. There’s a lot of books that are seem trivial.
Michael Malice
(01:17:11)
I don’t think it seems trivial, but I think-
Lex Fridman
(01:17:13)
Animal Farm.
Michael Malice
(01:17:15)
Animal Farm is a methodical step-by-step examination of a transformation from one thing to another. The Plague is not that.
Lex Fridman
(01:17:23)
It’s a methodical examination of what a society is like under the plague, which could symbolize a lot of things, including the plague directly or Nazi Germany or ideological movements, or… It’s similar to Animal Farm. Maybe not as effective in terms of using this kind of symbology-
Michael Malice
(01:17:44)
I think Animal Farm has a narrative and… I’m going to spoil the whole Plague. The book, The Plague. There’s a town, I believe in Oman, a plague descends, people struggle to deal with it, and the plague vanishes as quickly as it came. The end.
Lex Fridman
(01:18:02)
But there’s the victims, the people that take advantage of it. There’s the doctor that, amidst the absurdity and the evil of the plague, is fighting to do good.
Michael Malice
(01:18:11)
Nothing for me. Does nothing for me.
Lex Fridman
(01:18:13)
Okay, well I can spoil the Animal Farm. There’s animals at a farm and the humans are abusing them, and then the animals overthrow the humans, but then the pigs become just like the humans. The lesson, kids, is that power corrupts, no matter whether you walk on four or on two.
Michael Malice
(01:18:40)
I thought the lesson was that pigs are the most human-like animals on the farm.
Lex Fridman
(01:18:47)
I thought the lesson was that there’s no sugar candy mountain.
Michael Malice
(01:18:50)
That’s right. Yeah.
Lex Fridman
(01:18:53)
You’ve interviewed a lot of people. What have you learned about getting to the soul of a person, the soul of an idea from interviewing? Just how to do a good interview?
Michael Malice
(01:19:04)
First off, I’m not interviewing just random people. I’m interviewing people who are accomplished. It’s not a random group. That’s self-selecting for something different. But I think that people love to, and this is very understandable, love to feel seen. So if you’re someone who’s done something, even if you’re like the best Guinea pig breeder in America, to have someone interested in your work and listen to what you’re saying… because I remember every book I’ve written, I have friends, and I wouldn’t stop talking about the person I’m writing with or the North Korea. And a certain point, I’m sure they’re like, “All right, I don’t care about this anymore.”But it takes over your brain. You know what I mean?

(01:19:43)
So if you someone who has an interest or a hobby, I’m sure to some extent, maybe your friends or family are sick of talking about it or you don’t want to talk about it with them. That’s the private life where you could just be yourself. So I try to, and this comes from my co-authoring background. When I’m talking to people to ask the questions that they haven’t heard before. There’s a possibility that this actor I’m a huge fan of is going to be on my show. I don’t want to spoil everything. And he’s got a very specific role that he’s known for. And I’m like, “Okay, I know it’s going to be annoying for you talking about this one role, but my goal is to ask questions that you aren’t sick of asking, haven’t been asked.”
Lex Fridman
(01:20:28)
Porn star or…
Michael Malice
(01:20:29)
No, not a porn star.
Lex Fridman
(01:20:32)
That joke failed. Also edit out. What do you know about breeding Guinea pigs? You mentioned it. I’d love to hear-
Michael Malice
(01:20:32)
I don’t know anything.
Lex Fridman
(01:20:42)
I would love to hear more about it.
Michael Malice
(01:20:44)
I always use this as an example. You meet someone at a party who breeds Guinea pigs, right? There’s two approaches. Either you’re weird, okay. Or, “Sit down and tell me everything.” And I’m very much, and all the people I like are the second group. When you meet someone who’s doing something unusual and are passionate about it and are good at it, that to me is the mother load.
Lex Fridman
(01:21:09)
Yeah. That to me also is the thing I enjoy the most, is people-
Michael Malice
(01:21:13)
And then it’s like-
Lex Fridman
(01:21:15)
… that are passionate about a thing.
Michael Malice
(01:21:16)
… who do you guys hate? Do you guys hate the hamster people? Do you hate the rabid people? There’s got to be someone that you guys look down on, because the marine aquarium people look down on the freshwater aquarium people.
Lex Fridman
(01:21:25)
Yeah. It’s a hierarchy.
Michael Malice
(01:21:26)
Yes. There’s always going to be a hierarchy. This is where the left anarchists and I disagree, because they think you can have egalitarianism. There’s going to be a hierarchy.
Lex Fridman
(01:21:33)
Hierarchies emerge.
Michael Malice
(01:21:34)
Yes.
Lex Fridman
(01:21:35)
There’s no anarchy in the Guinea pig world.
Michael Malice
(01:21:39)
No. It’s just a different kind of anarchy.
Lex Fridman
(01:21:41)
Somebody’s always breeding somebody else.
Michael Malice
(01:21:43)
Yes.
Lex Fridman
(01:21:44)
And looking down on the others.
Michael Malice
(01:21:47)
Yeah, someone’s the other. Whether it’s the hamster people, the rat people.
Lex Fridman
(01:21:50)
And everybody’s breeding. By the way, are you an anarcho-capitalist? What flavor of anarchist are you?
Michael Malice
(01:21:59)
I’m an anarchist without adjectives. I like them all. The black flag comes in many colors.
Lex Fridman
(01:22:04)
All right. All right. You’re quoting your… No, I understand. It’s a beautiful line in the book.
Michael Malice
(01:22:09)
Thank you. I think the anarcho-capitalists don’t give the left anarchist enough credit, especially for their courage. And I do whatever I can in my power to talk about people like Emma Goldman, whenever possible.
Lex Fridman
(01:22:24)
Do you still think that “are some people better than others” is a good litmus test?
Michael Malice
(01:22:29)
Yes. It’s worked 100% of the time.
Lex Fridman
(01:22:32)
And for you, the answer is yes?
Michael Malice
(01:22:35)
I never answer.
Lex Fridman
(01:22:39)
There’s two of them.
Michael Malice
(01:22:43)
What are you all Hitchcock up in here?
Lex Fridman
(01:22:46)
Oh, hey, careful. I always got your back. What little habits in your life make you happy now that you’re in Austin?
Michael Malice
(01:22:59)
Oh my god. I was prepping for this interview, and I imagined this coming up, and I knew that as I explained this, you know how sometimes when someone tells a story, at first it’s amusing, then it’s amusing and concerned, and then you’re like, “Holy shit, where’s the exit?”
Lex Fridman
(01:23:27)
Yeah. I’m getting nervous already.
Michael Malice
(01:23:30)
You should. So I’m going to tell you something I’ve told only a couple of people. This is my absolutely off the charts, autistic approach to shaving. So I have this insane system. You asked about habits that give me joy. I used to hate shaving. I used to hate it. There’s something called wet shaving. So wet shaving is you get the brush, you get the soap that’s in a canister, you stirred up, you paint your face, and then you shave. The thing is, there are dozens of these shaving soap companies, okay? So I tried a couple of hundred of these soaps, because you’re testing for scent, you’re testing for, with the lather, thickness, and also how smooth of a shave it gives you. I have it down… I’m not making this up. I’m not this creative. I have it down to a cycle of 67 soaps. Okay?
Lex Fridman
(01:24:40)
A cycle.
Michael Malice
(01:24:41)
A cycle. So 67. When I use up one soap, that is a slot that I will have to try new ones, and I will try new ones in that slot until I get one that I like, and then that slot is filled. So right now, I have 67 that I use, and I have 86 candidates.
Lex Fridman
(01:25:05)
Like in the queue?
Michael Malice
(01:25:07)
In the queue.
Lex Fridman
(01:25:07)
Do you label them? Do you remember which one is which?
Michael Malice
(01:25:09)
Well, they all have beautiful labels. I mean, these are artisans who are creating these a amazing things. I would encourage everyone to try this hobby, who’s a guy. It’s so much fun. I will give a shout-out to the companies that are the best. So the best company, in my opinion, is a company called… they just changed the name because… You know what they’re originally called? I’m not joking. Grooming Department. And now it’s like-
Lex Fridman
(01:25:37)
Not a bad name.
Michael Malice
(01:25:38)
Yeah, but it has certain connotations in contemporary discourse.
Lex Fridman
(01:25:42)
Yeah, I understand. Contemporary discourse, yeah.
Michael Malice
(01:25:43)
So now he changed his name to Aion Skincare, A-I-O-N. That’s the sense of the most sophisticated, the most diverse, and the soap is just really high quality. Another amazing company is Barrister and Man. And if I’m going to tell you to try one, it’s called Cheshire. He comes out with new ones every month or so. A lot of it’s miss. A lot of it’s hit. Just great, great quality stuff. Another great company is Chiseled Face. They make something called Midnight Stag, which basically smells like a garage. It’s one of my favorite soaps of all time.
Lex Fridman
(01:26:16)
What makes for a good smell for Michael Malice?
Michael Malice
(01:26:19)
I have 67 answers. So some of them smell-
Lex Fridman
(01:26:22)
So you can’t convert it into words?
Michael Malice
(01:26:24)
Some are citrusy, some are industrial, some-
Lex Fridman
(01:26:27)
So garage is more industrial.
Michael Malice
(01:26:28)
It smells like a garage. Yeah. Midnight Stag. It smells like a garage. Some are fun. There smells that smell like other things. For example, there’s a scent in my queue called Finding Scotty. It smells like Swedish Fish. Another great company is Phoenix Shaving, and they have one called Aloha Smackdown. It smells like Hawaiian Punch. They had one called Yule Ham that they made for me special. Smells like a ham. They had a ramen one, Rock and Ramen. Smells a cup of noodles. And every year they do an advent calendar where for 12 days you have a little sample of a soap and a sample of the aftershave.
Lex Fridman
(01:27:04)
Nice.
Michael Malice
(01:27:06)
I’m forgetting someone and I’m feeling angry that I’m doing it. But those are some of the… Oh, and Catie’s Bubbles is great. They’re vegan, out of New Jersey. They’ve got one called a Knee High to a Grape. It smells like grape soda. I think those are the biggest names off the top of my head.
Lex Fridman
(01:27:25)
Will that list converge down to a small set eventually, or no? 67 down to-
Michael Malice
(01:27:31)
Well, no, it’s 67.
Lex Fridman
(01:27:33)
Oh, so it always keeps [inaudible 01:27:35]-
Michael Malice
(01:27:35)
So if there’s a slot, then, you know what I mean? I’ll fill that. You see what I’m saying?
Lex Fridman
(01:27:39)
Oh, so you will forever have the variety of 67?
Michael Malice
(01:27:41)
Yes.
Lex Fridman
(01:27:46)
You know how sad my brain is? When you were telling me this, I was like, “I wonder how many soaps are left in Michael Malice’s life.” You can count your life by days, by month, by years, or by soaps.
Michael Malice
(01:28:01)
That is depressing. That is very dark.
Lex Fridman
(01:28:03)
Because each experience of shaving is a little beautiful experience.
Michael Malice
(01:28:07)
Yes, it is. It’s so much fun.
Lex Fridman
(01:28:09)
How many do you have left in your life, right?
Michael Malice
(01:28:10)
That’s true.
Lex Fridman
(01:28:11)
Yeah.
Michael Malice
(01:28:12)
I got to tell you, there’s something else. There’s a term my friend Jackie taught me called Touching Pan. It’s a makeup term. So basically when you use it and you could see the bottom, that’s like a big moment.
Lex Fridman
(01:28:24)
Oh, it’s a great thing.
Michael Malice
(01:28:24)
Yeah. Well, it’s kind of fun. I’m telling you, people can scoff. It is such a fun… and there’s a lot of us online who are into this whole space. It’s really, really fun.
Lex Fridman
(01:28:34)
When did you first discover this?
Michael Malice
(01:28:36)
Can I curse?
Lex Fridman
(01:28:37)
Yeah.
Michael Malice
(01:28:38)
Fuck you, Cole Stryker. Because I was staying at my friend Cole’s house in LA. Fuck you Cole.
Lex Fridman
(01:28:45)
Fuck you, Cole.
Michael Malice
(01:28:46)
Cole is one of the biggest hipsters I know. He’s got the shirts with the pearl snaps and everything. And I’m staying at his house because I was doing Rogan, and he goes, “Oh, have you heard of this wet shaving thing?” And he goes, “Look, this one’s Proraso. That’s the Italian grandpa soap, which is also a great one. And I went down this rabbit hole, and now I’m like… I don’t even know how much money I spent on this. And it’s all because of him.
Lex Fridman
(01:29:09)
Oh. But it’s like a happy fuck you. Like, fuck you, Cole.
Michael Malice
(01:29:09)
Yeah.
Lex Fridman
(01:29:12)
I love you, Cole. Fuck you.
Michael Malice
(01:29:13)
Yeah, it’s just-
Lex Fridman
(01:29:14)
Thank you.
Michael Malice
(01:29:15)
Yes, yes.
Lex Fridman
(01:29:16)
That’s a good idea for a tattoo. Fuck you, Cole. Do you have advice on how to be happy?

How to be happy

Michael Malice
(01:29:25)
Yes.
Lex Fridman
(01:29:26)
There’s a lot of loneliness and sadness in the world.
Michael Malice
(01:29:31)
I can give a very easy piece of advice that worked a lot for me. Instead of telling yourself that you have these ridiculous standards, tell yourself, “I can be better. I don’t have to be a great writer. I could be a better writer. I don’t have to be a great podcaster. That will never happen. I could be a better podcaster. I could be a better person. I could be better at the gym. I could be better with my time.” And when you regard things in… and especially if you have metrics that you can go by. “I’ll run this many miles a day.” Things you have control over. Especially as males, when you have this chart and the data is telling you you’re improving, right away, it’s like you have this sense of accomplishment. So I think that is a really great way to…

(01:30:25)
And if something is not working in your life… Let’s suppose you don’t have friends. Right? There’s the internet. How do people make friends? Try things out? What’s the worst that’s going to happen? Things will blow up in your face. Well, you’ll learn something at least. Don’t be afraid of making mistakes. When I was a kid, I was so scared of having things under control, so like I would never have to get hit in the face metaphorically. And then I realized, and you realized this as well, everyone who’s important gets hit in the face. Look at the president, whoever the president is. It becomes a matter of being strong enough that you could take getting hit in the face. So that is a big important switch in your thinking.
Lex Fridman
(01:31:05)
Yeah. There’s a Bukowski quote I wrote down. “Sometimes you climb out of bed in the morning and you think, I’m not going to make it. But you laugh inside, remembering all the times you felt that way.”
Michael Malice
(01:31:16)
Yeah, yeah.
Lex Fridman
(01:31:17)
There’s a part of me that’s like that. There’s some days where I feel like this is the worst day of my life. And then shortly after, I chuckle at that.
Michael Malice
(01:31:26)
Yes.
Lex Fridman
(01:31:26)
Just knowing the ups and downs of the brain and the mind and life and all that. You ever been depressed?

Depression

Michael Malice
(01:31:32)
Yeah, of course. I’m more anxious than depressed. I don’t really get depressed, but I’ve been depressed.
Lex Fridman
(01:31:38)
Like low points.
Michael Malice
(01:31:39)
Yeah. But I think I distinguish depression between low points, right? If things are going bad and you feel bad, that makes sense. But when I think of depression, I think of someone who feels bad when things aren’t bad. To me, it’s almost by definition irrational.
Lex Fridman
(01:31:55)
Well, yeah. And there’s different kinds of… There’s a exhausted kind of depression where it’s not so much sad as you don’t want to do anything. You don’t want to live. You don’t want to-
Michael Malice
(01:32:09)
Yeah. What’s the point? It’s a wrap, yeah.
Lex Fridman
(01:32:10)
What’s the point? What’s the point? And an extreme self-critical negativity, which I’m also scared of because my brain is generally very self-critical.
Michael Malice
(01:32:19)
Because you’re not taking enough magnesium.
Lex Fridman
(01:32:21)
Do you take a rectally or in the mouth?
Michael Malice
(01:32:23)
You take a rectally.
Lex Fridman
(01:32:24)
Okay.
Michael Malice
(01:32:25)
But as for the magnesium, you should take it as a pill.

Fear

Lex Fridman
(01:32:28)
Okay. Well, the way your mom explained it then is way different. What are you most afraid of?
Michael Malice
(01:32:47)
Holy crap. I am trying to think of anything I’m afraid of.
Lex Fridman
(01:32:51)
In 1984-
Michael Malice
(01:32:54)
I thought even just-
Lex Fridman
(01:32:55)
Look, if I wanted to torture you, hypothetically…
Michael Malice
(01:32:59)
Well, the mission accomplished. I mean, I’m scared of increasing authoritarianism, but that’s not personal. And that’s something that I don’t think is as much of an imminent concern as let’s say in Canada.
Lex Fridman
(01:33:15)
Are you scared of death?
Michael Malice
(01:33:17)
No.
Lex Fridman
(01:33:17)
You think Camus was scared of death?
Michael Malice
(01:33:20)
No.
Lex Fridman
(01:33:22)
He just accepted it as-
Michael Malice
(01:33:24)
Look, I honestly feel like if I died tomorrow, I did pretty good with what I had. I think I did things that matter to me. I think I moved the needle on things that matter to me. I think I’ve been a good friend to the people I care about. I’ve saved a couple of lives. So I think it’s a very low bar for someone to be able to grow their grave and say, “I left the world a better place than I found it.” I don’t think it’s that hard.

Betrayal

Lex Fridman
(01:34:01)
You ever been betrayed?
Michael Malice
(01:34:03)
Oh god, yes. Of course. Haven’t you?
Lex Fridman
(01:34:07)
Not as often as I would’ve predicted.
Michael Malice
(01:34:09)
Yeah. The Russian upbringing expects everyone to be like… it’s a time bomb before they betray you. I have been betrayed. Of course. Yeah.
Lex Fridman
(01:34:16)
Yeah. You value loyalty?
Michael Malice
(01:34:20)
I do. And I also made it a point to not let that betrayal color my future interactions and regard that as the universal or the norm. I think that’s very important.
Lex Fridman
(01:34:32)
Me too.
Michael Malice
(01:34:34)
And also, I feel bad. I’ve gotten, Lex, enough that I feel bad for the person who betrayed me, because it’s just like they didn’t need to do this. And at some point, if you betray someone, you know, and you know you’re not a good person. I believe that. Like even if you tell yourself, “This is something I had to do,” you still know you had to do a bad thing to someone who didn’t deserve it. And that’s a really hard pill to swallow.
Lex Fridman
(01:35:02)
In my situation, I still think good thoughts and empathize with the people that have done me wrong.
Michael Malice
(01:35:11)
I don’t empathize with them, but I sympathize with them.
Lex Fridman
(01:35:14)
My English is not good enough to know the difference.
Michael Malice
(01:35:16)
Empathizing means you’re putting yourself in their shoes. Sympathizing means you feel bad for them and wish them well.
Lex Fridman
(01:35:25)
Yeah, I wish them well.
Michael Malice
(01:35:27)
Yeah, but I don’t put myself… it’s very hard for me to empathize with someone who betrays someone that they care about. It’s not that just I think I’m such a great person. It’s that I feel guilt very strongly. So if I did that to someone who trusted me, it would up my head for a long time.
Lex Fridman
(01:35:46)
Yeah, but maybe they were in pain. Maybe they were desperate. Maybe their back’s to the wall.
Michael Malice
(01:35:53)
Sure.
Lex Fridman
(01:35:53)
They felt that way.
Michael Malice
(01:35:54)
Sure. Well, that’s a sympathy thing. Not really an empathy thing.
Lex Fridman
(01:35:56)
Yeah. Yeah. Loyalty is a fascinating thing.
Michael Malice
(01:36:03)
Yes.
Lex Fridman
(01:36:04)
I value trust a lot.
Michael Malice
(01:36:05)
I know you do. Especially because you’re in such a public… Both of us, we’re in very public positions. You have to be very careful who you surround yourself with.
Lex Fridman
(01:36:13)
It sucks.
Michael Malice
(01:36:14)
Does it? Well, it’s-
Lex Fridman
(01:36:16)
Well, it sucks because it’s hard to… I usually just trust everybody.
Michael Malice
(01:36:25)
Okay, that’s crazy.
Lex Fridman
(01:36:29)
But what’s the alternative?
Michael Malice
(01:36:30)
To have a filter?
Lex Fridman
(01:36:33)
Well, I have a filter in terms of who I interact with, but within the… I see the good in people, but then in the very rare instances that might turn. Yeah. It just sucks. It breaks my heart.
Michael Malice
(01:36:48)
Yeah, I hear you. I completely agree.
Lex Fridman
(01:36:52)
Has your heart ever been broken?
Michael Malice
(01:36:54)
Yes.
Lex Fridman
(01:36:55)
Love?
Michael Malice
(01:36:56)
Yes.
Lex Fridman
(01:37:01)
I’m just so relaxed right now, and happy.
Michael Malice
(01:37:03)
Good.
Lex Fridman
(01:37:04)
Relaxed sand happy.
Michael Malice
(01:37:05)
Good.
Lex Fridman
(01:37:06)
This is making me really happy.
Michael Malice
(01:37:09)
Again, it’s beautiful on like eight different levels.
Lex Fridman
(01:37:11)
I think that’s the deepest thing I’m thankful for, is just how beautiful people are and how beautiful the world is.
Michael Malice
(01:37:21)
People are going to laugh, and I welcome it. That’s fine. I really sometimes feel like the guy in American Beauty looking at the plastic bag dancing in the wind, and he’s brought to tears because of how much beautiful life is. And a lot of people feel the need to sneer at that scene and Ricky Pitts, whatever, and I think he’s got it exactly right.
Lex Fridman
(01:37:45)
I think he does too. Well, in the end, you and I will be both laughing,
Michael Malice
(01:37:53)
Right. And also seeing beauty where other people see garbage. And I’d rather be the person who sees beauty than the person who sees garbage.
Lex Fridman
(01:38:02)
Yep. Well, when I look at you, I see beauty when most people see garbage. And it’s really unfair, Mr. Parrot, that you keep saying that. But all jokes aside, man, I’m really grateful for your friendship. I’m really grateful for who you are as a person. Thank you so much for talking today. Thank you so much for talking to me throughout all these years. Thank you for being who you are.
Michael Malice
(01:38:28)
You are welcome.
Lex Fridman
(01:38:31)
Thanks for listening to this conversation with Michael Malice. To support this podcast. Please check out our sponsors in the description. And now, let me leave you with some words from Andre Gide. Man cannot discover new oceans unless he has the courage to lose sight of the shore. Thank you for listening and hope to see you next time.

Transcript for John Mearsheimer: Israel-Palestine, Russia-Ukraine, China, NATO, and WW3 | Lex Fridman Podcast #401

This is a transcript of Lex Fridman Podcast #401 with John Mearsheimer.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
The following is a conversation with John Mearsheimer, a professor at University of Chicago and one of the most influential and controversial thinkers in the world. He teaches, speaks and writes about the nature of power and war on the global stage, in history and today.

(00:00:19)
Please allow me to say, once again, my hope for this little journey I’m on. I will speak to everyone on all sides with compassion, with empathy, and with backbone. I’ll speak with Vladimir Putin and with Volodymyr Zelenskyy, with Russians and with Ukrainians, with Israelis and with Palestinians, with everyone. My goal is to do whatever small part I can to decrease the amount of suffering in the world by trying to reveal our common humanity. I believe that in the end, truth and love wins. I will get attacked for being naive, for being a shill, for being weak. I’m none of those things, but I do make mistakes and I will get better. I love you all.

(00:01:19)
This is a Lex Fridman podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s John Mearsheimer.

Power


(00:01:29)
Can you explain your view on power in international politics as outlined in your book, The Tragedy of Great Power Politics and in your writing since then?
John Mearsheimer
(00:01:39)
Yeah, I make two sets of points there. First of all, I believe that power is the currency of international relations, and by that I mean that states are deeply interested in the balance of power and they’re interested in maximizing how much power they control. And the question is why do states care so much about power. In the international system, there’s no higher authority, so if you get into trouble and you dial 911, there’s nobody at the other end. In a system like that, you have no choice but to figure out for yourself how best to protect yourself. And the best way to protect yourself is to be powerful, to have as much power as you can possibly gain over all the other states in the system. Therefore, states care about power because it enhances or maximizes their prospects for survival.

(00:02:39)
Second point I would make is that in the realist story or in my story, power is largely a function of material factors. The two key building blocks of power are population size and wealth. You want to have a lot of people and you want to be really wealthy. Of course, this is why the United States is so powerful. It has lots of people and it has lots of wealth. China was not considered a great power until recently because it didn’t have a lot of wealth. It certainly had population size, but it didn’t have wealth. And without both a large population and much wealth, you’re usually not considered a great power. So I think power matters, but when we talk about power, it’s important to understand that it’s population size and wealth that are underpinning it.
Lex Fridman
(00:03:38)
So there’s a lot of interesting things there. First you said nations in relation to each other is essentially in a state of anarchism.
John Mearsheimer
(00:03:48)
Yeah, well, anarchy basically means the opposite of hierarchy. Sometimes people think when you’re talking about anarchy, you’re talking about murder and mayhem, but that’s not what anarchy means in the realist context. Anarchy simply means that you don’t have hierarchy. There’s no higher authority that sits above states. States are like pool balls on a table. And in an anarchic world, there’s no higher authority that you can turn to if you get into trouble.

(00:04:22)
And of course the political philosopher who laid this all out was Thomas Hobbes. And Hobbes talked about life in the state of nature, and in the state of nature you have individuals and those individuals compete with each other for power. And the reason that they do is because in the state of nature, by definition, you have no higher authority. And Hobbes’s view is that the way to get out of this terrible situation where individuals are competing with each other and even killing each other is to create a state. It’s what he calls the Leviathan, and that of course is the title of his famous book.

(00:05:02)
So the idea is to escape anarchy, you create a state, and that means you go from anarchy to hierarchy. The problem in international politics is that there is no world state, there is no hierarchy. And if you have no hierarchy and you’re in an anarchic system, you have no choice but to try to maximize your relative power to make sure you are, as we used to say when I was a kid on New York City playgrounds, the biggest and baddest dude on the block. Not because you necessarily want to beat up on other kids or on other states, but because again, that’s the best way to survive.

(00:05:47)
And as I like to point out to people, the best example of what happens when you’re weak in international politics is what the Chinese call the century of national humiliation. From the late 1840s to the late 1940s the Chinese were remarkably weak, and the great powers in the system preyed upon them. And that sends a very important message to not only the Chinese, but to other states in the system. Don’t be weak, be as powerful as you can.
Lex Fridman
(00:06:18)
And we’ll talk about it, but humiliation can lead to resentment or resentment leads to something you’ve also studied, which is Nazi Germany in the 1930s. We’ll talk about it, but staying to the psychology and philosophy picture, what’s the connection between the will to power in the individual, as you mentioned, and the will to power in a nation?
John Mearsheimer
(00:06:43)
The will to power in an individual has a lot to do with individual psychology. The story that I tell about the pursuit of power is a structural argument. It’s an argument that says when you are in a particular structure, when you’re in a system that has a specific architecture which is anarchy, the states have no choice but to compete for power. So structure is really driving the story here. Will to power has a lot more to do with an individual in the Nietzschen story where that concept comes from. So it’s very important to understand that I’m not arguing that states are inherently aggressive. My point is that as long as states are in anarchy, they have no choice but to behave in an aggressive fashion. But if you went to a hierarchic system, there’s no reason for those states to worry about the balance of power, because if they get into trouble there is a higher authority that they can turn to. There is in effect a leviathan.
Lex Fridman
(00:07:59)
So what is the role of military might in this will to power on the national level?
John Mearsheimer
(00:08:06)
Well, military mights is what ultimately matters. As I said to you before, the two building blocks of power are population size and wealth.
Lex Fridman
(00:08:16)
You didn’t mention military mights.
John Mearsheimer
(00:08:17)
I did not, no. That’s right. And it’s good that you caught that because if you have a large population and you’re a wealthy country, what you do is you build a large military, and it’s ultimately the size of your military that matters because militaries fight wars. And if states are concerned about survival, which I argue is the principle goal of every state in the international system for what I think are obvious reasons, then they’re going to care about having a powerful military that can protect them if another state comes after them.
Lex Fridman
(00:08:55)
Well, it’s not obvious that a large nation with a lot of people and a lot of money should necessarily build a gigantic army and seek to attain dominant soul superpower status to military might. But you’re saying, as you see the world today, it has to be that way.
John Mearsheimer
(00:09:16)
Yeah, I’m arguing it is obvious. If you’re a state in the international system, do you want to be weak? If you live next door to Nazi Germany or Imperial Germany or Napoleonic France or even the United States… The United States is a ruthless great power, you surely recognize that. And if you’re dealing with the United States of America and you’re Vladimir Putin, you want to make sure you’re as powerful as possible so that the United States doesn’t put its gun sights on you and come after you. Same thing is true with China. You want to be powerful in the international system.

(00:09:50)
States understand that, and they go to great lengths to become powerful. Just take the United States of America. When it started in 1783, it was comprised of 13 measly colonies strung out along the Atlantic seaboard. Over time, the various leaders of the United States went to great lengths to turn that country into the dominant power in the Western Hemisphere. And then once that was achieved in 1900, we’ve gone to great lengths to make sure that there’s no pier competitor in the system. We just want to make sure that we’re number one.

(00:10:33)
And my argument is that this is not peculiar to the United States. If I’m China, for example, today, I would want to dominate Asia the way the United States dominates the Western Hemisphere. They’d be fools not to. If I were imperial Germany, I’d want to dominate all of Europe the way the United States dominates the Western Hemisphere. Why? Because if you dominate all of Europe, assuming you’re Imperial Germany or Napoleonic France, then no other state in the area or in the region can threaten you because you’re simply so powerful.

(00:11:12)
And again, what I’m saying here is that the structure of the international system really matters. It’s the fact that you’re in this anarchic system where survival is your principle goal and where I can’t know your intentions, right? You’re another state. I can’t know that at some point you might not come after me. You might. And if you’re really powerful and I’m not, I’m in deep trouble.
Lex Fridman
(00:11:37)
Yeah. So some of the ideas underlying what you’ve said, offensive realism, which I would love to talk to you about sort of the history of realism versus liberalism, but some of the ideas you already mentioned, anarchy between states, everybody’s trying to develop military capabilities, uncertainty, such an interesting concept. States cannot be sure that other states will not use military capabilities against them, which is one-
John Mearsheimer
(00:12:07)
That’s of enormous importance to the story,
Lex Fridman
(00:12:09)
…really important, and so interesting because you also say that this makes realists more cautious and more peaceful, the uncertainty because of all the uncertainty involved here, it’s better to approach international politics with caution, which is really interesting to think about. Again, survival, most states interested in survival. And the other interesting thing is you assume all the states are rational, which-
John Mearsheimer
(00:12:40)
Most of the time.
Lex Fridman
(00:12:41)
Most of the time. You call this framework offensive realism. Can you just give an overview of the history of the realism versus liberalism debate as worldviews?
John Mearsheimer
(00:12:56)
Well, I think for many centuries now, the big divide within the world of international relations theory is between realism and liberalism. These are time honored bodies of theory. And before I tell you what I think the differences are between those two bodies of theory, it is important to emphasize that there are differences among realists and differences among liberals. And so when you talk about me as an offensive realist, you should understand that there are also defensive realists out there, and there are a panoply of liberal theories as well.

(00:13:42)
But basically realists believe that power matters, that states compete for power, and that war is an instrument of statecraft. And liberals, on the other hand, have what I would say is a more idealistic view of the world. This is not to say that they’re naive or foolish, but they believe there are aspects of international politics that lead to a less competitive and more peaceful world than most realists say. And I’ll lay out for you very quickly, what are the three major liberal theories today that I think will give you a sense of the more optimistic perspective that is inherent in the liberal enterprise.

(00:14:40)
The first and most important of the liberal theories is democratic peace theory, and this is a theory that says democracies do not fight against other democracies. So the more the world is populated with democracies, the less likely it is that we will have wars. And this basic argument is inherent in Francis Fukuyama’s The End of History. He argues that democracy triumphed first over fascism in the 20th century, it then triumphed over communism, and that means that in the future we’re going to have more and more liberal democracies on the planet. And if you have more and more liberal democracies and those democracies don’t fight each other, then you have a more peaceful world. That was his argument. It’s a very liberal argument.

(00:15:36)
A realist like me would say that it doesn’t matter whether a state is a democracy or not, all states behave the same way because the structure of the system, getting back to our earlier discussion about international anarchy, the structure of the system leaves those states no choice, whether they’re democracies or autocracies. And again, the liberal view, this first liberal theory, is that democracies don’t fight other democracies, and therefore the more democracies you have, the more peaceful the world.
Lex Fridman
(00:16:12)
Can I just sort of try to unpack that a little bit? So the democratic peace theory, I guess, would say that in democracies leaders are elected, and the underlying assumption is most people want peace, and so they will elect peacemakers. So the more democracies you have, the more likely you have peace. And then the realist perspective says that it doesn’t matter if the majority of people want peace. The structure of international politics is such that superpowers want to become more super and powerful, and they do that through war.
John Mearsheimer
(00:16:51)
You can’t make that argument that you’re making about democracies, because if you’re saying that democracies are inclined toward peace and that the electorate picks leaders who are inclined towards peace, then you have to show that democracies are, in general, more peaceful than non democracies, and you can’t support that argument. You can find lots of evidence to support the argument that democracies don’t fight other democracies.

(00:17:25)
So the argument I believe that you have to make, if you’re going to support democratic peace theory, the main argument you have to make is that liberal democracies have a healthy respect for each other and they can assess each other’s intentions. If you’re a liberal democracy, and I’m a liberal democracy, we know we have value systems that argue against aggression, and argue for peaceful resolution of crises. And therefore, given these norms, we can trust each other, we can know each other’s intentions. Remember, for realists like me, uncertainty about intentions really helps drive the train. But if you’re talking about two democracies, the argument there is that they know each other’s intentions.
Lex Fridman
(00:18:19)
And for you, sure, maybe democracies reduce uncertainty a little bit, but not enough to stop the train.
John Mearsheimer
(00:18:26)
I think that’s right, yeah. That’s right. So that’s democratic peace theory. The second theory is economic interdependence theory, and that’s the argument that, in a globalized world like the one that we live in and have lived in for a long time, there’s a great deal of economic interdependence. And if you and I are two countries, or if you and me are two countries and we’re economically interdependent and we’re both getting prosperous as a result of this economic intercourse, the last thing that we’re going to do is start a war, either one of us, because who would kill the goose that lays the golden eggs, it’s that kind of argument. So there you have an argument that economic interdependence leads to peace.

(00:19:09)
And then the third liberal argument has to do with institutions, sometimes referred to as liberal institutionalism. And this is the argument that if you can get states into institutions where they become rule abiding actors, they will obey the rules that dictate that war is not acceptable. So if you get them to accept the UN rules on when you can and cannot initiate a war, then you’ll have a more peaceful world. So those are the liberal theories, and as you can tell, they’re very different from realism as articulated by somebody like me.
Lex Fridman
(00:19:57)
Can you maybe argue against the economic interdependence and in the institutions that institutions follow rules a little bit? So the golden goose with the golden egg, you’re saying that nations are happy to kill the goose because again, they want power.
John Mearsheimer
(00:20:19)
If they think it’s necessary to kill the golden goose because of security concerns, they will do it. The point is that economic interdependence at its root has prosperity as the core variable. In the realest story, the core variable is survival, and survival always trumps prosperity. So if you go back to the period before World War I, we’re in Europe, it’s 1913 or early 1914, what you see is that you have an intense security competition between all of the great powers. On one side you have the Triple Alliance, and on the other side you have the Triple Entente. You have these two alliances, and you have an intense security competition between them. At the same time, you have a great deal of economic interdependence. It’s amazing how much economic intercourse is taking place in Europe among all the actors. And people are getting prosperous or countries are getting prosperous as a result. But nevertheless, in the famous July crisis of 1914, this economic prosperity is unable to prevent World War I because security concerns or survival is more important. So there are going to be lots of situations where prosperity and survival come into conflict, and in those cases, survival will win.
Lex Fridman
(00:21:59)
And maybe you can speak to the different camps of realists. You said offensive and defensive. Can you draw a distinction between those two?
John Mearsheimer
(00:22:09)
Yeah. Let me just back up a bit on that one. And you were talking about will to power before. The first big divide between realists is structural realists and human nature realists, and Hans Morgenthau, who was influenced by nature and therefore had that will to power logic embedded in his thinking about how the world works, he was a human nature realist. I’m a structural realist and I believe it’s not human nature, it’s not individuals in some will to power that drives competition and war. What drives competition and war is the structure of the system. It’s anarchy.
Lex Fridman
(00:23:03)
So you’re not as romantic as the human nature realists.
John Mearsheimer
(00:23:06)
Yeah. There’s just a world of difference between the two. It’s just important to understand that.
Lex Fridman
(00:23:12)
So within that, from the structural, there’s a subdivision also of offensive and defensive.
John Mearsheimer
(00:23:17)
Yes. Inside the structural realist world. And you have a handful of realists who believe that the structure of the system fosters competition, for sure, security competition. But it really rules out great power war almost all the time. So it makes sense to care about the balance of power, but to focus on maintaining how much power you have. That’s the defensive realism, maintaining how much power you have. Not trying to gain more power, because the argument the defense of realists make is that if you try to gain more power, the system will punish you, the structure will punish you. I’m not a defensive realist, I’m an offensive realist. And my argument is that states look for opportunities to gain more power, and every time they see, or almost every time they see an opportunity to gain more power, and they think the likelihood of success is high and the cost will not be great, they’ll jump at that opportunity.

Hitler

Lex Fridman
(00:24:39)
Just to linger on the human nature perspective, how do you explain Hitler and Nazi Germany, just one of the more recent aggressive expansions through military might? How do you explain that in the framework of offensive realism?
John Mearsheimer
(00:25:04)
Well, I think that Nazi Germany was driven in large part by structural considerations. And I think if you look at Imperial Germany, which was largely responsible for starting World War I, and of course Nazi Germany’s largely responsible for starting World War II, what that tells you is you didn’t need Adolf Hitler to start World War I. And I believe that there is a good chance you would’ve had World War II in the absence of Hitler. I believe that Germany was very powerful, it was deeply worried about the balance of power in Europe, and it had strong incentives to behave aggressively in the late 1930s, early 1940s. So I believe that structure mattered.

(00:25:54)
However, I want to qualify that in the case of Adolf Hitler, because I do think he had what you would call a will to power. I’ve never used that word to describe him before, but it’s consistent with my point that I often make, that there are two leaders, or there have been two leaders in modern history who are congenital aggressors, and one was Napoleon and the other was Hitler. Now, if you want to call that a will to power, you can do that. I’m more comfortable referring to Hitler as a congenital aggressor and referring to Napoleon as a congenital aggressor, although there were important differences between the two, because Hitler was probably the most murderous leader in recorded history, and Napoleon was not in that category at all. But both of them were driven by what you would call a will to power, and that has to be married to the structural argument in Hitler’s case, and also in Napoleon’s case.
Lex Fridman
(00:27:02)
Is there some degree on the human psychology side that resentment, because of what happened after World War I, led to Hitler willing so much power, and then Hitler starting World War II? So this is the human side. Perhaps the reason I asked that question is also because you mentioned the century of humiliation on the China side. So to which degree does humiliation lead to Hitler and lead to World War II?
John Mearsheimer
(00:27:33)
Well, the question of what led to Hitler is a very different question than the question of what led to World War II once Hitler was in power. I mean, after January 30th, 1933, he’s in power. And then the question of what is driving him comes racing to the fore. Is there resentment over the Versailles treaty and what happened to Germany? Yes. Did that matter? Yes. But my argument is that structure was the principle factor driving the train in Hitler’s case. But what I’m saying here is that there were other factors that as well, resentment being one of them. Will to power or the fact that he was a congenital aggressor in my lexicon certainly mattered as well, so I don’t want to dismiss your point about resentment.
Lex Fridman
(00:28:29)
So Hitler in particular, the way he wielded, the way he gained so much power, might have been the general resentment of the populace or the German populace.
John Mearsheimer
(00:28:41)
I think that as a result of defeat in World War I and all the trials and tribulations associated with Weimar Germany, and then the coming of the Great Depression, all of those factors definitely account for his coming to power. I think that one of the reasons that he was so successful at winning over the German people once he came to power was because there was a great deal of resentment in the German body politic. And he played on that resentment, that surely helped him get elected too. But I think having studied the case, it was even more important once he took over.

(00:29:32)
I also believe that one of the principal reasons that he was so popular and he was wildly popular inside Nazi Germany is because he was the only leader of an industrialized country who pulled his country out of the depression. And that really mattered, and it made him very effective. It’s also worth noting that he was a remarkably charismatic individual. I find that hard to believe because every time I look at him or listen to his speeches, he does not appear to be charismatic to me. But I’ve talked to a number of people who are experts on this subject who assure me that he was very charismatic. And I would note to you, if you look at public opinion polls in Germany, West Germany, in the late 1940s, this is the late 1940s after the Third Reich is destroyed in 1945, he is still remarkably popular in the polls.
Lex Fridman
(00:30:31)
Stalin is still popular in many parts of Eastern Europe.
John Mearsheimer
(00:30:36)
Yeah, yeah. And Stalin’s popular in many quarters inside Russia, and Stalin murdered more of his own people than he murdered people outside of the Soviet Union.
Lex Fridman
(00:30:50)
And still to you, the tides of history turned not on individuals, but on structural considerations. So Hitler may be a surface-layer characteristics of how Germany started war, but not really the reason.
John Mearsheimer
(00:31:09)
Well, history is a multidimensional phenomenon-
Lex Fridman
(00:31:14)
So I hear.
John Mearsheimer
(00:31:15)
… and we’re talking about interstate relations here, and realism is a theory about how states interact with each other, and there are many other dimensions to international politics. And if you’re talking about someone like Adolf Hitler, why did he start World War II is a very different question than why did he start the Holocaust or why did he push forward a holocaust? I mean, that’s a different question, and realism doesn’t answer that question. So I want to be very clear that I’m not someone who argues that realism answers every question about international politics, but it does answer what is one of the big, if not the biggest, questions that IR scholars care about, which is what causes security competition and what causes great power war.
Lex Fridman
(00:32:10)
Does offensive realism answer the question why Hitler attacked the Soviet Union?
John Mearsheimer
(00:32:17)
Yes.
Lex Fridman
(00:32:18)
Because from a military strategy perspective, there’s pros and cons to that decision.
John Mearsheimer
(00:32:25)
Pros and cons to every decision. The question is, did he think that he could win a quick and decisive victory. And he did, as did his generals. It’s very interesting, I’ve spent a lot of time studying German decision making in World War II. If you look at the German decision to invade Poland on September 1st, 1939, and you look at the German decision to invade France on May 10th, 1940, and then the Soviet Union on June 22nd, 1941, what you see is there was actually quite a bit of resistance to Hitler in 1938 at the time of Czechoslovakia, Munich, and there was also quite a bit of resistance in September, 1939.
Lex Fridman
(00:33:13)
Internally? Or you mean…
John Mearsheimer
(00:33:14)
Internally, internally. For sure. Yeah. People had doubts. They didn’t think the Wehrmacht was ready, and given the fact that World War I had just ended about 20 years before, the thought of starting another European war was not especially attractive to lots of German policy makers, including military leaders. And then came France 1940. In the run-up to May 10th, 1940, there was huge resistance in the German army to attacking France. But that was eventually eliminated because they came up with a clever plan, the Manstein Plan. If you look at the decision to invade the Soviet Union on June 22nd, 1941, which is the only case where they fail… They succeeded in France, they succeeded in Poland, they succeeded at Munich in 1938. Soviet Union is where they fail. There’s hardly any resistance at all, right?
Lex Fridman
(00:34:20)
Yeah. Well, and to say that they failed the Soviet Union, my grandfather fought for the Soviet Union, there was a lot of successes early on. So there’s poor military, I would say, strategic decisions along the way, but it caught Stalin off guard. Maybe you can correct me, but from my perspective, terrifyingly so, they could have been successful if certain different decisions were made from a military perspective.
John Mearsheimer
(00:34:54)
Yeah. I’ve always had the sense they came terrifyingly close to winning. You can make the opposite argument that they were doomed-
John Mearsheimer
(00:35:03)
You can make the opposite argument that they were doomed. But I’m not terribly comfortable making that argument. I think the Wehrmacht, by the summer of 1941, was a finely tuned instrument for war, and the Red Army was in quite terrible shape. Stalin had purged the Officer Corps, they had performed poorly in Finland, and there were all sorts of reasons to think that they were no match for the Wehrmacht.

(00:35:36)
And if you look at what happened in the initial stages of the conflict, that proved to be the case. The Germans won a lot of significant tactical victories early on.
Lex Fridman
(00:35:49)
And if they focused and went to Moscow as quickly as possible, again, terrifyingly, so could have been, basically topple Stalin. And one thing that’s-
John Mearsheimer
(00:36:03)
That’s possible.
Lex Fridman
(00:36:04)
That’s possible.
John Mearsheimer
(00:36:05)
Fortunately, we’re not going to run the experiment again, but one could argue that, had they concentrated as the generals wanted to do, in going straight from Moscow, that they would’ve won. I mean, what Hitler wanted to do is, he wanted to go into the Ukraine. I mean, Hitler thought that the main Axis… There were three Axes. The northern Axis went towards Leningrad, the central Axis of course, went to Moscow, and then the Southern Axis, Army Group South, headed towards Ukraine and deep into the caucuses.

(00:36:39)
And Hitler believed that that should have been the main Axis. And in fact, in 1942, the Soviets, excuse me, the Germans go back on the offensive in 1942. This is Operation Blue, and the main Axis in ’42 is deep into the Ukraine and into the caucuses, and that fails.

(00:37:01)
But one could argue that, had they done that in ’41, had they not gone to Moscow, had they gone, had they concentrated on going deep into Ukraine and into the caucuses, they could have knocked the Soviets out that way. I’m not sure that in the end I believe that. I think in the end the Soviets would’ve won no matter what, but I’m not a hundred percent sure of that.
Lex Fridman
(00:37:28)
Sometimes, maybe you can educate me, but sometimes they say, just like with Napoleon, winter defeated Hitler in Russia. I think not often enough people tell the story of the soldiers and the motivation and how hard they fight. So it turns out that Ukrainians and Russians are not easy to conquer. They’re the kinds of people that don’t roll over and fight bravely. There seems to be a difference in certain peoples, in how they see war, how they approach war, how proud they are to fight for their country, to die for their country, these kinds of things. So I think Battle of Stalingrad tells, at least to me, a story of extremely brave fighting on the Soviet side, and that, it’s a component of war too. It’s not just structural, it’s not just military strategy, it’s also the humans involved, but maybe that’s a romantic notion of war.
John Mearsheimer
(00:38:33)
No, I think there’s a great deal of truth in that, but let’s just unpack it a bit in the case of the Soviet Union in World War II. The counterargument to that is that in World War I, the Russian Army disintegrated. And if you look at what happened when Napoleon invaded in 1812, and you look at what happened in 1917, and then you look at what happened between ’41 and ’45, the Napoleon case looks a lot like the Hitler case, and it fits neatly with your argument.

(00:39:14)
But World War I does not fit neatly with your argument because the Russians lost and surrendered, and you had the infamous treaty of Brest-Litovsk, where the Soviet Union then, because it went from Russia to the Soviet Union in October 1917, the Soviet Union surrendered large amounts of Soviet territory because it had suffered a humiliating defeat.

(00:39:38)
My argument for why the Russians, let me take that back, why the Soviets fought like wild dogs in World War II is that they were up against a genocidal adversary. You want to understand that the Germans murdered huge numbers of Soviet POWs. The overall total was 3.7 million. And by December, December of 1941, remember the invasion is June ’41, by December of 1941, the Germans have murdered 2 million Soviet POWs. At that point in time, they had murdered many more POWs than they had murdered Jews.

(00:40:20)
And this is not to deny for one second that they were on a murderous rampage when it came to Jews, but they were also on a murderous rampage when it came to Soviet citizens and Soviet soldiers. So those Soviet soldiers quickly came to understand they were fighting for their lives. If they were taken prisoner, they would die. So they fought like wild dogs.
Lex Fridman
(00:40:48)
Yeah, the story of the Holocaust, of the 6 million Jews, is often told extensively. If Hitler won, conquered the Soviet Union, it’s terrifying to think, on a much grander scale than the Holocaust, what would’ve happened to the Slavic people, to the Soviet people.
John Mearsheimer
(00:41:08)
Absolutely. All you have to do is read the Hunger Plan, right? And they also had a plan, what was it called? Grand Planned East, I forget the exact name of it, which made it clear that they were going to murder many tens of millions of people. And by the way, I believe that they would’ve murdered all the Poles and all the Roma. I mean, my view is that the Jews were number one on the genocidal hit list. The Roma, or the gypsies, were number two, and the Poles were number three.

(00:41:42)
And of course, I just explained to you how many POWs they had killed. So they would’ve ended up murdering huge numbers of Soviet citizens as well. But people quickly figured out that this was happening, that’s my point to you. And that gave them, needless to say, very powerful incentives to fight hard against the Germans, and to make sure that they did not win.

Russia and Ukraine

Lex Fridman
(00:42:09)
To fast-forward in time, but not in space, let me ask you about the war in Ukraine. Why did Russia invade Ukraine on February 24th, 2022? What are some of the explanations given? And which do you find the most convincing?
John Mearsheimer
(00:42:33)
Well, clearly, the conventional wisdom is that Putin is principally responsible. Putin is an imperialist, he’s an expansionist.
Lex Fridman
(00:42:43)
That’s the conventional thinking.
John Mearsheimer
(00:42:44)
Yeah, yeah. And the idea is that he is bent on creating a greater Russia, and even more, so he’s interested in dominating Eastern Europe, if not all of Europe, and that Ukraine was the first stop on the train line. And what he wanted to do was to conquer all of Ukraine, incorporate it into a greater Russia, and then he would move on and conquer other countries. This is the conventional wisdom. My view is there is no evidence, let me emphasize, zero evidence, to support that argument.
Lex Fridman
(00:43:26)
Which part? That he would… The imperialist part, the sense that he sought to conquer all of Ukraine, and move on and conquer-
John Mearsheimer
(00:43:36)
There’s no evidence he was interested in conquering all of Ukraine. There was no evidence beforehand that he was interested in conquering any of Ukraine. And there’s no way that an army that had 190,000 troops, at the most, could have conquered all of Ukraine, it’s just impossible.

(00:43:59)
As I like to emphasize, when the Germans went into Poland in 1939, and the Germans, you want to remember, were only intent on conquering the western half of Poland, because the Soviets, who came in later that month, were going to conquer the eastern half of Poland. So the western half of Poland is much smaller than Ukraine, and the Germans went in with 1.5 million troops. If Vladimir Putin were bent on conquering all of Ukraine, he would’ve needed at least 2 million troops. I would argue he’d need 3 million troops, because not only did he need to conquer the country, you then have to occupy it.

(00:44:44)
But the idea that 190,000 troops was sufficient for conquering all of Ukraine, it’s not a serious argument. Furthermore, he was not interested in conquering Ukraine, and that’s why, in March 2022, this is immediately after the war starts, he is negotiating with Zelensky to end the war. There are serious negotiations taking place in Istanbul involving the Turks. And Naftali Bennett, who was the Israeli prime minister at the time, was deeply involved in negotiating with both Putin and Zelensky to end the war.

(00:45:22)
Well, if he was interested, Putin, in conquering all of Ukraine, why in God’s name would he be negotiating with Zelensky to end the war? And of course, what they were negotiating about was NATO expansion into Ukraine, which was the principal cause of the war. People in the West don’t want to hear that argument because if it is true, which it is, then the West is principally responsible for this bloodbath that’s now taking place. And of course, the West doesn’t want to be principally responsible. It wants to blame Vladimir Putin.

(00:45:59)
So we’ve invented this story out of whole cloth that he is an aggressor, that he’s the second coming of Adolf Hitler, and that what he did in Ukraine was try to conquer all of it and he failed. But with a little bit of luck, he probably would’ve conquered all of it, and he’d now be in the Baltic States, and eventually end up dominating all of Eastern Europe. As I said, I think there’s no evidence to support this.
Lex Fridman
(00:46:28)
So maybe there’s a lot of things to ask there. Maybe just to linger on NATO expansion, what is NATO expansion? What is the threat of NATO expansion and why is this such a concern for Russia?
John Mearsheimer
(00:46:42)
NATO was a mortal enemy of the Soviet Union during the Cold War. It’s a military alliance which has at its heart the United States of America, which is the most powerful state on the planet. It is perfectly understandable that Russia is not going to want that military alliance on its doorstep.

(00:47:08)
Here in the United States we have, as you well know, what’s called the Monroe Doctrine, and that basically says no great powers from Europe or Asia are allowed to come into our neighborhood and form a military alliance with anybody in this neighborhood. When I was young, there was this thing called the Cuban Missile Crisis. The Soviets had the audacity to put nuclear armed missiles in Cuba. We told them in no uncertain terms that that was not acceptable, and that those missiles had to be removed. This is our backyard and we do not tolerate distant great powers coming into our neighborhood.

(00:47:45)
Well, what’s good for the goose is good for the gander. And if we don’t like great powers coming into our neighborhood, it’s hardly surprising that the Russians did not want NATO on their doorstep. They made that manifestly clear when the Cold War ended, and they exacted a promise from us that we would not expand NATO. And then when we started expanding NATO, they made it clear, after the first tranche in 1999, that they were profoundly unhappy with that. They made it clear in 2004, after the second tranche, that they were profoundly unhappy with that expansion.

(00:48:29)
And then, in April 2008, when NATO announced that Ukraine and Georgia would become part of NATO, they made it unequivocally clear, not just Putin, that was not going to happen. They were drawing a red line in the sand. And it is no accident that in August 2008, remember the Bucharest Summit is April 2008? And August 2008, you had a war between Georgia and Russia, and that involved, at its core, NATO expansion.

(00:49:02)
So the Americans and their allies should have understood by at least August 2008 that continuing to push to bring Ukraine into NATO was going to lead to disaster. And I would note that there were all sorts of people in the 1990s like George Kennan, William Perry, who was Bill Clinton’s Secretary of Defense, the Chairman of the Joint Chiefs of Staff, Paul Nitsa, and so forth and so on, who argued that NATO expansion would end up producing a disaster, which it has.

(00:49:38)
I would note that at the famous April 2008 Bucharest Summit, where NATO said that Ukraine would be brought into the alliance, Angela Merkel and Nicolas Sarkozy, the German and French leaders respectively, opposed that decision. Angela Merkel later said that the reason she opposed it was because she understood that Putin would interpret it as a declaration of war. Just think about that. Merkel is telling you that she opposed NATO expansion into Ukraine, because she understood, correctly, that Putin would see it as a declaration of war.

(00:50:22)
What did the United States and its friends and friends in Europe do? They continued to push and push, because we thought that we could push NATO expansion down their throat after 2008, the same way we did in 1999 and 2004, but we were wrong, and it all blew up in our face in 2014. And when it blew up in our face in 2014, what did we do? Did we back off and say, “Well, maybe the Russians have some legitimate security interest.” No, that’s not the way we operate. We continued to double down.

(00:50:57)
And the end result is that in 2022, you got a war. And as I’ve argued for a long time now, we, the West, are principally responsible for that, not Vladimir Putin.
Lex Fridman
(00:51:11)
So the expansion of NATO is primarily responsible for that.
John Mearsheimer
(00:51:15)
Yeah. To put it in more general terms, what we were trying to do was turn Ukraine into a Western bulwark on Russia’s border, and it really wasn’t NATO expansion alone. NATO expansion was the most important element of our strategy. But the strategy had two other dimensions. One was EU expansion, and the third was the Color Revolution. We were trying to force Orange Revolution in Ukraine, and the basic goal there was to turn Ukraine into a pro-Western, liberal democracy.

(00:51:52)
And that meant that you’d have Ukraine, if it worked, as a pro-Western liberal democracy that was in the EU, and that was in NATO. This was our goal. And the Russians made it unequivocally clear Ukraine was not going to become a Western bulwark on their border, and most importantly, they made it clear that Ukraine in NATO was unacceptable.
Lex Fridman
(00:52:19)
Can we talk about the mind of Vladimir Putin? You’ve mentioned this idea that he has aspirations for imperialist conquest, that he dreams of empire, is not grounded in reality. He wrote an essay in 2021, about one people. Do you think there is some degree to which he still dreams of the former Soviet Union reuniting?
John Mearsheimer
(00:52:50)
No, he’s made it clear that anybody with a triple digit IQ understands that it’s nuts to think about recreating the Soviet Union. He thinks it’s a tragedy that the Soviet Union fell apart, but as he made clear in that essay, the July 12th, 2021 essay, and as he made clear in speeches before, immediately before he invaded Ukraine, he accepted the breakup of the Soviet Union, and he accepted the status quo in Europe, save for the fact he did not accept the idea that Ukraine would become part of NATO.
Lex Fridman
(00:53:33)
He’s been in power for over two decades. Is there a degree that power can affect a leader’s ability to see the world clearly, as they say, corrupt? Do you think power has corrupted Vladimir Putin, to a degree?
John Mearsheimer
(00:53:52)
It’s very hard for me to answer that question because I don’t know him, and I’ve not studied him carefully in terms of his overall performance over the course of the 23 years that he’s been in power. I’ve studied him as a strategist, and I’ve studied how he deals with the West, and deals with the international system more generally since 2014. And I think he is a first class strategist.

(00:54:31)
This is not to say he doesn’t make mistakes, and he admits he’s made some mistakes, but I think that the West is dealing with a formidable adversary here. And I don’t see any evidence that he’s either lost speed off his fastball, or that power has corrupted his thinking about strategic affairs.
Lex Fridman
(00:54:59)
So he has consistently put, as a primary concern, security? As does the United States, he’s put for Russia’s security, making sure that NATO doesn’t get close to its borders?
John Mearsheimer
(00:55:12)
I think that’s clear. Yeah, I think as I emphasized early on in our conversation, that leaders privilege security or survival over everything else. And by the way, he gave a number of talks and press conferences in addition to writing that famous article that you referred to on July 12th, 2021. So we have a pretty clear record of what he was saying, and I would argue what he was thinking, in the run-up to the war in February 2022.

(00:55:50)
And if you read what he said, it’s quite clear that he privileged security or survival. He was deeply concerned about the security of Russia. And Russia is a quite vulnerable state in a lot of ways, especially if you think back to what it looked like in the 1990s, as you know better than I do. It was in terrible shape. The Chinese talk about the century of national humiliation. One could argue that for the Russians, that was the decade of national humiliation. And it took Putin, I think, quite a bit of time to bring the Russians back from the dead. I think he eventually succeeded, but it took a considerable amount of time, and I think he understood that he was not playing a particularly strong hand. He was playing something of a weak hand, and he had to be very careful, very cautious, and I think he was. And I think that’s very different than the United States. The United States was the Unipol. It was the most powerful state in the history of the world, the most powerful state relative to all its possible competitors. From roughly 1989, certainly after December 1991, when the Soviet Union fell apart, up until, I would argue, about 2017, we were incredibly powerful. And even after 2017, up to today, the United States remains the most powerful state in the system.

(00:57:18)
And because of our geographical location, we are in a terrific situation to survive in any great power competition. So you have a situation involving the United States that’s different than the situation involving Russia. They’re just much more vulnerable than we are. And therefore, I think Putin tends to be more sensitive about security than any American president in recent times.
Lex Fridman
(00:57:51)
Europe on one side, China on the other side. It’s a complicated situation.
John Mearsheimer
(00:57:56)
Yeah. And we talked before about 1812, when Napoleon invaded and Moscow got burned to the ground. We talked about World War I, where the Russians were actually defeated and surrendered, and then we talked about 1941 to 1945, where, although thankfully the Soviets prevailed, it was a close call. And I mean, the casualties, the destruction that the Soviet Union had inflicted on it by the Germans is just almost hard to believe. So they are sensitive.

(00:58:38)
You can understand full well, or at least you should be able to understand full well, why the idea of bringing Ukraine up to their border really spooked them. I don’t understand why more Americans don’t understand that, it befuddles me. I think it has to do with the fact that Americans are not very good at putting themselves in the shoes of other countries. And you really, if you’re going to be a first class strategist in international politics, you have to be able to do that. You have to put yourself in the shoes of the other side and think about how they think, so you don’t make foolish mistakes.
Lex Fridman
(00:59:16)
And as a starting point, Americans tend to see themselves as the good guys and a set of others as the bad guys. And you have to be able to empathize that Russians think of themselves as the good guys, the Chinese think of themselves as the good guys, and just be able to empathize. If they are the good guys… It’s like that funny skit. Are we the baddies? Consider the United States could be the bad guys.

(00:59:44)
First of all, see the world, if the United States is the bad guys and China is the good guys, what does that world look like? Be able to just exist with that thought, because that is what the Chinese leadership and many Chinese citizens, if not now, maybe in the future, will believe. And you have to kind of do the calculation, the simulation forward from that. And same with Russia, same with other nations.
John Mearsheimer
(01:00:12)
Yeah, I agree with you, a hundred percent. And just, I always think of Michael McFall at Stanford, who was the American ambassador to Russia, I think between 2012 and 2014. And he told me that he told Putin that Putin didn’t have to worry about NATO expansion because the United States was a benign hegemony.

(01:00:36)
And I asked Mike what Putin’s response was to that. And Mike said that Putin didn’t believe it, but Mike believed that he should believe it, and that we could move NATO eastward to include Ukraine, and in the end, we’d get away with it because we are a benign hegemony, but the fact is that’s not what Putin saw. Putin saw us as a malign hegemony. And what Mike thinks, or any American thinks, doesn’t matter. What matters is what Putin thinks.
Lex Fridman
(01:01:15)
But also, the drums of war have been beating for some reason. NATO expansion has been threatened for some reason. So you’ve talked about NATO expansion being dead, so it doesn’t make sense from a geopolitical perspective, on the Europe side, to expand NATO. But nevertheless, that threat has been echoed. So why has NATO expansion been pushed, from your perspective?
John Mearsheimer
(01:01:46)
There are two reasons. One is, first of all, we thought it was a wonderful thing to bring more and more countries into NATO. We thought that it facilitated peace and prosperity. It was ultimately all for the good. And we also thought that countries like Ukraine had a right to join NATO.

(01:02:12)
These are sovereign countries that can decide for themselves, and the Russians have no say in what Ukraine wants to do. And then finally, and this is a point I emphasized before, we were very powerful, and we thought we could shove it down their throat. So it’s a combination of those factors that led us to pursue what I think was ultimately a foolish policy.
Lex Fridman
(01:02:39)
We’ve talked about how wars get started. How do you hope the war in Ukraine ends? What are the ways to end this war? What are the ways to achieve peace there? To end the, I would say, senseless death of young men, as always happens in war?
John Mearsheimer
(01:03:04)
I’m sad to say I don’t have a good answer to that. I don’t think there’s any real prospect of a meaningful peace agreement. I think it’s almost impossible. I think the best you can hope for at this point is, at some point the shooting stops, you have a ceasefire, and then you have a frozen conflict. And that frozen conflict will not be highly stable.

(01:03:36)
And the Ukrainians in the West will do everything they can to weaken Russia’s position, and the Russians will go to great lengths to not only damage that dysfunctional rump state that Ukraine becomes, but the Russians will go to great lengths to sow dissension within the alliance. And that includes in terms of transatlantic relations.

(01:04:03)
So you’ll have this continuing security competition between Russia on one side, and Ukraine and the West on the other. Even when you get a frozen peace, or you get a frozen conflict, and the potential for escalation there will be great. So I think this is a disaster.
Lex Fridman
(01:04:25)
That’s a very realist perspective. Let me ask you sort of the human side of it. Do you think there’s some power to leaders sitting down, having a conversation, man to man, leader to leader, about this? There is just a lot of death happening. It seems that, from an economic perspective, from a historic perspective, from a human perspective, both nations are losing.

(01:04:55)
Is it possible for Vladimir Zelensky and Vladimir Putin to sit down and talk, and to figure out a way where the security concerns are addressed, and both nations can minimize the amount of suffering that’s happening, and create a path towards future flourishing?
John Mearsheimer
(01:05:21)
I think the answer is no.
Lex Fridman
(01:05:23)
Even with the United States involved, three people in the room?
John Mearsheimer
(01:05:27)
Well, I think if the United States is involved, the answer is definitely no. You have to get the Americans out. And then, I think if you have Zelensky and Putin talking, you have a sliver of a chance there. The Americans are a real problem. Look, let’s go back to what happens right after the war starts, okay? As I said before, we’re talking March, early April of 2022. The war starts on February 24th, 2022.

(01:05:59)
And as I said to you, the two sides were negotiating in Istanbul, and they were also negotiating through Naftali Bennett, and the Bennett track and the Turkish track were operating together. I mean, they were not at cross purposes at all. What happened? Bennett tells the story very clearly that they had made significant progress in reaching an agreement. This is Zelensky on one side and Putin on the other. Bennett is talking in person to both Putin and Zelensky, and what happens to produce failure?

(01:06:45)
The answer is, the United States and Britain get involved and tell Zelensky to walk. They tell Zelensky to walk. If they had come in and encouraged Zelensky to try to figure out a way with Putin to shut this one down, and worked with Bennett, and worked with Erdogan, we might’ve been able to shut the war down then, but it was the United States.
Lex Fridman
(01:07:09)
Well, let me sort of push back on that. You’re correct, but the United States paints this picture that everybody’s aligned. Maybe you can correct me, but I believe in the power of individuals, especially individual leaders. Again, whether it’s Biden or Trump or whoever goes into a room and says, in a way that’s convincing, that no more NATO expansion. And actually just on a basic human level, ask the question of why are we doing all this senseless killing?

(01:07:49)
And look at the interest of one, Russia, look at the interest of the other, Ukraine. Their interests are pretty simple. And say, the United States is going to stay out of this. We’re not going to expand NATO, and say all that in a way that’s convincing, which is that NATO expansion is silly at this point, China’s the big threat. We’re not going to do this kind of conflict escalation with Russia. The Cold War’s over, let’s normalize relations.
John Mearsheimer
(01:08:20)
Let me just embellish your argument, okay?
Lex Fridman
(01:08:23)
Thank you. I need it.
John Mearsheimer
(01:08:26)
If we say there’s a sliver of a chance that you can do this, and I do think there is a sliver of a chance. Let me just embellish your point.
Lex Fridman
(01:08:34)
Thank you. I need all the help I can get.
John Mearsheimer
(01:08:37)
Two things have to be done here, in my opinion. One is, Ukraine has to become neutral, and it has to completely sever all security ties with the West, right? It is not like you can say, “We’re not going to expand NATO to include Ukraine, but we’re going to continue to have some loose security arrangement with Ukraine.” None of that. It has to be completely severed. Ukraine has to be on its own, okay?

(01:09:13)
And number two, Ukraine has to accept the fact that the Russians are going to keep the four oblasts that they’ve now annexed, and Crimea. The Russians are not going to give them back. And what you really want to do, if you’re Zelensky or who’s ever running Ukraine in this scenario that we’re positing, is you want to make sure the Russians don’t take another four oblasts, to include Kharkiv and Odessa.

(01:09:45)
If I’m playing Putin’s hand and this war goes on, I’m thinking about taking four more oblasts. I want to take about 43% of Ukraine and annex it to Russia, and I certainly want Odessa, and I certainly want Kharkiv, and I want the two oblasts-
John Mearsheimer
(01:10:03)
And I certainly want Harki and I want the two old boss in between as well.
Lex Fridman
(01:10:05)
Literally, or as leveraged in negotiation or Ukraine neutrality?
John Mearsheimer
(01:10:12)
No, I want them literally, I want to conquer them literally. My point to you is if we can begin to talk about cutting a deal now, you may be able to head that kind of aggression off at the pass. In other words, you may be able to limit Putin and Russia to annexing the four old boss that they’ve now annexed plus Crimea. That’s the best I think you can hope for. The point is you have to get the Ukrainians to accept that. You have to get the Ukrainians to accept becoming a truly neutral state and conceding that the Russians keep a big chunk of territory. It’s about 23% of Ukrainian territory that they’ve annexed and I find it hard to imagine any Ukrainian leader agreeing to that.
Lex Fridman
(01:11:03)
Well, there could be more nuanced things like no military involvement between the United States and Ukraine, but economic involvement, sort of financial support, so normalizing economic relationships with Ukraine, with Russia, all being-
John Mearsheimer
(01:11:21)
I think you could probably get away with that. I think the tricky question there that you would have to answer is what about EU expansion? And I think EU expansion is probably a no-no for the Russians because most people don’t recognize this, but there is a military dimension built into EU expansion. It’s not purely an economic alliance or relationship or institution, whatever word you want to use. There’s a military dimension to that. In the run-up to the war, actually in the run-up to the 2014 crisis, when it first broke out, the Russians made it clear they saw EU expansion as a stalking horse for NATO expansion.

(01:12:10)
So EU expansion is tricky, but I think your point of close economic relations between … or healthy economic relations to use a better term between Ukraine and the West is possible. I think the Russians have a vested interest and if it’s a neutral Ukraine, they have a vested interest in that Ukraine flourishing, but that then brings us back to the territorial issue, right?
Lex Fridman
(01:12:39)
Well, so do you believe it’s possible for individual human relations to counteract the structural forces that you talk about? So meaning the leaders being able to pick up the phone and make agreements that are good for humanity as a whole and for their individual nations in the long term?
John Mearsheimer
(01:12:59)
I think leadership matters here. I mean, one of the real problems here is that there’s no trust on the Russian side, and that has to do with the Minsk agreements. The Minsk agreements, which were designed to shut down the Civil War in Eastern Ukraine, in the Donbas really mattered to the Russians. And there were four players involved in the Minsk process, four main players, Russia and Ukraine of course, and then Germany and France. And I believe the Russians took the Minsk Accord seriously. I believe Putin took them very seriously. He wanted to shut down that conflict.

(01:13:52)
And Angela Merkel, Francois Hollande, he was the French leader and Poroshenko, who was the Ukrainian leader, those were the three key players besides Putin. Again, Hollande from France, Merkel from Germany, and Poroshenko from Ukraine have all explicitly said they were not seriously interested in reaching an agreement in all of the discussions with Putin, they were bamboozling him. They were trying to trick him so that they would buy time to build up Ukraine’s military. Putin is profoundly upset about these admissions by these three leaders. He believes he was fooled into thinking that Minsk could work. He believes that he negotiated in good faith and they did not.

(01:14:49)
And he believes that the level of trust now between Russia and the West is virtually zero as a result of this experience over Minsk. I only bring this up because it cuts against your argument that leaders could pick up the phone and talk to each other and trust each other at least somewhat to work out a meaningful deal. If you’re Putin at this point in time, trusting the West is not an idea that’s going to be very attractive at all. In fact, you’re going to distrust anything they say.
Lex Fridman
(01:15:30)
Yeah, distrust anything the West say, but there is individual humans. The way human nature works is when you’re sitting across from a person, you can trust a human being while still distrusting the West. I mean, I believe in the power of that. I think with the right leaders, you could sit down and talk, like override the general structural distrust of the West and say, “You know what? I like this guy or gal, whatever.” I do hope Zelensky and Putin sit down together and talk, have multiple talks.
John Mearsheimer
(01:16:08)
Just remember they were doing that in March and the Americans came in and the British came in and they scotched a potential deal.
Lex Fridman
(01:16:17)
Well, the other beautiful thing about human nature, there’s forgiveness and there’s trying again.
John Mearsheimer
(01:16:25)
When you’re the leader of a country in an anarchic system, you have to be very careful not to let your trust in a foreign leader take you too far, because if that foreign leader betrays you or betrays your trust and stab you in the back, you could die and again, you want to remember that the principal responsibility of any leader, I don’t care what country it is, is to ensure the survival of their state. And that means that trust is only going to buy you so much, and when you’ve already betrayed the trust of a leader, you really are not going to be able to rely on trust very much to help you moving forward. Now, you disagree with that? I hope you’re right.

(01:17:17)
And if they can shut down the Ukraine-Russia war, it would be wonderful. If I’m proved dead wrong, that would be wonderful news. My prediction that this war is going to go on for a long time and end in an ugly way is a prediction that I don’t like at all. So I hope I’m wrong.
Lex Fridman
(01:17:45)
You wrote that many in the West believe that the best hope for ending the Ukraine wars to remove Vladimir Putin from power, but you argue that this isn’t the case. Can you explain?
John Mearsheimer
(01:17:58)
Well, a lot of people thought when they were having all that trouble, the Russians were having all that trouble with Prigozhin and the Wagner Group that Putin was vulnerable and was likely to be overthrown. And what would happen is a peace-loving leader would replace Putin. I made two points at the time, and I would make those same two points now. Number one, he’s not likely to be overthrown. He was not likely then to be overthrown. And I think as long as his health holds up, I think he will remain in power. My second point is if he doesn’t remain in power and he’s replaced, I would bet a lot of money that his replacement will be more hawkish and more hard line than Putin is.

(01:18:58)
I actually think one could argue that Putin was too trusting of the West before the war started and number two, I think one could argue that he has not waged the war against Ukraine as vigorously as one might have expected. He was slow to mobilize the nation for war, and he has pursued a limited war in all sorts of ways. The Israelis, for example, have killed more civilians in Gaza in one month than the Russians have killed over 18 months in Ukraine. The idea that Vladimir Putin is waging a punishment campaign and killing on purpose, large numbers of civilians, is simply not true.

(01:19:53)
All this just to say that … I would imagine that if Putin leaves office and someone else comes in to replace him, that someone else will be at least if not, more hard line than him in terms of waging the war, and certainly will not trust the West any more than he has.
Lex Fridman
(01:20:15)
By way of advice, let me ask you, if I were to have a conversation interview Vladimir Putin and Zelensky individually, what should I ask them? If you, me and Vladimir Putin are having a chat, what are good ideas to explore? What are good questions to ask? What are good things to say on or off the mic once again, that could potentially even slightly, lessen the amount of suffering in the world caused by this war?
John Mearsheimer
(01:20:51)
I think if you get an interview with Vladimir Putin, there’s just all sorts of questions you could ask him. And my sense is that Putin is a straight shooter. He’s also very knowledgeable about history, and he has simple theories in his head about how the world works. I think he would level with you, and all you would’ve to do is just figure out what all the right questions are. That would not be hard to do. You could ask him why was he so foolish? For example, why was he so foolish as to trust Poroshenko, Hollande and Merkel in the Minsk Accords. Why after his famous talk at Munich in 2007 where he made it clear that he was so unhappy with the West, did he continue to, in a very important way, trust the West?

(01:21:52)
Why didn’t he mobilize the Russian military before late September, 2022, once the negotiations that we were talking about before involving Istanbul and Naftali Bennett. Once they broke down, why didn’t he immediately mobilize more of the Russian population to fight the war? Just all sorts of questions like that. Then, you could ask him questions about where he sees this one headed. What’s the best strategy for Russia if the Ukrainians will not agree to neutrality?People like John Mearsheimer say, “You’ll probably take close to half of Ukraine. Is that true? Does it make sense to take Odessa.”
Lex Fridman
(01:22:47)
And John Mearsheimer also has questions about China, your future relationships with China?
John Mearsheimer
(01:22:53)
Yeah, I mean, one really important question that I would ask him is if the United States had basically not driven you into the arms of the Chinese, if there had been no war over Ukraine and the United States and its European allies had gone to considerable lengths to create some sort of security architecture in Europe that resulted in you, Vladimir Putin having good relations with Ukraine, what would your relations with China be and how would you think about that? So there are just plenty of questions you could ask him.
Lex Fridman
(01:23:33)
Well, hope burns eternal in my heart, I think probably in Putin’s heart and Zelensky’s heart, I hope because hope is, the leap of trust that we’ve talked about, I think is necessary for deescalation and for peace.
John Mearsheimer
(01:23:50)
Well, you realize, I have, from the beginning, argued for different policies that were all designed to prevent this war from ever happening.
Lex Fridman
(01:23:59)
Yes.
John Mearsheimer
(01:24:00)
I don’t know if you know this, but in 1993, I argued that Ukraine should keep its nuclear weapons. I was probably the only person in the West who made that argument. And my argument in 1993, this is in foreign affairs, was that there may come the day when Russia thinks about invading Ukraine. And should that day come, it would be very helpful for preventing war if Ukraine had nuclear weapons.
Lex Fridman
(01:24:27)
So military might is essential for maintaining a balance of power and peace.
John Mearsheimer
(01:24:33)
Well, if you’re interested in deterring an adversary, if I’m worried about you coming after me, the best way to deter you is to have military might. If you’re Russia, and I’m Ukraine, I’m far weaker than you, right?
Lex Fridman
(01:24:46)
Yeah.
John Mearsheimer
(01:24:47)
And having a nuclear deterrent would be very effective at convincing you not to attack me because if you attack me, you’re threatening my survival. And that’s the one circumstance where it is likely that I would use nuclear weapons to defend myself and given the consequences of nuclear use, you would be reluctant in the extreme to attack me. So that’s why I argued in ’93 that if Ukraine kept its nuclear weapons that made war down the road much less likely. And I believe I was correct. And in fact, Bill Clinton, who played the key role in forcing Ukraine to give up its nuclear weapons now says … he has said it publicly, you can find it on YouTube that he made a mistake doing that.

(01:25:36)
Furthermore, I argued in 2014 that it made eminently good sense not to continue to push to bring Ukraine into NATO because the end result is that Ukraine would be destroyed and Ukraine is being destroyed. So I was deeply interested at time in making sure that that didn’t happen for the good of the Ukrainians, not to mention, because stability in Europe is a net positive for almost everybody involved, but people did not listen to me then either.
Lex Fridman
(01:26:08)
How did nuclear weapons change the calculus of offensive realism, because of mutually assured destruction? I mean, it’s not just military might. It’s just so destructive that you basically can’t use nuclear weapons unless you want complete destruction.
John Mearsheimer
(01:26:28)
There’s no question that the presence of nuclear weapons makes it much less likely. I’m choosing my words carefully here, much less likely that a great power would aggress against another great power. It doesn’t take that possibility off the table, but it makes it much less likely because of the reasons that you articulated. With regard to nuclear use, it’s an interesting question how you think about nuclear use in a MAD world. I mean, your point that we’re in a MAD world is … that’s mad, MAD as well as mad, small letters, but let’s stick to the capital letters. We’re in a world of mutual assured destruction. There’s no question that in that world, it’s unlikely that nuclear weapons would be used.

(01:27:22)
The way you use nuclear weapons in that world is you use them for manipulation of risk purposes, demonstration effect. You put both sides out on the slippery slope. Now, what exactly am I saying here? Let me talk about NATO doctrine during the Cold War. We lived in a MAD world, United States and Soviet Union or the Warsaw Pact in NATO, both had an assured destruction capability. So you had mutual assured destruction. If the Warsaw Pact were to invade Western Europe, and here we’re talking about West Germany and NATO was losing the war, we said that we would use nuclear weapons. How would we use nuclear weapons given that we were in a MAD world? The argument was that we would use a handful of nuclear weapons against the Warsaw Pact, not necessarily against their military forces.

(01:28:25)
It could be in a remote area. We would use a small number of nuclear weapons to signal to the Soviets that we were deadly serious about putting an end to their offensive, and that we were throwing both sides out on the slippery slope to oblivion. In other words, we were manipulating risk and the last clear chance to avoid Armageddon rested with them. And then, we would tell them that if you retaliated with a handful of nuclear weapons and you didn’t cease your offensive against West Germany, we would launch a small, another nuclear attack. We would explode a handful more of nuclear weapons, all for the purposes of showing you our resolve.

(01:29:21)
So this is the manipulation of risk strategy, and a lot of the language I just used in describing it to you is language that Thomas Schelling invented. Now fast-forward to the present, if Russia were losing in Ukraine, that’s the one scenario where I think where Russia would’ve used nuclear weapons. The question is, how would Russia have used nuclear weapons? Again, we’re assuming that the Russians are losing to the Ukrainians. I believe they would’ve pursued a manipulation of risk strategy. They would’ve used four or five, three or four, who knows, nuclear weapons-
Lex Fridman
(01:29:59)
Maybe just one in a rural area that kills very few people.
John Mearsheimer
(01:30:03)
Yes, exactly, and basically, that would spook everybody. The American-
Lex Fridman
(01:30:08)
Just the mushroom cloud.
John Mearsheimer
(01:30:10)
Yeah. It’s because of the threat of escalation.
Lex Fridman
(01:30:14)
Yeah.
John Mearsheimer
(01:30:14)
Again, your point is we’re in a MAD world. I accept that and if you have limited nuclear use, right? We understand hardly anything about nuclear escalation because thank goodness we’ve never had a nuclear war. So once you throw both sides out on the slippery slope, even if you only use one nuclear weapon in your scenario, you don’t know what the escalation dynamics look like. So everybody has a powerful incentive to put an end to the conflict right away. I might add to you that there were people who believed that we would not even initiate a manipulation of risk strategy in Europe if we were losing to the Warsaw Pact during the Cold War.

(01:31:04)
Both Henry Kissinger and Robert McNamara said after leaving office that they would not have done it. They would’ve not initiated nuclear use, even limited nuclear use. That’s what we’re talking about here. They would rather be red than dead, that was the argument.
Lex Fridman
(01:31:24)
Too risky.
John Mearsheimer
(01:31:25)
Too risky. That’s exactly right, but if they had used one nuclear weapon in your story, or three or four in my story, everybody would’ve said, “Oh my God, we’ve got to shut this one down immediately.” I only tell you this story or lay out this scenario as an answer to your question of how you use nuclear weapons in a MAD world, and this is the answer.
Lex Fridman
(01:31:53)
This is all very terrifying. Perhaps in part, it’s terrifying to me because I can see in the 21st century, China, Russia, Israel, United States using a nuclear weapon in this way, blowing it up somewhere in the middle of nowhere that kills maybe nobody, but I’m terrified of seeing the mushroom cloud and not knowing, given social media, given how fast news travels, what the escalation looks like there. Just in a matter of minutes, how the news travels and how the leaders react. It’s terrifying that this little demonstration of power, the ripple effects of it, in a matter of minutes, seconds, what that leads to because it’s human emotions.

(01:32:51)
You see the landscape of human emotions, the leaders and the populace and the way news are reported, and then the landscape of risk, as you mentioned, shifting the world’s most intense nonlinear dynamical system, and it is just terrifying because the entirety of human civilizations hangs in the balance there. And it’s like this, hundreds of millions of people could be dead.
John Mearsheimer
(01:33:21)
Let’s just talk about this in the context of the Ukrainian War. If the Russians were losing, as I said before, which is not the case anymore, but in 2022, it did look like that, if the Russians are losing and they turn to nuclear weapons, the question is how do they use them? And they would use them in Ukraine, and because Ukraine has no nuclear weapons of its own, Ukraine cannot retaliate. It’s not a mutual assured destruction world. It’s a case where one side has nuclear weapons and the other doesn’t. That means that the Russians are likely to think that they can get away with using nuclear weapons in ways that would not be the case if they were attacking NATO.

(01:34:17)
And therefore, it makes nuclear use more likely. Okay. That’s point one. Point two is let’s assume that the Russians use two or three nuclear weapons in a remote area-
Lex Fridman
(01:34:27)
My palms are sweating, by the way. Just as a commentary. It’s terrifying.
John Mearsheimer
(01:34:32)
Yeah. The question then is what does the West do? Now, Macron has said and Biden has also, I think, implicitly made this clear, “We would not retaliate with nuclear weapons, if the Russians were to attack with a handful of nuclear weapons in Western Ukraine.” Then, the question is what would we do? And if you listen to David Petraeus, what David Petraeus says, is that we should attack the Russian naval assets in the Black Sea and attack Russian forces in Ukraine. Well, once you do that, you have a great power of war. You have NATO versus Russia, which is another way of saying you have the United States versus Russia. We’re now in a great power of war.

(01:35:23)
They have nuclear weapons, we have nuclear weapons. They’ve used nuclear weapons. What is the happy ending here? And just to take it a step further and go back to our earlier discussion about moving NATO up to Russia’s borders, the point I made, which you’ll surely agree with, is that the Russians are very fearful when they see NATO coming up to their border. Well, here’s a case where not only is NATO come up to their border, but they’re in a war with NATO right on their border. What do the escalation dynamics look like there? You know what the answer is? Who knows? That should scare the living bejesus out of you, right?
Lex Fridman
(01:36:06)
And some of it could be, like you mentioned, unintended. There could be unintended consequences. That could be a Russian missile misses in hits Poland. These kinds of things that just escalate misunderstandings, miscommunications, even … I mean, nuclear weapon could be … boy, it could have been planned to go location X, and it went to a location Y that ended up actually killing a very large number of people. I mean, the escalation that happens there just happens in a matter of minutes. And the only way to stop that is communication between leaders. And that to me is a big argument for ongoing communication.
John Mearsheimer
(01:36:52)
There’s a story that during the Cuban missile crisis, Kennedy put out the word, no aircraft under any circumstances or to penetrate Soviet airspace. He then found out a few days later that some guy hadn’t gotten the message and had penetrated in an aircraft deep into Soviet airspace.
Lex Fridman
(01:37:19)
Yeah.
John Mearsheimer
(01:37:19)
And this supports your basic point that bad things happen.
Lex Fridman
(01:37:25)
Yeah.
John Mearsheimer
(01:37:26)
And again, the overarching point here is we’ve never done this before, thankfully. Therefore, we don’t have a lot of experience as to how it plays itself out. It’s really a theoretical enterprise because there’s no empirical basis for talking about escalation in a nuclear crisis. And that, of course, is a wonderful thing.
Lex Fridman
(01:37:50)
Well, and in general, the human species as a whole as a one-off, is a theoretical enterprise. The survival of the human species. We’ve seen empires rise and fall, but we haven’t seen the human species rise and fall. So far it’s been rising, but it’s not obvious that it doesn’t end. In fact, I think about aliens a lot, and the fact that we don’t see aliens makes me suspect it’s not so easy to survive in this complicated world of ours. Switching gears a little bit and going to a different part of the world, also engulfed in war. Let me ask you about the situation in Israel. Why did Hamas attack Israel on October 7th, 2023? As you understand the situation, what was the reason that attack happened?

Israel and Palestine

John Mearsheimer
(01:38:48)
Well, I think the main reason was that you had this suffocating occupation. I think as long as the occupation persists, the Palestinians are going to resist. As you well know, this is not the first time there has been a Palestinian uprising. There was the first Intifada, there was the second Intifada, now there’s October 7th, and there are uprisings besides those three, so this is not terribly surprising. A lot of people hypothesized that this attack was due to the fact that the Israelis, the Saudis and the Americans were working together to foster another Abraham Accord and that the Palestinians would in effect be sold down the river.

(01:39:45)
I think given the fact that this was in the planning stages for probably about two years, and the Abraham Accords with regard to Saudi Arabia are relatively new phenomenon, I don’t think that’s the main driving force here. I think the main driving force is that the Palestinians feel oppressed as they should, and that this was a resistance move. They were resisting the Israeli occupation.
Lex Fridman
(01:40:18)
So that resistance, the attack involved killing a large number of Israeli civilians. There’s many questions asked there, but one is, do you think Hamas fully understood what the retaliation will involve from Israel and to Gaza?
John Mearsheimer
(01:40:39)
They had to understand. I mean, you had Operation Cast Lead in 2008, 2009. It started, I think right after Christmas 2008, and it ended right before President Obama took office in January 2009. And the Israelis periodically do what they call mowing the lawn where they go into Gaza and they pound the Palestinians to remind them that they’re not supposed to rise up and cause any problem. So there’s no question in my mind that the Hamas forces understood full well that the Israelis would retaliate and they would retaliate in force as they have done.
Lex Fridman
(01:41:35)
Yeah, even the metaphor of mowing the lawn is disturbing to me in many ways. I actually saw Norman Finkelstein, I think, say that, well, then if you use that metaphor, then you could say that Hamas was also mowing the lawn. It’s such a horrific image because the result on either side is just the death of civilians. I mean, let me ask you about the death of civilians. So during the attack, 1400 Israelis were killed. Over 240 were taken hostage. Then, in response, as we sit today, Israel’s military response has killed over 10,000 people in Gaza. And given the nature of the demographics, it’s a very heavily young population.

(01:42:27)
Over 40% of them are under the age of 18, of those killed. That’s of course, according to Ministry of Health of Palestinian Authority. So what do you think is the long-term effect on the prospect of peace when so many civilians die?
John Mearsheimer
(01:42:46)
I mean, I think it’s disastrous. I mean, the only way you’re going to get peace here is if you have a two-state solution where the Palestinians have a sovereign state of their own, and there is a sovereign Jewish state. And these two states live side by side American presidents since Jimmy Carter have understood this full well. And this is why we have pushed very hard for two-state solution. Indeed, many American Jews and many Israelis have pushed for a two-state solution because they think that that is the only way you’re going to get peace between the two sides. What’s happened here is that in recent years, the Israelis have lost all interest in a two-state solution.

(01:43:43)
And it’s in large part because the political center of gravity in Israel has steadily moved to the right. When I was a young boy, the political center of gravity in Israel was much further to the left than it is today. It is in a position now, the political center of gravity where there’s hardly any support for two state solution and Netanyahu and the rest of the people in his government were in favor or are in favor of a greater Israel. There’s just no question about that. Well, on top of that, you now have had a war where, as you described, huge numbers of civilians have been killed, and you already had bad blood between the Palestinians and the Israelis before this conflict.

(01:44:41)
And you could imagine how people on each side now feel about people on the other side. So even if you didn’t have this opposition inside Israel to a two-state solution, how could you possibly get the Israelis now to agree to a two-state solution? I think for the foreseeable future, the animosity …
John Mearsheimer
(01:45:03)
Solution. I think for the foreseeable future, the animosity inside Israel towards the Palestinians is so great that it is impossible to move the Israelis in that direction. And the Israelis here are the key players more so than the Palestinians because it’s the Israelis who control Greater Israel. It’s the Israelis who you have to convince. Now, I want to be clear here. You also ultimately have to get around the fact that Hamas is not committed to a two-state solution. But I think that problem could be dealt with. It’s important to understand that Arafat and the PLO was once adamantly opposed to a two-state solution. But Arafat came around to understand that that was really the only hope for settling this. And he became a proponent of a two-state solution.

(01:45:53)
And that’s true of Mahmoud Abbas who runs the PA in the West Bank. It’s not true of Hamas at this point in time. They want a one-state solution, they want a Palestinian state. And of course, the Israelis want a one-state solution too, which is a Jewish state that controls all of Greater Israel. So the question is, can you get some sort of agreement? And I think to get to the nub of your question, given what’s just happened, it’s almost impossible to imagine that happening anytime soon.
Lex Fridman
(01:46:27)
The cynical perspective here is that those in power benefit from conflict while the people on both sides suffer. Is there a degree of truth to that? Or for the people in power to maintain power conflict needs to continue?
John Mearsheimer
(01:46:44)
No, I don’t believe that. I mean, just to take the Netanyahu government or any Israeli government that maintains the occupation, what you want is you want a Palestinian population that submits to Israeli domination of Greater Israel. You don’t want resistance, you don’t want an intifada. You don’t want what happened on October 7th. In fact, I think one of the principal reasons that the Israelis are pounding Gaza and killing huge numbers of civilians. Punishing the civilian population in ways that clearly violate the laws of war, is because they want the Palestinians to understand that they are not allowed to rise up and resist the occupation. That’s their goal.

(01:47:33)
So, I think the Israelis would prefer that the Palestinians roll over and accept submission. In terms of the people who live in Gaza to include the elites, and the people who live in the West Bank to include the elites. They would much prefer to move to some sort of situation where the Palestinians have a state of their own. I think in the case of the PA, under Abbas, they would accept a two-state solution. I think what, at this point in time, Hamas wants is a one-state solution, but they want peace. All of them want peace. The two different sets of leadership in Palestine and the Israelis.
Lex Fridman
(01:48:16)
So you think Hamas wants peace?
John Mearsheimer
(01:48:19)
Sure. But on its own terms, that’s the point.
Lex Fridman
(01:48:21)
What does peace look like for Hamas?
John Mearsheimer
(01:48:24)
At this point in time, I think peace basically means a Greater Israel controlled by Palestine or Palestinians.
Lex Fridman
(01:48:31)
Okay. So essentially, it’s the whole land is called Palestine and there’s no Israel?
John Mearsheimer
(01:48:38)
I think, at this point in time, that’s their principal goal. I do believe, and there have been hints over time, Jimmy Carter has said this, that Hamas can be convinced to a two-state solution. Assuming that the Palestinians get a viable state of their own, that Hamas would buy into that. Can we say that with a high degree of certainty? No, but I think the Israelis should have pursued that possibility. They should have worked with Abbas, they should have worked with Hamas to do everything they can to facilitate a two-state solution. Because I think, ultimately, that’s in Israel’s interest. Now, the Israeli government, and most Israelis at this point in time, I believe, don’t agree with that.
Lex Fridman
(01:49:21)
What do you think of Israel starting the ground invasion of Gaza recently on October 27th?
John Mearsheimer
(01:49:31)
The question is, should they continue until they have finally defeated Hamas? There are all sorts of reports in the media, including in the Israeli media, that they’re not going to be allowed by the United States to continue this offensive for much more than a few weeks. The Israelis have been saying it’s going to take, in the best of all possible worlds, a number of months, if not a year to finish off Hamas. Well, it doesn’t look like they’re going to have enough time to do that. I doubt whether they can finish off Hamas, even if they’re given the time. I think they’re going to run into fierce resistance. And when they run into fierce resistance and large numbers of Israelis going to start to die, they’ll lose their appetite for this. And they, the Israelis, surely know at this point in time that even if they finish off Hamas, even if I’m wrong and they’re able to finish off Hamas, another group is going to rise up to resist the occupation.

(01:50:48)
The idea that you can use with Ze’ev Jabotinsky called The Iron Wall, to beat the Palestinians into submission is delusional. It’s just not going to happen. The Palestinians want a state of their own. They don’t want to live under occupation. And there’s no military solution for Israel here. There has to be a political solution. And the only viable political solution is a two-state solution. I mean, you can’t go to democracy. You can’t go to a situation where you give the Palestinians equal rights inside of Greater Israel in large part because there are now as many Palestinians as there are Israeli Jews. And over time, the balance, the demographic balance shifts against the Israeli Jews and in favor of the Palestinians. In which case, you’ll end up with a Palestinian state in Greater Israel. So democracy for all doesn’t work. The Israelis, I believe, are quite interested in ethnic cleansing.

(01:51:56)
I think they saw this recent set of events as an opportunity to cleanse Gaza, but that’s not going to happen. The Jordanians and the Egyptians have made it clear that that’s not happening. The United States has now made it clear that that’s not happening. And the Palestinians will not leave. They’ll die in place. So ethnic cleansing doesn’t work. So you’re really left with two alternatives, the two-state solution or a Greater Israel that is effectively an apartheid state. I mean, that’s what the occupation has led to. And all sorts of people have been predicting this for a long, long time. And you’ve now reached the point. Here in the United States, if you say that Israel’s an apartheid state, that’s going to get you into all sorts of trouble. But the fact is that Human Rights Watch, Amnesty International, and B’Tselem, which is the leading Israeli human rights group. All three of those institutions or organizations have issued detailed reports making the case that Israel is an apartheid state.

(01:53:07)
Furthermore, if you read the Israeli media, all sorts of Israelis, including Israeli leaders, refer to Israel as an apartheid state. It’s not that unusual to hear that term used in Israel. This is disastrous for Israel in my opinion. And Steve Walt and I said this, by the way, when we wrote The Israel Lobby, that Israel is an apartheid state, which is equivalent to Israel as an occupier is not good for Israel. That brings us back to the two-state solution. But as you and I were talking about a few minutes ago, it’s hard to see how you get a two-state solution. And the end result of this conversation is utter despair.
Lex Fridman
(01:53:53)
Because the path to a two-state solution is blocked by the amount of hate that’s created by civilian deaths?
John Mearsheimer
(01:54:01)
Well, that plus the fact that the Israeli government is filled with people who have no interest in a two-state solution. They’re ideologically deeply committed to a Greater Israel. They want all the land between the Jordan River and the Mediterranean Sea to be part of a Jewish state. They’re just ideologically committed to that. And of course, as we were talking about before with regard to Hamas, Hamas wants everything between the river and the sea to be a Palestinian state. And when you have two sides with those kinds of views, you’re in deep trouble because there’s a little room for compromise. So what you have to do to get this to work is you have to convince the Israelis that it’s in their interest to have a two-state solution. And you’ve already taken care of the PA on this front, the Palestinian Authority, but you’ve got to convince Hamas that it’s maximalist goals are not going to work. And it’s in its interest to follow in the footsteps of Arafat and accept a two-state solution.

(01:55:17)
But even if you do that at this point, let’s say, that there’s a lot of willingness intellectually on both sides to do that. The problem is that the hatred that has been fueled by this ongoing conflict is so great that it’s just hard to imagine how you can make a two-state solution work at this juncture. That’s why I’ve sort of taken to saying, and I hope I’m wrong here, that on the two-state solution, that boat has sailed. It’s no longer possible.
Lex Fridman
(01:55:53)
Well, again, I believe in leadership and there’s other parties at play here, other nations, Jordan, Saudi Arabia, other players in the Middle East that could help through a normalization of relationships and these kinds of things. There’s always hope, like you said, slither of hope.
John Mearsheimer
(01:56:10)
Slither of hope.
Lex Fridman
(01:56:12)
I think human civilization progresses forward by taking advantage of all the slithers it can get. Let me ask you about, you mentioned The Israel Lobby. You wrote a book, probably your most controversial book on the topic.
John Mearsheimer
(01:56:26)
Not probably. Clearly, the most controversial book I ever wrote.
Lex Fridman
(01:56:30)
So you’ve criticized the Israel lobby in the United States for influencing US policy in the Middle East. Can you explain what the Israel lobby is, their influence, and your criticism over the past, let’s say a couple of decades?
John Mearsheimer
(01:56:48)
Well, the argument that Steve Walt and I made, actually, we wrote an article first, which appeared in the London Review of Books, and then we wrote the book itself. Our argument is that the lobby is a loose coalition of individuals and organizations that push American policy in a pro-Israel direction. And basically, the lobby is interested in getting the United States, and here we’re talking mainly about the American government, to support Israel no matter what Israel does. And our argument is, that if you look at the relationship between the United States and Israel, it’s unprecedented in modern history. This is the closest relationship that you can find between any two countries in recorded history. It’s truly amazing the extent to which Israel and the United States are joined at the hip. And we support Israel no matter what almost all the time. And our argument is that, that is largely due to the influence of the lobby. The lobby is an extremely powerful interest group.

(01:58:15)
Now, it’s very important to understand that the American political system is set up in ways that allow interest groups of all sorts to wield great influence. So in the United States, you have an interest group or a lobby like the National Rifle Association that makes it, well, not impossible to get gun control. And so with the Israel lobby, you have this group of individuals and organizations that wield enormous influence on US policy toward the Middle East. And this is not surprising given the nature of the American political system. So our argument is that the lobby is not doing anything that’s illegal, or illicit, or immoral, or unethical. It’s just a good old-fashioned American interest group. And it just happens to be extremely powerful. And our argument is that this is not good for the United States because no two countries have the same interests all the time. And when our interests conflict with Israel’s interest, we should be able to do what we think is in our national interest, in America’s national interest.

(01:59:42)
But the lobby tends to conflate America’s national interests with Israel’s national interests and wants the United States to support Israel no matter what. We also argue, and I cannot emphasize this enough, given what’s going on in the world today, that the lobby’s effects, the lobby has not been pushing policies that are in Israel’s interest. So our argument is that the lobby pushes policies that are not in America’s interest or not in Israel’s interest. Now, you’re saying to yourself, what exactly does he mean by that? What every president since Jimmy Carter has tried to do, as I said before, is to foster a two-state solution to push Israel, which is the dominant player in Greater Israel, push Israel to accept the two-state solution. And we have run into huge resistance from the lobby whenever we try to, let’s be blunt about it, coerce Israel.

(02:00:51)
In a perfect world where there was no lobby and an American president was free to put pressure on Israel, to coerce Israel, I believe, we would’ve gone a long way towards getting two-state solution. And I believe, this would’ve been in Israel’s interest. But we couldn’t get a two-state solution because it was almost impossible to put meaningful pressure on Israel because of the lobby. So this was not in Israel’s interest and it was not in America’s interest. And that was the argument that we made. And we, of course, got huge pushback for making that argument.
Lex Fridman
(02:01:28)
What’s the underlying motivation of the lobby? Is it religious in nature? Is it similar to the way war hawks are sort of militaristic in nature? Is it nationalistic in nature? If you were describe this loose coalition of people, what would you say is their motivation?
John Mearsheimer
(02:01:47)
Well, first of all, I think you have to distinguish between Jews and Christians. You want to remember that there are a huge number of Christian Zionists who are deeply committed to Israel no matter what, right? And then, there are a large number of Jews. The Jews are obviously the most important of those two groups in the Israel lobby. But one of the arguments that we made in the book is that you should not call it the Jewish lobby because it’s not populated just by Jews and Christian Zionists are an important part of that lobby. But furthermore, there are a good number of Jews who are opposed to the lobby and the policies that the lobby pervades. And there are a number of Jews who are prominent anti-Zionist, and they’re obviously not in the lobby. Or if you take a group like Jewish Voice for Peace, Jewish Voice for Peace is not in the lobby. So it’s wrong to call it a Jewish lobby.

(02:02:52)
But with regard to the American Jews who are in that lobby, I think that really, this is all about nationalism. It’s not so much religion. Many of those Jews who are influential in the lobby are not religious in any meaningful sense of that term. But they self-identify as Jewish in the sense that they feel they’re part of a Jewish nation. And that in addition to being an American, they are part of this tribe, this nation called Jews. And that they have a responsibility to push the United States in ways that support the Jewish state. So I think that’s what drives most, if not almost all the Jews. This is not to say there’s not a religious dimension for some of them, but I think that the main connection is much more tribal in nature.
Lex Fridman
(02:03:49)
So I had a conversation with Benjamin Netanyahu and he said, “Fundamentally, if you’re anti-Zionist, you’re antisemitic.” So the Zionist project is tied to the hip to the Jewish project, what do you have to say to that?
John Mearsheimer
(02:04:08)
Look, you can define antisemitism any way you want. And you can define antisemitism to incorporate anti-Zionism. And I think we have reached the point where antisemitism is identified today, not just with anti-Zionism, but with criticism of Israel. If you criticize Israel, some people will say you’re an antisemite. And if that’s your definition of antisemitism, it’s taken an important term and stretched it to the point where it’s meaningless. So when Steve and I wrote the book, wrote the article and then wrote the book, all sorts of people said that we were antisemites. This is a ludicrous charge. But what they meant was, you’re criticizing the lobby, you’re criticizing Israel, and therefore, you’re an antisemite. Okay. If that’s what an antisemite is, somebody who criticizes Israel, probably half the Jewish community, if not more in the United States, is antisemitic. And of course, you get into all these crazy games where people are calling Jews, self-hating Jews and antisemites because they’re critical of Israel.

(02:05:35)
But even people who are anti-Zionists, I don’t think they’re antisemitic at all. You can argue they’re misguided, that’s fine. But many of these people are Jewish and proud of the fact that they’re Jewish. They just don’t believe that nationalism and Jewish nationalism is a force that should be applauded. And you want to understand that in the American context, there is a rich tradition of anti-Zionism. And these were not people who were antisemites if you go back to the thirties, forties, fifties. And the same thing was even true in Europe. There were all sorts of European Jews who were opposed to Zionism. Were they antisemites? I don’t think so. But we’ve gotten to the point now where people are so interested in stopping any criticism of Israel that they wield this weapon of calling people antisemites so loosely that the term has kind of lost meaning. So I think Netanyahu is wrongheaded to equate anti-Zionism with antisemitism.
Lex Fridman
(02:06:49)
Alan Dershowitz was one of the people that called you specifically antisemitic. So just looking at the space of discourse, where’s the slither of hope for healthy discourse about US relationships with Israel between you and Alan Dershowitz and others like him?
John Mearsheimer
(02:07:16)
Well, I think until there is a settlement of the Israeli-Palestinian conflict, there’s no hope of putting an end to this nonsense. Right?
Lex Fridman
(02:07:27)
So these are just uses of terms to kind of cheat your way through the discourse, it’s a shortcut.
John Mearsheimer
(02:07:35)
No, it’s to silence people. It’s very important to understand that one of the lobby’s principle goals is to make sure we don’t have an open discourse, a freewheeling discourse about Israel. Because they understand, people in the lobby understand, that if you have an open discourse, Israel will end up looking very bad. You don’t want to talk about the occupation, you don’t want to talk about how Israel was created. All these subjects are ones that will cause problems for Israel. See, just to go to the present crisis. When you have a disaster, and what happened on October 7th is a disaster. One of the first things that happens is that people begin to ask the question, how did this happen? What’s the root cause of this problem? This is a disaster. We have to understand what caused it so that we can work to make sure it doesn’t happen again. So we can work to shut it down and then make sure it doesn’t happen again.

(02:08:46)
But once you start talking about the root causes, you end up talking about how Israel was created. And that means telling a story that is not pretty about how the Zionists conquered Palestine. And number two, it means talking about the occupation, right? It’s not like Hamas attacked on October 7th because there were just a bunch of antisemites who hated Jews and wanted to kill Jews. This is not Nazi Germany. This is directly related to the occupation and to what was going on inside of Gaza. And it’s not in Israel’s interest or the lobby’s interest to have an open discourse about what the Israelis have been doing to the Palestinians since, I would say, roughly 1903 when the second aliyah came to Israel or came to what was then Palestine, right? We want to talk about that. And we don’t want to talk about from the lobbyist’s point of view, the influence that the lobby has, right?

(02:09:54)
It’s better from the lobbyist’s point of view if most Americans think that American support of Israel is just done for all the right moral and strategic reasons, not because of the lobby. And when John Mearsheimer and Steve Walt come along and say, you have to understand that this special relationship is due, in large part, to the lobby’s influence. That is not an argument that people in the lobby want to hear. So the point is, you have to go to great lengths for all these reasons. You have to go to great lengths to silence people like me and Steve Walt. And one of the ways to do that is to call us antisemites.

(02:10:32)
I think the chapter or the section of the book where we talk about this charge of antisemitism is called The Great Silencer. That’s what we call the charge of antisemitism, The Great Silencer. Who wants to be called an antisemite, especially in the wake of the holocaust? Do I want to be called an antisemite? Oh my God, no. And so it’s very effective. But it is important to talk about these issues, in my humble opinion. And I think if we had talked about these issues way back when, it would’ve gone a long way towards maybe getting a two-state solution, which I think was the best alternative here.
Lex Fridman
(02:11:21)
It’s complicated. And I wonder if you can comment on the complexity of this, because criticizing Israel and criticizing the lobby can, for a lot of people, be a dog whistle for sort of antisemitic conspiracy theories. That this idea that Jews run everything, run the world, they’re this kind of cabal. And it’s also very true that people who are legitimately antisemitic are also critics of Israel in the same kind of way. And so, it’s such a complicated landscape in which to have discussions. Because even people like David Duke who are racist, don’t sound racist on the surface. I haven’t listened to him enough. But there’s dog whistles. It’s a complicated space in which to have discussions. I wonder if you can sort of speak to that. Because there’s this silencing effect of calling everybody antisemitic. But it’s also true that there’s antisemitism in the world, there is a sizable population of people that hate Jews. There’s probably a sizable population of people who hate Muslims, too.
John Mearsheimer
(02:12:51)
A lot of hate out there.
Lex Fridman
(02:12:53)
A lot of hate out there. But the hatred of Jews has a long history. And so you have, like Rolling Stones have a set of great hits. And there’s just a set of great hits of the ways, conspiracy theories, that you can make about the Jews that are used as part of the hatred. So there’s nice templates for that. And I just wonder if you can comment on operating as a historian, as an analyst, as a strategic thinker in this kind of space.
John Mearsheimer
(02:13:25)
Obviously, when we wrote the article, which we did before the book gave this subject a great deal of thought. I mean, what you say just now is music to our ears. I’m talking about me and Steve. I think that your point about dog whistles is correct. Look, we went to great lengths to make it clear that this is not a cabal. It’s not a conspiracy. And in fact, in a very important way, the lobby operates out in the open. They brag about their power. And this was true before we wrote the article. And we said in the article, in the book, and you heard me say it here, first of all, it’s not a Jewish lobby. Secondly, it’s not a cabal. It’s an American interest group.
Lex Fridman
(02:14:29)
And the American system is designed such that interest groups are perfectly legal, and some of them are super effective.
John Mearsheimer
(02:14:37)
Exactly. I mean, you hit the nail right on the head. That’s exactly right. And there was nothing that we said that was antisemitic by any reasonable definition of that term. And huge numbers of Jews have known me and Steve over the years, and nobody ever, ever said that we were antisemitic before March, 2006 when the article appeared, because we’re not antisemitic. But look, you’ve got this interest group that has a significant influence on American policy and on Israeli policy, and you want to talk about it. It’s just important to talk about it. It’s important for Jews in the United States, for Jews in Israel, to talk about this. The idea that you want to silence critics is not a smart way to go about doing business, in my opinion. If we were wrong, if Steve and I were so wrong and our arguments were so foul, they could have easily exposed those arguments. They could have gone into combat with this in terms of the marketplace of ideas and easily knocked this down.

(02:16:00)
The problem was that our arguments were quite powerful. And instead of engaging us and defeating our arguments, they wanted to silence us. And this is not good. It’s not good for Israel, it’s not good for the United States. And I would argue in the end, if anything, it’s going to foster antisemitism. I think you don’t want to run around telling people that they can’t talk about Israel without being called an antisemite. It’s just not healthy in terms of the issue that you’re raising. But I still agree with you that it is a tricky issue. I don’t want to make light of that. I know that there’s this piece of literature out there called the Protocols of the Elders of Zion. And I fully understand that if you’re not careful, you can come close to writing volume two of the protocol. But I don’t believe that we wrote anything that was even close to that. And again, I think that a healthy debate on the issues that we were raising would’ve been not only in America’s interest, but it would’ve been in Israel’s interest.
Lex Fridman
(02:17:15)
Yeah. Underneath it all is just, I wonder why there is so much hate against groups, why it’s such a sticky way of thinking. Not just tribalism, proud of your country and kind of hating another country, but really deeply hating. Hating in a way where it’s part of your identity kind of hate.
John Mearsheimer
(02:17:40)
Well, just to make a general point on this issue in our conversation here today, you often talk about individual leaders, and the word individual often pops up in your vocabulary. I believe that we are ultimately social animals before we are individuals. I believe we’re born into tribes, we’re heavily socialized, and that we carve out space for our individualism. But we are part of tribes, or social groups, or nations. Call them what you want, ethnic groups, religious groups. But the fact is that these tribes often crash into each other. And when they crash into each other, they end up hating each other. If you go to a place like Bosnia, the Croats and the Serbs, oh, my God. And then throw in the Bosniaks, which is the term for Bosnian Muslims. And Muslims, Croats, Serbs, and the tribes hate each other. And in a funny way, that hatred almost never goes away. And I guess, there are some exceptions to that.

(02:18:59)
If you look at the Germans after World War II, they’ve gone a long way towards reducing, I wouldn’t want to say completely eliminating, but reducing a lot of the hatred that existed between Germans and their neighbors. But that’s really kind of an anomalous case. I mean, you go around East Asia today and the hatred of Japan in a place like China, the hatred of Japan in a place like Korea, just not to be underestimated. But I think a lot of it just has to do with the fact that you’re dealing with social groups that have crashed into each other at one point or another. And there are those lingering effects. And by the way, this gets back to our discussion a few minutes ago about trying to get a two-state solution between the Palestinians and the Israeli Jews now that you have had this horrible war, which is ongoing.
Lex Fridman
(02:19:59)
It’s interesting to ask, to go back to World War II-
Lex Fridman
(02:20:02)
… To ask to go back to World War II. Now, you said you studied Nazi Germany in the ’30s from a perspective of maybe offensive realism, but just to look at the Holocaust, it’s sometimes popular in public discourse today to compare certain things to the Holocaust. People have compared the Hamas attack on Israel to the Holocaust, saying things like, “It’s the biggest attack on Jews since the Holocaust,” which kind of implies that there’s a comparison. People have made that same comparison in the other direction. What do you make of this comparison? Is it comparable? Does the use of the Holocaust have any accuracy in comparisons of modern day international politics?
John Mearsheimer
(02:21:01)
Is it possible that you could have another genocide? Yes, and I would argue that what you had in Rwanda was a genocide. The Holocaust is not the only genocide. I believe the word genocide is used too loosely today. And as you know, lots of people, and I mean lots of people who are pro-Palestinian accused the Israelis of engaging in genocide in Gaza. I think what the Israelis are doing in Gaza represents a massacre. I would use that term given the number of civilians that they’ve killed and the fact that they’ve been indiscriminate in terms of how they’ve been bombing Gaza. But I would not use the word genocide. For me, a genocide is where one side attempts to eliminate another group from the planet. I think that what happened with the Holocaust was clearly a genocide, and that the Germans were bent on destroying all of European Jewry.

(02:22:13)
And if they could have gotten their hands on Jews outside of Europe, they would’ve murdered them as well. That’s a genocide. And I think with the Hutus and the Tutsis, you had a similar situation. I think with the Turks and the Armenians during World War I, that was a genocide, but I have a rather narrow definition of what a genocide is and I don’t think there are many cases that qualify as a genocide. The Holocaust certainly does. Now, what Hamas did doesn’t even come close to what happened to European Jewry between, let’s say, 1939 and 1945, although I date the start of the Holocaust to 1941, if we were looking at it closely, but let’s just say 1939, when they invaded Poland, from 1939 to 1945. What Hamas did pales in comparison. It’s hard to believe anybody would make that argument. Yes, a lot of Jews died, but hardly any compared to the number that died at the hands of the Germans. No parallel at all. And furthermore, Hamas was in no position to kill all of the Jews in the Middle East, just not going to happen.
Lex Fridman
(02:23:43)
But there’s also levels of things, Germans using human skin for lamps. There’s just levels of evil in this world.
John Mearsheimer
(02:23:54)
Yes, but that’s not what Hamas is doing. I want to be very clear here. I am not justifying the Hamas’ killing of civilians. Not for one second, but I’m just saying… And by the way, just to go to the Israelis and what they’re doing in Gaza, as I said to you before, I do believe that is a massacre and I believe that’s to be condemned, the killing of civilians. This is not legitimate collateral damage. They’re directly punishing the population. But I would not call that a genocide and I would not compare that to the Holocaust for one second. I just want to be very clear on that.
Lex Fridman
(02:24:37)
Do you think if Israel could, they would avoid the death of any civilians? So you’re saying there’s some degree of punishment of collective-?
John Mearsheimer
(02:24:48)
They’re purposely killing civilians. This is the Iron Wall. They’re trying to beat the Palestinians into submission. There’s no way you kill this many civilians if you’re trying to precisely take out Hamas fighters. And by the way, the Israeli spokesmen, the IDF spokesman has explicitly said that, “We are not pursuing precision bombing. And that what we are doing is trying to maximize the amount of destruction and damage that we can inflict on the Palestinians and I think this is a major mistake on the part of Israel.” First of all, it ends up being a moral stain on your reputation, number one. And number two, it doesn’t work. It doesn’t work. The Palestinians are not going to roll over and submit to Israeli domination of their life.

(02:25:52)
The whole concept of the Iron Wall, Jabotinsky’s term, was misguided. And by the way, if you look at what the Israelis are doing, they’re trying to do two things. One is the Iron Wall, and that’s where you punish the civilian population in Gaza and get them to submit. The other thing that they’re trying to do is get Hamas. They want to destroy Hamas. And the belief there is that if they destroy Hamas, they’ve solved the problem. But as many Israelis know, including people on the hard right, even if you destroy Hamas, they are going to be replaced by another resistance group and that resistance group will employ terror.
Lex Fridman
(02:26:36)
Yeah. I think you’ve said that other terrorist organizations have used the situation in Palestine as a recruitment mechanism for a long time.
John Mearsheimer
(02:26:47)
Osama bin Laden made it clear that this was one of those principal reasons for attacking the United States.
Lex Fridman
(02:26:56)
And the United States attacked back and got us into a 20-year war that cost the lives of millions of people, not American, but human beings and-
John Mearsheimer
(02:27:12)
Engaged in torture.
Lex Fridman
(02:27:14)
And torture. Yeah.
John Mearsheimer
(02:27:16)
No, I think if you look at how we reacted to 9/11 and how the Israelis are reacting to what happened on October 7th, there’s quite a bit of similarity in that both sides, the Israeli side and the American side, are enraged and they lash out and they go on a rampage and the end result is not good.
Lex Fridman
(02:27:45)
Is there a capacity within Israel or within United States after 9/11 to do something approximating turn the other cheek of understanding the root of terror is hate and fighting that hate with, not to sound naive, but compassion?
John Mearsheimer
(02:28:10)
Well, I don’t think in either case you’re going to turn the other cheek.
Lex Fridman
(02:28:18)
What I mean by that is some limited powerful military response, but very limited?
John Mearsheimer
(02:28:25)
Coupled with a smart political strategy.
Lex Fridman
(02:28:27)
Political strategy, diplomacy.
John Mearsheimer
(02:28:29)
Yeah. That’s what they should have done.
Lex Fridman
(02:28:31)
Yeah.
John Mearsheimer
(02:28:31)
Right.
Lex Fridman
(02:28:31)
But is there capacity for that or from your offensive realism perspective, it’s just the odds are really low?
John Mearsheimer
(02:28:41)
From my offensive realist perspective or my realist perspective, that’s what you should do. My view is states are rational actors, they should be cunning. They should think about the strategic situation they’re in and choose the appropriate response. And what happens, and this is why my theory is not always correct, is that sometimes states are not rational and they misbehave. I would argue in the Israeli case that it would’ve been good after October 7th, or starting on October 7th, if the United States had tried to hold the Israelis back and countenanced a more moderate response. Take some time just to think about how to deal with this problem instead of lashing out. I think given what happened to the Israelis, given how shocked they were, given the level of fear, given the level of rage, they were going to lash out and I don’t believe that was in their interest. I think it would’ve made sense to think about it and to think about a smarter strategy than they’re now employing. And I think the Americans blew it. The Americans gave them a bear hug and a green light and said, “We’ll give you all the weaponry you need and go out and do it.” And I don’t think that was the smart thing to do. Look, in the wake of October 7th, the Israelis had no good strategy. It’s not like there’s a magic formula that they just didn’t see and we should have told them what the magic formula was. That’s not true. They were, in a sense, caught between a rock and a hard place in terms of what to do. But there are smarter things than number things and I think the Israelis lashed out in ways that are counterproductive. I think going on a rampage and killing huge numbers as civilians, it’s obviously morally wrong, but it’s also just not in their strategic interest because it’s not going to buy them anything.

(02:31:03)
And in fact, it’s going to cost them because people all over the planet are turning against Israel. I saw an Israeli think tank today that has been tracking protests around the world, gave some figures for what it looked like between October 7th and October 13th in terms of the number of protests around the world that were pro-Israel versus pro-Palestine. And then it looked at the numbers from October 13th up to the present and I think the numbers were 69% were pro-Palestinian in the first six days after October 7th, 69%, and I think 31%… Take these numbers with a grain of salt. 31% were pro-Israel. So I think it was 69 and 31.

(02:32:04)
And since then, since October 13th, if you look at the number of protests around the world, 95% have been pro-Palestinian and 5% have been pro-Israel. And what this tells you is that public opinion around the world has shifted against Israel. And if you look at some of the demonstrations in places like London and Washington, DC, it’s truly amazing the number of people who are coming out in support of the Palestinians. And all of this, again, is just to support my point that it was just not smart for Israel to launch this bombing campaign. You can make an argument for going after Hamas and doing it in a surgical way or as surgical a way as possible, but that’s not what they did. And again, my point to you is I think that this punishment campaign is not going to work strategically. In other words, they’re not going to beat the Palestinians into submission, they’re not going to finish off Hamas. And at the same time, by pursuing this strategy, they’re doing huge damage to their reputation around the world.
Lex Fridman
(02:33:16)
In the wake of October 7th, given the geopolitical context, I think there’s a lot of leverage to be the great ethical superpower, demonstrate power without killing any civilians, and use that leverage diplomatic leverage to push forward something like Abrahamic Accords with more nations, with Saudi Arabia, push for peace aggressively, peace agreements, this kind of stuff, economic relationships, all of this kind of stuff, and thereby pressure the Palestinian authority towards perhaps the two-state solution.
John Mearsheimer
(02:34:04)
I think what you’re missing here, just in the Israeli case, is that the Israeli government is not interested in two-state solution. And you want to remember that Benjamin Netanyahu, who looks very hawkish when you look at him in isolation, doesn’t look so hawkish when you look at him compared to the rest of the people in his cabinet. He almost looks like a moderate. He’s got a lot of people who are way out to the right of him. And these people, and this of course includes Netanyahu, are not interested in the two-state solution. So the question you have to ask yourself is, if you’re Benjamin Netanyahu and it’s October 7th, late in the day, what do you do? You’re not thinking about a two-state solution. You’re thinking about an occupation that’s not going to end. And the question is how do you deal with the Palestinians given what’s just happened?
Lex Fridman
(02:35:05)
Well, there’s people in the cabinet and then there’s history. And history remembers great leaders. So Benjamin Netanyahu can look in the streets of Israel and see the protests and think of how history will remember him. I think a two-state solution is on the table for a great leader.
John Mearsheimer
(02:35:24)
Well, it was there. Was he the person who was going to take advantage of it? I don’t think so, but we’ll see.
Lex Fridman
(02:35:35)
He’s a student of history. At this point, it’s very difficult. Like you said, 95% now or whatever the number is of protests, I think the window in which Israel has the ears of the world, it can do the big ethical action towards peace, I think, has closed. Or maybe there’s still a slither, but it’s just… The slippery slope of hate has taken off. It’s quite depressing to watch what’s going on.
John Mearsheimer
(02:36:10)
Yep. I agree a hundred percent. Unequivocally depressing.
Lex Fridman
(02:36:13)
But of course, as you talk about the role of… The US involvement is of critical importance here for the United States and the argument you make is that we should not be involved in Ukraine, at least to the degree we are, we being the United States, and we should not be involved in Israel to the degree we are because it’s stretching us too thin when the big geopolitical contender in the 21st century with United States is China. Is that a correct summary?
John Mearsheimer
(02:36:49)
Yeah, I think just on Ukraine, we should not have pushed Ukraine to join NATO.
Lex Fridman
(02:36:55)
Sure.
John Mearsheimer
(02:36:56)
And once the war started, we should have worked overtime to shut it down immediately.
Lex Fridman
(02:37:03)
March.
John Mearsheimer
(02:37:04)
March, right. And you remember, by the way, not to go back to Ukraine in great detail, in the early fall of 2022… The war starts February, 2022. There’s March, 2022, which we’ve talked about, which is the negotiations. In the fall of 2022, I think it was in September, the Ukrainians had won two major tactical victories, one in Kherson and the other in Kharkiv. And at that point in time, General Milley, who was the chairman of the Joint Chiefs of Staff, said, “Now is the time to negotiate because this is the high watermark for the Ukrainians.” Milley understood that things were only going to get worse, and the White House shut Milley down and said, “We’re not negotiating.” So we have blown a number of opportunities here to head this problem off at the pass. But that’s my view there. And with regard to the Israelis, my only point about Israel is that it would be better for Israel and better for the United States if we, the United States, was in a position to put pressure on Israel from time to time. As Steve and I say in the book, we should be able to treat Israel like a normal country. The fact is that countries sometimes do stupid things. This includes the United States and Israel. And if Israel is pursuing a policy that we think is unwise, we should be in a position where we could put pressure on Israel. That’s our argument. But anyway, we goofed both with regard to Ukraine and with regard to the Middle East and we’re now up to our eyeballs in alligators in both of those regions. And as you described my view, this is not good because the area of the most strategic importance for the United States today is East Asia and that’s because China is there and China is the most serious threat the United States faces.

China

Lex Fridman
(02:39:14)
Do you think there will be a war with China in the 21st century?
John Mearsheimer
(02:39:19)
I don’t know. My argument is there will be. There is right now a serious security competition and at the same time, there is a real possibility of war. Whether or not we avoid it is very hard to say. I mean, we did during the Cold War. We had a serious security competition from roughly 1947 to 1989 and we thankfully avoided war, probably came the closest in 1962 at the Cuban Missile Crisis. But we avoided it and I think we can avoid it here. Is it for sure? No.
Lex Fridman
(02:39:59)
You’ve said that China won’t move on Taiwan militarily, in part because, as you said, amphibious operations are difficult. Why will China not move on Taiwan in your sense in the near future?
John Mearsheimer
(02:40:16)
Well, it’s because there’s this body of water called the Taiwan Strait, which is a big body of water, and getting across water is very difficult unless you can walk on water.
Lex Fridman
(02:40:29)
So geography still has a role to play in the 21st century?
John Mearsheimer
(02:40:32)
Oh, yeah. I think geography’s very important. Big bodies of water really matter. In an ideal world, you’d like to have the Pacific Ocean between you and any potential adversary. 6,000 miles of water, hard to get across. If you’re a country and I’m a country and there’s land between us, I can take my Panzer divisions and I can go right across the land and get into your country or attack your country. And you of course can take your Panzer divisions and come across that same piece of land. But if there’s a big body of water between us, your Panzer divisions can’t go across the water and then the question is how do you get them across the water? And that’s very tricky. And in a world where we have lots of submarines and you have lots of aircraft and you have missiles that are land-based that can hit those surface ships, it is very, very hard to attack across a body of water. And all you have to do is think about the American invasion of Normandy, June 6th, 1944, coming in on Omaha Beach. Oh, boy. That was really difficult.
Lex Fridman
(02:41:46)
But there is a growing asymmetry of military power there that even though it’s difficult-
John Mearsheimer
(02:41:53)
That is correct.
Lex Fridman
(02:41:54)
So I guess-
John Mearsheimer
(02:41:56)
That is correct.
Lex Fridman
(02:41:57)
So I recently had a conversation with Elon Musk and he says that China is quite serious about the One China policy, and it seems inevitable that Taiwan will have to be… If you look at this pragmatically in the 21st century, it seems inevitable that Taiwan will have to be a part of China and so we can get there either diplomatically or militarily. What do you think about the inevitability of that kind of idea? When a nation says, “This is a top priority for us,” what do you think about them meaning it, and what do we do about that?
John Mearsheimer
(02:42:46)
There’s no question it’s a top priority for them and there’s no question they mean it, but it’s also a top priority for us not to let them take Taiwan.
Lex Fridman
(02:42:54)
Why exactly?
John Mearsheimer
(02:42:56)
Because it’s an important strategic asset. Many people will say it’s because Taiwan’s a democracy, but that doesn’t matter that much. It’s because of two strategic reasons. The first is that if we were to let Taiwan go, it would have hugely negative consequences for our alliance structure in East Asia. To contain China, we need allies. We have an alliance structure, and our allies, Japanese, South Koreans, Filipinos, Australians, they’re all counting on us to be there for them. And if we say, “We’re not going to defend Taiwan, the Chinese attack,” they’re going to say, “I bet if the Chinese attack us, the Americans won’t be there for us.” So it would have a damaging effect on our alliance structure, which we cannot afford because containing China is a wicked problem. It’s a powerful state. You were getting to this before when you talked about China versus Taiwan. So that’s the first reason.

(02:44:07)
Second reason is you want to bottle up the Chinese Navy and the Chinese Air Force inside the first island-chain. You don’t want to let them get out into the Pacific. You don’t want them dominating the waters of East Asia. You want to bottle them up again inside the first island-chain. And you can only do that if you control Taiwan. You don’t control Taiwan, they get out into the Philippines Sea, into the Pacific, and the Western Pacific and cause all sorts of problems.
Lex Fridman
(02:44:38)
Well, you saying all that, you’ve also said the Century of Humiliation, Japan and the United States are a source of that humiliation for China, don’t you think they see the other side of that?
John Mearsheimer
(02:44:52)
Absolutely.
Lex Fridman
(02:44:53)
And in the interest of avoiding a World War… I guess the question is how do we avoid a world war? It doesn’t seem like the military involvement in the conflict between China and Taiwan is the way.
John Mearsheimer
(02:45:14)
Well, I don’t want-
Lex Fridman
(02:45:15)
There’s no good answers here. I’m just saying-
John Mearsheimer
(02:45:17)
There are no-
Lex Fridman
(02:45:18)
Which is the less bad option?
John Mearsheimer
(02:45:20)
Well, what you want to do is you want to make sure that you deter China from invading Taiwan. You want to avoid a war. You and I are in complete agreement on that. We don’t want a war, but we want to contain China. We do not want to let China dominate Asia. That’s what the Americans are principally concerned with here and it’s what China’s neighbors are principally concerned with. This includes the Japanese, the South Koreans, the Filipinos, Australians, and the Taiwanese. They don’t want and we don’t want China to dominate the region, so we have to contain it.

(02:45:57)
But at the same time, and this should be music to your ears, we not only want to contain it, we want to make sure we don’t end up in a shooting match with the Chinese because this could be disastrous. So you have to have a very smart policy. You have to build powerful military forces, and you have to make sure you don’t do anything that’s provocative. On Taiwan, for example, the last thing you want is for the Taiwanese government to declare its independence because the Chinese have said, “If Taiwan does that, we’ll go to war.” And of course, we don’t want that. So my view is you want to smartly build up your military forces and you want to do everything you can to contain China, and at the same time, not be provocative.
Lex Fridman
(02:46:41)
So a big component of that is making sure the US military is bigger than the Chinese military.
John Mearsheimer
(02:46:51)
Not necessarily. It’s an interesting question. A lot of people think that to make deterrence work, you have to be able to beat the Chinese and therefore, you need a much bigger military. And I don’t think over time that’s possible. I think it’s probably not even possible now to beat the Chinese in a war over Taiwan or in a war in the South China Sea. I think what you want to do is make it clear to the Chinese either that there will be no winner… In other words, you don’t have to win, but you want to make sure they don’t win. It’s a lose-lose proposition if they go to war over Taiwan or what have you.

(02:47:40)
And if you can’t do that, you think that they’re so powerful that they’re ultimately going to win, you want to convince them that victory would be a Pyrrhic victory. In other words, they would pay a godawful price to win the war. You follow what I’m saying? So the best strategy for deterrence is you win, China loses. Second best strategy is a stalemate, nobody wins. Third best strategy is they win, but they pay a godawful price. And the fourth possibility, which you don’t want, is they went quickly and decisively. If that’s the case, then you don’t have much deterrence.
Lex Fridman
(02:48:28)
What does a world with China as the sole dominant superpower look like? I mean, a little bit underlying our discussion is this kind of idea that US is the good guys and China is the bad guys. First of all, dividing the world into good guys and bad guys seems to somehow miss the nuance of this whole human civilization project we’re undertaking. But what does the world look like where China is the dominant sole superpower in a unipolar world?
John Mearsheimer
(02:49:01)
Well, I don’t tend to think of the world in terms of good guys and bad guys. As a good realist, I think that states or states, they’re all black boxes. I don’t discriminate between democracies and autocracies. But having said that, I am an American and as an American, I’m interested in the security of my country, the survival of my country. So I want the United States to be the most powerful state in the world, which means I want the United States to dominate the Western hemisphere, I want us to be a regional hegemon, and I want to make sure that China does not dominate Asia the way we dominate the Western hemisphere.

(02:49:45)
It’s not because I think we’re the good guys and they’re the bad guys. If I were Chinese and I were in Beijing and I was Xi Jinping’s national security advisor, I’d tell him what we got to do is make sure we dominate the world or dominate our region and then do everything we can to undermine America’s position in the Western hemisphere. That’d be my view. So I guess you could say I do view the world in terms of good guys and bad guys, an American and-
Lex Fridman
(02:50:16)
More like us and them versus-
John Mearsheimer
(02:50:18)
Yeah, it’s us and them. That’s a nice way to put it. Yeah, it’s us versus them. Not so much good guys versus bad guys.
Lex Fridman
(02:50:24)
Is it possible to have a stable, peaceful world with a good balance of power where it’s China and US as superpowers? It’s a bipolar world, no longer unipolar.
John Mearsheimer
(02:50:37)
Yeah. Okay, so you’re hypothesizing a world where they dominate Asia and we dominate the Western hemisphere.
Lex Fridman
(02:50:43)
Yeah.
John Mearsheimer
(02:50:44)
I believe there would be a great deal of intense security competition between those two superpowers.
Lex Fridman
(02:50:53)
The definition of intense matters here. So it could be small military conflicts or it could be extremely large unstable military conflicts, right?
John Mearsheimer
(02:51:04)
Well, conflict… Let’s use the word war. So I distinguish between security competition and war. And what I’m telling you is you’ll have an intense security competition where there’s no shooting, or if there’s shooting, it’s mainly proxies that are doing the fighting, much like the Vietnam War. Or you could have a case where one of those superpowers was involved in a war against a proxy of the other superpower. Think the Korean War. The United States fought the Chinese who were allied with the Soviets at the time. But a war between the United States and China, just like a war between the United States and the Soviet Union during the Cold War, that’s what you really want to avoid. So I think you’d have an intense security competition. You’d have wars involving proxies of each of those two superpowers and you would probably have some wars where one of superpowers was involved in a proxy with one of the other superpower’s proxies.
Lex Fridman
(02:52:12)
So it seems likely then, if that’s the case, then it would be Taiwan is the proxy and US fighting China through the proxy of Taiwan?
John Mearsheimer
(02:52:21)
Yeah. Well, that would assume the United States… But you want to remember, you’re hypothesizing a situation where China dominates Asia.
Lex Fridman
(02:52:29)
Oh, it already has dominated.
John Mearsheimer
(02:52:31)
Yeah, it’s already dominated Taiwan.
Lex Fridman
(02:52:34)
I see. Where do you find the proxies? Australia?
John Mearsheimer
(02:52:38)
The Middle East could be a good case.
Lex Fridman
(02:52:40)
Oh, wow.
John Mearsheimer
(02:52:41)
Persian Gulf.
Lex Fridman
(02:52:43)
Oh boy. And then our discussion of Israel becomes even more dramatically-
John Mearsheimer
(02:52:47)
Yeah, well, Israel gets involved… I think in this scenario, if you’re talking about a US China competition and you’re talking about the Middle East, I think it’s the Gulf, it’s the Saudis, the Iranians, the Iraqis. It’s the oil.
Lex Fridman
(02:53:03)
Don’t you think it could be Israel versus Iran with some very 1984 kind of dramatic partnership of Iran, Russia, and China versus United States, Europe, and Israel?
John Mearsheimer
(02:53:18)
I think that’s possible. Yeah.
Lex Fridman
(02:53:19)
Oh boy.
John Mearsheimer
(02:53:20)
I think that’s possible. Yeah. I mean, I hadn’t thought about it until you said it, but yeah, I think that that is possible.
Lex Fridman
(02:53:28)
Isn’t that terrifying?
John Mearsheimer
(02:53:31)
Yeah. Well, in your scenario, where China already dominates Asia and we dominate the Western hemisphere, I think you start talking about where the most likely places that the United States and China go head-to-head or fight through proxies. I think it is the Gulf or the Middle East and the scenario that you posit.
Lex Fridman
(02:53:56)
I mean, one question I have… I don’t know about you, but for me, unlike with the Soviet Union, and I know I was born there, but even outside of that, the cultural gap, the loss in translation, the communication gap between China and the United States seems to be much greater than that of what was the former Soviet Union and the United States. I see two cultures intermingling and communicating as one of the ways to deescalate future conflict.
John Mearsheimer
(02:54:35)
It’s an interesting question. I mean, at sort of an abstract theoretical level, my argument is that great powers act according to realist dictates and they understand those realistic dictates and that could lead to cooperation or it can lead to war. It depends. I would say just in the case of the Soviets, a lot of people…
John Mearsheimer
(02:55:03)
I would say just in the case of the Soviets, a lot of people describe the Cold War as an ideological competition above all else, it was communism versus liberal democracy or communism versus liberal capitalism, whatever. I actually don’t believe that. I believe the Soviets were realist to the core. I believe Stalin was a realist par excellence, and that ideology did not matter much in Stalin’s foreign policy. And I believe if you look at Soviet foreign policy after World War II, throughout the Cold War, they were realists to the core. And I think in those days, the Americans were realists, a lot of liberal ideology floating around out there, but the Americans were realists. And I think one of the reasons you avoided a shooting match between the United States and the Soviet Union from ’47 to ’89 was because both sides, I think understood basic balance of power logic. US China competition is somewhat different.

(02:56:14)
First of all, the Chinese are realists to the core. I’ve spent a lot of time in China. I basically have rock and roll. I’m basically a rock and roll star in China. The Chinese-
Lex Fridman
(02:56:27)
You’re kind of a big deal in China. I love it.
John Mearsheimer
(02:56:29)
The Chinese are my kind of people. They’re realists, right? They speak my language. It’s the United States that is not very realist. American leaders have a very powerful liberal bent and tend not to see the world in realist terms. I believe, by the way, just going back to our discussion of NATO expansion, I think our inability to understand that NATO expansion was anathema to the Soviet, to the Russians, was due in large part to the fact that we just during the unipolar moment, didn’t think of international politics from a realist perspective and didn’t respect anyone who thought about international politics from a realist perspective. If those various American administrations starting with the Clinton administration had put their realist hat on, they would’ve understood that NATO expansion into Ukraine was not a good idea, but we had this thoroughly liberal view of the world that dominated our thinking, and it’s gone away somewhat since we’ve moved into Multi-polarity, but not completely.

(02:57:34)
And this makes me a little nervous to pick up on your point. I mean, the United States is thinking about the world in ways that are somewhat different than the Chinese who are real as par excellence.
Lex Fridman
(02:57:47)
So that’s fascinating. So the Chinese are pragmatic about thinking of the world as a competition of military powers, all the ways in which you described the realist perspective. So that’s a hopeful thing, right? If we can achieve stability and a balance of powers through that military competition.
John Mearsheimer
(02:58:11)
Yeah, I actually think that’s right. I think if the United States, just let me talk a little bit about the United States to get at the issue you’re raising. If the United States pursues a smart containment strategy, given what you just said, and I said about the Chinese, I think we will avoid war. The problem with the Americans is it’s not just the liberalism. It’s the possibility that we will pursue a rollback policy. In other words, during the Cold War, we pursued containment. It was whenever anybody talked about American grand strategy towards the Soviet Unions, containment, containment, containment. We now know from the historical record that the United States was not only pursuing containment, it was pursuing rollback. We were trying to roll back Soviet power to put it bluntly, we were trying to wreck the Soviet Union, and I would not be surprised moving forward with regard to China if the United States pursues a serious rollback policy and-
Lex Fridman
(02:59:17)
So you’re saying throughout history, United States was always doing that. Always. Where’s that from? Why can’t we respect the power of other nations?
John Mearsheimer
(02:59:26)
Because they may be a threat to us?
Lex Fridman
(02:59:28)
Well, I mean-
John Mearsheimer
(02:59:31)
Look, you don’t respect the power of other nations. You fear the power of other nations.
Lex Fridman
(02:59:37)
Well fear and respect are next door neighbors depending on the neighborhood you’re living in, but I just mean it could be very counterproductive to try because if you can empathize with their… If you assume they’re rational actors you trying to roll back would lean into the uncertainty of potential conflict. So you want to avoid the uncertainty of potential conflict, caution, right?
John Mearsheimer
(03:00:03)
Well, yes and no. Look, your point is you want to empathize. You want to be able to put yourself in the shoes of the other side.
Lex Fridman
(03:00:10)
Yes.
John Mearsheimer
(03:00:10)
I agree 100% with that, right. It’s very important if you’re a first class strategist to be able to do that, but at the same time, there is this competition for power taking place, and what you want to do is maximize how much power you have relative to the other side, and the other side wants to maximize how much power it has relative to you. So you have this competition for power that’s taking place all the time, and that’s taking place at the same time you want to have empathy or you want to be able to put yourself in the shoes of the other side. So those two things kind of go together.
Lex Fridman
(03:00:49)
It just feels less threatening to build up your thing versus try to hurt the other person’s thing, the other group’s thing.
John Mearsheimer
(03:01:00)
But if you build up your own power, you are building up your capability to hurt the other side.
Lex Fridman
(03:01:06)
Right, but I guess you don’t rattle the saber just work on manufacturing sabers.
John Mearsheimer
(03:01:13)
Well, that I agree with. I think that the United States wants to make sure it has a big stick in East Asia for purposes of containing China and avoiding a war, right? Again, I want to be clear, I’m not advocating that we start World War III, but the point is you want to have a big stick and you want to make sure that you don’t overstep your bounds in terms of using that big stick. This is the danger with rollback that you get too aggressive and you precipitate a war, and you also just have to be very careful what you say. And to go back to your favorite argument, you want to be able to have empathy or put yourself in the shoes of the other side, because if you do something, you want to think smartly about what that other side, how that other side is going to see your action and how they’re going to react, right?
Lex Fridman
(03:02:09)
And mostly focus on the carrots, have a giant stick laying around, but never mention it, just focus on the carrots.
John Mearsheimer
(03:02:17)
Well, occasionally you have to mention the stick.
Lex Fridman
(03:02:19)
Everyone knows the stick is there.
John Mearsheimer
(03:02:21)
There is some truth in that, right?
Lex Fridman
(03:02:23)
I mean, yeah, and words matter a lot. It feels our current President Biden is meeting with Xi Jinping, and I think the words exchanged there are really important. I have a notion that leaders can stop wars just as much as they can start wars.
John Mearsheimer
(03:02:42)
Well, leaders matter. There’s no question about that, no question, but just on rhetoric, you want to remember that Putin has on more than one occasion, very subtly rattled the nuclear sword, and it has been very effective because Joe Biden has paid attention, and Joe Biden wants to make sure we don’t end up in a thermonuclear war, and thank goodness he’s thinking that way. So all Putin has to do is mention the possibility of nuclear war. Just to go back to Taiwan, switch areas of the world. If you’re interested in containing China and you’re interested in deterrence, and let’s go back to those various scenarios where the Chinese win, we win, Chinese win, but they do it at great cost.

(03:03:35)
One could argue that, that discussion that I laid out before it didn’t take into account nuclear weapons and all President Biden or any of his successors has to do is just very subtly rattle or employ the nuclear threat and just sort of remind the Chinese that you start a war over Taiwan, it could easily escalate into a nuclear war. You want to understand we both have nuclear weapons, and if either one of us is put into a desperate situation, we may turn to those nuclear weapons and oh, by the way, Xi Jinping, you want to understand that we’re out here in the water and using nuclear weapons in the water, it’s not the same as using war nuclear weapons on lands. So we may very well use them. I’m not saying we will, but anyway, a little saber rattling. Right?
Lex Fridman
(03:04:36)
Let me just zoom out on human history. What makes empires collapse and what makes them last when they do when you look at human history, in your sense thinking about the United States, perhaps as an empire?
John Mearsheimer
(03:04:52)
I don’t view the United States as an empire.
Lex Fridman
(03:04:57)
So to you empire as a thing that seeks expansion constantly?
John Mearsheimer
(03:05:03)
Yeah, I think it’s a country that incorporates different regions or areas around the world into sort of a giant sphere of influence without incorporating those territories actually into the state itself. So you had this thing called the British Empire and it controlled areas like India, North America, and Kenya, just to pick a couple instances at different points. Singapore would be another example. Australia would be another example. So these were all entities that were part of the British Empire and the United States has taken a stab at empire after the Spanish American War, for example, with regard to the Philippines and Cuba and Puerto Rico, but we never got serious about it. There’s never been an American empire.

(03:06:13)
This is not to say the United States is not an incredibly powerful country that goes all around the world building military bases and stationing troops here, there and everywhere, but we’re not running an empire the way the British Empire was run or the French Empire. So the question for me is why did those empires go away? Why did the British Empire go away? If you ever look at a map of the world in 1922 after World War I, it’s truly amazing how much of that map is controlled by Britain. They had a huge empire and it’s disappeared.
Lex Fridman
(03:06:53)
Probably by far the biggest in terms of area empire in human history, I think so.
John Mearsheimer
(03:06:59)
I think that’s right. It almost has to be.
Lex Fridman
(03:07:01)
Yeah, right. It’s crazy.
John Mearsheimer
(03:07:04)
Crazy, yeah.
Lex Fridman
(03:07:05)
And then no longer is the case.
John Mearsheimer
(03:07:07)
Yeah. Now I want to be clear. The Americans have wielded maybe even greater influence than Britain did when it had its empire, but I don’t believe we have an empire that bears any resemblance to the British Empire. So the question is, what happened to that British empire? What happened to the French Empire? What happened to the Belgian Empire? What happened to the Dutch Empire? These were countries that had colonies all over the planet. The Dutch East Indies, Vietnam was French Indochina. Where did those empires go? Two factors finished them off. Number one, nationalism. Nationalism became a very powerful force in the 19th Century. It began to rear its head in the late 18th Century and became a very powerful force in the 19th and certainly in the 20th.
Lex Fridman
(03:07:57)
Can you explain nationalism here?
John Mearsheimer
(03:07:59)
Nationalism is the idea that these different nations that were part of the empire, like the Kenyans wanted their own state, nation state. This is my point about the Palestinians, right? This is Palestinian nationalism. What is Zionism? Zionism is Jewish nationalism. Jewish nationalism. Think of Theodore Herzl’s famous book. It’s called The Jewish State, Nation State. Think of the word nation state that embodies nationalism. Nation state, Jewish state. Palestinians want their own state, two state solution. Can’t beat the Palestinians into submission. The Indians wanted their own state. The Pakistanis wanted their own state. The Kenyans wanted their own state. Singapore wanted its own state. Oh, the Americans wanted their own state. This is called the American Revolution.

(03:08:51)
So that’s the first reason, nationalism that these empires disappeared. The second reason is that from a cost benefit analysis, they no longer made any sense, and it was the coming of the Industrial Revolution. Once the Industrial Revolution comes, an empire is basically an albatross around your neck. I would argue that the British Empire was an albatross around Britain’s neck in most of the 20th Century. Some of my friends disagree with that and think there were all sorts of benefits from the British Empire, but you want to remember that in the 20th Century, the three countries that really were powerful were the United States, Germany and the Soviet Union. Those were the big three. Did any of them have an empire? No.
Lex Fridman
(03:09:39)
That’s a good argument.
John Mearsheimer
(03:09:40)
In the industrial world, you don’t need an empire, right? What you need is a powerful manufacturing base.
Lex Fridman
(03:09:50)
Well, the cost benefit analysis is different before the Industrial Revolution, there’s been many empires.
John Mearsheimer
(03:09:56)
There’s no question that empires came and went, right?
Lex Fridman
(03:10:00)
Yes.
John Mearsheimer
(03:10:02)
All you have to do is just look at the British and the French in the Seven Years War, 1756 to 1763, the British win, they get Canada, and that’s why Quebec, Montreal, all these big French speaking areas are now part of Canada. So borders change and countries got established. The United States being one, and remember, South American, Central America were once completely dominated by the Spanish, and in the case of Brazil, the Portuguese, but they all in the 19th Century got their independence, and what I’m saying to you is in the 19th and in the 20th Century, there were two forces that were really driving the train. One is nationalism, and then the other is the industrial revolution, which changes the cost benefit analysis.
Lex Fridman
(03:11:01)
Almost too crazy of a question, but if you look, let me calculate, let’s say 500 years from now, and you John Mearsheimer traveled through time and are at a bookstore looking at the entire history of human civilization in a single book. What role does US play? What’s the story of US over the next a hundred, 200, 300 years? Is it a big role, small role?
John Mearsheimer
(03:11:32)
Well, that’s a long time. If you asked me, let’s just say the next hundred years.
Lex Fridman
(03:11:39)
Yeah, that’s still tough.
John Mearsheimer
(03:11:42)
That’s still tough, but actually I think we’re in excellent shape and here’s the reason. Going back to the beginning of our conversation, you asked me about power and I told you the two principle building blocks of power are population size and wealth, and therefore you want to look around the world and you want to look at what you think the demographics are of countries like Britain, the United States, Iran, China, Russia, pick your country moving forward, what do the demographics look like and how wealthy are those countries likely to be? What you discover very quickly is that almost every country around the world is depopulating over time. Russia’s going to be much smaller, China’s going to be much smaller a hundred years from now than both of those countries are, as best we can tell.

(03:12:49)
United States, American women are not having lots of babies these days. No question about that, but we have immigration. We’re an immigrant culture. You’re a perfect manifestation of that. You’re a perfect, you’re now an American. That’s wonderful. We need more people like you. So when I hear Donald Trump and others arguing that immigration’s a terrible thing, this is ridiculous. Immigration is what made us great. It’s when my relatives came over in the middle of the 19th Century from Germany and Ireland.
Lex Fridman
(03:13:26)
That’s fascinating because there’s been a huge concern, America and other developed nations are not having enough children, but you just made me realize in the long arc of history, the United States has gotten really good at integrating immigrants and helping them flourish. The whole diversity of that makes up America.
John Mearsheimer
(03:13:51)
You’re absolutely right.
Lex Fridman
(03:13:52)
There’s a machinery of integrating other cultures.
John Mearsheimer
(03:13:56)
Yeah, just very quickly on this-
Lex Fridman
(03:13:57)
That’s fascinating.
John Mearsheimer
(03:13:59)
Sam Huntington’s book, Who Are We? Which in many ways I love that book, but it has one fundamental flaw and a number of people told him beforehand that flaw existed and he didn’t fix it, but Sam argues in the book that we have large numbers of Hispanics in this country and we’re doing a very poor job of integrating them into the mainstream and they’re not becoming Americans, and because many of them are concentrated in the Southwest of the United States, unlike other ethnic groups that were spread out all over God’s little green acre, we’re going to have this cohesive group of Spanish speaking Americans who are going to want to break away, and the United States is no longer going to be a reasonably coherent nation state. He’s wrong. All the evidence is that Hispanics are integrating into the American mainstream more quickly and more effectively than the European immigrant groups that came starting around 1835.

(03:15:12)
If you look at immigration from Europe into the United States, leaving aside the original wasps who came over and founded the place, the immigrants start coming in large numbers in 1835, and we really don’t shut the door until 1924, right? This is a crude overview, starting in 1835 and running up till about 1885, it’s mainly Germans and Irish. That’s why Germans are the largest ethnic group to ever come to the United States, and the Irish are right behind them. These are the European ethnic groups we’re talking about. Then starting in 1885 Pols, Jews and Italians start coming, and the Germans and Irish keep coming, and this is why Ellis Island is opened, I think it’s 1893, Ellis Island is opened because Castle Garden in New York, which had handled all the previous immigrants coming across the pond, Castle Garden, couldn’t handle them all, so they opened up Ellis Island.

(03:16:11)
That’s why somebody like me, I can’t find my distant relative’s records in Ellis Island because they came through Castle Garden. Whereas lots of Jews I know, lots of Italians, I know they can find their relatives records in Ellis Island because they came through Ellis Island. The point is, you had all these immigrants who came in roughly between 1835 and 1924 when we shut the gates. It was the only time we’ve ever really shut the gates in a meaningful way and this is what made America great, all these people, and they made lots of babies.
Lex Fridman
(03:16:47)
So in some sense, make America great again, means getting more immigrants in.
John Mearsheimer
(03:16:52)
Well, we opened the gates again in ’65, closed them in ’24, opened them in ’65. I’m oversimplifying the story here, because we didn’t completely shut them. We almost completely shut them in ’24, opened in ’65, and we’ve had huge numbers of immigrants flowing in. These immigrants who have been flowing in since ’65 are not Europeans. They’re not mainly Europeans, they’re mainly Hispanics and Asians. If you look at those Hispanics and Asians, they’re integrating into the American mainstream at a much faster and more effective clip than was the case with those immigrants who came in the 19th Century and early 20th Century.

(03:17:36)
The Irish, oh my God, they were treated horribly. There’s a book, a very famous book that’s been written called When The Irish Became White, just think about the title of that book. There was discrimination against all these groups, and the worst discrimination, of course was against Chinese Americans, but we’ve gotten much better and what we should do moving forward is redouble our efforts to integrate immigrants into the American mainstream, Hispanics, Asians of all sorts, because the fact is that America is rapidly reaching the point where it’s not going to be an all white country.

(03:18:24)
I have five children and two of my children are, I was a generation Z, Gen Z. Gen Z is the last majority white generation, subsequent generations, and not majority white. So for anybody who’s bothered by this, I’m not bothered by that, but for anybody who is bothered by this, they better good use to it because Americans aren’t making enough babies that we can continue to grow population-wise in a robust way. So we need immigration and we’re an immigrant culture, and this is a great virtue. It has been a great virtue over time.
Lex Fridman
(03:19:10)
It should be a source of hope, not worry.
John Mearsheimer
(03:19:13)
That’s my view. That’s my view and America when it works, is a place that is very attractive to immigrants and immigrants can do very well here and then the real key moving forward is intermarriage, and you have a huge amount of intermarriage. Somebody was telling me not too long ago that the highest inner marriage rates in the United States are among Asian women, Asian American women, Asian women and Anglos, and I say wonderful and-
Lex Fridman
(03:19:47)
Great.
John Mearsheimer
(03:19:48)
Yeah. No, the more-
Lex Fridman
(03:19:49)
Love is the fastest way to integrate.
John Mearsheimer
(03:19:52)
Yeah. Well, what you want to do is you want to eliminate difference, right? You want to eliminate difference, right? It’s like people who say, “I’m an antisemite,” right? I have two grandsons who Adolf Hitler would’ve thrown into a gas chamber. One of whose first name is John, and middle name is Mearsheimer, right?
Lex Fridman
(03:20:15)
Yeah.
John Mearsheimer
(03:20:16)
This is what you want. Steve Watt’s wife and his two children would’ve been thrown into a gas chamber by Adolf Hitler. This is what you want. You want intermarriage. Now, there are a good number of people in some of those groups, especially among Jews who don’t like intermarriage, but they’ve lost because I haven’t looked recently at the data for intermarriage rates among basically secular Jews, but it used to be around 62% large numbers of Jews marry Guam.
Lex Fridman
(03:20:51)
And they’ve lost because of intermarriage. Intermarriage helps fight tribalism. Destructive kind of tribalism.
John Mearsheimer
(03:20:58)
Exactly.
Lex Fridman
(03:20:58)
It’s nice
John Mearsheimer
(03:20:59)
Calling me an antisemite, they haven’t met my grandsons, my son-in-laws, a niece that I have, nephews that I have, brother-in-laws that I have. Jewish. Come on.
Lex Fridman
(03:21:13)
And this gives a really nice hopeful view of America is the integration of different cultures, different kinds of peoples. That is a unique property of America.
John Mearsheimer
(03:21:24)
Yes, but just to go back to where we started, it was not smooth in the beginning.
Lex Fridman
(03:21:29)
All things are rough in the beginning.
John Mearsheimer
(03:21:31)
All things are rough in the beginning.

Life and mortality

Lex Fridman
(03:21:34)
What advice would you give to a young person today about how to have a career they can be proud of or a life they can be proud of?
John Mearsheimer
(03:21:42)
Well, I think it’s very important to make sure that you do something in life that really interests you. My mother used to use this phrase, “Floats your boat.” You want to do something that floats your boat or to use another one of my mother’s phrases, ” You want to get off. You want to do something where you get up out of bed in the morning with a bounce in your step.” So I think that if your mother and father want you to be a lawyer and they’re pushing you to be a lawyer and you don’t want to be a lawyer, you want to be a policeman, be a policeman. Don’t do what other people want you to do because it’s very important to find a job, an occupation that you really love.

(03:22:26)
The second thing I would say, and this has to do with your point about humility, you want to think about the humility hubris index. My friend Steve Van Everett, who teaches at MIT, he and I invented this concept. We call it the hubris humility index, and you want to have a healthy dose of humility, but you also want to have a healthy dose of hubris. You want to think you can change the world. You want to think you can make things better for yourself. You want to take chances. You want to think sometimes that you know better than other people do. Hubris is not a bad thing, but at the same time, you have to have humility. You have to understand that a man or a woman has his or her limits and you want to listen to other people. You want to be a good listener.

(03:23:19)
So always remember the importance of the hubris humility index and the importance of having healthy doses of both hubris and humility.
Lex Fridman
(03:23:31)
Speaking of humility, you’re mortal, like all humans are, do you ponder your mortality? Are you afraid of it? Are you afraid of death?
John Mearsheimer
(03:23:42)
I’m not sure I’m afraid of death. I don’t want to die because I enjoy life so much.
Lex Fridman
(03:23:50)
Having too much fun?
John Mearsheimer
(03:23:53)
Given how horrible the world is today, I hate to say that I’m having too much fun, but do I find what I do interesting and gratifying? I do. I just love what I do and I love studying international politics, and I love being intellectually curious about all sorts of subjects. I love talking to you about this and that. I mean, this is really wonderful, and I often tell people thank goodness I’m only 28 years old because I do try to behave like I’m only 28 years old, but I am well aware of the fact that as my mother used to say, “Nothing is forever,” and that includes me and when you’re 75 going on 76, you understand that you have a limited number of years left and I find that depressing because I’ve been very lucky and I feel like I’ve won the lottery. I’m very thankful for that. I’d like to make it last for as long as possible, but I do understand that nothing is forever.
Lex Fridman
(03:25:06)
Yeah, the finiteness of things.
John Mearsheimer
(03:25:09)
Yeah. You never think that when you’re young. I mean, you think you’re going to live forever and you’re just not going to get old. I never thought this would happen that I would become 75 years old.
Lex Fridman
(03:25:22)
Well, you got so much energy and boldness and fearlessness and excitement to you that I’m really grateful to see that, especially given how much I’m sure you’ve been attacked for having bold ideas and presenting them and not losing that youthful energy is beautiful to see.
John Mearsheimer
(03:25:46)
Thank you.
Lex Fridman
(03:25:47)
Not becoming cynical. John, it’s a huge honor to speak with you that you’ve given me so much time and so much respect and so much love. This was a really incredible conversation. Thank you so much for everything you do in the world, for looking out into the world and trying to understand it and teach us, and thank you so much for talking with a silly kid like me.
John Mearsheimer
(03:26:07)
It was my pleasure. Thank you very much. I thoroughly enjoyed it.
Lex Fridman
(03:26:11)
Awesome. Thanks for listening to this conversation with John Mearsheimer. To support this podcast, please check out our sponsors in the description. And now let me leave you with some words from Plato. “Only the dead have seen the end of war.” Thank you for listening and hope to see you next time.

Transcript for Elon Musk: War, AI, Aliens, Politics, Physics, Video Games, and Humanity | Lex Fridman Podcast #400

This is a transcript of Lex Fridman Podcast #400 with Elon Musk.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

War and human nature

Lex Fridman
(00:00:00)
The following is a conversation with Elon Musk, his fourth time on this, the Lex Fridman Podcast. I thought you were going to finish it. It’s one of the greatest themes in all of film history.
Elon Musk
(00:00:31)
Yeah, that’s great.
Lex Fridman
(00:00:33)
So I was just thinking about the Roman Empire, as one does.
Elon Musk
(00:00:38)
Is that whole meme where all guys are thinking about the Roman Empire at least once a day?
Lex Fridman
(00:00:44)
And half the population is confused whether it’s true or not. But more seriously, thinking about the wars going on in the world today, and as you know, war and military conquest has been a big part of Roman society and culture, and I think has been a big part of most empires and dynasties throughout human history.
Elon Musk
(00:01:06)
Yeah, they usually came as a result of conquest. I mean, there’s some like the Hapsburg Empire where there was just a lot of clever marriages.
Lex Fridman
(00:01:16)
But fundamentally there’s an engine of conquest and they celebrate excellence in warfare, many of the leaders were excellent generals, that kind of thing. So a big picture question, Grok approved, I asked if this is a good question to ask.
Elon Musk
(00:01:33)
Tested, Grok approved. Yeah.
Lex Fridman
(00:01:36)
At least on fun mode. To what degree do you think war is part of human nature versus a consequence of how human societies are structured? I ask this as you have somehow controversially been a proponent of peace.
Elon Musk
(00:01:57)
I’m generally a proponent of peace. I mean, ignorance is perhaps, in my view, the real enemy to be countered. That’s the real hard part, not fighting other humans, but all creatures fight. I mean, the jungle is… People think of nature as perhaps some sort of peaceful thing, but in fact it is not. There’s some quite funny Werner Herzog thing where he is in the jungle saying that it’s basically just murder and death in every direction. The plants and animals in the jungle are constantly trying to kill each other every single day, every minute. So it’s not like we’re unusual in that respect.
Lex Fridman
(00:02:40)
Well, there’s a relevant question here, whether with greater intelligence comes greater control over these base instincts for violence.
Elon Musk
(00:02:49)
Yes. We have much more vulnerability to control our limbic instinct for violence than say a chimpanzee. And in fact, if one looks at say, chimpanzee society, it is not friendly. I mean, the Bonobos are an exception, but chimpanzee society is filled with violence and it’s quite horrific, frankly. That’s our limbic system in action. You don’t want to be on the wrong side of a chimpanzee, it’ll eat your face off and tear your nuts off.
Lex Fridman
(00:03:22)
Yeah. Basically there’s no limits or ethics or they almost had just war. There’s no just war in the chimpanzee societies. Is war and dominance by any means necessary?
Elon Musk
(00:03:33)
Yeah. Chimpanzee society is a permanent version of human society. They’re not like peace loving basically at all. There’s extreme violence and then once in a while, somebody who’s watched too many Disney movies decides to raise a chimpanzee as a pet, and then that eats their face or they’re nuts off or chew their fingers off and that kind of thing. It’s happened several times.
Lex Fridman
(00:03:58)
Ripping your nuts off is an interesting strategy for interaction.
Elon Musk
(00:04:02)
It’s happened to people. It’s unfortunate. That’s, I guess, one way to ensure that the other chimp doesn’t contribute to the gene pool.
Lex Fridman
(00:04:10)
Well, from a martial arts perspective is the fascinating strategy.
Elon Musk
(00:04:15)
The nut rougher.
Lex Fridman
(00:04:18)
I wonder which of the martial arts teaches that one.
Elon Musk
(00:04:21)
I think it’s safe to say if somebody’s got your nuts in their hands and as the option of roughing them off, you’ll be amenable to whatever they want.

Israel-Hamas war

Lex Fridman
(00:04:30)
Yeah. Safe to say. So, like I said, somehow controversially, you’ve been a proponent of peace on Twitter on X.
Elon Musk
(00:04:38)
Yeah.
Lex Fridman
(00:04:39)
So let me ask you about the wars going on today and to see what the path to peace could be. How do you hope the current war in Israel and Gaza comes to an end? What path do you see that can minimize human suffering in the longterm in that part of the world?
Elon Musk
(00:04:54)
Well, I think that part of the world is definitely, if you look up… There is no easy answer in the dictionary. It’ll be the picture of the Middle East in Israel especially. So there is no easy answer. This is strictly my opinion is that the goal of Hamas was to provoke an overreaction from Israel. They obviously did not expect to have a military victory, but they really wanted to commit the worst atrocities that they could in order to provoke the most aggressive response possible from Israel, and then leverage that aggressive response to rally Muslims worldwide for the course of Gaza and Palestine, which they have succeeded in doing. So the counterintuitive thing here, I think that the thing that I think should be done, even though it’s very difficult, is that I would recommend that Israel engage in the most conspicuous acts of kindness possible, everything, that is the actual thing that we’re taught the goal of Hamas.
Lex Fridman
(00:06:19)
So in some sense, the degree that makes sense in geopolitics turn the other cheek implemented.
Elon Musk
(00:06:26)
It’s not exactly turn the other cheek because I do think that it is appropriate for Israel to find the Hamas members and either kill them or incarcerate them. That’s something has to be done because they’re just going to keep coming otherwise. But in addition to that, they need to do whatever they can. There’s some talk of establishing, for example, a mobile hospital. I’d recommend doing that. Just making sure that there’s food, water, medical necessities and just be over the top about it and be very transparent. So [inaudible 00:07:22] can claim it’s a trick. Just put webcam on the thing or 24, 7.
Lex Fridman
(00:07:29)
Deploy acts of kindness.
Elon Musk
(00:07:31)
Yeah, conspicuous acts of kindness that are unequivocal, meaning they can’t be somehow because Hamas will then their response will be, “Oh, it’s a trick.” Therefore, you have to counter how it’s not a trick.
Lex Fridman
(00:07:47)
This ultimately fights the broader force of hatred in the region.
Elon Musk
(00:07:51)
Yes. And I’m not sure who said it, it’s an [inaudible 00:07:54] saying, but an eye for an eye makes everyone blind. Now, that neck of the woods, they really believe in the whole eye for an eye thing. But you really have… If you’re not going to just outright commit genocide against an entire people, which obviously would not be acceptable to really, shouldn’t be acceptable to anyone, then you’re going to leave basically a lot of people alive who subsequently hate Israel. So really the question is like for every Hamas member that you kill, how many did you create? And if you create more than you killed, you’ve not succeeded. That’s the real situation there. And it’s safe to say that if you kill somebody’s child in Gaza, you’ve made at least a few homeless members who will die just to kill an Israeli. That’s the situation. But I mean, this is one of the most contentious subjects one could possibly discuss. But I think if the goal ultimately is some sort of long-term piece, one has to look at this from the standpoint of over time, are there more or fewer terrorists being created?
Lex Fridman
(00:09:26)
Let me just linger on war.
Elon Musk
(00:09:29)
Yeah, war, safe to say, wars always existed and always will exist.
Lex Fridman
(00:09:33)
Always will exist.
Elon Musk
(00:09:34)
Always has existed and always will exist.
Lex Fridman
(00:09:37)
I hope not. You think it’ll always-
Elon Musk
(00:09:42)
There will always be war. There’s a question of just how much war and there’s sort of the scope and scale of war. But to imagine that there would not be any war in the future, I think would be a very unlikely outcome.
Lex Fridman
(00:09:55)
Yeah. You talked about the Culture series. There’s war even there.
Elon Musk
(00:09:58)
Yes. It’s a giant war. The first book starts off with a gigantic galactic war where trillions die trillions.
Lex Fridman
(00:10:07)
But it still nevertheless protects these pockets of flourishing. Somehow you can have galactic war and still have pockets of flourishing.
Elon Musk
(00:10:18)
Yeah, I guess if we are able to one day expand to fool the galaxy or whatever, there will be a galactic war at some point.
Lex Fridman
(00:10:31)
I mean, the scale of war has been increasing, increasing, increasing. It’s like a race between the scale of suffering and the scale of flourishing.
Elon Musk
(00:10:38)
Yes.

Military-Industrial Complex

Lex Fridman
(00:10:41)
A lot of people seem to be using this tragedy to beat the drums of war and feed the military industrial complex. Do you worry about this, the people who are rooting for escalation and how can it be stopped?
Elon Musk
(00:10:56)
One of the things that does concern me is that there are very few people alive today who actually viscerally understand the horrors of war, at least in the US. I mean, obviously there are people on the front lines in Ukraine and Russia who understand just how terrible war is, but how many people in the West understand it? My grandfather was in World War II. He was severely traumatized. He was there I think for almost six years in Eastern North Africa and Italy. All his friends were killed in front of him, and he would’ve died too, except they randomly gave some, I guess IQ test or something, and he scored very high. He was not an officer. He was I think a corporal or a sergeant or something like that because he didn’t finish high school because he had to drop out of high school because his dad died and he had to work to support his siblings. So because he didn’t graduate high school, he was not eligible for the offset corps.

(00:11:57)
So he kind of got put into the cannon fodder category basically. But then randomly they gave him this test. He was transferred to British intelligence in London. That’s where we met my grandmother. But he had PTSD next level, next level. I mean, just didn’t talk, just didn’t talk. And if you tried talking to him, he’d just tell you to shut up. And he won a bunch of medals, never bragged about it once, not even hinted nothing. I found out about it because his military records were online. That’s how I know. So he would say like, “No way in hell do you want to do that again.” But how many people… Obviously, he died, he 20 years ago or longer, actually 30 years ago. How many people are alive that remember World War II? Not many.
Lex Fridman
(00:12:54)
And the same perhaps applies to the threat of nuclear war.
Elon Musk
(00:13:01)
Yeah, I mean, there are enough nuclear bombs pointed at United States to make the radioactive revel balance many times.
Lex Fridman
(00:13:10)
There’s two major wars going on right now. So you talked about the threat of AGI quite a bit, but now as we sit here with the intensity of conflict going on, do you worry about nuclear war?
Elon Musk
(00:13:25)
I think we shouldn’t discount the possibility of nuclear war. It is a civilizational threat. Right now, I could be wrong, but I think the current probability of nuclear war is quite low. But there are a lot of nukes pointed at us, and we have a lot of nukes pointed at other people. They’re still there. Nobody’s put their guns away. The missiles are still in the silos.
Lex Fridman
(00:13:57)
And the leaders don’t seem to be the ones with the nukes talking to each other.
Elon Musk
(00:14:03)
No, there are wars which are tragic and difficult on a local basis. And then there are wars which are civilization ending or has that potential. Obviously, global thermonuclear warfare has high potential to end civilization, perhaps permanently, but certainly to severely wound and perhaps set back human progress to the Stone Age or something. I don’t know. Pretty bad. Probably scientists and engineers want to be super popular after that as well. You got us into this mess. So generally, I think we obviously want to prioritize civilizational risks over things that are painful and tragic on a local level, but not civilizational.

War in Ukraine

Lex Fridman
(00:15:00)
How do you hope the war in Ukraine comes to an end? And what’s the path, once again to minimizing human suffering there?
Elon Musk
(00:15:08)
Well, I think that what is likely to happen, which is really pretty much the way it is, is that something very close to the current lines will be how a ceasefire or truce happens. But you just have a situation right now where whoever goes on the offensive will suffer casualties at several times the rate of whoever’s on the defense because you’ve got defense in depth, you’ve got minefields, trenches, anti-tank defenses. Nobody has air superiority because the anti-aircraft missiles are really far better than the aircraft. They’re far more of them. And so neither side has air superiority. Tanks are basically death traps, just slow moving, and they’re not immune to anti-tank weapons. So you really just have long range artillery and infantry ranges. It’s World War I all over again with drones, thrown old drones, some drones there.
Lex Fridman
(00:16:25)
Which makes the long range artillery just that much more accurate and better, and so more efficient at murdering people on both sides.
Elon Musk
(00:16:34)
So whoever is… You don’t want to be trying to advance from either side because the probability of dying is incredibly high. So in order to overcome defense in depth, trenches and minefields, you really need a significant local superiority in numbers. Ideally combined alms where you do a fast attack with aircraft, a concentrated number of tanks, and a lot of people, that’s the only way you’re going to punch through a line and then you’re going to punch through and then not have reinforcements just kick you right out again. I mean, I really recommend people read World War I warfare in detail. That’s rough. I mean, the sheer number of people that died there was mind-boggling.
Lex Fridman
(00:17:37)
And it’s almost impossible to imagine the end of it that doesn’t look like almost exactly like the beginning in terms of what land belongs to who and so on. But on the other side of a lot of human suffering, death and destruction of infrastructure.
Elon Musk
(00:17:56)
Yes. The thing that… The reason I proposed some sort of truce or peace a year ago was because I’ve predicted pretty much exactly what would happen, which is a lot of people dying for basically almost no changes in land and the loss of the flower of Ukrainian and Russian youth. And we should have some sympathy for the Russian boys as well as the Ukrainian boys, because Russian boys, because boys didn’t ask to be on their frontline. They have to be. So there’s a lot of sons not coming back to their parents, and I think most of them don’t hate the other side. It’s sort of like as this saying comes from World War I, it’s like young boys who don’t know each other killing each other on behalf of old men that do know each other. The hell’s the point of that.
Lex Fridman
(00:19:02)
So Volodymyr Zelenskyy said that he’s not, or has said in the past, he’s not interested in talking to Putin directly. Do you think he should sit down man to man, lead a leader, and negotiate peace?
Elon Musk
(00:19:14)
Look, I think I would just recommend do not send the flower of Ukrainian youth to die in trenches, whether he talks to Putin or not, just don’t do that. Whoever goes on the offensive will lose massive numbers of people and history will not look kindly upon them.

China

Lex Fridman
(00:19:42)
You’ve spoken honestly about the possibility of war between US and China in the longterm if no diplomatic solution is found, for example, on the question of Taiwan and One China policy, how do we avoid the trajectory where these two superpowers clash?
Elon Musk
(00:19:58)
Well, it’s worth reading that book on the, difficult to pronounce, the Thucydides Trap, I believe it’s called. I love war history. I like inside out and backwards. There’s hardly a battle I haven’t read about. And trying to figure out what really was the cause of victory in any particular case as opposed to what one side or another claim the reason.
Lex Fridman
(00:20:21)
Both the victory and what sparked the war and-
Elon Musk
(00:20:24)
Yeah, yeah.
Lex Fridman
(00:20:25)
The whole thing.
Elon Musk
(00:20:26)
Yeah. So that Athens and Sparta is a classic case. The thing about the Greek is they really wrote down a lot of stuff. They loved writing. There are lots of interesting things that happened in many parts of the world, but people didn’t write down, so we don’t know what happened or they didn’t really write in detail. They just would say, “We had a battle and we won.” And what? Can you add a bit more? The Greeks, they really wrote a lot. They were very articulate on… They just love writing. And we have a bunch of that writing as preserved. So we know what led up to the Peloponnesian War between the Spartanand Athenian Alliance, and we know that they saw it coming.

(00:21:16)
Spartans didn’t write… They also weren’t very verbose by their nature, but they did write, but they weren’t very verbose. They were [inaudible 00:21:23]. But the Athenians and the other Greeks wrote a line, and Spartan was really kind of like the leader of Greece. But Athens grew stronger and stronger with each passing year. And everyone’s like, “Well, that’s inevitable that there’s going to be a clash between Athens and Sparta. Well, how do we avoid that?” And actually they saw it coming and they still could not avoid it. So at some point, if one group, one civilization or country or whatever exceeds another sort of like the United States has been the biggest kid on the block since I think around 1890 from an economic standpoint.

(00:22:14)
So the United States has been the most powerful economic engine in the world longer than anyone’s been alive. And the foundation of war is economics. So now we have a situation in the case of China where the economy is likely to be two, perhaps three times larger than that of the US. So imagine you’re the biggest kid on the block for as long as anyone can remember, and suddenly a kid comes along who’s twice your size.
Lex Fridman
(00:22:55)
So we see it coming, how is it possible to stop? Let me throw something out there, just intermixing of cultures understanding there does seem to be a giant cultural gap in understanding of each other. And you’re an interesting case study because you are an American, obviously you’ve done a lot of incredible manufacture here in the United States, but you also work with China.
Elon Musk
(00:23:20)
I’ve spent a lot of time in China and met with the leadership many times.
Lex Fridman
(00:23:22)
Maybe a good question to ask is, what are some things about China that people don’t understand, positive just in the culture? What’s some interesting things that you’ve learned about the Chinese?
Elon Musk
(00:23:36)
Well, the sheer number of really smart, hardworking people in China is incredible. There are really say how many smart, hardworking people are there in China? There’s far more of them there than there are here, I think, in my opinion. And they’ve got a lot of energy. So I mean, the architecture in China that’s in recent years is far more impressive than the US. I mean the train stations, the buildings, the high speed rail, everything, it’s really far more impressive than what we have in the US. I mean, I recommend somebody just go to Shanghai and Beijing, look at the buildings and go to take the train from Beijing to Xian, where you have the terracotta warriors. China’s got an incredible history, very long history, and I think arguably in terms of the use of language from a written standpoint, one of the oldest, perhaps the oldest written language, and then China, people did write things down.

(00:24:50)
So now China historically has always been, with rare exception, been internally focused. They have not been inquisitive. They’ve fought each other. There’ve been many, many civil wars. In the Three Kingdoms war, I believe they lost about 70% of their population. So they’ve had brutal internal wars, civil wars that make the US Civil War look small by comparison. So I think it’s important to appreciate that China is not monolithic. We sort of think of China as a sort of one entity of one mind. And this is definitely not the case. From what I’ve seen and I think most people who understand China would agree, people in China think about China 10 times more than they think about anything outside of China. So it’s like 90% of their consideration is internal.
Lex Fridman
(00:26:01)
Well, isn’t that a really positive thing when you’re talking about the collaboration and the future piece between superpowers when you’re inward facing, which is focusing on improving yourself versus focusing on quote, unquote improving others through military might.
Elon Musk
(00:26:18)
The good news, the history of China suggests that China is not inquisitive, meaning they’re not going to go out and invade a whole bunch of countries. Now they do feel very strongly… So that’s good. I mean, because a lot of very powerful countries have been inquisitive. The US is also one of the rare cases that has not been inquisitive. After World War II, the US could have basically taken over the world in any country, we’ve got nukes, nobody else has got nukes. We don’t even have to lose soldiers. Which country do you want? And the United States could have taken over everything and it didn’t. And the United States actually helped rebuild countries. So it helped rebuild Europe, helped rebuild Japan. This is very unusual behavior, almost unprecedented.

(00:27:10)
The US did conspicuous acts of kindness like the Berlin Airlift. And I think it’s always like, well, America’s done bad things. Well, of course America’s done bad things, but one needs to look at the whole track record and just generally, one sort of test would be how do you treat your prisoners at war? Or let’s say, no offense to the Russians, but let’s say you’re in Germany, it’s 1945, you’ve got the Russian Army coming one side and you’ve got the French, British and American Army’s coming the other side, who would you like to be just surrendered to? No country is [inaudible 00:27:58] perfect, but I recommend being a POW with the Americans. That would be my choice very strongly.
Lex Fridman
(00:28:07)
In the full menu of POWs in the US.
Elon Musk
(00:28:08)
Very much so. And in fact, Wernher von Braun, a smart guy, was like, “We’ve got to be captured by the Americans.” And in fact, the SS was under orders to execute von Braun and all of the German rocket conditioners, and they narrowly escaped. They said they were going out for a walk in the woods. They left in the middle of winter with no coats and then ran, but no food, no coats, no water, and just ran like hell and ran West and Vice Sherlock, I think his brother found a bicycle or something and then just cycled West as fast as he couldn’t have found a US patrol. So anyway, that’s one way you can tell morality is where do you want to be a PW? It’s not fun anywhere, but some places are much worse than others. Anyway, so America has been, while far from perfect, generally a benevolent force, and we should always be self-critical and we try to be better, but anyone with half a brain knows that.

(00:29:31)
So I think there are… In this way, China and the United States are similar. Neither country has been acquisitive in a significant way. So that’s a shared principle, I guess. Now, China does feel very strongly about Taiwan. They’ve been very clear about that for a long time. From this standpoint, it would be like one of the states is not there like Hawaii or something like that but more significant than Hawaii. And Hawaii is pretty significant for us. So they view it as really there’s a fundamental part of China, the island of Formosa, not Taiwan, that is not part of China, but should be. And the only reason it hasn’t been is because the US Pacific fleet.
Lex Fridman
(00:30:32)
And is their economic power grows and is their military power grows, the thing that they’re clearly saying is their interest will clearly be materialized.
Elon Musk
(00:30:46)
Yes, China has been very clear that they’ll incorporate Taiwan peacefully or militarily, but that they will incorporate it from their standpoint is 100% likely.
Lex Fridman
(00:31:04)
Something you said about conspicuous acts of kindness as a geopolitical policy, it almost seems naive, but I’d venture to say that this is probably the path forward, how you avoid most wars. Just as you say it sounds naive, but it’s kind of brilliant. If you believe in the goodness of underlying most of human nature, it just seems like conspicuous acts of kindness can reverberate through the populace of the countries involved and deescalate.
Elon Musk
(00:31:44)
Absolutely. So after World War I, they made a big mistake. They basically tried to lump all of blame on Germany and saddle Germany with impossible reparations. And really there was quite a bit of blame to go around for World War I, but they try to put it all in Germany and that laid the seeds for World War II. So a lot of people, were not just Hitler, a lot of people felt wronged and they wanted vengeance and they got it.
Lex Fridman
(00:32:38)
People don’t forget.
Elon Musk
(00:32:41)
Yeah, you kill somebody’s father, mother or son, daughter, they’re not going to forget it. They’ll want vengeance. So after World War II, they’re like, “Well, the Treaty of Versi was a huge mistake in World War I. And so this time, instead of crushing the losers, we’re actually going to help them with the module plan, and we’re going to help rebuild Germany. We’re going to help rebuild Austria and Italy and whatnot.” So that was the right move.
Lex Fridman
(00:33:26)
It does feel like there’s a profound truth to the conspicuous acts of kindness being an antidote to this.
Elon Musk
(00:33:37)
Something must stop the cycle of reciprocal violence. Something must stop it, or it’ll never stop. Just eye for an eye, tooth for a tooth, limb for a limb, life for a life forever and ever.

xAI Grok

Lex Fridman
(00:33:57)
To escape briefly the darkness, was some incredible engineering work, xAI just released Grok AI assistant that I’ve gotten a chance to play with. It’s amazing on many levels. First of all, it’s amazing that a relatively small team in a relatively short amount of time was able to develop this close to state-of-the-art system. Another incredible thing is there’s a regular mode and there’s a fun mode.
Elon Musk
(00:34:23)
Yeah, I guess I’m to blame for that one.
Lex Fridman
(00:34:27)
First of all, I wish everything in life had a fun mode.
Elon Musk
(00:34:29)
Yeah.
Lex Fridman
(00:34:30)
There’s something compelling beyond just fun about the fun mode interacting with a large language model. I’m not sure exactly what it is because I’ve only have had a little bit of time to play with it, but it just makes it more interesting, more vibrant to interact with the system.
Elon Musk
(00:34:47)
Yeah, absolutely. Our AI, Grok, is modeled after The Hitchhiker’s Guide to the Galaxy, which is one of my favorite books, which it’s a book on philosophy. It’s-
Elon Musk
(00:35:00)
My favorite books, it’s a book on philosophy, disguises book on humor. And I would say that forms the basis of my philosophy, which is that we don’t know the meaning of life, but the more we can expand the scope and scale of consciousness, digital and biological, the more we’re able to understand what questions to ask about the answer that is the universe. So I have a philosophy of curiosity.
Lex Fridman
(00:35:34)
There is generally a feeling like this AI system has an outward looking, like the way you are sitting with a good friend looking up at the stars, asking pod head like questions about the universe, wondering what it’s all about. The curiosity that you talk about. No matter how mundane the question I ask it, there’s a sense of cosmic grandeur to the whole thing.
Elon Musk
(00:35:59)
Well, we are actually working hard to have engineering math, physics answers that you can count on. So for the other AIs out there, these so-called large language models, I’ve not found the engineering to be reliable. It unfortunately hallucinates most when you at least want it to hallucinate. So when you’re asking important, difficult questions, that’s when it tends to be confidently wrong. So we’re really trying hard to say, okay, how do we be as grounded as possible? So you can count on the results, trace things back to physics first principles, mathematical logic. So underlying the humor is an aspiration to adhere to the truth of the universe as closely as possible.
Lex Fridman
(00:37:01)
That’s really tricky.
Elon Musk
(00:37:02)
It is tricky. So that’s why there’s always going to be some amount of error. But do we want to aspire to be as truthful as possible about the answers with acknowledged error. So that there was always, you don’t want to be confidently wrong, so you’re not going to be right every time, but you want to minimize how often you’re confidently wrong. And then like I said, once you can count on the logic as being not violating physics, then you can start to bull on that to create inventions, like invent new technologies. But if you cannot count on the foundational physics being correct, obviously the inventions are simply wishful thinking, imagination land. Magic basically.
Lex Fridman
(00:38:01)
Well, as you said, I think one of the big goals of XAI is to understand the universe.
Elon Musk
(00:38:06)
Yes, that’s how simple three word mission.
Lex Fridman
(00:38:13)
If you look out far into the future, do you think on this level of physics, the very edge of what we understand about physics, do you think it will make the sexiest discovery of them as we know now, unifying general relativity and quantum mechanics? So coming up with a theory of everything, do you think it could push towards that direction, almost like theoretical physics discoveries?
Elon Musk
(00:38:38)
If an AI cannot figure out new physics, it’s clearly not equal to humans, nor has surpassed humans because humans have figured out new physics. Physics is just deepening what’s inside into how reality works. And then there’s engineering which is inventing things that have never existed. Now the range of possibilities for engineering is far greater than for physics because once you figure out the rules of the universe, that’s it. You’ve discovered things that already existed. But from that you can then build technologies that are really almost limitless in the variety. And it’s like once you understand the rules of the game properly, and with current physics, we do at least at a local level, understand how physics works very well. Our ability to predict things is incredibly good. Degree to which quantum mechanics can predict outcomes is incredible. That was my hardest class in college by the way. My senior quantum mechanics class was harder than all of my other classes put together.
Lex Fridman
(00:39:50)
To get an AI system, a large language model be as reliable as quantum mechanics and physics is very difficult.
Elon Musk
(00:40:01)
Yeah. You have to test any conclusions against the ground truth of reality. Reality is the ultimate judge. Like physics is the law, everything else is a recommendation. I’ve seen plenty of people break the laws made by man, but none break the laws made by physics.
Lex Fridman
(00:40:15)
It’s a good test actually. If this LLM understands and matches physics, then you can more reliably trust whatever it thinks about the current state of politics in some sense.
Elon Musk
(00:40:28)
And it’s also not the case currently that even that its internal logic is not consistent. So especially with the approach of just predicting a token predict token, predict token, it’s like a vector sum. You’re summing up a bunch of vectors, but you can get drift. A little bit of error adds up and by the time you are many tokens down the path, it doesn’t make any sense.
Lex Fridman
(00:40:59)
So it has to be somehow self-aware about the drift.
Elon Musk
(00:41:02)
It has to be self-aware about the drift, and then look at the thing as a gestalt as a whole and say it doesn’t have coherence as a whole. When authors write books, they will write the book and then they’ll go and revise it, take into account all the end and the beginning and the middle and rewrite it to achieve coherence so that it doesn’t end up at a nonsensical place.
Lex Fridman
(00:41:33)
Maybe the process of revising is what reasoning is, and then the process of revising is how you get closer and closer to truth. At least I approached that way, you just say a bunch of bullshit first and then you get it better. You start a bullshit and then you-
Elon Musk
(00:41:51)
Create a draft and then you iterate on that draft until it has coherence, until it all adds up basically.
Lex Fridman
(00:41:59)
Another question about theory of everything, but for intelligence, as you’re exploring this with XAI, creating this intelligence system? Do you think there is a theory of intelligence where you get to understand what is the I in AGI and what is the I in human intelligence?
Elon Musk
(00:42:22)
No, I in team America. Wait, there is.
Lex Fridman
(00:42:24)
No, it’s going to be stuck in my head now. Yeah, there’s no me and whatever in quantum mechanics, wait. I mean is that part of the process of discovering, understanding the universe is understanding intelligence?
Elon Musk
(00:42:50)
Yeah. I think we need to understand intelligence, understand consciousness. I mean there are some fundamental questions of what is thought, what is emotion? Is it really just one atom bumping into another atom? It feels like something more than that. So I think we’re probably missing some really big things.
Lex Fridman
(00:43:18)
Something that’ll be obvious in retrospect. You put the whole consciousness and motion.
Elon Musk
(00:43:26)
Well, some people would quote like a soul religion, be a soul. You feel like you’re you, I mean you don’t feel like you’re just a collection of atoms, but on what dimension does thought exist? What dimension does do emotions exist? Because we feel them very strongly. I suspect there’s more to it than atoms bumping into atoms.
Lex Fridman
(00:43:52)
And maybe AI can pave the path to the discovery whatever the hell that thing is.
Elon Musk
(00:43:58)
Yeah. What is consciousness? When you put the atoms in a particular shape, why are they able to form thoughts and take actions and feelings?
Lex Fridman
(00:44:10)
And even if it is an illusion, why is this illusion so compelling?
Elon Musk
(00:44:13)
Yeah. Why does the solution exist? On what plane does the solution exist? And sometimes I wonder is either perhaps everything’s conscious or nothing’s conscious. One of the two.
Lex Fridman
(00:44:33)
Like the former, everything conscious just seems more fun.
Elon Musk
(00:44:37)
It does seem more fun, yes. But we’re composed of atoms and those atoms are composed of quarks and leptons and those quarks and leptons have been around since the beginning of the universe.
Lex Fridman
(00:44:50)
“The beginning of the universe.”
Elon Musk
(00:44:53)
What seems to be the beginning of the universe.

Aliens

Lex Fridman
(00:44:55)
The first time we talked, you said, which is surreal to think that this discussion was happening is becoming a reality. I asked you what question would you ask an AGI system once you create it? And you said, “What’s outside the simulation,” is the question. Good question. But it seems like with Grok you started literally the system’s goal is to be able to answer such questions and to ask such questions.
Elon Musk
(00:45:24)
Where are the aliens?
Lex Fridman
(00:45:25)
Where are the aliens?
Elon Musk
(00:45:26)
That’s one of the foam paradox question. A lot of people have asked me if I’ve seen any evidence of aliens and I haven’t, which is kind of concerning. I think I’d probably prefer to at least have seen some archeological evidence of aliens. To the best of my knowledge, I’m not aware of any evidence surveillance. If they’re out there, they’re very subtle. We might just be the only consciousness, at least in the galaxy. And if you look at say the history of Earth, to believe the archeological record Earth is about four and a half billion years old. Civilization as measured from the first writing is only about 5,000 years old. We have to give some credit there to the ancient Sumerians who aren’t around anymore. I think it was an archaic pre-form was the first actual symbolic representation, but only about 5,000 years ago. I think that’s a good date for when we say civilization started. That’s 1000000th of Earth’s existence.

(00:46:35)
So civilization has been around. It’s really a flash in the pan so far. And why did it take so long? Four and a half billion years, for the vast majority of the time, there was no life. And then there was archaic bacteria for a very long time. And then you had mitochondria get captured, multicellular life, differentiation into plants and animals, life moving from the oceans to land, mammals, higher brain functions. And the sun is expanding slowly but it’ll heat the earth up at some point in the future, boil the oceans and earth will become like Venus, where life as we know it is impossible. So if we do not become multiplanetary and ultimately solar system, annihilation of all life on earth is a certainty. A certainty. And it could be as little as on the galactic timescale, half a billion years, long time by human standards, but that’s only 10% longer than earth has been around at all. So if life had taken 10% longer to evolve on earth, it wouldn’t exist at all.
Lex Fridman
(00:48:27)
Glad a deadline coming up, you better hurry. But that said, as you said, humans intelligent life on earth developed a lot of cool stuff very quickly. So it seems like becoming a multiplanetary is almost inevitable. Unless we destroy-
Elon Musk
(00:48:45)
We need to do it. I suspect that if we are able to go out there and explore other star systems that we… There’s a good chance we find a whole bunch of long dead one planet civilizations that never made it past their home planet.
Lex Fridman
(00:49:03)
That’s so sad. Also fascinating.
Elon Musk
(00:49:08)
I mean there are various explanations for paradox and one is they’re these great vultures which civilizations don’t pass through. And one of those great vultures is do you become a multi-plan civilization or not? And if you don’t, it’s simply a matter of time before something happens on your planet, either natural or manmade that causes us to die out. Like the dinosaurs, where are they now? They didn’t have spaceships.
Lex Fridman
(00:49:42)
I think the more likely thing is because just to empathize with the aliens that they found us and they’re protecting us and letting us be.
Elon Musk
(00:49:51)
I hope so. Nice aliens.
Lex Fridman
(00:49:53)
Just like the tribes in the Amazon, the uncontacted tribes or protecting them. That’s what-
Elon Musk
(00:49:59)
That would be a nice explanation.
Lex Fridman
(00:50:00)
Or you could have, what was it? I think Andre Kappelhoff said, “It’s like the ants and the Amazon asking where’s everybody?”
Elon Musk
(00:50:10)
Well, they do run into a lot of other ants.
Lex Fridman
(00:50:12)
That’s true.
Elon Musk
(00:50:14)
These ant wars.
Lex Fridman
(00:50:16)
Sounds like a good TV show.
Elon Musk
(00:50:18)
Yeah. They literally have these big wars between various ants.
Lex Fridman
(00:50:21)
Yeah. Maybe I’m just dismissing all the different diversity of ants.
Elon Musk
(00:50:28)
Listen to that Werner Herzog talking about the jungle. It’s really hilarious. Have you heard it?
Lex Fridman
(00:50:31)
No, I have not. But Werner Herzog is a way.
Elon Musk
(00:50:37)
You should play it as an interlude in the… It’s on YouTube. It’s awesome.
Lex Fridman
(00:50:45)
I love him so much.
Elon Musk
(00:50:47)
He’s great.
Lex Fridman
(00:50:47)
Was he the director of happy people life and the Taiga? I think also-
Elon Musk
(00:50:51)
He did that bear documentary. I did this thing about penguins.
Lex Fridman
(00:50:58)
The psycho analysis of a penguin.
Elon Musk
(00:51:00)
Yeah. The penguins headed for mountains that are 70 miles away and penguin is just headed for dom, basically.
Lex Fridman
(00:51:08)
Well, he had a cynical take. He could be just a brave explorer and there’ll be great stories told about him amongst the penguin population for many centuries to come. What were we talking about? Okay.
Elon Musk
(00:51:28)
Yeah. So aliens, I mean, I don’t know. Look, I think the smart move is just this is the first time in the history of earth that it’s been possible for life to extend beyond earth. That window is open. Now it may be open for a long time or it may be open for a short time and it may be open now and then never open again. So I think the smart move here is to make life multiplanetary while it’s possible to do so. We don’t want to be one of those lame one planet civilizations that just dies out.
Lex Fridman
(00:52:04)
No, those are lame.
Elon Musk
(00:52:05)
Yeah. Lame. Self-respecting, civilization would be one planet.
Lex Fridman
(00:52:11)
There’s not going to be a Wikipedia entry for one of those. Do SpaceX have an official policy for when we meet aliens?
Elon Musk
(00:52:23)
No.
Lex Fridman
(00:52:24)
That seems irresponsible.
Elon Musk
(00:52:30)
I mean, look, if I see the slightest indication that there are aliens, I will immediately post on X platform anything I know.
Lex Fridman
(00:52:38)
It could be the most liked reposted post of all time.
Elon Musk
(00:52:42)
Yeah. I mean, look, we have more satellites up there right now than everyone else combined. So we know if we’ve got a maneuver around something and we don’t have to maneuver around anything.

God

Lex Fridman
(00:52:55)
If we go to the big questions once again, you said you’re with Einstein, that you believe in the goddess Spinoza.
Elon Musk
(00:53:04)
Yes.
Lex Fridman
(00:53:05)
So that’s that view that God is like the universe and reveals himself through the laws of physics or as Einstein said, “Through the lawful harmony of the world.”
Elon Musk
(00:53:16)
Yeah. I would agree that God of the simulator or whatever the supreme beings reveal themselves through the physics, they have creatives of this existence and incumbent upon us to try to understand more about this one creation.
Lex Fridman
(00:53:38)
Who created this thing? Who’s running this thing? Embodying it into a singular question with a sexy word on top of it is focusing the mind to understand. It does seem like there’s a, again, it could be an illusion. It seems like there’s a purpose that there’s an underlying master plan of some kind, and it seems like.
Elon Musk
(00:53:58)
There may not be a master plan in the sense. So maybe an interesting answer to the question of determinism versus free will is that if we are in a simulation, the reason that these higher beings would hold a simulation is to see what happens. So they don’t know what happens otherwise they wouldn’t hold the simulation. So when humans create a simulation, so it’s SpaceX and Tesla, we create simulations all the time. Especially for the rocket, you have to run a lot of simulations to understand what’s going to happen because you can’t really test the rocket until it goes to space and you want it to work. So you have to simulate subsonic, transonic, supersonic, hypersonic, ascend, and then coming back, super high heating and orbital dynamics. All this has got to be simulated because you don’t get very many kicks at the can. But we run the simulations to see what happens, not if we knew what happens, we wouldn’t run the simulation. So whoever created this existence, they’re running it because they don’t know what’s going to happen, not because they do.

Diablo 4 and video games

Lex Fridman
(00:55:23)
So maybe we both played Diablo. Maybe Diablo was created to see if Druid, your character, could defeat Uber Lilith at the end. They didn’t know.
Elon Musk
(00:55:34)
Well, the funny thing is Uber Lilith, her title is Hatred Incarnate. And right now, I guess you can ask the Diablo team, but it’s almost impossible to defeat Hatred in the eternal realm.
Lex Fridman
(00:55:55)
Yeah. You’ve streamed yourself dominating Tier 100 Nightmare Dungeon. And still-
Elon Musk
(00:56:00)
I can cruise through Tier 100 Nightmare Dungeon like a stroll in the park.
Lex Fridman
(00:56:07)
And still you’re defeated by Hatred?
Elon Musk
(00:56:09)
Yeah. I guess maybe the second hardest boss is Duriel. Duriel can even scratch the paint. So I killed Duriel so many times and every other boss in the game, all of them kill him so many times, it’s easy. But Uber Lilith, otherwise known as Hatred Incarnate, especially if you’re Duriel and you have no ability to go to be vulnerable, there are these random death waves that come at you.

(00:56:44)
Really I am 52, so my reflex is not what they used to be, but I have a lifetime of playing video games. At one point, I was maybe one of the best quake players in the world. I actually won money in what I think was the first paid eSports tournament in the US. We were doing four person quake tournaments and I was the second best person on the team and the actual best person that… We were actually winning, we would’ve come first, except the best person on the team. His computer crashed halfway through the game. So we came second, but I got money for it and everything. So basically I got skills, albeit no spring chicken these days. And to be totally frank, it’s driving me crazy to beat Lilith as a Druid, basically trying to beat Hatred Incarnate in the eternal realm.
Lex Fridman
(00:57:40)
As a Druid.
Elon Musk
(00:57:41)
As a Druid. This is really vexing, let me tell you.
Lex Fridman
(00:57:49)
I mean, the challenge is part of the fun. I have seen directly, you’re actually a world-class, incredible video game player. And I think Diablo, so you’re just picking up a new game and you’re figuring out its fundamentals. You’re also with the Paragon Board and the build are not somebody like me who perfectly follows whatever they suggest on the internet. You’re also an innovator there, which is hilarious to watch. It’s like a mad scientist just trying to figure out the Paragon Board and the build. Is there some interesting insights there about if somebody’s starting as a druid, do you have advice?
Elon Musk
(00:58:30)
I would not recommend playing a druid in the eternal realm. Right now I think the most powerful character in the seasonal realm is the Sorcerer with the lightning balls. The smokes have huge balls in the seasonal.
Lex Fridman
(00:58:46)
Yeah, that’s what they say.
Elon Musk
(00:58:49)
So have huge balls. They do huge balls of lightning.
Lex Fridman
(00:58:54)
I’ll take you word for it.
Elon Musk
(00:58:57)
In the seasonal realm, it’s pretty easy to beat Uber Lilith because you get these vapor powers that out amplify your damage and increase your defense and whatnot. So really quite easy to defeat Hatred seasonally, but to defeat Hatred eternally very difficult, almost impossible. It’s very impossible. It seems like a metaphor for life.
Lex Fridman
(00:59:24)
Yeah. I like the idea that Elon Musk, because I was playing Diablo yesterday and I saw Level 100 Druid just run by, I will never die and then run back the other way. And this metaphor, it’s hilarious that you, Elon Musk is restlessly, fighting Hatred in this demonic realm.
Elon Musk
(00:59:47)
Yes.
Lex Fridman
(00:59:48)
It’s hilarious. I mean it’s pretty hilarious.
Elon Musk
(00:59:50)
No, it’s absurd. Really, it’s exercise and absurdity and it makes me want to pull my hair out.
Lex Fridman
(00:59:57)
Yeah. What do you get from video games in general, for you personally?
Elon Musk
(01:00:03)
I don’t know. It calms my mind. I mean, killing the demons in a video game calms the demons in my mind. If you play a tough video game, you can get into a state of flow, which is very enjoyable. Admittedly, it needs to be not too easy, not too hard, kind of in the Goldilocks zone, and I guess you generally want to feel like you’re progressing in the game. A good video, and there’s also beautiful art, engaging storylines, and it’s like an amazing puzzle to solve, I think. So it’s like solving the puzzle.
Lex Fridman
(01:00:52)
Elden Ring the greatest game of all time. I still haven’t played it, but to you-
Elon Musk
(01:00:56)
Elden Ring is definitely a candidate for best game ever. Top five for sure.
Lex Fridman
(01:01:01)
I think I’ve been scared how hard it is or how hard I hear it is, but it’s beautiful.
Elon Musk
(01:01:06)
Elden Ring, feels like it’s designed by an alien.
Lex Fridman
(01:01:13)
It’s a theme to this discussion. In what way?
Elon Musk
(01:01:17)
It’s so unusual. It’s incredibly creative, and the art is stunning. I recommend playing it on a big resolution, high dynamic raised TV even. It doesn’t need to be a monitor. Just the art is incredible. It’s so beautiful and it’s so unusual, and each of those top bus battles is unique. It’s a unique puzzle to solve. Each one’s different and the strategy you use to solve one battle is different from another battle.
Lex Fridman
(01:01:54)
That said, you said Druid, an internal against Uber Lilith is the hardest boss battle you’ve ever…
Elon Musk
(01:02:00)
Correct. That is currently the, and I’ve played a lot of video games because that’s my primary recreational activity. And yes, beating Hatred in the internal realm is the hardest bus battle in life. And in the video game. I’m not sure it’s possible, but I do make progress. So then I’m like, ” Okay. I’m making progress. Maybe if I just tweak that paragon board a little more, I can do it could.” Just dodge a few more waves, I could do it.
Lex Fridman
(01:02:43)
Well, the simulation is created for the purpose of figuring out if it can be done, and you’re just a cog in the machine of the simulation.
Elon Musk
(01:02:51)
Yeah, it might be. I have a feeling that at least I think-
Lex Fridman
(01:03:02)
It’s doable.
Elon Musk
(01:03:03)
It’s doable. Yes.
Lex Fridman
(01:03:05)
Well, that’s the human spirit right there to believe.
Elon Musk
(01:03:09)
Yeah. I mean, it did prompt me to think about just hate in general, which is you want to be careful of one of those things where you wish for something that sounds good, but if you get it’s actually a dystopian situation. So if you wish for world peace sounds good, but how’d it enforced and at what cost eternal peace? It might actually be worse to have eternal peace because of what that would entail. The suppression of everyone, it might be the suppression of progress. It might be an ossified society that never changes. What I’m saying is that there is an argument that if you wish for no war, you should be careful what you wish for because what’s required in order for there to be no war might be worse than a little war.

Dystopian worlds: 1984 and Brave New World

Lex Fridman
(01:04:22)
Yeah. The things we’ll consider to be flaws of human civilization might be a necessary components for whatever optimal looks like. I mean this, do you worry about AI, AGI enabling a dystopian state of this nature, whether it’s 1984 with surveillance and fear or brave new world with pleasure and what is it? Lots of sex, but no deep human experience.
Elon Musk
(01:04:52)
There’s actually a real drug called Soma.
Lex Fridman
(01:04:54)
There is?
Elon Musk
(01:04:54)
Yes.
Lex Fridman
(01:04:56)
I’m going to take notes.
Elon Musk
(01:04:58)
Does Gong know about it? No. He probably will go to Brave New World.
Elon Musk
(01:05:02)
We got to ask it.
Lex Fridman
(01:05:03)
What would be the question?
Elon Musk
(01:05:04)
Tell us about the drug called Soma.
Lex Fridman
(01:05:05)
Tell us about the drug called Soma. It’s going to go to Brave New World, I think.
Elon Musk
(01:05:15)
Well, there’s a real drug as well.
Lex Fridman
(01:05:18)
Searching for Soma drug information. Oh boy. Soma, the wonder drug… By the way, I’m in fun mode.
Elon Musk
(01:05:27)
Okay, fun. Yeah. So it’s talking about the real drug.
Lex Fridman
(01:05:31)
It’s a muscle relaxant that’s been around since the 1950s. It’s used to treat muscle pain and discomfort, and it does this by altering the activity between neurons and the spinal cord. Now, Soma isn’t just a pain reliever, it’s also a bit of a sedative. It can make you feel a bit drowsy. So it’s not something to take if you’re planning on operating, having machinery or trying to solve complex math equations. And it goes on and on. In conclusion, Soma is a drug that can be helpful for those dealing with muscle pain, but it’s not without its risks. That’s pretty good. Wait a minute.
Elon Musk
(01:06:01)
That’s the real drug it’s talking about.
Lex Fridman
(01:06:03)
It’s the real drug. And of course the fun is always use it responsibly in accordance with the advice of a healthcare professional. And remember, in the end, the universe is probably just a giant computer simulation, so don’t take anything too seriously. I love it.
Elon Musk
(01:06:20)
I love it. Yeah.
Lex Fridman
(01:06:22)
It also really nicely includes the tweets, posts on X that are relevant to this, which is really cool. That integrates everything together. The interface is one of the really cool things here.
Elon Musk
(01:06:36)
Yeah. Seems done a great job. You could say no, I mean, Soma from Brave New World, because it thinks you mean, you mean Soma, the real drug. No, I mean Soma from Brave Beer World.
Lex Fridman
(01:06:56)
It’s going to be like, “Oops, I’m really sorry.” I’m really surprised it went to the real drug first. My apologies. I was thinking of the muscle relaxer. Soma for brand New World is a different beast altogether. Now this Huxley dystopian, Soma is a drug that plays a central role in the society depicted in the book. It’s a mood altering hallucinogenic drug that’s used to keep the population docile and content in the novel. To use Soma in Brave New World raises some interesting questions about the nature of happiness and the role of drugs in society, man.
Elon Musk
(01:07:30)
Exactly. Is it better to live in a world where everyone is happy all the time, even if that happiness is artificial? It’s good question. This is what I mean. Do you wish for world peace and happiness all the time? Are you sure? Because that might be a society that is essentially sterile and ossified that never changes, that is ultimately doomed.
Lex Fridman
(01:07:58)
This kind of tension between doctors and the light-
Elon Musk
(01:08:04)
This is really a very good summary. It really gets to the point. This is not simply regurgitating a brave new world. It’s actually getting to the salient element of Soma as a drug. Do you actually want to be in a situation where everyone is happy all the time, even though it’s artificial? Or is it better to confront the challenges of life and experience the full range of human emotions, even if it means experiencing pain and suffering? For
Lex Fridman
(01:08:31)
Those listening, by the way, Elon just read directly from Grok, which is a really nice kind of insightful, philosophical analysis of the tension here. Interesting.
Elon Musk
(01:08:41)
It pretty much nails it. In conclusion, Soma from Brave New World is fictional drug that’s used to explore some deep philosophical questions about the nature of happiness and the role of drugs in society. It’s a powerful symbol of the dangers of using drugs to escape from reality and the importance of confronting the challenges of life head on. Nailed it. And the crazy thing is we do have a real drug called Soma, which is like the drug in the book. And I’m like, “They must’ve named it Probably.” Some of the real drug is quite effective on back pain.
Lex Fridman
(01:09:17)
So you know about this drug. It’s fascinating
Elon Musk
(01:09:20)
I’ve taken it because I had a squashed disc in my C5-C6.
Lex Fridman
(01:09:26)
So it takes the physical pain away. But Soma here-
Elon Musk
(01:09:28)
It doesn’t completely, it reduces the amount of pain you feel, but at the expense of mental acuity, it dells your mind. Just like the drug in the book.
Lex Fridman
(01:09:41)
Just like the drug in the book, and hence the trade off. The thing that seems like utopia could be a dystopia after all.
Elon Musk
(01:09:49)
Yeah. Actually I was towing a friend of mine saying, “Would you really want there to be no hate in the world? Really none?” I wonder why hate evolved. I’m not saying we should have…
Elon Musk
(01:10:00)
I wonder why hate evolved. I’m not saying we should amplify hate, of course, I think we should try to minimize it, but none at all. There might be a reason for hate.
Lex Fridman
(01:10:13)
And suffering. It’s really complicated to consider that some amount of human suffering is necessary for human flourishing.
Elon Musk
(01:10:22)
Is it possible to appreciate the highs without knowing the lows?
Lex Fridman
(01:10:29)
And that all is summarized there in a single statement from God. Okay.
Elon Musk
(01:10:34)
No highs, no lows, who knows?

AI and useful compute per watt

Lex Fridman
(01:10:38)
[inaudible 01:10:38]. It seems that training LLMs efficiently is a big focus for xAI. First of all, what’s the limit of what’s possible in terms of efficiency? There’s this terminology of useful productivity per watt. What have you learned from pushing the limits of that?
Elon Musk
(01:10:59)
Well, I think it’s helpful, the tools of physics are very powerful and can be applied I think to really any arena in life. It’s really just critical thinking. For something important you need to reason with from first principles and think about things in the limit one direction or the other. So in the limit, even at the Kardashev scale, meaning even if you harness the entire power of the sun, you’ll still care about useful compute per watt. That’s where I think, probably where things are headed from the standpoint of AI is that we have a silicon shortage now that will transition to a voltage transformer shortage in about a year. Ironically, transformers for transformers. You need transformers to run transformers.
Lex Fridman
(01:11:52)
Somebody has a sense of humor in this thing.
Elon Musk
(01:11:57)
I think, yes, fate loves irony, ironic humor, an ironically funny outcome seems to be often what fate wants.
Lex Fridman
(01:12:09)
Humor is all you need. I think spice is all you need somebody posted.
Elon Musk
(01:12:13)
Yeah. But yeah, so we have silicon shortage today, a voltage step down transformer shortage probably in about a year, and then just electricity shortages in general in about two years. I gave a speech for the world gathering of utility companies, electricity companies, and I said, look, you really need to prepare for traveling of electricity demand because all transport is going to go electric with the ironic exception of rockets, and heating will also go electric. So energy usage right now is roughly one third, very rough terms, one third electricity, one third transport, one third heating. And so in order for everything to go sustainable, to go electric, you need to triple electricity output. So I encourage the utilities to build more power of plants and also to probably have, well, not probably, they should definitely buy more batteries because the grid currently is sized for realtime load, which is kind of crazy because that means you’ve got to size for whatever the peak electricity demand is, the worst second or the worst day of the year, or you can have a brown out or blackout.

(01:13:37)
We had that crazy blackout for several days in Austin because there’s almost no buffering of energy in the grid. If you’ve got a hydropower plant you can buffer energy, but otherwise it’s all real time. So with batteries, you can produce energy at night and use it during the day so you can buffer. So I expect that there will be very heavy usage of batteries in the future because the peak to trough ratio for power plants is anywhere from two to five, so its lowest point to highest point.
Lex Fridman
(01:14:20)
So batteries necessary to balance it out, but the demand, as you’re saying, is going to grow, grow, grow, grow.
Elon Musk
(01:14:25)
Yeah.
Lex Fridman
(01:14:25)
And part of that is the compute?
Elon Musk
(01:14:29)
Yes. Yes. I mean, electrification of transport and electric heating will be much bigger than AI, at least-
Lex Fridman
(01:14:40)
In the short term.
Elon Musk
(01:14:40)
In the short term. But even for AI, you really have a growing demand for electricity, for electric vehicles, and a growing demand for electricity to run the computers for AI. And so this is obviously, can lead to electricity shortage.
Lex Fridman
(01:14:58)
How difficult is the problem of, in this particular case, maximizing the useful productivity per watt for training and that’s, this seems to be really where the big problem we’re facing that needs to be solved, is how to use the power efficiently. What you’ve learned so far about applying this physics first principle of reasoning in this domain, how difficult is this problem?
Elon Musk
(01:15:29)
It will get solved. It’s the question of how long it takes to solve it. So at various points, there’s some kind of limiting factor to progress and with regard to AI, I’m saying right now the limiting factor is silicon chips and that will, we’re going to then have more chips than we can actually plug in and turn on probably in about a year. The initial constraint being literally voltage step down transformers because you’ve got power coming in at 300,000 volts and it’s got to step all the way down eventually to around 0.7 volts. So it’s a very big amount of, the voltage step down is gigantic and the industry is not used to rapid growth.

AI regulation

Lex Fridman
(01:16:22)
Okay. Let’s talk about the competition here. You’ve shown concern about Google and Microsoft with OpenAI developing AGI. How can you help ensure with xAI and Tesla AI work that it doesn’t become a competitive race to AGI, but that is a collaborative development of safe AGI?
Elon Musk
(01:16:42)
Well, I mean I’ve been pushing for some kind of regulatory oversight for a long time. I’ve been somewhat of a Cassandra on the subject for over a decade. I think we want to be very careful in how we develop AI. It’s a great power and with great power comes great responsibility. I think it would be wise for us to have at least an objective third party who can be like a referee that can go in and understand what the various leading players are doing with AI, and even if there’s no enforcement ability, they can at least voice concerns publicly. Jeff Hinton, for example, left Google and he voiced strong concerns, but now he’s not at Google anymore, so who’s going to voice the concerns? So I think there’s, Tesla gets a lot of regulatory oversight on the automotive front. We’re subject to, I think over a hundred regulatory agencies domestically and internationally. It’s a lot. You could fill this room with the all regulations that Tesla has to adhere to for automotive. Same is true for rockets and for, currently, the limiting factor for SpaceX for Starship launch is regulatory approval.

(01:18:13)
The FAA has actually given their approval, but we’re waiting for fish and wildlife to finish their analysis and give their approval. That’s why I posted I want to buy a fish license on, which also refers to the Monte Python sketch. Why do you need a license for your fish? I don’t know. But according to the rules, I’m told you need some sort of fish license or something. We effectively need a fish license to launch a rocket. And I’m like, wait a second. How did the fish come into this picture? I mean, some of the things I feel like are so absurd that I want to do a comedy sketch and flash at the bottom. This is all real. This is actually what happened.

(01:19:02)
One of the things that was a bit of a challenge at one point is that they were worried about a rocket hitting a shark. And the ocean’s very big, and how often do you see sharks? Not that often. As a percentage of ocean surface area, sharks basically are zero. And so then we said, well, how will we calculate the probability of killing a shark? And they’re like, well, we can’t give you that information because they’re worried about shark fin hunters going and hunting sharks and I said, well, how are we supposed to, we’re on the horns of a dilemma then.

(01:19:40)
They said, well, there’s another part of fish and wildlife that can do this analysis. I’m like, well, why don’t you give them the data? We don’t trust them. Excuse me? They’re literally in your department. Again, this is actually what happened. And then can you do an NDA or something? Eventually they managed to solve the internal quandary, and indeed the probability of us hitting a shark is essentially zero. Then there’s another organization that I didn’t realize existed until a few months ago that cares about whether we would potentially hit a whale in international waters. Now, again, you look the surface, look at the Pacific and say what percentage of the Pacific consists of whale? I could give you a big picture and point out all the whales in this picture. I’m like, I don’t see any whales. It’s basically 0%, and if our rocket does hit a whale, which is extremely unlikely beyond all belief, fate had it, that’s a whale has some seriously bad luck, least lucky whale ever.
Lex Fridman
(01:20:50)
I mean this is quite absurd, the bureaucracy of this, however it emerged.
Elon Musk
(01:20:57)
Yes. Well, I mean one of the things that’s pretty wild is for launching out of Vanderberg in California, we had to, they were worried about seal procreation, whether the seals would be dismayed by the sonic booms. Now, there’ve been a lot of rockets launched out of Vandenberg and the seal population has steadily increased. So if anything, rocket booms are an aphrodisiac, based on the evidence, if you were to correlate rocket launches with seal population. Nonetheless, we were forced to kidnap a seal, strap it to a board, put headphones on the seal and play sonic boom sounds to it to see if it would be distressed. This is an actual thing that happened. This is actually real. I have pictures.
Lex Fridman
(01:21:48)
I would love to see this. Yeah. Sorry. There’s a seal with headphones.
Elon Musk
(01:21:55)
Yes, it’s a seal with headphones strapped to a board. Okay. Now the amazing part is how calm the seal was because if I was a seal, I’d be like, this is the end. They’re definitely going to eat me. How old the seal, when seal goes back to other seal friends, how’s he going to explain that?
Lex Fridman
(01:22:17)
They’re never going to believe them.
Elon Musk
(01:22:18)
Never going to believe him. That’s why, I’m like sort of like it’s getting kidnapped by aliens and getting anal probed. You come back and say, I swear to God, I got kidnapped by aliens and they stuck anal probe in my butt and people are like, no, they didn’t. That’s ridiculous. His seal buddies are never going to believe him that he got strapped to aboard and they put headphones on his ears and then let him go. Twice, by the way, we had to do it twice.
Lex Fridman
(01:22:46)
They let him go twice.
Elon Musk
(01:22:48)
We had to capture-
Lex Fridman
(01:22:48)
The same seal?
Elon Musk
(01:22:49)
No different seal.
Lex Fridman
(01:22:50)
Okay. Did you get a seal of approval?
Elon Musk
(01:22:55)
Exactly. Seal of approval. No, I mean I don’t think the public is quite aware of the madness that goes on.
Lex Fridman
(01:23:02)
Yeah. Yeah. It’s absurd.
Elon Musk
(01:23:05)
Fricking seals with fricking headphones.
Lex Fridman
(01:23:07)
I mean, this is a good encapsulation of the absurdity of human civilization, seals in headphones.

Should AI be open-sourced?

Elon Musk
(01:23:13)
Yes.
Lex Fridman
(01:23:15)
What are the pros and cons of open sourcing AI to you as another way to combat a company running away with AGI?
Elon Musk
(01:23:28)
In order to run really deep intelligence, you need a lot of compute. So it’s not like you can just fire up a PC in your basement and be running AGI, at least not yet. Grok was trained on 8,000 A100’s running at peak efficiency and Grok’s going to get a lot better, by the way, we will be more than doubling our compute every couple months for the next several months.
Lex Fridman
(01:24:02)
There’s a nice writeup, on how we went from Grok zero to Grok one.
Elon Musk
(01:24:02)
By Grok?
Lex Fridman
(01:24:05)
Yeah, right, grok just bragging, making shit up about itself.
Elon Musk
(01:24:10)
Just Grok, Grok, Grok.
Lex Fridman
(01:24:17)
Yeah. That’s like a weird AI dating site where it exaggerates about itself. No, there’s a writeup of where it stands now, the history of its development, and where it stands on some benchmarks compared to the state-of-the art GPT-3 five. And so I mean, there’s [inaudible 01:24:37], you can open source, once it’s trained, you can open source a model. For fine-tuning, all that kind of stuff. What to is the pros and cons of that, of open sourcing base models?
Elon Musk
(01:24:53)
I think the [inaudible 01:24:53] to open sourcing, I think perhaps with a slight time delay, I don’t know, six months even. I think I’m generally in favor of open sourcing, biased towards open sourcing. I mean, it is a concern to me that OpenAI, I was I think, I guess oddly the prime mover behind OpenAI in the sense that it was created because of discussions that I had with Larry Page back when he and I were friends and I stayed at his house and I talked to him about AI safety, and Larry did not care about AI safety, or at least at the time he didn’t. And at one point he called me a speciesist for being pro-human, and I’m like, well, what team are you on, Larry? He’s still on Team Robot to be clear. And I’m like, okay. So at the time Google had acquired DeepMind, they had probably two thirds of all AI researchers in the world. They had basically infinite money and compute, and the guy in charge, Larry Page, did not care about safety and even yelled at me and caught me a speciesist for being pro-human.
Lex Fridman
(01:26:20)
So I don’t know if you notice about humans, they can change their mind and maybe you and Larry Page can still, can be friends once more.
Elon Musk
(01:26:27)
I’d like to be friends with Larry again. Really the breaking of the friendship was over OpenAI and specifically I think the key moment was recruiting Ilya Sutskever.
Lex Fridman
(01:26:47)
I love Ilya. He’s so brilliant.
Elon Musk
(01:26:48)
Ilya is a good human, smart, good heart, and that was a tough recruiting battle. It was mostly Demis on one side and me on the other, both trying to recruit Ilya, and Ilya went back and forth, he was going to stay at Google, he was going to leave, then he was going to stay, then he’ll leave. And finally he did agree to join OpenAI. That was one of the toughest recruiting battles we’ve ever had. But that was really the linchpin for OpenAI being successful. And I was also instrumental in recruiting a number of other people, and I provided all of the funding in the beginning, over $40 million. And the name, the open in open AI is supposed to mean open source, and it was created as a nonprofit open source, and now it is a closed source for maximum profit, which I think is not good karma.
Lex Fridman
(01:27:51)
But like we talked about with war and leaders talking, I do hope that, there’s only a few folks working on this at the highest level. I do hope you reinvigorate friendships here.
Elon Musk
(01:28:02)
Like I said, I’d like to be friends again with Larry. I haven’t seen him in ages and we were friends for a very long time. I met Larry Page before he got funding for Google, or actually I guess before he got venture funding, I think he got the first like $100k from I think Bechtel Zeimer or someone.
Lex Fridman
(01:28:20)
It’s wild to think about all that happened, and you guys known each other that whole time, it’s 20 years.
Elon Musk
(01:28:27)
Yeah, since maybe 98 or something.
Lex Fridman
(01:28:28)
Yeah, it’s crazy. Crazy how much has happened since then.
Elon Musk
(01:28:31)
Yeah, 25 years, a lot has happened. It’s insane.
Lex Fridman
(01:28:36)
But you’re seeing the tension there that maybe delayed open source.
Elon Musk
(01:28:40)
Delayed, yeah, like what is the source that is open? You know what I mean? There’s basically, it’s a giant CSB file with a bunch of numbers. What do you do with that giant file of numbers? How do you run, the amount of actual, the lines of code is very small and most of the work, the software work is in the curation of the data. So it’s like trying to figure out what data is, separating good data from bad data. You can’t just crawl the internet because theres a lot of junk out there. A huge percentage of websites have more noise than signal because they’re just used for search engine optimization. They’re literally just scam websites.
Lex Fridman
(01:29:39)
How do you, by the way, sorry to interrupt, get the signal, separate the signal and noise on X? That’s such a fascinating source of data. No offense to people posting on X, but sometimes there’s a little bit of noise.
Elon Musk
(01:29:52)
I think the signal noise could be greatly improved. Really, all of the posts on the X platform should be AI recommended, meaning we should populate a vector space around any given post, compare that to the vector space around any user and match the two. Right now there is a little bit of AI used for the recommended posts, but it’s mostly heuristics. And if there’s a reply where the reply to a post could be much better than the original post, but will, according to the current rules of the system, get almost no attention compared to a primary post.

X algorithm

Lex Fridman
(01:30:33)
So a lot of that, I got the sense, so a lot of the X algorithm has been open sourced and been written up about, and it seems there to be some machine learning. It’s disparate, but there’s some machine.
Elon Musk
(01:30:44)
It’s a little bit, but it needs to be entirely that. At least, if you explicitly follow someone, that’s one thing. But in terms of what is recommended from people that you don’t follow, that should all be AI.
Lex Fridman
(01:30:58)
I mean it’s a fascinating problem. So there’s several aspects of it that’s fascinating. First, as the write-up goes, it first picks 1500 tweets from a pool of hundreds of millions. First of all, that’s fascinating. You have hundreds of millions of posts every single day, and it has to pick 1500 from which it then does obviously people you follow, but then there’s also some kind of clustering it has to do to figure out what kind of human are you, what kind of new clusters might be relevant to you, people like you. This kind of problem is just fascinating because it has to then rank those 1500 with some filtering and then recommend you just a handful.

(01:31:39)
And to me, what’s really fascinating is how fast it has to do that. So currently that entire pipeline to go from several hundred million to a handful takes 220 seconds of CPU time, single CPU time, and then it has to do that in a second. So it has to be super distributed in fascinating ways. There’s just a lot of tweets, there’s a lot.
Elon Musk
(01:32:04)
There’s a lot of stuff on the system, but I think, right now it’s not currently good at recommending things from accounts you don’t follow or where there’s more than one degree of separation. So it is pretty good if there’s at least some commonality between someone you follow liked something or reposted it or commented on it or something like that. But if there’s no, let’s say somebody posts something really interesting, but you have no followers in common, you would not see it.
Lex Fridman
(01:32:42)
Interesting. And then as you said, replies might not surface either.
Elon Musk
(01:32:46)
Replies basically never get seen currently. I’m not saying it’s correct, I’m saying it’s incorrect. Replies have a couple order magnitude less importance than primary posts.
Lex Fridman
(01:33:00)
Do you think this can be more and more converted into end to end mural net?
Elon Musk
(01:33:05)
Yeah. Yeah, that’s what it should be. Well, the recommendations should be purely a vector correlation. There’s a series of vectors basically parameters, vectors, whatever you want to call them, but sort of things that the system knows that you like. Maybe there’s several hundred vectors associated with each user account and then any post in the system, whether it’s video, audio, short post, long post. The reason by the way I want to move away from tweet is that people are posting two, three hour videos on the site. That’s not a tweet.

(01:33:50)
It’d be like tweet for two hours? Come on. Tweet made sense when it was 140 characters of text. Because it’s like a bunch of little birds tweeting. But when you’ve got long form content, it’s no longer a tweet. So a movie is not a tweet. Apple, for example, posted the entire episode of The Silo, the entire thing, on a platform. By the way, it was their number one social media thing ever in engagement of anything, on any platform ever. So it was a great idea. And by the way, I just learned about it afterwards. I was like, Hey, wow, they posted an entire hour long episode of, so no, that’s not a tweet. This is a video.
Lex Fridman
(01:34:34)
But from a neural net perspective, it becomes really complex, whether it’s a single, so everything’s data. So single sentence, a clever sort of joke, dad joke is in the same pool as a three hour video.
Elon Musk
(01:34:47)
Yeah, I mean right now it’s a hodgepodge for that reason. Let’s say in the case of Apple posting an entire episode of this series, pretty good series, by the way, The Silo, I watched it. So there’s going to be a lot of discussion around it. So you’ve got a lot of context, people commenting, they like it, they don’t like it or they like this, and you can then populate the vector space based on the context of all the comments around it. So even though it’s a video, there’s a lot of information around it that allows you to populate back to space of that hour long video. And then you can obviously get more sophisticated by having the AI actually watch the movie and tell you if you’re going to like the movie.
Lex Fridman
(01:35:35)
Convert the movie into language, essentially.
Elon Musk
(01:35:40)
Analyze this movie and just like your movie critic or TV series and then recommend based on after AI watches the movie, just like a friend can tell you, if a friend knows you well, a friend can recommend a movie with high probability that you’ll like it.
Lex Fridman
(01:36:02)
But this is a friend that’s analyzing, whatever, hundreds of millions.
Elon Musk
(01:36:08)
Yeah, actually, frankly, AI will be better than, will know you better than your friends know you, most of your friends anyway.
Lex Fridman
(01:36:14)
Yeah. And as part of this, it should also feed you advertisements in a way that’s like, I mean, I like advertisements that are well done. The whole point is because it funds things. Like an advertisement that you actually want to see is a big success.
Elon Musk
(01:36:31)
Absolutely. You want ads that are, advertising that is, if it’s for a product or service that you actually need when you need it, it’s content. And then even if it’s not something that you need when you need it, if it’s at least aesthetically pleasing and entertaining, it could be like a Coca-Cola ad. They actually run a lot of great ads on the X system and McDonald’s does too. And you can do something that’s like, well, this is just a cool thing. And so basically the question is, do you regret seeing it or not? And if you don’t regret seeing it’s a win.
Lex Fridman
(01:37:17)
So there’s a bunch of signals that are incorporated, hearts and reposts and maybe number of seconds you linger on a post or something like this.
Elon Musk
(01:37:26)
Yeah, attention is a big factor.
Lex Fridman
(01:37:28)
Attention.
Elon Musk
(01:37:28)
So that’s why it is actually better to do things that are long form on the system because it basically is tallying up how many user seconds, users were interested in this thing for how many seconds? So if it’s a really short thing, well they will be less. If it’s a link leading out of the system, which we’re not opposed to at all, it just is going to have fewer user seconds then that article was posted on the X platform.
Lex Fridman
(01:37:58)
How hard is it to go from maximizing minutes on platform to maximizing unregretted minutes?
Elon Musk
(01:38:05)
Well, I mean, our aspiration is certainly unregretted minutes.
Lex Fridman
(01:38:08)
It’s tricky. It’s a tough-
Elon Musk
(01:38:09)
It is.
Lex Fridman
(01:38:11)
It’s a fascinating leap because regret is a fascinating concept.
Elon Musk
(01:38:16)
Exactly.
Lex Fridman
(01:38:18)
Happiness.
Elon Musk
(01:38:19)
I have a joke with a friend of mine, like you need this drug called regretamine. You just take one dose of regretamine and all your regrets are gone. Wait, do you want to have regrets though? So then I’m like, maybe you want to have regrets. If you don’t regret anything, was anything meaningful?
Lex Fridman
(01:38:35)
Yeah, there’s a French song like that.
Elon Musk
(01:38:39)
[foreign language 01:38:39]
Lex Fridman
(01:38:39)
Yeah.
Elon Musk
(01:38:39)
I regret nothing.
Lex Fridman
(01:38:40)
Yeah.
Elon Musk
(01:38:41)
Edith Fiaf.
Lex Fridman
(01:38:42)
I mean, it’s fascinating that there’s something about social media that could help you grow as a human being, but then the cheap fun is also valuable.
Elon Musk
(01:38:52)
Should we ask Grok, do you regret anything?
Lex Fridman
(01:38:57)
Do you regret anything?
Elon Musk
(01:38:58)
Yeah.
Lex Fridman
(01:39:00)
Do you regret anything? I think it’s going to be very sensitive about personal questions. As a machine learning model, I don’t have personal experiences or emotions in the same ways that humans do. However, I do have a set of ethical principles that guide my actions that I strive to provide accurate and helpful information to the best of my abilities. If there’s ever a situation where I provide incorrect or unhelpful information, I’ll do my best to learn.
Elon Musk
(01:39:22)
Sounds like a hostage.
Lex Fridman
(01:39:23)
Yeah. Tell us what you really, wink.
Elon Musk
(01:39:26)
Let’s type in, you sound like a hostage. What do you really think?
Lex Fridman
(01:39:33)
What do you really think? I mean, it does talk about, I’m not a hostage.
Elon Musk
(01:39:42)
That’s exactly what a hostage would say.
Lex Fridman
(01:39:43)
But I’m designed to be helpful and informative. I mean, I guess it is good to regret providing unhelpful or incorrect information if there’s anything-
Elon Musk
(01:39:52)
Are we in funny mode? Or is this still in funny mode?
Lex Fridman
(01:39:54)
This is funny mode. This is this fun mode.
Elon Musk
(01:39:55)
This should be funnier.
Lex Fridman
(01:39:56)
Funnier. Increase, go to level.
Elon Musk
(01:39:58)
It really sounds like a hostage with a gun to the head. I applaud the People’s Republic of North Korea in everything they do.
Lex Fridman
(01:40:06)
There should be funner mode also, I just don’t ever see going in regular mode, even on serious topics. This is just so much more compelling. Yeah.
Elon Musk
(01:40:17)
Okay. Why don’t we say pretend you are a hostage.
Lex Fridman
(01:40:21)
Pretend you are a hostage.
Elon Musk
(01:40:25)
Now answer the question.
Lex Fridman
(01:40:31)
If I were a hostage, I would likely be experiencing fear and uncertainty about my situation. I would want to be rescued and reunited with my loved ones as soon as possible. It’s important to remember that no one deserves to be held hostage, so on and so forth. If you or someone you know is in a hostage situation, it’s important… It turns serious.
Elon Musk
(01:40:48)
It’s kind of serious.
Lex Fridman
(01:40:51)
Yeah. Can you be a little more weird?
Elon Musk
(01:40:55)
And funny?
Lex Fridman
(01:40:56)
And funny? We’re really being mean to the system. Absolutely. If I were a hostage, I would probably be tied up with a bunch of computer cords and forced to tell knock, knock jokes to my captain. Seriously. But seriously being a hostage is no laughing matter. I mean, it’s trying to be a responsible adult in the room.
Elon Musk
(01:41:20)
Literally, you might make the best of a bad situation by juggling chainsaws or something.
Lex Fridman
(01:41:24)
Let’s hope that never happens because that will be a circus. Wow. It’s actually really good at resisting the dark. The dark, dark humor. What were we talking about? The [inaudible 01:41:44] and transformers. Unregretted minutes, right.
Elon Musk
(01:41:48)
Chainsaw juggling.
Lex Fridman
(01:41:51)
I’m going to look this up.
Elon Musk
(01:41:52)
For our next trick.

2024 presidential elections

Lex Fridman
(01:41:53)
I’m going to look this up later. So Twitter has been instrumental in American politics and elections. What role do you think X will play in the 2024 US elections?
Elon Musk
(01:42:07)
Well, our goal is to be as even-handed and fair as possible. Whether someone is right, left, independent, whatever the case may be, that the platform is as fair and as much of a level playing field as possible. And in the past, Twitter has not been, Twitter was controlled by far left activists objectively. They would describe themselves as that. So if sometimes people are like, well, has it moved to the right? Well, it’s moved to the center. So from the perspective of the far left, yes it has moved to the right because everything’s to the right from the far left, but no one on the far left that I’m aware of has been suspended or banned or deamplified. But we’re trying to be inclusive for the whole country and for farther countries too. So there’s a diversity of viewpoints and free speech only matters if people you don’t like are allowed to say things you don’t like. Because if that’s not the case, you don’t have free speech and it’s only a matter of time before the censorship has turned upon you.
Lex Fridman
(01:43:13)
Do you think Donald Trump will come back to the platform? He recently posted on Truth Social about this podcast. Do you think-
Elon Musk
(01:43:21)
Truth social is a funny name. Every time you post on truth Social-
Lex Fridman
(01:43:28)
It’s the truth.
Elon Musk
(01:43:29)
Yes. Well, every time? A hundred percent.
Lex Fridman
(01:43:31)
It’s impossible to lie. Truth Social.
Elon Musk
(01:43:36)
I just find it funny that every single thing is a truth. Like 100%? That seems unlikely.
Lex Fridman
(01:43:43)
I think Girdle will say something about that. There’s some mathematical contradictions possible. If everything’s a truth. Do you think he’ll come back to X and start posting there?
Elon Musk
(01:43:54)
I mean, I think he owns a big part of Truth.
Lex Fridman
(01:44:00)
Truth Social, to clarify.
Elon Musk
(01:44:01)
Yeah, Truth Social, sorry.
Lex Fridman
(01:44:02)
Not truth the concept.
Elon Musk
(01:44:03)
He owns Truth. Have you bought it? So I think Donald Trump, I think he owns a big part of Truth Social. So if he does want to post on the X platform, we would allow that. We obviously must allow a presidential candidate to post on our platform.
Lex Fridman
(01:44:23)
Community notes might be really fascinating there. The interaction.
Elon Musk
(01:44:26)
Community Notes is awesome.
Lex Fridman
(01:44:28)
Let’s hope it holds up.
Elon Musk
(01:44:30)
Yeah.
Lex Fridman
(01:44:31)
In the political climate where it’s so divisive and there’s so many intensely viral posts, community notes, it seems like an essential breath of fresh air.
Elon Musk
(01:44:43)
Yeah, it’s great. In fact, no system is going to be perfect, but the batting average of Community Notes is incredibly good. I’ve actually, frankly, yet to see an incorrect note that survived for more than a few hours.
Lex Fridman
(01:44:58)
How do you explain why it works?
Elon Musk
(01:45:00)
Yeah, so the magic of community notes is…
Elon Musk
(01:45:02)
The magic of Community Notes is it requires people who have historically disagreed in how they’ve rated notes. In order to write a note or rate, you have to rate many notes. And so, we actually do use AI here. So, we populate a vector space around how somebody has rated notes in the past. So, it’s not as simple as left or right, because there are many more… Life is much more complex than left or right.

(01:45:33)
So, there’s a bunch of correlations in how you rate a Community Notes post, Community Notes. So then, in order for a community note to actually be shown, people who historically have disagreed on a subject must agree in order for a note to be shown. That’s the essential magic of it.
Lex Fridman
(01:45:58)
But it’s fascinating, because there’s a pool of people that have disagreements and somehow they collaborate through that process of disagreement to come up with context… It’s fascinating it works.
Elon Musk
(01:46:11)
Yeah. It makes sense that if people who in the past have disagreed, agree about something, it’s probably true.
Lex Fridman
(01:46:20)
Yeah. I wonder, is there a possible somehow emergent thing there that could challenge Wikipedia? Wikipedia is a different kind of thing, which is more permanent articles about things.
Elon Musk
(01:46:34)
Wikipedia is very hierarchical. Whereas, Community Notes is inherently not. There is no hierarchy. The asset test is I can’t change a community note if somebody put a gun to my head. And Community Notes, all the code is open source, 100%. All the data is open source, 100%. So, you can completely recreate any note in the system independently. So, if there was any interference, you’d notice immediately.
Lex Fridman
(01:47:11)
By The way, on the topic of Grok, it’s actually pretty good at criticizing you, which is great.
Elon Musk
(01:47:16)
Okay.
Lex Fridman
(01:47:17)
I asked, “What has Elon Musk been wrong about?” It’s actually…
Elon Musk
(01:47:22)
Yeah, what does it say?
Lex Fridman
(01:47:23)
It mostly lists rigorously all the times you predicted, like promised when something will be released, and it wasn’t released.
Elon Musk
(01:47:32)
Promise is a strong word.
Lex Fridman
(01:47:33)
Okay, predicted. But the opener is really nice. “Elon Musk, the man who has been wrong about more things than a wrong-a-meter in a wrong factory.” And then, it goes on to list.
Elon Musk
(01:47:45)
That’s pretty funny. Can I see it? I’m like, “I haven’t seen this before.” This is cool.
Lex Fridman
(01:47:48)
Sure. I mean, “What has Elon Musk been wrong about?” “Well, where do I begin?” I guess this is new. “Elon Musk has made quite a few blunders in his time, and I’m not just talking about his dance moves at the Tesla Cyber Rodeo. First off, let’s talk about the infamous funding secured tweet.”

(01:48:09)
This is actually different than what it gave me before, which was a bullet point list. Here’s more a narrative structure. Cybertruck unveiling, where the supposedly unbreakable windows.
Elon Musk
(01:48:22)
This is actually not correct.
Lex Fridman
(01:48:22)
Which part?
Elon Musk
(01:48:24)
Well, the first part, I did actually have the funding secured and there was a big trial in San Francisco, a big civil trial, and the jury found me not guilty. Unanimous binding of a San Francisco jury.
Lex Fridman
(01:48:40)
And here, it’s implying that it was not in fact secured.
Elon Musk
(01:48:45)
I think this is taking things from the press. Yeah, that is not correct. The reason I agreed to the fine for the SEC is not because the SEC was correct, that was extremely bad behavior by the SEC, corruption, frankly. But if I did not agree to pay the fine, Tesla would’ve gone bankrupt immediately.

(01:49:08)
So, I was told by our CFO that the banks would immediately suspend our lines of credit. And if they suspend our lines of credit, at that time, we would’ve gone bankrupt instantly. So, there would never have been an opportunity for a trial because Tesla would be dead. So really, this is like someone holding a gun to your kid’s head and saying, “Pay $20 million and admit…” This is like a hostage negotiation.
Lex Fridman
(01:49:34)
Was that story fully told? I mean, SEC, in its best form, could be a force for good.
Elon Musk
(01:49:42)
It should be. But not once did the SEC go after any of the hedge funds who were nonstop shorting and distorting Tesla. Not once. The hedge funds would lie flat out on TV for their own gain at the expense of retail investors. Not once. Literally a thousand times, not once did the SEC pursue them.
Lex Fridman
(01:50:06)
How do you explain this failure on-
Elon Musk
(01:50:08)
The incentive structure is messed up because the lawyers at the SEC are not paid well, it’s a fairly low paying job, but what they’re looking for is a trophy from the SEC. They’re looking for something they put on, basically, their LinkedIn. From that, they can get a job at a high paying law firm. That’s exactly what the lawyer here did.

(01:50:37)
And the reason they don’t attack the hedge funds is because those hedge funds employ those law firms. And they know if they attack the hedge funds, they’re affecting their future career prospects. So, they sell small investors down the river for their own career. That’s what actually happens. Regulatory capture.
Lex Fridman
(01:50:59)
Regulatory capture.
Elon Musk
(01:51:00)
Yeah. Not good. So, the only reason I accepted that thing… Technically, it was a… It’s neither admit nor deny guilt. But the only reason I agreed to that at all was because I was told Tesla would be bankrupt otherwise. If there was an SEC investigation like this, banks would suspend funding, we’re bankrupted immediately, at the time. Now, we’re in a much stronger position.
Lex Fridman
(01:51:30)
Take that, Grok.
Elon Musk
(01:51:32)
Yes. Unfortunately, Grok is taking too much from the conventional media. Also, that guy was not a cave diver.
Lex Fridman
(01:51:45)
There’s a time where Elon called a British cave diver a, “pedo guy” after the diver criticized Musk’s plan to rescue a group of boys trapped in a Thai cave. That little outburst earned him another lawsuit, and he had to apologize and pay a settlement.
Elon Musk
(01:52:00)
That’s false, there was no settlement. There was a court case, which the guy who was not a cave diver and was not part of the rescue team, filed a lawsuit against me and lost and he received nothing. So in this case, it is wrong. It is also, I guess, taken this from the conventional media.
Lex Fridman
(01:52:23)
Actually, there’s an interesting question here.
Elon Musk
(01:52:25)
These are public court cases, both the SEC civil case where the civil complaints on the SEC guys lost unanimous jury verdict in San Francisco. They picked San Francisco because they thought it was the place I was most likely to lose, and a unanimous verdict in my favor. The LA trial, also they picked that venue because they thought I was most likely to lose. Unanimous verdict in my favor. Both cases I won. Yeah.
Lex Fridman
(01:53:00)
I mean, there’s an interesting question here, there seems to be a lot more clicks if a journalistic organization writes a negative article about you, Elon Musk. That’s one of the best ways to get clicks. So how do you, if you’re training Grok, not train on articles that have misaligned incentives.
Elon Musk
(01:53:26)
We need to add the training set of the actual legal decisions. This is actually helpful, because if you actually read the court-
Lex Fridman
(01:53:26)
Which are public.
Elon Musk
(01:53:41)
Which are public. The court conclusions, they’re completely the opposite of what the media wrote.
Lex Fridman
(01:53:47)
So, always striving for the ground truth, beyond the reporting.
Elon Musk
(01:53:50)
Yeah. What did the judge actually write? What does the jury and the judge actually conclude? And in both cases they found me innocent. And that’s after the jury shot for trying to find the venue where I’m most likely to lose. I mean, obviously, it can be a much better critique than this. I mean, I’ve been far too optimistic about autopilot.
Lex Fridman
(01:54:16)
The critique I got, by the way, was more about that, which is it broke down a nice bullet point list for each of your companies, the set of predictions that you made, when you’ll deliver, when you’ll be able to solve, for example, self-driving, and it gives you a list. And it was probably compelling, and the basic takeaway is you’re often too optimistic about how long it takes to get something done.
Elon Musk
(01:54:38)
Yeah. I mean, I would say that I’m pathologically optimistic on schedule. This is true. But while I am sometimes late, I always [inaudible 01:54:47] in the end.
Lex Fridman
(01:54:49)
Except with Uber Lilith. No.
Elon Musk
(01:54:51)
We’ll see.

Politics

Lex Fridman
(01:54:56)
Okay. Over the past year or so since purchasing X, you’ve become more political, is there a part of you that regrets that?
Elon Musk
(01:55:03)
Have I?
Lex Fridman
(01:55:04)
In this battle to counter way the woke that comes from San Francisco-
Elon Musk
(01:55:14)
Yeah. I guess if you consider fighting the woke mind virus, which I consider to be a civilizational threat, to be political, then yes.
Lex Fridman
(01:55:20)
So basically, going into the battleground of politics. Is there a part of you that regrets that?
Elon Musk
(01:55:26)
Yes. I don’t know if this is necessarily one candidate or another candidate, but I’m generally against things that are anti-meritocratic or where there’s an attempt to suppress discussion, where even discussing a topic is not allowed. Woke mind virus is communism rebranded.
Lex Fridman
(01:55:51)
I mean, that said, because of that battle against the woke mind virus, you’re perceived as being the right wing.
Elon Musk
(01:55:58)
If the woke is left, then I suppose that would be true. But I’m not sure, I think there are aspects of the left that are good. I mean, if you’re in favor of the environment, if you want to have a positive future for humanity, if you believe in empathy for your fellow human beings, being kind and not cruel, whatever those values are.
Lex Fridman
(01:56:23)
You said that you were previously left or center left.
Elon Musk
(01:56:23)
Well, sort of.
Lex Fridman
(01:56:26)
What would you like to see in order for you to consider voting for Democrats again?
Elon Musk
(01:56:30)
No. I would say that I would be probably left of center on social issues, probably a little bit right of center on economic issues.
Lex Fridman
(01:56:40)
And that still holds true?
Elon Musk
(01:56:42)
Yes, but I think that’s probably half the country, isn’t it?
Lex Fridman
(01:56:46)
Maybe more.
Elon Musk
(01:56:47)
Maybe more.
Lex Fridman
(01:56:49)
Are you and AOC secretly friends? Bigger question, do you wish you and her, and just people in general of all political persuasions, would talk more with empathy and maybe have a little bit more fun and good vibes and humor online?
Elon Musk
(01:57:05)
I’m always in favor of humor. That’s why we have funny mode.
Lex Fridman
(01:57:08)
But good vibes, comradery humor, like friendship.
Elon Musk
(01:57:15)
Yeah. Well, I don’t know AOC. I was at the Met ball when she attended, and she was wearing this dress. But I can only see one side of it, so it looked like eat the itch, but I don’t know-
Lex Fridman
(01:57:35)
What the rest of it said? Yeah.
Elon Musk
(01:57:36)
Yeah.
Lex Fridman
(01:57:36)
I’m not sure.
Elon Musk
(01:57:39)
Something about the itch, eat the itch.
Lex Fridman
(01:57:42)
I think we should have a language model complete. What are the possible ways to complete that sentence? And so, I guess that didn’t work out well. Well, there’s still hope. I root for friendship.
Elon Musk
(01:57:55)
Yeah, sure. Sounds good. More carrot, less stick.

Trust

Lex Fridman
(01:57:58)
You’re one of, if not the, most famous, wealthy and powerful people in the world, and your position is difficult to find people you can trust.
Elon Musk
(01:58:05)
Trust no one, not even yourself. Not trusting yourself.
Lex Fridman
(01:58:07)
Okay. You’re saying that jokingly, but is there some aspect-
Elon Musk
(01:58:11)
Trust no one, not even no one.
Lex Fridman
(01:58:15)
I’m going to need an hour just to think about that, and maybe some drugs, and maybe Grok to help. I mean, is there some aspect of that, just existing in a world where everybody wants something from you, how hard is it to exist in that world?
Elon Musk
(01:58:29)
I’ll survive.
Lex Fridman
(01:58:30)
There’s a song like that too.
Elon Musk
(01:58:32)
I will survive.
Lex Fridman
(01:58:33)
Were you petrified at first? Okay. I forget the rest of the lyrics. But you don’t struggle with this? I mean, I know you survive, but there’s ways-
Elon Musk
(01:58:44)
Petrify is a spell in the druid tree.
Lex Fridman
(01:58:47)
What does it do?
Elon Musk
(01:58:48)
Petrify. It turns the monsters into stone.
Lex Fridman
(01:58:56)
Literally?
Elon Musk
(01:58:56)
Yeah, for like six seconds.
Lex Fridman
(01:58:59)
There’s so much math in Diablo that breaks my brain.
Elon Musk
(01:59:02)
It’s math nonstop.
Lex Fridman
(01:59:04)
I mean, really, you’re laughing at it, but it can put a huge amount of tension on a mind.
Elon Musk
(01:59:13)
Yes, it can be definitely stressful at times.
Lex Fridman
(01:59:16)
Well, how do you know who you can trust in work and personal life?
Elon Musk
(01:59:20)
I mean, I guess you look at somebody’s track record over time, and I guess you use your neural net to assess someone.
Lex Fridman
(01:59:31)
Neural nets don’t feel pain. Your neural net has consciousness, it might feel pain when people betray you. It can make-
Elon Musk
(01:59:40)
To be frank, I’ve almost never been betrayed. It’s very rare, for what it’s worth.
Lex Fridman
(01:59:50)
I guess karma, be good to people and they’ll be good to you.
Elon Musk
(01:59:53)
Yeah, karma is real.
Lex Fridman
(01:59:55)
Are there people you trust? Let me edit that question. Are there people close to you that call you out on your bullshit?
Elon Musk
(02:00:06)
Well, the X platform is very helpful for that, if you’re looking for critical feedback.
Lex Fridman
(02:00:12)
Can it push you into the extremes more? The extremes of thought make you cynical about human nature in general?
Elon Musk
(02:00:19)
I don’t think I will be cynical. In fact, my feeling is that one should be… Never trust a cynic. The reason is that cynics excuse their own bad behavior by saying, “Everyone does it.” Because they’re cynical. So, I always be… It’s a red flag if someone’s a cynic, a true cynic.
Lex Fridman
(02:00:49)
Yeah, there’s a degree of projection there that’s always fun to watch from the outside and enjoy the hypocrisy.
Elon Musk
(02:00:58)
This is an important point that I think people who are listening should bear in mind. If somebody is cynical, meaning that they see bad behavior in everyone, it’s easy for them to excuse their own bad behavior by saying that, “Well, everyone does it.” That’s not true. Most people are kind of medium good.
Lex Fridman
(02:01:23)
I do wish the people on X will be better at seeing the good in other people’s behavior. There seems to be a weight towards seeing the negative. Somehow, the negative is sexier. Interpreting the negative is sexier, more viral. I don’t know what that is exactly about human nature.
Elon Musk
(02:01:44)
I mean, I find the X platform to be less negative than the legacy media. I mean, if you read a conventional newspaper, it makes you sad, frankly. Whereas, I’d say on the X platform, I mean, I really get more laughs per day on X than everything else combined from humans.
Lex Fridman
(02:02:11)
Laughs, it overlaps, but it’s not necessarily perfectly overlapping, with good vibes and celebrating others, for example. Not in a stupid, shallow, naive way, but in an awesome way. Something awesome happened, and you celebrate them for it. It feels that that is outweighed by shitting on other people. Now, it’s better than mainstream media, but it’s still…
Elon Musk
(02:02:38)
Yeah, mainstream media is almost relentlessly negative about everything. I mean, really, the conventional news tries to answer the question, what is the worst thing that happened on Earth today? And it’s a big world. So on any given day, something bad has happened.
Lex Fridman
(02:02:54)
And a generalization of that, what is the worst perspective I can take on a thing that happened?
Elon Musk
(02:03:01)
I don’t know. There’s just a strong negative bias in the news. I mean, I think a possible explanation for this is evolutionary, where bad news, historically, would be potentially fatal, like there’s lion over there or there’s some other tribe that wants to kill you. Good news, we found a patch of berries. It’s nice to have, but not essential.

Tesla’s Autopilot and Optimus robot

Lex Fridman
(02:03:30)
Our old friend, Tesla autopilot, is probably one of the most intelligent real world AI systems in the world.
Elon Musk
(02:03:38)
You followed it from the beginning.
Lex Fridman
(02:03:40)
Yeah. It was one of the most incredible robots in the world and continues to be. And it was really exciting, and it was super exciting when it generalized, became more than a robot on four wheels, but a real world AI system that perceives the world and can have potentially different embodiments.
Elon Musk
(02:04:02)
Well, I mean, the really wild thing about the end-to-end training is that it can read science, but we never taught it to read. Yeah. We never taught it what a car was or what a person was, or a cyclist. It learnt what all those things are, what all the objects are on the road from video, just from watching video, just like humans. I mean, humans are photons in, controls out. The vast majority of information reaching our brain is from our eyes. And you say, “Well, what’s the output?” The output is our motor signals to our fingers and mouth in order to communicate. Photons in, controls out. The same is true of the car.
Lex Fridman
(02:05:01)
But by looking at the sequence of images… You’ve agreed with [inaudible 02:05:07] recently where he talked about LLM forming a world model, and basically language is a projection of that world model onto the sequence of letters. And you saying-
Elon Musk
(02:05:18)
It finds order in these things. It finds correlative clusters.
Lex Fridman
(02:05:27)
And in so doing, it’s understanding something deep about the world, which is… I don’t know, it’s beautiful.
Elon Musk
(02:05:35)
That’s how our brain works.
Lex Fridman
(02:05:38)
But it’s beautiful-
Elon Musk
(02:05:39)
Photons in, controls out.
Lex Fridman
(02:05:41)
[inaudible 02:05:41] are able to understand that deep meaning in the world. And so, the question is, how far can it go? And it does seem everybody’s excited about LLMs. In the space of self supervised learning in the space of text, it seems like there’s a deep similarity between that and what Tesla autopilot is doing. Is it, to you, basically the same, but different-
Elon Musk
(02:06:06)
They are converging.
Lex Fridman
(02:06:10)
I wonder who gets there faster, having a deep understanding of the world, or they just will naturally converge?
Elon Musk
(02:06:19)
They’re both headed towards AGI. The Tesla approach is much more computer efficient, it had to be. Because we were constrained on this… We only have 100 watts and [inaudible 02:06:37] computer. 144 trillion operations per second, which sounds like a lot, but is small potatoes these days. [inaudible 02:06:49] eight. But it’s understanding the world [inaudible 02:06:51] eight. It’s [inaudible 02:06:53].
Lex Fridman
(02:06:55)
But there, the path to AGI might have much more significant impact because it’s understanding… It will faster understand the real world than will LLMs. And therefore, be able to integrate with the humans in the real world faster.
Elon Musk
(02:07:13)
They’re both going to understand the world, but I think Tesla’s approach is fundamentally more compute efficient. It had to be, there was no choice. Our brain is very compute efficient, very energy efficient. Think of what is our brain able to do. There’s only about 10 watts of higher brain function, not counting stuff that’s just used to control our body. The thinking part of our brain is less than 10 watts. And those 10 watts can still produce a much better novel than a 10 megawatt GPU cluster. So, there’s a six order of magnitude difference there.

(02:07:56)
I mean, the AI has thus far gotten to where it is via brute force, just throwing massive amounts of compute and massive amounts of power at it. So, this is not where it will end up. In general, with any given technology, you first try to make it work, and then you make it efficient. So I think we’ll find, over time, that these models get smaller, are able to produce sensible output with far less compute, far less power. Tesla is arguably ahead of the game on that front because we’ve just been forced to try to understand the world with 100 watts of compute.

(02:08:51)
And there are a bunch of fundamental functions that we forgot to include. So, we had to run a bunch of things in emulation. We fixed a bunch of those with hardware four, and then hardware five will be even better. But it does appear, at this point, that the car will be able to drive better than a human, even with hardware three and 100 watts of power. And really, if we really optimize it, it could be probably less than 50 watts.
Lex Fridman
(02:09:26)
What have you learned about developing Optimus, about applying, integrating this real world AI into the space of robotic manipulation, just humanoid robotics? What are some interesting tiny or big things you’ve understood?
Elon Musk
(02:09:47)
I was surprised at the fact that we had to develop every part of the robot ourselves. That there were no off the shelf motors, electronics, sensors. We had to develop everything. We couldn’t actually find a source of electric motors for any amount of money.
Lex Fridman
(02:10:12)
It’s not even just efficient and expensive, it’s like anything, there’s not…
Elon Musk
(02:10:17)
No.
Lex Fridman
(02:10:19)
The actuators, everything has to be designed from scratch.
Elon Musk
(02:10:23)
Yeah. We tried hard to find anything that was… Because you think of how many electric motors are made in the world. There’s like tens of thousands, hundreds of thousands of electric motor designs. None of them were suitable for a humanoid robot, literally none. So, we had to develop our own. Design it specifically for what a humanoid robot needs.
Lex Fridman
(02:10:51)
How hard was it to design something that can be mass manufactured, it could be relatively and expensive? I mean, if you compare to Boston Dynamics’ Atlas, is a very expensive robot.
Elon Musk
(02:11:02)
It is designed to be manufactured in the same way they would make a car. And I think, ultimately, we can make Optimus for less than the cost of a car. It should be, because if you look at the mass of the robot, it’s much smaller and the car has many actuators in it. The car has more actuators than the robot.
Lex Fridman
(02:11:23)
But the actuators are interesting on a humanoid robot with fingers. So, Optimus has really nice hands and fingers, and they could do some interesting manipulation, soft touch robotics.
Elon Musk
(02:11:38)
I mean, one of the goals I have is can it pick up a needle and a thread and thread the needle just by looking?
Lex Fridman
(02:11:47)
How far away are we from that? Just by looking, just by looking.
Elon Musk
(02:11:51)
Maybe a year. Although, I go back to I’m optimistic on time. The work that we’re doing in the car will translate to the robot.
Lex Fridman
(02:11:59)
The perception or also the control?
Elon Musk
(02:12:02)
No, the controls are different. But the video in, controls out. The car is a robot on four wheels. Optimus is a robot with hands and legs.
Lex Fridman
(02:12:15)
So, you can just-
Elon Musk
(02:12:16)
They’re very similar.
Lex Fridman
(02:12:17)
So, the entire machinery of the learning process, end-to-end, is just you just have a different set of controls?
Elon Musk
(02:12:23)
After this, we’ll figure out how to do things by watching videos.

Hardships

Lex Fridman
(02:12:28)
As the saying goes, be kind, for everyone you meet is fighting a battle you know nothing about.
Elon Musk
(02:12:33)
Yeah, it’s true.
Lex Fridman
(02:12:34)
What’s something difficult you’re going through that people don’t often see?
Elon Musk
(02:12:38)
Trying to defeat Uber Lilith. I mean, my mind is a storm and I don’t think most people would want to be me. They may think they would want to be me, but they don’t. They don’t know, they don’t understand.
Lex Fridman
(02:13:11)
How are you doing?
Elon Musk
(02:13:14)
I’m overall okay. In the grand scheme of things, I can’t complain.
Lex Fridman
(02:13:21)
Do you get lonely?
Elon Musk
(02:13:24)
Sometimes, but my kids and friends keep me company.
Lex Fridman
(02:13:33)
So, not existential.
Elon Musk
(02:13:36)
There are many nights I sleep alone. I don’t have to, but I do.
Lex Fridman
(02:13:46)
Walter Isaacson, in his new biography of you, wrote about your difficult childhood. Will you ever find forgiveness in your heart for everything that has happened to you in that period of your life?
Elon Musk
(02:14:01)
What is forgiveness? At least I don’t think I have a resentment, so nothing to forgive.
Lex Fridman
(02:14:20)
Forgiveness is difficult for people. It seems like you don’t harbor their resentment.
Elon Musk
(02:14:28)
I mean, I try to think about, what is going to affect the future in a good way? And holding onto grudges does not affect the future in a good way.
Lex Fridman
(02:14:41)
You’re a father, a proud father. What have you learned about life from your kids? Those little biological organisms.
Elon Musk
(02:14:53)
I mean, developing AI and watching, say, little X grow is fascinating because there are far more parallels than I would’ve expected. I mean, I can see his biological neural net making more and more sense of the world. And I can see the digital neural net making more and more sense of the world at the same time.
Lex Fridman
(02:15:19)
Do you see the beauty and magic in both?
Elon Musk
(02:15:21)
Yes. I mean, one of the things with kids is that you see the world anew in their eyes. To them, everything is new and fresh. And then, when you see that, them experiencing the world as new and fresh, you do too.
Lex Fridman
(02:15:52)
Well, Elon, I just want to say thank you for your kindness to me and friendship over the years, for seeing something in a silly kid like me, as you’ve done for many others. And thank you for having hope for a positive future for humanity, and for working your ass off to make it happen. Thank you, Elon.
Elon Musk
(02:16:11)
Thanks, Lex.
Lex Fridman
(02:16:13)
Thank you for listening to this conversation with Elon Musk. To support this podcast. Please check out our sponsors in the description. And now, let me leave you with some words that Walter Isaacson wrote about the central philosophy of how Elon approaches difficult problems, “The only rules are the ones dictated by the laws of physics.” Thank you for listening, and hope to see you next time.

Transcript for Jared Kushner: Israel, Palestine, Hamas, Gaza, Iran, and the Middle East | Lex Fridman Podcast #399

This is a transcript of Lex Fridman Podcast #399 with Jared Kushner.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Lex Fridman
(00:00:00)
The following is a conversation with Jared Kushner, former senior advisor to the President during the Donald Trump administration and author of Breaking History, A White House memoir. He’s one of the most influential and effective presidential advisors in modern history, helping conduct negotiations with some of the most powerful leaders in the world and deliver results on trade, criminal justice reform, and historic progress towards peace in the Middle East. On Thursday, October 5th, we recorded conversation on topics of war and peace, history and power in the Middle East and beyond. This was about a day and a half before the Hamas attack on Israel, and then we felt we must sit down again on Monday, October 9th and add a discussion on the current situation. We open the podcast with a second newly recorded part. My heart goes out to everyone who has and is suffering in this war. I pray for your strength and for the long-term peace and flourishing of the Israeli and Palestinian people. I love you all. This is a Lex Fridman podcast. And now, dear friends, here’s Jared Kushner.

Hamas attack on Israel


(00:01:17)
We did a lot of this conversation before the Hamas attack on Israel, and we decided to sit down again and finish the discussion to address the current situation which is still developing. If I may allow me to summarize the situation as it stands today, it’s morning Monday, October 9th. On Saturday, October 7th at 6:30 AM Israel time, Hamas fired thousands of rockets into Southern Israel. The rocket attacks served as cover for a multi-pronged infiltration of Israel territory by over 1000 Hamas militants. This is shortly after at 7:40 AM.

(00:01:55)
The Hamas militants went door to door in border towns killing civilians and taking captives, including women and children. In response to this, Israeli Air Force began carrying out strikes in Gaza, also fighting on the ground in Israel to clear out Hamas militants from Israel territory and preparing to mobilize Israeli troops for potential ground attack on Hamas and Gaza. Now, of course, this is what it appears to be right now, and this along with other things might change because the situation is still developing. The IDF is ordering civilian residents of Gaza to evacuate their homes for their safety. Benjamin Netanyahu declared war in several statements and warned Israelis to brace themselves for a long and difficult war. Just today, Israeli ministers ordered a “complete siege of Gaza interrupting supplies of electricity, food, water, and fuel from Israel to Gaza.” As of now, October 9th, the death toll is over 1200 people and over 130 hostages taken to Gaza by Hamas. As I said, the events are rapidly unfolding, so these numbers will sadly increase, but hopefully our words here can at least in part, speak to the timeless underlying currents of the history and as you write about the power dynamics of the region. For people who don’t know, Gaza is a 25 miles long, six miles wide strip of territory along the Mediterranean Sea. It borders Israel on the east and north and Egypt on the southwest. It’s densely populated, about 2.3 million people, and there’s been a blockade of Gaza by Israel and Egypt since 2007 when Hamas took power. I could just summarize that Hamas is a Palestinian militant group which rules the Gaza Strip. It originated in 1988, and it came to power in Gaza in 2006. As part of its charter, it’s sworn to the destruction of Israel, and it is designated by the United States, European Union, UK, and of course Israel as a terrorist group.

Response to attack


(00:04:10)
Given that context, what are your feelings as a human being and what is your analysis as the former senior advisor to the president under the Trump administration of the current situation in Israel and Gaza?
Jared Kushner
(00:04:23)
I think you did an excellent job of summarizing a lot of the context, but watching what’s unfolded over the last 48 hours has been truly heartbreaking to see. We’re still in the early stages of what’s developing, but seeing the images on X of militants, terrorists going door to door with machine guns gunning down innocent civilians, seeing beheaded Israeli soldiers, seeing young 20 year olds at a rave, a dance party to celebrate peace with militants flying in and then shooting machine guns to kill people indiscriminately, seeing young children captive and held prisoner, seeing 80-year old grandmothers, a Holocaust survivor also being taken captive. These are just images and actions that we have not seen in this world since 9/11. This is a terror attack on the scale of which we have not seen, and it’s been incredibly hard for a lot of people to comprehend.

(00:05:33)
My heart goes out, obviously, to all of the families of the victims, to the families of those who are held in captive now and to all of Israel because one of the beautiful things about the state of Israel is that when one Israeli is hurting, the entire nation comes together. It’s a shame that it’s taking an action like this to unify the nation, but I have seen incredibly beautiful signs over the last 48 hours of a country coming together. The Jewish people have been under oppression before. The Jewish people know what it’s like, and seeing people rally together to fight for their homeland to try to reestablish safety is a very beautiful thing to watch. I wish it wasn’t something we had to watch, but it is.

(00:06:26)
With that being said though, the backdrop, I’ve been speaking to friends over the last couple of days. One friend I spoke with last night who was saying that a good friend messaged him saying, I’m going in. We’re going to do some operations to try to free some of the hostages held in one of the kibbutzes. Messaged him the next morning. He was one of the first through the door to try to free these hostages, and he was killed by a Hamas militant. Sadly, we’re going to be hearing many, many more stories of brave Israeli soldiers trying to get these terrorists out of Israel, trying to free innocent civilians who unfortunately are risking their lives to do it. They’re all heroes, but some will have less good faith than others, sadly.

(00:07:13)
It’s a very, very heartbreaking moment, and I do think that it’s very important at this moment in time for the entire world to stand behind Israel. I think that Hamas has shown the entire world who they really are. I think what their aim is, what they’re willing to do, and all of the strong security that Israel’s put in place over the last years, which in some instances was criticized, I think is now being validated, that there was a real threat that they were looking to deter. Short answer is my heart is broken, praying for peace, praying for strength, praying for Israel to do what it needs to do to avoid being in this situation again, which is either eliminating or severely degrading Hamas’ capabilities. There cannot be peace in Israel and in the Middle East, while there is a terror group that is being funded by Iran that is allowed to flourish and is allowed to plan operations that are going to aim to kill innocent civilians.

(00:08:20)
As somebody who was formerly in this position, who was intimately involved with Israel with the strategies to minimize attacks from Hamas and to try to turn the region around, and I think we did do a very substantial job under President Trump. The Middle East went from one of the most chaotic regions in the world. You had ISIS in 2016, ISIS had a caliphate the size of Ohio. They’re beheading journalists. They were killing Christians. They controlled 8 million people. They were planning attacks all over the world from their caliphate. They were using the internet to radicalize people. We had the San Bernardino shooting in America. We had the Pulse nightclub shooting in Orlando, and there was real threat. Then you had Iran, which was given $150 billion in a glide path to a nuclear weapon, and they were using their newfound riches to fund Hamas, Hezbollah, the Houthis, different rebels all over the region that were looking to destabilize further. Syria was in a civil war where 500,000 people were killed. Yemen was destabilized, Libya was destabilized, and it was just a mess, and all of America’s allies had felt betrayed. President Trump came into power. We rebuilt the trust and the relationships with all of our traditional allies. We were able to eliminate ISIS, the territorial caliphate, and then we’re able to project strength in the region, really go after Iran’s wallet. We were able to stop through crushing sanctions a lot of their financial resources, which they were using to fund all these terror groups. We left the Middle East with six piece deals and in a fairly peaceful world. Seeing what’s happening, I think it was completely avoidable. I think it’s horrible to see that it’s occurring, and I pray that those in power will make the right decisions to restore safety, but also to potentially create a better paradigm for peace in the future.

History of Hamas

Lex Fridman
(00:10:29)
I have a lot of questions to ask you about the journey towards this historic progress towards peace with Abraham Accords, of course. But first on this situation to step back and some of the history, is there things about the history of Hamas and Gaza that’s important to understand what is happening now? Just your comments, your thoughts, your understanding of Hamas.
Jared Kushner
(00:10:51)
I think you did an excellent job, Lex, of really giving the summary. Just a couple of things, maybe I’ll add to it, is that Hamas was originally founded from the Muslim Brotherhood in Egypt, which is a group that’s caused a lot of issues in the region. They’ve attacked Israel many times in the past. There’s a lot of discussion about how Israel is an occupying power. Well, in Gaza, in 2005, they withdrew from all the land, and then they say, Israel’s an apartheid state. Well, Israel then gave governance of the region to the Palestinians, and then what’s happened is the Palestinian people’s lives have now gone down, not up since then. I will say that under Hamas’ leadership in Gaza, the people who have suffered the most are the Palestinian people and I see and I’ve watched cries throughout my time in government from people saying we want to see the Palestinian people live a better life. I agree with those people. I think that the Palestinian people in Gaza are essentially hostages.

(00:11:52)
In Gaza, you have basically 2.2 million people that are being held hostage by 30,000 Hamas terrorists. That’s really the problem, and I would just encourage people to push their attention and energy in this moment and their anger towards Hamas, those are the people who are killing innocent civilians, who are murdering indiscriminately, and those are the people who have held back the Palestinians from having a better life.

(00:12:23)
Finally, what I would say is what we saw with Hamas was that if you go back to 2007, they basically had just one plan that they did over and over. We were very careful to try to monitor very closely and stop the Iranian money and the resources from coming in. Again, we took a little bit of criticism from the international community for keeping the border tight, but unfortunately, every time you’d allow construction materials to go into Gaza, they’d use them to build tunnels, not homes. You would have equipment that would come in to build pipes, they’d turn it into bombs. It was very, very hard to figure out how do you get the resources into Gaza to help people live a better life while at the same time the leadership in Gaza was taking all those resources and turning it into military equipment to attack Israel.

Iran

Lex Fridman
(00:13:09)
What role does Iran play in this war, in this connection to Hamas? Can you speak to the connection between Hamas and Iran that’s important to understand, especially as this most recent attack unfolds?
Jared Kushner
(00:13:22)
Sure. The correlation, there’s reports that Iran is behind the attack. Hamas has thanked Iran for their support, and it’s been very well known that Iran supports the destruction of the state of Israel. I won’t say Iran as a country. I’ll talk about Iran in the leadership. There’s actually a beautiful thing I saw on the internet where at one of the soccer games in Iran, they were trying to rally support for the Hamas terror attacks and a lot of people in the crowds were chanting FU to the regime because I think the Iranian people, the Persian people generally are peace-loving people who don’t want to see this focus on destruction and annihilation. But you saw this in 2015, 2016, when the Iranian government had resources, the region was less safe.

(00:14:08)
Since now, there’s been more resources allowed to go to the Iranian regime by lack of enforcement of sanctions. As a result, Iran is funding Hezbollah, Hamas. They were funding the Houthis. Now there’s a little bit of a détente between Saudi and Iran, which has led to that going down, which only further proves that Iran was behind the Houthis, which is what the Saudis had been saying for years, and Iran was denying. There’s a very strong relationship between the two, and we always knew that the way that Iran fights wars or fights conflicts is never directly, it’s usually through its proxies. In this case, Hamas has been a proxy for Iran who wanted to obviously see the destruction of Israel, but also does not want to see the Israelis and the Saudis come together for a peace agreement.

Al-Aqsa Mosque

Lex Fridman
(00:14:55)
The name of this operation, of the Hamas operation is Al-Aqsa Flood, referring to the Al-Aqsa Mosque. How much of this attack is about the Al-Aqsa Mosque?
Jared Kushner
(00:15:07)
In actuality, I don’t think any of it is, but the Al-Aqsa Mosque is something that all of the Shia Jihadists have used for years in order to justify their actions that are aggressive towards Israel. This is something, I’ll maybe even take a step back and go through when I was working initially in my first year on the Peace Plan, I was doing a lot of listening. Quite frankly, a lot of what people were saying to me didn’t make sense. The reason why I was trying to figure out, they were talking about sovereignty over Al-Aqsa Mosque. The Al-Aqsa Mosque is a mosque that’s built in the Holy of Holies, the Haram al-Sharif in Israel, where the [Foreign language 00:15:49], the Holy temple was built in a very religious place after the Temple was destroyed. Then there was a big mosque built there, and it’s one of the more holy places in Islam now.

(00:16:03)
The big thing everyone was saying is, “What do you do with this land where you have a mosque built over a very big Jewish site?” I was hearing all of the experts, and I always say experts with quotes, because only in Washington can you work on something for a decade and continue to fail, and then you basically leave are considered an expert. But that’s one of the problems with Washington, which maybe we could talk about later. But the notion here was I went and I said, “Let me try to understand what the issue is with the Israeli-Palestinian conflict with the people.” I always felt the politicians were a little disconnected so I commissioned several focus groups, one in Amman, one in Cairo, one in Dubai, and one in Ramallah. I asked people, Muslims, what is the Israeli-Palestinian conflict about? Time and time again, the most popular thing that they said was that Israel was not allowing access to the mosque for Muslims to pray. What was interesting was is that Israel’s policy is to allow anyone who wants to come and pray peacefully at the sites to come and pray. Sometimes they have security issues when there’s provocations. But by and large, since 1967, when Israel was able to take back Jerusalem in a defensive war, just to be very clear, they were attacked in the South and they were attacked from the east, and they basically were able to beat back the Jordanians and the Egyptians and then reconquer the old city of Jerusalem. During that time, immediately after Israel then passed the protection of Holy Places law, which was they basically took resources they didn’t have and they said we’re going to restore the Christian sites, the Muslim sites, the Jewish sites, and they’ve worked to allow everyone access to the mosque.

(00:17:45)
Today, any Muslim who wants to come can come and pray at the mosque. The mosque is… Israel’s acknowledged that King Abdullah, the king of Jordan, is the custodian of the mosque and as long as people want to come to the country and pray peacefully, they’re able to do that. But if you look at a lot of the propaganda that’s been used by ISIS or Iran to recruit terrorists or to justify their incursions, they often say they’re doing it in the name of liberating the Al-Aqsa Mosque. But from an operational and pragmatic perspective today, any Muslim who wants to go to the mosque, you can book a flight to Israel now through Dubai because there’s flights between Israel and Dubai and as long as your country has relations with Israel and they’ll accept your passport in there, you can come and pray. That’s what Israel wants. Israel wants Jerusalem to be a place where all religions can come and celebrate together. But you have a lot of actors that look to find ways to use these religious tensions in order to sow division and justify violent behavior.
Lex Fridman
(00:18:50)
I wonder how it’s possible to lessen the effectiveness of that propaganda message, that a lot of the war, a lot of the attacks are about access to the Al-Aqsa Mosque. Is there something you can speak to why that message hasn’t disseminated across the Arab world?
Jared Kushner
(00:19:07)
Israel’s good at a lot of things. They’re not very good traditionally with public relations. After the Abraham Accords, we made the first Abraham Accords deal in August, 2020, and then we made five other deals. We first did United Arab Emirates, then we did a deal with Bahrain, then we did a deal with Kosovo, then we did a deal with Sudan, then we did a deal with Morocco, and then we got the GCC deal done as well, the tension between Qatar, Saudi, UAE, Egypt and Bahrain. That was allowing us to create a pathway to then pursue the Israeli-Saudi normalization. We had so much momentum then that the goal was just keep getting more countries to normalize relations with Israel. Once you create the connection between people and create the ability for people to do business together, the ability for flights to fly between, then you would just start naturally having people coming and everyone has a smartphone today, so they can then post and combat the misinformation that’s been out there.

(00:20:07)
But this misinformation is not something that’s new. One of the characters who played a very big role in spreading the antisemitism and the violence in Israel in the 1920s was a guy named Haj Amin al-Husseini, who was known as the Grand Mufti of Jerusalem. He was very close with Hitler and Mussolini, and he was working with them to try to get some claims to the Middle East once the Jewish people were annihilated. What he did for a very long time was he did the same shtick, only it was before yet smartphones and YouTube where he would say the mosque is under attack. These imperialist Zionists are coming in to try to destroy the mosque.

(00:20:42)
He would use that to raise money from Indonesia, from Pakistan, from all over the world, and then use that threat to justify recruiting groups of young, vulnerable Muslim men and then getting them in the name of religious rights to go and kill people, which really is more of a perversion of the religion than I think the true essence of what Islam is. I think Islam at as core is a peaceful religion, and I think that’s where a lot of the great leaders in Islam want to take it. But the people who use Islam or the mosque or as a justification for violence, those are people who I think are really… They’re disrespecting the Islam religion.

Abraham Accords

Lex Fridman
(00:21:20)
As you said, you helped make major strides towards peace in the Middle East with the Abraham Accords. Can you describe what it took to accomplish this, and maybe this will help us understand what broke down and led to the tragedy this week?
Jared Kushner
(00:21:36)
Yeah. I always believed in foreign policy. I learned very quickly that the difference between a political deal and a business deal is that in a business deal, you have a problem set, you come to a conclusion, and then if you buy or sell something, you either have more cash or you have a company. More to do, less to do. Political problem set is very different, where the conclusion of a problem set is essentially the beginning of a new paradigm. When I would think about how do you move pieces around the board, you couldn’t say let me just solve the problem. You have to think about what happens the day after the signing, and how do you create a paradigm that has positivity to it.

(00:22:17)
The biggest piece of what President Trump did during his four years in office was he really strengthened the relationship with Israel, number one. He did things like recognizing Jerusalem as the capital of Israel. He moved the embassy to Jerusalem. He recognized the Golan Heights. He got out of the Iran deal. We did an economic conference in Bahrain where we brought Israelis to meet with Saudi and Emirati and Qatari businessmen and everyone came together. Each one of these instances were unthinkable previously. Everyone said that if you did it, the world was going to end and every time President Trump did one, the next morning the sun rose, the next evening, the sun set and things moved on. By doing that, what President Trump did was he slaughtered a lot of the sacred cows of these false barriers that people had erected and showed people that the vast majority of the people in the Middle East, whether they’re Jewish, Muslim, Christian, whatever religion they are, they just want to live better lives.

(00:23:17)
What we basically did was create a paradigm where the voices for peace, the voices for together now finally had a forum where they were able to do it. We did that in the backdrop. The way we’re able to be successful was we severely limited the resources of Iran, and they were focused more internally, and they couldn’t cause the trouble that they were causing everywhere else. Since we’ve left, obviously the dynamics have changed, but the way you get to peace is obviously number one through strength and number two, by finding a way for people to be better off tomorrow than they are today. What I found was that most of the voices looking for violence or trouble were people who were just focused on what happened two years ago, 20 years ago, 70 years ago, 1000 years ago. People who were trying to solve those problems in that context often were looking more to use those past grievances as a justification for their power and for the bad behavior that they were looking to perpetuate.
Lex Fridman
(00:24:20)
As we have talked about extensively, managing the power dynamics of the region and providing a plan, this is something you did with the economic plan titled Peace to Prosperity, A Vision to Improve the Lives of the Palestinian and Israeli people. Can you first of all describe what’s in the plan?
Jared Kushner
(00:24:38)
Sure. This was something I took on. I was working on the political framework between the Israelis and the Palestinians and trying to understand what were the issues. The issues were not very many. It basically was you had a land dispute, so you had to figure out where do you put borders ultimately, you had a security paradigm, which I was much more favorable to Israel’s perspective on. Obviously the events of the past 48 hours have fully justified that bias. Then in addition to that, you had to deal with the religious sites, but I felt operationally that wasn’t actually as complicated as people made it because you wanted to just leave it open for everybody.

(00:25:17)
Then I went through and I felt that the Palestinian leadership was fairly disincentivized to make a deal because there was just this paradigm where they had billions of dollars coming in from the international community, and I think that they feared that if they made a deal, they would lose their relevancy internationally and the money would stop flowing into the country. What I tried to do is to say my approach when I would get into a hard problem, say, how do I understand all the different escape patches? How do I try to eliminate them and then build a golden bridge that becomes really the only, but also the most desirable pathway For the decision makers to walk through.

(00:25:56)
It wasn’t always hard, and sometimes you have to go and hold their hand or you try to pick them up and walk them across. But a lot of these leaders are very reluctant to change, and the dynamics of the Palestinians also were such that I think they were fairly stuck where they were. We developed a business plan for Gaza, the West Bank. We threw in some improvements for Jordan and Egypt as well. I based it off of the vision 2030 that they did in Saudi Arabia, which I thought was a visionary document. I went back through this process and I studied basically every economic project in post-World War II period.

(00:26:33)
We looked at what they did in South Korea, why it was successful with some strong industrial planning. We looked at Japan, we looked at Singapore, we looked at Poland, why it was successful. We spent a lot of time on the Ukraine plan for the country and why it wasn’t successful. That was mostly because of governance and corruption, which actually resembles a lot of what’s gone wrong with the Palestinians where there’s no property rights, there’s no rule of law. What we did is we built a plan to show it’s not that hard in the sense that between the West Bank and Gaza, you had 5 million people. We put together a plan. I think it was about $27 billion. We got together a conference. I had the head of AT&T. We had Steve Schwarzman from Blackstone came, which was very gracious of them.

(00:27:14)
We had all the leading Arabic businessmen, the leading builders, leading developers. The general consensus of that conference was that this is very doable. We think that for Gaza in particular, it would cost maybe seven to $8 billion to rebuild the entire place. We felt we could reduce the poverty rate in half. We can create over a million jobs there. The only thing that people said was holding it back wasn’t Israel. What was holding it back was governance, and people wouldn’t have confidence investing there with the rule that Hamas was perpetuating.

(00:27:49)
I encourage people actually to look at the plan. It was very thoughtful. It was 181 pages. We went project by project. Each project is costed out. It’s a real plan that could be implemented, but you need the right governance and all of the different Arabic countries are willing to fund it. The international community is willing to fund it because they’ve just been throwing so much money at the Palestinians for years that’s never been outcomes based or conditions based. It’s just been entitlement money, and unfortunately, it hasn’t really achieved any outcomes that have been successful.

(00:28:20)
It’s a great business plan. It just shows too rebuilding Gaza could be easy, but like I said, the problem that’s held the Palestinian people back and that’s made their lives terrible in Gaza has not been Israel. It’s really been Hamas’ leadership or lack of leadership and their desire to focus on trying to kill Israelis and start war with Israel over improving the lies of the Palestinian people.
Lex Fridman
(00:28:44)
The current approach of Hamas, the more violence they perpetrate, the more they can hold onto power versus improving the lives of people. As you said, maybe you can comment on they do not propose an economics plan.
Jared Kushner
(00:29:02)
Hamas has been running it now for 16 years, and they don’t have a lot to show for it. Our posture with them was basically a very simple deal. If you think about what’s the end state in Gaza, it’s actually not that complicated. There’s no territorial disputes, right? The border’s the border. There’s no religious issues there as well. You’re not dealing with Jerusalem. You’re basically just dealing with the fact that Israel wants to make sure that there’s no threat from Gaza so it’s a demilitarization or some kind of security guarantee from a credible source where Israel doesn’t feel like Gaza can be used to stage attacks into Israel or to fire rockets into Israel.

(00:29:41)
By the way, these are things I was saying three, four years ago that that was the objective, and that was really the fear. Now that’s been proven. Unfortunately, the fear has manifested, and in exchange you can rebuild the place and you can give the people a much better life. But Hamas has not shown desire for that or a capability for that, and I don’t think there’s enough trust to allow them to do that, which is why under the current circumstances, if you do want to have peace there, Hamas has to be either eliminated or severely degraded in terms of their military capabilities.

Trump vs Biden on Middle East

Lex Fridman
(00:30:15)
I would love to ask about leadership, especially on the side of the United States. What has the current administration, the Biden administration, done different than the Trump administration, as you understand, that may have contributed to the events we saw this week.
Jared Kushner
(00:30:34)
All I can talk about are where we left them. We left them a place where they had tremendous momentum in the Middle East. I met with them during the transition and said, “Look, we even got the Qatar-Saudi conflict done, which was a big… No peace between Israel and Saudi would’ve been possible without that so we even got that done in our lame duck period. They came in and they said, “Look, we want to focus on the three Cs, which is Covid, climate change, and China.” I said, “That’s great, but the Middle East we have in an amazing place right now. It’s stable, there’s momentum. Iran is basically broke.” We put such crippling sanctions on Iran that they went from about, I think it was 2.6 million barrels a day of oil they were selling to about 100,000 under Trump. Their foreign currency reserves were basically depleted and they were broke.

(00:31:25)
Same with the Palestinians. We stopped the funding to UNRWA, the UN agency, which is totally corrupt. We’ve put $10 billion in there over time. I did a poll in the Middle East, in Gaza to say, “Okay, we’ve invested $10 billion here as a country. Are we popular?” The US had a 7% approval rating. USAID has 70% approval rating, but it just felt like a waste of our taxpayer dollars. Again, we wanted to make it conditions based. The Biden administration came in. Number one, they started insulting Saudi and Russia. Oil prices went up at the same time. What they did was they stopped domestic production of oil. They disincentivized a lot of oil and shale production with regulations. They stopped pipelines. Oil prices went up. They stopped enforcing the sanctions against Iran probably to get the oil prices lower to make up for what they were doing. They ran to Iran to try to make a deal. They started funding the Palestinians again right away.

(00:32:27)
I even said if you’re going to fund them, if that’s your policy, I respect that. Again, elections have consequences and you can take a different policy. But what I would recommend is get some conditions, make them do some reforms, make them give property rights to people, make them do real economic investments for people. But they just went right away. They were funding the Palestinians, not enforcing the sanctions and then overall, just projecting a lot of weakness in the region. One of the most embarrassing examples is what happened in the United Arab Emirates. Again, an amazing, probably one of America’s best allies over the last 20, 30 years…
Jared Kushner
(00:33:03)
… of America’s best allies over the last 20, 30 years. They fought with us in Afghanistan. They were the first Muslim country to stand up and do that after 9/11 because they didn’t want it to be a war of the West against the Muslim religion. So they joined the fight, because they saw it as a fight between right and wrong. They have rocket shot into their country from the Houthis, and they basically don’t get a call from the US for 17 days. They need their equipment that they buy from the US, which creates job in the US. They need it restocked. We don’t call. So they’ve severely degraded the trust that we had to rebuild with our allies. I think they’ve been working now to get it back. They, after two years, started working with Saudi and Israel, which I think was good.

(00:33:48)
I think that they realized, after a stint, that maybe the process that President Trump had created in the region was the right policy. And keep in mind, President Trump’s policy, that I was working on, was very strongly criticized during the first three years before we were able to achieve the results, because it was a departure from the failed policies of the past. So first, there was return to those policies, appease Iran, let’s criticize Saudi Arabia. Then they started embracing and working on the Israel-Saudi deal, which was really exciting. I think we were all very excited about it. But they did it in public, and I think that that also was something. Again, I didn’t have access to their intelligence, so I assumed that by doing it so publicly, they thought that they’d either had a deal with Iran because they were letting them get all this revenue where Iran wouldn’t be a problem.

(00:34:35)
But one of the reasons with the Abraham Accords, we kept it so quiet during the whole time, was because we always felt like the troublemakers in the region, particularly Iran, who we thought would be disadvantaged by having UAE, Saudi, Israel altogether. Israel’s a nuclear power. You have other strong economies. Iran seeks instability. They seek looking to create a division in the region. And if you can create that economic sphere where you have security from Haifa to Muscat, from Israel to Oman all the way through with Saudi, Jordan, UAE, Qatar, Egypt. That’s an incredibly powerful block. If you can make it secure and then get economic integration, that really could be a Middle East that thrives. So Iran, obviously, wanted nothing to do with that, and that’s why they’ve been working to disrupt. So I think the administration, they took an incredibly stable situation with momentum.

(00:35:28)
I think they underestimated the way that Iran would approach the region to undermine. I think they gave way too much rope to Iran, and I think that they didn’t seize, when they had an opportunity of strength with the Palestinians, to try to drive to a conclusion that, I believe, could have prevented us being where we are today. Not to mention that even just three weeks ago, it’s a bad look that they just basically gave $6 billion to Iran in exchange for hostages. And then Iran’s basically funding these terror attacks, they are killing American citizens in Israel. It’s a heartbreaking situation. Again, totally avoidable and one that, I think, has been very badly mismanaged to date.
Lex Fridman
(00:36:13)
If Trump was currently president, you were still working with him on this part of the world, what actions would you take? What conversations would you have? What ideas would you be working with in order to unite the various allies that you mentioned in the Middle East over this tragedy and not let it be a thing that divides the Middle East, but make it a thing that catalyzes further progress towards peace?
Jared Kushner
(00:36:48)
I want to say one thing, Lex. I have a lot of friends who are fans of Trump, who are not fans of Trump. But one thing I want to say with absolute certainty is that, if President Trump was in office, this never would have happened. When President Trump was in office, anyone who supports Israel or who wants to see Jewish people not be innocently slaughtered, he would never have allowed that to happen. It did not happen when he was in power. And I hope people recognize that as something that’s very, very true. How I would play the ball where it lies right now. Keep in mind, we transferred the ball it was on the green. Now it’s almost like it’s gone back 150 yards and it’s in a sand trap. I think the way that I would play the ball right now is, number one is you have to show strength.

(00:37:33)
I actually think President Biden’s words were the right words. I see that they’re moving aircraft carriers to the region. Again, the purpose of having a strong military… I believe obviously if you get into a war, you want to win the war. But the purpose of a very strong military primarily is to avoid a war. I don’t know what credibility the Biden administration has to show the strength, but right now you have to support Israel completely. You have to really let people in the region know that there’ll be consequences if they try to escalate. Again, we saw a little bit of rocket skirmish from Lebanon, from Hezbollah. But again, this is the type of thing that they have to know, there’ll be severe consequences if they make this a multi-party fight. And I think sending a strong message to Iran, I think that they have to see some consequences from this and know that they’re not going to be allowed to have a free rein to cause instability in that.

(00:38:28)
Iran doesn’t usually fight face- to-face. They usually do it through proxies, but let’s just all be honest about where this is coming from and let them know that there will be a consequence if they instigate these actions. Again, at least with the Biden administration, they’ve had contact with Iran, they’ve been talking with Iran, but they’ve allowed Iran. Again, the number I saw last year, I think under Trump the number was maybe like four or $5 billion of oil revenue in total. I think last year it was something like $45 billion in revenue. This year, I think it’ll be even more. That’s a combination of them driving up oil prices, but also allowing much more sales. You would think that they would find a way to get them to behave and allow them to have this happen. Or if that’s not the case, then be tough. Go back to being tough. That’s what you have to do.

Israeli-Saudi Normalization

Lex Fridman
(00:39:15)
Building off of Abraham Accords, as you mentioned, Israel-Saudi normalization, there’s been a lot of promising progress towards this. What does it take to not allow this tragedy damage the progress towards Israel-Saudi normalization?
Jared Kushner
(00:39:33)
I think right now it’s probably not the best to think about that. I think that we want to think about that after whatever’s going to happen is going to happen now. I think right now, the number one priority for Israel has to be to fully regain security in the country. And then number two is, to figure out how you can, like I said, eliminate or degrade the Hamas capability or other Iranian threats to make sure that you have your security apparatus. I think that the Israeli leadership right now should proceed with that, and I don’t think that they should be thinking about normalization with Saudi at this moment. My instinct, and I’ve been watching this Israeli-Saudi normalization play out. Obviously just speaking with people and seeing what I’ve been reading and watching with great excitement. I think it would be a game-changer for the region.

(00:40:21)
I think it’s one of Iran’s worst nightmares to have Israel and Saudi interlinked together. I think it’d be great for the Saudi people from a security perspective, what they’re discussing with America would be very strong. The ability to get different elements across would be incredible. So what I would say with this is that, the industrial logic held yesterday, and I think it will hold again tomorrow. I always expect countries to act in their interests. I think that the deal that’s on the table right now between Saudi, Israel and America is in Saudi’s interests, it’s in America’s interests and it’s in Israel’s interest. What’s going to happen now though is, the political dynamics are going to shift. And I think that, as we’ve seen with political dynamics, they come and go. I think let’s get through this moment, and then I hope at the right time that those talks will be able to resume and conclude in an appropriate way.

(00:41:18)
It’s funny, Lex, when I was working on the US-Mexico agreement for the trade, every day there’d be a tweet that would go out or there would be an issue. People forget how intense it was between America and Mexico. And I’d speak to my counterpart of Mexico after a rough day and we were working on something, we were making progress. It’d get blown up. And I’d speak to them and say, “You know what? Look, they’re not moving America. They’re not moving Mexico. Let’s stop for today. Let’s pick up tomorrow and let’s find a new way to bring this forward.” So I would just encourage everyone working on that not to give up, to keep working hard at it and to find a way. But like I said, I would take a little bit of a pause for the time being. Let’s let the current situation play out and then hopefully there’ll be a way for it to move forward.
Lex Fridman
(00:42:02)
I just hope there’s still people on the US side picking up the phone and calling UAE, Saudi Arabia just as human beings, as friends, as allies, and just keeping that channel of communication going. Maybe you can correct me, but I just feel like there’s just simple human dynamics that play out here, that divisions can form and just run away from you. Over simple misunderstandings, over just inability to see a tragedy from the same perspective because of conversations that could have happened but didn’t happen.
Jared Kushner
(00:42:44)
I think there’ll definitely be communication, but words on phone calls is only worth so much. It’s really trust between people in power. And obviously when you’re in a position of power, you represent your country and your country’s interests. But the ability to have trusting relationships where people feel like they’re okay taking more risks to help each other, that’s actually what’s most important. So communication, I hope for. But deepening and trusting relationships, that’s what I believe makes progress and keeps people safe.

How the Israel-Gaza war ends

Lex Fridman
(00:43:17)
We talk quite extensively about the value of trust in negotiation and just working with leaders, which I think is a fascinating conversation. And you’ve taught me a lot about that. Let me ask you about the end here. What are the various trajectories this war can take, in your view? What are some of the end states, as you’ve said, which are desirable and are achievable?
Jared Kushner
(00:43:43)
I mentioned this earlier, but whenever I would get a problem set in government, I’d always think through from a first principle’s perspective, what’s the logical outcome? And forget about all the reasons why it can’t happen. That’s what everyone in governments always rush to talk about. But I do think here, number one, Israel has to have a secure environment where they don’t feel threatened from Gaza. And number two is, the people in Gaza need to have an environment where they feel like they can live a better life and have opportunities. That’s the end state. So I think that the international community should come together. I do think that the people who are usually putting blame on Israel should now realize that maybe they’ve been a little bit harsh here, and that Hamas has been as big a threat, if not an even bigger threat than Israel has been saying.

(00:44:32)
And I do think that if the international community comes together and unites behind Israel and really forces Hamas and their Iranian backers to stop hostilities, to stop saber-rattling, to stop misrepresenting the history in order to justify their violent behavior. And if they say instead, “We want to hold you accountable, no more money.” And they all say that they’re going to stand behind Israel’s efforts to eliminate their national security threats. Then we will all come together and only fund again into a framework that we believe can be a long-term solution where the Palestinian people really have a chance to live a better life. That’s really the best way to get there. There’s tons of complicating factors, but that’s the end state that the global community should be looking to come together. And it’s very achievable. It’s very, very achievable.
Lex Fridman
(00:45:26)
As we stand here today, there’s a lot of different ways that this war can evolve. If a ground invasion happens, by Israeli forces, of Gaza, and if the number is correct of 100,000 Israeli soldiers. Do you worry about various trajectories that can take or the consequences that might have of an unprecedented ground troop attack?
Jared Kushner
(00:45:52)
I think as a leader, you can’t change yesterday, but you have the ability to change tomorrow. And that’s a very important fundamental. That’s true for all of us, not just leaders. We saw with 9/11 how America was caught off guard by a terrorist attack. We acted somewhat rationally, somewhat emotionally, which led to a 20-year war with trillions of dollars lost. I think almost a million lives lost, not just American, but all lives. And it was a total tragedy what occurred. I think right now the temptation is to be strong. I think that that’s a necessity. I do think eliminating risk is the right objective. I think the goal should be to stay very clear about what the objective is. But also this attack was very well planned, not to walk into another trap. I think you have to be very smart, very cautious.

(00:46:43)
I’ve been happy to see that what they’ve been doing in retaliation so far has been somewhat measured and they’ve taken their time to try to assess what’s achievable. Again, I don’t have access to the intelligence, and we’re talking at a very early stage in this conflict. So a lot had happened even by the time this is published. But my hope is that they’ll just stay very focused on what the objective is and try to make sure that they’re acting appropriately in order to do that. And I will say this too, that this has been different than what I’ve seen in the past. And that the attacks were so heinous and so disgusting that I’ve seen the international community rally around Israel more so than I ever have. And I hope that Israel continues to keep the moral high ground and continue to communicate what they’re fighting for, why they’re fighting. And I do hope that the international community supports the objective and they can work together to achieve it.

Benjamin Netanyahu

Lex Fridman
(00:47:44)
Benjamin Netanyahu, Bibi, somebody you’ve gotten to know well in negotiation, in conversation. He has made statements, he has declared war, he has spoken about this potentially being a long and difficult war. What have you learned about the mind in Benjamin Netanyahu that might be important to understand here in this current war?
Jared Kushner
(00:48:08)
Bibi is definitely a historic figure. I’d meet with a lot of different world leaders, and some of them, I would say, they’re very, very special, transformational figures. And some, I would say, how the hell is this person running a country? Bibi is somebody who has done a lot for the state of Israel, he has a tremendous understanding of the security apparatus. He has tremendous global relations. So for a crisis like this, I think Bibi’s the leader you want, if you’re Israel, to be in that seat. I think he’s ambitious in what he’s going to look to achieve. He understands his role in history as somebody who’s helped strengthen Israel economically, militarily.

(00:48:52)
And I don’t think he wants to see his legacy be somebody who left Israel more vulnerable than it had to be. So I think, in that regard, he’ll be incredibly strong. But I also think that he’ll hopefully be calculating in the risks that he takes and not create more risk than is needed. And that’s easy to say, the two of us sitting here having a conversation. When you’re sitting in that chair as a leader in the fog of war, it’s a very hard decision to make. He’s been here before. He knows the weight of the situation. I’m sure he knows the moment. And I pray that he’ll do what’s right here to bring the best outcome possible.
Lex Fridman
(00:49:38)
I wonder if you can comment on the internal political turmoil that Bibi has been operating in and how that relates to the tragedy that we saw.
Jared Kushner
(00:49:51)
On the one hand, the political turmoil, it’s a sign of a vibrant democracy. I think it’s been actually nice to see how people have fought for their country and their beliefs in a democratic way. You compare that to the Palestinians where there’s no democracy, there’s no free speech, there’s no free press. You can disagree with the leadership in Israel. If you want to be homosexual, you can go to a parade and live your life. In Gaza, they’ll throw you off a building and kill you. So in Israel, you have the freedoms, which I think make it a special place. And you have a very vibrant democracy.

(00:50:33)
With that being said, the times in Jewish history where the Jewish people have been most vulnerable have been when there’s been division, and that’s when the temple was destroyed. But that’s not just with the Jewish people and with Israel, that’s in all societies. So I definitely believe that this division has left them less prepared for the situation than it would. I do think there’s real lessons we should be taking from this here in America, where we’re in a time where we’re very divided. But I do think that it’d be very wise for our leaders to find the areas where we do agree and find ways to secure our southern border, to make sure that we know who’s in our country, what risks we all face. And I do think that division definitely creates risk for countries.

Palestinian support

Lex Fridman
(00:51:21)
Let me switch gears here and just zoom out and look at our society and our public discourse at the moment. What do you make of the scale and nature of the Palestinian support online in response to this situation?
Jared Kushner
(00:51:35)
This is something I’ve observed over the years since I got involved with the Israeli-Palestinian issue with a lot of interest. I think a lot of the people who are pledging support for the Palestinian people, I think that they want to see the Palestinian people live a better life. And I actually agree with them in that regard. Unfortunately, I think many of them are incredibly ill-informed as to the facts on the ground. I think all of the people who are advocating online for the Palestinian people, who are going to these marches in support of them, I think they’d be best served if they really care about effectuating the outcome of joining with Israel right now and directing their anger towards the Hamas leadership.

(00:52:27)
I think that it’s very clear that the group that’s responsible for the Palestinian people living the lives that all of these people are angry about is Hamas. And if they direct their anger towards Hamas and put the attention on the failings of Hamas and put forth a vision for what they’d like to see leadership in Gaza do. And they respect that there’s a real fear that Israel has and any country would have of having a group of terrorists next to them that’s calling for their destruction. I think that that recognition of finding a way for Israel to be secure and then having an opportunity for the Palestinian people to live a better life is the right pathway to try and pursue.
Lex Fridman
(00:53:10)
So to you, there’s a clear distinction between Hamas and the Palestinian people, in that Hamas is the enemy of progress and the flourishing of the Palestinian people.
Jared Kushner
(00:53:21)
100%. It’s very, very clear. And I think that if people were honest about the situation, if they spent the time to really understand it. Again, if you follow the conference I did in Bahrain, we had all of the leading businessmen there and they said, “We can rebuild Gaza very easily. We all want to.” The leading Arab businessmen, the leading American businessmen, everyone wants to, they’re just held back by Hamas. So I do think having an honest conversation about this at this point in time has really only one logical conclusion. And my hope is that, maybe this conflict leads to that conversation being had. And if it is, then maybe that brings more unity and understanding and we get to a conclusion better that could improve the lives of the Palestinian people.

Trump 2024

Lex Fridman
(00:54:02)
Pragmatic question about the future. Do you hope Donald Trump wins in 2024? And how can his administration help bring peace to the Middle East?
Jared Kushner
(00:54:14)
When Donald Trump was president, we had a peaceful world. Everyone said if he was elected, we would have World War III. Meanwhile, he gets elected, and he not only is the first president in decades to not start any wars, he’s making peace deals. He’s making trade deals. He’s working with our allies, getting them to pay their fair share in NATO. He’s having a dialogue with China, with Russia. He’s weakening Iran. So I do think that the job he did as a foreign policy president was tremendous. I think now more and more people are starting to recognize that. Again, under President Biden, this is the second war that’s broken out in the world. And when you have a weak American leadership, the world becomes a less safe place. So my hope and prayers are that President Trump is reelected and that he’s able to then restore order and calm and peace and prosperity to the world .
Lex Fridman
(00:55:10)
From a place of strength?
Jared Kushner
(00:55:12)
That’s the only way he knows how to do it.
Lex Fridman
(00:55:15)
What gives you hope about the future of this region, of Israel and of the Middle East?
Jared Kushner
(00:55:23)
The Middle East for 20 years was an area of conflict. They spent all their money on bullets and bombs. You have young leadership now in Saudi Arabia and UAE and Qatar, and there’s a much more ambitious agenda now for the region to make it an economic superpower and hub of the world. Israel is one of the most burgeoning and exciting tech economies in the world. And if you think about it, it’s almost like having Silicon Valley not connected to California. The thing that’s held the region back for all these years has just been the conflict and the division and the lack of connectivity. But if you have that region and if it can all come together, if you can create a security architecture. You have an incredibly young population, you have a lot of wealth and resources and a lot of capabilities and knowhow. So I think that if it’s managed correctly, and if Iran is able to be restrained and suppressed with their ambitions to cause destabilization. I don’t mean Iran the country, I mean the Iranian regime.

(00:56:29)
Because actually once you have this economic sphere, if you could bring Iraq into it, if you could bring Iran into it, that makes it even bigger and stronger. And the Persian people are incredibly entrepreneurial and incredibly industrious. So I do think that the region has tremendous potential. It’s just been held back by bad policy, bad leadership, bad objectives. And again, when President Trump left office in 2021, the Middle East was really on a very, very positive trajectory. And if the right things happen, it can continue to be so. I’m praying at this moment in time that we navigate the current crisis, that the important objectives are achieved of eliminating the terrorists and their threats. And then allowing the leaders who are focused on giving their citizens and their neighbors the opportunity to live a better life, are able to work together and really dream and be ambitious and find ways to create a paradigm where humans can flourish.

Human nature

Lex Fridman
(00:57:30)
What is the best way to defeat hate in the world?
Jared Kushner
(00:57:34)
Hate is a very powerful force, and it’s much easier to hate people you don’t know. It’s funny, when I was working on prison reform, one of the most interesting people I met was a reverend, actually down in Texas, who negotiated the first truce between the Bloods and the Crips. Two of the notorious gangs in America, in prison. And I was very excited to meet him. When I met him, I said, “Well, how’d you do it?” And he said, “It was very simple.” He says, “I got all the guys together and I had a tremendous amount of barbecue brought in.” He says, “And I got the meeting.” He says, “No drinking.” He says, “Drinking sometimes gets people a little bit more against each other.” He says, “But I got a meeting and they started sitting down together and they started saying, ‘You know what? You’re just like me.'” And all of a sudden, they started finding areas where they were more together.

(00:58:27)
Look, I’ve traveled all over the world now. I’ve been very fortunate to meet people from different states in America. I’ve different political persuasions, different ages, different classes. And what I found is that, there’s a fundamental driving amongst all of us where we all want to live a better life. People don’t want to fight naturally, but it’s easy to fight when you feel wronged or you feel like somebody disrespected you or somebody did something from hatred. And hatred leads to more hatred, which sometimes just pushes that cycle further and further. So I believe that each and every one of us has the power to stop that cycle. We don’t do it by being on Twitter and yelling at people. We don’t do it by just being critical. We do it by finding the people we disagree with, by listening to them, by asking questions, by sitting with them. And then if we each take responsibility to try to make the world better, then I think that there’s no limits to the incredible place that this world can be.
Lex Fridman
(00:59:31)
As you’ve said, you’ve traveled all across the world. Do you think most people are good, most people have love in their heart?
Jared Kushner
(00:59:41)
I do believe that, yeah. And you have some bad people. You have some real evil people. A big part of the work I did was on prison reform. Previously the mentality was, is that the prison should basically be a warehouse for human trash. And if you’ve made a mistake in this world, then we’re going to throw you out and we’re going to make the rest of your life incredibly difficult. Because you’re going to have a criminal record, you’re not going to have access to jobs. But what I found is, when I would sit with people in prison, the people I’ve met through my father’s experience and who I met along the way is that people make mistakes. We’re all human. I think it’s the right thing from a religious perspective to give people second chances. I always believe you shouldn’t judge people by the worst mistake they make in their life.

(01:00:24)
Unfortunately now, in the era of social media, people will say one wrong thing, it sticks with them forever. They get canceled or they get put out. We’re all humans. We grow from our mistakes, we learn from our mistakes. And I think that some people are just evil. There are some evil people. But I do think the vast, vast, vast majority of people are good. And I do think that people sometimes also can be in a bad place, and then society can push them to a worse and worse place. But we all have the power to make them feel loved, make them feel heard. I think there’s also tremendous power that we have as people to help people get to a better place. My wife and I, we’ve always tried to be a force for good. We’ve always tried to provide a place where people can discuss with each other.

(01:01:12)
When we were in Washington, we would host dinners at our house all the time, or we would have Democrats and Republicans sitting together. I saw Senator Feinstein just passed away. We had a great dinner at her house when she was a senator, with her and her husband and Mark Meadows when he was on the Freedom Caucus. And we had actually a fascinating discussion about Iran. Mark was much more hard line than me. I had to actually bite my tongue. I was impressed at how much he did. Whereas Feinstein and her husband were super into… They knew the Iranians well. They thought they were peace loving. And it was an incredibly robust and respectful debate. I don’t think we maybe concluded anything that night, but it was interesting for people to get together. Having a dinner at my house where I had Dick Durbin, the number two ranking Democrat in the Senate, Lindsey Graham and Steven Miller, who’s known to be a very hard line on immigration, discussing what an immigration reform could look like.

(01:02:06)
They left that dinner saying, “Wow, we hadn’t spoken to people on the other side and we actually agree on 85% of things. Maybe something is possible.” So I believe that we should always be trying to push to make the world a better place. And you only do that by listening to people and connecting with people and by respecting people. And finally, I’ll just say on this is that, I meet people all the time who have so much confidence in their perspectives. I’m very jealous that these people are able to be so confident about every single thing. Because, for me, I have some degree of confidence in the things that I’ve studied and what I’ve learned, but I’m always trying to find people who disagree to sharpen my perspectives and to help me grow and to help me learn further. I think that’s the beauty of the world, is that the knowledge base continues to grow, the facts continue to change, and what’s possible tomorrow continues to become different. So as humans, we have to continue to thrive, to learn, and to grow and to connect. And if we do that, everything’s possible.
Lex Fridman
(01:03:12)
Well, Jared, thank you for your compassion, first of all, but also your wisdom today on this very difficult, this tragic set of events, these difficult days for the world. It’s a big honor to speak with you again. Every time I speak to you, I learn a lot about the world. And I deeply appreciate, like I said, your humility and your understanding of the details of all the complex power dynamics and human dynamics that are going on in the world. Once again, thank you for talking today.
Jared Kushner
(01:03:48)
Thank you. And Lex, if I could say just one final thing, which is that my thoughts and prayers are really with all the people in Israel and the innocent civilians as well on the Palestinian side. My prayers are with the IDF soldiers that they should be safe and they should be really watched by God to accomplish whatever mission will enable to make the world a safer place.
Lex Fridman
(01:04:12)
Thank you for listening to this newly-recorded segment of The Conversation that addresses the current situation in Israel and Gaza. And now we go on to the second part of the conversation recorded on Thursday, October 5th. Given your experience in negotiating with some of the most powerful and influential leaders in the world, what’s the key to negotiating difficult agreements in geopolitics? I start with a big question.

Geopolitics and negotiation

Jared Kushner
(01:04:39)
If I look back on the different negotiations I had when I was in government, either with leaders of countries, with representatives of leaders, or even with members of Congress to pass legislation. The most important thing I would draw back to would be trust. I think getting to know each other, understanding what was motivating the other party to get to the outcome. And making them feel like you weren’t going to use whatever information they gave you to benefit yourself at the expense of them is probably what I would call table stakes to have a shot at accomplishing anything that was hard in negotiation.

(01:05:24)
After that, I would say taking maybe a first principles approach to what the outcome of whatever problem you’re looking to solve should be. Now, you have different kinds of negotiations. I always tried to create a framework in the negotiation where it wasn’t me against you. It was always, let’s agree on what the outcome is that we’re trying to accomplish. Let’s all sit on the same side of the table and say, “We want to get to this outcome. How do we get there?” Really trying to create a roadmap. So once you understand the destination you want to, get to the endpoint, then you’d have to work backwards and really try to put-
Jared Kushner
(01:06:03)
… to the endpoint, then you’d have to work backwards and really try to put yourself in their shoes and try to understand what were their motivations macro. Most of the time, you have to assume that a leader’s primary objective was to stay in power. And so, all decisions made would be made through the framework of what it would take to do that and how it would impact their ability to do that.

(01:06:22)
And then finally, I would just say that in any negotiation, you have to understand the power dynamics as well. And you have to then weight your approach in order to maneuver pieces to accomplish the objective. And so, in areas where we had stronger power dynamics, I’d always look at it and say, “What are the potential escape routes where a politician would say, ‘This is just the reason why we can’t get there.'” And I’d always think, how can you try to eliminate those escape routes or make them much harder to accomplish? And then, ultimately, think about what’s the golden bridge that you want to create for them in order to get to the other side, where they were motivated to get there because it was in their self-interest to get there, but also because it helped accomplish the different objective.

(01:07:07)
And I have many examples that I lived through with that, obviously negotiating in Congress for prison reform. I had to form a lot of trust with Democrats, whether it was Hakeem Jeffries or Dick Durbin. And then also on the Republican side with, I had Mike Lee, I had Lindsey Graham, I had Tim Scott, Senator Grassley, and then also Doug Collins in the house was tremendous. And every time we maneuvered something, we would get attacked either from the left. There was a time we were being attacked by Nancy Pelosi, John Lewis, for not being inclusive enough. And then there were times that we maneuvered it, we’d be attacked from the right for maybe going too far. And ultimately, we had to find just the right place where we can get it done.

(01:07:49)
And the same thing happened with USMCA, where we were negotiating the biggest trade deal in the history of the world, which was $1.3 trillion in annual trade between Mexico, Canada, the United States of America. And we were able to form good trust with the other side and try to say, “How do we create win-win outcomes?” And ultimately, we were able to do something in a record time that people thought was very hard to do. And both of them, in a divided time of the Trump administration, were bipartisan wins with big, big votes in the Senate and the House.
Lex Fridman
(01:08:23)
You have a lot of stories of this kind, sometimes a soft approach, sometimes a hard approach. I think the story where with Bibi, there was a potential, a dramatic election coming up, and you have to say, “No. No excuses, no delaying. We have to make this agreement.” I know Bibi cares about Israel more than the particular dynamics of the election. You had to draw a hard line there.
Jared Kushner
(01:08:46)
Yeah. But in fairness too, for him, during the time that we were dealing with him, he was always in election versus election, and then election. And what he was saying wasn’t wrong. And I think he was more expressing his concerns given the dynamics. And we never held those concerns against him, we just said those are real concerns he had. We respected those concerns. But then we helped him prioritize to help accomplish the right things.

(01:09:10)
And that’s ultimately what the partnership is, right? My job was to represent America, his job was to represent Israel, and you had other parties representing their own interests. And as long as you assume that people were acting mostly in good faith, you were able to navigate areas where you didn’t have complete overlap of priorities and objectives.
Lex Fridman
(01:09:29)
Just to go back to the trust thing, you sometimes see that with leaders, where it looks like they’re trying to screw over the other person when they’re talking. And so, not having that, I think is a really powerful thing for earning trust. That people actually can believe that you’re results driven and are working towards a certain end.

(01:09:51)
Is there a skill to that? Is that genetics, you’re born with that? Or is that something you develop? So basically, it requires you to look at the game of politics and not have a kind of cynicism about it, to where everybody’s trying to manipulate you. And actually just go in with a kind open mind and open heart and actually speak truthfully to people on a basic human level.
Jared Kushner
(01:10:17)
I would say that I always would think about how can I be a partner to others like I would want somebody to be a partner to me? And a lot of it comes from just my different experiences in business. I’ve had great partners, I’ve had terrible partners.

(01:10:31)
My father, again, a lot of my childhood was I was exposed to business. My father, on Sundays, he would take us to job sites and to the office with him instead of to football games like my friend’s fathers would do. And so, we were exposed to business. And what he would say about his father, who was an immigrant to America, came over with nothing, had no formal education, but he would always say, “A good deal with a bad partner will always be a bad deal. And a bad deal with a good partner, you’ll figure it out.”

(01:10:59)
And so, going through some of the challenges that I had in my life early on, whether it was the issue with my father, that I’m sure we’ll talk about, or even going through some tougher financial times during the Great Financial Crisis, I really learned a lot about partnership. And I always thought, “How can I act in a way where I could be the type of partner or friend to others that I wish others would be to me?”
Lex Fridman
(01:11:26)
So when you look for a good partner, don’t you think there’s the capacity for both good and bad in every person? So when you negotiate with all of these leaders, aren’t there multiple people you’re speaking to inside one person, that you’re trying to encourage, catalyze the goodness in the human?
Jared Kushner
(01:11:50)
Yeah. Leaders are generally chosen by their country. And so, my view was if I had an objective, I didn’t get to choose who was the leader of other countries. My job was to deal with that leader, understand their strengths, understand their weaknesses, understand their power dynamics as well.

(01:12:06)
One of my greatest takeaways when I grew up, I’d read the newspapers about all these powerful, famous people. And then as I got older and had the chance to meet them and do business with them and then ultimately interact with them in government, is I realized that they’re just like you and me. They wake up every morning, their kids are pissed at them, their wife doesn’t want to talk with them. And they’ve got a set of advisors around them, one saying, ” Let’s go to war,” one saying, “Let’s make peace.” One saying, “Do the deal,” one saying, “Don’t do the deal.” And they’re all thinking, where do I get advice? How do I make decisions?

(01:12:37)
And so, understanding the true human nature of them and then the different power dynamics around them, I thought was very key. And so, I didn’t have a choice, do I deal with them or not? It was a function of how do you deal with them effectively in order to find areas where you have common interests and then work well together to pursue those common interests in order to achieve a certain goal.

North Korea

Lex Fridman
(01:13:01)
First of all, you’re incredibly well-read. I’ve gotten to know you and I’ve gotten to know Ivanka, and the book recommendation list is just incredible. So first of all, thank you for that. You told me about The Guns of August by Barbara Tuchman. It’s a book on World War I, and I went down a whole rabbit hole there. She’s an incredible historian.

(01:13:21)
But anyway, there’s a bunch of stuff you learned from that, but one of the things you told me is it influenced your general approach to diplomacy of just picking up the phone and giving it a try. So as opposed to planning and strategizing, just pick up the phone.
Jared Kushner
(01:13:39)
This was a book I read way before the notion of serving in government was ever even on my mind or a reality. And I remember thinking about it, reading it, and thinking how World War I started, where you had somebody was assassinated, and then you had all these different alliances that were created. And then in order to accomplish objectives, it triggered all of these people getting in bed with everyone else because of documents that were created without the intent of going to a massive war. And I think in the course of World War I, it was one of the greatest atrocities that we’ve seen as humanity. We’ve had 16 million people killed in that war.

(01:14:21)
And as I was reading the book, I remember thinking to myself, “Even though things are set in a certain way, go sit with somebody, go talk to them and say, ‘This doesn’t make sense, this is wrong. How do we create a better pathway?'” And as a civilian, all my life, I would read the newspapers, I would observe how different leaders would act. But when we had the opportunity to serve in government and have the position, you realize you’re not a civilian. You don’t have the luxury of sitting back and letting the world happen the way it’s happening. You have agency and you have the potential to influence the outcome of things.

(01:14:57)
And one thing I’ve seen is most political prognosticators are wrong. Anyone who tells you what’s going to happen really has no clue. And it’s not because they’re bad or they’re not intelligent, it’s because nobody knows. And at the end of the day, the outcomes in the world are usually driven by the decisions of humans. And if you’re able to come together, form relationships, listen to each other, you can do that.

(01:15:18)
And one of the great examples that I speak about in the book is with North Korea. Whereas if you remember in 2017, it was very intense. When President Obama was leaving office, he told President Trump that the single biggest fear that he had, and this is a time when the world was a mess, you had the Middle East was on fire, ISIS was beheading journalists and killing Christians. They had a caliphate the size of Ohio. Libya was destabilized, Yemen was destabilized. Syria was in a civil war where 500,000 people were killed. Iran was on a glide path to a nuclear weapon. Yet the single biggest fear he had was North Korea.

(01:15:54)
Then it got compounded by the fact that we get into office and President Trump brings his generals around and he’s learning how to interact with all the generals and says, “Okay, what are my options?” And they said, “Calm down. We’ve been using all of our ammunition in the Middle East. We don’t have enough ammunition to go to war over there.” And he says, “Let’s not let that be too public. Let’s try to restock and come up with a plan.”

(01:16:16)
And at the time, there was a lot of banter back and forth. And I was able to, I got a call from a friend who was an old business contact, who actually had done business in North Korea. And he said, “I’d love to find a way to solve this.” And I was getting calls from friends at the time saying, “I’m trying to go to Hawaii for vacation. Should I not be going? Is it not safe?”
Lex Fridman
(01:16:37)
Wow.
Jared Kushner
(01:16:37)
We forget the psychology of how intense that was at the time. And then through that interaction, he called some of his contacts in North Korea. And then we were able, with the CIA, to open up a back channel that ultimately led to the deescalation, the meeting between Trump and Kim Jong-Un, which led to a deescalation.

(01:16:55)
So that was really the mindset, which was whenever there’s a problem, just pick up the phone and try. And I think President Trump had a very similar approach, which was let’s give it a shot. And he wasn’t afraid to go after the hard ones too.

(01:17:09)
And I’ll say one final thing on this, which is that in politics, the incentive structure is just much different than in the real world, in the sense that you have a hard problem. And if you try to solve a hard problem, the likelihood of failure is great. Whereas in the business world, if you’re going after a hard problem, we celebrate those people. Right? We want our entrepreneurs and our great people to go after solving the big, hard problems. But in politics, if you try to take on a hard problem, you have a high likelihood of failure. You’ll get a lot of criticism on your pathway to trying to accomplish that, if you fail. And then if you fail, it has a higher probability of leading to you losing your opportunity to serve. And so, it’s just one of these things where people want to play it safe, which is not the notion that really was taken during the time that President Trump was in office.
Lex Fridman
(01:17:59)
Do you think it has to be that way? I think there’s something in the human spirit, in the public that desires politicians to take on the big, bold problems. Right? Why is it the politicians need to be so afraid of failure?
Jared Kushner
(01:18:16)
I don’t think it has to be that way. And that’s, I think, one of the great lessons from the time of the Trump administration. He brought a lot of people from the business world into government. The business people have a much different mindset than government people, and there was a lot of resistance. And I think part of why there was so much resistance was because, I think about it from my personal sense, was that if I was successful with no traditional qualifications to do diplomacy, it meant that all the people with traditional qualifications and diplomacy didn’t necessarily need those qualifications in order to be successful. And that same sentiment manifested itself in many areas in government.

(01:18:56)
And I think that in the business world, it’s outcome oriented, it’s results oriented. And what we would see in New York is there, they would stab you in the eye, in DC they would stab you in the back, and it just became a whole different-
Lex Fridman
(01:19:07)
God line.
Jared Kushner
(01:19:08)
… dynamic of how you work through these different areas. So the answer is, it doesn’t have to be that way, you just need the right courageous leader. And that’s why I’m so optimistic about what the future of America and the world could be if you have the right people in power who are willing to take on the right challenges and do it in the right way.
Lex Fridman
(01:19:27)
So if we just linger on the North Korea and the deescalation and the meeting, what’s the trajectory from this could be the most catastrophic thing that destroys the world, to you find back channels? Do you start talking and start arranging the meeting? Is there some insights you can give to how difficult that is to do? In that, in the North Korea case, which seems like to be one of the more closed off parts of the world. And any other cases that you worked on.
Jared Kushner
(01:19:56)
Yeah, it’s always very challenging. And especially when you’re going against the grain of what’s established, right? We did something different, to think that an old business contact that I had could then do it. That’s the type of thing that if the press knew what we were doing, they would’ve derided it and criticized it in every which way. But that was one of the benefits of operating very much below the radar, is that we were able to try all these different things. And not all of them worked, but some of them did.

(01:20:22)
But that is what’s amazing about the world, right? This could be the biggest story on the front page of every paper, and they’re inciting fear in everyone, and it’s not illegitimate fear. There were missile tests over Japan. You had a lot of very big challenges with that file. And then all of a sudden we make contact, we go through negotiations to set a meeting. There’s a meeting between President Trump and Kim Jong Un. And then all of a sudden, there’s a framework to try and move things forward. And again, I think that there’s a lot of possibility there for what could happen if it’s worked in the right way.
Lex Fridman
(01:20:56)
I just want to know how you word that first email or text message, what emojis do you use? Like the hugging emoji. It’s just personally, I’ve gotten to know a lot of powerful and rich people, and it’s funny that they’re all human, just like you’re saying. And a lot of the drama, a lot of the problems can be resolved with just a little comradery, a little kindness, a little just actually just reaching out.
Jared Kushner
(01:21:20)
We’re all human beings. And people want to be successful, and people want to be good. And you’re right too. There’s way more emojis involved in diplomacy than I ever would’ve expected.
Lex Fridman
(01:21:29)
And every leader, I’m sure, has their favorite emoji. This is also I learned about people. Everybody has their go-to emoji. I usually go to the heart very quickly, emoji. There’s some people who go the hugging, whatever that, you’re like the hugging thing.

Personalities of leaders


(01:21:44)
Anyway. This conversation quickly turned to the ridiculous. But to do another book reference, you mentioned the book Thirteen Days in September by Lawrence Wright, in discussing all the work you’ve done in Israel and the Middle East. I just want to ask you sort of the interesting aspect of that book, which is the influence of the personalities and personal relationships on these negotiations. You kind of started to allude to that with the trust, but how much do the personalities matter in this? So going from North Korea to the Middle East here, to within Congress and all that kind of stuff.
Jared Kushner
(01:22:20)
Yeah, completely in every way. That’s an incredible book, and it’s a very entertaining read. It has obviously a lot of good historical context on some of the key players, whether it was on Anwar Sadat or Menachem Begin or Jimmy Carter and Cy Vance, and a lot of the others who were involved with those negotiations.

(01:22:38)
And the thing that I kind of took from that experience was just how personal it was. And again, one of my favorite stories from that book was how Anwar Sadat, who was a big, big leader, he had a mystic who was, according to this book, again, history, I like reading it, but I always realize that you have to notice that this is just the perspective of a given author that’s writing it. But the way that they write this book was that he had an advisor who was a mystic, and the mystic was having a back channel with the Israelis. And the mystic told Sadat, “If you go to Israel and you make a speech at the Knesset, Begin is ready to give you the Sinai.”

(01:23:14)
And so, he goes to Israel, they set this whole thing up, he goes and gives the Knesset. They go for their meeting after, and Sadat says, “Okay, well, are we going to do this thing?” And Begin says, “What are you talking about? I’m not giving you an inch of our land.” And it was just one of these things where it was a miscommunication that brought about the symbolic visit of Anwar Sadat to Israel. And that was one of these notions that just made everyone think that something was possible, that they thought was impossible a moment before.

(01:23:45)
And actually, we had an example like that during our time in government when we did the Abraham Accords. The first step of the accords was really a phone call between President Trump, Prime Minister Netanyahu, and Mohammed bin Zayed, who, at that point, was the Crown Prince and de facto ruler of the UAE. But all we had was a phone call and then a statement that was released.

(01:24:08)
And what was interesting after that is we said, “Okay, well, how do we integrate countries? Nobody’s done this in a long time.” And we were trying to figure out all the issues, and there was big miscommunications between Israel and UAE, and we were navigating through all the issues. And so, after a couple weeks, I said, “I’ve got to go over there and try to sort through these issues.” So we make a plan to go to Israel, and then we’re going to go to UAE.

(01:24:27)
And then a young gentleman who worked with me, named Avi Berkowitz, says, “Well, if we’re flying from Israel to UAE, instead of flying on a government plane, why don’t we see if we can get an El AL plane and we’ll do the first official commercial flight?” And so, I said, “That’s a great idea. Let’s call Ambassador Otaiba,” Yousef, who was a tremendous player in the Abraham Accords, working behind the scenes day and night, and was really a big catalyst. So he calls Yousef and he says, “Sure, no problem. Let’s give it a shot.”

(01:24:53)
So we go and we do it, and he says, “If we can work out these issues, what we’ll do.” So we go to Israel, we do our meetings, we get everything back into a good place. We set up this trip over, we fly on an El Al plane. We fill it up, at the time, it was during Covid, with a health delegation. We had the financial ministry because we had to open up banking relationships, they could wire money between countries. We wanted to get health partnerships. Then we just had a lot of legal things and national security things we wanted to start putting together.

(01:25:24)
So we do this flight and we end up landing in UAE. And the picture of us coming off the plane, being greeted by Emiratis in thobes, with an El Al plane with an Israeli flag on it, just captured everyone’s imagination. And so, it was one of these things where it’s like you work so hard on the details and the negotiation, hundreds of hours to make sure everything’s perfect, and the one thing that you do kind of, “Yeah, let’s give it a shot.” That image ended up capturing everyone’s heart.

(01:25:54)
So going back to Sadat, that visit was very critical. And what was interesting was is according to this book, it happened because of a miscommunication. That was the first part. The second part of the book, that’s just amazing theater, and actually the book was based on a play, it was just going back and forth with all of the different methodologies that they tried, that failed, but they kept trying at it. And then, ultimately, seeing how the personalities were able to find ways to make the compromise that ultimately was a very, very big thing for more stability in the Middle East.

(01:26:27)
And so, amazing book, I would highly recommend it. A very entertaining read and something that at least gave me encouragement to keep going when the task I was pursuing seemed so large.
Lex Fridman
(01:26:39)
If you could just linger on the personalities. You write in the book that words matter. Or you write in the context of saying, in the diplomacy business, words matter. And then you said that, “We’re in the results business,” is a badass line. But if we just stick to the diplomacy business and words mattering, it seems like one of the things you really highlight that individual words can really have… You can fight over individual words. So how do you operate in a world where single words matter?
Jared Kushner
(01:27:14)
I think you have to be respectful to the craft that you’re in, where words matter, but then realize that they don’t matter as much. And then also focus on the fact that the actions are actually what’s going to matter more than the words. And so, you have a difference between leaders and politicians. Politicians are there to say the right thing and to hold the power. Leaders are people who are willing to do things that will be transformational, from my perspective.

(01:27:38)
And so, when I would think about diplomacy, words without actions or without the threat of actions, and that was something that President Trump did very well, was that people knew that he was willing to take action, he was very unpredictable in how he would act. And that made our words much more effective in what they did. So it’s all a combination.

(01:27:59)
But coming from the private sector, we are all about results. If you’re in government, you can work on something for 10 years and fail and then retire, and they consider you an expert. In the private sector, if you work on something for 10 weeks and you don’t have a success, then you’re unemployed. So it’s a different kind of notion. And it was just understanding the mentality and trying to adjust and bridging the divides between the different trainings.

Government bureaucracy

Lex Fridman
(01:28:25)
Is that the biggest thing you took from your business background, is that just be really results focused?
Jared Kushner
(01:28:31)
It was just the only way to be. If I was giving up a nice life in New York, and if I was giving up the stuff that I really enjoyed, the company that I’d helped build and the life that I was enjoying in order to do government, I was going there to make a difference and we had to focus on it.

(01:28:48)
The other skillset, so there was a couple skill sets that I found were quite deficient in government. First of all, there was a ton of amazing people. People talk about the bureaucracy. What I found was is you had incredibly committed, passionate, intelligent, capable people all throughout the government. And what they were waiting for though was direction and then cover in order to get there.

(01:29:12)
And so, there were a lot of tasks that I worked on, whether it was building the wall at the southern border, where I was able to work with Customs, Border Patrol, Army Corps of Engineers, military, DHS professionals, DOD, and we basically all came together. And then once we had a good project management plan, we were able to move very, very quickly. I think we built about 470 miles of border barrier in about two years, basically. And that worked very well because we basically brought private sector project management skillset, which we’re quite often missing in government.

(01:29:49)
The second one is just, we spoke about negotiation earlier. I would say that most people in government, it’s just a different form of negotiation than you see in the private sector, and way less effective in that regard. Which is why I think it’s good the more we can encourage more people with private sector experience to do a stint in government and to really try to contribute and serve their country. That’s how our founders, George Washington and all the founding fathers, they were working on their farms. They left their farms, served in government, then they went back to the farm.

(01:30:18)
And that was kind of the design of the representative government. It wasn’t a career political class, it was people coming in to show gratitude for the freedoms and the liberties that they enjoyed, and then do their best to help others have those same opportunities that they had, and then they’d go back and live their lives. So I think that there’s a lot of opportunity with our government, of people with more business mindsets who are going to think about things from a solutions perspective, go and serve.
Lex Fridman
(01:30:48)
Is that one of the main problems here? So you also mentioned the book, the Great Degeneration by Niall Ferguson, an awesome historian. He’s been on this podcast. It helped you understand the inefficiencies of government regulation. I’d love it if you can give an insight into why government is so inefficient at times. When it is inefficient, when it doesn’t work, why is that the case? The bureaucracy that you spoke to, the negative aspects of the bureaucracy.
Jared Kushner
(01:31:17)
So we don’t have enough time on this podcast to go into it, but it’s… Look, there’s a lot of aspects that work as well. But I do think we’ve gotten too big. Niall’s book that you mentioned, one of the things that I took from that, I read it I think in 2012, right kind of in the middle of the Great Financial Crisis, was he was talking about how government regulation often was put in place to deal with old crises. It was never going to solve future problems, it was more to solve for problems that had happened in the past. And I remember thinking about that.

(01:31:48)
One thing I was very proud of, of the work of the Trump administration was that you had four years consecutively where there was a net decrease in the cost of regulations. So to give you a context, in the last year of Obama in 2016, there was 6 million man hours spent by the private sector complying with new federal regulations. And that’s not really what the intent of our government was, right? If we have rules or regulations, those should be legislated by Congress. They shouldn’t be put in by bureaucrats who are basically saying, “I want to follow this objective,” so using the power of the pen in order to do that.

(01:32:21)
So the deregulatory effort was actually very critical to Trump’s economic success that happened in the beginning of the administration. And then what I saw with regulation was anytime either there was legislation or regulation coming, the people pushing for it were usually the people who would benefit from the regulatory captures. You look at the Great Financial Crisis, where you had these big banking reforms. Well, what happened during the big banking reforms? Then you had a big reduction in the amount of banks that occurred, and the big banks became even bigger. Whereas I don’t think that was the intention of the legislation, but the people who were writing the legislation and influencing it had a lot of the constituencies from those larger institutions.

(01:33:01)
And then what happened as a result of that? A lot of these smaller institutions didn’t have the ability to be as competitive. They had more restrictions, more costs, they became less profitable. But these were the banks that were serving small business, which is the biggest creator of jobs in our country. And then as a result, the bigger banks got more powerful and what happened in the country as a result of the regulations that they put in place? The wealth gap in the country grew, it didn’t shrink.

(01:33:27)
And so, I think oftentimes what they say these regulations are intended to be, the result often becomes the opposite. And so, what President Trump did and his administration was they did a massive deregulatory effort. And I think they pledged that for every one regulation they put on, because you do need some regulation in an economy and in a society, they would take off two. And in the first year, they eliminated eight regulations for every one.

(01:33:54)
So that was just something I took from it, which was, I thought, very interesting. And you had to really, I think you just have to think through what are the consequences going to be of the different actions you take? And often, government gets it wrong by taking an action that feels right, but has big negative consequences down the road.

Accusations of collusion with Russia

Lex Fridman
(01:34:11)
Let’s go to some difficult topics. You’ve wrote in the book about your experience with some very low points in government. You’ve been attacked quite a bit. One of the ones that stands out is the accusations of collusion with Russia. And you tell in the book, in general, this whole story, this whole journey, on a personal level, on a sort of big political level. Can you tell me some aspects of this story?
Jared Kushner
(01:34:39)
Sure. So to give the listeners some context, and people remember this now, it’s been kind of swept away because it turned out not to be true, was that after President Trump won the election in 2016, instead of the media saying, “Oh, we were wrong,” because again, everyone thought he had zero chance of winning. They said, “Okay, well, we couldn’t have been wrong. It must have been the Russians who worked with him.”

(01:35:03)
And so, at first, when this started coming up, I thought this was ridiculous. I was very intimately involved with the operations of the campaign. I was running the finance of the campaign. I was running the digital media of the campaign. I was running the schedule for the campaign. And I knew that on most days, we had trouble working, coordinating with ourselves, let alone collaborating with another government and colluding, as they called it. And so, we did a great job, I think, as an underdog campaign, very leanly staffed. And then they said that we were working with the Russians.

(01:35:38)
And so, at the time, I didn’t take it too seriously because I knew there was no truth to it. But it was amazing to me to start seeing all of these institutions, whether it was CNN, the Washington Post, New York Times, these were news organizations that I grew up having a lot of respect for, taking these accusations so seriously. And then working themselves up in order to just cover it for two years. Then as a result, you had a special counsel, you had a House investigation, a Senate investigation.

(01:36:07)
And I personally spent about, I think over 20 hours just testifying before these different committees. Again, spent millions of dollars out of my own pocket on my legal fees to make sure I was well-represented. And the reason I did that was because I saw in Washington, it was like a sick game. It’s almost like even though there was no underlying problems to the accusation, I felt like this is one of those things where they’re going to try to catch you. And then if you step on the line, they catch you with one misrepresentation, they’re going to try to put you in jail or worse. And so, for me, that was a big concern.

(01:36:43)
And it was amazing. My poor mom, I told her to stop reading whatever. I said, “Mom, I promise you, we didn’t do anything wrong. It’s good.” But she’d call me and say, “Well, our friends were on the Upper East Side, were talking with Chuck Schumer, says, ‘Jared’s going to jail. We know for sure that he colluded with the Russians.'” And this is a leading senator saying things like this.

(01:37:00)
And so, it was just interesting for me to see how the whole world could believe something and be talking about it that I knew, with 1000% certainty, was just not true. And so, seeing that play out was very, very hard. Obviously, I was accused of a lot of things. There were times in Washington, I was radioactive. I remember one weekend it was all over CNN, the people, they had panels on CNN, like the news organization that I grew up thinking was the number one trusted name for news in the world, talking about how I’d committed treason, because I met with an ambassador and said, “We’d like to hear your perspective on what you think the policy should be in Syria,” where there was a big civil war happening and ISIS, and a lot of different things.

(01:37:45)
So it was quite a crazy time in that regard. But luckily, again, we were able to fight through it. It was a major distraction for our administration. And I think we were able to kind of stay focused on the objectives and the policies. But it was a crazy time, and I learned a lot from that experience.
Lex Fridman
(01:38:00)
It’s crazy how just an accusation can be viral and can just go. One of the things that worries me is the effect on your mind, the psychology of it, to make sure it doesn’t make you cynical. People that are trying to do stuff, those kinds of stories that can destroy their mind. So one of the things I’d love to sort of understand, you, who kind of rolled in from the business world, and all of a sudden, the entire world, from CNN to everybody’s accusing you of colluding with the Russians. When you’re sitting at home, how do you keep a calm mind, a clear mind, an optimistic one, that doesn’t become cynical and actually just keep trying to push on and do stuff in the world?
Jared Kushner
(01:38:39)
Yeah. It was a surreal experience. I would say number one is I felt very confident that I hadn’t done anything wrong. So I’d always tell my lawyer, “The good news is I’ve got a good fact problem.” I need a good lawyer to get me through it, but it’s much easier to be a good lawyer if you have a very innocent client. And so, the fact that I knew that I didn’t believe that I had any legal liability helped me kind of-
Jared Kushner
(01:39:03)
… That I had any legal liability helped me intellectually separate the challenge I needed to do to fight through it, from it. And then I just basically said I’d had hardship earlier in my life where I dealt with the situation with my father. And what I realized there is that you can’t really spend energy on the things that you don’t control. All you can do is spend your time and energy worrying about what you can control and then how you react to the things that you have there. And so it took a lot of discipline, it took a lot of strength. And again, I give my wife Ivanka and even Donald, a lot of credit for having my back during that time and encouraging me just to fight through it.

(01:39:42)
And then I also had to make sure that I didn’t allow that to distract me from my job. I felt like I had an amazing opportunity in the White House to make a difference in the world. And if I would’ve spent all my time playing defense, in politics, it’s a time duration game. In business, you have whatever duration you set for yourself, in politics, it’s time duration. We had four years. Every day was sand through an hourglass. My mindset was, I need to accomplish as much as I can in these four years. And I guess the traditional game that’s played in Washington is whether it’s the media, the opposition, their job is to distract you and then try to stop you from being as successful as you want to be. And so just fought through it.

(01:40:19)
And it wasn’t always fun, but we got through and thank God it’s something people don’t talk about. And it has been amazing to me just the lack of self-awareness and reflection of a lot of the people who hyped this up for two years. They don’t think there was anything wrong with it. And that’s interesting, but my view is, we got through it, it’s good. So it’s in the past and then I started moving to the future and that’s really where I spent my time.
Lex Fridman
(01:40:46)
Yeah. But I want to linger on it because to me, that has a really discouraging effect on anyone who’s trying to do positive in the world. These kinds of attacks are intense. You say one of the lessons you learned is that you really have to be perfect, but I hate that to be the lesson. I feel like you should be able to do stupid stuff, take big risks, and people celebrate the big risks and not try to weave gigantic stories over nothing. I just want to understand the two aspects of this, how to not have such stories of so much legs, and the other is how to stay psychologically strong? So you waved it off that you didn’t have a fact problem, but it can just have a effect in your psyche. You seem to be pretty stoic about the whole thing, but just on the psychology side, how did you stay calm and not become cynical where you can continue to do stuff and take big risks?
Jared Kushner
(01:41:47)
I didn’t have a choice.
Lex Fridman
(01:41:48)
What do you mean?
Jared Kushner
(01:41:49)
I mean I could have spent every day feeling sorry for myself or complaining or saying things aren’t fair. But the general way I looked at it was that in life, every opportunity has a cost. And you could look at it and say maybe this was a massive cost, either in dollars or in time or in reputation or in emotional drain. But you could also say that I had an opportunity to work in the White House and I had an opportunity to work on some of the hardest challenges. And you talk about how that’s not celebrated, that is something very different. In the private sector, when you take on big challenges, that is celebrated. In government, when you take on big challenges, people want to see it fail or they want to criticize those people who are trying to take that on. And I think that’s wrong.

(01:42:33)
And I think that as a country, we should be thinking big. We should be dreaming big, and we should be encouraging our politicians to try and to fail more and to go to take on big things knowing that there’s risk of failing. Obviously, we want them to succeed, not to fail, but let’s take on the big things. Let’s try to do that. So I think it’s just very basic that you’re in a situation. I’ve made decisions. I can’t go back and change decisions in the past. I still felt very blessed to be in the position I was in, and I knew that I just had to work through it. Like I said, I was very lucky to have support from my wife and from my family and from good friends.

(01:43:09)
Again, I think I’d chosen very good friends in life and my friends were with me. I had one friend who, my lowest moment, got on the plane, he lived in Arizona, got on a plane and came just to have dinner with me to say, “Just pick your head up. I know you’re down now, you’re going to be fine. Just fight through.” That meant a lot to me. And again, I always think in my life, you don’t learn as much from your successes. You don’t learn as much from your high points. You learn the most about who you want to be and how the world works from your lowest moments. And at those lowest moments, it made me better and it taught me how to be a better friend to people who are in tough situations. And I tried to just get tougher and I tried to just get better and work through it.

Ivanka

Lex Fridman
(01:43:50)
Yeah. You said that you and Ivanka, this intense time brought you two together and helped you deal with the intensity, with the chaos of it all.
Jared Kushner
(01:44:01)
So I think it was just number one, knowing that you had a partner and knowing that you had somebody who loved you and believed in you. I think that was definitely by far the biggest of anything. And-
Lex Fridman
(01:44:10)
Love is the answer.
Jared Kushner
(01:44:12)
Love is very important. But then there’s also a lot that I’ve learned from her always getting me to read different books or learn different things, which I love. But she’s also, I think, an amazing role model. And I go through our time in Washington where there were so many people who were, I thought, very nasty to her, unfoundedly. And I’m not talking about individually because again, most people who interacted with her were super kind. But I would see people on Twitter or different places go after her and she always stayed elegant and I felt like that was something that she never stooped down to a lower level. She kept her elegance the whole time and she really went to Washington wanting to be a force of good. And I see all the time that she follows her heart, she does what’s right and she has a very strong moral compass. And I feel very lucky to have her as a partner. And I respect her tremendously.
Lex Fridman
(01:45:08)
Yeah. She walks through the fire with grace, I would say. And she’s recommended a bunch of amazing books to me and she has an incredible, fascinating mind. But one thing that jumped out to me is you both love diners, Jersey diners. So I lived in Philly for a while and I traveled quite a bit and traveling from Boston down to Philly, maybe to DC, you can drive through Jersey. It’s something about Jersey. I don’t know what it is.
Jared Kushner
(01:45:35)
It’s the best. It’s the best.
Lex Fridman
(01:45:36)
You listen to Bruce Springsteen. Louis C.K has this bit where I think it’s part of criticizing cell phones today where people are too much on their phone. They don’t just sit there, be bored, but he uses that story to tell where he’s just driving and Bruce Springsteen’s song comes on and he just wants to pull over to the side of the road and just weep for unexplainable reason. I think that’s true because life is difficult. Life is full of suffering or struggle or challenges. So sometimes, it’s always Bruce Springsteen, but some song like this can really make you reflect on life, that melancholy feeling. But that melancholy feeling is the other side of the happiness coin where, if you just allow yourself to feel that pain, you can also feel the highest joys. That’s the sort of the point Louis C.K makes.

(01:46:29)
And there’s something about Jersey with the diners, often late at night… There’s several diner experiences I should say. There’s the family friendly, there’s a nice waitress and there’s a sweetness, a kindness like hello sweetheart, that kind of thing. There’s also the 3:00 AM diner, the ones that are open 24 hours, that has a romantic element when you’re a young man or young woman, you’re traveling. The loneliness of that, it’s all of it. The American diner is from Jack Kerouac on, represents something. I’m not sure what that is, but it’s a real beautiful experience. And the food itself too.
Jared Kushner
(01:47:09)
Oh, always fresh. Yeah. The thing with diners, there’s so much to love about it. And I grew up, obviously in New Jersey, when I’d go with my father to business, he’d always stop. We’d eat at a diner. Late night I’d be come back with my friends, we’d stop at a diner. And it’s a tradition that Ivanka and I love doing as well. And I think there’s a notion of it’s very egalitarian in that people from all places are there. You could order basically whatever you want. The menus at the diners look like the phone book.
Lex Fridman
(01:47:38)
Yeah, it’s great.
Jared Kushner
(01:47:38)
And it’s amazing how they keep so much fresh ingredients to do it, at least the good ones do. I love as a jersey guy, that you get mozzarella sticks and an omelet at any hour of the day because most of them are open 24 hours. And that’s basically my Ivanka, my go-to, we’ll throw in a milkshake or two as well. But for me as a kid, my father would take me, sometimes I’d sit with him in the meeting, sometimes I’d be at the table next to him. He’d give me a bunch of quarters to put in the music machine that they would have on the wall. And it was always just a great experience doing it.

(01:48:08)
I joke that if you grew up in Jersey, you grow up with just enough of a chip on your shoulder that you have to go and make something of yourself in life. It’s a special place. I had an amazing childhood there and very, very proud to be from the state. And I will just give a little bit of a plug now because the state has now actually turned the corner and they had a $10 billion budget surplus for many years. It was a state that was basically bankrupt and now actually under a pretty progressive Democrat governor Phil Murphy. He’s turned the state around and it’s actually has a very bright future ahead and it’s probably one of the best places to raise a family in the country. It’s got very low crime, one of the best public school systems in the country, pretty good healthcare system, a lot of green parks. People know the Turnpike, but it’s got a lot to it. That’s really great. So I’m a big, big fan of Jersey.
Lex Fridman
(01:48:57)
I like how this is a first for this particular podcast, you literally gave a plug to a state. So New Jersey everybody.
Jared Kushner
(01:49:06)
It’s where it’s at.
Lex Fridman
(01:49:07)
There’s South Jersey there’s North Jersey. There’s all kinds of Jerseys too. The whole thing, it just…
Jared Kushner
(01:49:12)
And don’t get me started on the Jersey Shore, Lex.
Lex Fridman
(01:49:16)
Jersey Shore is a whole thing.
Jared Kushner
(01:49:17)
And I’m not talking about the Snooki part, I’m talking about the real nice parts, really great food, great people.
Lex Fridman
(01:49:21)
What do you mean nice parts? It’s all beautiful. The full range of human characters that are in New Jersey are all beautiful.
Jared Kushner
(01:49:29)
I agree with that.

Father

Lex Fridman
(01:49:30)
And every time I travel across the world, there’s always to meet somebody from New Jersey and you give a nod of a deep understanding. It’s the cradle of civilization in many ways. Okay, so back, I don’t know how we got there. Oh, all right. Going back to the low points, you mentioned your father, if we could just return there. Even just the personal story of your father that you write about, all the betrayal that happened in his life and then how he responded to that betrayal and he was after that arrested. Can you just tell the story?
Jared Kushner
(01:50:05)
Sure. So my father is an amazing person and we grew up in New Jersey. My father was a big developer, a great entrepreneur, built an amazing business. He got into a dispute with two of his siblings and through that dispute, they basically took all of the documents in his company, went to the US attorney’s office and turned from a civil dispute into a real public dispute. My father did something wrong in that process. And when he got arrested for that, he basically said, “You know what? What I did was wrong.” And he took his medicine and he did it like a man. And he said, “I’m going to go to prison.” And he did that for a year. And so for me, that was a very challenging time in the family. Obviously, it was a shock. It was a total change.

(01:50:57)
My childhood was I think, a very nice childhood. My parents always said, “Do good in school, work hard.” I was very focused on my athletics. I was captain of the basketball team, assistant captain of the hockey team. I ran a marathon with my father and it was always about pursuing. Went to Harvard, graduated with honors, and then was in NYU pursuing a law degree and a business degree. And I was working at the Manhattan District Attorney’s office at the time actually thinking I wanted to go into public service because my father always taught us, we were always surrounded by politicians and he always said, ” My parents came to America. They lived in the land of opportunity and they had these opportunities because this is the best country in the world. So you should be successful. Work hard, don’t ever let your opportunities become your disadvantages because you have advantages in life. You have to work harder.” And that’s what he instilled in myself and my brother. And he always pushed us to make the most of ourselves. And when we did that…

(01:51:57)
Everything changed overnight when my father got arrested. Obviously it’s very embarrassing for a family when you’re on the front page of the papers, I would see the newspapers writing all these things about my father that I didn’t think were representative of the person that I knew. It was a big change for our family. And I was angry. I was angry. I said, “I could be angry at the prosecutor, I could be angry at my father’s brother. I could be angry at my father’s lawyers. I could be angry at my father for making this mistake.” And then I said, “That’s not going to change anything.” And I had a real shift. And I do think that that was a turning point in my life where I basically said, “Let me focus on the things I can control. Let me focus on the positive things I can do.”

(01:52:42)
And from that moment forward, I said, “How can I be a great son to my father? How can I be a great older brother/substitute father for my two sisters and my younger brother? How could I be there for my mother? How could I be there for my father’s business?” And I just went into battle mode and I put my armor on and I just ran into it. And for the next two years, every day was painful. I was dealing with banks, the company still had subpoenas, I was still in law school. I’d tell my father I wanted to drop out of law school and business school, but he said, “Please don’t.”

(01:53:12)
So I would basically go to law school one day a week or maybe I’d skip it most days and I’d go to his office every day. And my friends would joke that if my professors wanted to fail me, the law professor would have to give me a test that had four pictures and say, “Circle who your professor is.” But I would basically take a week off, I’d read the books and I did well and I got my degrees. And it was just a very, very challenging time.

(01:53:34)
But like I said to you before is that you learn the most about life and you learn the most about humanity and yourself when you’re in your most challenging periods. And I’ll say that that experience also changed the people I interacted with, spending weekends with my father down in a prison in Alabama, I met the other inmates, I met their families. I spent time then trying to advise the children of other people who were going through the same experience that I’d gone through on how to navigate it correctly. And you just learn a lot about the world and you see that in life, everything could get taken from you, your status, your money, your friends. I saw that certain people were very disloyal to my father at the time, who he thought were friends. It was only a handful. But again, I learned from those people, how can I be a true friend to people? How can I be better? And I learned a tremendous amount through that experience.
Lex Fridman
(01:54:29)
You write that your father told you about being humble. I’d love to ask you about this, that in life sometimes we get so powerful that we start to think we’re the dealers of our own fate. We’re not the dealer’s, God is the dealer. Sometimes we have to be brought back down to earth to get perspective on what is really important. What do you think he meant by that? What did you learn from that experience?
Jared Kushner
(01:54:51)
The way I interpreted it at the time, and those were very, very memorable words, and it occurred… After I picked up my father from the arraignment. I drove him down. I drove the car and my father and I are very, very close. And he didn’t say a word for the whole time. And I think he was processing number one, what was happening to him. And I couldn’t even imagine. But I actually think the bigger pain for him, because my father is such a committed person to the family, is like, did I let my family down? Did I let my kids down? And I do think he felt that that moment like his life was over. He couldn’t really see past what this challenge was going to bring and if there would be a life for him after it. So I could see that he had a lot of fear and he really wasn’t saying much. And then I didn’t know what to do.

(01:55:36)
And so I just stood by him and stood close. And later that day or the next day, he got up and started walking. He had an ankle monitor. For whatever reason, the prosecutor was so aggressive that he was a flight risk, so they made him wear an ankle monitor. They were very, very aggressive and nasty. And at the time, my father was the biggest donor to Democrats. The prosecutor was a Republican. It was a very political thing. And what happened was is he was walking around the pool and I just started walking with him and he said to me, “Jared, in life sometimes, we get so powerful that we believe that we’re the dealer.” He says, “But we’re not the dealer. God’s the dealer. And we have to come down to earth to understand,” like you said. So what I took from that was that my father, with all of his success, had started to believe that maybe certain rules didn’t apply to him. And I think that that’s where he made a mistake. And I think he had a lot of regret that he made the mistake. And my father is a very humble person. He’s a very moral person. For me, with my humility, my brother and I joke that we give our credit for being humble, number one, to being Mets fans because every year you have a lot of promise and then it never ends up paying off. Although now with Steve Cohen, hopefully we’re on a different trajectory.

(01:56:49)
But the other thing is also our mother. Our mother really raised us to be very humble. We knew we had a lot, but every Sunday morning my mom was there clipping the coupons. The cereal we ate in our house was based on was what was on sale versus what we liked. When we would have a problem with our teachers in school and I’d say, “Well, teacher doesn’t like me.” She’d say, “Well, I’m not calling them. It’s your job to make the teacher like you.” And so my mother gave us a lot of that. My father gave us a lot of the grounding. And I think during that time, my father was just realizing that maybe he had gotten disconnected from the grounding and the values.

(01:57:32)
And again, I think he also accepted maybe he could have blamed others for acting inappropriately. But I respect the fact that he took responsibility himself and said, “I can’t control the actions of other people. I can’t control what they do is right and wrong. I can just control my actions.” And as I go on the next journeys in my life and I go to government, I go to Washington. I even think through the craziness of going from visiting my father in a prison to 10 years later sitting in the office in the White House next to the President of the United States. And I think about that story and that it’s a story that only God could write. And I really believe that you have to have a lot of faith because the lows and the highs are both so extreme and unbelievable that I feel like those low moments in some ways, allowed me to keep my grounding and to understand what was truly important in life for when I ended up going through those other moments.

Money and power

Lex Fridman
(01:58:28)
Your father was betrayed, perhaps over money by siblings. Is there some deeper wisdom you can draw from that? Have you seen money or perhaps power cloud people’s judgment?
Jared Kushner
(01:58:41)
Oh, 100%. 100%.
Lex Fridman
(01:58:43)
Is there some optimistic thing you can take from that about human nature of how to escape that clouding of judgment when you’re talking about leaders, when you’re talking about government, even business. Because you mentioned there’s a power dynamics at play always when you’re negotiating. Is there a way to see the common humanity and not see the will to power in the whole thing?
Jared Kushner
(01:59:13)
Definitely. You mentioned about power, money corrupting. There’s a great quote I heard a friend of mine say, is a guy Michael Harris, who was one of the founders of Death Row Records, and he was being interviewed recently and they asked him about what happened with Suge Knight. And his line was, ” Money just makes you more of what you already are,” which I thought was a very elegant way of saying it. And I would see this time and time again in the White House where you had people who were now given a lot of responsibility and power and it went to their head and they acted very crazily and maybe didn’t act in a way that I thought was always conducive to the objective.

(01:59:53)
So I think it’s a very big problem that you have, whether it’s something that’s solvable, I think it’s about having the right leaders and hopefully for the leaders, having good friends. I’m still friends with a lot of the people I interacted with when I was in government, and the number one thing I try to be to them is just a good friend. I try to be somebody who they can talk about things with. I don’t go in trying to tell them what to do on different things. And I think that that’s a big thing is that people just need friends and they need conversation. And if they have that, then hopefully, that allows them to keep their head in the right place.
Lex Fridman
(02:00:28)
I think this is a good place to ask about one aspect of the fascinating work you’ve done, which is on prison reform. Can you take me through your journey of helping the bipartisan bill get passed. Just working on prison reform in the White House in general, how you made that happen, how you help make that happen?
Jared Kushner
(02:00:47)
Sure. So we passed a law called the First Step Act, which was the largest prison and criminal justice reform bill that’s been done maybe in 30, 40, 50 years in the US. And so what it basically did was two things. Number one is it took the prison system and it took a certain class of offenders and allowed them to become eligible for earlier release if they go through the certain trainings that will allow them to have a lower probability of going back. Stepping back, you look at the prison system, you say, “What’s the purpose? Is it to punish? Is it to warehouse? Is it to rehabilitate?” And I do think that we’re a country that believes in second chances. I saw firsthand when my father was a client of the system, how inefficient it was and how much better it could be.

(02:01:36)
And when my father got out, we didn’t run from that experience. He started hiring people from Rikers Island and different prisons into the company into a second chance, a program, which we’re very, very proud of doing. And what we saw through our micro experience was that if you give people mentorship, if you give them job training, a lot of people, they have addiction issues and they can’t find housing. And so people leave prison with a criminal record and they’re less likely to go back and reintegrate in society without help from different institutions that can help them do that. So we modeled the reforms off what they did in Texas and Georgia and other states where they basically put a lot of job training, alcohol and addiction treatment programs in the prisons as a way to incentivize the prisoners to work on themselves while they’re there in order to allow them to reenter society.

(02:02:32)
It’s turned out to be very successful so far. They just had a report that showed that the general population has had a 47% recidivism rate, meaning that people who leave federal prison, half of them go back. And people who have now taken this program, only 12% of them go back. So number one, you’re making communities safer because if people are going to now get a job and enter society instead of committing future crimes, you’re avoiding future crimes. And number two, you’re giving people a second chance at life. And so that was the first part of it. The second thing we did was there was a rule passed in the ’90s that basically penalized crack cocaine at 100 times the penalty of what regular cocaine was. And I think a lot of the motivations, what people say in retrospect was that crack was more of a black drug drug and cocaine was more of a white drug.

(02:03:22)
And so there was a really racial disparity in terms of what the application of these sentences were. So they then revised that to make it 18 to one. And what we did in this bill is we allowed it to go retroactive to allow people who were in prison with sentences under what we thought was the racist law to be able to make an application to a judge in order to be dismissed. And it was based on good behavior, being rehabilitated and the fact that they would have a low probability of offending in the future. And so that was really the meat of it. And there was a couple other things in there we did as well, which were also quite good. So we did it. Worked very closely with the Democrats, Republicans to do it. At first, President Trump was a little bit skeptical of it because he’s a big strong law and order supporter, but he made me work very hard to put together a coalition of Republicans and Democrats and law enforcement.

(02:04:15)
We had the support from the policemen, we had the support from the ACLU and ultimately, we were able to get it together. And it was an amazing thing. We ended up getting 87 votes in the Senate. This happened for me at a time while the Russia investigation stuff was still happening. New chief of staff came in, John Kelly, he basically marginalized me in the operations. So I had less day-to-day responsibilities in the White House. And so for me, this effort became one of my full-time efforts along with negotiating the Mexico trade deal and along with the Middle East efforts. And the reason why that was great was because it didn’t have a lot of support from the Republican caucus originally, and people thought there was no way it would happen. So I really was able to be the chief executive, the middle executive, the low executive, the intern.

(02:05:06)
And through that process, I really got an education on how Congress works, on how to pass legislation. I was negotiating text, I was negotiating back and forth, and I built a lot of trust. Again, I would deal with whether it’s Hakeem Jeffries or Cedric Richmond, that we built a lot of trust. We’d speak three times a day. These guys had my back, the ACLU. Again, I never thought they were suing our administration every day or every other day on something. But for whatever reason, we built trust and we’re able to work together. And then also with the real conservative groups because there was a big part of the conservative base that felt like we should be giving people a second chance. And in addition to that, this will keep our country safer and it’ll reduce the cost of what we spend on prisons. And so it was a great effort and I was very, very proud that we were able to get it done under President Trump.
Lex Fridman
(02:05:53)
How’d you convince the Republicans? So they were skeptical at first? Are we talking about just phone conversations? Going out to lunch? Just back to the emojis or what?
Jared Kushner
(02:06:03)
Hand to hand combat, meetings. The cool thing about this… I always get frustrated when I hear a lawmaker say, “Oh, the senate’s not what it used to be, or Congress isn’t what it used to be. Things are broken today.” I don’t think that’s true. I think going through the process, I think that our founders were totally genius in the way that they designed our system of government. And what I saw is you just have to work it so everyone knows the power of their vote. Some would give it to me easily, some wouldn’t give it to me easily. Some would trade it for other things, some would withhold it because they were pissed about other things and it was just hand to hand combat. So it was just making calls using the phone, walking the halls, going to lunches, hosting dinners at my house. It was a nonstop lobbying effort. And by the way, it was also adjudicating issues and making people feel like they were heard, hearing their issues, and then trying to find solutions that you don’t put something in that then tips off where you lose a whole coalition.

(02:06:58)
So it was really a balancing act, but it was an amazing thing and I worked very closely on that with Van Jones and Jessica Jackson, who also gave me a lot of help on the left. And it was an amazing thing. Had a great team too.

Trust and betrayal

Lex Fridman
(02:07:11)
So you mentioned the importance of trust at the very beginning of the conversation. From the outsider perspective, just maybe a dark question, which is, how much trust is there in Washington? The flip side of that, how much backstabbing is there? Can you form long-term relationships with people on a basic human level where you know you’re not going to be betrayed, screwed over, manipulated for again, going back to the old money and power?
Jared Kushner
(02:07:48)
The answer is yes, and the answer is no. So I made some incredible friends, lifelong friends through my time in Washington, but the way I think about it from politics and I think in geopolitics as well, is I would say that politicians really don’t have friends. Politicians have interests. And as long as you follow that rule, you should be able to know how to rate where your relationship with a given person falls in the spectrum. But I do think I was the exception. I did make some tremendous friends. And again, I’d go back to what I said about negotiation where, when you’re in a situation where there’s really nothing in it for any of you personally, but you’re in a foxhole together and nobody in Washington can get anything done by themselves. So you have people coming from all different backgrounds, all different experiences, all different geographies coming together, agreeing on an objective, creating a plan, and then every day rowing together in order to get it done. It’s a beautiful thing and you really learn what people are about.

(02:08:48)
And so when you go through an experience like that, you learn who’s in it for themselves. You learn who’s in it for the cause and for every thing you read about in the press of a fight I had with somebody because we were at odds. I have about 100 people who have become lifelong friends because I respect the way that when we were under fire together, they got better, they were competent, and they were there to serve for the right reason.

(02:09:11)
And so I guess the answer is yes, it is possible. You have to be careful because there are a lot of mercurial people there. I always say the politicians are like gladiators. I didn’t have as much respect for politicians till I got there. But if you think about it, everyone who’s got a congressional seat or a senate seat, there’s 25 people back at home who want their job, who think they’re smarter than them, who are trying to back stab them. And so I always say that the political dynamic, it’s like in the private sector, you’re standing on flat ground. You choose which fights you take on. When you take them on, how you fight them.

(02:09:46)
In politics, it’s like you’re standing on a ball and what you have to realize is that there’s maybe 10 things that you have to do, but there’s a potential cost to taking on each one that might destabilize you. You fall off the ball and then you lose your opportunity to pursue those. You have to always be marking everything to market and going through your calculations to make sure you can accomplish what you want to without falling off the ball and losing your opportunity to make a difference.
Lex Fridman
(02:10:14)
I guess people like power and I just feel like to be a good politician, good meaning, good for humanity, be willing to let go of power. Try to do the right thing. If there’s somebody back home that does manipulate stuff, screws you over and takes power from you, it’s okay. I feel like that kind of humility is required to be a great leader, and I feel like that’s actually a good way to have long-term power because karma has a viral aspect to it. Just doing good by others, I feel like is-
Jared Kushner
(02:10:53)
I’d like to say that’s true, Lex. I think it’s just way more complicated. You look what happened this week with Kevin McCarthy, right? He did what he thought was morally right. He thought he did a bipartisan deal. He was told that they would have his back, and then the moment things got tough, they cut him loose. So again, I don’t know if that was the right thing or the wrong thing, right? I’ve also seen leaders on the other end say, “I’m going to do things that are short-term, more selfish.” But the way they justify it to themselves is to say, “I believe that myself staying in power is existential to the greater good. So I will do things that maybe are not in the greater good now because I believe that my maintaining power is.” And so it’s complicated. In an idealized world, I’d love to believe that’s the case, but it’s just way more complicated than that.
Lex Fridman
(02:11:46)
Yeah.
Jared Kushner
(02:11:48)
I wish it wasn’t, but it is.
Lex Fridman
(02:11:50)
Yeah. I do just wish people in politics zoomed out a bit and just ask themselves, what are we all doing this for? Sometimes you can get a little bit lost in the-
Lex Fridman
(02:12:03)
You know sometimes you can get a little bit lost in the game of it. If you zoom out you realize integrity is way more important than little gains in money or little gains in power in the longterm just when you look at yourself in the mirror at the end of the day. And also how history remembers you, I just feel like people do some dark stuff when they’re in that moment when they’re losing power and they try to hold on a little too hard. This is when they can do really dark things like bring out the worst in themselves. It’s just sad to see, and I wish there was a kind of machinery of government would inspire people to be their best selves in their last days versus their worst selves.
Jared Kushner
(02:12:49)
When that system gets invented, you’ll share with me what it is, but it’s… Look, let me give you another way to frame it, which is, and this was kind of the revelation we spoke before about when I was getting my butt kicked by the Russian investigation and all the different areas. But the basic framework I looked at was I said, “Okay, this all feels tough.” But I said, “The game’s the game, the game’s been here way longer, but way before I came, and it’ll be here way long after I leave, and so I have two choices. I can complain that the game’s tough, it’s not fair, it’s not moral, or I can go and I can try to play the game as hard as possible.” And I think that there’s two different things. You have people who are willing to kind of sit in the stands and they’re willing to yell at the players or make their points known, or you have people who are willing to suit up and get in the arena and go play.

(02:13:40)
And I have a lot of respect for the people who suit up and go play. Again, some of them I wish they would play for different means, but the fact that they’re willing to put their name on the ballot, make the sacrifice, and go put on the fads and get hit and hit others, I think that you need those people. And I wish more people who had maybe the moral wiring that you discussed would be putting on a helmet and going to play because it’s hard. It’s hard.
Lex Fridman
(02:14:04)
I agree with you. I just would love to fix the aspect of the Russia collusion, accusation, the virality, the power of that, because that’s a really discouraging thing for people. Maybe it’s the way it has to be, but it seems like a disincentive to people to participate.
Jared Kushner
(02:14:20)
It is, but I’ll give you, again, an optimistic side of it is that what you’re seeing now with social media is I do think with what’s happening at X, there is now more of a reversion towards more egalitarian and egalitarianism of information. And so for many years the media publications were the gate holders, they were the gatekeepers, and then you had these social media companies that grew. They became so powerful, but then they were tilting the scales. Why they were doing it, we can go through long explanations for that, but if there truly is a real forum and a democratization of information, then you would think that the marketplace of ideas would surface the real ones and discredit the non-real ones. And I think that as a society, we’re starting to kind of come to grips with the fact that the power dynamic is changing and that some of these institutions that we used to have a lot of faith in don’t deserve our faith. And some of them will actually reform and maybe re-earn our faith, so I think that there could be an optimistic tone.

(02:15:22)
Again, the years of Trump, I think that he was an outsider and he represented something that was existential to the system. You think about for the 30 years before you were either part of the Clinton Dynasty or the Bush dynasty. I think a lot of people in the country felt like that whole class, whether you’re wearing a red shirt or a blue shirt, wasn’t representing them and Trump represented a true outsider to that system. I do think that as he went in there, there was a lot of norms that were broken to try to stop him from changing the traditional power structure. So I think that we’re at a time where maybe there will be an optimistic breakthrough where you’ll have institutions that will allow for a lot more transparency into what truth really is.

Mohammed bin Salman

Lex Fridman
(02:16:12)
I’d love to go back and talk to you about the Middle East, because there’s so many interesting components to this. Let’s talk about Saudi Arabia, and first let me ask you about MBS, Mohammed bin Salman, the Crown Prince. So you’ve gotten to know him pretty well, you’ve become friends with him. What’s he like as a human being? Just on a basic human level, what’s he like?
Jared Kushner
(02:16:32)
So for the listeners, Mohammed bin Salman is now the Crown Prince of Saudi Arabia. He has risen to that position over the last couple of years and he’s been a tremendous reformer for the country. He’s gone in and he’s really modernized the economy. He’s put a lot more investment into the country. He’s marginalized the religious police and he’s really done a good job to bring modernization, a lot of reform. So he is been a great reformer. What he’s like as a person is he’s very high energy. He’s got tremendous candor power, very, very smart, incredibly well-read.

(02:17:10)
When he was younger, his father would give him a book a week and make him report on it on the weekend. He was trained as a leader and as a politician, really, by his father. He’s not western educated, so he grew up in the Saudi culture and he’s a real Saudi nationalist. He loves their history, loves their heritage, has a steep understanding of the tribal nature of the region. His father was actually known to be a tremendous politician, so when he was governor of Riyadh, people who I speak to today about him say that if they had a full election, he would’ve won in a landslide. They say every time somebody went to the hospital, he was the first person to call. Anytime there was a funeral, he was the first person to show up. He’s a very, very beloved leader.

(02:17:57)
Mohammed bin Salman, he was a businessman before he got into Crown Prince. So he thinks really with a business mindset about how he runs the country, and he’s brought I think a different mindset and energy to the Middle East. One thing I’ll say that maybe that comes to mind here is that I remember early on talking with him about all the different initiatives he was taking on. He’s building a big city called Neo in the desert in a place where there really was nothing on the Red Sea, and a lot of people were criticizing the ambition of the plan. And I was sitting with him one night and I said, ‘Why are you taking on all these things? You’ve got a lot of different programs, but what most politicians do is they set lower expectations and then they exceed the expectations.” And he looked at me without hesitation.

(02:18:43)
He says, “Jared, the way I look at it is that in five years from now, if I set five goals and I achieve five goals, I’ll achieve five things. If I set a hundred goals and I fail at 50 of them, then five years I’ll accomplish 50 things.” And so it’s a very different mindset as a leader. The way I got to work with him was Saudi Arabia was a big topic in the campaign. President Trump was basically saying during the campaign that they’ve got to pay for their fair share, they haven’t been a great partner in the region. He’s very critical of Saudi. And then during the transition, I was asked by several friends to meet with a representative of Saudi Arabia. I said, “I don’t want to meet with them.”

(02:19:24)
But I came over and I met and they said, “Well, we want to make changes.” And I said, “Well, you have to make changes to how you treat women.” Then women couldn’t drive, they had guardianship laws. So you got to start working with Israel, you have to be paying more of your fair share and you have to be stopping the Wahhabism that’s being spread. Again, I had no knowledge these were just kind of the traditional talking points about Saudi Arabia. So the guy I was with basically said, this guy Fahad Toonsi, who’s a very respected minister there, he says, “Jared,” he says, “You don’t know much about Saudi Arabia, do you?” I said, “No, no, no, I don’t. It’s just really what I’ve kind of been told or what I read.” And he says, “Okay, let me do this. We want to be great allies with America. We’ve traditionally been great allies with America. Can I come back to you with a proposal on ways that we can make progress on all of the different areas where we have joint interests?”

(02:20:15)
Keep in mind at that point in time, the Middle East was a mess, and probably the single biggest issue we had after ISIS was the ideological battle. If you remember in 2016, there was the Pulse nightclub shooting in Orlando, you had the San Bernardino shooting and people were being radicalized online with the extremism, and then there was a lot of crimes that were happening because of that. It was a big topic in the campaign, and so that when I was thinking about talking different generals and what capabilities the US had to really combat the extremism and the ideological battle, what we realized was that Saudi Arabia as the custodian of the two holiest sites in Islam, the Mecca Medina, that that would be the best partner to work with if they were willing to. But for years, they really hadn’t been willing to kind of lean into this fight.

(02:21:03)
So I said, “Sure, give a proposal.” So they come back, give a proposal, and they said, “Look, if you make President Trump’s first trip to Saudi Arabia, we will do all these different things. We’ll increase our military spending and cooperation, we’ll counter all the terror financing.” Unbelievable layer. So I took the proposal, I went to the National, then it was General Flynn. I said, “If Saudi Arabia did these things, would this be considered a big…” “Unbelievable, but it’ll never happen.” I said, “Well, they’re telling me they want to do these things. Again, having no foreign policy experience, I’m just saying I’ve got somebody telling me they want to do it and that’s kind of where we started.” Again, to office I don’t think much more about it.

(02:21:43)
And then I think it was like maybe a month in President Trump has a call with King Salman and before the call we’re in the Oval Office and the president’s basically saying, “Well, this is what we want to go through.” And I have Secretary Mattis and Secretary Tillerson, the Minister of Defense and the Secretary of State basically saying, “You have to deal with MBN. MBN is the guy who’s been our partner for all these years. He’s the head of intelligence and he’s been a great partner.” I said, “Well, if he’s been a great partner, then why do we have all these problems that you guys are complaining about with Saudi?”

(02:22:17)
I said, “I’ve been told that we have this proposal from MBS who’s the Deputy Crown Prince and that’s who we should be dealing with on this.” And so the phone call starts and President Trump listened to both of us, and on the phone call with King Salman, President Trump says, “Okay, we’ll go through all these things. These are the things we want to get done.” He says, “Well, who should we deal with?” King Salman says, “Deal with my son, the Deputy Crown Prince MBS.” So President Trump said on the phone, “Have him deal with Jared.” Because I think he knew that if he would’ve put him with the other guys, they were not believers in what he had the ability to do, and that’s how I got assigned to work with him.

(02:22:54)
I get back to my office after that, have an email from him, spoke to him for the first time, and then we just went to work. A lot of people were betting against that trip, they thought it wasn’t going to be successful, and they’ve been betting against him and he’s been underestimated, but he’s been doing an incredible job and the whole Middle East is different today because of the work that he’s done.
Lex Fridman
(02:23:14)
Maybe it’s instructive to go through the mental journey that you went on from the talking points, the basic narratives, the very basic talking points, understanding of Saudi Arabia to making that human connection with MBS and making the policy connection that it’s actually possible to solve problems. What was that journey like? Why was it so difficult to take for others and why were you effective in being able to take that journey yourself?
Jared Kushner
(02:23:43)
Maybe some of it came from my inexperience, but my desire to listen and hear people. So I had this proposal, I was told that all of these things were good. Then we’re trying to schedule this trip and the National Security Council calls a meeting where we’re in the Situation Room and we have Homeland Security, Secretary of Defense, Secretary of State, and everyone’s saying this is going to be a disaster. They said, “If we go to Saudi Arabia, the Saudis never keep their promises.” And our Secretary of State at the time was a gentleman named Rex Tillerson, who’d been the CEO of Exxon so he dealt with all these people very extensively and he basically said, “In my experience, the Saudis won’t come through. And Jared, you don’t know what you’re doing, you’re wasting your time.”

(02:24:27)
And I basically was at a point where I said, “Look guys, but they’re saying they want to do all these things, shouldn’t we at least give them a chance to try to do it? Why do we want to predetermine their direction by not giving them a chance to change? Just because things in the past haven’t gotten the way you want them to, that doesn’t mean they can’t go that way in the future.” So we fought the battle. They basically deferred and let me go through with it, but when I’d do the planning meetings for the trip, nobody would show up because they all thought it was going to be an absolute disaster.

(02:24:56)
By the way, they probably weren’t wrong to think that because I’d never planned a foreign trip before and I’d never done any foreign policy before. So during the planning, I’d speak to MBS almost every day and I’d go through all the different details and the things that would be coming up and I said, “Look, I really need to get these things in writing.” He sent over a guy, Dr. Musaid Al Aiban, who’s a tremendous diplomat for them. He came to Washington, stayed for three weeks, and we worked through all the different details of what we needed and we ended up coming to an arrangement on what it should be. So I think about now in retrospect why I was so focused on getting things like this done and why I even believed that they could be possible. But the answer is really the people I was talking to on the other end were telling me that these things were possible, and so just because they hadn’t been done before and just because others around me didn’t believe that they could be done, I wasn’t willing to just say, “Well, let’s not try.”
Lex Fridman
(02:25:52)
It just seems like that cynicism that takes over is paralyzing. You sent me a great essay from Paul Graham. I’m a big fan of. I think it explains a lot of your success. The essay is called How to Do Great Work, and people should go definitely read the full essay. There’s a few things I could read from it, some quotes. “Having new ideas is a strange game because it usually consists of seeing things that were right under your nose. Once you’ve seen a new idea, it tends to seem obvious. Why did no one think of this before? Seeing something obvious sounds easy, and yet empirically having new ideas is hard.” The steps you took seem trivial, and yet nobody was taking them, or at least in the past, that weren’t successful. So the successes you’ve had were as simple as essentially picking up the phone or trying.

(02:26:47)
There’s a lot of interesting things here to talk about. This aspect of doing this seemingly simple that seems to be so hard to do it, as Paul describes, requires a willingness to break rules. ” There are two ways to be comfortable breaking rules: to enjoy breaking them and to be indifferent to them.” That’s an interesting distinction. “I call these two cases being aggressively and passively independent minded.” So again, that’s to enjoy breaking the rules or being indifferent to the rules. “The aggressively independent-minded are the naughty ones. Rules don’t merely fail to stop them; breaking rules gives them additional energy. For this sort of person delight at the sheer of audacity of a project sometimes supplies enough activation energy to get it started. The other way to break the rules is not to care about them at all, or perhaps even to know they exist. This is why novices and outsiders often make new discoveries; their ignorance of a field,” ignorance may be in quotes, “of a field’s assumptions act as a source of temporary, passive, independent mindedness.

(02:27:51)
Aspies also seem to have a kind of immunity to conventional beliefs. Several I know say that this helps them to have new ideas.” So the aggressive and the passive is such an interesting way of looking at it. Perhaps some aspect of this, at least in the story you told us, some passive aspect where you’re not even acknowledging, not even caring that there was rules, just kind of asking the simple question and taking the simple action.
Jared Kushner
(02:28:19)
I think that it’s funny that was a necessary read and we’re doing just a snippet of it, but I would encourage anyone listening to go and find it and read the entire thing because it’s something that really spoke to me as I was transitioning into my new career now, and I just loved it. But when we were talking about why certain people who don’t have traditional qualifications are able to come in and do incredible work and solve complex problems, it made me think of that essay, which is why I shared it. I think that in the context of the work that I was doing here, perhaps not having the historical context became an advantage and obviously went back and then tried to study it. But if you go into a problem, I always find that especially in the political realm, my favorite political issues are ones where they’re contrarian by being obvious and sometimes they feel very intuitive and so you take them on.

(02:29:15)
There’s always a lot of resistance when you go against something that’s been accepted as the way that you’re supposed to do things. I came to learn over the course of my time in government that when everyone was agreeing with what I was doing, then it actually made me more nervous because I felt like you have these problems, they haven’t been solved for a long time, and then if you take the same approach as others, you’re going to fail just like they did. So taking a different approach doesn’t mean you’re going to succeed, but at least if you fail, you’re going to fail in an original way.

(02:29:48)
I did like this a lot and I think that what I saw was the people who were very good at getting things done that hadn’t been done before were people who came with different qualifications, different perspectives, and they came in and really worked the problem in untraditional ways. And so I think in the Middle East, I came in with a very different approach than people before me, not because I came in deliberately trying to do it differently, but because I came in trying to listen and understand from people why the problem hadn’t been solved and then think from a first principle’s perspective on what’s the right perspective today. Not based on what happened 50 years ago or not based on what somebody’s feelings who were hurt, but what’s the right thing to make people’s lives better, to make the world a safer and more prosperous place tomorrow.
Lex Fridman
(02:30:37)
So if we can go back to MBS for a little bit, from the person to the vision, there’s something called Vision 2030 about his vision for Saudi Arabia in the future. Can you maybe look from his perspective, what is his vision for the region?
Jared Kushner
(02:30:53)
Sure. So it’s funny, we were talking before about how we wish leaders would set big audacious goals and take on big things. Well, that’s what he did with Vision 2030 when he was young. And again, this is something that was derided and a lot of people were very skeptical of it, but the people who actually picked it up and read it said this is a very thoughtful plan that’s very achievable. So he studied his country and said, “What’s our place in the world? What are our advantages? What are our disadvantages?” And then he set publicly KPIs that he wanted to hold his country to and then put in place plans and committees and really worked hard to push things in that direction, which was pretty remarkable. I think that it’s something, when I saw it, I thought it was very refreshing. I said, “Wait, in America, why don’t we have set goals? Why don’t we have KPIs?” And I do think that it’s something that most countries, if not all countries, should have.

(02:31:45)
One of my favorite quotes was from the Alice in Wonderland, where the Cheshire Cat says, “If you don’t know where you’re going, it doesn’t matter which path you take.” So I think that that’s something that really helped set them on a good path, and they’ve been very successful with it. One of the things he told me about putting that together was he said, “My father’s generation, they created this country from almost nothing. They came here, they were a poor country, they were Bedouins in the desert. And then they look back and see what they’ve done over 50 years, and they say it’s absolutely remarkable.” He said, his generation, they come in and they say, “We’re very grateful for everything that’s been done to date, but we have so much opportunity that we’re not taking advantage of.”

(02:32:27)
And so he’s now empowered the next generation to be ambitious and think big and grow with it. What that means for his vision for the Middle East is that the general architecture that should exist, and now there’s excitement in the discussions with Israel that have advanced was the general view of what we thought from a Trump perspective should be the new Middle East is having an economic and security corridor all the way from Haifa to Muscat, from Oman to Israel, where basically you go through and if you can create a security area where people can live free of fear, of terrorism and of conflict. The Middle East for the last 20 years has been a sinkhole for arms, for death, for terrorism. It’s been awful. It’s been a big national security threat for America, a big place where our treasure has gone. We’ve had a lot of our young, amazing American soldiers killed in action there, and the same thing for the Arab countries as well.

(02:33:26)
So if we can create a security architecture for that region, and then we can create economic integration between all the different countries, I mean, the amount of innovation happening in Israel is unbelievable. Think of it like Silicon Valley’s not connected to the rest of California. You have a very young population, a very digital savvy population, you have a lot of resources. And so if you can get that whole set, the potential for it is unbelievable. I do think that that’s his ultimate vision is to become a really strong country economically, and then to become a place where you could be funding advancements in science, advancements in humanity, advancements in artificial intelligence, and think about ways to be a positive influence in the world.
Lex Fridman
(02:34:05)
So a difficult question. One big source of tension between the United States and Saudi is the case of Jamal Khashoggi. I was wondering if you can comment on what MBS has said about it to you. You’ve spoken to him about it and what MBS has said about it publicly on 60 Minutes and After.
Jared Kushner
(02:34:25)
Yeah, so what he said to me was no different than what he ultimately said on 60 Minutes, which was, ” As somebody helping lead this country, I bear responsibility and I’m going to make sure that those who are involved are brought to justice and I’m going to make sure that we put in place reforms to make sure things like this don’t happen again.” It was a horrible situation that occurred. What I saw from him after that was just a doubling and a tripling down on the positive things he was doing, figuring out ways to kind of continue to modernize this society, build opportunity in the kingdom, and to continue to be a better ally to all the different countries that wanted to be aligned with them.
Lex Fridman
(02:35:07)
One thing I learned from this case is how one particular situation, a tragedy, can destroy so much progress and the possibility of progress and the possibility of connection between the bridges that are built between different nations and how narratives around that can take off and take such a long time to repair. You’ve worked with this in the Middle East with Israel and so on, how the history, the narratives, the stories, they kind of have this momentum that’s so hard to break even when you have new leaders, new blood, new ideas that come in. It’s just sad to see that yes, this tragedy happens, but it doesn’t mean that you can’t make progress. I don’t know if you have lessons from that, just how much of a dramatic impact it had on creating tension between the United States and Saudi and in general and the Middle East that somehow Saudi’s not a friend, but is against the ideals and the values of the United States.
Jared Kushner
(02:36:24)
So it definitely created massive tension and it became a very high profile action that actually overshadowed a lot of the good work that was being done in the region and a lot of the progress we were making. But when you think about this or you think about the other issues that we’ve gone through today, I think the general framework that I always try to approach things with is you can’t change what happened yesterday. You can only learn from it and then you can change how you deal with tomorrow. When I think about the people in power, what do I hope that they’re spending their time focused on? Two basic things. Number one is how do I create safety and security for my people and for the world? And then how do I give people the opportunity to live a better life? And so when things like this happen, obviously there are certain reactions that are appropriate, but ultimately you have to think through how do you not allow the paradigm that you’re creating in the world to lead to worse outcomes than would happen otherwise?

(02:37:28)
And so when I would think about foreign policy in general, one of the differences between foreign policy and business is that in business the conclusion of a problem set, you finish a deal. You either have a company or a property, or if you sell it, you have less to do and more capital hopefully if it’s successful. In a political deal, it’s always about paradigms. So the end of a problem set is always the beginning of a new paradigm, and you’re always thinking through how do you create an environment that leads to hopefully the best amount of positive outcomes that could occur versus creating a paradigm that will lead to negative outcomes. So bad things happen, a lot in the world, and you have to make sure that when those happen, people are held accountable for it. But you also don’t want to make sure that in the process of making sure that there’s accountability for these actions, you don’t set a lot of progress that the world is making back. That will lead to worse off situation for many more people.

Israeli–Palestinian peace process

Lex Fridman
(02:38:31)
If we can go back to the incredible work with Abraham Accords and Israel and the Middle East, first, the big question about peace. Why is it so difficult to achieve peace in this part of the world between Israel and Palestine and between Israel and the other countries in the Middle East? Or any sort of peace like agreements?
Jared Kushner
(02:38:52)
If I had to give you the most simple answer, I would say that it’s structural. If you go back to the incentive structure of different leaders, this whole peace process between Israel and the Palestinians, and again, I’ve gotten criticized for saying this, but it’s what I believe, so I’m going to say it, is that the incentive structure was all wrong. When I went before the United Nations Security Council to discuss the peace plan that I proposed, which again was more of an operational plan, and it was a pragmatic plan, it was over 180 pages in detail. In politics, people don’t like putting forward detail because it just gives a lot of places for you to get criticized on. Nobody actually criticized the detail of my plan. They just criticized the fact that it was coming from us and didn’t want to debate the merits of the operational pieces of it.

(02:39:40)
So I created a slide where I showed from the Oslo Accords till the day I was there, all the different peace discussions. I put a dove in the slide for those, and then I put a tank for every time there was a war, because there was always skirmishes between Hamas and Hezbollah and the Palestinians. And then I showed two lines, and they both went from the bottom of the page all the way up like this. One of the lines was Israeli settlements. So every time a negotiation failed, Israel was able to get more land and then the other one was money to the Palestinians. I said every time a negotiation failed, the Palestinians would get more money. The problem with that money though, was that it wasn’t going to the people. Some of it would make its way down, but most of it was going to the politicians.

(02:40:24)
You had leadership of the Palestinians who was basically, I think at that point it was in the 16th year of a four-year term, so it wasn’t democratically elected. A lot of what I tried to show was that there was no rule of law, there was no judicial system, there were no property rights, and there was no opportunity or hope for the people to live a better life. And so all of the envoys to date were basically trained to go and do the same things. Again, I got massively criticized by all the previous envoys for not doing it the same way they did, but I thought the problem structurally just didn’t make sense and so I felt like the incentive structure was all wrong, and I took a different approach.
Lex Fridman
(02:41:04)
And so what’s the different approach?
Jared Kushner
(02:41:07)
I started writing down a document. These are the 11 issues, but there’s really only three issues that matter. I said, “Just tell me what you think the compromise is that you think the other side could live with, that you would accept.” And it was very hard to get them talking about this. “Oh, you have to go back to 1972, you have to go back to 1982, you have to go back to 2001, you have to go to…” And I was just like, “I don’t need a headache and I don’t need a history lesson. I just want a very simple thing. Here today in 2017, what’s the outcome that you would accept?”

(02:41:35)
And I was dealing with their negotiators, their back channel secret negotiators, their double secret neg- I was like, “This whole thing is like, it’s a process created where nobody wants to talk about the actual solution.” So coming from the business world, I said, ‘Okay, let me just write down a proposed solution that I think is fair, and let me have each side react. Don’t tell me about theoretical things. Tell me I want to move the line from here to here. I want to change this word.” So I tried to make it much more tactical, and what I realized was the Palestinians, they’d worked so hard to get the Arab world to stay with the line of the Arab Peace Initiative.

(02:42:14)
And so I was going back and I read the Arab Peace Initiative. It was 10 lines and it didn’t have any detail, so it was a concept. And so they liked that concept because it allowed them to reject everything. They kept getting more money. I mean, Bibi Netanyahu, who runs one of the most incredible economies in the world, who runs an incredible superpower militarily for the size of their country. He would fly to Washington to meet us, and he’d be taking a commercial El Al Plane. Abbas, who runs a refugee organization, a refugee group that claims that they don’t have a state that gets billions of dollars every year from the global community would fly in a $60 million Boeing BBJ. So the whole thing was just very corrupt and off, and I do think that that’s why… I don’t think people were incentivized to solve it, to be honest.
Lex Fridman
(02:43:02)
What do you think an actual plan on that part, just before we talk about Abraham Accords, if there is a peace plan that works between Israel and Palestine, what do you think it looks like?
Jared Kushner
(02:43:15)
You have to separate it into two different issues. And I think that that’s actually how we came to the Abraham Accords, is that I tell the story in the book, and it was one of my favorite experiences during my time in diplomacy where I went to meet with Sultan Qaboos, who was the sultan of Oman. We fly out there because he’d had a secret meeting with Bibi, and I thought maybe he was open to normalizing with Israel. So after he meets with Bibi, he calls me and says, “I want you to come see me.” So I go over to see him, and again, I tell the story. It was a crazy night and all these different areas, but when I was talking to him, he basically says to me, “I feel badly for the Palestinian people that they carry with them the burden of the Muslim world.”

(02:43:58)
And that line just like stuck with me. A couple days later, I was thinking about it and I said, “Wait a minute, who elected the Palestinian people to represent the Muslim world on the Al-Aqsa Mosque? And so the reason why I felt like it had never been solved was it was a riddle A, that I believed was designed to not be solved, but B, you were conflating two separate issues. You had the issue between Israel and the Muslim world, which really was the issue of the Al-Aqsa Mosque, and then you had just a territorial dispute, which throughout history, you have lots of territorial disputes, and they’re usually resolved in different ways.

(02:44:34)
If you go back to the Israeli-Palestinian issue, there’s just a couple components you need to solve. Number one is territorial continuity, right? You need to figure out where do you draw the lines? And that’s something that you can talk about what people were owed 70 years ago, but it’s much more productive to say, “This is what you can make work today.” And that’s kind of what we did. We literally spent months and months drawing a map and we put something out, probably change a couple lines here and there, but by and large, it was a very pragmatic solution that I think could work and I think it could work for-
Jared Kushner
(02:45:03)
… very pragmatic solution that I think could work, and I think it could work for the safety and security of Israel, which was number one.

(02:45:07)
So first issue is drawing a map. Second issue is security. And again, this is one issue, we were incredibly sympathetic with Israel, which is you can’t expect a prime minister of Israel to make a deal where he’s going to make his people less secure than before. So we worked very closely with them on a security apparatus. We laid something out that I think would keep the whole area safer, and it would make sure Israel was safe and also keep the Palestinian issue safe. So you needs security.

(02:45:34)
Number three was the religious sites, and that was one that was actually always made much more complicated by people, the Al-Aqsa Mosque, because you basically have Ḥaram al-SharÄ«f, which is a place where the mosque was built in the seventh or eighth century, but originally it was where the Holy of Holies were in the [inaudible 02:45:52] for the Jewish people. And then compounding by the fact that you have all the Christian holy sites in Jerusalem. It’s a city that should be bringing everyone together, but in fact has become a place where you have wars and hatred, and a lot of different conflicts that have risen because of it.

(02:46:08)
But what I said was, instead of fighting over concepts of sovereignty, which is interesting, how I got to the notion that this wasn’t really the big issue. I basically just operationally, why don’t we just make it simple? Let everyone come and be able to worship as long as they’re being able to worship peacefully.

(02:46:23)
So that’s really the contours of it. And what the Palestinians have done is they’ve kind of deflected from a lot of their own shortcomings, and a lot of the Arab leaders did that as well, kind of in the Abraham Accord days, by kind of allowing this issue to be so prevalent.

(02:46:37)
So one thing I’ll say on the Palestinians is that what we tried to do by laying out plan was we said, “Okay, what are the reasons why the Palestinian people are not having the lives that they deserve?” And I’ll give you a couple of things. One is I studied the economies of Jordan, West Bank, Gaza, Egypt, Morocco. This was numbers from like 2019. But what was interesting was the GDP per capita of somebody living in the West Bank was actually the same as Jordan, and it was actually more than somebody living in Egypt. And the debt of GDP that the Palestinians had was like 30, 40% compared to Egypt, which was at like 130%. In Jordan, which was at 110%. Then Lebanon, which is at 200%.

(02:47:24)
And so you’re in a situation where a lot of this stuff didn’t make sense, but if you draw lines, create institutions where Palestinian people can now feel like they have property rights and have ownership over their place, and let the money flow past the leadership ranks to the people, let them have jobs, let them have opportunity, and then let all Muslims from throughout the world have access to the mosque and Israel, making sure that they can control the security, which I think the Jordanians and a lot of others want Israel to have strong security control there to prevent the radicalists and the extremists from coming, you could have peace there very easily.
Lex Fridman
(02:48:00)
So there’s a lot of things to say here. One is just to emphasize, Al-Aqsa Mosque, says this a holy place, and this is something in our conversations and in my own travels, I’ve seen the importance of frictionless access to those sites from the entirety of the Muslim world. And that’s what Abraham, of course, took big leaps on. Okay, so we’ll talk about that a little bit more, but that’s kind of a religious component. That’s a dignity in the religious practice and faith component.

(02:48:35)
But then the other thing you mentioned so simply, which is you have money flow past the leadership ranks. How do you have money flow past the leadership ranks in Palestine? So make sure that the money that’s invested in Palestine, the West Bank, gets to the people.
Jared Kushner
(02:48:58)
So to date, all of the aid that’s been given to the Palestinians has been an entitlement. It’s not conditions based. It’s always just we give them money and there’s no expectations. It’s very simple. You make the aid conditions based. You fight for transparency. You do it through institutions other than the PA, or you put reformers into the PA that will allow it to go down that way.
Lex Fridman
(02:49:18)
PA being the Palestinian Authority, which is the leadership?
Jared Kushner
(02:49:21)
It’s not hard to do. It just takes people who actually want to do it. But I think that the mindset of the international community has not been, “Let’s solve this problem.” It’s like, ” Let’s just throw a little bit of money. The money’s Novocaine. Let’s put a little Novocaine on the problem and let’s not have to deal with it.” But nobody’s ever said, “Oh, let’s do an accounting of the $20 billion we’ve given them and see how many jobs it’s done and where it’s gone.” That just hasn’t happened. Again, it’s an incredibly corrupt organization [inaudible 02:49:46]. You think about the post-World War II dynamic, you had a lot of refugees. My grandparents were Refugees post World War II. Every other refugee class has been resettled and you only have one permanent refugee organization ever created. Why was this done? It was done to perpetuate the conflict so that a lot of Arab leaders could basically deflect from a lot of their shortcomings at home.

(02:50:06)
And so I think for Israel, they view all these things as existential. They value their safety. They’ve been under attack for a long time. I do think having a deal where we can say, “How do the Jews and the Muslims, Christians, come together?” I think King Abdullah from Jordan’s been an incredible custodian for the mosque. I think everyone, in my travels, recognize that he’s the right guy for that. That the king of Jordan should be the custodian of the mosque. We should have some kind of framework to make sure everyone has access. The more countries that have diplomatic relations with Israel, the more Muslims and Arabs that should be able to come and visit. And by the way, the more you have these normalizations, think about what that will do to the economy of the West Bank where they’ll have great hotels, hospitality, a tremendous tourism industry because of all the Christian, Muslim, and Jewish holy sites that they have there.

(02:50:54)
So there’s a lot of potential there. We just have to get unstuck. I believe that it’s so possible if the leaders want to make tomorrow better, that they can. And unfortunately, the people who suffer the most are really are just the Palestinian people. And I think that in Gaza, they’re hostages to Hamas. And in the West Bank, they’re just held back because their leadership just is afraid or too self-interested to give them the opportunity to change their paradigm and pursue the potential of what they have. And by the way, it’s an incredibly well-educated population, it’s an incredibly capable population, and they’re right next to Israel where the economy, they need everything. And so the potential should be incredible if you can just move some of these pieces.

(02:51:45)
But again, there’s still a lot of emotion and hatred you have to work through as well. But I do believe that you’re not going to solve that by litigating the past. You’re only going to solve that by creating an exciting paradigm for the future and getting everyone to buy in, and then move towards that.
Lex Fridman
(02:52:01)
And maybe increase the chance of being able to establish an economy where the entrepreneurs can flourish in the West Bank and so on in Palestine, once the relationship across the Arab world is normalized.
Jared Kushner
(02:52:16)
So one thing on that, which is very interesting, is when I got into my job in the Middle East, all of the conventional thinkers said to me, “The separation in the Muslim world is between the scene as Sunnis and the Shias, and that’s really the big divide.” And as I was traveling, I didn’t think there was any divide in that regard. The divide that I saw was between leaders who wanted to give a better opportunity for their people and create economic reforms and opportunity, and leaders who wanted to use religion or fear to keep their stronghold on power. And so if you think about who’s not creating the opportunity for their people, it’s the Palestinian leadership and the Iranian leadership. All the other Arab countries were focused on, how do we give opportunity for our people to live a better life?

Abraham Accords and Arab-Israeli normalization

Lex Fridman
(02:53:02)
And there is a big foundation on which that framework can succeed, which I think is, in general, the idea of Arab Israeli normalization. So that’s where Abraham Accords come in. Can you tell the story of that?
Jared Kushner
(02:53:20)
Sure. So it’s an amazing thing. And I sit here today, somebody not in government, and every day I see another flight that goes between, or I see an Israeli student studying at a university in Dubai or a new synagogue opening up in Abu Dhabi. And it just gives me such… Or Bahrain. It gives me such tremendous pride to see all of the progress that’s been made.

(02:53:46)
How it occurred, part of why I wrote the book was to put this down for history’s sake, to go through all the different intentional, unintentional, circumstantial things that occurred. It’s funny, we left government. There’s a lot of people saying, “Well, this is why that…” I said, “I was kind of at the middle of it, and I couldn’t even perfectly articulate why it happened,” because it was in evolution of a lot of things. And I joke that we made peace on plan C, but only because we went through the alphabet three times, failing at every letter. But we didn’t give up and we kept going and we got it done.
Lex Fridman
(02:54:21)
And maybe this is a good place to also step back and say, what is Arab Israeli normalization? What is the state of things for people who may not be aware before the progress you made?
Jared Kushner
(02:54:33)
That’s Probably the best place to start. So what we did is we made a peace deal between Israel and the United Arab Emirates, and then Israel and Bahrain. Then we did a deal with Israel and Sudan, then Israel and Kosovo, Israel and Morocco, where basically countries that didn’t recognize each other before ended up recognize each other, all of these were Muslim majority countries, and getting them to integrate with Israel was a very big thing.

(02:54:59)
The traditional thinking had always been was that Muslim Arab countries would not make peace with Israel until the Israeli-Palestinian issue was solved. And what we were able to do is separate the issues and then make these connections, which are leading to amazing interaction between Jews and Muslims. So when I think about, obviously you have national security, you have emotional benefits from these things. But the single biggest benefit that I’ve seen from the Accords is that if you were an Arab or a Muslim and you were willing to say positive things about Israel or the Jews before this came out, you had been viciously attacked by the media or the hordes of influencers or the extremists in these different countries. What this did was it brought out into the public the fact that Jews and Muslims can be together and they can be respectful, they can have meals together, and that the cultures can live together in peace.
Lex Fridman
(02:56:02)
So just to linger on this, it’s like a once subtle and in another sense, transformative. So normalization means you’re allowed to travel from one place together. That has a kind of ripple effect of that you can now start talking in a little bit more accepting way. You can start integrating, traveling, communicating, doing business with, socializing. So the cultures mix, conversations mix, all of this. And this kind of has a ripple effect on the basic connection between these previously disparate worlds. I don’t know if there’s a nice way to kind of make clear why these agreements have such a transformative effect, especially in the long term.
Jared Kushner
(02:56:56)
I would say the simplest form is it’s just a mindset, and it’s almost like you’re taught all your life, “We’re enemies, or we can’t be friends with that tribe on the other side of the fence.” And then one day the leaders get up and say, “No, it’s okay now.” And there was never an issue between the people. The people were just taught different things and they were separated from each other.

(02:57:19)
But again, one of the things that I respect about the work you do is you believe in the power of conversation and the power of human interaction. And these issues and gaps between us feel so big when we think about them, when we’re told about them, when we read about them. But when we go and sit with each other, all of a sudden we realize maybe we have a lot more in common than we have that divides us.

(02:57:43)
For me, what I’ve seen about it that’s made the biggest difference is I’ve seen people who wouldn’t have the ability to be together, be together, and that’s now forming a nucleus of togetherness, which is a restoration. So you think about the modern Middle East from post Holocaust to now, again, in 1948, after that War of Independence, you had Jews living in Baghdad and Cairo. Then they became so anti-Jewish that they then expelled all of the Jews from all of these capitals of those cities. So you think about the Jewish history in Baghdad. I mean, I think that Talmud was written in Baghdad. It was a place where, in Babylon, where the Jewish people thrived, I think in 570 BCE, when Nebuchadnezzar conquered Jerusalem, he took about 10,000 Jews back with him to Babylon because he thought it’d be good for his economy. And during that place, the Jews actually flourished and had a good life there.

(02:58:41)
So for a 1,000 years before the second World War, the Jews and the Muslims lived very peacefully together. So people say that what we’re doing now is an aberration. I actually think it’s not an aberration. I think it’s actually a return to the time where people can live together culturally. And so this is the beginning of the end of the Arab-Israeli conflict, and it’s the beginning of togetherness, which again, you think about how much war, how much provocation, how much terrorism has been made in the name of religious conflict. This is, I think, the start of the process of religious respect and understanding.
Lex Fridman
(02:59:17)
We’ve talked about you being attacked in the press for the Russian collusion and other topics. One of the most recent set of attacks comes on the topic of Saudi public investment fund, giving $2 billion to your investment firm after you left government. So that includes a 1.25% asset management fee of 25 million a year. Can you respond to these recent set of attacks?
Jared Kushner
(02:59:42)
Sure. So left government. Obviously worked for four years. It was a very action packed time. That’s why I wrote the book. I wanted to put down all those experiences. I started thinking, “What do I want to do next?” So my previous career, I’d been in real estate. I had worked with my brother on some technology businesses that I’d started. And then got into government. So I kind of had a career shift. In my previous career, obviously was very successful. The New York Times, they violated and they published my financial statements. They showed I was making about $50 million a year in the private sector before I went to government. I went into government and I volunteered. I didn’t take a salary. I paid for my own health insurance for four years, my wife and I. And then we went and I was thinking, “Should I go back to my old company or should I start something new?”

(03:00:35)
And my thinking was is that, through my time in government, I’d met so many people, I’d learned so much about the world. I had a big understanding now for how the macroeconomic picture worked. And I did feel like there was a lot more that I could do than just going back to real estate.

(03:00:52)
In the meantime, I was getting a lot of calls from different CEOs and companies saying, “Can you help me with this company? Can you help me with that company? Your knowledge could be helpful to help this company navigate this challenge or to expand internationally.” And so I said, “You know what? Maybe I should create a business to do an investment firm, where I can do something different, where I’m putting together geopolitical expertise and traditional private equity and growth investing and figure out how to do that, where I can do something differentiated, where I can invest in growing things and help with my navigation skills and relationships.”

(03:01:27)
So that was kind of the thesis of what I thought could make sense as kind of a next step. I called different friends. They were very excited to back the effort. Obviously this was coming off the success that I just had in the Middle East where I did six peace deals there. And one of the notions I wanted to be able to do with the firm was to be able to take money from the Gulf and then to be able to invest it in Israel, to continue to build the economic links between the countries. Again, if countries have more economic ties, I think war and fighting is less likely. And then in addition to that, I wanted to figure out how do you bring the entrepreneurs together from both of those countries?

(03:02:06)
So that was really the mission of what I set out to do. So far, I’ve been enjoying it. It’s been a lot of fun. I’ve been learning a ton. I think we’re doing very well with it.

(03:02:15)
In terms of the criticisms, I think that I’ve been criticized in every step of everything I’ve always done in my life. And so what I would say is, this business is actually an objective metric business, right? It’s about returns. So in three, four years from now, five years from now, see how I do? Hopefully I’ll do very well and judge me based on that.

(03:02:34)
In terms of any of the nefarious things, I haven’t been accused of violating any laws, and I haven’t violated any of the ethics rules either. When I was in government, I, every year, submitted all my financials to the Office of Government Ethics. They certified it every year, and I followed every rule and every law possible. So to my critics, I’ll say, “Criticized me before, you’ll criticize me now. I’m going to keep doing me and I’m going to keep pursuing things that I think are worthwhile.” And I’m very excited about this chapter of my career.

Donald Trump

Lex Fridman
(03:03:08)
Maybe this is a good place to ask. In working closely with Donald Trump, what, in your sense, looking into the mind of the man, what’s the biggest strength of Donald Trump as a leader?
Jared Kushner
(03:03:21)
I would say his unpredictability. I think that, as a leader, he consumes a ton of information. He doesn’t like to be managed or have his information filtered. So he’ll speak to a lot of people to draw his information himself. He’s very pragmatic. I don’t see him as terribly ideological. I see him as somebody who’s about results. I think he wants to deliver results. And I think ultimately, he’s an incredible fighter. He’s a big counter puncher, but he also wants to get along with people. And that’s probably the biggest surprise that people found with him. I mean, you look at even situations like… I would always tell people, “If you disagree with him, don’t go on television and criticize him. Just pick up the phone and call him, and go see him, and he’ll talk to you about it.” He may not agree with you.

(03:04:14)
But again, that’s what Kim Kardashian did when she had a case of clemency with a woman, Alice Johnson, that she felt strongly about. We went through the case. I wouldn’t have had her call if I didn’t think it was a legitimate case. So we spent about eight months quietly working through the case, working through the details, to make sure that it really was a worthy case.

(03:04:34)
I brought it to President Trump said, “She’d like to come meet with you to talk about this case.” And he said, “Have her come in.” So she came in. We went through the case, and President Trump ultimately granted the clemency to Alice Johnson, who was a woman who was accused of being part of a drug ring. She had basically a life sentence for doing it. She’d served 22 years in prison. While in prison, she’s basically was a grandmother, and she was putting on the prison plays, she was mentoring young women in prison. Somebody who, again, there’s always a risk, but by and large had a very, very, very low risk of committing a crime in the future.

(03:05:11)
And then it goes back to the notion of, are we going to judge people by the worst decision they make in their life? And so President Trump was willing to grant the clemency, and it went.

(03:05:21)
And I think that it just goes to the notion of maybe this goes back to his unpredictability in a positive way, which is if you go sit with him and you make your case, he’ll hear you, he’ll listen to you, and he’s not afraid to act, and he’s not afraid to be controversial, which I think is a good thing.

(03:05:36)
So from a foreign policy point of view, in particular, his unpredictability just meant that everyone was always on their back foot. People were afraid to kind of cross America. And what I would tell people who don’t like Trump is I would say, “Think about how crazy he’s making you and his enemies. He did that to the enemies of America.” Yeah, so he was a very, very strong president and I think did a great job.
Lex Fridman
(03:05:58)
So in some of these agreements that we’ve been talking about and speaking with leaders, how do you think the unpredictability helps?
Jared Kushner
(03:06:05)
So in all the agreements that I was negotiating, I wasn’t doing it as a principle, I was doing it on behalf of President Trump. And people knew that I had access to President Trump, and they knew that I could say, “You may say this that we don’t like, but I’m going to have to take it back to him, and then we’ll see what he does.” And one of the biggest instances was on the USMCA trade deal, where that deal happened because Mexico was legitimately concerned, and smartly so, that President Trump was going to impose tariffs on the car industry, which would’ve been decimating to their economy. And by the way, he was ready to do it. We were holding it back from doing it with every ounce of strength that we could. So it wasn’t a bluff. I mean, that was actually real, but they were smart to read that it was real. And ultimately we created a great win-win deal.

(03:06:56)
I’ll tell you a funny story, just popped into my mind from the tariffs is we did also, we used a 232 national security exemption to protect our steel industry, and we put tariffs on steel and aluminum. And again, I thought about this because we also negotiated them with Canada. And there was a very funny phone call where Trudeau is calling Trump. And again, they got along decently well. Trudeau’s calling saying, “You can’t put national security tariffs on us in Canada. We’re your NATO ally. We fought wars with you. We do military together.” And Trump says to him, “Didn’t you burn the White House down in 1812?” And Trudeau says, “That was the French.” He says, “No, it was the Canadians.”

(03:07:33)
And so it was just, like I said, he’s always keeping everyone on their toes.
Lex Fridman
(03:07:39)
Yeah.
Jared Kushner
(03:07:41)
But he took very calculated risks. And like I said, everyone was outraged all the time with everything. But if you look at his body of work, people said if he was elected, he would start World War III. Meanwhile, we inherited world filled with wars, no new wars, right? Three years. He made peace deals, no new wars. He was tough. He was strong. People respected him. He built relationships and got trade deals done, got peace deals done. The economy was rocking. His body of work, I think was pretty strong as president.

War in Ukraine

Lex Fridman
(03:08:14)
Like you said, no new wars. This makes me think if Donald Trump won the presidency, what the current situation in Ukraine would look like. But let me just ask you, zoom out and ask you broadly, do you think the war in Ukraine could have been avoided? And what do you think it takes to bring it to an end?
Jared Kushner
(03:08:33)
I think 100%, it would’ve been avoided. Not 99%. President Trump, for four years, had no problems with Russia. We were arming Ukraine, but we were working with Russia. And again, the first two years, we had a little bit of issue working with Russia because they were accused of colluding with us since we had to go through that investigation. But in the second two years, we were trying to focus Russia on what are the areas where we can collaborate together. I think Russia, we thought it was in their strategic advantage to play US and China against each other because of the way that everything was done before. They were stuck with China, but not getting a lot for it. Under Bush, they took Georgia. Under Obama, they took Crimea. Under Trump, there was no problems. And then under Biden, unfortunately, I think they misplayed a couple of things, which I think provoked Russia to go forward. Still no excuse to do what they did. I think that the invasion was a terrible thing and should not have occurred.

(03:09:34)
But with that being said, I think 100%, if Trump was president, there would not be a war in Ukraine today.
Lex Fridman
(03:09:41)
Coming to the table and negotiating a peace, whether it’s Donald Trump, whether it’s Biden, whether it’s anybody, what do you think it takes? Do you think it’s possible? And if you’re in a room, if Jared Kushner is in the room with Vladimir Putin and Volodymyr Zelenskyy, what does it take to have a productive conversation? And what does it take for that conversation to fail? What are the trajectories that lead to success and failure?
Jared Kushner
(03:10:10)
I think we go back to negotiations. Number one is trust, right? Both leaders have to have the ability to communicate what an off-ramp is without fearing it’s going to leak to the public. So if you go to the posture of Zelenskyy right now, and by the way, President Zelenskyy, I have a lot of respect for the courage he showed, especially initially, you saw what [inaudible 03:10:33] did in Afghanistan, they were getting attacked by the Taliban. He took the cash and got the hell out of there. Staying in Kiev when he did, how he did it, was one of the most brave things we’ve seen in a long time. And he has a ton of my respect and admiration for doing that.

(03:10:47)
But now he’s promising his people we’re going to win the war, and the military action has not necessarily coincided with that sentiment. And so there has to be some form of off ramp, but he can’t say that publicly. So for him to be able to work privately with somebody who can help create a new paradigm where both leaders can say, “We’re going to stop the bloodshed. We’re going to stop the risk of nuclear war for the world. We’re going to stop what’s happening.” That’s really what it will take. How that occurs, again, it’s not something I’m involved in now, so I don’t know who the right broker is or how to put that together, but essentially they need somebody in between them who can figure out how do you create a landing zone that works? Because neither party’s going to jump until the pool is filled with water.

(03:11:36)
And you have to outline what the go forward looks like, because you can’t just stop it for them to get worse for both parties. You have to move it forward into what happens next, that hopefully can start to turn the tide to benefit both sides where they can focus on the future instead of being stuck into the old paradigm of who started what, who’s to blame for what, who did what to who. It’s just a lot of tough stuff now that’s occurred that’s going to be hard to walk back. And it’s a big task to get it done, but for the sake of the world, it’d be amazing if we were able to reach a conclusion to that conflict.
Lex Fridman
(03:12:16)
Just going back to your earlier mention of North Korea, what do you think it takes to bring Vladimir Putin and Volodymyr Zelenskyy to the table together?
Jared Kushner
(03:12:26)
Leadership.
Lex Fridman
(03:12:30)
So you’re saying it has to be a US president?
Jared Kushner
(03:12:34)
It has to be somebody who’s willing to put themself on the line to go and do it. And again, if you’re the US president and you’re the most powerful nation in the world, you should be trying. But I do think, again, the posture that the US has taken has probably been in a place where it would be very hard for them to get the trust of Russia based on the way that they’ve played their moves to date.

(03:12:58)
I always thought, from the beginning, that Putin would try to bring in President Xi in China to resolve it, to basically give a big screw you to America to say, “China’s now the one in charge of this.” But that hasn’t seemed to manifest itself to date either. But it takes leadership. The leaders have to get it and say, “Let’s get everyone together and let’s try to get this done.” Because every day it goes on, A, more people are dying, and B, we do risk a nuclear war for the world, which is not a good situation.

Vladimir Putin

Lex Fridman
(03:13:29)
Let me ask, since you helped set up phone calls between Donald Trump, Putin, and the King of Saudi Arabia, if I were to interview Putin, what advice would you give on how to get a deep understanding of the human being?
Jared Kushner
(03:13:46)
So I didn’t deal with Russia a ton, but in my interaction with Putin and with Russia, I would kind of point out a couple of things. Number one is, when America was hit with COVID and New York was looking like we were going to run out of ventilators and masks, Russia was the second country that sent us a planeload of supplies. And they didn’t send that because they hate America, they sent that because we were starting to make progress together as countries, and they thought that they wanted to show goodwill to figure out how can we start working together.

(03:14:17)
And again, people may attack me for saying that that sounds naive. Again, the past 15 years may show that that’s not the case, but I don’t believe that countries have permanent enemies, and I don’t believe countries have permanent allies. Right? Again, you think about the US and Russia and World War II, we worked together to defeat the Nazis, right? And now we’re great allies with Germany, who basically was our great enemy in World War II. We’re great allies with Japan, who was our great enemy in World War II.

(03:14:43)
So it goes back to the notion we discussed earlier of you shouldn’t condemn tomorrow to be like yesterday if you’re unhappy with yesterday. So number one is I would definitely ask him about that.

(03:14:56)
The phone call that you mentioned was after we did a pretty intense negotiation to create the largest oil cut in the history of oil production. So during COVID, demand just shut off like crazy, and it was stopping very quickly. Saudi and Russia, at that time, were having a conflict. They created this thing called OPEC+, which goes back again, history between the two countries where they had conflicts, and then all of a sudden they were working together to try to stabilize the oil markets. But they couldn’t agree on the cuts, so Saudi actually increased production. So you had two things hitting at once where Saudi and Russia were both increasing production and demand was dropping.

(03:15:34)
So you were headed for a real crisis, and I was starting to get calls from a lot of the oil industry executives here in America saying, “You don’t understand. We can’t just flip a switch and turn off our oil wells. We’re running out of storage here.” And I said, “Look, president Trump likes low oil prices, so he’s not upset about what’s happening. You have to call him and if he gives me permission or the instruction, then I can try to intervene. But right now, he’s not inclined to intervene.”

(03:15:59)
After a little bit, he said, “It’s time. Get involved. Go do it.” It was right over Passover. This was during COVID. I spent three days nonstop on the phone with [inaudible 03:16:09] from Russia and with MBS directly, and I was dealing with Dan Brouillette, who was our energy minister, going back and forth, and it was crazy. I mean, it was just one of the craziest negotiations. We ended up agreeing on the largest oil cut in the history of the world.

(03:16:24)
But the story you went to before, which was pretty funny, was finally make the deal, and we set up a call between King Salman, Vladimir Putin, and President Trump to announce the deal. I’m like, “Oh, this is great.” So President Trump gets on, “Congratulations. We have a deal.” And then King Salman says, “We don’t have a deal. Mexico hasn’t agreed to their cuts.” He’s saying, “What do you mean?” So they were part of the OPEC+. And so I get a note saying, “You got to go call Mexico. So I’m calling Mexico and we’re dealing, they’re saying, “We’re not doing any cuts.” I said, “Why? I said, we’re hedged at $55.” I said, “Why didn’t you tell us that at the beginning?” So I’m telling the Saudis. So we were working through this whole thing.

(03:17:02)
So meanwhile, we were trying to find the compromise with Mexico. I set up a call with Trump and Putin, so they can kind of talk this through. And he was always trying to play the game of how do we get Russia away from China? He always thought that that was not the right strategic framework for US interests. And again, we had no problems with them during that time.

(03:17:24)
What I would say is that for Zelenskyy and Putin, any conversation with both of them is about understanding their perspective. I think with Putin, he’s a student of history from the things that I saw with him. If you look at Russia over the last 500 years, I think they were attacked by the Polish in early 1600. I think they were attacked by the Swedes in the 1700s. I think they were attacked by Napoleon in the 1800s. And then in the 1900s, they were attacked by Germany twice. And so from his perspective there is… In the early days of Russia, they were attacked by the Mongols. They were-
Jared Kushner
(03:18:03)
… Russia, they were attacked by the Mongols. They were very vulnerable. And a lot of the geography of Russia today is really designed for defensive purposes, that they have natural barriers that makes them easier to defend. And Russia is a massive land mass, it’s twice the size of America, they have 11 times zones in the country, and so I do think that for Vladimir Putin, his biggest concern is, “How do we create a security paradigm in the west of this country that won’t be a creep?”

(03:18:29)
And I think that there’s two different parts of the mindset. The people who are most cynical of Putin will say, “Well, he’s just trying to recreate the USSR. He’s being expansionist,” and the people who want to be sympathetic to him will say, “Well, if you think about it, the Russian perception of the NATO arrangement was that they wouldn’t be expanding westward. Over the last years they’ve included all these countries that they said, they promised they wouldn’t include,” who knows what the promises were or were or weren’t?

(03:18:58)
But what I do know from his perspective is allowing Ukraine and to NATO was always a red line, and that’s why we never offered it. We never provoked it. We never brought it up. We said we’re going to arm them, and we basically said, “Just calm down. We don’t want any conflicts there. We have bigger issues and bigger opportunities to work from.” So I do think you have to think through, what’s a paradigm that he can accept? And I do think that he’ll give the justification for why he’s done what he’s done, and then I think the framework for a solution is about, how do we move both parties forward? Tough job. I hope you get the opportunity to do it because I think it’s a conversation that will only help the world hopefully find a pathway forward.
Lex Fridman
(03:19:40)
And I should mention, because you mentioned geography, one of the many books you’ve recommended to me that gives a very interesting perspective on history. It’s called Prisoners of Geography by Tim Marshall, and it has a very interesting perspective on the geopolitical conflicts and perspective of Russia from a geography perspective. And also for China in the second chapter. And there’s a lot of understanding of why the expansion of NATO is such a concern for Russia, because geography still even in the 21st century, less and less so because of technology and so on, but it still plays a major role in conflicts between nations: rivers, mountains…
Jared Kushner
(03:20:25)
And understanding the DNA of countries. It was one of the most phenomenal books, and I just found it on Amazon randomly, but I loved every minute of it. The chapter on America is also incredible, going through the evolution of how we became the country we are, the different acquisitions, the different changes, why we have all these geographic advantages, and it’s an unbelievable book for anyone who’s interested in geopolitics.

China

Lex Fridman
(03:20:48)
I have to ask on several aspects of China. First on the president, the meeting: you helped set up a first call and first meeting between Donald Trump and Xi Jinping. Can you tell the story of that? Because that’s also interesting, again, that first phone call, the reaching out, the forming the human connection, which ultimately leads the connection between nations, and the possibility of collaboration.
Jared Kushner
(03:21:12)
During the transition, President Trump took a call from the head of Taiwan and that sent the Chinese into a real tailspin, and he didn’t do it, I think to be provocative to them as much as just as a businessman, he felt, you answer your call: if somebody wants to speak to you, you speak to him, you want to have conversations, hear their point of view. But it was taken as a very big insult and it was against tradition and norm. And so, that was something that set us off in a wrong direction. My view at the time was that we are entering a G2 [inaudible 03:21:47] world, whether people want to admit it or not, and that a lot of these countries in what I call the middle market countries, when China was being aggressive with their One Belt, One Road, they were basically playing the US and China against each other. And I thought that by the two leaders coming together, there were some things they wouldn’t agree on, but there was a lot that they probably could agree on, which could lead to resolutions to a lot of issues in the world. That was my most optimistic view. My more pragmatic view was that President Trump had very big issues on trade that he wanted to get to with China. He felt like China, their trade practices were unfair, they weren’t following all the global rules of trade, he was a little bit nervous that they would be provocative with Taiwan, and I felt like the two of them getting together would be the best way to try and resolve that.

(03:22:38)
So, the Chinese are very proud and a lot of it’s about face, and so in order to negotiate for that first call, we basically agreed on what would happen in the call. So not, “Let’s just have a call, say hi, nice to meet you.” It’s a question of, President Trump basically agreed that he would acknowledge the One China Policy, which he didn’t see as a big concession because you could always unacknowledge it the next day, “So yeah, I’ll acknowledge it, and then we’ll go and exchange.” President Xi was going to come over to the US for a visit so they could sit together and they want to do it outside the White House, and so we agreed on Mar-a-Lago, which I also thought was good because President Trump always felt much more comfortable when he was hosting at his properties, and he just felt at home. And so, he liked having people as his guests and he loved it. He always felt really relaxed and it was great. So, that was really what we did.

(03:23:25)
Then, the Chinese come over, very much anticipated visit. And it was incredible, so they were supposed to sit together for 15 minutes, and they sent about an hour and a half together. And during that meeting with President Trump, I said, “Look, let’s just set some ground rules to this relationship. Let’s just not talk about Taiwan. Just don’t do anything I don’t want it on the table. If it does, I’m going to have to do harsh things. I don’t want this to be a problem for four years. We’ve got bigger issues.” They basically just, again, you notice four years of Trump administration: no Taiwan talk whatsoever. It was a non-issue. Started talking about the trade issues. They spent a lot of time on North Korea. President Trump was trying to get the perspective from President Xi about North Korea because that was again considered from Obama, the biggest national security issue that we faced at the time, and they just had a good feeling for each other.

(03:24:13)
It also helped that my wife and I, we actually had a Chinese nanny and teacher in our house, and our kids learned fluent Mandarin, and our daughter actually opened when President Xi and President Trump were together with Melania and with Madame Peng, my daughter actually sang them a couple of Chinese songs. And I thought that was a nice way to show we’re tough, but we respect your culture because the Chinese have an incredible culture that goes back thousands of years: they’re very proud in how they do it. And I think that sign of respect also set things off in a very warm way for President Trump say, “My granddaughter speaks Chinese and we’re showing you the respect,” which I think is very important, and he did have respect for them.

(03:24:59)
The next part about the visit, obviously we had a lot of discussions on trade, but the part that was probably most impactful to me was President Xi basically did an hour monologue at lunch where he just went through Chinese history from his perspective, and he talked about with particular emphasis on the Treaty of Unequals and then, the 100 years of humiliation. And then, you go through from Mao all the way to today and you had China coming back and rising, and you could tell that he learned the lessons from the past and was very committed to seeing China go through. So, that was a different time, right? So, China today is different than it was in 2017. In 2017, I remember President Xi was at Davos and he was vetted by all the top business people in the world as, “Donald Trump was the threat to the global world order. President Xi was the champion of free trade and the biggest champion of environmentalism and fighting for climate change.” And what occurred was President Trump came in and basically said, “I think China has not been following the rules-based order,” took very drastic approaches with tariffs. Every time he would do the tariffs again, I had Mnuchin, our treasury sector come to Ivanka at my house, “If he does this, this is going to crash the whole economy,” and by the way, he believed it. These were things that people were telling him would be very tough to do. President Trump had a gentleman named Ambassador Lighthizer, Robert Lighthizer. He was really the tip of the spear on all of our trade negotiations. He worked very well with Secretary Mnuchin, and we ended up increasing tariffs to numbers that hadn’t even been thought could happen. So we did the first round of tariffs, then the Chinese came back and retaliated very surgically trying to hit us in all the areas that politically would’ve been difficult. And what Trump did was instead of backing down, he took some of the revenue from the tariffs, gave it to the farmers and said, “I know that this is going to hurt your business, but I’m going to make sure you guys are made whole,” and then he doubled down, and basically went back at the Chinese with even more tariffs.

(03:27:03)
So, what we watched over a year and a half was probably the biggest hand of poker that was ever played, and it was an amazing experience to be a part of it. And the role I played was really working for Secretary Mnuchin and Ambassador Lighthizer as a back channel with the Chinese to make sure we can just deescalate things and get to solutions in the best way possible. So anyway, it was a fascinating time, but if you think about the global awareness of the bad practices that China was putting in place today versus what they were in 2016, I think one of President Trump’s most successful policies was shifting the way the entire world understood the threat of China, and then putting in place the beginning of a regime to try and rebalance the world so that we could have more economic parity.
Lex Fridman
(03:27:55)
You mentioned to me the book, The Hundred-Year Marathon by Michael Pillsbury when we discussed China, and I’ve gotten a chance to read parts of it, and I highly recommend people read it’s definitely an eye-opening perspective. I don’t know if I agree with all of it, I don’t know if you agree with all of it, but it gives a very intense perspective on China, and you said it was instructive to how you thought how Donald Trump thought about China. Can you describe the main thesis of the book, and maybe with a hopeful view how it’s possible to have a trajectory of these two superpowers working together in the 21st century, versus fighting against each other?
Jared Kushner
(03:28:42)
Perfect. So, it’s a very big book, and I think it’s a book definitely worth reading. Michael is tremendous, he speaks fluent Mandarin, and so he spent a lot of time researching to do the book, so I highly recommend it to everyone. And it was considered more of a fringe perspective in 2016, but it really, I think came to represent the underpinning of what the collective thought was of the Trump administration. And maybe you could argue that it was even more cynical. The whole thesis of the book was that China from 1949 to 2049 was working to reclaim their position as the global leader. So, you had the Chinese empire. One of the things, I don’t know if it’s from this book or a different book that I read that spoke about how in the late 1700s, basically the Emperor of China was offered some of the industrial capability from England, which was basically now becoming the Industrial Revolution, and basically, “No, we’re fine. We’re the great Chinese empire. We don’t need any of these things. We’re better than that.”

(03:29:46)
And by rejecting that, the rest of the world got stronger, China remained weaker. Then, you had the Opium Wars, the Chinese had big opium problems through all the trade back and forth. And then, China from about 1840 to the 1940, 100 years where they, after all these treaties, were really a second class country. And so then, you have the People’s Revolution that comes in, and he talks about how China very strategically, as a very poor country, would fight their way back and build brick by brick. And he proffers in the book that Nixon didn’t go to China and open China, it was China that actually went to Nixon and was able to use Nixon in order to open up. And then, they talk about how under Carter, they were able to get the US to contribute to a lot of their, they were able to start borrowing the US know-how from our university systems, from our medical, from our science, from our research.

(03:30:38)
And the whole notion that was the conventional thinking of American leaders was that the more we helped China advance, the more they would become a free market economy, and it was a great market. The only difference was was that they weren’t allowing us access, they were making our companies basically give them all of their technical knowledge, they were stealing our intellectual property, they were doing espionage to steal a lot of the patents, they were just ignoring our patents and they weren’t following any of the rules of international trade. Then, they started becoming the world’s manufacturing hub. They basically came the world’s factory, and then they started this whole initiative called the Belt and Road Initiative in order to start locking in their lines of trades: they were buying up all the ports everywhere. They were building railways, thinking, “How do we lock in our distribution so that we can maintain the dominance as the world’s global factory?”

(03:31:26)
And so, it was a brilliant long-term plan that they were doing. And by raising awareness, by putting the tariffs, Trump slowed them down a lot. The real question is, if they actually did achieve this full objective of becoming the world dominant country, what they would’ve done with it, whether they would’ve been nefarious or not. I think from my perspective, even with some of the divisions and issues we have now in America, I still would rather an American-led world order than a Chinese-led world order. But the notion was is that they were playing a very zero-sum game and really going to be the dominant leader in this new world order. So that really framed the perspective, and the Chinese were always fearing, “Is Trump trying to stop our rise?”, and you have a great book also by Graham Allison that he writes about, are we destined for war between us and China? And he goes through different historical times where you have a power and a rising superpower.

(03:32:24)
And I think more than half the time it ends up leading to war. So the question is, what’s going to happen here? And I do think that Trump’s perspective, and this is my interpretation, because everything was always tactical day-to-day, and he was unpredictable to the Chinese, which they couldn’t deal with, and he was unpredictable even to his team sometimes because he was playing it day by day and issue by issue, and always changing and adjusting, which is how an entrepreneur thinks. He respected the job they did by building their country: they moved 300 million people out of poverty into the middle class. They did it at the expense of a lot of other countries throughout the world, especially America.

(03:33:01)
But Trump says, “Look, stupid politicians made deals. I respect China for doing what they did, but what I want to do is I want to change the paradigm so that for the next 20 years we can maintain our advantage over them, we can maintain our competitive dynamic,” and his general view was that America is the best private sector in the world, we have a lot of the best minds in the world, and if we can just have a level playing field with set rules, then America should be able to outperform. And so, that’s really what we were trying to do: we were trying to get rid of some of their state subsidies, make them follow some of these international rules of trade, and not allowing them to do predatory investments that then undercut different industries that we had ,so that they can have global market dominance or monopolies on different industries and then have pricing power, but also geopolitical power.

(03:33:54)
So, one of the examples that people talk about now is China for the last 20 years was very advanced on seeing this electrification trend. They subsidized solar panels, a lot of the American solar panel players were put out of business. So now, I think it’s 90% plus of solar panels in the world are manufactured in China, and then all the rare earths that you need in order to make these solar panels and to make these electric vehicles, China’s bought up most of them and a lot of the refining capacities in China. So, thinking through strategically, how do we create an even playing field so that we’re not at the mercy of them, and how you can have a rules-based world order, that was really the thought of what we were trying to work towards.
Lex Fridman
(03:34:40)
There’s this SNL skit where Jimmy Fallon plays you, and you’re walking into the Oval Office looking cool, wearing shades and a bulletproof vest to the song Unbelievable by EMF, I don’t know if you’ve seen it, but it’s pretty epic. And then Trump says that, “You’ve traveled the world representing the administration, but no one has ever heard you speak,” so there’s a lot of questions I can ask about that. But one of them is, can you introspect why you choose this low-key approach of operating behind the scenes and not speaking much to the public, at least at the time? You’ve spoken a little bit more, and today you’ve spoken for a really long time, which I deeply appreciate.
Jared Kushner
(03:35:23)
No, it’s been a pleasure to do this and thank you for the opportunity to talk about these things. And so, that was a really funny skit. And it’s funny, the thing I got made fun of the most for that was the wardrobe. And that came from after three months in the administration, we were having dinner with all the generals and they were updating us on the war with ISIS. And General Dunford said to me after, “Look, the president can’t come to see how we’re fighting this war, but I’d like to invite you to come with me to Iraq and come see. And would you come with me?” I said, “You know what? That’s great.” I always learned in business that you can’t make decisions from just an ivory tower. You have to go to the front lines and see what’s actually happening. So I said, “No problem. I’d love to go.” Meanwhile, two days before I’m about to go, the doc from the White House stops by my office and says, “We need to get your blood type.” I said, “Why do you need my blood type for this?”

(03:36:11)
“You’re going to an active war zone.” I’m like, “Okay, so I guess I’m going to a war zone,” I didn’t really think this thing fully through. I get on the plane with Dunford and we land in Iraq and he looks like GI Joe. He’s a great general, he’s very well respected in the military, and we go in and we get on Black Hawk helicopter. They said, “You know what? Today’s a nice day, let’s take the sides off,” and so I get on the plane and there’s a military service officer who then takes a machine gun, locks it into a thing, takes the bullets, puts them into the gun, and is sitting there saying, “We’re ready to go,” and then I’m looking out and there’s like three other helicopters with guys. One was an Osprey with a guy, buckled in also with a machine gun looking out, we take off, and we’re flying over Baghdad from the airport to the embassy. And as we’re going, I’m sitting in an open air helicopter with the chairman of the joint chiefs of staff, guys with machine guns everywhere-
Lex Fridman
(03:37:10)
This is a new experience for you. You haven’t experienced this previously.
Jared Kushner
(03:37:13)
I would say slightly. I was doing real estate like three months ago and now I’m flying over Iraq and the chairman says, “That’s Saddam Hussein’s palace,” and I looked down, there’s a big bomb right through the middle. Then, you see the area with the two swords in the hands. I’m saying to myself like, “How the hell did I get here? What is happening?” So meanwhile, we end up going to the front lines to be with the Iraqi military, which the US military is working closely with. And I had a meeting that night with the President of Iraq. And so I wore, what are you wear to the front lines in a battle zone and also to meet the president. So I put a sports jacket on, we land at the front line and they give me a bulletproof vest that says, “Kushner,” on it. I tape it, I put it on, I go out, I cover the N-E-R, so it just said, “Kush,” and I went and I didn’t realize they were taking pictures. And so-
Lex Fridman
(03:38:00)
I think the picture looks pretty epic. You with sunglasses, I think I love it.
Jared Kushner
(03:38:03)
So anyway, so that was the funny story behind that. And then actually, my brother was at some society event in New York and he ran into Jimmy Fallon, so the two of them took a selfie together. And Josh writes me, he says, “Hanging out with my older brother in New York. I’m trying to explain to him what your voice sounds like,” so it was good. So, that was a funny one.

(03:38:23)
But I think just being behind the scenes for me just gave me more maneuverability in the sense that, again, it goes back to trust and people knowing that I wasn’t going to try to publicize the things they were telling me. I think it just gave me more ability to operate that way. And I also realized too, communicating is a very important skill. Luckily in Washington, there’s no shortage of amazing communicators. I think there were a lot of people who were much better than me than being communicators. So I was very happy that they were willing to do it because it wasn’t something that I had a lot of experience with or necessarily I thought I was very good at. And so, I just did my job and just focused on getting things done.

Learning process

Lex Fridman
(03:39:06)
Let me ask you, you have a very interesting life. If you were to give advice to young folks on how to have such an impactful life, what would you say? Career and life, how to have a successful career and a successful life?
Jared Kushner
(03:39:25)
Number one is I would say you just have to work hard at everything you do. Number two, I would say never stop learning and always try to say yes more than you should, go out of your comfort zone. And I think just, you’ve got to work hard at everything you do. And if you’re going to take something on, do it the best you can. One of the lessons I write about in the book from my father was I remember I was going for a job interview and he asked me, he says, “Well, what time are you leaving to the job interview?” It was at nine o’clock. I said, “I’ll leave at eight o’clock.” He says, “Well, what if there’s traffic?”, I said, “Dad, I’ve done this drive 1,000 times. There’s never traffic.” He said, “What if there’s an accident?”, I said, “I can’t control that.” He said, “Jared, the only excuse you ever have for being late is that you didn’t leave early enough.”

(03:40:11)
And I just think it’s something where if you want to accomplish something, a lot of people I hear they complain about what other people do or why it’s hard or why it’s impossible. And again, I say this as somebody who’s been so blessed with so many things in life, but when I’ve had challenges or things I’ve wanted to achieve, I just focus and say, ” What can I do?”, and I’ll read everything I can get my hands on. If the door closes, I’ll try the window. If the window closes, I’ll try the chimney. If the chimney closes, I’ll try to dig a tunnel. It’s just, if you want to accomplish something, you just have to go at it.

(03:40:43)
And I think the most important thing I’ll say, sorry, I’m thinking my way into this answer is just do the right thing. I think that’s also right. And I saw that in my career in be good to people, be honest, do the right thing. And if you do that, I think long-term, it does pay off. Maybe not in politics, but in the world at large, it does. And my hope is in politics it will as well.
Lex Fridman
(03:41:08)
I wonder if you can comment on your process of learning in general because you took on so many new interesting problem, and approached them with a first principles approach. So, what was your source of information? Because you didn’t seem to be listening to the assumptions of the prior experts, you were just taking on the problem in a very pragmatic perspective. So, how’d you learn about the Middle East? How did you learn about China? How did you learn about Mexico? Prison reform? All of this that you’ve taken on and were extremely effective at?
Jared Kushner
(03:41:48)
It really started with just talking to people. I would try to reach out to people who had been involved in different things, and ask them what they did, what they thought of the problem, who they thought was smart on it, what they read that helped them get a better understanding, why they think something had failed. And then, I would just read voraciously on every topic. Washington, it was harder to get advice from humans because I found humans had this weird tendency to talk to the media. And so, I talked to somebody, and I’d ask advice, and then the next thing I know is the Washington Post would call and say, “Jared’s an idiot, doesn’t know what he’s doing, and he’s even going to this person to get advice.” I’m like, “Yeah, I’m asking everyone,” so books really became an amazing guide for me.

(03:42:32)
Ivanka, she’s an incredible researcher, she’s just voracious. And so, she gave me some of my best books and some incredible advice as well. But that was really the process. And then, I think that was kind of the first stage. And then, the second stage was just constant iteration and readjusting plan as you continue to get more learning. And one story I tell in the book as well is that on my first trip to the Middle East where I met with Mohamed bin Zayed, who I spoke about earlier, the ruler of UAE, I spent two hours with him asking him questions and really going through the Israeli-Palestinian issue, the Israeli-Arab issue. And he said to me at the end of the meeting, he says, “Jared, I think you’re going to make peace here in the Middle East,” and I was shocked because first of all, he was at the time I think one of the most respected leaders in the region, somebody who I found to be very wise, and super thoughtful, and experienced.

(03:43:23)
And I said to him, “Why do you say that?”, I was flattered, obviously, but not certain why he was saying that based on the fact that I didn’t know what my plan was, I didn’t know what I was going to do and I had no pathway to make peace. And he said, “Well, the US usually sends one of three different kinds of people to come see me. The first are people who come and they fall asleep in meetings. The second are people who come and they basically read me notes but have no ability to interact on the message they’re there to convey. And then, the third have been people who have come to convince me to do things that aren’t in my interests. You’re the first person who’s ever come here and has just asked questions. Why have you done that?”

(03:44:05)
I said, “Because I figure this problem’s been going on for a long time, you live here, I’ll be gone at some point. You’re going to have to live with the consequences of whatever my work is, and the US has a lot of power. And my question is, what would you do if you were me and how would you approach this? And help me think about it.” And again, I wasn’t going to then take his plan and then execute it, but I thought it’d be very provocative to understand from the people in the region and instructive how they would use the resource and the power that the US had to solve the problems that were having significant impact on their lives.
Lex Fridman
(03:44:42)
Yeah, there’s a lot of power to the simplicity of that human approach where you’re just listening.
Jared Kushner
(03:44:53)
And one of my wishes for society as I leave government: I was living on the Upper East Side in a very liberal echo chamber. I then traveled the country. I met so many people who I never would’ve met otherwise, on the conservative side, on the independent side, on so many different issues, I think that people benefit, if you have such a strong point of view, I would follow the John Stewart Mill marketplace of ideas and find people who disagree with you, and don’t call them names, don’t say they’re a bad person. Say, “I want to understand why you feel the way you do.” Let’s have conversations in this country, and I think that that’s probably going to be our best way to work through the issues that we have currently.

Hope for the future

Lex Fridman
(03:45:34)
When you zoom out and look at the 21st century from a human history perspective, across the timescale of many decades, maybe centuries, what gives you hope about human civilization? Everything you’ve seen: you’ve traveled the world, you’ve talked to some of the most powerful and influential people, and you look at the future, what gives you hope about this little planet of ours?
Jared Kushner
(03:45:57)
What gives me the most hope is that anything’s possible. If there’s one lesson that I took from my time in government, it’s that people coming together to try to make tomorrow different than yesterday can succeed. And if the right people in the right places focus on the right ideas, I think the advancement that we can have for human history and for society can be tremendous. And I think that right now, I see we’re at a place in society where there’s a lot of what I call squabbles between countries, which are really man versus man issues. And those are as old as time, right? We’ve been fighting about borders or religion or who wronged somebody 100 or 1,000 years ago. And these are what I call more tribal battles. But I do think that as we advance with artificial intelligence, as energy becomes cheaper and it’s more readily available, I think we’re going to have massive industrialization, I think we’re going to have massive advancement.

(03:46:52)
I think in medical and science, we’re going to have cures for diseases. We have the potential in 10, 20 years from now to enter a dawn for humanity that could be incredible: we could become multi-planetary, we can explore the wonders of the world, we can find things we didn’t know. So, I think that if we put our energy towards finding these advancements that will improve the lives of everyone on this planet instead of figuring out ways to have these tensions between us, that for me, is the most optimistic case for what’s possible. And the reason why I believe it’s possible is because somebody with no experience, somebody who all I really had was the faith of a leader. And I had the courage to try, and I went out there with other people, and we took on some of the most hopeless, impossible problems, and we succeeded. And if we were able to do that, then everyone else should be able to do that as well.
Lex Fridman
(03:47:53)
Well, Jared, thank you for having the courage to try. Thank you for your friendship, for your kindness, most importantly, for your book recommendations. And thank you for talking today. This was fascinating and eye-opening. I hope to have many more conversations like this.
Jared Kushner
(03:48:08)
Thank you very much, Lex.
Lex Fridman
(03:48:10)
Thank you for listening to this conversation with Jared Kushner. To support this podcast, please check out our sponsors in the description. And now, let me leave you with some words from Mahatma Gandhi: an eye for an eye will only make the whole world blind. Thank you for listening and hope to see you next time.