Author Archives: Lex Fridman

Transcript for Matthew Cox: FBI Most Wanted Con Man – $55 Million in Bank Fraud | Lex Fridman Podcast #409

This is a transcript of Lex Fridman Podcast #409 with Matthew Cox.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Matthew Cox
(00:00:00)
She found $40,000 in cash in my freezer one night. So she’s like, “What is going on?” So we have this conversation and I tell her, “Look, people are looking for me.” “Who?” “Law enforcement.” “Which ones?” “All of them.” She’s like, “For what?” I go, “Mostly bank fraud.” And she’s like, “Well, how are they not finding you? I mean, people know you like your general contractor,” which I met four months before, this guy, six months before, this one, two months before. She’s like, “So-and-so, so-and-so…” And I’m like, “Right. Right.” She’s like, “I mean, they’ve got your name, they’ve got your… I go, “Well, that’s identity theft.” And she was like, “What do you mean?” I said, “Well, my name’s not… it’s not Joseph Carter.” “What is your name?” I go “Look, don’t even worry about it.”
Lex Fridman
(00:01:02)
The following is a conversation with Matthew Cox, a conman recently released from federal prison where he served 13 years for bank fraud, mortgage fraud, identity theft, passport fraud, and other charges. He has admitted guilt to all of it. He has written true-crime stories of many of his fellow prisoners. And now he continues this work by interviewing criminals about their crimes on his YouTube channel that I recommend called Inside True Crime. Exploring the mind of a criminal is exploring human nature at the extremes, often in its most raw and illuminating form. And that is something I definitely want to do with this podcast to understand the human mind and everything it is capable of. This is the Lex Fridman Podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Matthew Cox.

Mortgage fraud


(00:01:59)
What was the first crime you committed?
Matthew Cox
(00:02:03)
The first mortgage I ever did.
Lex Fridman
(00:02:04)
A mortgage is me borrowing money from a bank to buy a house.
Matthew Cox
(00:02:08)
Yeah.
Lex Fridman
(00:02:09)
How can you find a way to commit crime in this? How can you do fraud in this space?
Matthew Cox
(00:02:16)
It’s very difficult for the average guy to commit fraud because there’s so many safeguards set up. If you were to go in and say, “I make $300,000 a year,” “Okay, well, we want your W-2s, we want your pay stubs. We’re going to call your employer. We’re going to check to make sure your employer… how long they’ve been incorporated. We’re going to check to make sure they’re registered.” It’s like your whole plan fell apart because the average guy can’t do that. He can’t even come up with the pay stub and W-2.

(00:02:43)
So the average person, or “I’m going to put down this much money,” but you’re going to borrow that money from the seller. Okay, well then they start asking for bank statements. “Where did the money come from? How long has it been in your bank?” You can’t even have it put in your bank for a day, get a letter. It’s got to have been there for 90 days or 60 days, depending on the bank. And so there’s all these ways… For the average person, it’s very difficult to commit fraud. The average guy that works at Walmart and makes $60,000 a year, and he’s been there for five years and he saved his deposit, that’s really the guy that those transactions are set up for. To borrow a mortgage from Bank of America, that’s the guy they’re looking for.
Lex Fridman
(00:03:24)
So to commit fraud in this space, you have to misrepresent some aspect of your identity, of how much you’re worth, how much money you have, this kind of stuff?
Matthew Cox
(00:03:33)
Right. You have to be able to lie to the bank. Anytime you lie to the bank, you’ve committed fraud. And it’s funny, when I was doing it, I would say, “Ah, it’s in the gray area.” There’s no gray area. You’re either lying in some capacity or you’re not. So for instance, the very first loan I did, my borrower had been 30 days late on her rent. So they’re really looking at the last two years. So when you go into the bank, most of what they’re asking is a two-year window.

(00:04:09)
They’re saying, “How long have you been on their job?” They care about two years, and “How long have you been at your residency?” They’re looking for two years. Now, you could be at three places in two years. That’s fine. As long as you consistently paid for two years. Well, she had been in an apartment complex, but she’d been 30 days late. Now she caught it up, but she was late. The bank doesn’t want to lend you money if you’ve been 30 days late. So I was a broker and I whited out the 30-day late. I just got rid of it. And my manager is the person that told me to do it. She said, “It’ll be fine.” And she was right, it was.
Lex Fridman
(00:04:52)
What did it feel like? So that was the first fraudulent action you committed.
Matthew Cox
(00:04:56)
Yeah, I was worried. I always say I sweated bullets for four or five days, but I was concerned and I don’t know that I was concerned that I had broken the law. I was concerned because I was behind on my truck payment, I was behind on my mortgage. I had banked on being a mortgage broker, and I’d gone deep, deep behind on all my bills to do this. So in the last minute when this loan isn’t going to close and I have to commit fraud to make that happen… And my fear was they were going to figure it out and maybe I’d get fired. I didn’t think I was going to go to jail because my manager assured me, “You’re not going to jail. You’ll get fired at best.” So my concern was they were going to catch it and I get fired and I wouldn’t get paid. I needed that money so bad.
Lex Fridman
(00:05:56)
So we’ll maybe paint the picture here. Where were you working? Who was the manager?
Matthew Cox
(00:06:00)
The manager, it’s funny because I don’t think I ever really mentioned this, her name was Gretchen Zaas. I don’t mind saying it because she eventually ended up going to jail for fraud. Her name was Gretchen Zaas and she was a manager. I was working for a company called Eagle Lending, and it was in Tampa, and this was my first month. So it was my very first deal, three or four weeks into that first month. And I walk in, I put the file in front of my manager, she looks through everything. “Oh, great. Good. Good.” And put this one piece of paper over here and sat there. And then when she was done, I said, “What’s going on?” She goes, “Perfect. File’s perfect.” She goes, “But your borrower was 30 days late on her rent,” and she says it’s done. She’s like, “That’s a deal killer.” And I was like, “Oh my gosh, what do I do?” And I remember she pulled out a thing, a whiteout. Remember a whiteout? Not that it sticks, but the one that…
Lex Fridman
(00:07:00)
Okay.
Matthew Cox
(00:07:01)
And she started going… And I was like, “What?” She goes, “If I was you…” And she handed… She said, “I’d white it out. Make a copy, stick it back in the file.” She said, “It’ll be fine.” I was like, “That’s fraud. I could go to jail.” And she was like, “They’re never going to catch it.” She said, “Look, I do stuff all the time.” She said, “They’re not going to catch it, and nobody’s calling the FBI.” She goes, “Worst case scenario, if underwriting catches it, then they’ll fire you. That’s it. Nobody’s calling… You’re not going to jail.” And I trusted her. I was like, “Okay.” And so I did what she said. I stuck it in the file. And I mean, like I said, for four or five days, I was like, “Oh my God, I’m so scared.”
Lex Fridman
(00:07:45)
How old were you at this point?
Matthew Cox
(00:07:46)
Probably 29. I think it was 29. I had gone to college and so many things had not worked out. I got a degree in fine arts. There’s not a lot of people looking for anyone with a fine arts degree. And I tried to be an insurance adjuster. Tried that for about a year, year and a half, that didn’t work out. Ended up working construction for a few years. And so finally the girl I was dating said, “You got to be a mortgage broker.” She’s had just started in the mortgage industry. And she was like, “You have to do this. You were born to do this. This is perfect for you.”
Lex Fridman
(00:08:31)
What did she see in you?
Matthew Cox
(00:08:32)
She said, “You’re a salesman.” Because I was like, “I can barely balanced my checkbook. I don’t know anything about numbers.” And she was like, “It has nothing to do with that. It’s sales. It’s putting together deals. You’re good at that. You’re good at negotiating. You’re a natural salesman.” And I figured I need to try something.
Lex Fridman
(00:08:53)
So what aspect of mortgages is sales and deal making, what aspects require the charisma that you clearly have?
Matthew Cox
(00:09:02)
Well, one, you have clients that have lots of options. They can go to Bank of America, they can go to SunTrust, they can go to Chase. They have options if they have perfect credit. I ended up working for a company that was a subprime lender, and those people didn’t have a lot of options. Honestly, by the time they got to Eagle Lending, their options were over. So what ends up happening is you’re negotiating with sellers. You would think that a lot of the stuff in that industry that real estate agents should do, the brokers end up doing because real estate agents are used to… You meet them at the house or they take you to several houses, they open the door, they walk around, they write up a contract that’s legit, a legit contract, and you’re already pre-approved. Everything works out. But subprime, that’s not the case.

(00:10:03)
You got borrowers with horrific job history. They don’t have enough of the down payment. Maybe they have the down payment, but they don’t have the closing costs. So you have to go to the real estate agent and say, “Listen, I need you to raise the purchase price and have the seller pay the closing costs,” which is legal to a degree, but that’s not how they wrote up the contract. So now you’re having to get them to rewrite the contract or there’s little things you’re trying to do. And the more deals you get done, and the more you deal with certain real estate agents, the more you start to realize that they’re… You know which ones are completely above board and which ones are willing to twist the rules.
Lex Fridman
(00:10:49)
And a lot of it works on personal relationships.
Matthew Cox
(00:10:52)
Right. Right. For some reason, people tend to like me and trust me. I don’t know why. It hasn’t worked out for so many people, but people naturally seem to trust me. And so if I say, “Hey, I can close the loan, but you got to do this. It’ll be cool. Don’t worry, we do it all the time,” it’s like my third loan and “I’ve been doing this for years.” And they go, “Oh, okay.” And then they raise the purchase price, they add some money, they have the seller of the house give the borrower some money, they stick it in the bank or they put it in Escrow, the closing company. Now you’re starting to massage deals.
Lex Fridman
(00:11:30)
What was the second time you committed a crime? So how did it start to evolve from the whiteout?
Matthew Cox
(00:11:35)
Well, I mean, when that went through, I think a normal person probably would’ve said, “Wow, it was a one-time thing. Got away with it. I’m good.” But for me, it just emboldened me. I just got a check for, I don’t know what it was, 25, $3,500. I was thrilled. And by that time, I was already working on another deal. But that guy, he made… I forget, it’s something like… He had made, let’s say $45,000 the year before in his W-2, based on his current track record or his year to date of his pay stub, he made just enough money. But if you factored in last year’s W-2, he was shy. So if I changed that 45,000 to 51,000, then the loan closes. I get a check for 3,500 bucks. He gets into a house. I’m doing him a favor. I’m doing God’s work. So I fix it.

(00:12:43)
I kick back. I’m terrified a little bit, worried about it. Sure enough, it closes. Four or five days later, they call me, “He’s ready to close.” A week later, we close. I get a check. Next guy that comes in… I mean, I got very, very quickly… I was concerned, “Do you have a house? Do you have a deal? Is it ready? I can get you done.” Now, if you were in bankruptcy or something, there’s some things you just… You’d pull their credit and you just couldn’t help them. If they had a 550 credit score or something and no job. I mean, it had to be within reason, but very quickly it was changing W-2s, changing pay stubs, changing appraisals, fixing, like I said, verifications of rent. So it evolved very quickly for me.
Lex Fridman
(00:13:28)
And you’re essentially helping people.
Matthew Cox
(00:13:30)
That’s what I told myself.
Lex Fridman
(00:13:32)
Giving them a chance. People that have been really struggling financially in life. So you’ve been telling yourself that you’re doing a good thing for people.
Matthew Cox
(00:13:43)
I told myself that right up until… That those loans were solid and I was helping those people out, right up until I went to prison. And I was in prison and I had to write… The government asked me to write an ethics and fraud course to help teach the nation’s mortgage brokers. All loan officers and brokers have to take… I think it’s nine hours of continuing education every single year. And I was approached to write the ethics course, and it was about that time and about the same period of time I was writing my book, and I started reflecting on what I had done.

(00:14:28)
And the truth is, and this is a horrible thing to say, because the first time I ever heard somebody say this, I remember thinking, “Oh, that’s a horrible thing to say.” Some people should not own a house. They shouldn’t be allowed to borrow. They’re not in a position financially. And there were many occasions where I put someone in a house that they 100% swore they could afford. I was helping them. I told myself I was helping them, and a year and a half later, they’re going into foreclosure. Their stuff’s on the corner, they don’t know where to go. And the truth is that I’m not smarter than the actuaries that came up with those underwriting guidelines.
Lex Fridman
(00:15:10)
So in this whole process, how are you making money? Are you taking a percentage?
Matthew Cox
(00:15:15)
Broker fee. Yeah, I charge a broker fee, or you charge yield spread. So yield spread is… Let’s say the interest rate is 8% interest. If I charge them 25 basis points over the 8%, so I charge them eight and a quarter, 8.25, then I get 1% of the loan back as a fee. So if I charge them 8.5%, I get two points back. So if it’s a $100,000 piece of property and the bank says your interest rate is going to be 8%, and I tell you 8.5 and I’m charging you a $3,500 broker fee, now I’m making $5,500. So on even a $100,000 loan, you could make a nice chunk of change. I mean, it’s-
Lex Fridman
(00:16:03)
So how much gray area is here? You said that there really isn’t when you’re lying or not, but it feels like there is.
Matthew Cox
(00:16:10)
Well, every time I change something, it wasn’t gray area, I just committed fraud. At this level, you either meet the guidelines or someone has massaged it in such a way that they’ve committed fraud and that’s it. And there’s tons of ways where you can commit fraud and they just can’t figure it out. Does that make sense? I mean, you’ve committed fraud and it’s like they’ve looked at the entire… They look at all the documents and they double check everything, and they know there’s fraud in here and they just can’t find it.
Lex Fridman
(00:16:40)
Just because they can’t find it doesn’t mean it’s not fraud.
Matthew Cox
(00:16:42)
Exactly, doesn’t mean it wasn’t fraud.

Creating fake people

Lex Fridman
(00:16:44)
As part of this, you did a lot of fascinating things. One of the things you did, you talked about creating synthetic people, meaning creating fake identities. What does it take to do that well?
Matthew Cox
(00:16:57)
So your credit profile is made up of your name, date of birth, your address, and your Social Security number. And then there’s other things where you work, that sort of thing. But what people don’t realize is there’s so many people out there that think that the credit bureaus already know who you are, but the truth is, the first time the credit bureau has ever heard about you was when you told them. The first time you applied for a credit card, they created a credit profile at that moment. Prior to that, they had no idea. So the first time you apply, you give them your full name, date of birth, Social Security number and your address, and they create a credit profile and they say, “Hey, no record found of this person. He has no credit, nothing, probably got denied.”

(00:17:53)
Well, what I realized through the course of… Because eventually I ended up leaving that one company and I opened my own mortgage company. When I opened that mortgage company, I was on the inside. Does that make sense? I wasn’t just a broker that was sitting out with everybody else and would periodically come in and ask questions or would call underwriting, but I really didn’t understand what was happening and exactly what the underwriting guidelines were. Now, I was actually talking to the underwriters and you’re talking to the owners of the lending institutions and the banks, and you’re talking to all of the account executives.

(00:18:33)
And now, it wasn’t just Eagle Lending I was talking to, there were 40 different account executives coming in on a weekly basis trying to get us to sign up with their lender. And they’re on the inside coming in, showing you programs and saying, “Look, if your borrower is self-employed, we don’t ask for this or this, we just ask for them to say to say they’re self- employed.” Liar loans. You’ve heard the term liar loans?
Lex Fridman
(00:19:02)
No.
Matthew Cox
(00:19:03)
Okay. Or no doc loans where they don’t ask for any documentation. If he’s got over, let’s say, a 700 credit score and he says he’s been a plumber and he works for himself, and he’s got over a 700 credit score, he just has to say he’s worked for himself for over two years, and-
Lex Fridman
(00:19:17)
They don’t ask any other questions.
Matthew Cox
(00:19:18)
They don’t ask for any documentation. He’s got the money in the bank. He’s got a 700 credit score, says he’s been on the job for two years, he’s self-employed. We’re going to raise his interest rate by 1%, and that’s it. He’s got the loan. But you start to know how things work because I hired a bunch of brokers to work underneath me, and when they would get caught, I would get the phone call.

(00:19:44)
So I get the phone call from the owner of a bank or a lending institute, a lender, and that lender says, “Hey, Matt, we got a problem.” I’m like, “What’s up?” He’s like, “Listen, we caught a fake W-2.” I’m like, “What do you mean?” “Yeah, your broker so-and-so sent us a file and this person had… There’s two fake W-2s and we’re assuming the pay stubs are fake.” And I’m like, “Are you serious? How did you even catch that?” And they go, “Oh, well, here’s what we did. We checked with sunbiz.gov,” which is the Secretary of State’s website that registers corporations. “And we checked, and the tax ID number didn’t match.” And now I know every W-2 has to have a matching tax ID number for whatever corporation issued it.
Lex Fridman
(00:20:34)
So there’s a sequence of checks they do to detect fraud in different documents like W-2s?
Matthew Cox
(00:20:38)
Right.
Lex Fridman
(00:20:39)
And then you’re slowly learning-
Matthew Cox
(00:20:41)
Yeah, exactly.
Lex Fridman
(00:20:42)
What’s the process for detecting-
Matthew Cox
(00:20:43)
I mean, I had a pretty good understanding anyway, but so I’m starting to learn that-
Lex Fridman
(00:20:47)
It’s common sense understanding. Yeah.
Matthew Cox
(00:20:48)
So I’m putting these things together. And I remember one time I had a woman come in and she came in and she had perfect credit. She had like a 750 credit score. I mean, it was perfect. And she came in and one of the brokers came in and said, “Hey, man, can I show you something?” I was like, “Yeah, what’s up?” He goes, “Look,” he said, “I’ve got this woman’s W-2s here.” I said, “Okay.” I looked at them and he goes, “Here’s her credit report.” And he goes, “Here’s the application. This is the Social Security number.” I went, “All right.” And he said, “This is the Social Security number on the W-2.” And I went, “Okay.” Keep in mind, you go to get a car loan or credit card, they never asked for these things.

(00:21:29)
I’m really shocked he even noticed it. I probably might not have even caught it, but they were different. And I went, “Really?” And he goes, “Yeah,” he said, “She just brought them in. She’s here.” And I was like, “Oh, bring her in here.” So she came in, sat down, I said, “Listen, here’s what we just found.” And she was like, “Oh, okay. You know what? I don’t want the loan. I just… I go, “No, no, no, no, no.” I said, “Listen, you’re getting a loan. You got a 750 credit score. I don’t care what we have to do. We’re getting you the loan. I just want to know what’s going on. How did you get 750 credit scores under this Social Security number when clearly this is your real Social Security number? You’ve been working for this company for 10 years, and your credit profile says it’s only three years old. And I was like, “What happened?”

(00:22:13)
And what she told me she did was she went through a divorce. She had been married for 10 years, used her husband’s… I mean, his surname for 10 years. So she has no credit under her maiden name. But when they got divorced, she switched to her maiden name because when she tried to get anything in her husband’s surname, it was denied, bad credit. So he had bad credit. Their credit went bad. So she switched to her name and a friend told her if she needed to get her electric or anything turned on, she could use her name and use her daughter’s or son’s Social Security number, which was like a four-year-old kid. So she used that and it went through, she had to put a deposit down, but it went through at least, it wasn’t denied. So that went through.

(00:23:07)
Then she went and she applied for an apartment with that. Sure enough, it went through. She had no credit, but they said, you don’t have bad credit. So she said once she moved into the apartment, she then started getting these pre-approved credit cards. So she goes, “But I knew I had applied there using my son’s Social Security number,” let’s say. So she started filling those out, and sure enough, she got a credit card and then she got two, and then she got a pre-approval from Ford Motor Credit. She went and got herself a new car, got approved. She’d been making the payments ever since. She has a 750 credit score. She thought she’d try her hand at buying a house in his Social Security number, and we caught it and she got a house in that name. We closed it. I just was like, “Wow, this is great.”
Lex Fridman
(00:23:53)
Can I ask you a question about that? Because it seems like she’s able to pay for everything.
Matthew Cox
(00:23:58)
Right.
Lex Fridman
(00:23:59)
So while this is highly illegal, is it unethical? It’s unethical in that it’s messing with the system on which a lot of people rely, but it feels like there’s some aspect of the system that’s broken in that it doesn’t give people like her a second chance.
Matthew Cox
(00:24:19)
She could have claimed bankruptcy and then two years later… Listen, two years out of bankruptcy, you can go into Bank of America and get a conventional mortgage, assuming you have perfect credit outside the bankruptcy, you have the down payment, you make enough money, there’s a whole bunch of underwriting guidelines you have to meet. But that’s possible. But you’re right. For instance, she wasn’t getting an apartment with her bad credit, she wasn’t getting her utilities turned on. She wasn’t getting any of those things done.
Lex Fridman
(00:24:48)
So getting your life back on track is just harder.
Matthew Cox
(00:24:50)
It’s extremely hard.
Lex Fridman
(00:24:51)
So there’s a temptation to take the shortcut and the shortcut is often going to be illegal.
Matthew Cox
(00:24:56)
Right. And she stumbled into it, but she basically explained it to me, and I don’t think she had walked out of my brokerage office before I went, and I just started making up names. And I think I went into our file cabinet and grabbed some people’s 1040s, which we had, their tax returns and looked up children’s Social Security numbers and just grabbed some random kids’ Social Security numbers and their name and went and pulled them. But I changed their date of birth to be an adult. Pulled it, and sure enough it came up, “No file found.” It didn’t say fraud alert or fraud or anything. They didn’t say mismatched this, mismatched that, didn’t say anything. It just said, “No file found.” Well, then we went and we applied for a couple credit cards using a child’s Social Security number, and then we went and pulled our own credit report.

(00:25:52)
And sure enough, it didn’t say no file found. It just said that there had been two inquiries applying for credit cards. So I was like, “Wow, that’s a credit profile.” So that turns into me calling Social Security and trying to get them to issue me Social Security numbers to adults that had never had a Social Security number issued to them. I need to get a Social Security number to give me a clean Social Security number. But I called up, and of course, I’m a novice, I don’t really know what I’m doing. So I call up and I say, “Hey, yeah, I never had a Social Security number issued.” And they were like, “How old are you?” And I was like, “I’m 31 years old.” And they were, “Yeah, that’s not possible. Do you have a driver’s license?” “Yeah.” “You have a bank account?” “Yeah.” “You have a Social Security number. Bring your driver’s license in and we’ll pull it up.” Okay, well, that’s not going to happen.

(00:26:51)
Hang up, call back. “Hi, my son is seven years old or three years old, and he never had a Social Security number issued.” “Oh, okay. Was he born in a hospital?” “Yes.” “Well, he has one. He has one. Go ahead and get your son, come in here…” No, I’m not doing that. Hang up, call back. So I called back probably 10 times, and eventually someone said… I kept altering it, kept altering what I was saying until I got to the point where I was saying, “My son was born with a midwife, not in the hospital. And the pediatrician told us that we need to get Social Security to issue a Social Security number.”

(00:27:43)
And they would say, “Well, he should have issued it. But that does happen sometimes. So bring your son in and you can fill out the paperwork. We’ll have one issued. First, we’ll check to see if he never had one issued. And if he hasn’t, we’ll issue one.” And so then it turned into, “My son is out of the country and I need this.” And then that turned into, “Oh, I’m sorry. Well, how old is he?” I was like, “He’s three.” And they go, “Well, I’m sorry if he’s over the age of 12 months old, he has to come in.” Hang up the phone, call back. “My son is 10 months old, he’s out of the country, born with a midwife, never had a Social Security number.” And then they go, “Oh, okay, that’s fine. Just get his birth certificate and his shot record and you can come in, fill out the paperwork, we’ll issue you a Social Security number.”

(00:28:33)
And that’s what I did. So I figured out how to create a birth certificate. I ordered the security paper where you make a copy. It says, “Void if copied.” I ordered had to order a bunch of that, and I went online and figured out how to make a fake birth certificate. It was great too, because the county actually, they give you a blank form and then they actually show you what it looks like filled out, like a handwritten one filled out. So I knew if he was born this day, he got these shots. Two months later, he got these shots. Six months later he got these shots. So I just filled it out. I even had to order a seal. So you have to have a seal that says “Hillsborough County Vital Statistics” or “Richland County Vital Statistics” or something. And I couldn’t get anybody to make that.

(00:29:23)
So I changed it to Richland County Office of Virtual Records. And then I took 220 grit sandpaper and hit it over and over and over again to wear it down. And then I did the embossment on the corner and I printed it on the security paper, embossed it. Nobody looks at those things. You could see Richland County, you could kind of see that. And really, they just grab it and they go like this. This is what you realize after you… When I started getting driver’s licenses issued by the state DMV, I figured out eventually it was easier to just go into the DMV and have them give me a driver’s license than actually make one. But you notice they would just grab the thing, they’d feel the form and go, “Okay,” they don’t even look at it, which is upsetting if you put as much work into these documents as I am for them to go, “Okay. Yeah, that’s good. Sit over there.” I felt like going like, “Hey, bro, take a look at this. This is artwork.”
Lex Fridman
(00:30:25)
Yeah. But they’re looking for the low hanging fruit of crappy fraud?
Matthew Cox
(00:30:31)
Right. Yeah. This stuff was right through.
Lex Fridman
(00:30:34)
Okay, so birth certificate gets you a Social Security number. So it’s interesting because you’ve done a lot of different approaches to creating synthetic people. There’s homeless people involved. So sometimes it’s grounded in real people or real names.
Matthew Cox
(00:30:56)
Right.
Lex Fridman
(00:30:56)
Some part is fake, some part is real sometimes, and sometimes it’s completely all fake.
Matthew Cox
(00:31:00)
Right, because now I have the name, I have the Social Security number. And what’s great is they mail you. What’s even better is then you get to pick whatever name you want. Because when you pick your child’s name, he doesn’t even have to have your last name, you pick any name. So I would pick a name and I’d just say, “Oh, my wife’s last name is this” If they questioned it, which they never did. I’ve got a Social Security number, and then I would go apply for credit cards and I’d get denied of course, but they would all offer me a secured credit card. So I’d then fill out the secured credit card and I’d send the bank the money, and they would give me a secured credit card for $500, $300, $1000, whatever it was. And then once you start making the payments, I pulled the credit and a credit profile shows up saying that this 31-year-old man with the Social Security number that I know was issued a couple of months ago, has three credit cards.

(00:31:57)
They don’t even say secure. They just say, ” This credit card is $500. It was issued by Bank of America. This one was issued by Capital One, this one…” So I’ve got three of them, but I had no credit scores. So at that point, I kind of kicked back and waited and I just kept making payments. And I remember thinking to myself, “I’ll bet you that the credit bureaus don’t generate credit scores for at least a year.” And I was like, “God, this is going to be a year long process.” And while that was happening, I was starting other ones because I figured at least in a year I’ll have a bunch of these… We call them phantom borrowers, but now they call them synthetic identities. So at least I would have these synthetic identities and maybe I could do something with them. But what happened was at six months, I went and I randomly pulled the person’s credit, and he had 705 credit scores, 705, 701, 695. I was like, “Oh, my God.” You only needed a 620 to get a 95% loan from the bank. So-
Matthew Cox
(00:33:02)
Borrow to get a 95% loan from the bank. So I was like, “Oh my God, this is amazing.” Sure enough, a month later, the other ones I had started, all of them, bam, bam, bam.
Lex Fridman
(00:33:13)
So what do you do with a phantom borrower? How do you make money on this?
Matthew Cox
(00:33:17)
So I think most people, if you were just a scammer or a fraudster, you would probably just get credit cards and maybe build up that history or maybe try and borrow a personal loan, which is limited. Personal loans used to be, you could go to an FDIC insured bank, which borrows money. The personal loans they lend out at the max $15,000. So you could do that.
Lex Fridman
(00:33:52)
So you can go through this whole process of creating a fake identity, getting a card, paying it off, building up credit, and then you get $15,000 at the end or so.
Matthew Cox
(00:33:59)
Right. You get 15. Maybe if you want to keep making the payments, if you could wait a year, you could probably get 15,000, you could maybe get 20, 30,000 and a bunch of little smaller ones. You get 7,500. There was a $7,500 from Citibank, $5,500 from American General. So you maybe get, what? 25,000, maybe 30,000 in personal loans.

(00:34:22)
Maybe you could then get another 20 or 30,000 in regular credit cards. 10,000 here, 8,000, 5,000, and then you go to the lower department store cards and you go to Home Depot, you get 1,000, you get 500. So it ends up being maybe you can get 50, 60,000, maybe if you really good, you could get up to 80 or 100,000 in credit cards and personal loans if you really knew what you were doing. But-
Lex Fridman
(00:34:49)
Per person, per identity?
Matthew Cox
(00:34:50)
Per identity. But I had the ability to leverage those perfect credit profiles against properties, and I mean, ultimately that’s what I end up doing so each one of those identities was worth a few million.
Lex Fridman
(00:35:09)
Can you explain how that works, so to leverage them against property? So how does that work with the mortgage?
Matthew Cox
(00:35:14)
So what I did eventually, I mean this is down the road, but at this point when my whole life had kind of gone off the rails. I was on federal probation, and so what I decided I was going to do was start running a scam, a much larger scam. And what I was going to do was I was going to start flipping properties, right? Buy houses cheap, fix them up and sell them.

(00:35:41)
There’s an area of Tampa called Ybor City. So I was going to start flipping houses in Ybor City. I thought, “Okay, I can buy these houses for,” you could buy a really crappy house at that time for 50, $60,000, let’s say 50. And then you could put $25,000 into it in renovations. You could renovate it for 25 and maybe you could get an appraisal for 100. So I thought what I could do is, “I can buy these houses, renovate them and sell them to regular people.”

(00:36:24)
But I also had been working on the synthetic identities. And then I thought, “well, or I could just sell them to synthetic identities.” And then I wouldn’t have to dump 25,000 into it, right? And these guys are perfect. They have perfect credit. I can provide W-2s and pay stubs because by this point I’m manufacturing businesses. So I’ve incorporated businesses, I’ve got websites for the businesses, W-2s, pay stubs, so these guys look perfect.

(00:36:51)
So I figure I’ll buy these properties for 50,000, sell it to these guys for 100. Maybe I’ll pocket 40 or 50,000. I don’t really have to do anything. But that seemed shortsighted. So I thought, “What would be even better is that if I did a little bit of renovations and then I sold it for much higher.” Maybe I put 10,000 clean up the outside of it, because these guys don’t care what the inside of the property looks like. They don’t exist.

(00:37:15)
“But how am I going to get an appraisal for $100,000?” Well, you know how appraisals work? Okay, so the bank sends an appraiser out, or at that time you could provide an appraisal. They can review it. So they’ll do what’s called a desktop review. They review it on the computer and they never go out to the property or they send someone out. They call that, it’s a field review. They send someone out and they just look at the house. They don’t go in it though. So I have to clean up the outside of the house.

(00:37:53)
But the problem is if you’re trying to sell that house for let’s say 200,000, the other houses, they have to pick three comparable sales in the area that are also going to support a $200,000 sales price.

(00:38:11)
Well, there’s no other house that’s selling for 200,000 near this house. So I thought, “If I want to get these things appraised for 200, 250,000, I have to have comparable sales and that appraisal is going to be reviewed.” So what I did was I went out and I bought this house for 50,000 and I recorded the sale at 200,000.

(00:38:41)
So when you buy a house for $100,000, you pay $700 in dock stamps. But if you pay an extra 700 bucks, the sale shows up for 200,000. I’m buying these things for 50, so I’m paying $350 and I’m just paying an extra $1,050. So it ends up being $1,400, but the sale shows up at 200,000 on a house. That’s a crack house I bought for $50,000.

(00:39:11)
Now I go, I trim the trees, we mow the yard, we clean up the porch, we put the porch rail on maybe, we paint it real nice. We black out all the windows. You can’t see inside, but from the curb it looks great. I get an appraisal. So I do that with that house. I do that with another house all within a mile. So I buy four houses knowing there’s a subject and three comparables for all of them.

(00:39:37)
So the first thing I did is I bought four houses for 50,000, 60,000, 40,000 and I recorded the values at 210, 200, 190. So I get an appraiser to come out there. He appraises it. Of course, he says, “It’s horrible,” but there’s comparables here. Now, of course it is in bad shape, and he says, “It’s in bad shape,” but I go ahead and I correct all that. So I correct it.

(00:40:02)
So now if you review the appraisal and you’re in California, or even if the appraiser comes to the house and looks at it from the street, it looks fine. But the truth is, I’ve got $60,000 into this property and you’re appraising it for 200,000. So the bank, they’re not going to lend 200, but they’ll lend one 190. So the bank is ready to lend this synthetic borrower $ 190,000 on a house that I have 60,000 in. So I schedule a closing and we close on the house and I walk away with $60,000.

(00:40:45)
And the thing is, the problem was is by the time I got to this point, I knew so many people in the industry, nobody had to really at that point show up. Although I’ve had people show up for the synthetic identities and sign for them. Almost all the closings, nobody ever showed up.

(00:41:03)
I just showed up and said to the title agency and said, “Hey, my borrower, he’s at work right now. He can’t make it. Can I just take the file and I’ll have him sign all the documents at his work and I’ll bring them back. He’s like an hour and a half away from here. I’ll be back in two or three hours.” And they’re like, “Oh, wow, man, Matt, thank you so much”. And they would give it to me and I’d go sit in the parking lot and I’d sign all the documents and I’d wait an hour or two and I’d come back in and say, “Here you go.”
Lex Fridman
(00:41:32)
How were you able to keep all of this in your mind because you have to not slip up in any of these conversations?
Matthew Cox
(00:41:37)
It’s pretty easy for me to keep everything in the correct category. Does that make sense?
Lex Fridman
(00:41:48)
Sure.
Matthew Cox
(00:41:49)
I’m not great at a lot of things, but this I was very good at.
Lex Fridman
(00:41:53)
Well, there’s these phantom people that exist and they were becoming real people in your mind, as in you’re able to tell good stories with those people, right? Because if you’re talking to the appraiser, you’re talking to everybody involved.
Matthew Cox
(00:42:09)
Well, keep in mind, the appraiser almost never meets the borrower. Never. 99.99% of the time they never meet them.
Lex Fridman
(00:42:17)
But you have to talk about them?
Matthew Cox
(00:42:19)
Yeah.
Lex Fridman
(00:42:19)
So I guess what I’m asking is you’re able to converse fluently about these synthetic identities.
Matthew Cox
(00:42:26)
Yeah. They all had different jobs. They were all on the job for five years. A lot of it was-
Lex Fridman
(00:42:34)
Sure. There’s a template.
Matthew Cox
(00:42:36)
Exactly.
Lex Fridman
(00:42:38)
I got it.
Matthew Cox
(00:42:38)
Listen, as a matter of fact, almost every one of them had the same birthdate because who knows? So it wasn’t difficult and keep in mind, a lot of the brokers barely ever meet the borrower. They call in on the phone, but it didn’t matter anyway, because I’m walking in saying, “I got a slam dunk deal for you.” And they’re like, ” Oh, wow, Matt, you got the W-2s, the pay stubs. You got all their rental history, you have everything done. It’s perfect. Thank you so much.” They’re happy to do it.

(00:43:09)
“Hey, I’ll print up the docs and I’ll have them go sign it.” “Great. Wow, thank you.” Assuming they didn’t already know about it, and almost everybody involved in this by the time I was done, was involved. There was probably 15 or 20 people that all knew what was going on.
Lex Fridman
(00:43:23)
The full of it? They knew the full depth of it?
Matthew Cox
(00:43:26)
Yes. Maybe not 100% everything, but they definitely knew this is fraud.
Lex Fridman
(00:43:32)
And they were still going along with it?
Matthew Cox
(00:43:36)
Keep in mind that even when, I’ll give you an example. One of my, let’s say, and this happened with almost all of them, was, he would buy five houses. So the basic design was I buy the houses, I record the values higher, and this person buys all five houses, refinances them. He ends up borrowing a little bit over a million dollars in his name.

(00:44:11)
Then of course, then I go and I get personal loans from several banks. I get credit cards. I run up all of his credit cards. By this point, I’ve got 10, $20,000 worth of credit cards in the guy’s name. So the guys are all worth a million, a million and change. Well, once I stop paying, you start getting letters from the collection companies, right? From the banks, and then they sell them off. So after about three months, you’re getting tons of letters.

(00:44:37)
And what I would do is I would take my borrower’s name, I would go online and I would find, or I’d go in the newspaper and I would find an article about, let’s say a 12 car pile up. So there’s a huge accident on I-4. It’s very dangerous. So there’s a 12 car pile up, and someone in the accident was life flighted to Tampa General Hospital.

(00:45:04)
I would cut and paste that article and I would just insert my borrower’s name into the article saying that, “Brandon Green was life flighted to Tampa General Hospital. He’s currently in critical condition.” I would then print that article out on newsprint. I’d then make a copy of it. Cut it up, make a copy of the newsprint, highlight his name, and I would write a letter from Brandon Green’s fictional sister to the collection companies saying, “Several months ago, my brother was in a horrible car accident. He is currently…”

(00:45:41)
They’ve got the article, they have the highlighted name. He clearly was in this accident. “He is currently in a coma, and the doctors say, ‘Even if he wakes up from the coma, he will never work again.’ So you might as well just foreclose. Stop writing us letters and take the houses back.” And that’s all they’re looking for, is a reason.

(00:46:02)
At this point, even if they look into Brandon Green, they can’t figure out if he’s a real person or not because he’s got a social security number and everything went bad at the same time. He’s got multiple rental properties or his primary residence, all of his credit cards went bad, everything went bad. We have an excuse. We have a letter. That happens. People get divorced, they lose their job, they get in accidents. It’s reasonable.
Lex Fridman
(00:46:27)
When they look into it, it all looks legitimate.
Matthew Cox
(00:46:30)
Even if they ordered another appraisal, by this point it’s not four comparable sales or three or four comparable sales, by this point it’s 10, 15, 20, 30, 40, 50 because I kept making more and more of these guys.
Lex Fridman
(00:46:43)
What was your, just almost like a tangent, what’s your thinking process? There’s a lot of cleverness going on here. So the car pile up as a solution. The newspaper and you mail it. Are you sitting there alone and thinking through this? How do you come up with that idea? It’s a very interesting, a very clever, innovative idea.
Matthew Cox
(00:47:05)
So at first, I thought about making a fake death certificate. He died. But I thought, “I don’t know what if,” some of these places had primary mortgage insurance, “what if the primary mortgage insurance, what if they try and claim because he was dead or I don’t know. I don’t know that side.” So I’m like, “I don’t want to do that. I want to do something that’s semi verifiable and a third party’s telling you this is what happened.”

(00:47:32)
I thought, “Well, like the newspaper, or do I claim bankruptcy?” And I’ve done that. I’ve gone and got the bankruptcy forms. You can go to the bankruptcy court and they’ll give you forms to mail to all of your creditors. You mail them and they stop contacting you. They wait to be located or notified by the bankruptcy court. But my fear there is, “Nobody’s ever going to notify them. I’m not going through bankruptcy for one of these guys.”

(00:48:01)
So it was like, “This is a better bet than just writing a letter saying, ‘I’m going through a divorce. My wife’s keeping those houses. That’s her problem.'” There’s lots of things you could do, but to me this was, “How do you shut it down without him dying? How do you shut that down?” This is how you shut it down. He’s in a coma. He’ll never work again. He was in a car accident. Here’s the proof. He can’t even write you. I’m his sister. I wrote you the letter.
Lex Fridman
(00:48:29)
It’s a one-time letter that seems to tie up all the-
Matthew Cox
(00:48:33)
Exactly.
Lex Fridman
(00:48:33)
… loose ends.
Matthew Cox
(00:48:34)
Exactly. I don’t know exactly what sparked that as much as there were so many other avenues that I could have gone that I just didn’t know.
Lex Fridman
(00:48:46)
But you were thinking through all those different avenues?
Matthew Cox
(00:48:48)
Yeah.
Lex Fridman
(00:48:48)
Are you mostly thinking alone?
Matthew Cox
(00:48:51)
I mean, I had guys I was bouncing-
Lex Fridman
(00:48:53)
Ideas.
Matthew Cox
(00:48:54)
… ideas off of. There were other guys that were involved in the scam. I think that scam ended up making, I think the FBI said it was 11.5 million or something. But there were so many other people that were involved in that scam that were, this guy’s getting 50, this guy’s getting 17,000, 20,000, 25,000. And we’re just doing it constantly.

(00:49:19)
And so the bank would foreclose on that property. They’d take it back. They’d put it back on the MLS. They put it back on the MLS for 200,000. It wouldn’t sell. Then they’d drop it to 150. It wouldn’t sell. Then they’d drop it to 125, 130. It wouldn’t sell. They’d drop it to 90 and somebody would buy it for 90. It wasn’t worth 90. But by that point, we’d done so many houses at that point the whole area shot up.

(00:49:47)
The FBI said we did 109 houses. I don’t think that’s true. But-
Lex Fridman
(00:49:53)
Wow.
Matthew Cox
(00:49:55)
… When I end up leaving Tampa after that scam falls apart, and the FBI shows up, Forbes came out with an article, whatever six months later, and they said that, “The Ybor City zip code was one of the top 20 fastest growing appraising areas in the country.” And everybody was like, “Oh, that’s Matt, because this place is a dump. This is a horrible place.” And I remember one time, I had talked to a guy years later, and he was like, “All the comparable sales have dried up. When you left, there was just nothing even close to 200,000.”

Arrested by FBI

Lex Fridman
(00:50:33)
You mentioned right before telling the story of this elaborate scam that you were on federal probation. How did that happen?
Matthew Cox
(00:50:41)
So I mentioned that I owned the mortgage company.
Lex Fridman
(00:50:43)
Yes.
Matthew Cox
(00:50:43)
So I had started a mortgage company. I had maybe a dozen guys working for me, and there was fraud. I would say it wasn’t all fraud, but whatever, 60, 70% of it was fraud that was going in there. And from the outside of that business, it looked very legitimate. We were an FHA approved lender. We were a VA approved lender. We did conventional, probably signed up with 40 or 50 subprime lenders. But there was a considerable amount of fraud. It became a game, right?

(00:51:23)
I started getting just more and more creative. Like I said, every time I would get away with something you become emboldened by it. It’s like, “Nice.” ” Hey, the underwriter’s looking for this and looking for this.” And you sit there and go, “Man. What am I going to do? You know what we could do? We could create our own bank.” “What?” “Yeah. Here’s we’re going to do. We’re going to go on…” How do they know if this bank exists? These people are in California, they’re in New York. They don’t know.

(00:51:48)
“So what we’re going to do is we’re going to go online,” and keep mind, this is 2000, 2001. The internet’s in its infancy still, right? I remember GoDaddy, I think had just come up with a site where you could build your own website. How cool is that? So I go online with a buddy of mine, and we create something called the Bank of Ybor. We cut and pasted things that we liked from other banks. We got a 1-800 number you could call, or a 186 number, whatever it was, and you could call it, and it would go to a voicemail.

(00:52:28)
So we set up this bank, and then I ended up making bank statements, which by this point, I already had been making bank statements to prove someone has their down payment. Because a lot of times people, they have good enough credit to borrow 95% or 90%, but they don’t have their down payment. So we’d raise the purchase price high enough to cover their five or 10% down payment.

(00:52:50)
We would bring their down payment for them, or we’d have the owner of the house bring the down payment for them. Then we would have a check cut out of the clothing statement to a construction company that I owned, and we get our money back. So they get into the house for 100% financing or 110%. Some of them turned into one 130. We want to pay off their car, give them an incentive to sign. They still don’t have the money to buy it. So we are doing all kinds of insane things.

(00:53:23)
Well, at some point, remember Gretchen Zayas, my old manager?
Lex Fridman
(00:53:32)
Yeah, the original.
Matthew Cox
(00:53:33)
Yes.
Lex Fridman
(00:53:33)
The OG.
Matthew Cox
(00:53:34)
She came and worked for me for a short period of time, and then she and her husband went and opened their own mortgage company, which you should have known it was going to be fraudulent from the get go because it was was called Creative Financing. It was CFM, Creative Finance. No, Creative-
Lex Fridman
(00:53:54)
Creative was in the name.
Matthew Cox
(00:53:55)
Yeah, Creative was in the name.
Lex Fridman
(00:53:59)
It’s really on the nose.
Matthew Cox
(00:54:00)
So she’s doing very well and we became very close by the way. Where we’d go on vacation, went to Puerto Rico together. I got married at the time. I was married. Our kids play together. We babysit. We go to each other’s parties. We’re close. We’re good friends. And she’s got her own mortgage company. She calls me up periodically and asked me, “Hey, can you make a W-2?” Or, “Hey, can you make me a pay stub?” “Sure, no problem.” We’re friends. That’s what fraudulent friends do.

(00:54:32)
So if I needed somebody to verify rent or verify somebody’s rental history or employment, she had a cell phone, she would answer that sort of thing for me. Well, what ends up happening is she gets in trouble. She starts doing fraudulent loans for some guys, and these guys are doing what’s called a cashback scam. So they’re getting a half a million dollar loan on a house that’s worth $300,000. So they’re buying the house for whatever, 600,000. It’s really only worth 300, 350.

(00:55:21)
But she happened to be in an area where she could get the appraisal jacked up. So they buy the house, they get two, $300,000 back, and it’s a straw man’s scam, right? It’s a cash back straw man’s scam. So this is a real person that’s buying the house. He’s got perfect credit, but he’s willing to ruin his credit to get a couple 100,000 in his pocket. So he never has any intentions. So it’s not a synthetic identity. It’s not a stolen identity. It’s a straw man. He’s not a fake person, but he’s just a straw man. He’s a stand in.

(00:55:53)
So he stands in, he signs the paperwork, he buys the house. They end up getting two, 300,000. Well, this guy buys like five houses, so it’s two, $3 million. They’ve lost six, $700,000 and these guys never even make the first payment. They just let them go into foreclosure. So the bank immediately investigates and realizes this is fraud.

(00:56:15)
So the FBI comes in, they grab Pete and Gretchen. She has to hire an attorney, of course, and she doesn’t get thrown in jail or anything. They just come to their office and they tell them they’re investigating them. They know what’s going on and they’re like, “Well, look, we want to talk to you. You’re going to be indicted.” “Okay.”

(00:56:31)
So she comes to me. Well, actually Pete came to me and said, “Look, Matt, can you refinance our house and get us 75,000 out to pay our attorney?” I said, “No problem.” Gretchen gives me W-2s, pay stubs. The whole thing’s fake. I refinance. I get a second mortgage on her house. $75,000, they pay their attorney.

(00:56:51)
Their attorney immediately says, “You need to wear a wire on this guy. He just got you $75,000. I don’t know how you got $75,000. The attorney knows something’s wrong because the attorney’s like, “Your whole mortgage company was just shut down. There’s no way you could borrow $75,000.” So he is like, “This guy’s doing fraudulent stuff.” And she says, “Yes, of course he is.” And he says, “You need to work with the FBI, wear a wire against this guy.”

(00:57:16)
So she calls me one day and says, “Listen, I got to talk to you. The FBI is asking questions about you.” And I go, “What?” And she goes, “Yeah.” I was like, “Meet me at the pizza place down the street. So don’t come into my office,” because everybody knows she’s been indicted. Everybody in her office quit. When the FBI shows up and gives you a business card and announces they’re the FBI, everybody quits. So I said, “Don’t come here.” Because they already know they’re already concerned.

(00:57:42)
So I go and I meet her and Pete, and we sit down at a restaurant, a little pizzeria. I sit down and she starts telling me that the FBI is asking questions about me. And I’m like, “Well, what are you talking about? What are they asking?” And she goes, “Look, they came in, they took all our files.” And I was like, “I didn’t know any of this.” I’m like, ” When did this happen?” She’s like, “A couple of weeks ago and they have some of your files.”

(00:58:05)
Because I had closed several loans for my wife at the time. We were buying rental properties. My wife didn’t have a job. So it’s all fraud. But I could not close those loans at my mortgage company because I own the property. So I’m selling those properties. I bought properties, renovated them, and sold them to my wife to get around something called seasoning.

(00:58:35)
Seasoning says you have to wait six months to a year to refinance at the market value. Otherwise, if you want to refinance, that’s fine, but you have to refinance at the price you purchased the property at. But I bought these properties for 80 or 100,000, renovated them, sold them for two, 300,000 to my wife, who didn’t even get a big mortgage. We were just trying to get around a guideline. But my wife was not working, and I provided W-2s and pay stubs.

(00:59:03)
So when she says all this, she says, “They’re looking at the loans you gave me, at your wife’s loans.” And I went, “Oh my God.” I said, “Well, you didn’t tell them that the W-2s were fake, did you? You didn’t tell them the pay stubs were fake, did you? You didn’t tell them that the down payments were? You didn’t tell them that we were married, did you?” I mean, just absolutely buried myself.

(00:59:27)
And as I’m telling her this, I kind of caught myself and I went, “Okay, wait, wait, wait a minute. Look. Okay, here’s what you’re going to tell them. You’re going to tell them you never met her. She called on the phone.” I start trying to devise a plan that will answer their questions without getting my wife in trouble or them in trouble. And if nobody cooperates, the whole thing should shut down. It doesn’t go anywhere. There’s nowhere for them to go if everybody just kind of stonewalls them.

(00:59:58)
So as I’m saying all this, Gretchen says, “Matt, we can’t lie to the FBI.” And I go, “What are you are you talking about? You’re already lying to the FBI. I mean, you’ve been lying to the FBI. I mean, I just refinanced your house.” And before I can really say anything, Pete jumps up, her husband stands up, and he says, “We’ve never lied to the FBI. We may not have told them everything, but we’ve never lied.” And I thought, “Who are you talking to?”

(01:00:26)
I know that’s not true. So you’re not saying that for my benefit. So I kind of look at them and I’m like, “What?” And I remember looking down, and this may mean nothing, but both of their cell phones were right next to me, right? And I remember they were probably just wearing wires. But I just remember thinking, “Those cell phones are microphones.”

(01:00:49)
They probably weren’t. But I remember thinking, “Oh, wow.” And I looked at her and I went, “Wow.” And I said, “Well, I hope you’re going to get something for this.” She immediately starts crying and she says, “Matt, I’m sorry. I have a kid. I can’t go to jail.”
Lex Fridman
(01:01:07)
Do you have kids at that point?
Matthew Cox
(01:01:08)
Yeah, I have a kid. I have a kid. And I was like, “Wow.”
Lex Fridman
(01:01:14)
What have you learned about friendship from that? Loyalty?
Matthew Cox
(01:01:18)
Oh, there’s no… It’s sweet.
Lex Fridman
(01:01:21)
That must have hurt.
Matthew Cox
(01:01:22)
It’s cute. I mean, I love the idea of it.
Lex Fridman
(01:01:25)
You don’t think that?
Matthew Cox
(01:01:26)
No. I’ll tell you why. So I go back to my office. I remember I told her, I said, “Tell the FBI agent to call me on the phone. Do not come to my office.” So I go back, I’m still trying to figure out how to weather this, right? I go back, I sit down. The phone rings. My secretary comes in and says, “Hey, Agent,” I’ll never forget the guy’s name, “Agent Scott Gale with the FBI.” And I was like, “Okay, he’s on the phone.”

(01:01:52)
She’s standing there. I was like, “Close the door. Get out and close it.” She’s like. So I get on the phone. He asked me if I’ll come down. I said, “Yeah, absolutely. Let’s schedule it for next Tuesday.” I put it off four or five days. I go to my brother-in-law immediately, who’s a lawyer. And he says, “Oh, yeah.” I don’t really tell him exactly what’s going on, but I tell him, “This is what’s happening kind of and I may be in trouble. I need a federal defense attorney.”

(01:02:21)
I don’t even know what a federal defense, I don’t even know the difference. But he said, “You need a federal defense attorney. It’s the FBI.” So we meet a couple lawyers. I end up getting a lawyer. I give him 75 grand. Initially, he had me convinced I was probably going to go to jail for a few years, but really that’s what they kind of do to justify you giving them $75,000.
Lex Fridman
(01:02:45)
Right.
Matthew Cox
(01:02:47)
But the more I thought about it and read, he gave me the guidelines, that supposedly the fraud that I had committed and the guidelines that oversaw that. And I read it and I was like, “I’m not really in trouble here because I’m looking at a felony, but I’m not going to go to jail.” Because there was no potential for the bank to lose money.

(01:03:14)
Because I bought the house with a hard money loan and then I renovated it with my own cash. And when I sold it, it appraised at 250,000. My ex-wife borrowed 180. So there’s plenty of equity. If the whole thing had gone into foreclosure, they still would’ve got their money back. And to be honest, by the time all of this happened, there was only three properties. It was five, but we’d already sold a few. At this point, we’d just sold another two. There’s like one or two properties left.

(01:03:43)
So at that moment, we were selling them. So I was like, “No,” I kind of argue with him. But then he wanted 75 grand. I gave him 75 grand. And then he comes back and he says, “Good news. There was no potential fraud. So I can get you three years.” Now here’s the thing, here’s what I always kind of look back at. When I first went into his office, he said, “Listen, you haven’t been indicted yet. I spoke with the FBI, I spoke with the US attorney, they believe, and they’ve been told…”

(01:04:16)
He said, “Look, they didn’t tell me exactly what they have, but they said what the evidence that they have on you based on two confidential informants, that you cannot go to trial.” And I was like, “Right.” of course I knew that one. And I was like, “Okay.” He said, “But you haven’t been indicted yet and they are fairly certain that you’re running a mill, right? A fraud mill over there, and that you guys are churning out fraudulent loans.”

(01:04:45)
“Now they can’t come and raid your office and do anything about it yet because so far they only have you. But here’s what I’m saying,” he said, “I can keep you from being indicted. It’s called a pretrial intervention where we go in and what we’ll do is you go in, talk to the FBI, you go grab a bunch of your mortgage broker’s most egregious files. Grab them, bring those files to the FBI. Go work with the FBI, they will indict them and you will not be indicted.”

(01:05:27)
And I said, which I kick myself to this day. I said, “Absolutely not. I’m not going to snitch on them. I’m not going to cooperate. I’m not going to,” I’d seen the Godfather, you’re not supposed to cooperate. You’re supposed to be loyal. “I’m not going to do any of that.” And so I say all of this where looking back, if I could go back in time, I would’ve gone into our weekly meeting with a dolly and I would’ve walked in front of everybody and scooped up two or three of the file cabinets and put them in the back of a truck…
Matthew Cox
(01:06:03)
… of everybody and scooped up two or three of the file cabinets and put them in the back of a truck and said, “Listen, you guys are going to be talking to the FBI soon. I suggest you get attorneys.” And I would’ve driven off but I didn’t. I thought, “No, be loyal. Don’t do that.” And what happened was when the other thing falls apart, when the next scam falls apart, every one of these people go to the FBI. Like they’re not even coming to them. These guys are going to the FBI with lawyers. “I want to cooperate. I want to tell you what Cox did. I want to help. I want to” … and I’m thinking I never had to get indicted to begin with.
Interviewer
(01:06:41)
So you think that most of these people, from your experience, are going to sacrifice all integrity. That’s a funny word, sacrifice-
Matthew Cox
(01:06:49)
I’m not sure that applies to this, but that’s all right.
Interviewer
(01:06:53)
They’re going to sacrifice friendships and loyalty just to save their own ass.
Matthew Cox
(01:07:00)
Yeah. I only had one person that did not talk to the FBI. I had one person that every time the FBI or the Secret Service went to that person’s door, she said, “Don’t come to my house again. I don’t have anything to say about Matt. I have nothing to do with any of this. Talk to my lawyer.” And this happened over and over again. And that’s my ex-wife. She’s a gangster.

Omerta: Code of silence

Interviewer
(01:07:24)
So are there people in this world you trusted or you still trust?
Matthew Cox
(01:07:30)
The problem is eventually I cooperate. And at the time, I didn’t want to cooperate. I didn’t believe in cooperation. But after seeing how many people cooperate and the way the system is set up, I think that my understanding of loyalty is vastly more realistic now. And I think that if you are committing crime, if you are absolutely like the things I did, I did a bunch of scumbag things. I mean, I’m not killing people, but I’m doing scumbag things. I’m lying, cheating, stealing. I’m a thief. You boil down to it. That’s what I am. So you can’t go around behaving like a scumbag, dealing with scumbags and then expect those same scumbags to suddenly abide by some kind of a street code and not roll over on you. And it does happen, but it’s in the 90 percentile of people that cooperate, 90 something percent. And people cooperate when they’re not even looking at any real time.

(01:08:46)
So if you’re looking at 30 years, and especially after going to prison, you go to prison and it’s like this guy’s a standup guy over here, he got 30 years. He could have cooperated against all of his co-defendants but he didn’t. Nobody comes to see him. His wife divorced him. His kids ended up in foster care. His friends are cleaning out his house. Nobody puts money on his books. Nobody comes to see him. Nobody answers his phone. Nothing. He took 30 years. Most of those guys turned around. They end up getting indicted for other things. Years later, they cooperate. And the best thing this guy’s got going for him is that he can walk around and say, well, he’s a stand-up guy. That guy’s going to the same halfway house as me. He’s going to do 30 years where I’m going to do 10.
Interviewer
(01:09:39)
A stand-up guy meaning he never snitched.
Matthew Cox
(01:09:41)
Right.
Interviewer
(01:09:42)
And so everybody’s seeing this example and saying, “Well, I’m going to snitch then.” But it sounds like what people are doing is they’re virtue signaling, like they would never snitch and actually do secretly.
Matthew Cox
(01:09:59)
I mean I remember I talked to one of the COs at the prison one time and I said, “Shit, 50% of the guys here snitched.” He goes, “It’s more than that.” “But listen,” he goes, “a hundred percent of them are lying about it.” He said, “There’s nobody here that’s going to tell you they snitched. Nobody.”

(01:10:20)
So there’s guys, tons of them that cooperate. If 80-90% of defendants cooperate, you start doing the math. And if you ask 10 guys in prison, all of them say, “I didn’t cooperate. I didn’t cooperate. I didn’t cooperate.” Okay. Well, you ask a hundred. “I didn’t cooperate.” Nobody’s going to say, “I cooperated.”
Interviewer
(01:10:38)
Does that break your heart a little bit that people back stab each other like this?
Matthew Cox
(01:10:44)
It does. It does but I have such a low opinion of people. You know what I’m saying? I don’t expect … It’s not that I don’t like people. It’s that I just don’t expect anything of them. I don’t expect you to look out for me. There was a time when I did. I thought, “I look out for you. You should look out for me.” But I just don’t expect that anymore.
Interviewer
(01:11:06)
See, but I think humanity flourishes because there is a lot of people out there that do the thing that is difficult to do in terms of integrity.
Matthew Cox
(01:11:17)
That may be but these aren’t people with integrity. These are criminals. If these were decent human beings, and all of them will tell you, “Well, why’d you do that?” “Oh, I was a drug addict” or “I needed the money.” Well, if you were a decent human being, you would have gotten off the drugs. You would’ve gone and gotten three jobs. You can work 80 hours a week. I’ve done it. You can work 84, 85, 80. You can work 90 hours a week. You can do that. “Oh, I did it for my kids.” No, you’re lazy. You could have worked three jobs for your kids. Instead, you decided to sell methamphetamine. “Well, I was addicted.” You could have gotten off meth. It wasn’t important. It was the easy way out. You’re not someone with integrity.

(01:11:56)
So for you to sit there and say, “Hey, I’m going to act like a scumbag, but now I got caught or you got caught and I don’t want you to tell on me.” Well, you’re a guy that robs banks. You stick guns in people’s faces. You kidnap people, you torture people. You sell drugs. You’re not a moral, ethical person, but you want everybody else to hold up to some ethical code while you’re robbing grandma. That’s not right. So I get the whole omerta code, and there was a time when I was delusional enough to believe that. But after going through it, no. And after going through it multiple times, no.
Interviewer
(01:12:43)
I have to really think about that. I deeply appreciate your honesty on this. There’s all kinds of criminals in this world, and they all have all kinds of stories. And your story is one of … I don’t know if it came from desperation versus a love of this kind of game. Like wasn’t part of it an attraction to the creative aspect of this, of breaking the rules when nobody else can and you figure out a way to do it?
Matthew Cox
(01:13:38)
I think initially it was I needed the money. That’s the first thing. You say, “Oh, okay. Well, I need” … and if you ask most guys, “Oh, well, man, I needed the money.” You needed the money. And I definitely needed the money. But then you get $50,000 in your bank and then you get a hundred, and then it’s 200, and then it’s half a million and then it’s a million. And what the hell are you still committing fraud for? You’ve got half a million or a million dollars in the bank or worth of real estate, or you’re making five, $10,000 a month just in rental income. Why are you still committing fraud?

(01:14:16)
So I think it morphs into the creativity, in part, for me. And two, it was a chance for me to prove to everybody how smart I was. It was done out of desperation initially, and then it just turned into pure narcissistic arrogance. “Look at me, look at how I can do things that nobody else can do. Look how smart I am. I just walked into Bank of America, handed them seven documents that were all fraudulent and they cut me a check for $250,000. Like, wow, I’m amazing. And guess what? They’re never going to get their check. And they won’t even know where to start to try and find the person because they’re looking for a phantom.”

(01:15:05)
And you feel great. I felt great. I felt like James Bond. I felt like 007. It was amazing. And it feeded my need to feel important, even if that was a lie, because all that success was just a lie.
Interviewer
(01:15:29)
Well, no, you were good at it.
Matthew Cox
(01:15:31)
I was good at it, but it’s not-
Interviewer
(01:15:34)
It was illegal.
Matthew Cox
(01:15:35)
It’s not like I’m Elon Musk. You know what I’m saying? It’s not like I’m an exceptional human. I’m an exceptional human being at a horrific thing, at committing fraud.
Interviewer
(01:15:46)
Well, the question is how many people are getting hurt? Because-
Matthew Cox
(01:15:50)
The thing is, initially, nobody got hurt. That’s the thing. Nobody ever lost any money directly. I didn’t go and say, “Give me $50,000” and I ran off with your money. I wasn’t doing that. And that was a great justification. But at some point, and we’ll get into that, I take off on the run and people do lose money. I didn’t take that money directly. And for some reason, in my sick mind or whatever the case may be, that seems like a distinction to me that makes me feel okay, is that I never said, “Give me 300, give me $10,000,” and I ran off with it.

(01:16:29)
But I put people in a position where I damaged the title to their house and they had to go get a lawyer to fix that and so they had to go pay a lawyer $10,000. So I absolutely caused that person … To me, it’s you’re a victim and I owe you that money. And it was a shitty thing to do because, even at the time, I was like, “Oh, they’ll make a couple of phone calls, it’ll be fine.” It wasn’t fine. And if I had really put any thought into it at all, I would’ve known it’s going to going to really affect these people. And those people had done nothing wrong with the exception of trusting me. They rented me their house or they owner financed their house. They made the mistake of bumping into me and now they owe $10,000, $20,000 and I’m sure a ton of anguish.
Interviewer
(01:17:28)
So what happened when you were caught that first time?
Matthew Cox
(01:17:32)
So I was caught. I got three years probation. I took the probation.
Interviewer
(01:17:39)
What does that involve?
Matthew Cox
(01:17:40)
Initially, it was just a slap on the wrist.
Interviewer
(01:17:45)
Were you allowed to still practice-
Matthew Cox
(01:17:48)
Okay. So I wasn’t. I couldn’t own the mortgage company anymore. That was a good question because you would think wouldn’t it be great if I could keep on going? But what they said was you have to forfeit your brokerage license and your brokerage business license. And what I did was I transferred my brokerage business license to a guy that essentially bought my business. They allowed me to work as a consultant in the mortgage industry because my lawyer goes to the judge and says, “What else can he do?” And so I have a friend, his name’s Dave Walker. He was a CPA. He came in and he bought my business and he paid me like $9,000 a month and that covered my bills. My wife and I got divorced, so she’s my ex-wife.

(01:18:44)
And I don’t know what to do. I could have … You look back and it’s like I could have claimed bankruptcy. I could have moved into my parents’ spare room, something like that, because I lost everything in my divorce. I had huge child support payment. Not that that has anything to do with my ex-wife. I absolutely signed up for that. I wanted to pay that but it was a chunk of change. So we’re talking about a couple thousand dollars a month for child support. She got all of the apartments that we had. We had about a million, million and a half dollars’ worth of apartments, which isn’t a lot now, but that’s probably five or six million dollars now. So she got all the apartments, so she got everything. So now I’m sitting here/ I can’t be a mortgage broker. I can get my $9,000, but I have to help this guy run this company, train people, do that sort of thing.

(01:19:47)
So what I decided to do was I was going to start flipping houses.
Interviewer
(01:19:52)
Legitimately or not?
Matthew Cox
(01:19:53)
Well, initially, I thought about doing it legitimately but at the same time I was also in the middle of figuring out how to make these synthetic identities. So I’m making the payments every month. Remember? Two months in, three months. No credit scores. No credit scores. No credit scores. And I’m also saying I’m going to start buying houses, renovate them, sell them. So the truth is we actually renovated probably one house completely. I remember it was on 26th Street. We renovated the house completely-
Interviewer
(01:20:25)
On the outside and the inside?
Matthew Cox
(01:20:27)
Yeah, outside, inside. It’s done. It’s good.
Interviewer
(01:20:30)
Okay, great.
Matthew Cox
(01:20:31)
Me and this guy, actually Dave, Dave Walker, the guy that bought my business. So we renovate it and it just so happens at the same time, I go to pull credit one day and, wow, 700-plus credit scores. And I went we don’t have to sell this thing at all. I can sell it and put it in this guy’s name and let him refinance it. So that’s what we did. I ended up selling it to this synthetic identity.
Interviewer
(01:20:59)
Do you remember the first synthetic identity, the name?
Matthew Cox
(01:21:02)
The first one was a Joel Cologne, and then I started getting creative because the ones after that, I started naming … So I had Joel Cologne and an Alan Duncan, but then I … Do you remember the movie, Reservoir Dogs?
Interviewer
(01:21:16)
Mm-hmm.
Matthew Cox
(01:21:17)
So I started naming the characters after guys in the Reservoir Dogs. So I had a James Red, I had a Michael White, Lee Black. I had William Blue, David Silver, Brandon Green. So then I start developing these guys. Now I thought, “Oh, forget those normal things. I’m going with the Reservoir Dogs.” And I thought it was so cute too.
Interviewer
(01:21:44)
Do you think, in retrospect, that was a mistake?
Matthew Cox
(01:21:45)
It was so stupid. That was just … There’s so many things, so many mistakes I made. I mean within the fraud there are mistakes I made, but other than just the overall committing fraud, but it was just like I thought it was so cute. And then you get in front of the judge and the judge is hearing about the Reservoir Dogs and Mr. Green and Mr. Black, Mr. White, Mr. This, Mr. That. And he’s looking at me just like, “You jackass.” And what am I saying? I’m like, “Yeah, I thought that was cute.” But nothing’s cute. Plus I’m making fake banks.
Interviewer
(01:22:18)
What’s the purpose of the fake banks?
Matthew Cox
(01:22:19)
Well, sometimes you have to have your down payment in the bank. So they want three months’ worth of bank statements to see that, “Hey, he’s got his $50,000 in the bank.” And then the more properties you buy, they start to want to see what’s called reserves. They want to make sure that you can pay all your mortgage payments. If this guy loses his job, can this guy maintain all these mortgage payments for the next six months? And, see, they do that and they think you’re going to go, “Oh, no, he can’t do it.” They go, “Well, then we won’t lend it.” Well, when they do that to me, I go, “Of course, I do. Of course he’s got it. Let me send you over the bank statements. Oh, you want to call the bank? Call them.”
Interviewer
(01:22:59)
So there’s a phone number. There’s a website.
Matthew Cox
(01:23:01)
Yes. You can call. We’ll get on there. I’ll do the whole … “Hold on. Okay. What’s the name again? Do you have the account number? Hold on.” You wait a little bit and you come back. “Oh, okay, I got it here. I can’t tell you the exact amount right now, but what was his balance last month?” And you tell, “Oh, yep, that’s it. Exactly. Okay, thank you.” Click.
Interviewer
(01:23:23)
Would you do different voices or would you be-
Matthew Cox
(01:23:25)
No, I’ve done different voices or I’d just have somebody else do it. Gretchen would’ve done it or one of the brokers. Susan would’ve done it, one of the brokers that worked for me, or Kelly or Johnny Moon. I have so many guys and they just get on the phone and they do it because they’re all doing something fraud and we’re all working together. So, “Hey, I need you to call this guy. I need you to call this guy and verify this and say” … “I’m at the bank? Okay, I’m at the bank. Okay, cool.” And they call back and-
Interviewer
(01:23:50)
Does this feel like an organized system or was it more improv, just like dealing with the different situations?
Matthew Cox
(01:23:55)
The government would definitely say it was organized. I always say it was … You’re a bunch of, you’re just a bunch of guys it. You’re joking around with everybody and you’re helping each other, and it’s not like everybody’s kicking up to Tommy.
Interviewer
(01:24:12)
And then all these new puzzles come up and you figure out ways to solve these puzzles.
Matthew Cox
(01:24:16)
Right. You go in and you say, “Hey, I’ve got this loan. I need to get this loan. If this guy’s trying to buy this house and I need a loan that looks like this, where can we go?” And by the way, they cannot order a copy of his tax returns, so you don’t want to have to sign what’s called a 4506. So they’re like, “Oh, okay. Listen, so-and-so’s got a program.” And you go back and forth, “but you have to have this much in reserves. But you got the bank?” “Yeah, yeah, I got the bank. I could do that.” So you go in and you throw it out there to five or six guys and you’re going to come up with an answer.
Interviewer
(01:24:46)
So you’re on probation here. Just to self-reflect, did you start doing this while on probation because of the money or because it gave you meaning?
Matthew Cox
(01:24:58)
God, I mean a big part of that, the reason is I did not want to move back in with my parents and I didn’t want my father to see me struggling, and I didn’t want him to … My success, he had no idea, my success had been the first time he’d ever really been proud of me. Does that make sense?
Interviewer
(01:25:26)
Your financial success?
Matthew Cox
(01:25:27)
Yes.
Interviewer
(01:25:27)
At which point? When was the first time you told him you did something and it was like you could sense him being proud?
Matthew Cox
(01:25:33)
Oh, when I became a mortgage broker. When I became a mortgage broker and I went to work for the company, and we’re talking about within a week I got a client. Three days later, I got a client. A week later, got a client. Two days later, got a client. I closed four loans my first month and my dad was like, “Well, how much money are you going to make?” And I’m like, “Well, I’m charging this much, this. I got a point on the back. I got this. Boom. I’m thinking I’m going to walk home after taxes like 10, 11,000.” “Jesus God Almighty, are you serious? Well, see. Don’t start counting your chickens before that.” And then, whatever, three weeks later, four weeks later, boom, I got a check. It’s like $9,000 or something. And then the next month, it’s 12 and the next month it’s 16. And then they make me a manager and it just-
Interviewer
(01:26:23)
He didn’t know any of it was illegitimate.
Matthew Cox
(01:26:25)
No, he thinks, “My son, he’s brilliant. He’s great. He’s wonderful.” Was certainly not proud of me prior to that. But my dad was athletic. He was extremely bright. I mean brilliant. And I was a kid who had to be put into special schools, who barely graduated high school, who ended up going to college and getting a degree in fine arts because I was never going to be able to get a degree in business. It wasn’t going to happen.

(01:27:02)
So when I graduated college, I remember, with the degree in Fine Arts, he said, “The best thing you could do with that is maybe you could draw caricatures at Disney World.” You know what I’m saying? Which wasn’t a compliment. It wasn’t like, “Hey, you could draw” … And then I turned around and I tried to go to work for State Farm Insurance which is who he worked for. He worked for them for 40 something years, and I failed the aptitude test. So then I went and worked for another insurance company and I was an insurance adjuster, but I couldn’t keep up with the workload. Then I end up working construction. I’m still barely paying my bills. That’s basically where my dad felt like that’s … He was polite to me. We were cordial. But yeah, I think he felt he deserved a better kid.
Interviewer
(01:27:54)
Well, when you started doing mortgages, that’s when he was like-
Matthew Cox
(01:28:00)
Of course. He was like-
Interviewer
(01:28:01)
This kid’s got something.
Matthew Cox
(01:28:04)
I was driving a new … I just pulled in in a new car and I just bought a house that was four or five blocks away from his house, from where I grew up, from where he lived at that time, six blocks away from where my sister’s married to her lawyer husband. I’m doing pretty good. And then, within three months, my new wife, we buy a quadplex, and then we’re buying a triplex and another quadplex and a 10 unit and a duplex and another duplex and a quadplex. And it’s like what the hell’s going on? This guy is blowing up. He’s going on vacation here and vacation here.

(01:28:44)
So when the FBI comes in and they indict me, and I take the three years’ probation, probably the worst thing in the world other than going to prison would’ve been just having to just sell everything and go move in and start over and sell used cars. Not that there’s anything wrong with selling used cars, but I just felt like I just didn’t want to disappoint him any more than I already had. So I thought, “I’m going to flip houses and then I’ll start maybe a development company. So I’ll buy some vacant lots and all this and that.” The problem is these houses I’m buying for 50,000, if I fix them up and sell them, maybe I make $20,000, $25,000. And then you got to find a qualified borrower. It’s very hard to find a qualified borrower that wants to live in Ybor City back then.

Fake ID’s


(01:29:35)
I still think it’s rough but those same houses are going for three and 400,000. So I’m buying houses. I got to get qualified borrowers. I do all the renovations. It’s a nightmare. Looking back, it’s like, “Well, then you got to bite the bullet. It’s just what you have to do.” I didn’t want to do that. I didn’t want to do it. Whether it was laziness or, I don’t know, I just thought, “I’m good at this. I’m going to run. I’m just going to start running a scam. I’m going to figure out how to drive the prices up, buy the houses for 50, record them at 200,000, and then have these synthetic identities, buy all the properties, refinance them, pull out the cash, make six months’ worth of payments, let them all go into foreclosure.” And that really, really started working well, very well.

(01:30:26)
I had one time where I had a guy, it was James Red, the synthetic identity was James Red and he had bought two or three houses, and there was somebody at the office who was friends of somebody who knew the title company where we were closing the loans, and he called her, her name was Mary, and said, “Mary, this guy, James Red, like Cox is doing something shady. James Red doesn’t even exist.” She goes and looks at her last couple files and she realizes, of course obviously, this guy never showed up. She remembers Cox picked up the files, and he’s saying he doesn’t exist. So she freaks out. She calls the mortgage broker. Mortgage broker calls me, mortgage broker calls me up and says, “Listen, Mary said she’s not closing the next loan unless James Red shows up.” And I went, “Wow, that’s a tough one.”

(01:31:22)
And she’s like, “Okay, so what do you want to do? Do you want to go to another title company?” We’re supposed to close in three days, two, three days. I said, “Well, I mean he’s going to have to show up then. I’ll figure it out. Give me a couple of days. Let me figure this out.” And she’s like, “Okay, well, I don’t know how that’s going to happen. He didn’t exist.” Keep in mind at this point I don’t need IDs. I don’t need a real ID. I figured out how to make a real ID. I could make one. I could take sandpaper and sand off the information on a regular ID, and then I would print the corrected information in reverse on a piece of transparency, and I would glue it over there and you could still see the holograms and stuff. It actually worked pretty good. It’s not going to pass mustard with a cop but somebody at the bank like I was able to go in and I would open a bank account with it.

(01:32:11)
Well, so one of the things I had done when I was closing these loans was I would go online and you have to pick a photo of somebody to put on the driver’s license. So I’m not making a fake ID for all these guys because I don’t need a fake ID for all these guys, not with my picture on it, but I need a copy of an ID, but I need a picture. Where do I get the picture? So I go to Hillsborough County’s arrest website, and I would find people that I knew that had been arrested. So I found a guy named Eric Tamargo who had been arrested. He had, I don’t know what it was, a DUI or domestic violence. I forget what it was but there was a picture of him.

(01:33:01)
So I print out the picture, I cut it up, I paste it onto a driver’s license, and I make a copy of it for James Red. That’s what I’ve been giving the title people. When I would close, I’d sign all the documents and I’d leave them that copy so that it looked like they made a copy of it. And then they would notarize all the documents, even though they’d never seen this person. They have a copy of his driver’s license. Everything’s signed. Cox said he signed it. It’s good, notarized. Here’s your check. So what I do is I think let me see if I can get Eric to do this. I knew he’d been to prison before, so I call up Eric and I remember one of my buddies like, “He’s never going to do this.” And I was like, “I think he will. I think he will.”

(01:33:46)
And that’s really that kind of like, “You think? What do you think? No.” “Let me try. Let me call him.” “I don’t know, bro.” That’s the kind of conversations you’re having but really, looking back-
Interviewer
(01:33:56)
I would love to hear the opener few sentences that you have with him.
Matthew Cox
(01:34:00)
I can tell you exactly what I said because it’s burned in my mind. He comes in. So what Eric was doing at that time, he was actually working for us. He worked for somebody else, but periodically we’d buy a house and we’d call him up and we’d say, “Hey, can you and your boss, can you guys come over and trim the trees of this house? Trim all the trees, take all the crap in the yard, clean it up?” They go, “Yeah, sure, no problem.” Because that’s what he did, worked for a handyman service. So they would come and they’d clean it up and they’d do that.

(01:34:24)
So I said, “Can you come over?” And he goes, “Yeah.” So he comes to the office, whatever, a few hours later, and he comes in the conference room. I said, “Hey, Eric, what’s going on?” And he says, “How’s it going?” I said, “Listen, I’m going to tell you something. I need a favor.” He’s like, “Okay, cool. What is it?” I said, “You know all these houses we’ve been having you go and clean up?” He’s like, “Yeah.” “You painted that one house. You did this.” “Yeah, yeah, yeah. I know. I know.” “Right. So here’s what we’ve been doing. I’ve been buying these houses for $50,000, recording them for 200, and then I have these fake people buy them.” And I explain, I just lay it out for him and he’s like, “Wow.” He’s like, “Fucking, bro, that’s ingenious, man. That’s smart.” Like, “Wow.”

(01:35:07)
I was like, “Okay. Yeah, I know. That’s great. So here’s the thing.” I said, “The title company, who’s been closing some of these loans, and we have a closing in a couple of days, she wants this guy James Red to show up, and I need someone to show up as James Red.” And he goes, “Wow.” He goes, “Who are you going to get to do that?” And I was just thinking just like, “You’re not understanding. I’m not confiding in you because I need a friend.” And I looked at him, I said, “Well, I was thinking you might do it.” He was like, “That’s a big favor.” I said, “It is a big favor.” “I could be in a lot of trouble.” And I said, “I know.” And he goes, “Well, wait a minute. I can’t go.” He said, “You have to give these people a driver’s license. You said the driver’s licenses, you were using mugshots. You said she’s closed a couple of these. She’s seen this guy’s picture.”

(01:35:55)
And I go, “She has seen his picture.” I said, “The thing is for James Red, I pulled the mugshot offline of you when you were arrested a couple of years ago.” And he jumps up and he goes, “You motherfucker.” And I go, “Whoa, whoa.” I said, “Eric, wait a minute. Hold on, hold on.” I said, “Listen, I only did that because I knew if it came down to this moment, you were the only person that I knew that could pull this off, that’d have the balls to walk in and do it.” And he sat there and he went, “Yeah, you’re right. You’re right.” And I couldn’t believe he fell … Listen, this guy would beat the brakes off me.

(01:36:32)
He’s like five ten, five eleven. He’s boxed. He’s a big guy. So it’s like I’ve weathered that part of the storm. And he sat there and he goes, “Right, right.” And he goes, “Well, I’m not doing it for free. I’m not doing it for nothing.” I said, “No, bro, of course not.” He’s like, “You’re making a lot of money.” I said, “Well, keep in mind a lot of that money goes back in the property. It’s not like we’re walking away with” … I think I said tens of thousands. We’re really walking away with hundreds of thousands. “It’s not like we’re walking away with a bunch of money, Eric. We got to buy more properties. We got to keep it going. We got to make the payments.” “I know but still I could get in a lot of trouble.” I said, “I understand, bro.” I go, “Well, what do you want?”

(01:37:09)
And I remember thinking if he asked for more than 10 or 15,000, I’ll do it myself. We’ll just change title companies and we’ll go and I’ll do it myself. And he sat there and he went, “I want $500.” And I went “$500?” Listen, I almost started laughing. I put my hand over my mouth. I was like, “$ 500? It’s going to take you 30 minutes.” And he’s like, “I don’t care, bro. I could get in a lot of trouble.” I was like, “Well, I’m not paying you now. You got to sign first.” And he’s like, “Oh, you know I’ll sign. I’ll sign. I know you’re good for it.” For 500 bucks. I made a fake ID for him. He goes into the place, he signs James Red. Comes out.

(01:37:53)
What was funny about that was when we walked into the title company, we’re sitting in the lobby and Mary comes walking out, she looks at me and she goes, “Mr. Cox. I don’t know why you’re here.” She goes, “I told Kelly” … that was the broker … “I told the broker that I’m not closing the loan unless James Red shows up.” And Eric stands up on cue and he goes, “I’m James Red.” And she goes, “Hold on a second.” She runs in the back, comes back with a file, opens it up, looks at the picture, and she’s like, “Oh, I’m so sorry. Give me five minutes. I’ve got the file.” Prints up the docs. He goes in, signs.

(01:38:30)
And when we’re there, she’s passing out the checks, 5,000 here, 25,000 here, 35,000 here, 7,000 here, 6,000 here. So he sees all these checks and I’m like, “Oh, I got that. I have the construction company. No, no, no, I have that. I’ll take care of that. I’ll take care of that.” So I get all the checks and I leave. We go sit in my Audi and he sits down and he’s like, “Bro, that’s a lot of money.” “A lot of that money goes back into the properties, Eric.” And he’s like, “Ah, still, bro.” And I counted out 500 bucks. But listen, a week later-
Matthew Cox
(01:39:03)
And I counted out 500 bucks. But listen, a week later, we had another closing. So he comes in, I said, “Hey, bro.” He says, “Hey, what’s going on?” And I said, “I need you to do the James Red thing.” He goes, “Yeah, I’ve been thinking about that. I did that way too cheap.” I said, “I get it, man. Well, how much do you want? What do you want?” And I’m thinking, “If it’s more than 10 or 15, I’ll do it myself.” He sits there and he goes, “I want a thousand dollars.” I go, ” A thousand dollars, oh my God.” So, I gave him a thousand dollars and he did another one.

(01:39:39)
But by that point it was like five or six. We’d done five or six with that guy. After five or six plus the credit cards, plus all the other things, their credit scores start dropping. If it was 700, now it’s down to like 600. And at 600, you couldn’t really borrow enough to make it worth it. So I go, “No, I have other people in the wings, waiting.” I’d go out and I’d run up the credit cards and pull all the money out of the banks and close the accounts and then stop paying.
Interviewer
(01:40:10)
And you said a lot of people knew.
Matthew Cox
(01:40:12)
Yeah.
Interviewer
(01:40:12)
So, he was one of the people and then-
Matthew Cox
(01:40:14)
He was one of the people.
Interviewer
(01:40:15)
Why do you think nobody said anything?
Matthew Cox
(01:40:18)
Well, I mean, I think everybody was making money. At that time, I had an appraiser. Eventually I ordered appraisal software and I just start doing the appraisals to myself. Why give this guy 500 bucks?
Interviewer
(01:40:29)
So you were doing the appraisal yourself? How’s that possible? Is there a check against that, is there-
Matthew Cox
(01:40:37)
There is. It’s funny. Nobody ever questions that. You actually have to have a license to get the appraisal software. So, I get an appraiser’s that we’re working with, I get her license and I create an email address as her.
Interviewer
(01:40:57)
Ah, so it was a synthetic appraiser.
Matthew Cox
(01:40:59)
Right, it was a real person. But I ended up ordering the appraisal software by emailing, it was called Alamo Appraisal Software. So, I end up emailing them as her, and they go, “Well, we can’t sell you the software unless, we need a copy of your license. Boom, here’s your license.” So, I send them the license and then we paid for it with a credit card. You could go get a green dot card, you go put 500 bucks on it, or a thousand. The software was like 1500 bucks or something, back then, it was a long time ago. So 1500 bucks, they mail it to us, and now I’ve got the software. So, now I can do the appraisals myself.
Interviewer
(01:41:41)
What stops you from appraising it, not for 200,000, but even more.
Matthew Cox
(01:41:45)
There’s no comparable sales. So, no matter what you send to the bank, they’re going to look at it. They’re going to have, their in-house appraiser is going to do a desktop review. He’s going to go online, he’s going to check to make sure all the appraised, all of the comparable sales are sold for what you said they sold for, are the same square footage, were built, what the pictures look like, how far they are. He is going to double check everything, but he’s some guy who’s on salary and he does whatever, 40 or 50 these a day or something. It doesn’t take him long. So, it’s cheaper that way, where we pay for the appraiser, appraisals, the whole thing.
Interviewer
(01:42:22)
Got it. So everybody’s getting paid.
Matthew Cox
(01:42:24)
Right.
Interviewer
(01:42:24)
So at this point, I’m doing that, right?
Matthew Cox
(01:42:27)
Yeah.
Interviewer
(01:42:28)
And I’m getting caught periodically.

(01:42:30)
Can you give an example? What do you mean getting caught?
Matthew Cox
(01:42:32)
I’m living in Tampa Heights, which is right next to Ybor City in Tampa. So, these are all little suburbs of Tampa, and they’re all built back in the 1920s, 1890s, 1910s, 1920s. So, I bought this eight-unit building. I renovated it into a triplex. I mean, I’m driving an Audi. I’m dating a woman that I should not have been dating. I don’t know what she was thinking. So we are going on vacations, everything, life’s good. But every once in a while where things happen, you get a phone call, “Hey, this is what just happened.” One time I got a phone call from same broker, Kelly. Kelly calls me up and said, “Listen, we got a problem.” This was, I want to say this was Alan Duncan. This was one of the first ones that I had done. We used him.

(01:43:28)
So, she calls me up and says, “Listen, Alan Duncan never made his first mortgage payment.” I had a friend of mine, or one of my co-defendants, when we closed on that loan, we both got checks for whatever, 40 or 50 grand. Keep in mind, we’re also buying, some of this money’s going into a business account. We’re buying property. So it’s not like I’m pocketing hundreds of thousands of dollars or even 20 or $30,000 on every closing. I’m more like, I’m getting 25, 10, 20, and this guy’s getting 10 and this guy’s getting 15, and then we’re taking 60 and we’re putting it into the business account. We’re buying a bunch of vacant lots, or we’re building some new houses. So we’re trying to take all this and turn it into a development company. But we still have to pay our bills. So, my buddy’s got to go to Amsterdam at least for two weeks. He’s from Belgium. Apparently you have to do that at least once a year. When I gave him the check, I said, “Here’s the 20 grand or 15 grand, but you got to make the payments on this thing for the next six months.” He goes, “No problem.” I said, “Okay.”

(01:44:46)
So, she calls me up a month and a half later and says, “Hey, Alan Duncan did not make his first payment.” And I went, “Oh my God.” He was actually renting the apartment downstairs for me. So, I run downstairs and I open the door and I go, “Bro,” I’m like, “did you make Duncan’s payment?” And he turns around and he’s like, “Is it due?” And I was like, “Oh my God.” So I run back, I grabbed the phone, I’m like, “He didn’t make it. He didn’t make it.” She’s like, “Okay, well here’s what’s happening. The account executive is calling. They’ve got the file.” It was South Star Bank. “South Star Bank has it. They reviewed it. They’ve already been ordering documents. They’re said there’s a problem there. It’s falling apart. The whole thing’s falling. They know something’s wrong.”
Interviewer
(01:45:32)
But they don’t know exactly what. It’s just something suspicious, or what?
Matthew Cox
(01:45:35)
She didn’t tell me that on the phone.
Interviewer
(01:45:36)
Okay.
Matthew Cox
(01:45:37)
She’s saying there’s something wrong. They’re freaking out. Because the account executive didn’t really know. She just got a phone call saying, “Hey, have you ever met this broker? Did she meet the guy? Who is the guy? He hasn’t paid. We’re calling the sale. Nobody’s answering.” And really, most of this was my buddy Rudy’s fault. He’s just not doing any of this stuff, any of the things he’s supposed to be doing. So, we go to the office and I call South Star Bank. I get the secretary and I said, “Look, I need to talk to,” whatever the guy, the big guy was. One of them was the president and one was somebody else, vice president. So I said, “I need to talk to, So-and-So, the vice president.” And she says, “I’m sorry, he’s in a business meeting.” I said, “Well, listen, tell him this is Alan Duncan. You need to go tell him its Alan Duncan’s on the phone right now. I’m sure he wants to talk to me.” And she’s like, “All right, hold on.”

(01:46:25)
I mean, 20 seconds later, speakerphone. ” Hey, Mr. Duncan, this is so-and-so, and I’m here with our lawyer and the president of the bank and our head of fraud. We were just discussing you.” And I was like, “Okay, I understand that I haven’t made my first payment. I said, it actually came back in the mail. I had the wrong address. That was completely my fault and I apologize.” I said, “But I can get you a cashier’s check. Today I will overnight it, no problem. Hope that’s going to be okay.” They said, “Wait, we’re way past that, way past that.” I said, “Okay, well, what’s the issue?”

(01:47:06)
They were like, “Look, to be honest, I don’t think I’m talking to Alan Duncan. I don’t think there is an Alan Duncan. I mean, your social security number was issued a couple of years ago. We called the bank.” We had gone with our SunTrust Bank, so it was a real bank, but it wasn’t our normal bank. And they called. ” They don’t have any record of you.” And I was like, “Well, I’ve never been happy with SouthStar Bank. It sounds like a banking error.” And they’re like, “Yeah, I don’t think this isn’t cute.”
Interviewer
(01:47:40)
He says, “I don’t think I’m talking to Alan Duncan right now.”
Matthew Cox
(01:47:43)
Right.
Interviewer
(01:47:43)
And you were-
Matthew Cox
(01:47:45)
Terrified.
Interviewer
(01:47:46)
But you have to be playing it Cool, I guess.
Matthew Cox
(01:47:49)
What am I going to say? “No, you’re talking to Matt Cox”? I can’t say that. I’m just, got to keep running with it. Just like, “Okay, well look…” And he’s like, “We called the DMV, they don’t have a list for you in their website. We don’t think you exist. We’re still waiting for a phone call back from so-and-so and so-and-so.” And I’m just like, “Oh my God.” I said, “Have you called the authorities yet?” And they were like, “No, we haven’t, but once we put our file together, we will.” Then the head of the fraud department, they said, “Oh, by the way, Mr…” I forget his name, but the head of the fraud department worked for the FBI for 10 years or something, or 12 years.

(01:48:35)
By the way, the broker is there and my buddy Rudy is there. And I mean, he’s pacing the room, she’s in tears, crying. And I’m like, “Okay, well fellas,” I say, “Where’s this headed? Where’s this going? What are we doing?” So, they’re kind of chuckling and joking about it. I remember thinking, “What’s the deal? It’s weird.” And I said, “Look, let me just pay you back.” They said, “Ah, we’ll get the money. We’re not worried about it.” I said, “You don’t seem worried about the money, about getting any of the money back. Why don’t you just let me, I’ll cut you a check. I can get you the money back. What do I owe?” I owed them 150 or something. I forget exactly. It was nothing. I’m like, owe you 150,000. Let me cut you check for 150,000.

(01:49:22)
They were like, “No, no, we’ll get the money back when we foreclose on the property.” That’s when I was like, “Oh, they think the property’s worth like a $195,000 or something.” I went, “Oh,” I said, “I understand. Okay, so do you have the appraisal in front of you?” They were like, “Yeah.” I said, “Open it up.” I said, “Take a look at comp number one. That’s owned by a guy named name Lee Black. Comp number two is owned by whatever, David Silver,” whatever the names were. I’m like, “Black, Silver, Red.” I said, “I am all those people.” And I said, “Let me tell you what I’ve done.” And I tell them, just laid out, “Boom, boom, boom, boom.” I said, “So you can call the FBI, but you’re not going to get all your money back. Or you could let me give you your money back and we can let sleeping dogs lie. The whole thing goes away. I apologize. I had every intention of making all the payments. It’s a glitch. You caught me. My bad.”

(01:50:31)
So, these guys are all just like, “Oh, my God.” Now they put me on hold, they’re looking through the file, they come back. And I remember at some point we go back, forth, back, forth, and finally they come back and they said, “Listen, you still have the money?” I said, “Yeah.” Well, first they come back, they threaten me, “Oh, well, when we give this to the FBI, you’re…” I said, “That’s not true. I said, the money was deposited into a bank account. It has since been moved. The bank account has been closed. It’s been removed in cash. That money has gone. You will never see that money. I will be cutting you, if I pay you back at all, it’ll be from another account.” So, the FBI agent ends up saying, “He’s right. Even if we caught him red-handed, the likelihood that any of these funds will ever be recouped, is zero.” There’s almost no money is ever recouped.

(01:51:20)
They put me on hold again and they come back and they go, “How quickly can you get us a cashier’s check?” That day I go get them a cashier’s check, overnight the cashier’s check. They never called the FBI. They never did anything. Now, at that point, we actually ditched that Alan Duncan. I remember at that point we went to the mall, ran up all the credit cards and just threw everything away and walked away, because it was shot. That guy was shot. I think we borrowed, whatever, $800,000 or $900,000 in his name.
Interviewer
(01:51:54)
So with the banks, it’s really, really all about the money.
Matthew Cox
(01:51:57)
Listen, when I go on the run, I got one where I was caught so red- handed, it’s insane how bad it was. Listen, that’s nothing. I got caught by Washington Mutual one time. I was caught by Washington Mutual where we had done six owner-occupied duplexes. So, if you say you’re going to live in a house, you can get about 95% financing. But if it’s an investment property, you got to put down 20%, you get about 80% financing. So, a buddy of mine who was a sheriff’s deputy, we had his wife buy, I’m going to say six owner-occupied duplexes, saying she lived in every single one of them.

(01:52:44)
Well, you can’t owner-occupy six dwellings. That’s fraud. Now granted, her W-2s and pay stubs were correct, but she didn’t put the down payments down. Even the down payments we didn’t put down, we actually got cash back. But months later, the lawyer from Washington Mutual ends up calling the mortgage broker and saying that they ended up with two of the owner-occupied duplexes, because Washington Mutual had a credit line extended to one of the lenders who’d lent the money. So, it actually was Washington Mutual. So, it was a couple months later when they went to sell it, they package them together and sell them, they realized we have the same customer with two duplexes, side by side, both owner-occupied. This is fraud. She comes in, she tells me, “Oh my gosh, this lawyer’s on the phone. This is what happened.” I’m like, “Oh wow, this is horrible.”

(01:53:41)
I end up getting on the phone with him. We have a conversation and he’s like, “Look, this is a big deal. We could call the FBI.” I’m like, “Look, who knows who was involved in this? Maybe somebody on your side was involved, maybe somebody on my side. I don’t know what my mortgage broker did. I’ll deal with her on my own. Why don’t you just let us refinance the properties?” Not only did we talk him into allowing us to refinance the properties, he gave us a reduced balance of what we owed him. Because we couldn’t borrow enough to pay him off. So, they took like a $20,000 hit just to refinance those properties. Never called the FBI, never did anything. Absolutely fraud.

(01:54:27)
I had a broker one time, we got caught with over a million dollars in loans that he had done that were fraudulent. Pinnacle Bancorp, which was out of Chicago, the owner called me, and he was like, “Look, your mortgage broker did this.” There was a bunch of canceled checks. They were fake canceled checks. So, they looked like they had run through the bank for somebody’s rent, but they hadn’t. Does that make sense? You pay your rent, they deposit it, it goes to the bank and they’ve got all the numbers and everything. Well, I had a bunch that were blank, that all you had to do was fill out your borrower’s information and then you cut and pasted his name and his address at the upper left-hand corner. You make a copy of it, it looks like canceled checks. We had 24 of them. Well, one of my brokers was using them for all of his files. Even if the person really had a rental history, he didn’t want to order it. He just did this, it was easier.
Interviewer
(01:55:19)
It’s faster, yeah. Wow.
Matthew Cox
(01:55:22)
So they catch a million dollars worth of loans. They called me up, and then they caught another million dollars, but they had already sold them to Household Bank. So, while I’m on the phone with the owner, his name’s Gary, and we’re talking, he’s like, “Look, this is what we found. This is this. This is what happened.” And I remember I said, “Gary, at the end of this conversation, if you think I’m cutting you a check for a million dollars,” I said, “I just don’t have it. I don’t have it.” This was when I owned the mortgage company. He says, “No, I’m asking you for your word that if any of these come back on us, they’re in Florida, they’re in your area. You’ll help us get rid of the properties. We’ll foreclose. We’re going to have to resell them. I don’t want to be flying down there. Just help us get rid of them.” I said, “Absolutely, of course, no problem.”

(01:56:10)
I said, “Well, what are you going to do with them?” He goes, “Well, they’re going to be a part of a package, like a $3 million package we’re selling to Household Bank.” The other ones they had caught had already been sold. The ethical thing to do is to contact Household Bank, say, “We will buy those back. We are going to take care of…” It’s not what happened. In fact, Gary flew down a couple weeks later, took me and several of the brokers, not that broker, but several of the brokers out to dinner, had a few drinks, and he openly admitted. He’s like, “Look, I don’t care if all the loans have fraud in them, as long as they don’t come back on me. That’s what I’m concerned about.” Because there was a clawback clause for one year. He’s like, “So, if they can perform for one year, I don’t care.” That was it.
Interviewer
(01:56:55)
How many people in the industry do you think are operating like this? And by this, I mean in the aforementioned gray area.
Matthew Cox
(01:57:10)
I would say there’s probably, after the 2008 financial crisis, I would say it cleaned up considerably. But I would say at this point it’s just as bad as it ever was. Keep in mind, a lot of the loans that caused the problems, they call liar loans, no qualification, no qual loans, no income. Well, those loans, they exist again. There are subprime companies that are doing that again. I don’t think they call them subprime anymore. So, they got some other name.
Interviewer
(01:57:53)
Yeah, rebranded.
Matthew Cox
(01:57:54)
Yeah, they’ve rebranded a little bit, but it’s happening all over again.
Interviewer
(01:57:58)
It just seems the whole real estate slash banking system is very prone to this kind of corruption.
Matthew Cox
(01:58:08)
But how can you fix it? A lot of the things they fixed, a lot of the manipulation they fixed. But if you tighten it too much, then the average person can’t get a loan. And the thing is, some of these loans, sometimes changing a W-2, should that person have gotten into that house? No, he shouldn’t have, he didn’t qualify. But he makes all of his payments. So it’s like, is it a fraudulent loan? Yeah, but it performs.

(01:58:39)
So, I would say that, I forget what the FBI statistic was. It was like 20% or 30%. Prior to the financial crisis it was like 20 or 30% of bank loans, they were saying, that contained some kind of fraud, even if it was just a lie. If you want to cut 30% out of… That’s a ton. That’s a ton.

Getting caught

Interviewer
(01:59:04)
So, you’re on probation and you’re almost getting caught, you’re almost getting caught, and you’re doing these really large-scale scams. How does it get to the point where you’re on the run?
Matthew Cox
(01:59:19)
I’m doing multiple scams. So, it’s not just that I’m doing the scams with the Reservoir Dog scams. I’m not just doing those guys. I’m also creating other identities because I’ve got other people that are involved. They want to do a scam. So, this chick I was dating, she wanted to run a scam. So, I set up a scam. It’s semi-complicated, but the bottom line is she ends up stealing a real person… We steal a real person’s identity. I have a real person’s identity. We get a driver’s license in her name, open up some bank accounts, go rent a piece of property in her name, and I transfer the deed or the deed from the property out of the real owner’s name, I transfer it into her stolen identity. We then refinance the house like three or four times. So, she starts going to these different closings. Her name is Allison, and she’s pretending to be a Puerto Rican woman named Rosie de Perez. Allison has brown hair and blue eyes. Rosie De Perez clearly doesn’t. So Allison, when we make the ID, she dyes her hair black, curls it a little bit, and gets the pictures taken of herself before she goes to the first closing to get a check for like a hundred thousand dollars. We’ve got three of these scheduled. She changes her hair color, she dyes it back like a dirty blonde, and she goes to the first closing and she gets a check, a check for 100,000, let’s say. I don’t know what it was like 95 or 105, whatever, roughly $100,000. She gets a check at the closing, they give it to her. We then go to the next closing. Well, at the next closing, the title person has her sign all the documents, but she’s looking at her like something’s not right. Looks at her ID, makes a copy of the ID, looks at it and says, “This doesn’t look like you.” And she’s like, “You don’t look Hispanic.” And she’s like, “I’m half Hispanic.” But keep in mind the photograph was her. So she’s saying, “This doesn’t look like you,” but it’s her. Granted she had the curly hair a little bit, but that’s it. So Allison is like, “It’s me.” And she’s like, “Look, I’m not going to give you the check. Let’s just sign the documents. You can get the check. I’ll let you know.”

(02:02:10)
She goes, gets in my car. She says, “Yeah, listen, there’s a problem.” So we’re driving down the road, she explains it to me. I realized, “Okay, that’s done. It’s over. We’re not going back.” She’s like, “What about the other closing?” “No, no. No more closings, we’re done.” And it was probably more of a yell, screaming and yelling like, “What the hell did you do? I told you not to change your hair. Why would you change your hair?” When she came in the day before, and I was like, “What did you do? What did you do?” And she’s like, “I changed my hair. What’s the big deal? It’s still me.” Sure enough.

(02:02:42)
It’s not that I knew that that was going to happen, but why tempt fate?
Interviewer
(02:02:47)
How’d you meet Allison?
Matthew Cox
(02:02:49)
She was a mortgage broker.
Interviewer
(02:02:50)
Okay.
Matthew Cox
(02:02:54)
Sorry, she worked for another mortgage company. She couldn’t get a loan closed. The owner of that mortgage company called me and said, “Look, we got a loan, we need it closed.” And I said, “Great.” And when guys would call me, I’d say, “Great, I’ll come pick it up. I’ll give you a $300 or a $500 referral fee.” “No, no, it’s a couple hundred thousand dollars. We want to close it.” “Well, then close it.” “I can’t close it. We need a W-2 or we need this. We need that. We can’t figure out how to do it.” So, I go over there and typically I convince them, just give it to me or it’s not going to close. But you’d have to see this chick, she was gorgeous. She was gorgeous, very flirtatious. Made me feel like I was thin and handsome. So, she gets whatever she wants.

(02:03:41)
So, I’m like, “Okay, look, here’s what you do.” And I explained to her, “Do this, do this, this. Send it here. It’ll close.” And we closed it. Well, then she starts calling me, “Hey, how’s it going?” We go to lunch. Next thing you know, we start sleeping together. She realizes what’s happening. She says, “I want in on this.” So, now we do the closings. We’re on our way. I say, “Look, that check’s dead.” She goes, “What about the other one?” I go, “No, no, it’s all dead. We’re walking away.” Now, it was easy for me to say, because for me, I had money. She’s going through a divorce, she’s broke. None of this did I take into consideration at the time, by the way, to me it’s like, “Nah, that’s dead. We’re done. We’ll start over again.” To her, in her mind, that was a million-dollar scam. She was about to end up getting whatever it was, half or one-third of half a million dollars in the next week. Now she’s got nothing.

(02:04:42)
So, she says, “Look, let’s at least cash this one.” I had a buddy named Travis Hayes, we actually, we’ve been friends since high school. We were best friends, really close friends in high school. We were still close. Travis was running a scam. Hers was in Clearwater, his was in Orlando. So, I’m all over the state at this point. So, he’s running an Orlando scam that’s already yielded half a million, maybe more. We’re still refinancing properties, right? So he’s about to close on another half a million dollars worth of properties.

(02:05:24)
He’s got a bank account that’s open. She says, “Let’s give it to Travis, have him deposit it in his account.” He’s already pulled out like 300,000 out of the account. And she’s like, “Shouldn’t be a problem.” I was like, “No, no, no.” And she goes, “Let me call him.” I think I called him and I explained the situation. He said, “Do you think it’s okay?” And I said, “No, I don’t think it’s okay. I don’t think it’s okay at all.” And he’s like, “Nah, it’s not a big deal. Just give me the check.” So, I give him the check. He goes, he deposits the check. They say they’re going to hold it until it clears. That was kind of a thing back then. It takes, I don’t know, I don’t know how long it took, five days, six days, whatever it was. He was supposed to go back and it would’ve cleared and he would’ve been able to start pulling money out. So, I call him one day, because Allison’s bugging me. So I call him and I go, “Hey, where are you at?” He goes, “I’m actually on my way to Orlando.” And I said, “Oh, okay, so you let Allison know I’m not getting any money.”

(02:06:21)
He said, “The bank manager called and said that because the check was over a hundred thousand dollars, they have to witness me endorsing the back of the check. Or they had to see my something.” For me to come in, I’m like, “Whoa.” I said, “Something’s wrong. Something’s wrong. Don’t go to the bank.” “What do you think is wrong?” I go, “I think the cops are waiting for you. That’s what I think is wrong.” And he goes, “No, the cops aren’t.” He goes, “Man, I’m in the parking lot right now. I just pulled into the parking lot. There’s no cops.” I’m like, “They’re not going to be in squad cars.” And he’s like, “No.” He said, “It’s fine. You’re overreacting, bro.” And I’ll never forget what he said. He said, “You’re shaking like a little girl, bro. Calm down. I got this. I’m cool with the manager.” The manager, because you’ve chopped it up with the manager, he’s going to let your fraudulent check go through.

(02:07:06)
So, he walks in, the cops are in there, they locked the door. He told me later, they closed the door, locked it. The cops are in there. They grab him, and they bring him downtown. He didn’t say anything. He won’t say anything. That’s not true, by the way. So, here’s what he told me, he wouldn’t say anything, “I told them, ‘I’m not talking to you, coppers.'”
Interviewer
(02:07:29)
Oh, he told you, but he actually did tell him.
Matthew Cox
(02:07:31)
He actually did talk to him. What ends up happening, is we can’t get in touch with him. So, we’re calling and calling, calling. Then, finally I decide, “You know what? I’m not going to call his cell phone anymore. I’m going to call the synthetic identity’s number.” So I go and I call the synthetic identity’s number. I call and somebody answers, and I go, “Hey, is so-and-so there?” And it’s a gruff, authoritarian voice. This is law enforcement. He goes, “No, this is officer so-and-so. Who’s this?” I was like, “Oh, this is Lee Black.” He goes, “How do you know so-and-so?” I was like, “Oh…” Click, and I just hung up, and I called from a pay phone.

(02:08:20)
So, I turned around and I said, “He got arrested.” Then later on that night, he showed up on the county website, the arrest website, showing he had been arrested. The next day he calls me and he asked me to get him out of jail. Like, “Hey, you got to go.” So, I have to give his brother-in-law money. We get him out of jail. He actually got out-
Interviewer
(02:08:44)
[inaudible 02:08:45]?
Matthew Cox
(02:08:45)
Yeah, he got out for nothing. And here’s where I should have known that he was cooperating. It went from like $300,000 bond down to like $10,000. So it’s a thousand bucks. So right then, I didn’t know it at the time, but obviously that means we’re going to let him out of jail. He’s cooperating. So, they let him out of jail. I go and I get him a lawyer, a state… This was state, by the way. It wasn’t federal. So I get him a lawyer for like $15,000.

(02:09:14)
He comes, of course, he tells me, “Look, they asked me a bunch of questions. I told him that…” He made up some story about he’s working with another guy, but he doesn’t know the guy’s name. He made up a name. He has this whole kind of thing where he tells them about me, but not me. None of the numbers led anywhere. So they all lead to cell phones that are only being used for those scams. So it’s a dead alley or a blind alley. I’m like, “Okay, okay.” And I’m paying him. He’s coming in, “Man, my truck’s no good. I need another truck.” I buy him another truck. “Hey man, the electric is going to get turned off and I don’t have… I need a thousand dollars.” “Of course, here’s a thousand dollars. I’m embarrassed you had to ask. Here’s a thousand.” A week later, he needs 2000 for this, a thousand for this, 2000 for this. He wants to start a tree-trimming company. He needs to buy a tree-trimmer. “How much are those? 5,000? Of course, $5,0000.” So I give him another 25,000, starts like a tree-trimming business, which he runs to this day.

(02:10:17)
What I don’t know, is that the whole time he’s actually working with a task force that’s been put together.
Interviewer
(02:10:25)
Federal, or…
Matthew Cox
(02:10:26)
This is state at this point. It’s a state task force because there’s multiple counties involved at this point. It wasn’t hard for him to explain. This comes back to Reservoir Dogs. All he had to say to the officers was, “Listen, you got to let me go. I can’t do any prison time. I’m going to tell you about a much, much bigger scam.” And they go, “Okay, well how can you prove that scam?” “Pull up Hillsborough County’s Tax Appraiser website. Okay, look up the name James Red. Look, all of these were bought six months ago. Six months later, they’re all in foreclosure. Pull up Lee Black. All of these were bought. Look, six months later, all of them are foreclosure. Hey, pull up James Red. Pull up Brandon Green, pull up…” So, all of these are going in foreclosure. What I thought was so cute, not cute. It was just stupid.

(02:11:18)
So very quickly they put together a task force. He’s working with them on the task force, and we’re still buying houses, flipping houses, doing everything. Because I believe him. He’s saying, “Look, if I have to go to jail for a year or so,” and he is also paying… He hasn’t paid them back yet, but we’re saying he can pay them back. He’s like, “Look, if we get to the point, when we get to that point, we’ll pay them back.” But we haven’t paid him back yet, because we have no way to show where that money came from. We can always go to one of his relatives and give his dad 40 grand, give his mom 20 grand, that kind of stuff, and start putting money that way. And all that money was taken out in cash, too. So we could always show up with a chunk in cash.
Matthew Cox
(02:12:02)
All that money was taken out in cash too. So, we could always show up with a chunk in cash. Regardless, it’s still in the process. And, I think that we’re still in the process, and it could be six months or a year away because it’s a slow thing. I’ve already been through the process my first time when I got in trouble, and it was a year from the time that I was spoken to until I pled guilty and was sentenced. I’m not concerned about it.

Going on the run from FBI


(02:12:24)
Well, that’s happening. We’re still flipping properties. And, one day… I have a buddy named Steve Sutton. Remember the sheriff’s deputy? Keep in mind, it’s funny because I’ve done bad loans for police officers, sheriffs, lawyers, doctors, across… everybody. These aren’t all-
Lex Fridman
(02:12:47)
Yeah, everybody.
Matthew Cox
(02:12:48)
… guys that… These aren’t all construction workers or guys that work… or mechanics or something. These are legitimate people that have credit problems or whatever the case may be. One day, I’m sitting at work and I’d been getting phone calls for the prior week from people at title companies saying, “Hey, Matt. Wanted to let you know we just had some subpoenas served on several of your files.” I’m concerned. That had me concerned. Then a guy named Jeff Testerman starts making phone calls. Jeff Testerman is a reporter for the St. Petersburg Times. He’s calling people saying, “Hey, I noticed that you sold a piece of property to Lee Black. Have you ever met Mr. Black?” And, they’re just hanging up on him or saying, “No, I don’t know what you’re talking about. I’m not sure what that guy’s name was. Let me call you back.” And, I’m getting phone calls from people. So, I know something’s up with the newspaper. Now I know something’s being looked at, but nobody’s really talking.

(02:13:53)
I know that there are subpoenas being served and I’m nervous. I’m very concerned. One day, I’m in my office and the sheriff’s deputy walks in, Steve Sutton, in his uniform too, which everybody always stiffened when he would walk in. He walks in. I go, “Steve.” I said, “What’s going on?” He said… and usually he’s jolly and laughs and stuff. He says, “I got to talk to you outside.” I was like, “Okay.” I walk outside, “What’s up?” He says, “I used to date this girl in the Tampa Police Department,” or something. I was like, “Okay.” He said, “She showed up at my house this morning at six o’clock in the morning.” I went, “Okay.” He said, “She said that she’s been working on a task force.”

(02:14:39)
And, he said, “Apparently, one of your buddies got arrested in Orlando. They’re investigating some other thing in Clearwater. They’re investigating a ton of properties here in Ybor, Tampa Heights. And, there’s like a hundred properties involved. And, my name came up because you’ve sold some properties to me,” which I had. He’s like, “So, she came to me and said, ‘Look, your buddy, Cox…'” I was like, “Okay.” He goes… He said, “Well, the task force is on you. And, she said to stop talking to you because they’re going to come arrest you in a couple of days. They just handed over the task force findings to the FBI and the FBI is going to come arrest you in a couple of days. She said not to talk to you because you’re going to cooperate because all white collar guys cooperate. So, she thinks you’re going to cooperate and not to talk to you because she’s afraid you’re going to get me hemmed up. And, she said just to walk away.”

(02:15:42)
He was like, “So, I thought you should know.” I was like, “Okay.” He said, “What are you going to do?” I said, “Oh… you know…” Well, first, he said, “What should I do?” I go, “Tell them. Tell them that I arranged all the loans for you.” You came in. You signed the paperwork. I filled out all the documents. You signed the paperwork. I arranged everything.” I’m like, “You’re not a mortgage broker. You don’t know if this is legit. You have perfect credit. You signed the paperwork. You walked away with a check for 30,000. You don’t know.” He was like… Because, he did it because he had a job. He was a sheriff’s deputy.

(02:16:18)
I went in. I applied for a loan at a bank. They said, “You can buy the house and we’ll give you $30,000.” So, of course I’m going to do that. That’s not going to happen. But, he doesn’t know. I said, “Just tell them yeah. Tell them you’ll cooperate, absolutely.” He goes, “What are you going to do?” I said, “Me?” I said, “I’m leaving, bro. I’m leaving.” I said, “I can’t stay here. I can’t go to prison. I was just sentenced. I’m on federal probation right now. The judge isn’t going to be cool with me getting popped again. I can’t do it. Can’t do it.” I said, “I’m leaving. Can’t go to prison. I’m adorable, bro. I saw Shawshank Redemption. I know what’s going to happen. I can’t.”
Lex Fridman
(02:16:59)
You’re too good looking.
Matthew Cox
(02:16:59)
Yeah, I can’t do that. That’s not going to happen. I am not going to defend myself against a guy who’s six foot three and tatted up. No. I’m no benefit to a gang. I’m a nonviolent, soft, white collar criminal. I was like, “Yeah, I’m leaving, bro. I’m leaving.” Well, I actually went home… Well, actually, I was able to… I started cutting checks to people. I cut checks to Allison, to Johnny, to everybody I could think of. Here’s 5,000. Here’s 7,000. Here’s 8,000. Here’s six. Here’s nine. And, had them going into all these different bank accounts, pulling out cash. But, this is like a Thursday at four o’clock. The next day they show up with cash, write some more checks. They go again. I get about 80 grand in cash. That’s all I can get.

(02:17:50)
I go home that night. I start packing my bags. And, I was dating this chick named Rebecca Houck. We’d been dating about a month. And, she shows up at my house. I hadn’t returned her phone calls all day and apparently we’re supposed to go out and I’d forgotten about it. I had bigger issues. So, I’m packing a couple of duffle bags and she walks in and she’s like, “What’s going on?” I’m like, “I’m leaving.” Where are you going? I thought we were supposed to go out at such and go do something tonight. I’m like, “I’m leaving. It’s over.” She says, “What happened?” I tell her what happened. This is what happened. She’s like, “Oh my God.” She had no idea.
Lex Fridman
(02:18:29)
She had no idea about anything you were doing?
Matthew Cox
(02:18:31)
No, I barely knew her. I mean, she’s coming over two, three times a week for a month. This isn’t love. This is a booty call. That’s all it is. We’re hanging out. We’re having sex and that’s it. I don’t even know you. She suddenly just begs to come with me. You got to bring me with you. You have to this, you have to that. I’m like, “What are you talking about? You’ve got a son. You have your mom lives here.” She’s just in tears and crying. She suddenly said, and this is what’s so funny about it, is that she had just moved from Vegas to St. Petersburg to work at the dog track, to work for a company that owned the dog track. A casino interest or a gambling company. She said, “You don’t even know why I’m here.” I was like, “Okay, why are you here?” She said, “I’m here because I was working for a law firm that worked for the casino company that I worked for.” She said, “I got caught embezzling…” Nothing. It was like 10 or $15,000 from my boss. She had a gambling habit. And, she said, “He didn’t call the police because we were sleeping together and he was afraid his wife would find out.” She said, “So, instead, he banished me here to St. Pete. My son just came to live with me. He’s been caught sneaking out.” Because, the father had raised him. He’d only been living with her since she got to Florida.

(02:20:10)
She’s like, “I was going to send him back. He’s failing school. He’s smoking pot. He’s been caught sneaking out after curfew.” I’m like, “Oh, okay. I don’t know any of this.” She’s like, “He was going back in December?” No, he was going back after the school year, which would’ve been like May. Okay. I’m like… Where before, five minutes earlier, I thought she was this sweet secretary, sweet innocent secretary, she’s like, “I’ve been married three times. I am a gambler. I’ve claimed bankruptcy. I’m sleeping with my boss.” She went from this thieving, adulterous, and I thought these are all really beneficial to my future plans. And, I shouldn’t have… At that moment, I was so just flipped out and concerned. And, up and leaving your life and everything you know behind, that’s terrifying. Now, you’re alone in a strange place, in a place-
Lex Fridman
(02:21:17)
Is that the first time you’ve done something like that, leave to go on the road?
Matthew Cox
(02:21:21)
Yes. I’d never just up and moved. And, keep in mind, now I can’t call home. I’m leaving… There are things that I feel like get you caught. I’ve watched tons of these TV shows and there are certain things that get you caught. One of them is keeping in contact with anybody in your old life. I’m thinking that’s not going to happen. I’m not contacting anybody. I’m leaving and that’s it. That didn’t really happen. I kept in touch. I called my mom every once in a while. But, I was like, “Okay, that’s cool.”
Lex Fridman
(02:21:52)
Did the loneliness of that hit you early on or no? Like as you were packing…
Matthew Cox
(02:21:56)
I never did. Well…
Lex Fridman
(02:22:00)
You’re leaving your life… I mean, it feels like a fundamental transition.
Matthew Cox
(02:22:04)
Oh, listen. You think? Listen, not just that I’m leaving my son. I have a son. I was leaving everything. I was just terrified of going to prison. It was just so stupid. It was just arrogance. I should have stayed. I made things so much worse. But, I also thought I’m smart, I can figure this out. I can change my identity. Blend in. I’ll be fine.
Lex Fridman
(02:22:37)
Aren’t you already… People know what your face looks like.
Matthew Cox
(02:22:40)
They do. They do. But, one of the first things I did was I got plastic surgery.
Lex Fridman
(02:22:44)
What kind of plastic surgery?
Matthew Cox
(02:22:46)
I got a nose job. I got what they call a mini facelift. They go in through your back of your ears and they suck out all the fat in your neck.
Lex Fridman
(02:22:54)
Does that change appearance much?
Matthew Cox
(02:22:56)
A little bit. I was balding. I got two hair transplants, two hair grafts. The hair in my head, this isn’t my hair. It’s my hair, but it’s from back here. They cut it here-
Lex Fridman
(02:23:07)
It looks great.
Matthew Cox
(02:23:09)
Appreciate it. They reimplanted it there. Got liposuction, just some other stuff. And, got my teeth done, that sort of thing. That was my plan. I’ll go, I’ll take off. I got 80 grand. I’ll steal some more money. But, I let her come with me. We ran up all my credit cards over the next few days, packed up the car, traded in my Audi and got an Audi… I don’t know, it was it like an A6 or a four-door, like the big four-door, whatever it was. Got that and drove straight to Atlanta. I wrote a letter to my parents before I left, just explaining this is what’s happening. I’m leaving. I’m done. I’m not going to prison. Love you. Sorry.

Identity theft

Lex Fridman
(02:24:03)
Sorry.
Matthew Cox
(02:24:03)
Sorry. Sorry. I know I’m a disappointment. Sorry. Bam. I take off, go to Atlanta. When we went to Atlanta, I already had the name of a guy named Scott Kugno that I’d done a loan for. I had his vital information. I have his name, date of birth, social security number, mother’s maiden name, and where he was born. One day we were having a conversation and I just slowly pried all that out of him. We’d done a loan for him. So, I already had his name, date of birth, social security number. But, to steal his identity, I need to know where he was born and his mother’s maiden name. Through the course of the conversation, I just pried, “Hey Kugno, is that… What is that? Is that like Irish? Is it…” No, it’s such and such. What’s your mom’s name? Oh, such and such. Oh, okay.

(02:25:05)
Were you born here? Were you born in… Weren’t you from? Ah, man. I was born here. I was born in such… Oh, Hillsborough County. It was no big deal. We get to Atlanta. I make a fake ID for both of us. But, keep in mind, I don’t have a driver’s license. I do. But, they’re fake. I can’t give this to a cop. Can’t give a driver’s license that says David Freeman.
Lex Fridman
(02:25:28)
What’s David’s residence? Florida or is it Georgia?
Matthew Cox
(02:25:31)
No, this is Florida. But, it was just a made up name. I’d gone to high school with a kid named David Freeman. So, I had an ID, but I can’t give that to a cop. That’s enough to rent a place or do something. So, we go to Atlanta, make an ID, set it up, make some business cards, set up a couple of websites, set up… get an HQ which is… it’s a company that will… You can do virtual… You can rent virtual… You can rent offices and they’ll answer your phone for a hundred bucks a month and they’ll forward them. So, it seems like you have an office.

(02:26:08)
They give you a phone number that you call up and they say, “Hi, United Southern Bank.” They’ll answer the phone and forward messages. We get one of those, make a business card for Becky. She rents a house from a guy named Michael Shanahan. We rent Michael Shanahan’s house. It’s like $200,000. $200,000 house in Alpharetta. I then go to… Wait, I then order Scott Kugnos birth certificate, social security card. I think I registered to vote in his name and I made a lease agreement in his name. And, I think that’s all I needed. Then I went to Alabama and got a driver’s license in his name. I went into the DMV, give him all these documents, which almost all of them are real except for the lease.

(02:27:05)
They said, “Sit over there.” I sit over there. I sit down. Boom. 20 minutes later I have a driver’s license. It was 20 something dollars. It was nothing. I get the driver’s license. Now, I’m driving this. I’m still driving a car, an Audi that is in the name of Matt Cox. I park that. I then go get social security to issue me a social security number in the name’s Scott Kugno. I then turn around and I go and I get a loan. You put down 20, 30%. There’s all these first time buyers. 30% down. Get like a Honda or something. Now, we’re living in a house, we’ve got some furniture, bedroom furniture. I go downtown. I pull the title to this guy, Michael Shanahan’s house, and I go downtown and I satisfy the loan on his house. He had two loans with Bank of America.

(02:27:59)
I create two satisfaction of loans from Bank of America. Michael Shanahan owns a house in the name Michael Shanahan. He has one mortgage with Bank of America and a second one. When you pay your mortgage off, the way public records knows it’s paid off is they mail public records a satisfaction of mortgage. It’s a one-page document, and it’s notarized.
Lex Fridman
(02:28:27)
You’ve got two of those.
Matthew Cox
(02:28:28)
I filled out two. I created two of them. I just ordered… You can do research. When I went downtown, I researched Bank of America satisfaction of mortgages. And, thousands show up. I just grab a couple of them and now I know what the basic template is. They’re all different by the way. It’s not like you even have to be that close. But, whatever. I mimicked some of them. I had a notary stamp. Not hard to get. You go into three different office depots and you say, “Hey, I need a notary stamp.” You give them the information and you come back four day… or whatever, a week later and they give it to you.

(02:29:09)
So, I’ve got these notary stamps. I notarize the satisfactions. I go downtown, I file them. Boom, the mortgages are gone. Keep in mind, Bank of America, he’s still paying the mortgages. They don’t know that they’ve been satisfied in public records. They’re not notified. Those are gone. But, it takes about a month or two for it to show up. Atlanta was that far behind. I think it was Fulton County. They were just way behind. So, we just have to dick around for a while. We’re going on little vacations. We’re going to New Orleans. We’re going to different places as Scott Kugno, driving a car as Scott Kugno.

(02:29:46)
We opened up several bank accounts. We opened multiple bank accounts. And then, we ended up going to Vegas. We do go to Vegas. But, what happened was we were driving around and I remember thinking, telling her, I was like, “This is a problem. We have to get real IDs, real driver’s licenses. I mean, this is real. But, this is a real person too. He may stumble across it.” What I did was I started running ads in magazines saying home loans available. Good credit, bad credit, no problem. Call now. Government loans, government… VA, FHA, whatever. Call this number. People start calling and I’m getting their information. One of the guys I got was Michael Eckert. Yeah, I remember. Michael Eckert. Poor Michael Eckert. I actually legally changed his name to Michael Johnson at one point. But, at this point, it was just Michael Eckert. I wanted to see… I’m bored, I want to see what the process is. How much does it cost? Is this possible? Let me see if I can change this guy’s name. It was 1,500 bucks. I changed it.
Lex Fridman
(02:30:56)
Without him ever showing up anywhere. So, you can fake-
Matthew Cox
(02:30:59)
Well, I have a driver’s license in his name.
Lex Fridman
(02:31:00)
Right.
Matthew Cox
(02:31:01)
I am him. So, he did show up. He showed up at the lawyer’s office.
Lex Fridman
(02:31:06)
Right.
Matthew Cox
(02:31:07)
So, I do that. I’m living in the house and we’re driving along one day and I’m saying, “We got to get real. These people that are calling…” One guy, I get his information. But, during the course of taking the application and I’m asking these government survey questions at the very end, there’s like 20 questions and I’m rambling them off. At some point, he was like… He volunteered, I never even asked anybody about criminal history. He ended up saying something, “Well, I do have a felony. Does that matter? It was a DUI. I’ve had a couple DUIs. But, I got my license back.” That was part of the reason he had bad credit. It was like, “Okay, no. Nope, it doesn’t matter. Don’t worry.”

(02:31:53)
I’m thinking you’re not getting a loan. So, I’m just taking… I’m just stealing from you, stealing your information. I get all this information. I’m gathering it. One of the things I said to Becky while we are sitting at this stoplight is I’m like, “We got to get people’s real information. For instance, I said, “What if I steal somebody’s identity? I get a driver’s license in his name four states from where he lives, and he gets a DUI? I could get pulled over two years later and get arrested for a DUI that he got in Florida.” She’s like, “Well, what are you thinking? Are you thinking criminals or you thinking prisoners, mental patients?” I looked over and there was a homeless guy holding a sign. I went, “Like that guy.” I’ll never forget, she goes, she says, “The hobo?” I don’t know who calls them hobos. She’s like, “The hobo?” I said, “Yes. That guy.”

(02:32:48)
I said, “Hold on.” Pulled over to a Subway. Got out. She went inside to get Subway. I walk across the street, pulled out like 20 bucks. I said, “Hey bro, can I ask you some quick questions real quick?” He’s like, “Yeah, what’s up?” I go, “Here’s 20 bucks.” I said, “Listen.” I said, “When was the last time you were gainfully employed?” He’s like, “Ah,” whatever, “10 years.” I’m like, “Oh, okay. Do you have a criminal record?” He’s like, “Ah, I’ve been arrested with misdemeanors, like vagrancy.” He names off some things, drunk in public, whatever. I was like, “Are you on probation?” He goes, “I can’t do probation. They don’t give us probation. They keep us for 90 days. They release us. The judge knows I can’t do… I’m not going to show up for a probation.” I’m like, “Okay, do you have a driver’s license?” He’s like, “Maybe, I don’t think so.” I go, “Did you get a DUI?” He’s like, “No, I think it’s just expired.” Did you have a driver’s license with you? He’s like, “No, I got nothing.” I’m like, “Okay. Well…” He told me he lived in a tent in the woods. So, I gave him another 20 bucks, asked him a few more questions. I remember in the middle of it, he said, he goes, “What, are you’re taking a survey or something?” I remember thinking… not thinking, I chuckled. I go, “You get a lot of surveyors out here like that.” He goes, “Yes. Sometimes.” I was like, “Really?” He goes, “Yeah.” He said, “People from halfway houses and…” What did he say? Social workers and stuff. They’ll come out and they’ll pass out stuff and they’ll ask us questions and stuff. I’m like, “Oh, okay.”

(02:34:25)
I thought, “That’s good to know.” I go back. I get grab Becky, and she’s like, “Oh, did you give him money?” I said, “I give him like 40 or 60 bucks or something. Forget what.” She was like, “What a waste of money.” I thought that was good. That was money well spent. I said, “That guy’s perfect.” I said, “That guy… He’s got everything. He has no way to be contacted. He has no documentation on him.” I said, “He’s not going to drive a car. He’s not going to get a DUI. He has an expired license. I just have to get his license reinstated and I can be him.” I went home. I typed up what I called a federal statistical survey form, and I made a little thing. I mean, I went online. I mean, I’m always filling out federal documents as a mortgage broker. It looked identical. I mean, I had this little… the recycle symbol, and it was like Federal Form 17017. I print out these forms. I go buy a clipboard. I make a little Salvation Army ID. I pin it on me. I go out and I start-
Lex Fridman
(02:35:31)
Doing surveys.
Matthew Cox
(02:35:32)
I start surveying homeless people. Don’t judge me, bro. I was in a bad spot. I was in a bad spot. I see the judgment. I see the judgment. Let’s maintain civility here. Stay neutral. Stay neutral.
Lex Fridman
(02:35:46)
These homeless guys, I mean, they have a social security number. They have a birth certificate, I guess. I mean, they’re a real person. They’re a real person.
Matthew Cox
(02:35:56)
Right. They’re just not using their real person.
Lex Fridman
(02:35:58)
Yeah. They’re not actively engaging with the economic system, the financial system. They’re not employed. They don’t have housing, all that.
Matthew Cox
(02:36:07)
Yeah, they don’t file taxes. One of the questions I even asked the guy, one of the last questions, I said, “Do you believe that you will be gainfully employed within the next two years?” Every one of them said no, no, no. It was like, okay, they’re not even trying. They all had alcohol problems. Or, honestly, the few of them I talked to, it was pretty clear. I mean, it takes literally five minutes, less than five minutes to fill out the form. I filled it out for them, of course. But, even filling it out and that brief just asking questions back and forth, half of them, you could tell you’ve got some mental illness. Something’s not right with you. These aren’t guys that are going to go out and are going to get jobs. They’re not cleaning up. They were perfect for my purposes, as horrible as I know that sounds.
Lex Fridman
(02:36:56)
Do you feel bad about this little small tangent?
Matthew Cox
(02:36:59)
No. Do I feel bad about it?
Lex Fridman
(02:37:02)
The homeless people in society are really… It’s a difficult life. Dealing with mental illness, dealing with drug addiction, all that stuff.
Matthew Cox
(02:37:11)
I mean, listen, being in prison and then the people that are in prison that are going to be homeless or have been homeless, or the mental illness that I’ve dealt with in halfway houses and even doing this, I don’t know what you do with these people. I don’t even know that you house them. You can’t necessarily even house them together. They cause such problems. I don’t know what the solution is other than just keeping them fed maybe and keep them away from normal people so they don’t cause crime or whatever. I don’t know about housing them in one area. That seems like a mistake. There is absolutely no good solution to that problem. None. Because, it’s not like, “Hey, if we gave you a house and we gave you job training and we gave you this,” okay, you might get 5% 10. But, most of them are on the street because they’ve just messed up over and over and over again. They just gave up.
Lex Fridman
(02:38:11)
But, I guess we still have to remember that they’re human beings. We mentioned off-mic Soft White Underbelly, and he highlights the humanity of people who’ve had a real difficult life. He does it well.
Matthew Cox
(02:38:24)
Mark Laita, he is amazing. He’s amazing. One of the things he had said was, he was like, “These are real people.” He’s like, “They have stories and they need…” But, if you also talk to Mark, he’ll tell you, “You can’t give them money. You can’t…” He’s tried. Every time he’s reached out and tried to help these guys, put them in apartments, fed them, got them back on their feet, within six months, they’re back on the street. It just happens over and over and over again. I mean, I think the amount of money that would have to be dumped into correcting that problem, I don’t know. I mean, you can say, “Well, yeah. But, just you should do it because it’s the right thing to do.” I don’t know who’s paying for it.
Lex Fridman
(02:39:13)
It’s complicated. But, for your purpose, they have a social security number.
Matthew Cox
(02:39:18)
They got 20 bucks. They seem very happy.
Lex Fridman
(02:39:20)
There you are with a clipboard, taking a survey.
Matthew Cox
(02:39:22)
Right. Took a survey, went home, ordered their… And, of course, they give me everything. Name, date of birth, social security number, mother’s maiden name, where they were born. Have they ever been in the armed services? Have they ever had a passport issued? What states have they had identification in? Have they ever been arrested? They ever been on probation? Have they ever claimed Social Security Disability? SSI. I mean, I had like 17 questions and it absolutely answered everything.

(02:39:50)
What high school did you go to? Because, high school transcripts are great for documentation. A lot of times they’ll ask you for high school. Can you get us a copy of your high school transcripts? That’s good to know. And, I’m a big believer in overkill. I mean, I ordered a ton of stuff. If I needed three things to get a driver’s license in your name right, I’d come in with six. Because, what you do is you get in front of the guy at the DMV and you fumble through like, “Oh, I got this. What else do you need?” I know exactly what you need. But, they’ll be like, “Oh, was that high school transcript? Yeah, I’ll take that. Oh, voter’s registration card. Give me that. Yeah, you’re perfect. You’re good. Sit down. Right over there.” That’s it.
Lex Fridman
(02:40:29)
Who’s, by the way, lurking in the shadows trying to catch you? You’ve mentioned FBI, Secret Service, you mentioned… I think I’ve heard you mentioned US Marshals, which is interesting. Cops, in general, the police, CIA, I guess CIA is international only.
Matthew Cox
(02:40:44)
Yeah.
Lex Fridman
(02:40:44)
FBI is internal.
Matthew Cox
(02:40:45)
Yes.
Lex Fridman
(02:40:46)
Okay. When you’re doing this, who are you afraid of?
Matthew Cox
(02:40:52)
By the time I’ve gotten to Atlanta, within four or five days, the FBI raided my office. I guess I missed that.
Lex Fridman
(02:41:00)
Back in Florida.
Matthew Cox
(02:41:00)
Back in Florida. When I left and drove to Atlanta and left, remember the FBI was going to show up a few days later. They were going to arrest me.
Lex Fridman
(02:41:08)
And, they did.
Matthew Cox
(02:41:09)
They did. They showed up… I left on a Sunday night or something. Because, for some reason in my stupid thought, I thought, “Well, they won’t arrest me on the weekend.” Like they don’t work on the weekends. They came on a… whatever it was, like a Tuesday or Wednesday or Thursday. Within a few days, they’d come in the office, they raid it, they’re looking for me. But, I’m gone. Nobody knows where I am. Now, I’m surveying the homeless guys and I turn around and I’m ordering their documents. And, as their documents are showing up, I’m going to different states and getting IDs. I’m going to Florida. Over the course of this whole thing, I’ve had 27 driver’s licenses in seven different states. I’ve had two dozen passports. Because, if you’re going to get the driver’s license in the guy’s name, you might as well get… or an ID even, you might as well get a passport. Because, a passport’s not difficult to get. They don’t fingerprint you. All they’re doing is saying, “This is your ID and were you born here?” Then they run a check. It comes back or it doesn’t. Back then, you could do it expedited and I’d have it in two weeks. Now, it takes like 90 days or 60 to 90 days to get one.
Lex Fridman
(02:42:24)
If you have multiple ideas for a single identity, that’s more proof.
Matthew Cox
(02:42:27)
Right.
Lex Fridman
(02:42:29)
Wait, what number did you say? How many IDs? How many identities?
Matthew Cox
(02:42:34)
I had… Well, I’ve had over 50 identities. But, I’ve had 27 driver’s licenses issued from state DMVs, Department of Motor Vehicles.
Lex Fridman
(02:42:45)
Legitimately?
Matthew Cox
(02:42:45)
Legitimately. I walked into the DMV, said, “Hi, my name’s Michael Eckert.” And, I just moved here about three weeks ago, four weeks ago. Here’s my lease. I lost my driver’s license, bro. I don’t know what I did with it. I don’t know. I don’t know what happened. I don’t know. They’re like, “It’s all right. What do you have? I need a proof of residency.” Well, I have my lease. Oh, okay. I need a primary. Okay, here’s my birth certificate. Okay. And, I need a secondary. Here’s my social security card. But, I also registered to vote.

(02:43:20)
My girlfriend made me vote immediately, and she said I would need that. Oh yeah, it’s perfect. You’re good. I don’t even need that. Okay, great. Stand over there. Pay that person. They call your number, 275. Forty five minutes later, you go, you pay your 25 bucks. You stand in front of the screen. They take a picture. You got a driver’s license. You walk out, it’s still warm. It’s beautiful. It smells like popped plastic. It’s amazing. So, I am opening up different bank accounts in these guys’ names and just about-
Lex Fridman
(02:43:53)
Yeah, sorry. Well, what are you mostly doing with the identities? You opening up different bank accounts?
Matthew Cox
(02:43:57)
Right now?
Lex Fridman
(02:43:58)
Are you doing credit… starting to establish credit or no?
Matthew Cox
(02:44:01)
Some of them. I might order secured credit cards. So, I’m building their credit. It’s not helping me in any way. I’m just sending out $500 to get a Capital One card or a American… I’m sorry, a Bank of America secured credit card, whatever. So, I’m building their credit. But, not all of them. Only a few. Because, although I’m collecting them, I’m also going to be moving soon. I’m only here to get a few hundred thousand dollars and move. I need some kind of a base. So, I don’t want to start getting credit cards and building up a history in Atlanta in anybody’s name. But, I am getting driver’s licenses in other states, like North Carolina, South Carolina.

More scams

Lex Fridman
(02:44:49)
What’s the primary method of income here when you move to a place? South Carolina, how do you make a hundred thousand at this time?
Matthew Cox
(02:44:57)
Oh, well, right now, I’m living in this guy’s house and I satisfied his loans. The house is worth 200,000.
Lex Fridman
(02:45:02)
Got it.
Matthew Cox
(02:45:03)
What happens-
Matthew Cox
(02:45:03)
… His loans, the house was worth 200,000.
Lex Fridman
(02:45:03)
Got it.
Matthew Cox
(02:45:03)
So what happens is one day we go and we check public records. Remember I told you it takes months for it to show up?
Lex Fridman
(02:45:08)
Mm-hmm (affirmative).
Matthew Cox
(02:45:09)
And it shows up. He’s got no mortgages on the house. So now I turn around, and I make a fake ID in the name Michael Shanahan, and I’m living in his house, but I have no credit. There’s no credit. So the ID, I’ve got a social security number, and I order some secure credit cards in his name. So if you pull that credit profile, it shows up saying he’s got some credit cards, but they’re only a month or two old. So I can’t go to Bank of America. I mean, I could, but I needed to get the money as quick as possible. I want to get out of Atlanta.

(02:45:51)
And at this point, by the way, there’s multiple articles showing up in Tampa. So the St. Petersburg Times is writing multiple articles about me.
Lex Fridman
(02:46:03)
With your face.
Matthew Cox
(02:46:04)
With my picture. Yeah. But honestly, it’s post-internet, but it’s in its infancy. Like nobody’s… It’s not huge. And honestly, it’s a local newspaper in Tampa. It’s not that big of a deal. I’m not concerned about that so much at this point. What I’m concerned about is getting a chunk of money and just moving on and kind of reestablishing ourselves in a better way where we’re not living in a building that we’re going to be committing fraud in with our house.

(02:46:36)
But I’m living in this place. I make a fake ID in the name Michael Shanahan, and I call up three hard money-lenders. A hard money-lender is a guy that lends his own money or other investors’ money on property, kind of like a bank, but he’s lending his own money so he doesn’t have to really meet the banking requirements, and he can charge a much higher interest rate. These guys are charging 12, 13% interest, simple interest, and they’re only lending you a much lower percentage of the value of your home. So they’re not lending you 90% of the value. They’re lending you 65%, 60%.

(02:47:17)
So I call three of these guys. They all come out to the house at different times, and each one of them says, “I’ll lend you 100,000,” or it’s like 150,000. They all lend roughly 150,000. So we schedule three separate closings. None of them know about the other person. So what I do is I close one loan on let’s say Monday, and then one on Tuesday, and then one on whatever, Wednesday or Thursday, or they may have all been the same day, to be honest, but I don’t remember.

(02:47:48)
The point is I go to three separate title companies or real estate attorneys, and we close, and I get checks, after cost and everything the total ends up being roughly 400,000. So I’ve got 400,000.

(02:48:04)
Becky and I run another scam in Tallahassee, Florida, and we get like 50 grand, plus the ’80s dwindled down to close to nothing. Because we had gone on several vacations. We went to Bermuda, and I think we went to Jamaica. We actually stayed at the Ritz in Jamaica. So it was very nice.
Lex Fridman
(02:48:25)
You and Becky. So Becky turned out to be pretty good in terms of scams on the road?
Matthew Cox
(02:48:30)
No, she was useless. She was horrible, and she just spent money all the time. And what I realized too, very quickly, is she’s bipolar. So she’s bipolar, and she’s absolutely insane. She smokes pot all the time.
Lex Fridman
(02:48:48)
Did that matter for you personally or did it actually affect how good you were able to do these particular scams?
Matthew Cox
(02:48:56)
It was that she was the type of person that would start an argument at 1:00 in the morning and scream at the top of her lungs and get the cops called. So I can’t have the police called. I can’t get taken downtown and fingerprinted. I can’t have the police showing up. I don’t know who’s looking. We haven’t had plastic surgery at this point. We’re still pulling money together.
Lex Fridman
(02:49:26)
Oh, Becky.
Matthew Cox
(02:49:27)
Yeah, Becky’s a problem. And at some point, actually, we send her to a psychiatrist, and they put her on Zoloft. And she takes it for a month or two, and then she stops taking it. She thought she was all better. Like you’re not all better.
Lex Fridman
(02:49:48)
So can you give me a timeline here? How long are you able to be on the road here successfully?
Matthew Cox
(02:49:53)
Three years.
Lex Fridman
(02:49:53)
Three years.
Matthew Cox
(02:49:54)
This is me. This is the first few months.
Lex Fridman
(02:49:55)
Three years. Three years.
Matthew Cox
(02:49:59)
Yeah. What happens is we get that little chunk of money, we deposit it into these bank accounts, and we start pulling out cash, which works out fine because we’ve got a bunch of accounts and we’re pulling out little amounts, 7,000, 5,000, 8,000. And I would cash checks against her accounts, and they would call her to verify, “Oh, is there’s someone here trying to cash a check for $9,000? Can you verify the payee?” And they go, “Oh yeah, that’s Scott Cogno.” “Oh, okay, thank you.” And they cash the check.

(02:50:34)
These are new accounts, so it looks odd, but we were always… I open the account. So what ends up happening is we’re cashing them, and I remember getting really frustrated because it was just taking forever. And I had gone into a bank one time. And they have banks where they actually cash large checks. Like if you go into Bank of America and you try and cash a check for $15,000 or 25,000, they probably won’t do it. They’ll tell you, “We don’t have that much cash on hand. We don’t this, we don’t that.”

(02:51:12)
They have certain banks that do that. So they told me where one of those was. I went there, I had a check for like 29,000 that had been cut on a closing for Michael Shanahan. Remember I refinanced Michael Shanahan’s? I’ve got a check for 29,000 that was issued to Scott Cogno. So I’m sitting in the bank, I go in there and I say, “I need to cash this.” And she says, “You’re going to have to talk to the manager.” I go, okay. She says, “Go sit down over there.” I go sit down in the little glass cubicle.

(02:51:41)
He comes over and he says, “I see you’re trying to cash this check.” And I was like, right. He goes, “Why don’t you just deposit in your own bank?” And I went, “My bank is a credit union or something and it’s in Florida. They’ll hold this thing for two weeks. I need the money now. I have people I need to pay.” He was like, “Well, I’m not sure.” And I was like, “Well, it’s fine. It’s a cashier check. It’s good.” And he goes, “No, it’s good. It’s good.” I said, “You have the money?” And he’s like, “Yeah, we have the money.” He said, “It’s just odd. Hold on,” he goes back in the back, and he comes back and he says, “Where’d you get the check?” Cashier’s check. I said, “It was a cashier’s check. It was drawn off of a closing for somebody’s property that we’re doing. The company I work for, we’re putting on an addition on,” okay, that makes sense.

(02:52:31)
Comes back, goes, “Well, why do you need cash?” And I was like, “I’m cashing guys’ checks that work for the company. There’s a lot of these guys that are Mexican guys. They give them a check, they go to a check cashing company or they get charged 5, 10%. So I cash them,” I’m like, I don’t under… What? The check’s good, right? And he’s like, “Yeah, we’re just trying to verify some stuff.” And he went, “Yeah, hold on.” And he leaves again.

(02:52:56)
And I remember my cell phone rang, and I pick up the phone, it’s Becky. She goes, ” What are you doing? What’s taking so long?” I go, “Ah, the guy’s being a jerk. He doesn’t want to give me the money.” Well, she’s like, “Oh my God, get out of the bank. Get out of the bank.” And I went, “I can’t get out of the bank. The guy’s got my ID, he’s got my credit card, my ID, and the check for 29,000. He’s going to call the police if I just jump up and run.” And I go, “Don’t call me again. I’ll let you know. It’ll be fine.” I hang up the phone.

(02:53:23)
She calls back, same conversation, “I’m bouncing all the walls. I’m like, I’m going crazy.” I’m like, “It’ll be fine.” Hang up the phone. He comes back out and I said, “Hey, so what’s taking so long?” And he goes, “We’re trying to get in touch with Michael Shanahan to verify the check.” That’s not good for me. I’m thinking, right, right. Okay. Okay. And he walks away, the phone rings, it’s Becky, “What’s going on?” I go, “They’re trying to get ahold of Michael Shanahan.” She goes, “Oh my God! Oh my God!”

(02:53:58)
And I’m like, oh my God. And I remember thinking I shouldn’t have left her the keys. There’s a good chance I run out of this place and she’s not there.
Lex Fridman
(02:54:05)
But by the way, when you’re sitting there, you’re who? Scott? You’re Scott?
Matthew Cox
(02:54:09)
I’m Scott Cogno.
Lex Fridman
(02:54:10)
And then-
Matthew Cox
(02:54:10)
The other guy’s-
Lex Fridman
(02:54:11)
He’s calling Michael Shanahan, okay.
Matthew Cox
(02:54:12)
Right. They’re saying they’re trying to get in touch with Michael Shanahan. So then the phone rings, my cell rings again, and I look, and it’s not Becky. So I pick up the phone, I go, hello? And she says, “Hi, this is Kim from Sun Trust Bank. Is this Michael Shanahan?” So I’m like, ” Yes, it is Michael Shanahan.” And she says, “There’s a guy here, he’s trying to cash a check. It’s very large. Could you verify the payee?” And I go, “Sure. It’s Scott Cogno.” I said I believe the amount’s $29,000. And she goes, “That’s right. Thank you very much. I appreciate it.” I said, okay. I said, “Hey, by the way, how’d you get my number? This is my cell number.” And she’s like, “Oh, I’m sorry. We called the title company, and the title company gave us your phone number.” Well, I closed those loans. That’s my cell. That’s why if they looked in any other way, they could have gotten in touch with the real Michael Shanahan. So I was like, oh, okay. Hang up the phone.
Lex Fridman
(02:55:09)
You answered the phone from the bank while sitting in the bank-
Matthew Cox
(02:55:13)
As Scott Cogno.
Lex Fridman
(02:55:14)
As Scott pretending to be Mike.
Matthew Cox
(02:55:18)
Right. So I just verified the check myself.
Lex Fridman
(02:55:21)
As Matthew pretending to be Scott pretending to be Michael.
Matthew Cox
(02:55:24)
Right. So I wait there, terrified still. They come out about two minutes later, the manager comes out, plus a woman, I’m assuming maybe that was Kim. She never said anything. And she walks out, and he counts out the money twice. 29,000. 29,000. And I stand up, and I mean, I remember shoving the money in my pockets. Like I’m trying to get out of there so quick. I’m like, hey. I’m like, okay, cool. I’m thinking this whole thing feels bad.

(02:55:54)
And I’m getting up, and so I’m starting to walk out of the bank and he said, “Excuse me, Mr. Cogno?” And I said, yes, sir. I turned around. And he goes, “I’d like you to know that I feel very apprehensive about this transaction.” And I go, “Really? What is it exactly?” He goes, “I can’t put my finger on it.” And I go, “It’ll come to you.” And I turn around and I just bolt right out of there.

(02:56:21)
And keep in mind, a week or so later, the Secret Service shows up. Did you cash a check for $29,000? So what’s so funny is that was one of the last checks we cashed. So we ended up with like 400,000.
Lex Fridman
(02:56:33)
Was there a connection between the Secret Service and this guy?
Matthew Cox
(02:56:37)
No, the-
Lex Fridman
(02:56:37)
The apprehension.

FBI Most Wanted

Matthew Cox
(02:56:38)
So the FBI is looking for me kind of in Tampa, and they’ve put out a fugitive warrant for me, which is how the US Marshals got involved.
Lex Fridman
(02:56:47)
So the US Marshals track down fugitives.
Matthew Cox
(02:56:50)
Yes, federal fugitives, they track down.
Lex Fridman
(02:56:54)
But everybody’s after you. You’re on every list.
Matthew Cox
(02:56:57)
Right. I’m on the FBI’s most wanted list. At that point, the Secret Service got involved once I leave Atlanta. So when Becky and I pack up our bags and we leave Atlanta, the Secret Service got involved because of identity theft, banking, identity theft. The Secret Service doesn’t just do counterfeiting and protect the president. They also protect the financial infrastructure of the United States, and they especially have jurisdiction when identity theft is involved.
Lex Fridman
(02:57:28)
So identity theft plus bank fraud there, that’s when they [inaudible 02:57:33]
Matthew Cox
(02:57:32)
They move. Yeah, that’s it. That’s their territory.
Lex Fridman
(02:57:36)
And the US Marshals are just fugitives.
Matthew Cox
(02:57:39)
US Marshals, just fugitives. They don’t do any investigations.
Lex Fridman
(02:57:42)
Okay, but they’re all kind of working together?
Matthew Cox
(02:57:44)
Yeah. Yeah. The US Marshals are, let’s say, an arm of all of the various law enforcement agencies. Federal agencies, not the states. The states have their own fugitive task forces or fugitive…
Lex Fridman
(02:58:00)
So when you leave Atlanta, basically everybody’s after you.
Matthew Cox
(02:58:03)
Everybody’s after me.
Lex Fridman
(02:58:05)
Did you know this at that time? Or did you ever sense it?
Matthew Cox
(02:58:08)
No. I mean now every day you’re just looking your name up every day. I’m not, because I’m just trying to get a bunch of money and just blend in, right? Things were not as interconnected at that time as they are now, but they’re starting to get interconnected. But of course, I have no idea how much. I barely go on the internet for anything. Dating. That’s the only thing on the internet. I had never been on Facebook. At this point, Facebook isn’t even out yet. This is 2006.
Lex Fridman
(02:58:38)
Still, were you trying to stay low?
Matthew Cox
(02:58:39)
Yeah, I am. I’m not a flashy person. I’m not driving… Like I didn’t go out and buy a red Lamborghini. I’m driving 40, $50,000 cars. I’ve had some sports cars, 70, 80. Maybe that’s 150,000 sports car now, but it’s still not flashy. It’s not like it’s bright red or yellow. I mean, it’s always something nondescript.

Close calls


(02:59:03)
And I’m living in areas that these cars are everywhere. So I end up going to Charlotte, North Carolina. We rent an apartment, we decide to run a scam in South Carolina, so I go to Columbia, South Carolina. And in between this period of time, we go to Las Vegas. We go to Las Vegas to drop off a bunch of money to Becky’s son’s father, who’s taking care of her son. We drop off some money there we go, and we start… And while we’re there, it’s like, “Hey, there’s homeless people here.”
Lex Fridman
(02:59:45)
So you’re always-
Matthew Cox
(02:59:50)
You know, usually I don’t feel bad telling these stories. You make me feel bad.
Lex Fridman
(02:59:54)
I’m sorry. I’m sorry, my judgment is showing. No, but you have to be collecting identities, I guess, to be constantly creating new identities.
Matthew Cox
(03:00:02)
So I got my survey forms. So I go, and we go out and I’m taking surveys, and I end up going up to this guy. There’s like two or three guys that are standing on a bench or sitting next to a bench or something. And I see him and I walk up. And one guy gets up and he comes over and he is like, “Hey, what do you need?” And I went, “I’m taking surveys for the Salvation Army to determine where we place our next homeless facility.” And the guy goes, “Oh, I’m not interested.” And they always said that. And I said, ” It pays 20 bucks cash right now. It’ll take you five minutes.” And they’re like, “$20 cash right now?” I was like, yeah. I show them the cash. And they go, “Okay, yeah. What do you need?” Name, date of birth, social security number.

(03:00:40)
So when I get to criminal record, he says, criminal record. He’s like, “Yeah, I’ve been arrested three, four times,” he said, “for prostitution.” He said, but they’re like misdemeanors. And I went, okay. And it was like, okay, well prostitution… To me, women get charged with prostitution. Men get charged with solicitation. I went, “Prostitution?” And he said, “Yeah, yeah.” He said, “I offered to blow an undercover cop for 20 bucks.” He said, “That’s what I thought you were coming out here for.” And I was like, no, no, bro. I said okay. And he’s like, yeah. He said, “I mean, a girl’s got to do what a girl’s got to do.” And he made some comment or something. I was like, okay.

(03:01:26)
So I jot down the rest of it, we’re good. I give him 20 bucks. I get in my car. I leave. We get back to North Carolina. I order all of his documents. His name was Gary Sullivan. I then go to South Carolina. When I go to South Carolina, I get a real estate agent. We drive around for a day. We look at five or six houses. I put five owner financing contracts on five different houses. So he writes up five contracts, all of them are asking for owner financing. I’ll put down 10%. I want owner financing. Two of them end up coming back and saying yes, we’ll do it. I have two closings. One of them is a house that’s worth like 225,000. I put down 25 grand. Another one’s 110,000. I put down 11,000.

(03:02:24)
So I buy these two houses. I then satisfy the loans on both the houses. Everything seems like it’s going okay, although Becky’s a lunatic at this point. She won’t take her medication. She’s had so many outbursts. And by this time we’ve had plastic surgery. She’s gotten plastic surgery, she’s gotten a boob job, she’s gotten liposuction. I mean, all kinds of stuff.
Lex Fridman
(03:02:56)
Look quite different? Like appearance changes or?
Matthew Cox
(03:03:00)
Thinner, better looking, just tightened everything up. I guess. She had had a kid, and she was 33, 34. I don’t know how old she was. 32, 33? I don’t know, roughly my age. Yeah, she lost like 15 pounds. Not because of the surgery, but just in general, we’re just working out. We’re going mountain climbing. We’re riding bikes. Fraud’s not a full-time job, so we have plenty of time.

(03:03:29)
So we’re goofing off, but she’s also a lunatic. She’s getting the cops called. She’s able to go out, and she’s able to stay stoned 24 hours a day. She’s going out with friends, drinking. I never leave the house.

(03:03:48)
Even to this day, I really barely ever leave the house. I’m very much a homebody kind of person. So the idea that I’m able to make my living doing YouTube and I never have to leave my house, I love that. I don’t ever go anywhere except for the gym and back home. That’s it.

(03:04:03)
So what happens is I’ve actually moved her out of my apartment. Like I had an apartment downtown, 30-story building. I actually move her into another apartment. She’s that much of a lunatic. We can’t even be in the same place. Multiple times I’ve tried to leave her, she’s called me up and begged me to come back. It’s horrible.

(03:04:23)
So I end up buying a couple houses in Columbia, South Carolina. I satisfy the loans on the houses. I’ve got an ID, not a driver’s license, but an ID in the name of Gary Lee Sullivan. And I refinanced those houses, because keep in mind, there was owner financing, but they also had mortgages. So there’s something called a wraparound mortgage. So these guys did wraparound mortgages. So let’s say you buy a house for $250,000 and the bank lends you 200,000, and then you owner finance the house to me. So I give you 50 grand down, but I’m not able to get a loan from the bank to pay off your mortgage. So what we do is you do a wraparound mortgage. So I’ll pay you your mortgage and you pay the bank. So there is a second mortgage on the property, but it’s wrapped around your first.
Lex Fridman
(03:05:21)
That’s legal?
Matthew Cox
(03:05:22)
Mm-hmm (affirmative). Yeah. I wouldn’t lie to you. So these have wraparound mortgages. So-
Lex Fridman
(03:05:30)
You’re always selling, and you’re good at it.
Matthew Cox
(03:05:32)
So I go, I satisfy the owner finance loans, the wraparound mortgages, and I satisfy the original loans that these people took out on their own mortgages. One of them, by the way, I sat… You have to sign as the president of the bank, right?
Lex Fridman
(03:05:53)
Yeah.
Matthew Cox
(03:05:53)
So I sign it as C. Montgomery Burns, which is the aging tycoon, the guy that owns the power plant in the Simpsons TV show. So I sign that and I notarize it, which I thought was cute. I actually wanted to sign all of them cartoon characters, and Becky was screaming her head off and wouldn’t let me do it. Like I wanted to do all the Simpsons, right? But she wouldn’t let me do it. She’s screaming and hollering. Nobody knows who C. Montgomery Burns is.

(03:06:18)
So I sign it, notarize it, all of those are satisfied. I then go to multiple banks and I start refinancing all these properties multiple times. So I’m applying for these loans, and I’m getting the loans, and I’m closing, so I’ve got like five or six loans on this one house, it’s like 225,000. I think it was like 230, whatever. I borrow four or five loans on that house. So I borrow like $190,000 like five times. So I’ve got like $800,000, and then I borrow another 3 or 400,000 on the other house, the smaller one. So it ends up being like $1.3 million. It’s actually 1.5 million. It was more. But what happened with that was… So keep in mind, you can only open up so many bank accounts in your name. You can go to Bank of America, they’ll open one. Then you go to SunTrust, they’ll open one. They might even ask you, did you open another bank account today? Because every time you do it, there’s an inquiry into something called Check Systems or AccuCheck. And so then by the time you go to the third bank, they’ll say, “Listen, something’s not right. You’ve got multiple inquiries.” If you go to, whatever, Mercantile Bank, they might go, “Okay, we’re going to open one.” They’re going to need an explanation, but you’re not opening more than three. By the third one, they’re going to be like absolutely not. Something’s wrong.

(03:07:46)
So I’ve got multiple identities, but I can only open up so many banks. The other problem is that these checks, they’ll only give you so much money on a refi. Usually after 100,000, they only want to let you walk away with let’s say a $100,000. So one of the things I did was I would typically record another mortgage and have them pay that mortgage off. So I opened a corporation to do that, so I could then turn around and go open corporate bank accounts. Because now it’s not going off my information, it’s going off the corporation, so I can open up multiple corporate bank accounts.
Lex Fridman
(03:08:22)
Well, these corporations are fake or real?
Matthew Cox
(03:08:24)
No, no. I went to a real corporate attorney and had him open them. I gave him whatever. I gave him like $1,500, $2,000, and he opened up a corporation for me, Gary Sullivan, and I then turned around and I went and opened up multiple bank accounts in that corporation’s name.
Lex Fridman
(03:08:39)
How are you keeping track of all this? Is it in your head or do you have good organization?
Matthew Cox
(03:08:44)
Oh no, every single identity has its own file with plastic inlays, sleeves for their passports.
Lex Fridman
(03:08:54)
That’s nice and organized.
Matthew Cox
(03:08:55)
For all this. Yeah, it’s super organized.
Lex Fridman
(03:08:57)
You open this. I’m Gary now.
Matthew Cox
(03:08:58)
Right. That’s exactly what it is. You kind of go over, boom, boom, boom, boom. You sit in your car for a minute, you put it down, you walk in. Well, what happens is it went up to 1.5 million, and I’m pulling money out of the bank, and then one day I got a phone call on Gary Sullivan’s cell phone. The guy, it’s a lawyer. They call up, he says, “Hey, I’m a lawyer with Washington Mutual. We have an issue.” I said, “What’s that?” He says, “We got a phone call from the title company.”

(03:09:26)
One of the title companies that I was attempting to refinance one of the pieces of property with noticed that I… They’d been sent a document that showed that I had purchased the property, and I said I purchased it cash, and the documents said I purchased at cash. And they got that, and there was actually a mortgage on the property. And so somehow or another, they connected it and they called Washington Mutual and they said, “Look, there’s an issue. We have a fraudulent document here.” And he said, “So we went and we looked, and it turns out that we pulled public records and that there is a mortgage in front of us, several mortgages in front of us. So there’s like three or four mortgages in front of me, Washington Mutual. You owe us.”

(03:10:19)
And it wasn’t that much. It was like it 100 grand, right? Like 95 or 100. And I said okay. And he said, “So there’s an issue here. You’ve got a few mortgages in front of us, and we’re supposed to be your first mortgage, and we’re not supposed to be two mortgages behind or three.” And I was like, “Okay, sounds like an error. Not a big deal. Have you contacted law enforcement?” He said, “No, I haven’t. I was hoping we could rectify this some other way.” I said, “You know what? I think we can. I’m going to have my lawyer call you back. I’m going to go to his place right now. Give me about two hours.” No problem.

(03:10:53)
I immediately run, jump in my car, head towards South Carolina, call my corporate lawyer, tell him, “Look, I need to talk to you. Here’s what’s going on.” I explain it to him. He doesn’t really understand. He says, “This sounds pretty complicated. My law partner is a criminal defense attorney. I’m going to set up a meeting right now with all of us.” Okay.

(03:11:15)
I get there 45 minutes later. I walk in the door, I sit down. He says, “What happened?” They said, “Gary, this doesn’t sound right. What happened?” I said, “Okay, so listen. Bought this house. I bought it cash. I then refinanced it,” I didn’t buy it cash, but I told him, “I bought it cash. I refinanced it like four or five times within a day or two of each other.” And they were like, “How is that even possible?” I was like, “Well, I went to different title companies,” and I explained how I do it. I said, “Washington Mutual just found out that they’re in second position or third position.” Or I said, “But they may be in fourth position.” You know they mail these things in so you never know. And he was like oh my God. He’s like, “What do you want to do?” I said, “I want you to contact them and agree for them to not contact the authorities provided I pay them off.” He goes, “Do you have the money?” I said, “I do have the money. I can go get the money right now.”

(03:12:10)
He calls the lawyer. This is back when faxes, right? So they fax some documents back and forth. They do a couple emails back and forth, and they have a conversation. I remember the lawyer started arguing because he wanted to charge me like yield spread and fees and stuff, and I was like, “What are you talking about? I’ll pay it.” So it ends up being a little over 100,000. And I’m like, that’s it. So he’s like, okay. And so he says, “Okay, that sounds good.” And so he said, “Okay, all you have to do is go get the check.” And he said, “bring it to a Washington Mutual branch. Tell them to call.” I said, “I’m not going into a Washington Mutual branch, bro. I’ll bring you the check.” So he calls them back, he’s not doing that, right? Okay, I’ll bring it here. You guys take care. He said, “No problem.”

(03:12:56)
Okay, hang up the phone, and he turns to me and he says, “Okay, well we have a problem.” He said, “We still have the problem of these other mortgages.” And I went, “Right?” I said, “They don’t know anything.” He said, “I know, but Gary,” he said, “what if they find out?” I said, “They find out that they’re like in second and third and fourth place?” He’s like, “Right.” I said, “I leave town.” So they both laugh. They go, “Gary, you can’t just leave town. They have a copy of your driver’s license. They have your social security number. They have your birth certificate. They’ll find you. It’s the FBI.” And I go, “You’re assuming I’m Gary Sullivan.”
Lex Fridman
(03:13:36)
Wow. You tell them.
Matthew Cox
(03:13:37)
And listen, they looked at me and they went… And I remember he said, he goes, “We’ll cross that bridge when we get to it.” And I said, “Right, my immediate problem is getting rid of these people.” And he goes, “Right. Right.” So I go get the check, bring it back, give it to them. Never called the FBI.
Lex Fridman
(03:13:59)
Can’t believe you got away with the Washington Mutual.
Matthew Cox
(03:14:03)
Oh, bro…
Lex Fridman
(03:14:04)
I mean, these are all really close calls, it seems like.
Matthew Cox
(03:14:07)
No, this is the close call. I have two more close calls that my hands sweat thinking about it. I walk into Wachovia. I just opened this account two months ago, so it’s a new account. So whenever I would go in there, I’d say, “Hey, I need $7,000, $6,000.” Anything over $3,000, they had to call to get permission, like authorization. So she’s like, “Okay, I got to go call.” I said no problem. So the girl walks in the back, I’m sitting there waiting, all of a sudden a massive person reaches over my hand and grabs my wrist, and somebody grabs it from the other one, and they pull my hands behind my back. These are two of possibly the largest law enforcement officers I’ve ever seen in my entire life. And they’re massive. And they handcuff me and they say, “Mr. Sullivan, you’re being detained. We’re taking you into custody, and we’re holding you until a detective gets here.”
Lex Fridman
(03:15:12)
Who are these guys? Is this just Marshalls or is this cops or what?
Matthew Cox
(03:15:15)
These are Sheriff’s deputies.
Lex Fridman
(03:15:16)
Sheriff’s deputies, okay. So Gary Sullivan, right?
Matthew Cox
(03:15:19)
Right. And as they walk me in the back, they’re calling me Mr. Sullivan. They sit me down, and by now the Secret Service are looking for me. They were calling us John and Jane Doe, but now they figured out who we were. And so now I’m on the Secret Service’s Most Wanted list. I’m not number one, right? I probably was, but we just found out I was on that list. So it is getting bad.

(03:15:47)
So they sent me down, and I’m waiting, and I remember thinking that the FBI was coming. I don’t really know. At that point, I couldn’t tell you the difference between everybody. And then five minutes go by and I’m sitting there going, ” What is going on? Do you guys have any idea what’s going?” They’re like, “We don’t know. We’re just grunts. We just do what we’re told.”

(03:16:05)
So suddenly this guy walks in, he’s probably in his early thirties, maybe. He walks in, gray suit, I think he looks like he’s FBI. He says, “Hey, I’m a detective with the…” I want to say Richland County, whatever, sheriff’s department or police department, whatever. And I was like, oh, okay. And he says, “Yeah, listen, we’ve got an issue. Wachovia, they want us to arrest you.” He said, ” They’re saying that you’ve got three mortgages on your house.” And I go, “Is that illegal?” And he looked at me and he went, “You know, to be honest, I don’t know.”

(03:16:41)
And I distinctly remember thinking, I’m walking out of here. All I have to do is convince this guy I haven’t done anything wrong. He’s already said he doesn’t know. So he gets on the phone with the head of Wachovia’s fraud department, and he’s saying, “This guy is running what’s called a shotgunning scam,” which is absolutely right.
Lex Fridman
(03:17:01)
What is a shotgunning scam?
Matthew Cox
(03:17:02)
It’s where you close on so many loans simultaneously, they can’t catch it. Anyway, they start going back and forth, and he’s on the phone and he’s like, “Why did you close three loans?” I said, “It’s not illegal. I have a first mortgage, a second mortgage, and a home equity line of credit. That’s perfectly legal.” And he goes, and you can hear the guy. “They’re all first mortgages,” and I said, “I read every one of those documents. Not one of them said they were first mortgages.” And they don’t. First mortgages don’t say they’re first mortgages. It’s the placement of the mortgage, the placement of the lien that determines is it a first, second, or third.

(03:17:40)
So it’s possible that I wouldn’t have known it. It’s certain that I could have read those documents and not known. And he’s like, “That’s not true!” And he’s screaming. And so I go, yeah, listen. And he said, “Well, you’re taking it out all cash. Why are you taking all cash?” I said, “I don’t know if this might be illegal,” I said I don’t know. I said, “I mean, I work for a labor company-“
Matthew Cox
(03:18:03)
I work for a labor company, Labor on Demand. I pull out my business card. You can call. So, I’m like, “I work for Labor on Demand.” I said, “We hire a lot of guys that they don’t have bank account. So, the company pays them.” Then usually, I’ll pull out money and I’ll cash their checks, because they get charged like 10% of these check cashing companies. I feel bad. I know the checks are good, so I just deposit them. I mean, I don’t know if that’s illegal. I don’t think that’s illegal. He’s like, “No, no, no, that’s fine. That’s a decent thing to do it. That’s fine.” I’m like, “Oh, okay.” He’s talking to the guy and Wachovia, screaming out, hollering. He’s going back and forth, back and forth. So, we’re going back and forth and I’m just derailing everything this guy says.

(03:18:53)
At one point, he’s screaming, “He’s committing fraud. We want him arrested.” He’s like, “I don’t know what to charge him with.” He’s like, “Hey, look. How did you even do this?” I go, “Look, I didn’t do this.” I said, “I came to Wachovia. I met with a loan officer.” I said, “I need a first mortgage. I need to pull out $100,000. I want to start buying houses.” He goes, “That’s right. You own another house here too, don’t you?” I said, “I do.” I said, “We’re putting a new roof on it. We’re going to build an addition. We’re putting in a pool. I’m buying one right down the street from that one.” Obviously, I’m pulling out money. I said, “So I told them I need $100,000.” They said, “That’s fine.” They said they could only get me $100,000 out for something about Fannie Mae guidelines, which is true.

(03:19:44)
So, then she said, “I can send you to a friend of mine who’s a loan officer. She can get you a second mortgage,” which she did. Then I told her, “She could only get me $100,000 or so, $190,000.” She said, “You should get an equity line of credit if you’re going to be doing renovating properties.” So she sent me to somebody and they got me an equity line of credit. I said, “I haven’t committed fraud.” I said, “I wouldn’t know how to commit fraud if you told me.” I said, “What sounds more reasonable? A guy that worked for a labor company ripped off a bunch of banks for over half a million dollars, or some loan officers got together and did something illegal?” I said, “There’s a problem at the bank.” He says, “I think you’ve got a problem at the bank.” This guy goes nuts.

(03:20:33)
While he’s screaming, “He needs to be arrested. This is fraud,” my loan officers have not done anything illegal. They wouldn’t do that. He says, “Look at his ID. His ID is fake. His ID starts with 000.” South Carolina ID start with 000. This guy’s in California. He has no idea. So, when he says that, the detective looks at my ID and he goes, “Listen.” He said, “This is a real ID. I ran this guy through NCIC.” He said, “This is Gary Sullivan.” I looked at him. I go, “Now I’m not Gary Sullivan.” I go, “Come on, bro. What are we doing here?” He goes, “I know Gary. I know.” He says, “I’m going to take him downtown. I’m going to talk to my whatever, lieutenant, whoever captain. I’m going to fill out a police report and I’ll let you know.” He hangs up. I get up. They’ve taken the handcuffs off. I stand up.

(03:21:38)
As we’re walking out with the detectives, as we’re all walking out, he goes, “Hey, you have an ID. Do you have a driver’s license?” I went, “I do, but it’s in Nevada.” He goes, “Oh, that’s right.” He goes, “You’re from Vegas.” He looks at the two deputies and they all grin. I think he ran me through NCIC, which means he ran a statewide criminal database, which means he thinks I’ve been arrested three times for prostitution in Vegas.
Lex Fridman
(03:22:09)
Right.
Matthew Cox
(03:22:10)
Listen, I’m humiliated. I was just like, “Oh, man.” So one of the cops goes, “Here, give me the ID,” takes the ID. He goes, “I’ll check and see,” because I have to follow him back in my car. So, he goes, and by the way, my car is in the name Michael Eckert. So, Michael Eckert, he doesn’t have a photograph of Michael Eckert, because you can’t pull up photographs from other states. So, he doesn’t have a photograph, but he knows that’s not my car. He asked me, “Whose car are you driving?” I said, “Oh, that’s my boss, Michael Eckert.” I said, “That’s my boss.” He goes, “Oh, Michael Eckert?” I said, “Yeah, exactly.” I’m like, “Oh, my God.” So I’m thinking he knows Michael Eckert, knows it’s registered in North Carolina, knows the address, which is where I was currently living. That’s a problem.

(03:23:04)
So, the deputy grabs the ID, walks outside, comes back. I have no idea if this homeless guy has a driver’s license in Nevada. I don’t know. He had nothing on him. He comes back and he goes, “Does he have a valid license?” He goes, “Yeah, it’s valid.” He hands it to him or he hands me the ID and he goes, “It’s valid.” He looked at me, he goes, “Yeah, well…” He said, “It says, he’s 5’11.” It was like 5’10, 5’11, and I’m clearly not 5’10 or 5’11. They all look at me and I go, “Fellas, with a good pair of shoes.” They all go, “Follow us, Gary.” I follow them back to the police station. Becky is calling me on the phone, screaming her head off.

(03:23:53)
Now, I’d always told Becky, “If I ever get arrested, immediately, go get me a lawyer. The lawyer will be able to get me out on bond,” because I’ll be arrested for something stupid. I said, “It’ll be something like trying to cash a fake check.” All my IDs are real, so it won’t be for a fake ID. So, my ID won’t be in question. Most police departments and sheriffs at that time did not run your fingerprints through AFIS, because they charge them for that. So, they don’t typically do it unless your identity is in question. Mine wouldn’t be. I have a valid driver’s license or a valid ID in that state. So, I go back. She’s screaming, she’s like, “Oh, my God. You don’t understand. I just checked the internet, the website. You are number one on the Secret Service’s most wanted list.”

(03:24:54)
I was like, “I got bigger problems right now. They just held me in the bank. I’m following them right now.” She was like, “Get on the interstate. Go, go.” I cannot go. The detective’s in front of me. The cops are behind me. They’re escorting me to the police. Listen. She’s like, “Oh, my God! Run! Run!” I go, “Look, not a NASCAR driver.” It’s a sports car, but it’s not going to outrun a radio or a helicopter. That’s not going to happen. I know it seems nice. I’m not that guy. I said, “Look, you don’t understand. I was in handcuffs 30 minutes ago. I just talked my way out of him. I’m going to get out of this.” I said, “The worst that’s happens is I’ll be arrested as Gary Sullivan. You can get me an attorney. He can get me out.”

(03:25:42)
She goes, “I’m not getting you an attorney. I’m not getting you out on bond. I’m not risking everything I’ve got for you,” because she has all the money. We’ve got $700,000, $800,000 at this point. By the way, she’s not even in North Carolina at this point. She’s relocated to Houston, Texas. Because when this scam fell apart, we were going to move to Texas. So, we were already moving there.
Lex Fridman
(03:26:14)
But by the way, just a small tangent, where do you store money in situations like this? When you talk about $800,000, do you have to keep moving accounts to make sure it’s not accessible by FBI?
Matthew Cox
(03:26:27)
Well, there’s about $600,000 or $700,000 accounts, but keep in mind, I’m getting that out in cash. There’s no Bitcoin. None of that stuff exists. So, I probably should have bought diamonds or bought gold. I don’t know any of that. All I could think of is go in slowly, be patient, don’t drain the accounts, fluctuate them. I was getting cashier’s checks from one account to another. So, the balances were doing this. They weren’t just going … They were doing this, and then one day, boom, they’re gone.
Lex Fridman
(03:26:59)
Okay, got it.
Matthew Cox
(03:27:02)
We’ve gotten out like $600,000 or $700,000. There’s still $600,000 or $700,000 in the bank, but I’m not going back. I’m done. Well, look, I go in. So, I go into the police station. Well, first she says, “If you go in the police station, I’m done. If you get arrested, you’re done.” I said, “Well, then I better not get arrested.” I hang up the phone. The cop’s standing behind my car. I get out. I go in the police station, I walk in. I fill out the police report. He tells me, “I got to talk to my captain real quick. Can you wait?” He couldn’t leave me in his cubicle. He goes, “Can you wait in the hallway? I can’t leave you in the cubicle.” I said, “No, no problem.”

(03:27:42)
So I go and I wait in the hallway. In the hallway are a whole wall full of, on the corkboard, wanted posters, black and white, black and white, car thief, rapist, murderer, Secret Service’s most wanted. My face is right there. I’m like, “Holy Jesus.” Everything in me told me, “Run, bro.” Just fucking [inaudible 03:28:08] right now. Right now, just go. Your luck’s run out. There were so many, I didn’t think he was going to see it, but everything in me just said run. The problem is if you’ve ever been into a police station, you’re not getting out of it. Do you understand?
Lex Fridman
(03:28:23)
There’s a lot of cops around.
Matthew Cox
(03:28:26)
Well, not just that, but they buzz you in. You get in the elevator, you have to punch in a code. You have to punch in a code to get back out of the elevator. You have to punch in a code to get into the next door. I mean, it’s impossible. I’m not going to get in the elevator. The cop comes back up. He said, “Hey, Gary, appreciate it. No problem. My captain said, we’re good. We’re going to wait for a phone call from the…” No, wait. The district attorney called already. They’re looking into it. I’m going to go ahead and let you go. I go downstairs. He walks me to my car. He said, “Look, do me a favor.” He is like, “We do have some serious questions at this point. The district attorney says there’s some things.” I said, “Not with me.” He said, “Well, just do me a favor.” He goes, “Don’t leave town.”

(03:29:14)
I said, “Bro, I own two houses here. I’m not going anywhere.” I said, “I’m telling you right now. Wachovia, they fucked up.” He’s like, “I believe you. I believe you.” Whatever he said, I hope they’re right. I’m sure you’re right. Okay. So, I get in my car. I leave. I go to two more banks, pull out more money, but at one point, I go into a bank and two of the cashiers practically slam into each other trying to get to the phone. I can tell something’s up. I go, “No, no, no, no. Something’s up.” So I get in my car back out. One of them even runs out and looks at the tag number. So, I drive. I get in the interstate. I go. Becky, of course, I’m sorry. I love you. I would’ve never done that. I was just scared. I understand.
Lex Fridman
(03:29:58)
Becky sounds like a handful.

Break up with Becky

Matthew Cox
(03:29:59)
Oh, my God. So, I go all the way back to Charlotte. I pack up my apartment. I drive all the way to Houston with my entire apartment packed up, by the way, in a U-Haul. The next day, the next morning, she’s got people there packing it up, movers. We pack it up. I drive the U-Haul all the way to Houston. It takes a couple days. We have some guys unload it into a storage unit, because I’m going to stay with Becky until I find my own apartment. As we’re driving around the neighborhood, super nice. She’s living in that 20th floor or something of some huge high rise, great apartment. We drive by and I go, “Oh, stop the car and I want to get out.” It was one of those cone things where there’s flyers for a house. I jump out and I get the flyer. She’s like, “What are you doing?”

(03:30:56)
I go, “Well, I was just looking at the flyer,” and she says, “I don’t want to do a scam here. I want to live here. This place is nice. I love it here.” I went, “Right, I understand.” I said “No, but I have to find an apartment.” She goes, “Oh, I’m just so disgusting. You can’t stand to spend even a couple weeks with me.” She goes just ballistic. She’s screaming at the top of her lungs, and I know she’s going to get me caught. She’s never going to get me out. She’s already told me that. So, we go back to the apartment, we go upstairs. I was so scared of this chick, bro. I was so scared. I remember I was going up in the elevator, and this girl gets on, clearly a stripper. I mean, drop dead, just wearing stripper clothes.

(03:31:49)
As soon she got on, Becky gave me that with the face. I’m like this. I’m staring in the corner and never look at the girl. I remember we get off the elevator, bing, it opens. I bolt off it. Becky bolts off the elevator, and I remember she squeals, “I bet you just love to fuck that tramp.” As the elevator doors are closing, she goes, “Hey!” I thought that was funny. So, I go to the apartment. We have a screaming match, kind of, tell her I want to split up the money. She tells me she’s not going to split the money.
Lex Fridman
(03:32:31)
Why?
Matthew Cox
(03:32:32)
Because she said, “You can go somewhere else and do this again. You’ll have $1 million in six months. I have to live off this money.”
Lex Fridman
(03:32:44)
Did she threaten you?
Matthew Cox
(03:32:45)
Oh, it was funny too, because the conversation back and forth, I remember saying, “No, I want half.” She said, “I’ll give you $10,000.” I said, “You’re out of your mind.” I said, “I’m telling right now. You come up with something reasonable. I’ll take all of it.” I said, “I’ll take all of it.” She goes, “And what? Escape in that U-Haul?” She says, “The cops are going to be looking for in five minutes.” I just remember thinking, “Oh, wow.” Keep in mind, all of my IDs, everything are in the storage unit that she has a key to. I’m not getting those. It’s over. I got an ID right now that says my name is Michael Eckert. I’m driving a U-Haul van.
Lex Fridman
(03:33:23)
Yeah, it sounds like she has a lot of negotiation leverage.
Matthew Cox
(03:33:26)
So we start arguing back and forth, and she says, “$100,000. I’ll give you $100,000.” I said, “I’ll take it.” She counts out $100,000. Later when I recounted, it wasn’t even $100,000. It was like $98,000. That’s fine. It’s fine. But we’ve got them all marked, $2,000, $5,000, $6,000. She’s like, “2,000, $5,000, [inaudible 03:33:45].” She ends up stiffen me. That’s fine. It’s not my money. So, I take it, I leave, and as I’m leaving, she’d always called me before on the phone and begged and pleaded and cried. I messed up. Please give me a chance. I’m sorry. I’ll take my medication. I’m sorry. I thought it was better. I thought it was okay.

(03:34:04)
I remember walking out. I put my cell phone on the counter and just walked out, went downstairs, got in the truck, and drove. When I got to Louisiana, I stopped at Baton Rouge. I mean, at some point, I stopped and I think I got a room or something. At one point, I know I stopped.
Lex Fridman
(03:34:31)
So you drove without a plan essentially?
Matthew Cox
(03:34:32)
I drove back to Charlotte to get my car.
Lex Fridman
(03:34:37)
Got it.

Calling parents

Matthew Cox
(03:34:38)
So I can’t be driving. So, I stopped at Baton Rouge at one point and got a cell phone, like a burner phone, a Verizon Virgin mobile or something, one of those little phones. So, I bought one. I call a few people at home, back home, called my mom. She’s in tears crying. My dad’s yelling in the background.
Lex Fridman
(03:35:06)
Just a small attention. What did your mom and dad say? Do you remember anything stand out to you?
Matthew Cox
(03:35:11)
No, my dad, well, I hope you’re happy. Every time someone mentions your name, your mother cries, which is funny to me because growing up, he was never concerned about her crying. So, it was like, “Since when did you care?” My dad, he’s an alcoholic. He’s been sober for two years, a month and a half, drinking binge, and then sober for six months, and then did it again, then sober. It just went back and forth and in and out of alcohol drug programs. But like I said, worked for State Farm and he was a top-selling manager. So what they would do is they’d put them into a 30-day program, and I mean, he has to stay there. They were the only ones that had that control, because they’re like, “You’re going to do this and you’re going to pass it, or we’re firing you.”

(03:36:12)
He made a lot of money and he made a lot of money for State Farm. He hired and trained a ton of agents, and he had one of the top performing agencies. So, he was worth a lot to them. What ends up happening is I get that phone that I was telling you about, and I called, talked to my mom. She’s crying. She’s like, “I love you so much. I just want to make sure you’re safe.” I end up calling Susan Barker, which was one of the brokers that worked for me at the time, call her, and I say, “Hey, what’s going on you?” She’s like, “Oh, Matt, what’s going on? FBI is everywhere. They’ve been talking to everybody.” It’s like a year and a half at this point.

Calling FBI


(03:36:59)
She’s like, ” They come around every once in a while. Everybody’s gone in, everybody’s cooperating, everybody’s talking, everybody’s blaming you,” including her. So, as we’re talking, she said, “Look, the main FBI agent on the case, she told me if I ever spoke with you to have you call her.” I was like, “Yeah, I’m good.” So she goes, “Her name is Candace, and she wants you to call her.” She goes, “At least call her for God’s sakes. Maybe you could just turn yourself in. Maybe you can negotiate just like a couple years. If they’re not going to catch you, then maybe turn yourself in. Maybe it’ll help, at least hear her out.” I was like, “Okay, all right. You’re right.” Hang up the phone. I call Candace. She picks up the phone. I go, “Hey.” She goes, “Who’s this?” I go, “This is Matt Cox.”

(03:37:49)
She goes, “Hello, Mr. Cox. How are you?” I go, “I’m doing okay. How’s it going? I understand you want to talk to me.” She goes, “I do.” I said, “What can I do for you?” She says, “You can turn yourself in.” I go, “Well, that’s not going to happen.” I said, ” What else do you need?” She said, “I think that you should think about turning yourself in.” I said, “Why? Well, what am I looking at?” She goes, “Well, that’s not how it works. The way it works is you turn yourself in and we take that into consideration.” I said, “No, no, no, no.” I said, “That’s not good enough.” I said, “I’m not stupid enough to turn myself in and hope for the best.” So she says, “Well, let’s talk about this.” I said, “Well, what am I looking at?” She goes, “I don’t really know. I can’t tell you that.”

(03:38:31)
I said, “Well, then we don’t really have anything to talk about.” She goes, “Well, wait a second.” She said, “Hold on. Let me call the US attorney. Maybe we can work something out.” So I said, “Okay, I’ll call you back.” She said, “Well, give me your phone number, I’ll call you.” I went, “No, no, no.” I said, “I’ll call you.” I said, “I’m going to hang up the phone. I’m going to turn the phone off.” I said, “For all I know, you’re triangulating this phone call right now or something.” She goes, “Oh, give me a break.” She goes, “You’re not that important.” I remember thinking, “Who do you think you are? You’re just some little fraudster guy running around. You’re not a terrorist.” I almost was like, “Oh, okay. Here’s my number,” which she probably already had.

(03:39:23)
But I almost was like, “Okay, I’ll wait for your call and left my phone number.” I said, “No, you know what?” I said, “I’m going to hang up the phone. I’m going to turn it off anyway, and I’ll call you back.” All right. Whatever. I hang up. I turn off the phone. It turns out I found out later when I ordered the Freedom of Information Act. She actually immediately called the US Marshals, and they immediately called, took the phone number, and tracked back the phone and immediately had two marshals from Baton Rouge go immediately to the place where I had been.
Lex Fridman
(03:39:51)
Damn.
Matthew Cox
(03:39:52)
Oh, listen. Yeah.
Lex Fridman
(03:39:53)
They work fast, and she’s good too.
Matthew Cox
(03:39:56)
Not just that. I made the initial calls sitting there where I went and bought the phone. It was a gas station. There was also a Subway station. I had ordered a Subway. I was eating a Subway, playing on my computer, programmed the phone, and making phone calls. So, by the time I talked to her, they’re driving. By that point, I walked and gotten into my vehicle and I leave. But who knows? I don’t know if they showed up 30 minutes late. I don’t know. I could have hung out. Oh, I’m just going to finish my food, could have shown up. So, I call her back an hour or two later. She says, “Listen, first time he hadn’t got back with her.” Then he did. Then he came back. He said, “Seven years. He’s got to turn himself in here.”

(03:40:41)
So seven years, that seems like a lot. I kept saying, “Is that seven years for everything?” She goes, “Yeah, that’s for everything.” I was like, “That’s everything that happened in Atlanta and some stuff that you don’t know about?” She said, “Look, what’s important is you turn yourself in Tampa.” I was like, “Okay. Well, I’m closer to Atlanta. Why wouldn’t I turn myself in Atlanta?” She’s like, “Look, you don’t want to do that. You don’t want to do that.” Well, because the Secret Service would’ve gotten the credit if I’d walked in there, right? So I don’t know anything about rivalries and how they work at that time. I do now. So, we go back and forth, back and forth, and I continually ask her, “Does that include Atlanta and everything?” At some point, I realized like, “Oh, she’s just not answering.” So finally, I said, “Listen, you keep dodging this question.” She said, “All I can speak for is Tampa. So, if you come back to Tampa and you cooperate against everyone, seven years.” She wants me to cooperate against my ex-wife. I’m like, “I’m not going to do that.” I said, “My ex-wife didn’t do anything. She doesn’t know anything. She didn’t do anything.” Well, that’s not what I heard. She’s going on and on. I was like, “No, no.” I was like, “Oh, wow.” I was like, “So that’s just for…” She’s like, “That’s right.” I said, “All right, we’re done.” No, wait. I can call the Atlanta US attorney.

Running from cops


(03:42:15)
No, lady, I wouldn’t believe you if you told me water was wet. I don’t trust you. I hung up the phone, threw it out the window, and I ended up going to Charlotte, dropped off the U-Haul van. I would’ve actually brought it back to the dealer. It’s not like I evaded. I brought it back. So, I bring it back. I go to my old apartment in Downtown Charlotte, and I remember thinking I would be okay. I know by this point that they knew Michael Eckert’s name. They had the address in Charlotte. So, I know by this point, it’s been five, six days. So, I know they’ve tracked him back there. So, I figured if I could get my car, I’m fine. So, I go into the apartment complex, and it’s one of those four or five, six-story apartment. Those are parking things that stack up. So, I go into this parking garage thing. So, I go in.

(03:43:20)
I’m on the third floor or something. I look at my car and I get in my car. I remember as soon as I drove out of the parking garage, I was like, “I’m good.” So I can go ahead and pull across the street and stop at Starbucks. So, I stop at Starbucks. I walk into Starbucks. I order a Starbucks. I’m standing there waiting for the barista. I look over and it’s two people from the apartment complex staring at me. They’re whispering and pointing, and I remember thinking, “This is the fifth of the month.” I hadn’t paid my rent. I hadn’t been there. So, I thought that makes sense. Maybe I’m picturing an eviction notice or a three-day notice on my door or something. I’m like, “Okay.” Then one of them bolts out the back.

(03:44:07)
There’s a guy and a girl. The woman runs out the back. He’s standing there staring at me. I get my venti vanilla latte. I get my little frou frou drink. So, I got my frou frou drink. I walk out, I get into the car. He follows me. I get in the car. I set everything up. I put my seatbelt on. I’m okay. He’s standing there staring at me. I’m thinking, “Something’s wrong. What’s up?” I check to see. There’s no traffic. I’m good. I’m about to leave. He starts screaming, “He’s right here! He’s right here!” I look in the rear-view mirror. There’s two guys running towards the back of my car. I punch it and I take off.

(03:44:49)
Sounds dramatic. It wasn’t that dramatic. There was no cars. I knew there was no cars already pulling out. It wasn’t like a T. J. Hooker, where I jumped over, slid across the hood. They didn’t catch the car and hang onto the back. So, they’re running, and I, boom, hit it.
Lex Fridman
(03:45:05)
Did you spill the coffee?
Matthew Cox
(03:45:08)
No. It was one of those little things. It was actually nice.
Lex Fridman
(03:45:10)
You’re making it sound like you were pretty calm. Weren’t you panicking here?
Matthew Cox
(03:45:13)
I was terrified. Terrified.
Lex Fridman
(03:45:16)
So you’re under fear. You’re still operating-
Matthew Cox
(03:45:21)
Yeah, I operate.
Lex Fridman
(03:45:22)
… calmly.
Matthew Cox
(03:45:22)
It’s funny you say that, because the Secret Service, when they talk to these guys, all the people that they spoke with said the same thing over and over again. The guy was a professional. He never seemed upset. He never seemed agitated. He was never in a hurry, but most of the time, I wasn’t, because it wasn’t until the police got involved or the federal law enforcement got involved that I started really getting anxious. So, at that point, I take off. I drive about a mile down the road.
Lex Fridman
(03:46:02)
Who were the two guys, by the way?
Matthew Cox
(03:46:04)
I thought it was FBI. I ordered the Freedom of Information Act when I got to prison at some point in the future, and it was U.S. Marshals.
Lex Fridman
(03:46:14)
It sounds pretty dramatic to me, U.S. Marshals running towards your car, but it’s all right.
Matthew Cox
(03:46:20)
It’s hard not to tell it like it’s dramatic.
Lex Fridman
(03:46:22)
I understand. There’s not much traffic. It goes. Okay.
Matthew Cox
(03:46:25)
It’s not like their fingers were at the back of the car. They’re holding on. But yeah, if I had waited an extra 20 seconds, yeah, they would’ve been on my car. They would’ve been right there at the door.
Lex Fridman
(03:46:35)
Did you consider giving up there or no?
Matthew Cox
(03:46:39)
No. Listen, my instinct is get out, go, go, go, go, go, go.
Lex Fridman
(03:46:44)
You’re already on the run.
Matthew Cox
(03:46:45)
I’m already in trouble. It’s not like they’re going to add anything. Although, to be honest, it only got worse, because actually, at that point, I drive down the road. I stop at a homeless facility. I survey three guys. I’m a mile down the road. Looking back on it, I think, “What were you thinking?” But there were three homeless guys that were in their early 30s, and they were all Caucasian. That’s hard to find. So, trust me, I’ve spent hours before finding these guys.
Lex Fridman
(03:47:15)
So that’s the golden thing you’re looking for is white guys in their 30s.
Matthew Cox
(03:47:20)
Right, because I was in my 30s. I wasn’t an old man, like I am now. So, I surveyed them. I drive straight to Nashville, get to Nashville, drive through an area called Green Hills. Well, first when I got to Nashville, I stayed the night, and the next day I went into… I’m going to say a UPS store. It was actually a Kinko’s. They used to be called Kinko’s.
Lex Fridman
(03:47:48)
I remember Kinko’s. They got bought by FedEx, I feel like.
Matthew Cox
(03:47:52)
Oh, is it FedEx? Okay. Then it was a FedEx store. So, I go in there and you give them like 50 bucks or something or 20 bucks or something. They’d give you like 100 business cards. So, I go get a phone number, a burner phone. I go in there. I call and get a phone number the local HQ. I come up with a name, Manufacture Funding Group. I’ve got two phone numbers. I get business cards made. One of the guy’s name that I surveyed was… His actual name was Joseph Marion Carter Jr. I went by Carter. So, I get business cards made of Joseph Carter. I then drive through Green Hills, took them like an hour to get the card. So, I’m driving through Green Hills. I’m planning on going to an apartment, but still I don’t have an ID. I don’t have anything.

(03:48:39)
I’m wondering, “What am I going to do? How am I going to get a place to stay? I’m going to stay in a hotel. What am I doing?” I’m using an ID that the cops are looking for. So, as I’m driving, trying to find this big apartment complex, there’s a guy putting a sign in the front yard of a townhouse, several townhouses, probably in his 60s. I pull in, jump out of the car, and I said, “Hey, is this for rent?” He said, “Yes, it is.” I said, “Oh, okay.” Yeah. Can I see it? Sure. I go in, check it out, come back downstairs. It’s perfect. I said, “Listen, I work for a company, Manufacture Funding Group. Boom, hand thing. I said, “I’ve been in Europe for the last…” I forget what I said.

(03:49:23)
I said, “England, some little town outside of London, whatever, Dexter, London for the past five years. I don’t really have any credit.” But I said, “I can put down a double the security deposit or whatever you need. Here’s my business card.” He looked at me and he looked at my car and he goes, “You look like an honest young man.” He said, “I’ll take the first month’s rent and deposit.” He said, “Now, go get a lease right now.” I said, “Okay.” I said, “Oh, okay.” Filled out a lease right then, gave me the keys. Nice. Very trusting in that town.
Lex Fridman
(03:50:02)
Oh, yeah, but there must’ve been also something about you where you just got a nice car.
Matthew Cox
(03:50:09)
You’re going to get a lot of comments to say white privilege.
Lex Fridman
(03:50:12)
I think the charisma has something to do with it.
Matthew Cox
(03:50:18)
Well, I appreciate that. So, he gave me the keys. Listen, I ordered all of Joseph Carter’s vital information, all of his birth certificate, social security card, everything that night from a Kinko’s or I forget where, but from one of these places I went online. You could go online back then. There wasn’t WiFi everywhere. So, I ordered the stuff. It shows up a couple days later. I take that information. I go and I get a driver’s license. Within seven or eight days, I’ve got a driver’s license in his name. I get in that car, Michael Eckert’s car. I drive it all the way back to Nashville. I leave it in long-term park.
Matthew Cox
(03:51:03)
Michael Eckert’s car, I drive it all the way back to Nashville. I leave it in long-term parking, get on a plane, fly back to Nashville, go in and buy myself a brand new car. It wasn’t brand new, it was a couple of years old, but from CarMax. [inaudible 03:51:15] within two weeks, I am completely 100% set up. I start dating for three, four months. That gets really boring and-
Lex Fridman
(03:51:23)
Where again? In Nashville you said?
Matthew Cox
(03:51:25)
Nashville.
Lex Fridman
(03:51:26)
Okay, got it.
Matthew Cox
(03:51:26)
So I started dating a bunch of chicks and then I end up meeting this one girl.
Lex Fridman
(03:51:30)
By the way, are you lonely here because you’re on the run? Is that-
Matthew Cox
(03:51:32)
Man, listen, I’m telling you right now, being on the run was the best part of my life.
Lex Fridman
(03:51:38)
Really?
Matthew Cox
(03:51:39)
You know how all these guys say, “It was horrible and I was always so concerned and looking over my shoulder and,” it wasn’t, I wasn’t. Keep in mind, I’ve gotten five or six traffic tickets while on the run. I went to traffic school as someone else. I got so many traffic tickets in his name, I went to traffic school as him. If I got pulled over, I’m not concerned.
Lex Fridman
(03:52:00)
So your confidence just was over the top here.
Matthew Cox
(03:52:03)
And I’m driving a vehicle in the name of the driver’s license that I have that was issued by that state. Full coverage insurance. I’m not an idiot. I’m not driving around a stolen car with a broken taillight and a body in the trunk. I’m covered. I’m not concerned about the local cops.
Lex Fridman
(03:52:20)
Plus you’re going to Starbucks, sipping your coffee and driving away from U.S. Marshals [inaudible 03:52:26]-
Matthew Cox
(03:52:26)
Right, right. That was-
Lex Fridman
(03:52:27)
You could start believing that it’s impossible to catch you.
Matthew Cox
(03:52:30)
That is exactly what it is. Every time I just kept getting more and more emboldened, more and more cocky, arrogant. They’re not going to… I’m too good. Which is great until they catch you. And so I meet a girl named Amanda Gardner. Well, what I end up doing is, keep in mind, I’ve only got a hundred thousand or so. So I go and I start buying houses in the area, in this area called J.C. Napier. It’s just close to downtown. And I buy these houses and I start… I buy them for like 60, 70,000, and I record the sales at 210, 190, 205, that sort of thing. Same thing, and I refinance the houses, I start pulling out money.

(03:53:16)
I meet this girl, Amanda Gardner. We hit it off. Within a few months, she’s moved in. We move into a house in that area. I renovate a house. We move in there. I borrow three and a half million dollars and I’m buying houses. Now I’m buying houses, recording the value. I started all over. I borrow, whatever, three and a half million dollars. I meet Amanda, we move in together. We’re buying-
Lex Fridman
(03:53:43)
Do you tell her about what you’re-
Matthew Cox
(03:53:44)
What she knew was that… It’s odd, right? I have no photographs. Everything I own is brand new. She’s like, there’s nothing in this house that’s more than four months old. So six months old, you have no photographs, you have no internet presence. Every stick of clothing is brand new. You don’t have old pairs of jeans.
Lex Fridman
(03:54:08)
Do you tell their stories about the past of any… Is there a fabricated…
Matthew Cox
(03:54:13)
Initially there was a fabricated version that I owned a mortgage company. My typical story was I owned a mortgage company and I got bought out by Household Bank. Started doing very well, I got bought out by Household Bank. I have a non-compete clause. I ended up with half a million dollars after paying off all my bills and just decided to travel around the U.S. and now I’m here and I’m going to start renovating houses.
Lex Fridman
(03:54:38)
[inaudible 03:54:40].
Matthew Cox
(03:54:39)
But that, you don’t call home, nobody calls you. Your family doesn’t call you. You tell stories about your mom, your dad, your brother, your sister, friends. I don’t know any of these friends. Never seen any of these friends. They never call you. It’s like, ah, shit. So at some point, I basically just said to her… Look, at one point I had to have a check cut. I refinanced the house and I had, I’m going to say something like, it might’ve been 30,000, but let’s say 20,000. I had a $20,000 check cut to Amanda Gardner because you have to have these checks. You can’t have them cut to me. So I would say, “Hey, there’s a second mortgage on there,” and I’d provide a second mortgage or I’d provide different things. And I knew I need names of people to cut these things to. So I had a check cut for whatever.

(03:55:29)
So I remember we’re at dinner one night. This is before she really knows who I am. And I said, “Hey.” I said, “Oh.” And she goes, “Oh, you had a…” She goes, “How’d that thing go, your refinance?” I go, “Oh, thank God you said that.” Boom. I said, “I need you to deposit this.” Give her a check for 20,000. She’s like, “I can go tomorrow and I can deposit it. And I…” And I’m like, “No, no.” I’m like, “Look, it’s fine. Just deposit.” She’s like, “As soon as it clears, I’ll get you a cashier’s check.” I was like, “No, just deposit it and keep it in your bank. It’s fine.” So she’s like, “What is going on?” So we have this conversation and I tell her, “Look, people are looking for me.” “Who?” “Law enforcement.” “Which ones?” “All of them.”

(03:56:14)
She’s like, “That doesn’t even… For what?” I go, “Mostly bank fraud.” And she’s like, “Well, how are they not finding you? People know you, your general contractor,” which I met four months before. This guy, six months before. This one, two months before. She’s like, “So and so, so and so, so…” And I’m like, “Right, right. Well,” I said, “Well…” She’s like, “They’ve got your name, they’ve got your…” I go, “Well, that’s identity theft.” And she was like, “What do you mean?” I said, well, “My name’s not… It’s not Joseph Carter.” “What is your name?” I go, “Look, don’t even worry about it. This is what’s happening. This is where I’m at,” and this has been months into the relationship. This is, I’d say, maybe a month or two in, but she was just too inquisitive and… Oh, I know what it was. She found like $40,000 in cash in my freezer one night.

(03:57:13)
That was another thing that happened. She went to get a Popsicle and she opened up the flip to get a Popsicle, and she opened the wrong one, and there was all cash. And she was like, in this conversation, she’s like, “The other day I opened up the Popsicle box and there’s cash,” And I’m like… So I kind of explain it, but I had a feeling she’s going to be okay with this.
Lex Fridman
(03:57:37)
So she was okay.
Matthew Cox
(03:57:38)
She was okay with it. [inaudible 03:57:40]-
Lex Fridman
(03:57:40)
[inaudible 03:57:40], to me, that’s just a fascinating conversation to have.
Matthew Cox
(03:57:43)
It was a great conversation, but-
Lex Fridman
(03:57:45)
Because oftentimes in relationships, you learn about each other and you find out new things. And here you find out-
Matthew Cox
(03:57:50)
That’s a doozy.
Lex Fridman
(03:57:50)
Yeah, it’s a good one to find out. The name you’re using is not your real name. And the Secret Service, the FBI and everybody else are looking for you.
Matthew Cox
(03:58:02)
Yeah.
Lex Fridman
(03:58:02)
And to be honest, you’re not a violent criminal. So it’s like-
Matthew Cox
(03:58:08)
But she didn’t know my name. She was like, she… And I told her, I said, “Look, if you start digging, if you find out my name, I’ll leave. There’s certain things that catch you. Staying in contact with people that you know, that’s how you get caught. Going back to see people, that’s how you get caught. Telling people who you are, that’s how you get caught.” And I was like, “So I’m Joseph Carter, everything’s fine.” And she was like, “Okay.” And keep in mind too, this girl, oh, your car’s broken or your car’s not doing well, take it and trade it in. We’ll go get you another car. We’ll go get you an Infinity FX or whatever. A 55,000, $60,000 vehicle. She’s driving the equivalent of a beat up old Nova. You want to go on vacation, we’ll go on vacation. You want to do this, you want to do that. So we’re buying houses, we’re renovating houses, we’re building brand new houses. We’re buying lots. She’s in the middle of this, like holy Jesus.

(03:59:12)
There’s hundreds of thousands of dollars in the bank, in our bank account. Her bank account. I open up a corporation in her name, she’s opening up bank accounts, there’s websites. It’s a lot it and while this is happening, we start seeing a friend of hers. So this other girl comes in the picture, her name’s Trina, and Trina is semi-lesbian. So-
Lex Fridman
(03:59:48)
Is this like a sexual thing-
Matthew Cox
(03:59:49)
Yeah, so-
Lex Fridman
(03:59:50)
… or actual relationship?
Matthew Cox
(03:59:51)
No, it’s more like she’s coming over a couple times a week.
Lex Fridman
(03:59:54)
Okay.
Matthew Cox
(03:59:55)
So we’ve got tons going on and… [inaudible 04:00:01] put this? So while this is happening, I end up coming out in several magazines. So I’m thinking this whole thing’s dying down, but it’s not dying down because now I just got caught and handcuffed in a bank, walked out of the police station, outran Marshals. Although that part, the Marshal thing was never in the papers, but the getting caught and handcuffed in the bank, when that hit the papers, that’s everywhere, bro. That’s huge. Suddenly, Chicago Tribune’s running a series, the fugitives. I’m in Bloomberg Businessweek. They run an article called Sharks in the Housing Pool. Then you’ve got Fortune magazine comes out with a thing because by now, guess what? Becky’s been caught.
Lex Fridman
(04:00:48)
Oh, Becky.
Matthew Cox
(04:00:49)
Becky.
Lex Fridman
(04:00:51)
Is she in Houston or whatever?
Matthew Cox
(04:00:52)
In Houston, got caught.
Lex Fridman
(04:00:54)
And did she-
Matthew Cox
(04:00:55)
But gangster, bro. The way she, here’s the thing, I-
Lex Fridman
(04:00:59)
Hey. Hey, there you go.
Matthew Cox
(04:01:01)
Oh, no, she told on me immediately.
Lex Fridman
(04:01:02)
Oh, she did?
Matthew Cox
(04:01:03)
Yeah, it’s-
Lex Fridman
(04:01:03)
Oh, no. Oh, no.
Matthew Cox
(04:01:04)
It’s fine. She did the right thing. So here’s what’s funny about that.
Lex Fridman
(04:01:07)
I don’t know about that.
Matthew Cox
(04:01:09)
Here’s what she says.
Lex Fridman
(04:01:09)
Loyalty is everything in this world, my friend. That you and I disagree on.
Matthew Cox
(04:01:13)
[inaudible 04:01:14]. I just took off. I just took off-
Lex Fridman
(04:01:15)
Still. Still.
Matthew Cox
(04:01:16)
… on her and left her with, listen, with five or $600,000 is what I left her with.
Lex Fridman
(04:01:23)
It’s not all about money, Matthew. It’s also about just ride or die. There’s a meaning to that.
Matthew Cox
(04:01:30)
[inaudible 04:01:31].
Lex Fridman
(04:01:31)
I’m sorry, go ahead. [inaudible 04:01:34].
Matthew Cox
(04:01:33)
So-
Lex Fridman
(04:01:34)
She said everything.
Matthew Cox
(04:01:35)
Well, here’s what, when I say gangster, when she gets caught, they come in, she’s in the middle of beauty school. She’s paid for beauty school, she’s going through beauty school. She’s going to open a salon or something. So she’s in there cutting hair, in a class on a mannequin, and all of a sudden, five or six Secret Service agents come in, guns drawn, screaming, get on the ground, get on the ground. She said, everybody dropped the ground. She goes, “I’m sitting there with scissors going…”

(04:02:04)
They grab her, they handcuff her, they bring her in, and the whole time… Now at that point, her name was Rebecca Hickey. She went by Becca. So she’s Rebecca Hickey, she’s got a Texas driver’s license, the whole thing. And they’re screaming at her, and they put her in the car, and they’re driving the whole way. The Secret Service agent told me, “45 minutes, she’s telling us, you’re losing your job, bro. You’re losing…” He’s like, “I couldn’t believe it. We’ve got pictures of her.” We’re like, “This is you.” She’s like, “That’s not me. Are you insane? Look at that chubby little thing.” [inaudible 04:02:43]-

(04:02:42)
Would not budge until they actually put her hand on the scanner and she goes, “Okay, I’m Rebecca Hauck. What do you need?” They’re like, “Where’s Matt Cox?” She’s like, “I have no idea. That fucker left me like a year ago.” So-
Lex Fridman
(04:02:58)
But she contributed to the story, to the legend that’s already growing.
Matthew Cox
(04:03:02)
Because she was interviewed by Fortune magazine and it was horrendous. The article is horrendous. He was abusive. He’s a Don Juan that forced me to fall in love with him, commit mortgage fraud, and then took all the money and left. By the way, they found 40 or 50 grand on her and maybe another 30 or 40 in her bank account, and no other money.
Lex Fridman
(04:03:29)
Yeah.
Matthew Cox
(04:03:30)
Where’s the other money? So anyway, and she was, by the way, she got caught. She was in communication with her family. So she’s talking to her mom.
Lex Fridman
(04:03:40)
That’s [inaudible 04:03:41] she got caught ultimately.
Matthew Cox
(04:03:42)
And her mother, through multiple conversations, one conversation being, “Mom, I’m doing fine. I can’t tell you where I am exactly, but I’m in Houston, Texas. I’m fine.” Next one, six months later, “I enrolled in beauty school.” Houston, Texas Beauty School. How many are there?
Lex Fridman
(04:03:59)
Yeah.
Matthew Cox
(04:03:59)
And her mom, bipolar. I just want to see my daughter.
Lex Fridman
(04:04:04)
Yep.
Matthew Cox
(04:04:04)
I’m going to call the Secret Service.
Lex Fridman
(04:04:05)
Yep.
Matthew Cox
(04:04:06)
I’m doing the right thing.
Lex Fridman
(04:04:07)
Yeah.
Matthew Cox
(04:04:08)
And honestly, she is doing the right thing.

Getting arrested

Lex Fridman
(04:04:11)
So you’re getting more and more famous-
Matthew Cox
(04:04:13)
It’s bad.
Lex Fridman
(04:04:14)
… nationally.
Matthew Cox
(04:04:15)
Right, so I’ve got all these [inaudible 04:04:17]-
Lex Fridman
(04:04:18)
You’re having a threesome with Amanda and Trina.
Matthew Cox
(04:04:20)
And what ends up happening is we end up going… And listen, Amanda and I, we’ve gone to Greece, Italy, Croatia. We’re going on multiple trips. And remember we had just gotten back from a 10-day cruise of the Greek Isles. And we get home and Amanda goes online and there’s a blog about Dateline, about one of their new specials called the Thief of Hearts, and that’s me. Apparently I’m the Thief of Hearts, and I am apparently going around, and it’s based on Becky’s story, that I’m wooing women to commit fraud, stealing all the money and then leaving them to hold the bag.

(04:05:19)
Well, they interviewed her. They’re interviewing multiple people, in my case, they’re putting together an episode. It’s going to be released in a month or so. So I’m terrified. At this point, I’ve been on the run three years, and I’m like… There’s lots of things I could care less about. Fortune, I don’t know anybody that reads Fortune. Bloomberg, come on, I’m hanging out with contractors and laborers and I’m not hanging out with these guys. So local news, who caress. Even local news channels, I don’t care. But Dateline, there weren’t 400 channels back then. So Dateline comes out, even if you don’t see it the first time, they’re going to rerun it in three months, or six months, or 10 years from now, they might rerun it again. My face is going to be on it, so I could be perfectly fine. Five years from now, in one day, the barista that I go to every other day looks at Dateline and goes, “Oh my god, that’s Mr. Johnson,” or, “That’s Mr. Thomas,” or whatever.

(04:06:24)
So the point is that I was like, “Yeah, I got to go. I can’t stay here. I got to get out of the country.” So I was going to go to… Well, we really started doing research and Amanda ended up saying, “Australia.” Australia, at the time, I don’t know how it is now, but at the time, if you went to Australia with a hundred thousand dollars and a business plan, you could become a permanent resident alien. You can’t vote, but you can buy property, you can open a business, but you can’t get a job. And they didn’t require a fingerprints. So there’s no criminal background check. Now, if you wanted to be a citizen, you have to get an FBI criminal background check. [inaudible 04:07:07]. No, I’m good. So I was like, “Wow, I can go there and start a business,” and I’m going to show up with a couple million.

(04:07:14)
So what we do is we start refinancing houses, we start pulling out money as quick as we can. I’m asking guys, laborers, guys that I work with, my general contractor, my real estate agent, “Hey man, can you cash this check for six grand?” Nobody says no, everybody, yeah, no problem, no problem. A few guys like, “Yeah, man, if you give you 10%,” yeah, I’ll give you 10%. So that’s happening. We’re pulling out cash. One day Amanda gives Trina a bunch of checks and asks her to cash them. That sparks a conversation like what was happening. She confides in… By this point, by the way, Amanda knows who I am.

(04:08:02)
So by this point, she’s actually came across the letter that I wrote to my parents when I left Tampa. So she’s figured out who I am. She tells Trina, “His name’s Matt Cox, Dateline’s coming out, we’re leaving. We got to get a bunch of cash.” And Trina goes, “Okay, I’ll cash the checks,” and what she does instead is she calls the Secret Service. They watch my house for three days, I come home one day, they pull the cars up… And they arrest me. So it’s a little bit longer than that, but that’s a short version of me getting arrested. And I’ve probably skipped over a whole [inaudible 04:08:43]-
Lex Fridman
(04:08:42)
So simple because you’ve gotten in the way with much more complex situations.
Matthew Cox
(04:08:48)
It’s women, man. It’s women. Just joking.
Lex Fridman
(04:08:53)
They also are the thing that make life worthwhile.
Matthew Cox
(04:08:56)
Listen, God bless Trina, she did the right thing. Honestly, based on-
Lex Fridman
(04:09:01)
There you go, back to the right thing.
Matthew Cox
(04:09:02)
But based on what she saw, based on what the Secret Service told her and the articles that she’s reading, I’m a bad guy. I’m a bad guy in general, so I don’t deserve loyalty. I don’t think so. I’m ripping people off and she’s thinking that her friend is in danger. The FBI is saying, I have a weapon. He’s dangerous. We believe he’s armed and dangerous. When I was in Florida, I had a concealed weapons permit, but I had gotten rid of both my guns when I was placed on probation. I’ve never had one since. I’ve never touched a gun since. But they used that to say, they said, “Oh, he had a concealed weapon permit. Okay, well then he’s armed and dangerous.” There’s these little things and things they’re telling her, “Read this article. Look, he forces girls to fall in love with them. That’s what he’s going to do to your friend.” So she negotiated also, I think she got 10,000, I think, which is embarrassing. I’m ashamed that she got $10, 000.
Lex Fridman
(04:10:02)
And said everything.
Matthew Cox
(04:10:03)
Yeah, and told them, “This is where he is. His name is Joseph Carter. This is where he is.” They watch it, they grab me, they arrest me. They bring me downtown.
Lex Fridman
(04:10:16)
What did you feel like when you got-
Matthew Cox
(04:10:18)
I didn’t feel good, bro. It was bad. It was a bad day. It was a bad day. First of all, Casino Royale was coming out on Friday. It was the first Daniel Craig as James Bond.
Lex Fridman
(04:10:33)
That was the first, yeah.
Matthew Cox
(04:10:34)
And the whole week I’d been telling Amanda, “I’m going to go see Casino Royale.” She go, “Okay, well on Saturday we’re going to go to the festival.” I go, “That’s fine, but on Friday, Casino.” And she’s like, “Right, Casino Royale.” And then she’s like, “Okay, by the way, on Thursday I thought we could go to dinner.” That’s fine, but on Friday, Casino Royale. And when they put the handcuffs on me, you want to know the first thing I thought of? I’m not going to fucking get to see Casino Royale. I’m not going to get to see it, not going to see it. And I saw it about five, six years later, it went on the institution’s movie channel. It was nice. It’s not the same, but, yeah.

(04:11:08)
So they bring me to Nashville, then they transport me to all over the place. I go on Con Air, they fly me to Oklahoma, they fly me to Atlanta, then I go to Atlanta. I’m placed in the U.S. Marshals, holdover. I get assigned an attorney, go in front of the judge, plead not guilty, meet with my attorney. You always plead not guilty. Whenever people say, “Can you believe that he pled not guilty?” Nobody walks in and pleads guilty. You plead not guilty while you figure out what you’re going to do. So I plead not guilty. There’s no bond. Obviously, they caught me. When they caught me I had four or five passports, so that’s no good. They charged me with bank fraud, conspiracy to commit bank fraud, wire fraud, mail fraud, passport fraud, conspiracy… What was the other? Aggravated identity theft, money laundering, use of a fraudulent passport. And there’s like 30 counts of this, 20 counts of this 20… But none of that matters.

(04:12:25)
Even if you just dropped all the counts to one count and stacked them, it’s like 150 something years, not that [inaudible 04:12:31].
Lex Fridman
(04:12:30)
Yeah, so everything they could [inaudible 04:12:33].
Matthew Cox
(04:12:32)
And that’s what they always say, “You’re looking at 150,000,” and your lawyers, they’re like, “You’re not looking at that. You’re looking at 54 years.” What? That’s no matter. That’s no matter. Yeah, so my lawyer comes in and sees me one day, our first meeting, and she says, “I’m Millie Dunn. And she says, “Listen, I’ve looked at everything.” Well, first they say, ” You’re responsible for, it’s like 25 or $26 million in loss.” And I’m like, “That’s not true. That’s not true.” And I said, “Not even potential loss. There’s just no way. There’s no way.” And then she comes back and she says, “Well, they’re saying 19 million.” No, it’s not possible, [inaudible 04:13:25], I didn’t, no. So when the FBI is saying 40 million. They’re saying 11.5 in Tampa, plus 40 million for the mortgage company. So it ends up being, plus what I stole on the run, it ends up being like 55 million, but she gets them to drop the 40. That’s just brokers. That’s this, that’s that. Drop it. And they’re like, “He’s so done. It doesn’t matter. They drop that.”

(04:13:49)
So it ends up being 15 million. And then it’s down to what does he owe? They said 9.5, and I got it down to 6 million, which I’m good for. So what ends up happening is they’ve charged me with all these things and she’s like, “Okay, you can plead guilty and you can go with the sentencing guidelines, which is going to be like…” She’s like, “It depends.” She said, “It might be, whatever, 54 years.” She goes, “But if they run them concurrent or consecutive, depending on which one they do,” she said, ” Most likely it ends up being 30 years.” It’s no good. That’s not good. So we go back and forth, back and forth and try and figure out what I’m looking at. Now, as we go through the whole thing, she knocks off a bunch of stuff that they’re saying I did, enhancements. Because you’ll have a base level of, let’s say, a level eight. That should be, maybe a few years. But then they start adding on enhancements.

(04:15:11)
Did what he do, was it sophisticated? Yes. Okay, three levels for sophisticated means. Were there more than… How many victims were there, more than 50 victims? Yes. Okay, that’s six more levels. Okay, did he change the jurisdiction to evade detection? Yes. That’s four more levels. Okay, did he… They start adding, boom, boom. And when you start adding up all those levels, plus your criminal history, and I have a big criminal history because I was already on federal probation and I committed a new crime on federal probation. So that was another enhancement. And this case, so I’m in a category [inaudible 04:15:45], category two or three.

(04:15:47)
So they come back and they’re saying, I forget, it’s like 20… Well, they don’t come back right away, but she ends up saying, “You’re probably looking at 14 years.” Okay, that’s reasonable. That’s reasonable. And so when we get the PSI back, we eventually get what’s called a presentence report. They’re saying 26 years. Well, they really said 32 years. And I argued, and we got it down to 26 years and four months. That’s what it is. It’s 316 months. That’s how they do it, in months, because it doesn’t sting that much, I guess, if you say months.
Lex Fridman
(04:16:30)
Yeah.
Matthew Cox
(04:16:30)
So she says to me, Millie sits down with me and she says, “Listen, you got to cooperate.” And I was like, “Okay.” And she said, “Because you’re guilty. You’re extremely guilty.” She’s like, “You can’t go to trial,” and she said, “So you need to cooperate.” I was like, “Well, what do I get if I cooperate?” And she’s, “The way it works is you cooperate and you hope for the best.” And I was like, “Are you serious?” She goes, “You tell them everything and you hope for the best.” And she’s like, “Part of the problem is,” she said, “Everybody in Tampa’s cooperated. Rebecca has cooperated. Everyone across the board has cooperated.” She goes, “There’s nobody that hasn’t cooperated.”
Lex Fridman
(04:17:12)
By the way, when you say cooperate, you mean they told, aka snitched.
Matthew Cox
(04:17:17)
Yeah. Right. They came in, they sat down with their lawyer and they said, “This is what he did. He did this, he did that.” They showed them documents, “Yes, yes, yes. That’s my signature. I didn’t know what that was.” Everything was my fault. They didn’t do anything. It was all me. So they’ve all cooperated and they haven’t been charged. They’ve been indicted. They’re all named as unnamed co-conspirators on my indictment. So I’ve got 12 people, [inaudible 04:17:40] there’s probably 20 people that are involved, but there’s 12 of them that are… So I’ve got all these names, K.B., D.L., C.Y. It’s like, I know who that is. I know who D.W., that’s Dave Walker. I know who these people are. And so there’s just a list of them, there’s like 12 of them plus me. Some of them walked in and said, “I’m guilty. I just want to plead guilty.” The girl, Allison, she walked in, said, “I’m tired of waiting for you to come get me.” Walked in with her lawyer and said, “I just want to plead guilty.” And they sentenced her, and she went to jail. She got 36 months or 30 months. She called the prison that… She went to the low security, it was a female prison at the time, female camp. Called the camp and asked if she could come by for a tour before she went. And they went, “Excuse me?” She said, “Well, I’m going to be there for about two years, so I’d like to come in. Is there a tour I can take? Because I like to know where I’m going and what it’s going to be like, how I should prepare.” And they just started laughing. They said, “There’s no tour, sweetie. We’ll give you the tour when you get here.” You got to love that, she-
Lex Fridman
(04:19:00)
Yeah. I mean, [inaudible 04:19:01]-
Matthew Cox
(04:19:02)
I thought I wasn’t prepared. There’s no tour. So Becky got 70 months, but when I got caught and when I was sentenced, they reduced it to 30 or no, to 40 months. They reduced [inaudible 04:19:16]-
Lex Fridman
(04:19:16)
Because she “cooperated.”
Matthew Cox
(04:19:17)
Cooperated.
Lex Fridman
(04:19:19)
That term. Right.
Matthew Cox
(04:19:21)
Do you want to say snitch or ratted?
Lex Fridman
(04:19:23)
Well, there must be… Snitch is too harsh of a word, but yeah, the ratted. You’re saying, I don’t know.
Matthew Cox
(04:19:32)
Well, we can get there. We’ll get there.

Snitching

Lex Fridman
(04:19:33)
All right. All right. So where did the sentencing end up?
Matthew Cox
(04:19:39)
So I should say first on the cooperation subject. My lawyer wanted me to cooperate, and by this point I realized you don’t have a choice. No, that’s not true. I could have been a gangster.
Lex Fridman
(04:19:57)
Yeah.
Matthew Cox
(04:19:57)
[inaudible 04:19:58]-
Lex Fridman
(04:19:57)
What does it mean to be a gangster in this case?
Matthew Cox
(04:20:00)
Like a standup guy. I could have said, “I’ll just take it. Give me 54 years. Go fuck yourself. I’m not going to snitch on nobody.” And I know you look at me and you think, “Tough guy.” I’m not a tough guy at all. I’m not doing 50-some-odd years. I’m not doing it. I don’t want to do 30 years. I was hoping for, I knew it wasn’t possible, but I would’ve satisfied for another slap on the hand like I got the first time. I really thought I deserved, honestly, when my lawyer asked me, “What do you really think you deserve?” And I thought, “I deserve 10 years. I deserve 10 years.” So she said, “Look, they want to talk to you.” So the FBI… Well, first the Secret Service flies in. They come in and they interview me.
Lex Fridman
(04:20:54)
Who’s more terrifying, FBI, Secret Service?
Matthew Cox
(04:20:57)
The Secret Service was so overwhelmingly professional. The FBI, and really only one of the FBI agents that interviewed me, I don’t know how he’s an agent. I don’t know. He was just ineffective, incompetent.
Lex Fridman
(04:21:16)
Oh. Oh, so it’s a competence issue.
Matthew Cox
(04:21:17)
The other one was Candace.
Lex Fridman
(04:21:20)
Oh, you met her eventually.
Matthew Cox
(04:21:21)
Of course I did. Of course. She’s-
Lex Fridman
(04:21:24)
What was her [inaudible 04:21:25].
Matthew Cox
(04:21:24)
… 5’11”, wearing 3″ heels. She’s a giant and in impeccable shape, attractive. One of the angriest human beings I’ve ever met. And every FBI agent that I’ve met since then that knows her, and I mention, they all say, “Oh, what’d you think of her?” And I’m like, “What? Why?”
Lex Fridman
(04:21:45)
[inaudible 04:21:45].
Matthew Cox
(04:21:45)
They go… And I was like, “Kind of aggressive.” They go, “Yeah, yeah, yeah. She’s a bulldog.” All of them are like, “Yeah, yeah, she’s something else.”
Lex Fridman
(04:21:54)
Secret services is a little bit more professional [inaudible 04:21:57]-
Matthew Cox
(04:21:57)
[inaudible 04:21:57], very, it’s their job. It’s like, hey, this is just my job. They’re polite, professional. That’s it. So this is my nine-to-five. But they fly in and they meet with me for three, four days. One of the funny things is that when I first sat down with him, one guy’s name was Dan Brosanskowski or something. So he sits down and he says, “Look, before we get started, we need to talk about something.” And I said, “What?” He said, “We know you’ve hidden money, and we…” And I was like, “What?” And he goes, “We know you’ve got money hidden.” I said, “I don’t have any money hidden. What are you talking about?” And my lawyer’s like, “Do we need to talk?” I’m like, “No, no, no, no. I don’t have nothing. I gave you everything. I gave you all the accounts. You got everything.” And he’s like, “You’re looking at an obstruction charge at this point.” I was like, “I don’t have anything.” And he says, “We know you have money. We know you have money in different identity’s names.” And I go, “What are you talking about?” And he pulls out a bank statement and he slaps it on the counter. And he goes, “You’ve got money in Southern Exchange Bank. You’ve got $190,000 in Southern Exchange Bank.” And I look at it and I went…

(04:23:18)
It was in the name Walter Holcomb, and I went, “Did you call the bank?” He says, “Yeah, we called the bank.” I went, “Okay. Did anybody call you back?” And he said, “Well, no, we’ve left several messages.” I said, “Did you go to [inaudible 04:23:33] bank website?” He goes, “Yeah, I went to the website.” I said, “What’d you think?” And he went, “What do you mean? It was bank website.” I said, “Yeah, but it was professional, right? It was a professional website.” And he goes, “It’s a bank website.” And I go, “Yeah, but it was well done.” And he goes, “Oh god.” And I go, “Yeah, convincing.”
Lex Fridman
(04:23:50)
[inaudible 04:23:52].
Matthew Cox
(04:23:51)
And I go, “It’s all an illusion,” and I said, “The bank doesn’t exist. It’s a fake bank. I made the bank. Made it when I was in… Not even in Tampa, I think I’d gotten to Nashville when I made it.
Matthew Cox
(04:24:03)
… not even in Tampa. I think I got into Nashville when I made it and I was like, “Yeah, it’s an… The bank statements…” He’s like, “They’re the color of bank statements.” I’m like, “Yeah, well no shit.” I said, “As a matter of fact,” I said, “Who did you leave a… I haven’t paid for this service in months.” And he turned around and he called it and it went (singing) it was disconnected. And I was like, “How do you not know that’s a bank?” Well, it turns out there was a Southern Exchange Bank and I’d used their bank routing number. I mean, I always thought that was funny, that it was like…

(04:24:34)
Well, I remember really for a split second there I was really embarrassed that they caught me. I was like, “Can’t believe this. You’re the Secret Service.” Anyway, I talked to them. As far as the Secret Service is concerned, there’s just not much I can tell them. It was me, Becky’s already told them everything. Amanda’s already told them everything. It’s not hard to track. When they raided my house, they’ve got boxes and boxes, so it’s laid out. It took forever. I still went through everything. I explained how I got the driver’s licenses, how I made the bank statements, how I made the birth certificates, the whole social engineering of figuring out what these little loopholes are. It’s like seven days total with these guys.
Lex Fridman
(04:25:23)
You mean like question?
Matthew Cox
(04:25:24)
Yeah, it was like they question me for all day and then they’d take me back to the Marshals holdover, and then the next morning I wake up and they chain me up again and bring me back.
Lex Fridman
(04:25:33)
What’s that like? What’s that process of questioning like? I mean, you’re somebody who is exceptionally good at conversation, charismatic was part of the games you played. Are they good at conversation?
Matthew Cox
(04:25:50)
I mean, the problem is they’re not there to shoot the shit. You see what I’m saying? They have an agenda.
Lex Fridman
(04:25:57)
But they have to use their words to get information out of you. Aren’t they trying to manipulate you?
Matthew Cox
(04:26:03)
[inaudible 04:26:03], I’m not holding anything back.
Lex Fridman
(04:26:05)
Okay.
Matthew Cox
(04:26:07)
It’s not like I’m sparing Jim. Trust me, Jim’s got to go. I mean, you’re looking at 20 some odd years, but Jim can do five. Bill can do some. Tom can do six. I don’t even like Jerry. Jerry can do 20. So I’m ready to cut everybody’s throat.
Lex Fridman
(04:26:27)
But you not guaranteed that you’re getting anything for that.
Matthew Cox
(04:26:30)
Right. In all my time, I’ve seen one time where an inmate got a guarantee to have his sentence reduced, and it was signed by the head of the FBI. Was Robert Mueller gave it to him, to have a conversation with him. That’s the only time I’ve ever seen that document.
Lex Fridman
(04:26:48)
Okay, so a lot of days with both the Secret Service and the FBI.
Matthew Cox
(04:26:52)
So FBI, Candice was irritated, didn’t like me. And I remember when she took the cuffs off, I was rubbing my wrist. She goes, “Your wrists hurt?” And I go, “Yeah.” And she goes, “Get used to it.” I mean, she was just an asshole, just all around. Not that she didn’t have a right to be, but everybody else was professional.
Lex Fridman
(04:27:09)
Oh, Candice.
Matthew Cox
(04:27:17)
We talked for three or four days with the FBI and they asked a ton of questions. They brought documents. So it’s like, “Hey, who signed this?” It’s like, “Oh, that’s not my signature. That’s so-and-so’s signature,” or, “I signed that. I signed that. I signed that. That’s so-and-so.” “Where’d this check go? Who is this?” “Oh, that’s so-and-so.” You’re looking over everything. One of the things they wanted to know about was, which I never talked about because it seemed so minor, is I bribed the politician. We got him elected to city council so he could vote to get the lots. We bought a hundred vacant lots in Ybor City. They were all single family, we wanted them zoned multifamily. And so we bribed him and got him elected all-
Lex Fridman
(04:28:05)
That doesn’t seem minor.
Matthew Cox
(04:28:07)
It’s not as sexy as the rest of the stuff.
Lex Fridman
(04:28:09)
That’s pretty… I mean, [inaudible 04:28:12].
Matthew Cox
(04:28:11)
That’s a whole ‘nother thing.
Lex Fridman
(04:28:13)
Yeah, yeah, all right.
Matthew Cox
(04:28:14)
What happened is when they got all of the bank accounts, they see all these checks going to Kevin White, and so they’re like, “Why did James Red donate $500 to Kevin White? Why did Brandon Green donate? Why did Alan Duncan donate? Why did…” So I had to explain to them, “Oh yeah, well, we wanted him to be city councilman, so we gave him a bunch of money so he could run the ad, so he could get elected, so he could then get all of our stuff.” But because he never did, I took off on the run before he was able to do that, and then not too, too long after that, he ended up… About five, six years later, he ended up getting indicted for bribery, but not mine, on somebody else’s case.
Lex Fridman
(04:29:00)
Can I take a small tangent here and ask how many politicians do you think commit crimes? Are a little bit or a lot criminals?
Matthew Cox
(04:29:09)
I mean, I think there’s some ways that are… They’re seemingly legal.
Lex Fridman
(04:29:15)
The aforementioned gray area.
Matthew Cox
(04:29:17)
Well, that’s not gray. This guy was, at one point I couldn’t find anybody to write $500 checks anymore so I just gave him cash. I’m just handing him seven, $8,000, $10,000 in cash. But I think most of them have legal ways to make ungodly amounts of money for influence. But is it legal? No, they’re politicians. They’ve made it so that it’s not illegal. If you really sat down and explained it to someone, the average person would say, “That’s not right.” Oh, no, no, that’s legal.
Lex Fridman
(04:29:58)
Okay. So at the end of these few days, what was the sentencing like?
Matthew Cox
(04:30:05)
Yeah, I go to sentencing. I get my PSI back and it’s 32 years to life. So we argue about it with the prosecutor just before sentencing, and they get it down to 26 years, four months. Then Millie says, “Listen, don’t worry,” because I’m trying to backpedal at this point. I’m like, “I might as well go to trial. If I lost at trial I couldn’t get more than 30.” Well, more than 32 years. Because you can’t get life. 32 was the max. It’s just a mistake he said 32 years to life, you can’t get life. So it was like, the most I can get is 32 years. So I was like, “I’ll go to trial. Might as well go to trial and see if I can get them to reduce some of these enhancements.”

(04:30:51)
She insists that she can get the enhancements knocked down and if you actually read the enhancements, some of the enhancements, they didn’t apply to me. So she goes, and I believed her, and I think she made a valid argument. We go to sentencing. My mom’s there, she’s crying. My dad’s there, he’s looking at me like he’s disgusted. And crowd, there’s a whole bunch of reporters, the whole place is packed. And I plead guilty. Millie gets up, my lawyer gets up and she argues these enhancements. And every single time the judge is like, ” I disagree. Overruled.” And it’s like, boom, five more years. Bam, six more years. Bam. Because if she had won the enhancement she argued I would’ve got 14 years.

(04:31:44)
Now, keep in mind too, a month or two prior to this, the US attorney had called Millie and said, “Look, Dateline…” Dateline had already come out, by the way. Remember I was worried about Dateline coming out? Well, it had come out, but they wanted to do a follow-up because it came out like a month or two after I got arrested. And they were saying, “Hey, we want to recut it with interviews with him.” Well, Gail McKenzie, that’s the US attorney, she wants me to do that. And she says, “I’ll consider that substantial assistance.”

(04:32:20)
Now, when you cooperate with the government, they consider it substantial assistance, that’s what they call it. So I cooperate with you, it’s substantial assistance. She says, “If he’s interviewed by Dateline, we’ll consider it substantial assistance.” And Millie says, “You have to do it.”
Lex Fridman
(04:32:36)
By the way, what’s the idea behind that? That you serve as a warning for others or something like that?
Matthew Cox
(04:32:42)
Yeah, exactly. Because you become a cautionary tale, like, “Don’t let this happen to you.” So I go and I’m interviewed by Dateline, Keith Morris, or whatever his name is, that guy, “Mr. Cox was…” that guy. So he comes and he interviews me. Becky’s interviewed, I’m interviewed, Amanda’s interviewed, Allison is interviewed, everybody. The Secret Service agent, I think is interviewed, everybody. Prosecutor’s interviewed. It’s funny, at the time when I watched it, I was like, “That’s not true, and that’s not true, and that, and…” And honestly, it’s like 99% true. Looking back on it, I’m like, ” My Audi TT wasn’t blue, it was silver.” It’s just stupid.

(04:33:36)
But anyway, so I’m interviewed by them and they recut it and they air the video. So you said this was substantial assistance. And then the other thing is I was interviewed by the FBI and the Secret Service. Now my lawyer calls the prosecutor the night before sentencing and says, “Look, he was interviewed by Dateline and he was interviewed by the Secret Service and the FBI. And if you do that, you said you’d reduce his sentence, you’d consider it substantial assistance, and you would reduce his sentence. What are you going to ask for his sentence to be tomorrow at sentencing?” And she said, “We did consider it substantial assistance and it’s just not enough.” “What do you mean?” “Nobody was arrested.” “Yes, but what about Dateline?” “Millie, I don’t know what to tell you. It just wasn’t enough.”
Lex Fridman
(04:34:27)
We considered it?
Matthew Cox
(04:34:29)
“We considered it. We will consider it.” And they did consider it.
Lex Fridman
(04:34:33)
Oh, man.
Matthew Cox
(04:34:34)
Yeah, the meaning of words is so important.
Lex Fridman
(04:34:40)
I’m going to use that at some point.
Matthew Cox
(04:34:41)
I’ll consider it.
Lex Fridman
(04:34:41)
I will consider it. I’ll consider it. I considered it…
Matthew Cox
(04:34:44)
It’s not.
Lex Fridman
(04:34:47)
… and still feel the same.
Matthew Cox
(04:34:48)
So she calls me, I’m crushed. And she’s like, “But look, they’re still investigating. They’re going to make these arrests.” And so when you get a sentence reduction at sentencing, it’s called a 5K1. When you get a sentence reduction after sentencing, it’s called a Rule 35. So she said, “We’ll file a Rule 35 as soon as the arrests are made.” Okay, so I go to sentencing and Millie says, “You’re going to get 14 years. I’m going to argue these enhancements.” She argues the enhancements. She loses the enhancements.

(04:35:18)
Not that she’s not an amazing attorney. She’s an amazing attorney. The judge wanted to hammer me. He hammered me. Millie was a great attorney. She was always polite to me. And by the way, to this day, will answer my phone call. Most public defenders, you call them now, you call them after your sentence, they don’t answer your call. Great person.
Lex Fridman
(04:35:41)
Thank you, Millie.
Matthew Cox
(04:35:42)
I didn’t give her anything to work with. It’s like I’m a little overwhelmingly guilty. It’s like there’s no defense. So I end up getting sentenced 26 years.

Prison

Lex Fridman
(04:35:55)
That’s a lot of years.
Matthew Cox
(04:35:56)
I would like to tell you that when they gave me the time, that I was stoic and I stood there and I took it in. But the truth is, I cried like a baby, like a small child. You’ve never seen anyone cry like this in your life. I was just, How did I get 26? What did I do to get 26 years like murderers, rapists? I’ve met guys that kidnapped guys that got 15.
Lex Fridman
(04:36:27)
26.
Matthew Cox
(04:36:31)
So yeah, I…
Lex Fridman
(04:36:33)
Were you scared?
Matthew Cox
(04:36:35)
I mean, does a pope wear a funny hat? Of course I was scared, I was terrified. But I kept telling myself, “They’re going to reduce the sentence. They’ll reduce it, they’ll reduce it, they’ll reduce it. Okay, okay. It’s going to be okay. It’s going to be okay.” But it wasn’t okay. I got moved to Coleman, the Coleman Complex in Coleman, Florida, the Federal Correctional Coleman Complex in Coleman, Florida, which is the largest federal complex in the nation.

(04:37:10)
At that time, there was a camp, which was a female camp. There was a low security prison for men, a medium security prison, and two penitentiaries. So I get moved to the medium. Now I’m moved to the medium, not because… That’s where real criminals go, right? I’m a soft, white boy. I’m no danger to anybody. I hurt someone’s feelings once, but other than that, I’m not going to be a problem. But if you have more than 20 years to serve, you have to go to a medium. So even though my security level said this guy should be in a camp, I had 20 years. You can’t go to a camp until you have less than 10.

(04:37:57)
So as soon as I am given 26 years… They knock off three, but you still have three years to get below 20, so they go to the medium. So I go to the medium and there are guys getting stabbed. The very first day, people are being stabbed. I get locked into… Go to my cell, meet my cellie. They scream lockdown. Somebody got stabbed in the rec yard. I remember I asked my cellie, which I’d met 20 minutes earlier, He’s like, “Hey, we’ve got to get in the cell.” I was like, “What’s going on?” “Somebody got stabbed in the yard.” And I go, “Somebody just got killed.” And he goes, “Nah, they just stabbed him up a little bit.” And I thought, “Oh my God, you’re in a place where they say stabbed him up a little bit. You’re not prepared for this, bro. You got to get out of here.” Anyway, I go to the medium. I’m there.
Lex Fridman
(04:38:40)
What was the first day and night?
Matthew Cox
(04:38:42)
Remember, I already had been locked up in the county. They’re county jails where they call them, they’re US Marshal, they’re holdovers, but they’re really county jails. They just keep you with the federal guys. So I’m not mixed in with hobos and people like that. I’m mixed in with the federal people.
Lex Fridman
(04:39:00)
It’s already felt like a prison?
Matthew Cox
(04:39:02)
Yeah, it’s a prison. I mean, it’s jail, but it’s a prison. Unless you’ve been locked up, you don’t really know the difference. So it’s a jail. Jails suck. Jails are much worse. The whole time I was locked up in the jails, waiting to be sentenced. Guys were like, “I just want to get sentenced and go to prison, bro.” And I was like, “Why does everybody keep saying that? Prison’s worse than this. I saw Shawshank. It’s horrible.” And they’re like, “Bro, prison? Listen, prison I can walk the rec yard. I could go to the movie room, watch movies. Listen, right after count…” There’s a four o’clock count. They count everybody at 4:00.

(04:39:39)
So they are like, “Right after count, I’m going to go to commissary. Somebody’s going to buy me an ice cream. I’m going to be eating an ice cream, walking on the rec yard the first day.” And it’s been months and months and months that I’ve been locked up in this county jail, and I’m thinking, “I want to go to prison. That sounds nice. I’d like an ice cream.”
Lex Fridman
(04:39:57)
But there was a stabbing on the first day, so…
Matthew Cox
(04:40:00)
Yeah. Well, everybody kept telling me I was going to go to a camp. You’re going to go to a camp, you’re going to go to a low.
Lex Fridman
(04:40:05)
I see.
Matthew Cox
(04:40:05)
And honestly, very quickly, I was walking on the rec yard, I was… So I was at the medium. I got there. It’s a real prison with the doors, bam. And they can open the little tray thing and feed you out of the tray, and there’s a stainless steel toilet and sink. And they have that in the county too, but it’s exactly what you think of prison as being.
Lex Fridman
(04:40:30)
But it feels like a fundamentally different experience when it’s 26 years and the door locks, and…
Matthew Cox
(04:40:36)
Yeah. So yeah, I have a cellie, but I’m also, is they sent me to a prison where tons of guys have 30, 40, 50 years, life sentences. There’s gangsters there, there’s murderers, there’s serial killers, there’s really bad guys. There’s guys that are trying to take advantage of guys, right?
Lex Fridman
(04:41:06)
You mean like sexually?
Matthew Cox
(04:41:07)
Yeah. But by the time I got there, I’d heard all the… How you can get yourself in trouble. Don’t go in somebody else’s cell. You don’t know the guy? You’re not 100% sure? Do not go in his cell. Don’t even go near a cell. Don’t go into places where people can close a door behind you or they can trap you in an area. There’s all these things that I’ve been told not to do.
Lex Fridman
(04:41:34)
Again, for sexual reasons.
Matthew Cox
(04:41:35)
Right, because I’m a small guy in prison.
Lex Fridman
(04:41:40)
Yeah, attractive white dude.
Matthew Cox
(04:41:43)
Yeah, it’s a problem. It’s a problem. This, it’s bad. It’s all bad.
Lex Fridman
(04:41:49)
Well, it’s good in the outside world, but bad in prison.
Matthew Cox
(04:41:53)
Yeah. My fear was they’re going to make me shave my head to make sure that the mop wig fits correctly. But there’s certain things that… I always hate to say this, and this is the simplest way to say it, is that if you get stabbed in prison, you had it coming. You did something. They’re not running around just stabbing people, you did something. And the things that get you hurt is you argue over the TV, what channel you want to watch. You got 50, 80 guys watching one TV, don’t argue about it. It’s not worth it.

(04:42:27)
Borrowing things and not returning them, that’s a problem. Running up debts, that’s a big problem. Gambling, gossiping, those are the problems. Those things get you hurt. Not being polite, be respectful. I’m super respectful. So I was respectful. Very quickly when I got to Coleman… There are continuing education courses. One of the courses is residential real estate. The guy that was running the residential real estate didn’t want to do it anymore because he was doing legal work and it just was taking too much time. So he came to me and said, “Listen, you just got here. You got a real estate background like nobody else does. Can you take over this class?” And I was like, “Sure.”

(04:43:12)
So I looked at his curriculum, I rewrote it a little bit, and I started teaching a residential real estate class. And at one point I was teaching two classes a semester or a quarter. And these guys loved it. They all think they’re going to get out and flip houses. So I started from the fundamentals. I talk about credit, how to borrow, hard money lenders, different types of… Everything.

(04:43:35)
It’s the first time in my life, this was funny. Not that I think I was really ever in a position for this to happen. This is really odd though. Probably the second or third class when guys are leaving and I’m having to check them off the roll, multiple guys are stopping and saying, “Yo, bro,” putting their hand out and shaking my hand and going, “Good class. It was a good class, bro.” Then I have guys coming to me, telling me, “Hey, what are you teaching these guys?” I go, “What do you mean?” He goes, “My cellie’s telling me he’s going to get out and make millions. ‘I’m taking Cox’s real estate class. I’m telling you I can do this. I’m going to be a millionaire.'” And it’s like this flipping houses, this is not…

(04:44:12)
But the truth is, flipping houses was… What I basically told these guys, especially the drug dealers, right? You’re a drug dealer and you were raised in the projects and you’re going back to the projects. This is the one industry that you will thrive at because you’re a hustler. You’re not afraid. A 45-year-old, divorced, white woman is not going into the Hood knocking on doors to try and flip houses, but you will. And you know everybody in the neighborhood, and you’ll knock on those doors, and you’ll hustle. And you’ve been told no before and you don’t care and you’re not scared, you’re not…

(04:44:49)
And there’s tons of money to be made in lower income areas. And then when I go through the whole thing and how you can leverage your credit to borrow money to get into the property and do the renovations with very little money down, and I do the whole thing, these guys, they loved it. And what that did for me was two things. One, if you got to the class, 40 guys show up for the class. And I say, “Look, if you don’t want to go, you don’t want to be here, you just want it because your counselor’s making you get a certificate. You don’t want to be here, that’s fine. Bring me two coffees and two creamers from commissary and I’ll fill out all your paperwork and you’ll pass. You’ll get a certificate. I don’t have to see you again.” I have full of coffee and creamer because at least 10 or 15 didn’t want to be there. The other guys seriously wanted to be there. And I don’t want those guys to be there anyway, they’re going to be a problem. So the other guys are serious about it, and some of these guys sat through the class two, three, four times. Some of these guys got out and sent me money, which is a huge sign of respect, by the way. Because they don’t owe me anything. But I did that and I taught GED because you have to do something for money.

(04:46:04)
And I met a bunch of cool guys and I was hanging out and I was doing well. And after about three years, they transferred me to the low security prison. At this point the FBI starts showing up, asking me questions. They asked me questions about the politician I bribed, asked me questions about him. Statute of limitations was up and they were trying to tie him into the bank fraud. Because his name was Kevin White, and one of my guys’ name was Michael Kevin White, and so they were trying to tie him in. “Did he know about it? Because if he knew about it, statute of limitations is 10 years. We could…” “No, he didn’t know.” Should’ve thrown him in there. Because a couple a years later, he gets indicted. He ends up going to jail anyway.
Lex Fridman
(04:46:58)
And it could’ve decreased your sentence.
Matthew Cox
(04:47:00)
Yeah. Listen, listen, stop. Stop. Oh my God.
Lex Fridman
(04:47:07)
I got all my judgment out after the homeless conversation.
Matthew Cox
(04:47:10)
Listen, it’s only going to get worse.
Lex Fridman
(04:47:14)
I mean, I really appreciate your honesty and your insight about snitching, honestly, that I have a sense that there’s at least a desire for loyalty in the world.
Matthew Cox
(04:47:28)
Wouldn’t that be nice?
Lex Fridman
(04:47:30)
Did you ever feel in danger in medium or low?
Matthew Cox
(04:47:36)
Is funny, I had more problems probably at the low than I did the medium. But at the medium, the only thing that happened was an article came out in the newspaper when I was at the medium. It came out and said… Because they’re still investigating things. So this article comes out and I’m on the front page of the St. Petersburg Times. It was about the politician. Big article, and in the article, they interviewed Millie, my lawyer, and she says, “Well, when Mr. Cox was being interviewed by the FBI, one of the first things they wanted to know about was this politician.” So she just said, “Mr. Cox was being interviewed by the FBI.”

(04:48:27)
So I immediately get taken into custody and they put me in the shoe, the hole, for my own protection, and I’m there for like 45 days. Then after 45 days, they’re like, “Cox, what do you want us to do? You want us to ship you?” I was like, “No, put me back on the compound.” I’m like, “Half the guys here cooperated.” And he goes, “Yeah, it’s more than half.” He said, “But this is the guy from SIS,” which is their internal security.
Lex Fridman
(04:48:50)
So that’s when he told you that it’s actually a much higher percentage, but-
Matthew Cox
(04:48:53)
Right, he said, “But a hundred percent of them are lying about it.” He said, “You just came out in the newspaper.” I go, “Man, I’m not concerned.” “If you are concerned, you got to come immediately to the lieutenant’s office and tell us, we’ll ship you.” I said, “Okay.” I get out there, people are looking at me and, “What’s up?” But I don’t have a lot of friends anyway. I don’t come there to make friends.

(04:49:14)
So at one point, this one guy comes to me. I’m walking the yard probably two days later, after I get back on the compound, I’m walking. Guy comes to me, he has a goatee, and it comes down here, and he’s got a little skull thing he had made, whittled out of wood or something, and definitely looks scary. So I’m walking and he stopped, he goes, “Hey, Cox.” I’ve never talked to these guys. I had been there for a year or so and never talked to any of these guys. They’re all like bikers and Aryan Brotherhood. And so I’m like, “Yeah, what’s up?” He said, “Bubba.” Bubba’s their leader. He goes, “Bubba told me to tell you not to walk the yard. He don’t want to see you out in the yard.” And I went, “Okay.” I said, “Well, I’m going to walk the yard tonight.” I said, “And if I get the shit kicked out of me, then I get the shit kicked out of me, but-“
Lex Fridman
(04:50:02)
But did you talk back to a guy with a wooden skull hanging off his beard?
Matthew Cox
(04:50:06)
I did, but you know what? It was right in front of the guard shack, and so there was guards in the guard shack. They’re 20 feet away.
Lex Fridman
(04:50:13)
Really, you weren’t scared?
Matthew Cox
(04:50:14)
I mean, I think I just got numb. I’m not stupid, but I’m walking around. I was scared from the moment I got there, on, if that makes sense. So you get to a point where you’re just numb and you’re waiting for it. Especially when I got out of the shoe. Got out of the shoe, I went straight to my cell, laid down. Couple of minutes later it was lockdown, they closed the doors. I wake up the next morning, I go to chow, I go to my job, it starts all over again. So I had a very packed routine. Although there’s guys everywhere, and I’m thinking at some point I might just be walking around, a guy might walk up and just smash me in the head, but it didn’t happen.

(04:50:53)
And it’s not that guys aren’t getting stabbed, but they’ve got it coming. I didn’t tell on anybody here. I didn’t do anything. It’s not that on other yards I might not have gotten smashed, but I didn’t get smashed. And I’d been there a while and I taught the real estate class, and everybody wanted to take real estate. So I think that insulated me to a degree. I also had made a few friends there, and I think they were probably also putting out the words like, “Bro, cut this guy a break.”

(04:51:20)
So I’m walking across and I tell the guy, I said, “Look man,” and I wasn’t rude to him. He wasn’t even rude to me, really. He said, “Don’t walk the yard anymore. Bubba doesn’t want you walking the yard.” I said, “Well, listen, I’m going to go to chow and then I’m going to go out there tonight and walk the yard, and if I get smashed, I get smashed.” I go, “Because I got 26 years and I cannot walk around for the next 26 years, not going on the yard.” I said, “So I’m going to be there, and if that happens, then that happens.” And he looked at me and he goes, “Man, I don’t give a fuck what you do. That’s what Bubba told me to tell you.” He said, “I told you.” And he goes, “I don’t give a shit what you do,” and he walked off.

(04:51:54)
I went out there that night with a buddy of mine named Zach, a guy named John Gordon, with my cousin and a couple of his buddies. We walked the track for about an hour. Bubba and a group of his guys stood there and looked at us, and as we walked, probably closest we got to them was 30 or 40 feet. That went on for 30 minutes and then they broke up and went their separate ways.

(04:52:15)
There was a couple of times where I would go to the chow hall and I would go and I’d be sitting at a table and Bubba would walk up and tell the other guys at the table, “I want to let you guys know you’re…” He didn’t even call me a snitch. He said, “You’re sitting with a cooperating witness.” He said, “If that’s how you want to roll,” he said, “You ain’t going to be rolling with us if there’s any trouble.” And then they all looked at me and they got their plate and they moved off. He didn’t tell me to move. And he could’ve walked up and said, “This is a snitch motherfucker.” He didn’t do that. Bubba was very respectful. As respectful as you could be [inaudible 04:52:48].
Lex Fridman
(04:52:47)
Whatever you want to say about Bubba, he was a respectful man. You ever talk to him directly?
Matthew Cox
(04:52:52)
Never had a conversation with him. So that went on, but I mean, when I say that went on, I mean literally that’s a couple of times. He said the same thing to a guy in line one time. Guy came up to me later and said, “Look, man, I’m sorry, Matt.” He was standing next to me in line. Bubba said something to him. He went like 10 or 15 people back and stood in line. Later on he came up to me, “Matt, I’m sorry bro, but blah, blah, blah.” I said, “Bro,” I said, “Look, I get it. We’re not friends, don’t worry about it.”

War dogs


(04:53:18)
And here’s the thing. At some point there, I ended up getting… Well, the FBI started showing up there at the prison, questioning me about my files in Tampa, that [inaudible 04:53:33] of the 12 guys that were indicted?
Lex Fridman
(04:53:34)
Mm-hmm.
Matthew Cox
(04:53:34)
They show up and they start asking me about it. And so they’re still working it. Well, at the same time, I end up getting moved to the low security prison. I get to low security prison, they show up over and over again. But at some point they come to me and they say, “Look, we went to the US attorney. We presented everything we have. I have enough to indict all of these guys.” I think it was whittled down to maybe eight instead of 12. And they said, “Look, the entire economy is melting down. At this point some of these are four, five years old. We’ve got banks that are melting down right now. We’ve got 100, 200, 300 million, 500, half a billion dollar banks that we’re investigating. We don’t have time to deal with this. We’re not going to indict those people.” So they get away. The agent I was working with, her name was Leslie Nelson, very nice person. She came… Actually didn’t have to do this, came to the prison to tell me this is what happened. And when she’d first come to see me, I told her, “Listen, I want to do all this, but no matter what happens, I need you to write me a letter. If they don’t indict these people, I need you to write me a letter that I can present to the US attorney on my behalf, that I did everything I could.” And she goes, “I’ll do that. That’s not going to happen. We’re going to get the indictments and everything.” I was, “Okay.”

(04:55:05)
So of course, a year later, she shows up after nothing happens and they’ve dropped the case. She shows up and she tells me what happened, and he’s not going to do it. And I go, “Do you remember that you…” She goes, “I got the letter right now.” Gave me the letter. She was like, “That’s it.” Great letter. It says, “Mr. Cox has worked, blah, blah, blah. He’s done this, this, this,” great. And even said, “He deserves a reduction in my opinion, blah, blah, blah.” But nobody was arrested.

(04:55:33)
So I call my public defender, I call Millie, I explain it to her, and she starts crying and she’s sorry. And, “Well, what are we going to do?” “Well, there’s nothing you can do. You’re time barred.” You have one year to file a 2255, which is to say that your lawyer is ineffective or that the court has made a mistake in some way. And it had been over a year, it had been years. It’d been like four years. And she’s like, “Yeah, I mean, there’s nothing you can do.” And she’s in tears, and I feel like I’m done. At that point I’m done.

(04:56:14)
And what I do is I start writing a book. I write my memoir. And this is not a shameless plug for my memoir, by the way, which is amazing. Just saying. But so what happens is I actually write it. I write it, and then I have to rewrite it because I don’t really know what I’m doing. And I’ve been reading true crime and that sort of thing. And I’ve always liked true crime. I get a literary agent, comes to see me, tells me I have to rewrite some stuff. We rewrite it. As I’m finishing up my memoir, there’s a guy that comes on the compound, and his name is Efraim Diveroli. Efraim Diveroli and his business partner, a guy named David Packouz, were selling munitions, AK-47 rounds.
Matthew Cox
(04:57:03)
… selling munitions, AK-47 rounds, really tons of munitions. But they got in trouble with this and they were selling them to the US government for the Afghani Security forces. And there had been an article in Rolling Stone Magazine about him, and I’d read it and somebody points them out and says, “Hey, that’s that guy.” And I went up to him, I said, “Hey, bro, you just got here?” He’s like, “Yeah.” And I said, “Look, if you want to write a memoir or anything, I’m finishing my memoir. I can always help you. I can help write an outline. You can get a professional writer, whatever you need help.” He’s like, “Yeah, all right.” Efraim Diveroli was played by Jonah Hill in the movie War Docs. So a few months later, he comes to me and says, “Hey, they sold the movie rights.” I was like, “Oh wow, that’s great.”

(04:57:46)
And I’m like, “You don’t want to write a memoir?” And he’s like, “Yeah, man. It was sold to the guys from the Hangover movie.” And I was like, “So the guys from the Hangover movie are going to make a movie about you?” I said, “You understand, they’re going to call it like, dude, where’s my hand grenade? And you’re going to be Spicoli from Fast Times at Ridgemont High. You’re going to be a joke, all because you don’t want to write a memoir and get your version out there.” And he was like, “Holy shit”. So I ended up writing an outline for him. We worked together, and then he asked that, “Can I read your book?” And I was like, “Sure.” And I give it to him and he reads it and he comes back and he said, “Bro, this is the best thing I’ve ever read in my life.” And to be honest, I later found out he’d read about three books in his entire life, but still it was very nice…
Interviewer
(04:58:30)
[inaudible 04:58:31] still the other two.
Matthew Cox
(04:58:32)
So he asked me if I’ll write his book, I write his book. We work out a deal and we do that. And I’m saying all this because I basically settle in. I’m done. I’m going to do 26 years.
Interviewer
(04:58:46)
By the way, just on a small tangent, how did you know you’d be good at writing?
Matthew Cox
(04:58:52)
I had written a manuscript prior to even taking off on the run, I used to listen to John Grisham books. I would listen to him in the car. I liked John Grisham books, and I’d actually written a manuscript about a mortgage broker. He writes about lawyers, and it’s like, Laurie, being a lawyer is not exciting. If you can make that sound exciting, I can make being a mortgage broker. And I wrote a book, put it at my desk, and the FBI found it and they had said, “Oh, it’s a blueprint to the fraud that he’s going to commit.” It wasn’t, stop. That character was as much me as John Grisham’s characters are him.
Interviewer
(04:59:31)
But it’s still interesting that John Grisham didn’t…
Matthew Cox
(04:59:35)
Right. I mean, if John Grisham did something similar to what one of the…
Interviewer
(04:59:40)
Yeah, I saw a quote somewhere that the criminal is a true artist and the detective is merely a critic. Something like that. Does that resonate with you or not?
Matthew Cox
(04:59:55)
I’ll have to look that up.
Interviewer
(04:59:56)
Okay, so you already knew you could write?

Frank Amodeo

Matthew Cox
(04:59:58)
Well, I knew I liked it, but yeah, I think I got better and better at it. I mean, as you’re writing… And they had creative writing classes in prison at the Lowe. The Lowe was a much different breed of animal. You could very easily get hurt, you could get hurt either place, but there were guys that have life sentences that have been working out for 20 years and were just super angry at the medium. And if you got hurt at the medium, it was probably really go bad, as opposed to you get hurt at the Lowe, it’s more like a fistfight in high school, with knives. So anyway, so I am there. I’m writing, I’m doing that. And there was a guy on the compound that came on the compound about that same time. His name was Frank Amadeo. Frank Amadeo is a rapid-cycling bipolar with features of schizophrenia.
Interviewer
(05:00:56)
Rapid-cycling, bipolar with features of schizophrenia.
Matthew Cox
(05:01:01)
It’s just constant, right? And so there are moments in his manic state where his reoccurring psychosis, I guess, is… That he believes, and since he was in his early teens, has believed that he’s preordained by God to be emperor of the world. He’s a lawyer, disbarred. Stole close to $200 million from the federal government. They gave him 22 years and they sent him to Coleman, but it doesn’t… This is the part I love. The delusions don’t affect his legal work. It doesn’t say a ton for legal community, but…
Interviewer
(05:01:41)
How do you know he’s delusional? I’m just asking questions.
Matthew Cox
(05:01:43)
Yeah, he’s trust me. I mean, it’s not me. It’s like the transcripts, the lawyers, the doctors. There’s a ton of ton. And then if you saw him in action, you’d be like, “Oh, wow.” He would be completely normal. He would be having a completely normal conversation and somebody would say something and he’d go, “That makes me so angry. I am not going to let them do that. When my legions march on Washington, we are going to burn the constitution and the president will kneel at my feet.” And he goes, “I’m going to need your transcripts. I’m going to need a 2255 form. We’re going to file a…”

(05:02:35)
And everybody would sit there and be like, “Okay, Frank, I’ll get to this and I’ll get…” It was insane. It was the most insane… He was basically running a medium-sized law firm from inside of the prison. He was training people. He taught the legal research class and was training people on how to do legal research in prison, how to put together motions, how to fight their cases, how to do the research, how to type them up. Everything. It’s like a law school. He’s teaching these guys… Listen, they made such a mistake locking this guy up.
Interviewer
(05:03:11)
So he’s a great lawyer.
Matthew Cox
(05:03:12)
Listen, it’s going to get worse. It’s going to get worse. Because here’s what happens is, at this point, I don’t talk to him for probably a year or so because everybody’s saying he’s crazy. And for a year, he gets there, he’s drooling out of the side of his mouth. They got him on a ton of medication. It takes him about a year to take him off the medication. So he gets them to take him off the medication, and then he starts stabilizing his mood by drinking Pepsi. I know. I know it’s crazy. I see you looking at me like this guy’s delusional. I know. So at some point, one of my buddies comes to me and says, “Look, you got to go talk to Frank.” Here’s the other thing. Over the course of a year or two that he starts doing legal work for guys, he starts just taking on guys’ cases. “I’ll do the motion, I’ll do your legal work, I’ll do this.” Keeps him busy. But suddenly you start hearing people get released.

(05:04:08)
Jimmy just got 10 years knocked off his sentence. He’s going to halfway house next month. Tom got an immediate release. Frank’s walking people up to R&D, shaking their hands. Guys are walking up to him in tears, crying. And so crazy or not, what choice do I have? I called three different lawyers on the street and said, “This is what happened. What can I do? What can I do?” They told me to do this and this and this, and I worked with them, and then they decided not to proceed, and what can I do? And they said, “You’re hit, bro. There’s nothing you can do. In the 11th circuit, you cannot force them to file a reduction on your behalf. You cannot do it. It’s impossible. You’re hit. You’re done. It’s over. I’d love to take your money, Mr. Cox, but it’s not going to happen. I’m not just going to take your money. You’re going to lose.” Three different lawyers.

(05:05:03)
I talked to Irti’s lawyer, told me, “Bro, it’s not going to happen. It’s over.” So my buddy says, “Go talk to Frank.” I said, “Well, why wouldn’t I? I got nothing else to lose.” So I go talk to Frank. He actually has a little manic moment, that little thing that I just showed you. That’s exactly what he said the first time I talked to him.
Interviewer
(05:05:24)
Based on your case?
Matthew Cox
(05:05:25)
Yes, “I won’t let this happen.” He’s like, “I’m going to need your transcripts. I’m going to need you to get this. I need to see your indictment. I’m going to need your percentage report. I’m going to need…” I was like, “Okay.” And I turned to my buddy. He’s like, “Bro, I know. I know what you’re thinking. It’s fine.” It’s fucking crazy. And he’s like, “I understand. What choice do you have?” I was like Fuck. So Frank files a 2255 motion on my behalf stating that I’m not time-barred that Millie was… We file it against Millie, stating that she was ineffective, that she didn’t understand the law. She had me plead to something. Because she thought I could get a reduction simply for doing Dateline. Oh, by the way, when I was in the medium, the government came to me and asked me to be interviewed by American Greed. I do that. I’m interviewed. And they get me on the phone, they talk to me, everything. The prosecutor wants me to do it. She’s re-interviewed, everybody’s re-interviwed.

(05:06:23)
It airs. Millie goes to the government, says, “Look, reduce the sentence.” They go, “No, Millie, it’s not enough.” Then they come to me and they ask me to write an ethics and fraud course. I write an ethics and fraud course. The guy I write the course with that flies up to Atlanta. He talks with… I think he drove up, but he goes up to Atlanta, he talks with a US attorney, talks to Millie. She insists if he does this, I will reduce his sentence. I will definitely consider this. Definitely consider. And then we do it. It’s being used all over the nation. Not enough. At this point, I go to Frank. I tell Frank what’s happening. Frank says, “Yeah…” He goes, “Every time they asked you to do something, it reset the time bar. You have a year from that time to file a 2255.”

(05:07:11)
Now, he insists that that was a viable argument. Nobody else does. But he said, “I’m not going to let them do this. I’m going to take care of this. I’m going to get your sentence reduced.” Okay. “Emperor. Okay, Emperor.” So he was a character. Anyway, so he files a 2255. The government comes back, they say, “He’s time-barred.” Frank comes back, they answer his motion, he files a retort. It just goes back and forth. This goes [inaudible 05:07:46] for six months to a year. And at some point, I go to mail call, and they call my name and they hand me this thing, and I open it up, and it says the government’s filed a motion for a stay so that they want the court to appoint me a lawyer and to discuss filing a Rule 35, reducing my sentence. And I’m like, I read it, but I couldn’t even understand.

(05:08:15)
I don’t understand. So I mean, I rushed to go find Frank. I show it to Frank and he says, “Yeah, they’re staying it. They’re going to send you a lawyer and you’re going to negotiate for how much they’re going to reduce your sentence.” He says, “It’s perfect.” So they fly this woman down, her name was Esther Panitch. She flies down, comes to the visitation room, they bring me there, the lawyer’s room, whatever they call it. And so we’re sitting there, and I remember we’re talking, and she says, “Listen, your motion, your 2255 is written well, but honestly, you don’t have much of a prayer, and they’re offering you a one-level reduction, which is 30 months.” And I went, “Oh, that’s not enough.” And she said, “Well, I don’t know what to tell you.” She said, “They’re willing to bring you back.” And I was like, “Well…” I mean, I don’t know.

(05:09:12)
I go to talk to Frank. Frank said, I deserve this many levels, and we’re going back and forth. She says, “Who’s Frank?” And I go, “Frank’s the guy that’s doing all my legal work.” She goes, “He didn’t write all this.” And I was like, [inaudible 05:09:20], “No, who wrote?” And I explained it to her and she’s like, “He’s an inmate?” And I was like, “Yeah.” And she says, “Why is he here?” And I tell her, “Well, he stole a bunch of money from the federal government because he’s trying to take over the world.” So I tell her that whole thing. And she’s like, “You’re letting a mentally incompetent person do your legal work.” And I was like, “Yeah, because all the competent attorneys wouldn’t do it. They said, I didn’t have a prayer. Your people said, I didn’t have a prayer.” And I said, “Frank said he could get this done.”

(05:09:50)
And she’s like, “Well, I mean, I don’t even know why they’re offering you one-level.” I was like, “Well, Frank said.” And I’m like, Frank this, Frank that [inaudible 05:10:00] ended up saying, she’s like, “You’re taking advice from a legally, an incompetent person.” I said, “Yeah.” She said, “You really don’t have a prayer.” I said, “Then why are you here?” I said, “If they could crush me so easily, why are you here?” I said, “They’re giving me one-level. Let me talk to Frank. I’ll let you know what we’re going to do.” So I leave. I call her a couple of days later, I tell her… I talked to Frank. Frank said, “Go back. Go back and argue for more.” He said, “I think the judge is going to give you more. He’s going to give you at least between whatever he said, six or seven levels or something.”

(05:10:33)
So I get moved all the way back to Atlanta. The FBI agent comes to talk on my behalf, the guy… Multiple people show up to talk on my behalf. They say… Millie, who I filed the 2255 against. So I’m basically saying, “You’re ineffective, you’re incompetent.” But she knows the game. She’s like, “I get it.” She gets on the stand and testifies for me. So the judge goes, “Listen…” I think we were asking for nine levels or something outrageous. Prosecutor starts arguing for one-level. And he said, “Listen, one-level is not nearly enough for what Mr. Cox has done.” He said, “Mr. Cox, I know you’re arguing for nine levels off. You’re [inaudible 05:11:22].” He goes, “That was never going to happen.” I was like… It felt like I got slapped. He said, three levels. “I’m going to go with three levels.” He goes, “Which is seven years.” Which he said, “For somebody who has no arrest associated with his case.” He said, “I think it’s pretty good,” and that’s [inaudible 05:11:44] judgment and blah, blah, blah, blah, blah. And he hammered, puts the gavel down and walks off and that’s it. It’s over, I get seven years. I was hoping for more. So I get moved back to Coleman. I get moved back to Coleman, and I go up to Frank and I said, “Frank, I got seven years off.” And he is like, ” I know.” I said, “And I don’t mean to sound unappreciative.” I said, “I was hoping for more.” He goes, “I was too.” He said, “It looks like we’re going to have to eat this elephant one spoonful at a time.” And he goes, “Something will come out. Something’s going to happen.”

(05:12:19)
He said, “Keep your ears open. Something will happen.” And I said, “Okay.” And honestly, by that point, I’d done eight years, and I remember if I got a year off for the drug program and good time and this, I had about eight years left to go or something, nine years left. And I was like, “I can do that. I’ll write.” I’d been writing. By that point, I’d actually written a story. I got a book deal for Deboroli, and I ended up writing a synopsis of a guy’s story. And I got him in Rolling Stone Magazine. And I got a book deal for that. I got an advance. It was thirty-five hundred bucks for being in prison, a prisoner to get a thirty-five hundred in advance is like, “I’m a millionaire.” That’s a lot of money. And then we optioned the film rights.

(05:13:17)
Basically the synopsis that I wrote for this reporter, journalist for Rolling Stone, he goes to Rolling Stone with what I wrote and gives it to them, and they okay it, they say, “Yeah, this is great. We want you to write an article based on this.” He writes the article. He tells me that the article will be from his name Guy Lawson, Douglas Dodd, which is the name of the kid I wrote the memoir about, and Matthew Cox. A couple of weeks before the article is going to be published, he tells me Rolling Stone doesn’t want my name on the article because I’m in federal prison and it doesn’t look good, but don’t worry, he’s going to put my name in the article. And that’s just as good. And I argue it’s not just as good. It’s not. I’m like, “I would be a writer for Rolling Stone Magazine. You understand, I’m trying to come up with something here that I can rebuild my life as a true crime writer. That’s no good.”

(05:14:24)
And that wasn’t so bad. That wasn’t the worst. The worst of it was 90% of the article that he published was taken directly from what I sent him. I mean, sick to my stomach, bro, just sick over it. But they option the life rights for that. And I got a piece of that. So there’s like $7,000. I get a cheque for that. So I’m thrilled I can keep writing. Because you have to understand, writing on the computer there they charge you. So I start… Oh, they charge you for phone calls, writing… Every single thing costs money. So I start writing all these guys’ stories. I start writing books. I went back to Atlanta, got seven years knocked off my sentence, come back, and I’m walking around the compound. Now, there was a guy that was there named Ron Wilson. Ron Wilson ran… If you look in the newspaper, it says it’s like a hundred million dollars Ponzi scheme.

(05:15:25)
But really it was fifty-seven million dollars. He had lost fifty-seven million. So it says a hundred. They always exaggerate. Because fifty-seven is not enough. Ron ended up getting nineteen and a half years. Ron was an old conman, early sixties, sixty-two, sixty-one, I don’t know. And I liked Ron. So we’re walking around the compound and he’s like, “So what are you going to do? I mean, you eight or nine more years to go?” And I was like, “Yeah, I’m going to keep writing and when I get out of here, maybe I’ll have a huge body of work and maybe I’ll be to sell it, or maybe I’ll be able to option some more stuff. And if I could get together with Rolling Stone or get with some of these magazines, I could start writing for them and I could option those. Maybe I could walk out of here with something.” “Right, right, right.” So Ron was… Who’d only been locked up like a year or so. He was cooperating, with the Secret Service in his case, against some of his co-defendants.

(05:16:25)
So he’s already been debriefed and he’s cooperating. He’s actually thinking he might get brought back to have to testify at a trial. We’re talking and we’re walking, and he keeps saying, “Even if they charge those guys, and even if this happens, they’re not going to reduce my sentence. They’re not going to cut my sentence.” First of all, well, probably because you stole a bunch of money from pension funds and churches that didn’t help your case. But I don’t say that. So I say, “Oh, they have to, bro. They’ll have to, if you cooperate, they’re going to have to. And if they don’t, we’ll have Frank file a 2255.” And he’s like, “Ah, that crazy mother…” So he says, “Okay.” He’s like, “Yeah, yeah, you don’t understand. You don’t understand.” So this goes on for months. And I’m like, “What is the problem?” And he says, “They think I hid Ponzi scheme money.”

(05:17:19)
And he’d actually dug up like five or six million dollars in Ponzi scheme proceeds that he dug. He buried in these… Literally buried in aluminum ammunition canisters. Super interesting guy. So he actually went and dug them up and gave them to him. And I’m like, “Well, you gave them all the money. You didn’t hide anything. Relax, it’s not a big deal. They’re not going to find anything, don’t worry about it.” And so he mentions it a couple of weeks later, a couple of weeks later, and then one day I go, “Bro, why do you keep bringing this up? What are you concerned about? It’s not going to happen.”

(05:17:54)
And he said, “Can I trust you?” And I went, “Probably not.” And he goes, “I did hide some money.” I was like, “Okay.” I said, “Did you bury it in a can somewhere?” And he’s like, “No, I gave my wife 150,000 in cash.” I said, “Okay, well, she’s not going to say anything she’s using [inaudible 05:18:20].” He said, “No, you don’t understand. Since then she found out I was having an affair and we’re going to get a divorce. And she hates me. And I think she’ll turn that money in just to make sure that I don’t get a reduction.” Because if you lie to the FBI, it doesn’t matter what you’ve done for them, they won’t give you anything. And so, I’m sorry, the Secret Service or… Anyway, he has clearly lied to the Secret Service at this point.

(05:18:45)
If she goes and says, “This is what he gave me.” So I was like, “Oh, wow.” And he’s like, “My brother’s holding maybe 30,000 for me.” And at that moment I was like, “Wow, this poor guy.” No, that’s not what I thought at all. What I thought was, “Is that enough to get me a sentence reduction?” And I went and I sat there, and you know what I thought? I thought, “No.” I thought, “That’s not enough. That’s not enough. It’s nothing. That’s not even $200,000.” And they didn’t want to give me a reduction. My prosecutor was pissed that I got seven years off. She wanted me to get 30 months. She’s not going to give me anything. It’s up to her. She’s not going to do it. So I go, I lay down, I go to bed. A month later, I’m on the phone with my lawyer.

(05:19:38)
I had written, I remember wrote, I had a manuscript from my book, and I wanted to put some of the stuff that was said in my sentencing in the book. So I was trying to get my lawyer to mail me my transcripts, and she hadn’t done it. So I called her and I said, “Listen, you said you were going to…” She’s like, “Oh God, man, I’m so sorry. I’m so busy. I’ll do it. I’ll do it.” And then she went… This is Esther. She goes, “So what else is going on in there?” And she never wanted to talk to me, when they were paying her, she didn’t want to talk to me. And I was like, “What do you mean nothing? I just need my transcription.” She’s like, “Nothing’s happening. There’s nothing you want to talk about.” And I was like, “And I went, you know what? There’s something weird happened there. Listen to this.” And I told her about Ron Wilson, and she goes, “Hold on.” And she looks him up on the computer. She goes, “Oh wow. This is a bad guy. This is a bad guy.”

(05:20:32)
“And he told you… Then you know where it [inaudible 05:20:36]” “Absolutely. And I can tell you exactly.” And she goes, “Okay, okay, okay.” She goes, “Let me look into this.” I go, “Okay.” So a week later, a CO comes to me and goes, “Hey, Cox.” And I go, “What’s up?” He goes, “Listen, at the next move…” Because they have controlled moves. All the doors are locked, and they open them up for 10 minutes. So you can run to the chow hall or you can run to the… You can’t run though. They have no running on the compound, but you can walk fast to the rec yard or the library, whatever. He says. “The next move go to SIS.” So I go to SIS on the next move. But I was used to going there, by the way, because I was constantly ordering Freedom of Information acts. And so I’d order… You’re an inmate and I’m writing a story for you. And I’d order it and they’d send it to me.

(05:21:20)
And then they would catch it and they’d be like, “Why are you getting Lex’s information?” So they’d call me down there and I go, “No, I ordered it for him and I’m writing a story, and I’d already been in Rolling Stone and everything.” They’re like, “What’s the story?” And I tell him the story. The guy’s like, “That’s a pretty good story here.” And so I go down there, but this is different. This is the guy answers the door and this guy, they call him Bulldog. He was a real asshole. He was a lieutenant at SIS. And he’s like, “Get in here, Cox, sit down.” And he dials the phone. He goes, “Here, you got to talk to this guy.” And I’m like, “What?” And I pick up the phone, I’m like, “Hello?” And the guy goes, “Hey, this is Agent Griffin with the Secret Service. I understand you know where Ron Wilson has hidden Ponzi scheme money. I want something in writing.”

(05:22:03)
So I start doing that and they go, “Okay.” Then I get his email address and we start emailing each other back and forth, and he ends up getting a letter from the US attorney in South Carolina that says they will consider it substantial assistance if they make arrests or recover a substantial amount of money. That’s the best I’m going to get [inaudible 05:22:28] consider. So I start talking to this guy and he starts asking me questions about Ron Wilson. Like, “Hey, ask him this, ask him this.” So I’m like, “Bro, I got to work that into a conversation. That’s an odd thing to ask.” So this goes on for six months. So I’m asking questions and I’m typing up little reports, and I’m a prison snitch now. So I’m not just cooperate now [inaudible 05:22:50] prison. So I’ve moved down. I’ve moved down actually from being just a cooperating witness or…
Interviewer
(05:22:58)
Because you’re in prison, is that what makes you a prison snitch?
Matthew Cox
(05:23:01)
You can’t even really say. No, you could say Prison Rat. You could say Prison Rat. I think prison snitch, I think, that’s probably the closer the term that most guys would use.
Interviewer
(05:23:12)
What’s the difference between a snitch and a rat in prison?
Matthew Cox
(05:23:14)
I’m not sure. It rolls off the tongue better. Prison rat doesn’t sound as good as prison snitch. I don’t know. I don’t spend a lot of time thinking about this. So what happens is I’m asking Wilson questions periodically, and at some point they contact me and they say, listen, “Wilson’s about to get some bad news.” I go, “Okay.” And they go, “He’s like… I wouldn’t want to tell you what it is. Let us know what happens.” Two days go by and Wilson comes up to me one day and says, “Cox, Cox.” I’m like, “Oh, shit.” I’m like, “Hey, what’s up?” He’s like, “Oh, you’re not going to believe this. I got indicted.” I was like, “What? What happened? No.” “Yeah, my wife, they questioned my wife and my brother, and my wife walked in. First she said, I don’t have nothing. I don’t know what you’re talking about.”

(05:24:14)
The next day, the brother walks in and gives them $150,000 in cash. And so the next day the wife comes back and gives him $250,000 in cash and a bunch of silver, like gold, bullion and silver, because his Ponzi scheme was based off of silver. He was going to invest in silver for you. So half a million dollars, they turn over half a million. I’m like, “Half a million dollars? I thought she was a hundred thousand or something.” And he was like, “I know. I didn’t know I could trust you.” I’m like, “Ron, what are you doing? I thought we were…” So I’ll tell you something just for the icing on the cake by the way, the icing on the cake. Let me explain one more thing.

(05:25:05)
So if somebody cooperates with the federal government, let’s say I get arrested and they go, “You want to help yourself?” And you go, “Yeah, okay, look, Jimmy is a… He lives next to me and he’s running a meth house, a meth lab, whatever.” And they go and they raid Jimmy and he gets arrested. You’re going to get something off of that. Not a lot, but you’re going to get something. And they could just say, “We were going to bust him anyway. We were already onto him.” Now, the next level would be you wear a wire.

(05:25:38)
So I wore a wire and I was in danger. Now keep in mind, I’m asking this guy questions inside federal prison. I’m in danger. So whatever, that’s the next level. You’re taking an active participation in the investigation. And the third level would be you actually get on the stand and you cooperate and you testify there’s no better cooperation than that. So when Wilson says to me, “They’re going to move me back to South Carolina, they’ve indicted me. They’ve charged me, what do you think I should do?” And I go, “I think you should go to trial, because I know they’ll have to call me as a witness.” Just to let you know, I don’t want to walk out of here and have you feeling like, “Hey, there’s some good to this guy.” So I’m ready to gut Wilson like a fish.
Interviewer
(05:26:40)
But you are putting yourself in danger if you get on the stand, right?
Matthew Cox
(05:26:42)
I’m already in danger. If people there heard what I was doing, I probably would’ve been in danger.
Interviewer
(05:26:46)
Does that increase the chance of them hearing or no?
Matthew Cox
(05:26:50)
It does, but it also increases my ability to get more off my sentence. So what happens is a couple of days later, he’s on what’s called the packout. They’re going to move him maybe a week later. So they come and get him, they move him, he gets back there to South Carolina and he pleads guilty. They sentence him, he gets six months added on. So he is now from nineteen and a half to twenty years. And by the way, when Covid hit, he was released. So he only ended up doing six years on a twenty-year sentence because he was older, by that point, he was sixty-six, sixty-seven years old. Anybody older than fifty-five was in danger, especially in the prison. So they had a Covid thing where they were releasing these guys and sending them home on [inaudible 05:27:37].Like, “He’s an old man, he’s not going to hurt. He’s not a danger.”

(05:27:43)
So they sent him home. So he ended up doing… So he didn’t even serve the six months. He didn’t even serve the original sentence, whatever. Not that I care. So I’m just saying, if it makes you feel like, “Poor Ron.” It’s okay. So his wife got a hundred hours of community service or something, or sixty hours, and I think his brother got six months papers. They got charged with obstruction of justice and neither one of them… It was six months probation and community service, nothing. So when I turn around, I’m waiting for my reduction, waiting, waiting. After about 90 days after this guy gets sentenced, maybe six months, I send a letter, “Hey, what’s going on to the prosecuting, to my prosecutor?” The prosecutor of both districts, no response. Then I go to Frank, I explain to Frank, and Frank has known what’s going on the whole time. And Frank goes, “Okay, I’m going to file a 2255.” So we file a 2255, government comes back and first thing they say is, “Your Honor, we don’t know about any cooperation. We’ve never heard about any cooperation.”

(05:28:51)
So of course then we submit the letter that we have, the judge comes back and the judge ends up saying it’s a little complicated, but he ends up saying, “Look, I don’t have jurisdiction to hear this because you may be time-barred, but I’m going to let the appeals court hear it.” Now, typically, you have to get what’s called a right of a certification to appeal. You have to make sure that you actually have a case. He says, “I’m waiving the cert and I’m waiving the $500 fee to file with them.” And he basically expedites it for me, which is a subtle way of telling the prosecutor, “I think he’s got something and I’m sending it up there.” And the way he writes his motion, it’s basically saying, “I don’t have the jurisdiction to do anything, but they do. They need to do it. And I’m paving the way. You don’t have to pay any money and you don’t need that cert.” So the prosecution immediately comes back, they file a one level reduction, and…
Matthew Cox
(05:30:03)
… level reduction. And we immediately, Frank files something saying, “Hey, stop. We don’t want the reduction. We don’t want the one level, we want to come back to court. Please don’t rule on it.” So the judge says, “Okay, I’m freezing everything. I’m putting a stay on everything. I’m going to give this guy a lawyer to try and figure out what you’re going to do.”

(05:30:25)
They fly down a lawyer, Leanne Weber. So she comes, and she comes and sees me and she says, “Listen, I see that you want to go back and fight this and this, but honestly I don’t think you’re going to get anything more than one level. I talked to the prosecution. They said they’ll give you…” Well, she said, “I can work on trying to get you two levels, but you don’t have much of a prayer. You’re going to get crushed.” And I said, “Well, then why are you here? If they can crush me so easy, why don’t they do it? Why would they pay you…” They pay them like 12 grand or something just to fly down and all your expenses, “… to negotiate for me? Why not crush me?” And she’s like, “I don’t know.”

(05:31:12)
I said, “Well, Frank said four levels.” And she’s like, “Who’s Frank?” I go, “Frank’s the guy that wrote all this.” And she’s like, “Oh, is he an attorney? Is he in here?” And I’m like, “Yeah, he’s in here.” She’s like, “Why is he in here?” And I tell her, you’ve taken over the world. And she says, “That’s the strangest thing I’ve ever heard in my entire life.” And I said, “I understand. But Frank said…” And she’s like, “You’re listening to an incompetent…” I’m like, “Yeah, absolutely. And Frank said we want four levels. He said for me to tell you we want four levels.” She goes, “Okay.”

(05:31:46)
She leaves, she goes to the US attorney. We argue. Two levels. They come back and say two levels. No. We go back and forth. We start filing motions saying we want to go back, we want a hearing. We want to bring back all the FBI agents, the Secret Service agents. And she’s like, “What? Do you want to turn this into a circus?” “Exactly what I want to do. I’m going to turn it into the biggest circus. Because I’ve already got one level.” They come back in one day, she says, “Listen, three levels is the best you’re going to get.” She said, “So I guess you’ll be moved back here. We’ll go to the hearing.” I said, “No, no, no, I’ll take three levels.” And she goes, “What are you talking?” She said, “You said four levels. You said Frank wouldn’t let you take anything less than four.” I said, “No. Frank said to tell you four. I was happy with three. I wanted you to argue for four. I’m good with three. I’m out of here in a year.”

(05:32:33)
And I don’t want to be moved back. I don’t want to have to get on that bus. Do you know what it’s like to be moved? It’s horrible. So I said, “I just want the three levels.” Then we argue about the wording for about two, three months, and then they file it. And then I get five years knocked off my sentence because three levels at the level I was at now, isn’t seven years. Every level you get a little less time, so I get five years off. So now I’ve got 12 years knocked off my sentence.

(05:32:55)
At this point I maybe have a year and a half to go, and that’s doable. So I was super, super happy. And I’m going to tell you something, and I’m sorry bro, but every time I think about it and I just feel like I have to say it, Frank [inaudible 05:33:28] insane, but I didn’t have a fucking prayer without that guy. And as crazy as he is, as much of a pain in the ass as he was, I could never repay him, bro. I shouldn’t be here. I’m supposed to be in prison right now. My out date was 2030 without that guy.
Interviewer
(05:33:59)
Where is he know?
Matthew Cox
(05:34:01)
He got himself out. He didn’t do all that time, he got himself out. I don’t even know how he did it. They’ve even thrown him back in prison again for six months and he got himself out again. He’s insane. He’s incredible. He’s insane but he’s incredible.
Interviewer
(05:34:13)
Is he really that insane?
Matthew Cox
(05:34:15)
He’s in Orlando.
Interviewer
(05:34:17)
I mean, he seems like a good lawyer and a good man.
Matthew Cox
(05:34:22)
Look, he’s great. He’s great. I mean, there’s no doubt in my mind I would be in prison right now if it wasn’t for him.
Interviewer
(05:34:31)
And he’s done this for others?
Matthew Cox
(05:34:32)
Walk people right out. 10 years off, five years off, nine years off, 10 years. And I didn’t pay for one thing. I didn’t pay for my stamps, he paid for everything.
Interviewer
(05:34:45)
It sounds like the other lawyers don’t really believe it’s possible, and he does. It’s interesting.
Matthew Cox
(05:34:50)
Well, I think he’s willing to badger them into doing what they should’ve done to begin with. I actually wrote a book about it, which he loved.
Interviewer
(05:35:03)
About him.
Matthew Cox
(05:35:03)
About him and his story. It’s so over the top, what happened with him. I mean, literally tried to take over the Congo. I mean, there’s a documentary about it. It’s called 9 Days in the Congo. It’s an insane story. It’s one of those stories that’s just like, how is this not a movie?
Interviewer
(05:35:21)
It’s not a movie yet.
Matthew Cox
(05:35:22)
No. I’ve pitched it several times and it would be great. So I wrote a synopsis and I turned that into a book.
Interviewer
(05:35:32)
What’s the name of the book?
Matthew Cox
(05:35:33)
Oh, It’s Insanity.
Interviewer
(05:35:33)
It’s Insanity.

Freedom

Matthew Cox
(05:35:35)
Yeah. But about it, like a year and a half later, I ended up getting out of prison and I went to the halfway house.
Interviewer
(05:35:40)
What’d that feel like, freedom?
Matthew Cox
(05:35:43)
Oh, this is bad, bro. This is bad. I remember when I was leaving the prison… I met some great guys in prison, which is a weird thing to say. But I met better people in prison than I’d ever met outside prison, at that low. I mean, because it was the first time I actually had friends. I really had someone that wanted to hang out with me, just to hang. I didn’t have anything to offer them. I can’t make you any money, I can’t do anything for you. We’re just hanging out because we like to laugh or we have things in common or we are fascinated by each other, or we just have a good time and fun.

(05:36:32)
So when I was leaving, I remember my mom showed up and my brother showed up and they picked me up, and we were driving off. I remember looking back at the prison and my brother said, “I’ll bet you’re glad to leave that behind you,” and I started crying. It’s like nobody talked. It was so uncomfortable. I started crying and it wasn’t because I was like, “Oh, it’s over.” It was like survivor’s guilt. Like I was leaving all of my friends and I felt so bad that I was leaving them.

(05:37:15)
But I went to the halfway house and I had four… When I was getting out, I remember joking that I had exhausted my Trulincs account, my inmate account, I’d exhausted it. I had nothing, I had 18 cents, I couldn’t even figure out how to spend it. And they give you a debit card when you leave, and they charge you every time you use the card. I don’t even have enough to spend the 18 cents because the charge is like $3. So I was like, “Yeah, yeah.” I was like, “I wonder if they’ll still giving my debit card.” And I’m laughing. Everybody’s like, “What are you going to eat? What are you this, what are that?” And my one buddy looked at me, he was like, ‘you can’t go to the halfway house with nothing, bro.” And I was like, “No, it’s cool.” I said, “No, it’s cool.” I said, “No, it’s cool.” I said, “I want to start at the bottom. I’ve got that coming. I got working at McDonald’s coming, so I’m going to work at McDonald’s. I don’t give a fuck.” And he was like, “Well, I think you’re going to need to buy clothes.” I said, “Oh,” I said, “It’s at the Goodwill. They give you a bunch of crap if you don’t have anything, if you’re indigent.” And I said, “I’m indigent.”

(05:38:36)
And a couple of days before I’m leaving, $400 ends up on my account. And I was like, “What the fuck?” And it was from a buddy of mine. And I go to him, my buddy Tommy, and I was like, “Tommy,” I go, “Did you put 400 on my account?” And he said, “I can’t let you go with nothing, bro.” So I get to the halfway house and I go to Walmart and I buy $300 worth of clothes at Walmart. I’ve never been in a Walmart. I go to Super Walmart, it’s huge. I go there and I buy a bunch of clothes and I buy about 300 bucks worth of clothes, and I still have some of the blue jeans. To this day I still wear some of the blue jeans.

(05:39:24)
I stayed in the halfway house and I called a buddy of mine named Trion, Trion Colta, and he owns a gym. And I grew up with him. His whole family, they own a bunch of gyms. And I called him and I said, “Hey man, I’m in the halfway house.” And he was like, “Hey, what’s going on?” He said, “Can I do anything for you?” And I was like, “I mean, I need a job.” I didn’t think he was going to give me a job. He goes, “Bro, you’re hired. I’ll give you a job.” He said, “Minimum wage.” I said, “That’s fine. If I can stay out of here…” You can work 80 hours a week. I was like, “If I can just stay out of here 80 hours and you pay me minimum wage.” He goes, “Oh, hell yeah, perfect.”

(05:40:03)
So I’m at the gym and I got free reign. So I’m playing on my computer, goofing off all day. And my buddy Pete, who’s still locked up, he’s texting me and calling me, and he’s like… Not texting me, he’s emailing me through the Corrlinks system. And he calls me periodically, he’s like, “Have you started a website?” Because one of the things I was going to do when I got out was I was going to start a website with all these stories that I’d written. And I was like, “No, Pete, I can’t. I don’t have a computer.” He’s like, “Well, how much is a computer?” I was like, “I don’t know, they’re like 300 bucks.”

(05:40:36)
I said, “I could probably get a used Apple MacBook, like a five-year-old MacBook or something, I don’t know, for $350, whatever.” But he was like, “Okay, so that’s all you need, 300 bucks.” I go, “No, no, no, no, no,” I said, “It’s not 300 bucks, bro. It’s 300 bucks plus it’s getting a WordPress website,” which I said costs money. “Plus it’s hiring somebody to help me figure it out because I’m inept. I don’t know how anything works.” So he, “Okay.” And I said, “Plus, I need this. Plus I need a bunch of stuff. I need $600 for this. I need 300 for this. I need 500 for this. I need a thousand dollars for this.”

(05:41:16)
And he goes, “Okay.” He said, “I’ll get you… Okay, I got it.” So he reads off a list, he goes, “I got you.” Pete doesn’t have any money. And I go, “How are you going to give me any money?” He goes, “Every day I walk across the compound, people stop me and say, ‘How’s Cox doing?’ And I say, ‘Oh, he’s okay.’ And they say, ‘Does he need anything?’ And I say, ‘No, no, he’s good.'” He said, “I’m going to start telling these fuckers, ‘Yeah, yeah, he needs something. You want to do something for him? Here’s what he needs.'”

(05:41:47)
I ended up getting two laptops sent to me. I got the computer program Final Cut Pro. I had guys in prison cutting me checks so that I could build a website and put all these stories on the website. So I start putting the website… And I don’t know what I’m doing. I put them on the website slowly, it takes forever. I’m putting pictures up, I’m trying to figure out how Photoshop works, all this stuff. The whole time I wanted to start… Because the last, when I was just getting out of prison, everybody kept telling me, “Bro, you got to start a podcast. You got to start a true crime podcast.” And I don’t know what a podcast is. The term podcast came into existence in 2009 when I’d been locked up three years. I’d never been on YouTube.

(05:42:41)
So by the time I get out, the last year or two, guys are coming up to me, giving me magazines, like, “This is what a pod… You need to read… Look, true crime’s huge.” And you have to think, guys are asking me every couple of days, “Cox, you got any stories?” And I’m like, “Yeah, yeah, did you read Cash and Coke?” And they’re like, ” Is that the one with the guys are robbing the drug dealer?” “Yeah.” “Oh no, no, I read that one.” “Did you read this one?” “No, no, I haven’t read that, that’s the one with the guy…” And I’m like, “Yeah, yeah, yeah, yeah.”

(05:43:09)
So I’m giving these little stories and then they’d come back and give them to me. You don’t have anything in there, so this is guys that would never read in their life, are reading. And I’m writing about the guy in B2, the guy in C1. So I put up the whole thing and well, anyway, they’re all telling me do a true crime podcast. True crime podcast. I don’t really know what that is, but by now I’m starting to listen to them on YouTube, Serial and Cold Case Files, that kind of stuff.

(05:43:40)
And I think that’s what I want to do. Well, my buddy Trion says, “There’s a guy named Danny Jones that runs a podcast called Koncrete, and it’s in St. Petersburg, and he lives a couple of miles from me. I see him all the time.” And I went, “Okay.” And he said, “You should email him. He’s got a guy on there all the time that does real estate.” And I go, “I just got out of prison for bank fraud related to real estate. He doesn’t want to interview me.” He goes, “Well, maybe he does. Maybe you could ask him about starting a podcast.”

(05:44:12)
Okay. So I sent him an email. I remember Danny called me and he said, “Hey, is this Matt Cox?” I was like, “Yeah, this is Matt.” He’s like, “I got your email. This is Danny Jones.” And I was like, “Okay.” And he says, “Yeah, I got your email, bro.” He goes, “This is a good fucking email.” I was like, “What?” He goes, “I get a lot of emails, bro.” He said, “That is a…”
Interviewer
(05:44:31)
This is a good one.
Matthew Cox
(05:44:31)
“That’s a good one. That was really good. I mean, that was well written.” He’s like, “I immediately knew I had to talk to you.” And I said, “Oh, okay.” Because I think I started off with, “Hey, my name’s Matt Cox, and I’m a conman.”
Interviewer
(05:44:46)
Good opening.
Matthew Cox
(05:44:46)
“Who was recently released from federal prison.” And so he was like, “Oh yeah, I mean, who says that?” So anyway, he said, “Well, what’s going on?” I said, “Well…” And I tell him what’s going on. I want to start a podcast, blah, blah, blah. And Danny, he listens to me for 30 minutes to an hour, and, “I’ve heard this and this.” And he’s like, “Yeah, right. YouTube’s not really like that, and that’s not really how we do it. And you’re going to have to get a production company,” and blah, blah, blah. He goes, “But you know what? What you really need to do is to see if people are even interested in you or your story, or you’re able to talk. You should come on my show.” Shameless, trying to get some content.
Interviewer
(05:45:22)
Well, I mean, so as I told you offline, Danny and Koncrete podcast is really good, so people should definitely listen to it. But yeah, I mean, it turns out people do like listening to you.
Matthew Cox
(05:45:32)
Turns out.
Interviewer
(05:45:32)
I mean, you’re good at telling stories.
Matthew Cox
(05:45:35)
Well, anyway, by the time I got… I couldn’t do Danny’s podcast. I was like, “I can’t do it, bro. I’m in the halfway house, so maybe…” I get out of the halfway house and a couple of months go by. Maybe two months, three months go by, and one day I get a phone call from Danny. He’s like, “Bro, you’re out of the halfway house, right?” Because I told him I got out in July, it was like October, November. I’m like, “Right.” He’s like, “Listen, I had a guest fall through. I got nobody. I need you to come on. I answered all your questions.” I’d called him five, six times. “You said…” And I was like, “Fuck it, I’ll do it.”

(05:46:11)
That video got 2 million views. Then I did Patrick Bet-David flew me out. Then I did Soft White Underbelly, then I did Vlad, people started… I’m sorry, and then it just blew up. Then people started asking me to come and talk for no reason, which was crazy. But you were saying, I’m sorry?

Family

Interviewer
(05:46:31)
Is your dad still with us?
Matthew Cox
(05:46:33)
No, he died when I was in prison. He came to see me two or three times.
Interviewer
(05:46:43)
When is the first time he found out that you were doing fraud?
Matthew Cox
(05:46:48)
The first time I got in trouble.
Interviewer
(05:46:50)
When you got the probation?
Matthew Cox
(05:46:51)
Yeah, because I had to explain that something’s happening. I didn’t want him to hear it from anybody else.
Interviewer
(05:47:03)
So you talked to him directly about it?
Matthew Cox
(05:47:08)
Super disappointed.
Interviewer
(05:47:11)
Did he ever tell you he loves you after that?
Matthew Cox
(05:47:15)
After I got the 26 years and the government decided they weren’t going to indict anybody, and I really was like, “Wow, this is it. You’re done.” He came to see me, but just by himself. And I remember when he came to see me, it was by himself. He never came by himself. So I remember thinking something happened to my mom. And as soon as he walked in, I go, “What happened?” I go, “Where’s mom?” And he goes, “Oh no, she’s fine. She’s fine.” And he sat down with me and he said, “How are you doing?” I was like, “I’m good.”

(05:47:58)
He was getting sick. He was getting older. So we talked for a little bit just about the situation. And I was like, “Yeah.” He’s like, “Well, what are you going to do?” And I was like, “There’s nothing I can do. I’ve called multiple attorneys, I’ve talked to people, there’s nothing I can do.” And he was like, “You’re going to figure it out.” He said, “You’re clever and you’re smart, and you’re not going to do all of that time.” And I was like, “I’m done. It’s over. I’m going to get out of here when I’m 60 if I behave myself. And if I don’t, I’ll be 64.” And he was like, “That’s not going to happen.”

(05:48:58)
I think that was the first time he… I knew he was proud of me when I was making money, but he never said it. You got the look like he was impressed. But we were sitting there and I remember he said… Because it’s the only time I can ever remember him saying he was proud of me. And I remember he said, “You’re going to figure this out.” He said, “I’m not proud of where you ended up, but you’ve done amazing things.” He said, “I wish you’d use your talents for something different, but you’ve done things that I could’ve never done, and you’ve led an amazing, adventurous life, and I’m proud of you.”
Interviewer
(05:49:51)
I wish he could see you now.
Matthew Cox
(05:49:58)
My mom saw me. My mom’s funny because my mom came to see me. My mom’s a gangster. My mom came to see me every two weeks for 13 years. She missed about a month and a half when she had a stroke and ended up in a wheelchair. Then she came in the wheelchair, and she would make my brother bring her. My brother and sister would be like, “Mom, are you sure you want to go? It’s so hard to… It’s such a long drive and you get so tired.” ” Well, I’ll sleep in the car.” “I know, but then we have to wait in the waiting area forever and it takes forever.” “Well, I’m in the wheelchair, so I’m fine.” “Well, I know, but it’s such a pain to get in and out, and in and out.” She goes, “I’m going to see my son and you’re taking me.”
Interviewer
(05:50:51)
I love it.
Matthew Cox
(05:50:55)
Yeah, she was something else. And I always say, if I had to say… I don’t think about all the things I did to get out. I know there’s all these guys that are like, “Oh, I wouldn’t have done that. I’d have been a standup guy. And I’d have been…” Well, good for fucking you, bro. I wanted to get out. I wanted out. And the icing on the cake of me getting out, and I would’ve cut every motherfucker’s head in that prison off. I was able to get out just in time to spend the last year and a half of my mother’s life with her.

(05:51:55)
I saw her two or three times a week, took her to dinner once a week. Was able to go on walks with her in her wheelchair. I was sitting right next to her when she had her final stroke. I held her hand when she took her last breath. So if I have to be called a snitch the rest of my life, I don’t give a fuck. I may not deserve more, but she deserved more.

Regret

Interviewer
(05:52:35)
Do you regret… [inaudible 05:52:39] just look back, would you do any part of your life different?
Matthew Cox
(05:52:43)
Oh, I’d scrap all this, yeah. Yeah, I’d scrap all this to be… You always hear these guys say, “I wouldn’t change it because it made me the man I am today.” The man I am today is a fucking 54-year- old scumbag, multiple felons, starting my life over broke, living off of scraps, trying to make YouTube work. I’ve got two dead parents. I’m divorced. I have a son that doesn’t talk to me. I have a son that doesn’t talk to me for good reason, not because of a misunderstanding, because he understands. You can’t even argue with him, he’s got a powerful argument. “I don’t want to be a part of this guy’s life. He’s a scumbag. He stole money. He went on the run. He abandoned me when I was three years old. I don’t want anything to do with him.”

(05:53:52)
I get it. And I’ve tried to do all the right things. I wrote the letters. I drew him pictures. I’ve tried to call and it’s not happening. I would do anything to go back and just be that regular, middle class guy with the two kids and the wife, working a regular job. That’s a good life. That’s a good person. I just made one arrogant decision after another, after another until it snowballed and I couldn’t take it back. And then I did everything I could. And if I wasn’t the calculating, backstabbing scumbag motherfucker that I can be, I’d be in prison right now. Sorry.

(05:54:51)
So yeah, yeah, I would much rather be a CPA right now. I would much rather… Should’ve stuck with being an insurance adjuster or something. I mean, I never should’ve whited that 30 day [inaudible 05:55:04] out. Never. It was a mistake.
Interviewer
(05:55:06)
That was your first mistake.
Matthew Cox
(05:55:07)
That was a huge mistake.
Interviewer
(05:55:09)
You think your son will forgive you?
Matthew Cox
(05:55:11)
No. Unfortunately, according to my ex-wife and my sister, and everybody that he is a part of their lives. And I’ve seen him. My mother’s funeral, I saw him. I’ve seen him at several functions. You look across and he looks right through me. Everybody says, “He’s just like you. He’s just like you.” And everybody says I’m just like my dad. I’ve never smoked a cigarette. I’ve never drank alcohol, not a drop. Never done any drugs because my dad was an alcoholic and my dad smoked two packs a day, and everything in our house reeked of nicotine. And I’ve never smoked.

(05:56:08)
And my dad was a pill head. He was always on some kind of prescription medication. I didn’t want to be that person. And one day I drew a line in the sand and I wouldn’t do it. And I think he’s drawn a line in the sand and he’s decided, “This is the hill I’m going to die on and I’m not going to back off it.” And the thing is, my ex-wife tells him, “He’s a good person, you should be in his life.” His father, because he was adopted. When I was in prison they adopted him. Nick is his dad. Nick has told him. Nick came to see me when I was in prison. Nick has told him like, “Hey, this is a mistake. You’re making a mistake.” Everybody that knows me, knows him, and he has said no. So I fully believe it’s no. I mean, I hope it’s not.
Interviewer
(05:57:04)
Well, I hope he forgives you. I think there’s a lot of good in you, despite you calling yourself a scumbag over and over in this podcast.
Matthew Cox
(05:57:12)
That keeps bothering you, you mentioned that earlier.
Interviewer
(05:57:16)
What advice would you give to young people, given that you’ve lived quite a non-standard life? What advice would you give them, how to live a life they can be proud of?
Matthew Cox
(05:57:27)
I mean, I don’t know if I’m in a position that anybody would listen to me. And I don’t have any advice that I don’t think a father would give you, and it’s like work hard, be appreciative. I mean, things are so good out here. I hear people complain all the time. And I think a huge part of just being happy is being appreciative. I didn’t appreciate anything. This is so cliche, but when I had all the money in the world, I was miserable. But when I got out with nothing, I was happier in prison with nothing than I was with two or $3 million prior to prison, and I’m dating a chick I never should’ve been dating, driving a sports car, vacationing all over the world, miserable. I’m crying, driving away from prison because I already miss my friends. You could’ve never told me that was going to happen.
Interviewer
(05:58:27)
Turns out money, in fact does not buy happiness.
Matthew Cox
(05:58:30)
No. And it is such a cliche, right? But it’s so true.
Interviewer
(05:58:34)
Crying, driving away from prison. Yeah.
Matthew Cox
(05:58:37)
You know what? I met my wife in the halfway house. She had just gotten out of prison. She was in the halfway house with me. She just did five years for a meth conspiracy. I never would’ve met her if I didn’t go to prison.
Interviewer
(05:58:54)
And now your date night is hunting alligators together.
Matthew Cox
(05:58:58)
Yeah, that was a month or so ago.
Interviewer
(05:59:01)
This is Florida, folks. This is what badass people do in Florida. Hog hunting.
Matthew Cox
(05:59:09)
My wife is a former… She was an MP in the military. She hunted, she ran a hog hunting tour guide service for six years, went to prison for five years. Got out, and then now she’s a marine mechanic. And yeah, our date night the other night was we went in the middle of the night, went to Lake Okeechobee and went alligator hunting.
Interviewer
(05:59:39)
And if I may say so, she’s quite beautiful.
Matthew Cox
(05:59:41)
Thank you. I did nice. She didn’t want to date me at the halfway house too. I kept saying, “I feel like you’re sweet on me.” She’s like, “I’m not. I’m not. I make fun of guys like you. You’re a city boy.” I’m like, “I don’t know. I feel like…”
Interviewer
(05:59:54)
Well, you wore her down.
Matthew Cox
(05:59:56)
That’s exactly what I did.
Interviewer
(05:59:59)
Yeah, it’s that charisma. It always works. Well, Matt, thank you for being so honest. Thank you for being who you are. I do think there’s a lot of good in you. And thank you for telling your story and the story of others who’ve made mistakes in their life. Thank you for talking today.
Matthew Cox
(06:00:17)
I appreciate you having me on.
Interviewer
(06:00:19)
That was a really short conversation.

(06:00:23)
Thanks for listening to this conversation with Matthew Cox. To support this podcast, please check out our sponsors in the description. And now let me leave you with some words from Mario Puzo, author of The Godfather, “Behind every successful fortune, there’s a crime.” Thank you for listening and hope to see you next time.

Transcript for Tal Wilkenfeld: Music, Guitar, Bass, Jeff Beck, Prince, and Leonard Cohen | Lex Fridman Podcast #408

This is a transcript of Lex Fridman Podcast #408 with Tal Wilkenfeld.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Tal Wilkenfeld
(00:00:00)
I am standing on the edge of the cliff the entire night, and if I mess something up, mess it up, what even is a mistake? But if I do a little clunker or whatever it is, it’s like, so what? I wouldn’t have played half the stuff that I’m playing if I wasn’t constantly standing on the edge of the cliff, like wild.
Lex Fridman
(00:00:22)
Why stand at the edge of the cliff?
Tal Wilkenfeld
(00:00:24)
Because at the edge of the cliff is all possibilities.
Lex Fridman
(00:00:30)
The following is a conversation with Tal Wilkenfeld, a singer-songwriter, bassist, guitarist, and a true musician who has recorded and performed with many legendary artists, including Jeff Beck, Prince, Eric Clapton, Incubus, Herbie Hancock, Mick Jagger, Jackson Brown, Rod Stewart, David Gilmore, Pharrell, Hans Zimmer, and many, many more.

(00:00:54)
This was a fun and fascinating conversation. This is the Lex Fridman podcast. To support it, please check out our sponsors in the description. And now, dear, dear friends, here’s Tal Wilkenfeld.

Jeff Beck


(00:01:08)
There’s a legendary video of you playing with Jeff Beck. We’re actually watching it in the background now. So for people who don’t know, Jeff is one of the greatest guitarists ever. So you’re playing with him at the 2007 Crossroads Festival, and people should definitely watch that video. You were killing it on the bass. Look at that face. Were you scared? What was that experience like? Were you nervous? You don’t look nervous. Confident?
Tal Wilkenfeld
(00:01:37)
Yeah, I wasn’t nervous. I think that you can get an adrenaline rush before a stage, which is natural, but I think as soon as you bring fear to a bandstand, you’re limiting yourself. You’re walling yourself off from everyone else. If you’re afraid, what is there to be afraid of? You must be afraid of making a mistake, and therefore you’re coming at it as a perfectionist and you can’t come at music that way, or it’s not going to be as expansive and vulnerable and true.

(00:02:10)
So no, I was excited and passionate and having the best time. And also the fact that he gave me this solo, the context of this performance is that this was a guitar festival. It’s one of the biggest guitar festivals in the world because it’s Eric Clapton’s festival, and there’s 400 guitarists that are all playing solos all night. And we were towards the end of the night, and I could tell Jeff got a kick out of, I’m not going to solo on one of my most well-known songs, Cause We’ve Ended as Lovers. Well, Stevie Wonder wrote it, but people know Jeff for that song and his solo on it. It’s like, “I’m going to give it to my bass player.” And he did, and like-
Lex Fridman
(00:03:02)
You took it.
Tal Wilkenfeld
(00:03:03)
The fact that he’s bowing, he didn’t have to do that.
Lex Fridman
(00:03:03)
But you really stepped up there.
Tal Wilkenfeld
(00:03:14)
It just shows what a generous musician he is, and that’s evident in his playing across the board. He is a generous, loving, open musician. He’s not there for himself. He’s there for the music. And he thought, “Well, this would be the perfect musical thing to do.” And it kind of all started when I went to audition for him, which was an interesting experience because I got food poisoning on the plane.

(00:03:46)
And so literally when the plane landed, I went straight into an ambulance into a hospital overnight. The manager picked me up and I showed up at Jeff’s door, which was a three-hour drive through windy country roads, and he answered the door, and he is like, “Okay, you’re ready to play?” So we went upstairs and started rattling off the set. And when it came to this song, Cause We’ve Ended as Lovers, he just said solo, and he loved it and kept the solo in it. So that’s how, there was no bass solo before I was playing in his band. So this whole thing was kind of new.
Lex Fridman
(00:04:24)
So even with food poisoning, you could step up?
Tal Wilkenfeld
(00:04:27)
Yeah.
Lex Fridman
(00:04:28)
That’s just like what? Instinct?
Tal Wilkenfeld
(00:04:30)
It’s just being able to differentiate from the body and from expression, music.
Lex Fridman
(00:04:37)
It’s interesting. You said fear walls you off from the other musicians, and what are you afraid of? You’re afraid of making a mistake. Beethoven said, “To play a wrong note is insignificant. To play without passion is inexcusable.” Do you think the old man had a point?
Tal Wilkenfeld
(00:04:54)
Yeah. Different styles of music invite varying degrees of, I would say, uncertainty or unsafety in the way that people might perceive it. So for instance, the tour that I was just on playing Allman Brothers songs, I am standing on the edge of the cliff the entire night, and if I mess something up, mess it up, what even is a mistake? But if I do a little clunker or whatever it is, it’s like, so what? I wouldn’t have played half the stuff that I’m playing if I wasn’t constantly standing on the edge of the cliff, like wild.

(00:05:38)
And so I don’t care about those few little things. I care about the overall expression. And then there’s other gigs that, for instance, if I got called for a pop or a country session or a show. In those environments, they may want you to play safe, just play the part and play it with a great groove and time and great dynamics and don’t really veer away from the part and stuff. And I’ve done plenty of those gigs too. It’s just a different hat you put on.
Lex Fridman
(00:06:14)
What do you get from the veering? From the veering off the beaten path? You just love it? Or is that going to make the performance better? Why stand at the edge of the cliff?
Tal Wilkenfeld
(00:06:28)
Because at the edge of the cliff is all possibilities and unknown. You don’t know what’s coming. And I love being there in the unknown. Otherwise, it’s just like, “Well, why are we doing this? Am I just like a clown on stage showing you my skills or what I’ve studied in my bedroom?” It’s like, no, I want to be pure expression happening right now and responding in real time to everything that’s happening. And anytime I’m not doing that, it’s like it’s a waste of everybody’s time.
Lex Fridman
(00:07:06)
Have you ever messed it up real bad?
Tal Wilkenfeld
(00:07:09)
Messed what up?
Lex Fridman
(00:07:11)
I mean, all comedians bomb. You’re a big fan of comedy.
Tal Wilkenfeld
(00:07:13)
Yeah.
Lex Fridman
(00:07:14)
Have you ever bombed on stage?
Tal Wilkenfeld
(00:07:16)
Probably. I think it’s all about recovery. And the more times that you fall off the cliff, the quicker you know how to recover and the varying ways that you can recover to the point in which it’s concealed so much that maybe a listener might not even know that you’re recovering.
Lex Fridman
(00:07:38)
And eventually you learn to fly, if we take that metaphor all the way, off the cliff. [inaudible 00:07:44]
Tal Wilkenfeld
(00:07:43)
Remember one time when I was really young. Well, not really young, but when I was 21 or-
Lex Fridman
(00:07:44)
What is age anyway?
Tal Wilkenfeld
(00:07:52)
22? Yeah, exactly. But when I was first playing with Jeff Beck and we played at what I consider the best, the coolest jazz festival, it’s Montreux Jazz. And Miles played there, everyone played there, and they have the best speaker system ever. I was excited for months, and the drummer, Vinny was practicing for eight hours in the bus on the way there, and everyone was on fire on stage. And I remember playing a note, just one note that I really didn’t like. And I let it go in the moment on stage, but as soon as I got off-stage, I was really sad.

(00:08:37)
And so I sat on this road case, everyone was out celebrating. I sat this road case, look with a sad face, boo-hoo. And then Claude Nobs, the owner of the whole festival, came out to me. He’s like, “Tal, what’s wrong?” And I’m like, “I played a bad note.” I was such a child. And he said all this wise stuff that Miles Davis had imparted to him and it fully cheered me up. He’s like, “Is there anything that would make you feel better?” And I was like, “Caviar?” The dude came back 10 minutes later with this huge thing.
Lex Fridman
(00:09:18)
Oh wow.
Tal Wilkenfeld
(00:09:18)
It was a joke. It was a joke, but he actually brought me caviar. But anyway, that’s the one time that I remember being sad about a performance. Now I’m just like, “Okay, whatever. It’s done.”
Lex Fridman
(00:09:30)
Was it a physical slip of the fingers or did you intend to play that note?
Tal Wilkenfeld
(00:09:35)
That I can’t remember. I can’t remember if it was just a bad choice that sounded like a clanger, why it happened. It was so long ago, but I don’t get depressed about that anymore.
Lex Fridman
(00:09:48)
That’d be funny if that was your biggest and only regret in life is that note, and that haunted you in your dreams.
Tal Wilkenfeld
(00:09:53)
And then I’m on my deathbed and everyone’s just bringing me caviar because the one-

Confidence on stage

Lex Fridman
(00:09:59)
Joke went way too far. You talked about confidence somewhere. I don’t remember where. So I want to ask you about how much confidence it takes to be up there. You said something that Anthony Jackson told you as encouragement, line that I really like. That quote, “On your worst day, you’re still a bad motherfucker.”
Tal Wilkenfeld
(00:10:17)
That’s actually a Steve Gadd quote. And Steve used to tell that to Anthony because Anthony used to get real depressed if he did a wrong thing or not perfect thing. And Steve Gadd used to say this to Anthony Jackson. And then Anthony was my first bass mentor or just mentor in general.
Lex Fridman
(00:10:36)
For people don’t know, he’s a legendary bassist.
Tal Wilkenfeld
(00:10:37)
He’s a legendary bassist. And I started playing the bass when I was 17 and I moved to New York and I met Anthony and he started mentoring me bit in a very not typical way. He would just sit in his car with me for hours and talk music.
Lex Fridman
(00:10:55)
You guys just listen to music and analyze it?
Tal Wilkenfeld
(00:10:57)
Exactly. And that was the best form of learning, I think. Just like, “Well, what do you perceive here?” And, “Well, I heard this” and just discussing that.
Lex Fridman
(00:11:08)
Jazz usually?
Tal Wilkenfeld
(00:11:09)
No, all styles of music. And yeah, he told me that story about on your worst day because yeah, even then when I was 18, 19, I’d get sad sometimes about performances. “I could have done this.” I don’t do that anymore, thankfully. Or I’d be miserable.
Lex Fridman
(00:11:29)
So you always kind of feel pretty good?
Tal Wilkenfeld
(00:11:31)
Yeah. Yeah, now I do. Now it’s just I sense the body feeling fatigued, especially if it’s a very long show. The ones I just did with three hour shows and we did one to three hour sound checks. So that’s a lot of physical activity every day. So I just feel the body being tired, fatigued, the ears are fatigued. That’s about it. I don’t really reflect on the show much.
Lex Fridman
(00:11:59)
You’re almost like from a third person perspective, feel the body get tired and just accept it.
Tal Wilkenfeld
(00:12:05)
Yeah, I don’t want to identify with it then I’m tired, but I’m not tired.
Lex Fridman
(00:12:09)
It’s very Zen.
Tal Wilkenfeld
(00:12:10)
I’m usually energized.
Lex Fridman
(00:12:12)
It’s like with the food poisoning, the mind is still capable of creative genius, even if the body is gone.
Tal Wilkenfeld
(00:12:18)
Yeah.
Lex Fridman
(00:12:19)
Something like that? So no self-critical component to the way you see your performances anymore?
Tal Wilkenfeld
(00:12:30)
There is critique, but not in the way that it would diminish my sense of self. It’s different. I can just kind of look at something and be like, “Okay, well actually next time I’ll do this choice and this choice, maybe. Maybe this would serve the song better. Maybe this would help the groove feel more like this.” But it’s not like, “I suck because I did this and I’m a loser.”
Lex Fridman
(00:12:58)
Do you think that’s bad? Even when I asked that question, I had a self-critical thought that, “Why’d you ask that question? That’s the wrong question.” I always have the self-critical engine running. Is it necessarily a bad thing?
Tal Wilkenfeld
(00:13:12)
It depends. If it’s affecting you negatively.
Lex Fridman
(00:13:14)
What is negative anyway?
Tal Wilkenfeld
(00:13:15)
Well, if it brings your frequency down and you feel less joyful inside and less, you don’t feel like complete, you feel less than, less worthy of something, than you could call that bad if you aspire to not feel that way.
Lex Fridman
(00:13:35)
Yeah, I aspire to not feel that way in the big picture, but in the little picture, a little pain is good.
Tal Wilkenfeld
(00:13:41)
That’s fair.
Lex Fridman
(00:13:43)
So confidence. You seem like in this performance, you seem confident. You seem to be truly walking the bad motherfucker way of life.
Tal Wilkenfeld
(00:13:55)
A word that I prefer over confidence is trust. Because I think with confidence is almost like is a belief assigned to it that I am this thing.
Lex Fridman
(00:13:55)
Ego.
Tal Wilkenfeld
(00:14:08)
That you believe in. Whereas trust is just simply knowing that you can get up there and handle whatever is going to come your way. And it’s more of an open feeling where it’s like, “Yeah, I could do this. Sure.” But not like, “I’m a bad motherfucker.” You know what I mean? There’s a huge difference because I’ve shared the stage with people who have a lot of confidence and it can be like a brick wall, just like fear is a brick wall.
Lex Fridman
(00:14:40)
So the brick wall is a bad thing. The thing you have with Jeff here on stage-
Tal Wilkenfeld
(00:14:44)
Is not a brick wall.
Lex Fridman
(00:14:45)
There’s no wall, just chemistry.
Tal Wilkenfeld
(00:14:46)
Yeah.
Lex Fridman
(00:14:47)
How can you explain that chemistry the two of you had?
Tal Wilkenfeld
(00:14:49)
Trust and lack of fear. Yeah, and also I will say that each individual has developed likes and dislikes over their lifetime. And that can be like in this case, we’re just talking aesthetic likes and dislikes. So in this particular case, obviously our likes and dislikes are very much aligned such that the things I do to complement him, he enjoys and vice versa. But it could be two very trusting open musicians on stage that don’t have walls up, but their choices are very different. And one person likes heavy metal and the other person likes classical. So it’s got to be both.
Lex Fridman
(00:15:33)
So you guys were good at yes and-sing each other musically?
Tal Wilkenfeld
(00:15:37)
Definitely.
Lex Fridman
(00:15:37)
Is that where you’re most at peace in a meditative way? It’s on stage?
Tal Wilkenfeld
(00:15:46)
It used to be that it would only be on stage. It started with that. That was almost like my way into flow state and meditation was playing music. And then back in the day when I’d kind of crash after shows, I wanted to change that. I wanted to always feel like I’m in flow state.
Lex Fridman
(00:16:09)
Have you succeeded?
Tal Wilkenfeld
(00:16:10)
I’ve gotten a lot better. I’m still obviously on the journey, but yes.
Lex Fridman
(00:16:17)
So you meditate? I think you said somewhere that you meditate before shows or just in general?
Tal Wilkenfeld
(00:16:21)
I meditate every day. When I’m on tour with my band, I ask that we all meditate together for at least 20 minutes. And I don’t dictate which type of meditation. I don’t put on a guided meditation. Everyone has their own thing they want to do. Maybe someone might be praying in their head, it doesn’t matter. It’s just the idea that we all put our phones down and we all are in one room connecting energetically, spiritually, and just letting our lives go for a second. And then we walk straight on the stage and it’s always really connected. And there were a couple gigs where we ran out of time for that, and I could tell. There was a major difference in the performance.
Lex Fridman
(00:17:07)
So it both connects you and centers you, all of those things.
Tal Wilkenfeld
(00:17:11)
But then when I’m home, I love to meditate and I’ve tried various styles of meditation and studied various types of things. So I don’t do just one thing. I kind of customize it depending on where I’m at in my life.
Lex Fridman
(00:17:30)
You and the world lost Jeff Beck a year ago. You told me you really miss him. How’s the pain of losing Jeff change you? Maybe deepen your sense of the world?
Tal Wilkenfeld
(00:17:43)
It’s hard to accept that we won’t create something musically again in this lifetime. But in terms of the grief, grief was easier for me because I went through a major grief period in 2016 and 17, and that was the first time I’d really gone through the process of grief in a non-family situation with friends and mentors and people that I’d created with, which is different. It’s a different kind of connection. When my grandparents died, it’s like there was nothing left unsaid. And I was at peace with what was happening.

(00:18:40)
With this, when Prince died out of the blue in mid 2016, and then Leonard Cohen died in November, that just tore me to shreds because Leonard Cohen was not just someone that profoundly inspired me musically and lyrically, but spiritually, we had a very deep connection. And that was the basis of a lot of our conversation was spirituality. And so at that time, I felt like a piece of me went missing. And that was a very long process where I just stayed in my place and didn’t want to play a note of music. I kind of wanted to just get rid of all my stuff. So I had a friend come over and he’s like, “You should just, why don’t you come to the Comedy Store?” I’m like, “Comedy Store? What am I going to go to some store and buy clown suits? What are you talking about? What’s a Comedy Store?” He’s like, “No, no, no. The Comedy Store, the place where comedians go.”

(00:19:54)
I’m like, “Okay, well, I’ve never seen standup. I’ve seen Seinfeld on TV. That’s the extent of my standup experience.” So he took me to the Comedy Store and every single one of those comedians embraced me like I was family. It didn’t even take a day. I was part of the family and I made 25 best friends, and I ended up throwing all my stuff in storage and finding a little room to stay in where I rented my gear out and my rent paying was me loaning the gear. I didn’t want any responsibilities, financial, I just wanted to be completely free so that I could just process it and not feel like I had to commit to anything work-wise or creatively. I just wanted to unplug.

(00:20:50)
And so this was a fun and very different way to unplug, because previously I may have just gone to a monastery and spent weeks at a monastery or months, but in this case I was like, “You know what? This is a different kind of experience. I’m going to just hang out with comedians and stay in this room.”
Lex Fridman
(00:21:09)
With no responsibility, really.
Tal Wilkenfeld
(00:21:11)
Other than to really deeply connect with this grief that I’m experiencing. I’m not going to negate it. I’m going to really fully connect to it. And I did, and it was tough. And then more people in 2017 were leaving. Gregg Allman, Tom Petty. I mean, these are people, I worked with all these people and had great connections with them, and they were all going, and the world was mourning the loss of these people because of everything that they’d given to the world. They’d changed the world’s lives, not just mine because I knew them personally. And so that was also complicated. And why, for me, it was interesting to be grieving the loss of these musicians with comedians.

(00:22:04)
And I learned a lot. It changed my life. I learned to laugh at absolutely anything, everything. I mean, my grandpa had a really great sense of humor too. My grandpa was a Holocaust survivor, and he could just laugh at anything. And so I already kind of have that in me. But being around all these comedians just kind of exaggerated that for me, and that really changed things for me for the better. So then when Jeff Beck died, it was like, “Okay, I’ve got these tools. I know what this is and I’m going to go through it again, and I’m going to be on tour with Incubus in two days.”

(00:22:45)
So Mike Dirnt from Green Day, he called me up and he said, “Hey, I know you’re going through a lot.” And I said, “I don’t even know what I’m going to play. I really want a vintage jazz bass for this, and I only have a seventies one that I don’t really think is appropriate. I really need a sixties one, blah, blah, blah.” And Mike’s like, “I’m going to hook you up.” He showed up to my place the next day with a truckload of old P basses and jazz basses and brought them all into my studio, and I’m playing them.

(00:23:16)
And then I pull one out of the case and it’s Olympic White, just like Jeff Beck and I play it. And not only did I get goosebumps and started crying, but I looked over at Mike and same thing was happening, and he’s like, “I guess Jeff might be happy about this.” And he’s like, “Well, I didn’t want to let this one go. I was just trying to cheer you up a bit and maybe loan it to you for the tour, but if you really want it’s yours.” And I was like, “Oh my God, this is… Mike Dirnt is the nicest guy ever.”

(00:23:59)
So that happened. So that bass’ name is Jeff, and it’s a white jazz bass, and I played it on the Incubus tour. But yeah, I do feel like I’m more equipped to handle grief now.
Lex Fridman
(00:24:11)
Tell me about the Comedy Store a little bit more. Do you think comedians and musicians in some deep fundamental way are made from the same cloth? Are they spiritually connected somehow?
Tal Wilkenfeld
(00:24:25)
I think everyone’s connected spiritually in the same way. So I think personality wise, comedians and musicians are quite different, actually.
Lex Fridman
(00:24:38)
In what way?
Tal Wilkenfeld
(00:24:40)
Well, you’d have to subdivide even musicians into different categories too, because the thing that I appreciate about comedians is that you go to a restaurant with them and all the observational humor of, they’ll notice everything and make you laugh about it, which a really great songwriter does the same thing too. And my favorite lyricists, like Leonard Cohen, Bob Dylan, Paul Simon, Warren Zevon, they add comedy into their lyric. And so those types of people I would liken to hanging out with a comedian.

(00:25:16)
It’s very different from say somebody that is an instrumental guitarist or something like that, that they’re more focused on, whether it’s a kinesthetic thing or a physical thing or whatever it is. They’re not quite doing the observational thing in the same way. So I just appreciate, my favorite thing to do is go on and laugh, especially because I can tend to be pretty analytical and be in my head. So anything that just kind of lets me be in my heart and just enjoy life is great.
Lex Fridman
(00:25:54)
I think there’s a photo of you with Dave Chappelle on stage. What was that about?
Tal Wilkenfeld
(00:25:58)
So right after Leonard Cohen passed away, the Comedy Store threw me a birthday party. It was this crazy lineup, and it was like I’d play a song with my band, and then Jackson Brown sat in and sang a song, and then Dave Chappelle came up and said some jokes. It was one of my favorite nights ever.
Lex Fridman
(00:25:59)
Yeah.
Tal Wilkenfeld
(00:26:23)
Yeah. It was cool. It was a very healing birthday party.
Lex Fridman
(00:26:27)
Yeah, there’s something magical about that place.
Tal Wilkenfeld
(00:26:29)
Yeah.
Lex Fridman
(00:26:30)
It’s really special.
Tal Wilkenfeld
(00:26:31)
Yeah. Well, the Mothership has some magic to it too. It’s really cool. It’s different. Totally different vibe, but super awesome.

Leonard Cohen

Lex Fridman
(00:26:40)
You said that Leonard Cohen is a songwriting inspiration of yours. I saw you perform his song Chelsea Hotel, brilliantly on the internet. It’s about, for people who don’t know his love affair with Janet Joplin. How does that song make you feel?
Tal Wilkenfeld
(00:27:01)
Great. I love that song.
Lex Fridman
(00:27:03)
Which aspect? Musically, the melancholy feeling, the hopeful feeling, the cocky feeling? All of it, every single line has a different feeling to it, really.
Tal Wilkenfeld
(00:27:16)
Yeah. But as a whole piece, I appreciate it so much. I actually lived at the Chelsea Hotel, and when Leonard and I first met, that was one of the first things we talked about was that I lived there, where all that stuff went down before they tore it apart. And yeah, it is just a beautiful song.
Lex Fridman
(00:27:44)
What makes me sad, the way it ends. “I don’t mean to suggest that I loved you the best. I can’t keep track of each fallen robin. I remember you well in the Chelsea Hotel. That’s all, I don’t even think of you that often.” That line, ” I don’t even think of you that often” always breaks my heart for some reason.
Lex Fridman
(00:28:00)
… I don’t even think of you that often, always breaks my heart for some reason. How ephemeral, how short lasting certain love affairs can be. Just kind of like, huh.
Tal Wilkenfeld
(00:28:14)
Yeah.
Lex Fridman
(00:28:14)
Do you think he meant it? I always think he’s trying to convince himself of it.
Tal Wilkenfeld
(00:28:19)
It could be both, or either. That’s the beautiful thing about poetry and lyric, is that it’s supposed to be open.
Lex Fridman
(00:28:27)
Yeah. I wonder if it’s also open to him, depending on the day.
Tal Wilkenfeld
(00:28:30)
Definitely. The thing that he taught me, or his advice to me was when you’re writing a song, look at it the next morning, just first thing, and read it. And then take a walk, smoke a joint, read it again. Go have a fight with your daughter, come back, read it again. Get drunk, read it again. Wait a week, read it again. Just so that from every state and every position, the wider the lens is going to be from an audience perspective. You want things to mean multiple things.
Lex Fridman
(00:29:12)
There’s one line I read somewhere, that he regrets putting in the song, so I’ve got to ask you about it. It’s pretty edgy. It’s about, “Giving me head on the unmade bed.” You think that’s a good line, or a bad line?
Tal Wilkenfeld
(00:29:27)
I think it’s an amazing line. It’s one of the best lines in the song.
Lex Fridman
(00:29:30)
Yeah, right?
Tal Wilkenfeld
(00:29:30)
When he put that song out, obviously he didn’t regret it, or he wouldn’t have put that lyric in the song. I think what happened was that eventually word got out, either from him or from somebody else, that the song was about Janis Joplin. And so at that point, he regretted the indiscretion. It wasn’t that he regretted how great the line was, it was just the privacy factor. But then again, Leonard’s known for rewriting his lyrics. In his live shows, you’ll see a bunch of songs where it’s like new lyrics. And he didn’t do it because he didn’t like the old lyrics, he just did it because he could, because he’s Leonard. And it’s like, why not have fun with words the way musicians have fun improvising solos on stage? And he could have changed that line in Chelsea Hotel after, in retrospect, and he never did.
Lex Fridman
(00:30:26)
“I remember you well in the Chelsea Hotel. You were talking so brave and so sweet. Giving me head on the unmade bed, while the limousines wait in the street.”
Tal Wilkenfeld
(00:30:35)
It’s so powerful.
Lex Fridman
(00:30:36)
It’s a powerful line. It just kind of shocks you.
Tal Wilkenfeld
(00:30:39)
Well, that’s what’s so great about it. Yeah.
Lex Fridman
(00:30:42)
But also heartbreaking, because it doesn’t last. Especially actually, to me it adds more meaning once you know it was Janis Joplin. It’s like, okay, these two stars collided for a time.
Tal Wilkenfeld
(00:30:54)
Yeah, but why is it heartbreaking? It could also be just beautiful that they had a little fling.
Lex Fridman
(00:31:00)
Yeah, everything is beautiful.
Tal Wilkenfeld
(00:31:02)
Thank you.
Lex Fridman
(00:31:03)
Even the dark stuff. What’s not beautiful? Everything is beautiful, if you look long enough and deeply enough. What were we saying? Oh, what do you think about Hallelujah? What do you think about the different songs of his, and why’d you choose Chelsea Hotel to perform?
Tal Wilkenfeld
(00:31:22)
Because I lived there, and it meant something to me to sing that song. And actually when I put that song out on YouTube, that’s when he sent me an email. He’s like, “Hey, do you want to come over?”
Lex Fridman
(00:31:37)
Nice. This is how you guys connected?
Tal Wilkenfeld
(00:31:39)
No, we met in a rehearsal studio. I ended up watching their whole rehearsal, and sitting there next to Roshi, his 105-year-old monk, which was really great. I remember when I was shaking his hand, it was just me and Roshi on the couch watching Leonard with this band. And we are shaking hands, and he grips my hand like this, doesn’t let it go. And he looked in my eyes, he said, “Where are you?” And I said, “In the handshake.” He says, “Yes.”
Lex Fridman
(00:32:13)
Wow. You passed the test.
Tal Wilkenfeld
(00:32:15)
Passed the Roshi test. And then what’s funny was that the next thing that happened about five minutes later, was Leonard Cohen got down on his knees and opened up a jar, I’m not kidding you, of caviar. This is not a callback.
Lex Fridman
(00:32:28)
Well, it is in a way. In a deep, fundamental way.
Tal Wilkenfeld
(00:32:32)
I know, I know. He started feeding the monk caviar, and that healed my Montreux Jazz Festival sadness forever. The end.
Lex Fridman
(00:32:41)
Do you think there’s a kind of weird, there’s a sense of humor to it all somehow? Why does that happen? Why does that happen? Why stuff like that happens, or that the Jeff Bass speaks to you?
Tal Wilkenfeld
(00:32:57)
Why do we need to know?
Lex Fridman
(00:32:59)
You believe in that stuff?
Tal Wilkenfeld
(00:32:59)
In what stuff?
Lex Fridman
(00:33:01)
That there’s a rhyme to the whole thing, somehow? There’s a frequency to which magical things of that nature can happen?
Tal Wilkenfeld
(00:33:19)
I’m divided about that answer. Because I think just things are flowing, I don’t think anything’s planned out.
Lex Fridman
(00:33:32)
Like through time, it’s like an orchestra playing of different experiences and circumstances that are somehow connected.
Tal Wilkenfeld
(00:33:40)
I think everything’s connected, so yes.
Lex Fridman
(00:33:43)
But predetermined means-
Tal Wilkenfeld
(00:33:45)
I don’t believe in the predetermined stuff necessarily, which is different from whatever your previous karma is. And karma is a whole other conversation, I don’t mean karma as in good karma, bad karma. Just karma meaning the collection of things you’ve acquired over this lifetime or other lifetimes. Just whatever that is, is going to influence your future.
Lex Fridman
(00:34:13)
Well, you had a really interesting trajectory through life. Maybe I just read it that way, because I’ve had a lot of stuff happen to me that’s lucky, feels lucky. And sometimes I’ll wonder, huh, this is weird. It does feel like the universe just kind of throws stuff at you with a chuckle. I don’t know. Not you, the proverbial you. One.
Tal Wilkenfeld
(00:34:37)
One, yeah.

Taxi Driver

Lex Fridman
(00:34:40)
You said you sometimes watch classic movies to inspire your songwriting, and you mentioned watching Taxi Driver. I love that movie. And I think you mentioned that you wrote a love song based on that movie. So Travis Bickle, for people who don’t know, is a taxi driver and he’s deeply lonely. What do you think about that kind of loneliness?
Tal Wilkenfeld
(00:35:02)
I think that loneliness is a product of feeling separate from the world, and separate from others. And that the less you experience that separation, the less you’ll feel lonely.
Lex Fridman
(00:35:20)
How often have you felt lonely in this way, separated from the rest of the world?
Tal Wilkenfeld
(00:35:25)
It’s less and less every single year. Because I work very hard at it.
Lex Fridman
(00:35:34)
Feeling like a part of the world?
Tal Wilkenfeld
(00:35:37)
Yeah, just meditating and studying scriptures.
Lex Fridman
(00:35:40)
Don’t you think that, isn’t there a fundamental loneliness to the human experience?
Tal Wilkenfeld
(00:35:45)
In what sense?
Lex Fridman
(00:35:46)
That all the struggles, all the suffering you experience is really experienced by you alone?
Tal Wilkenfeld
(00:35:51)
Is it?
Lex Fridman
(00:35:53)
Maybe at the very bottom, it’s not.
Tal Wilkenfeld
(00:35:55)
It’s kind of all the same stuff.
Lex Fridman
(00:35:57)
You didn’t feel alone in 2016, 2017?
Tal Wilkenfeld
(00:36:02)
I felt like I lost a piece of myself that I had given to somebody else. And I feel like people feel that in romantic exchanges, whether it’s long-term, short-term. You give a piece of yourself, and then if that person dies or you break up with that person, you feel like you’ve lost that piece of yourself. Which I feel like is a very different experience than if you just are opening yourself. Rather than giving a piece of yourself, you’re just opening yourself to somebody or something.
Lex Fridman
(00:36:39)
So opening is fundamentally not a lonely experience.
Tal Wilkenfeld
(00:36:43)
No, it’s a loving experience.
Lex Fridman
(00:36:45)
And then losing a piece of yourself can be.
Tal Wilkenfeld
(00:36:50)
Yeah. Because you can’t lose a piece of yourself, if you are the same self as every other self.
Lex Fridman
(00:36:57)
Right, right. If you see yourself as together with everybody, then there’s no losing.
Tal Wilkenfeld
(00:37:01)
Yeah.
Lex Fridman
(00:37:02)
Yeah, yeah. It’s a beautiful way to look at it. You said that there’s something healing about being in an empty hotel room, with no attachments except your suitcase. A lot of people will talk about hotel rooms being a fundamentally lonely experience, but you’re saying it’s healing.
Tal Wilkenfeld
(00:37:23)
It’s healing. Yeah. Because I just get to sit there, and not worry about all this stuff, these meaningless attachments. I’ve got my suitcase with my necessities, or my three suitcases sometimes. And I can just sit there and meditate, and just be with myself, and it’s so awesome. And usually you plan your touring for, you get the business aspect of things taken care of in advance, so you can just really be flowing day to day on a tour. And it’s a great feeling. It’s funny because this last tour that I did, we didn’t have hotels every night. We had hotels maybe once a week. And I hadn’t done that before. Usually I’m frequently in hotels. I didn’t get that space that I’m really used to getting.
Lex Fridman
(00:38:18)
You missed them.
Tal Wilkenfeld
(00:38:20)
I very much missed it, and had to be very creative. And I ended up going into the back lounge when everyone was asleep, and meditating back there, or before everyone woke up. And I actually joined, there was an online meditation retreat that was happening. It was 12 hours a day of silent meditations that happens once a year, and I love this particular group of people. And they knew I was on tour, so they’re like, “Just join when you can.” And so I was on the tour doing the meditation retreat at the same time. It was so fun. It was so fun. Because I was in the back lounge, the bus is moving around like this, my laptop, the Zoom is like… and I’m just sitting meditating. It was like, yeah, this is the shit.
Lex Fridman
(00:39:12)
It’s silence, so they’re all connected to Zoom and just doing silent 12 hours a day?
Tal Wilkenfeld
(00:39:16)
Yeah, yeah.
Lex Fridman
(00:39:16)
That’s cool.
Tal Wilkenfeld
(00:39:17)
These particular retreats that I started doing, it’s not straight silent. There are silent sits every hour for 50 minutes, and then there’s some talks. And these people that I’ve been working with are really cool, because they’re integrating spiral dynamics into Zen, and it’s like the coolest combination.
Lex Fridman
(00:39:43)
What’s spiral dynamics?
Tal Wilkenfeld
(00:39:45)
Like Ken Wilber? Do you know Ken Wilber, Integral Theory?
Lex Fridman
(00:39:49)
Yes. Can you explain a little bit? I vaguely know of him because of this notion that everything is one, everything is integrated, that every field has truths and falsehoods, and we should integrate the truths.
Tal Wilkenfeld
(00:40:07)
Yeah. It’s hard to explain how it applies to this type of meditation, because it’s in the guided parts of the meditation that this whole holonic theory is brought in, about transcending and including every aspect of your being. Because he talks about levels of development in consciousness, and how this applies to every single, religion or non-religion, that there are these levels of development, that go all the way up to enlightenment. No matter what you start off with. It could be Christianity, Buddhism, Vedanta, it doesn’t matter, anything.

(00:40:57)
I like it when everything and everyone is taken into account. It doesn’t matter where you’re coming from, that there is a way to be self-realized, self-actualized. There are self-actualized beings from all walks of life with very, very different paths. There’s no one path. In this particular retreat I do, there’s a lot of silent sits, and then there’s some guided meditations. But I’ve tried a lot of different avenues, and they’re all great. I wouldn’t just say, just try this one thing. I’ve studied the Upanishads with Vedanta teachers, and gone through those texts for months and months, and stayed at monasteries. And how they break it down makes total sense to my mind and heart. And more importantly than my mind, my inner knowing, it resonates.
Lex Fridman
(00:41:49)
Inner knowing.
Tal Wilkenfeld
(00:41:50)
Yeah, because your mind is the thinking tool. It’s not you, you’re not your mind, you’re not your thoughts, you’re not your body. It’s like, just the you, that knowing that you have. When something resonates there, that’s usually when you go with something.
Lex Fridman
(00:42:12)
What was living in a monastery like?
Tal Wilkenfeld
(00:42:14)
It’s the best.
Lex Fridman
(00:42:15)
What are we talking about?
Tal Wilkenfeld
(00:42:17)
It’s just an empty room, with a tiny single bed, and a sheet and a pillow, and that’s it.
Lex Fridman
(00:42:22)
That’s it?
Tal Wilkenfeld
(00:42:23)
You have to eat the same thing as everyone.
Lex Fridman
(00:42:25)
What’s the food like? What is it?
Tal Wilkenfeld
(00:42:27)
Very plain, cheap, basic food. Which is funny for someone like me, because I’m pretty particular about my diet.
Lex Fridman
(00:42:36)
Yeah, you brought over like 20 different ingredients.
Tal Wilkenfeld
(00:42:41)
Yeah.
Lex Fridman
(00:42:43)
What was the day in the life of Tal in a monastery?
Tal Wilkenfeld
(00:42:48)
You wake up at 5:00 a.m. to the bell, and you go and meditate constantly until bedtime. Other than two meals.
Lex Fridman
(00:43:00)
How are you sitting? Are you in a group? Is there other people there, and you’re just sitting there?
Tal Wilkenfeld
(00:43:06)
Well, if you’re talking about the Zen monastery, because I stayed in Zen monastery, and I did a thing with the guy I was telling you about, the integral Zen thing where he uses Ken Wilber’s work in combination with Zen. That’s a little bit different, because he does talks, we talk about things. That’s very separate from the Vedanta monasteries I’ve stayed at, which there’s very little meditation in terms of sitting silently. Instead, we are meditating on the scriptures, like the Upanishads, and we’re diving into that.
Lex Fridman
(00:43:46)
What were the differences, the takeaways from the experiences? The two different, the integral one and the meditating on the scriptures?
Tal Wilkenfeld
(00:43:56)
They’re both incredibly, have been incredibly helpful to me. Because the Vedanta, anytime I go into my head about something, the answer is there, based on this knowledge. And with the Zen monastery, it’s like you just got to put your butt in the seat, and sit and wait. And maybe something will happen, maybe it won’t, but just keep sitting. And it’s very disciplined, and you go through a lot. Your body’s purging a lot. There’s a lot, and you don’t necessarily have the answers as to what is happening. And so I think for somebody like me, I need both. I need to be in a place where there’s complete uncertainty, but complete discipline, and just doing the regimented thing. And then there’s the me that feels very satisfied from an analytical standpoint, understanding what’s happening, what is the gross, and the subtle body? I want to understand these things about what it is to be a human. I like them both.
Lex Fridman
(00:45:15)
Understand what it means to be a human, so having that patience and just sitting with yourself helps you do that?
Tal Wilkenfeld
(00:45:22)
Yes. More so the analysis part.
Lex Fridman
(00:45:26)
Oh, so the analysis, the actual… okay, got it.
Tal Wilkenfeld
(00:45:29)
But sitting with yourself, there’s no better education of facing every demon. And it’s all going to come out, and it’s not going to be pretty. But then there’s things that happen on the other side of it that are so profound.
Lex Fridman
(00:45:45)
Have you met most of your demons?
Tal Wilkenfeld
(00:45:48)
I’ve met the demons that have come out.
Lex Fridman
(00:45:50)
Oh, there may be more?
Tal Wilkenfeld
(00:45:51)
Who knows? Yeah.

Songwriting

Lex Fridman
(00:45:53)
Okay. Well, to be continued. Since I think I heard you say that you wrote a love song after Taxi Driver, what kind of love songs do you write more of? You’re a songwriter first, for people who don’t know. They might think you’re primarily a bassist.
Tal Wilkenfeld
(00:46:14)
But they’re wrong.
Lex Fridman
(00:46:16)
Do you write mostly broken heart ones, or hopeful love songs? In love songs, about to be in love songs, soon to fall in love songs?
Tal Wilkenfeld
(00:46:27)
Well, the last album I put out is pretty self- explanatory as to what that is.
Lex Fridman
(00:46:31)
A lot of pain in that one?
Tal Wilkenfeld
(00:46:34)
There was, yeah. Some of it was storytelling, and some of it was real experience, and it’s always a combination of things. I serve the song. Sometimes you use your own life experience to tell a song, and sometimes you may watch a movie, and part of that script merges with your own experience, and that tells the right story for the point you’re trying to make in the song. It varies from song to song in terms of how autobiographical it is.
Lex Fridman
(00:47:14)
Yeah. I always think at the end of the Taxi Driver, when… what’s her name, Betsy? Because Travis becomes a hero, she tries to get with him, and he rejects her. That was powerful.
Tal Wilkenfeld
(00:47:28)
My favorite love songs are the ones where you’re not sure it’s about romantic love, or love of God, or love of life, or just pure love. I was thinking George Harrison writes songs like that, What is Life? Or Bob Dylan’s song that George Harrison covered, If Not for You?
Lex Fridman
(00:47:54)
Yeah, just grateful. Grateful for his love. Yeah.
Tal Wilkenfeld
(00:47:57)
Right, right. That’s kind of like what I’m experiencing now, and so who knows what’ll end up coming out.
Lex Fridman
(00:48:05)
So you’ve been writing this kind of-
Tal Wilkenfeld
(00:48:07)
Yeah, I’ve been writing.
Lex Fridman
(00:48:09)
A little bit?
Tal Wilkenfeld
(00:48:10)
I don’t have an intention of putting something out in any particular timeframe, but I’m just writing and letting things flow. And yeah, there’s a bunch of Leonard Cohen songs too where you’re like, there’s so many ways to interpret this song. There’s so many ways. I just love songs that aren’t so specifically about one thing.
Lex Fridman
(00:48:39)
I really love the song to play it, to listen to it, Wonderful Tonight by Eric Clapton. And I thought it was pretty straightforward. And then I had a conversation with Eric Weinstein, who’s a mutual friend of ours, and he told me it’s not about what I thought it’s about.
Tal Wilkenfeld
(00:48:54)
Oh yeah, what did he say?
Lex Fridman
(00:48:57)
It’s a more complicated story. It’s actually a man… Wonderful Tonight is a story about a man being just finding his wife beautiful, and appreciating it throughout. But he said it was actually a man missing his wife, he’s imagining. That she’s lost, because of the decisions he’s made in his life, so it’s pain. He had a long, beautiful Eric Weinstein-like explanation of why.
Tal Wilkenfeld
(00:49:28)
I love those.
Lex Fridman
(00:49:29)
Have you and Eric played music?
Tal Wilkenfeld
(00:49:32)
No. We’ve just hung out and had very long conversations about everything.
Lex Fridman
(00:49:37)
He’s a bit of a musician, you know?
Tal Wilkenfeld
(00:49:38)
Yeah.

How to learn and practice

Lex Fridman
(00:49:39)
Okay. You picked up the guitar when you were 14, let’s go back. And one interesting thing that just jumped out at me is you said you learned how to practice in your head, because you only had 30 minutes. Your parents would only let you practice for 30 minutes. I read somewhere that Coltrane did the same. Not the practice part, but he was able to play instruments in his head as a way to think through different lines, different musical thoughts, that kind of stuff. Maybe, can you tell the story of that?
Tal Wilkenfeld
(00:50:14)
Yeah. I just grew up in an environment that was focused on academia. And I fell in love with guitar, and really just wanted the focus to be that. My limit was 30 minutes a day for, I don’t even remember how many times a week. Might’ve been every day, five days a week, whatever.
Lex Fridman
(00:50:36)
So your parents didn’t want you to play more than that?
Tal Wilkenfeld
(00:50:39)
No. And so, I just learned how to visualize the fretboard in my head, and I’d practice all day in my head. It’s kind of like, you know The Queen’s Gambit, the TV show with Anya Taylor-Joy, and she just on the ceiling? I used to do that with the fretboard, and just practice. And I actually recommend it to every musician. Because if you’re just practicing here, you don’t know what is more dominant necessarily, is it this or is it your motor skills? If you just take that away and do it here, you know you’ve got it. I’m glad that that happened and that I learned how to do that.

(00:51:24)
And in terms of learning fast, because I had to try to absorb a lot of information in a short amount of time when I did have the instrument, I kind of would do things in bursts. Even in that half an hour, I would just play for a couple minutes, and then I’d stop for a minute. And then I’d do it again, and I noticed there was a huge difference between the first time and the second time. Whereas if I just kept repeating stuff, it would be much slower.
Lex Fridman
(00:51:56)
What did you do in that minute?
Tal Wilkenfeld
(00:51:59)
Just hang out.
Lex Fridman
(00:52:00)
Just integrate?
Tal Wilkenfeld
(00:52:01)
Yeah. It’s like my brain was telling me, just chill out for a sec. That’s enough information. Let me take a second to integrate that. That’s at least what it felt like to me. And the most hilarious thing happened a couple months ago. I know you’re friends with Andrew Huberman. He put out some clip, which was a part of one of his podcasts, about learning. And he said that there was some research done on learning fast, and that if you practice something for a minute or so, and then you let your brain rest for 30 seconds or a minute, that in that 30 seconds or a minute, your brain does the repetition 20 to 30 times faster, and in reverse. And I was like, whoa, that’s so cool. Because that’s what I used to do when I was a kid, now there’s science that proves that. Which is really cool for musicians to know that that’s a good way to practice efficiently. Because some musicians, they’re practicing for six, seven, eight hours a day. I’ve never done that. I’ve never practiced more than an hour a day, even now. That’s my technique, and it works.
Lex Fridman
(00:53:17)
Are you also practicing in your head sometimes?
Tal Wilkenfeld
(00:53:20)
Now, I’m not practicing as much. I’m more always writing songs in my head, so that’s why I like silence. That’s why I love being in the empty hotel room and being alone. Songs come to me while I’m showering, or walking around, doing the dishes. Or occasionally when I’m hanging out with friends, or comedians, and people will just say shit. And I’ll be like, that’s a cool line. Just jot it down on my phone.
Lex Fridman
(00:53:46)
So it’s not always musical, it’s sometimes lyrical.
Tal Wilkenfeld
(00:53:48)
It’s more lyrical than musical now. Because for me it’s like, well, there’s so much music in the world. If I’m going to write a song, I want the song to be about something interesting. And so, yeah, the words matter to me.
Lex Fridman
(00:54:07)
Yeah. And the right word has so much power. It’s crazy, like we said with Leonard Cohen. And then they’re often simple, the really powerful ones are simple.
Tal Wilkenfeld
(00:54:18)
And when you mentioned Hallelujah, he wrote like 80 verses to Hallelujah before he narrowed it down to four. And it took him like 15, 20 years to write that song. Some writers will do that, and then other writers just vomit it out and it’s beautiful. I’ve heard that Bob Dylan or Joni Mitchell, they’re fast writers. It just kind of comes out.
Lex Fridman
(00:54:41)
That makes me feel so good to know Leonard Cohen wrote so many verses of that. That was so deliberately crafted, extensively rigorously crafted.
Tal Wilkenfeld
(00:54:53)
He just would spend months and years, constantly refining, refining.
Lex Fridman
(00:55:00)
Do you have songs like that for yourself, where you refine for many years?
Tal Wilkenfeld
(00:55:03)
Yeah, it’s song dependent. Some just flow out and it’s like, oh, there it is. Everything’s there. And then other songs, it’s like, you might have started it with music, and there’s some words that come out. And then trying to fill in the rest of the words, sometimes it can be like a square peg in a round hole, and other times it’s like, oh no, I can… it depends. Sometimes it becomes like a math problem, and hopefully it doesn’t. Because you just want to say what’s right for the song. And usually when you write it all together, like the lyric, and the melody, and the chords and everything’s developing at once, at least for the first draft, that’s very, very helpful. Sondheim used to write like that. He wouldn’t move on until… he would just go this way. Whereas for me it’s just like, I’ll just go with what seems to be coming naturally, and I’ll just let it be what it is. And then you come back and you say okay, well, what-
Tal Wilkenfeld
(00:56:00)
Truly, and I’ll just let it be what it is. And then you come back and you say, okay, well what do I have to do to this now? What’s needed?
Lex Fridman
(00:56:07)
Just to linger on the learning process, what would you recommend for young musicians on how to get good? What are the different paths a person can take to understand it deeply enough to create something special?
Tal Wilkenfeld
(00:56:26)
I think first and foremost, understanding why you are playing music. If it’s because you have something that you’re trying to express or that you’re just in love with expression itself, with art itself, those are great reasons to start this journey.
Lex Fridman
(00:56:47)
The why should be-
Tal Wilkenfeld
(00:56:49)
I think the why is really important because it’s a jagged lifestyle and there’s a lot in it. And so if you don’t have your purpose, if you’re not centered in your purpose, then all that jagged lifestyle is probably going to get to you.
Lex Fridman
(00:57:06)
Jagged.
Tal Wilkenfeld
(00:57:06)
It’s jagged.
Lex Fridman
(00:57:07)
Interesting word.
Tal Wilkenfeld
(00:57:08)
Yeah, it’s jagged. It’s all over the place. It’s uncertain. It’s one thing one moment, and a completely different thing another moment. You never know what’s going to happen. And if you thrive on variety, which I love variety, then it’s perfect. But also every human being needs a certain amount of certainty and structure, and so the certainty can come from your inner knowing knowing that you’re doing exactly what you want to be doing and knowing what your purpose is in doing it in this expression. Otherwise, you’re just kind of like a leaf blowing in the wind.
Lex Fridman
(00:57:48)
In the early days touring, just playing clubs seems like tough.
Tal Wilkenfeld
(00:57:52)
Yeah.
Lex Fridman
(00:57:53)
It’s a lot.
Tal Wilkenfeld
(00:57:54)
Yeah, it’s a lot of the physical labor aspect of it is really hard. Playing on stage to two people, or 2000, or 20,000, that doesn’t make a difference. I mean, it makes a difference to the ticket sales, which informs what level of luxury you might have on the road or not. But other than that, it’s just people there listening to music. The music doesn’t change.
Lex Fridman
(00:58:19)
Does it make it tough when it’s two people versus 200?
Tal Wilkenfeld
(00:58:21)
No.
Lex Fridman
(00:58:23)
So even if nobody recognizes whatever the thing you’re doing.
Tal Wilkenfeld
(00:58:26)
No, because the idea is to be having a great conversation on stage.
Lex Fridman
(00:58:33)
The audience can come and go.
Tal Wilkenfeld
(00:58:35)
Yeah. I always, there’s certain points in shows where I am just like, I consciously am like, oh yes, there’s an audience over there. So wrapped up in whatever’s happening on stage.
Lex Fridman
(00:58:49)
You forget yourself.
Tal Wilkenfeld
(00:58:50)
Or maybe I’m remembering myself.
Lex Fridman
(00:58:52)
Oh, damn. Call back, somehow feels like one. Okay. You think every instrument is its own journey. You play guitar, you play bass, you sing, just the mastery of an instrument, or let’s avoid the word mastery, the understanding of an instrument is its own thing, or are they somehow physical manifestations of the same thing?
Tal Wilkenfeld
(00:59:19)
It’s both. Every instrument has its strengths, beauty, limitations, range, possible range that can be extended to some degree or another depending on who you are, like trumpet or something. Certain people can hit higher notes than others, blah, blah, blah. But that being said, we’re all playing the same 12 or 24, however you divide the octave, that many notes. We’re all playing the same notes. So in that sense, it’s all the same thing. It’s just music or better yet it’s just art or expression. But yeah, every instrument has, you’ve got to go through the physical aspects of it, the motor skills and all of that, and hopefully you get through that really quickly so you can get to the expression quickly because if you get stuck in just that first phase, that’d be really boring.
Lex Fridman
(01:00:19)
But that’s a pretty long phase. The technical skill required to really play an instrument.
Tal Wilkenfeld
(01:00:27)
For some people it’s a long thing, and some people it’s short. It very much varies. It might have to do with how you learn and getting to know your strengths in learning. More oral, or is it more… What’s your strength and playing off of those strengths. So for me, like I was saying earlier, it was just an intuitive thing that I knew. I can feel when my brain is full that it needs processing time. And so I listened to that. I don’t push past it, even if it’s one minute and I do something, I’m like, okay. Silence. And then I come back and I trust that it’s going to be there and it is there. So just trusting yourself I think is really important. Trusting that you know better than anybody else is going to know you.

(01:01:23)
So that’s the kind of thing with teachers that can be either really, really helpful and great or really not great. I’m primarily self-taught. I’ve had amazing mentors of all walks of life, and I think I’m unbelievably blessed that my mentors are some of my favorite musicians on Earth, whether it’s Leonard Cohen or Jeff Beck or Wayne Shorter, whoever these people are, they are my favorite musicians. So not everyone has that opportunity, but what the opportunity that we have now that I didn’t have when I was starting is that everything’s on YouTube. Every interview with every genius. You don’t need to necessarily have these people in person now. I mean, and then I’ll say to that, yes and no. I agree with myself, and then I don’t agree with myself. And the reason is I do believe that there is something that happens when you’re in person with a master in some cases, that there is something transferred that is not intellectual, it’s not spoken, it’s something else that happens, that can happen, that I’ve experienced, and I really value that.
Lex Fridman
(01:02:47)
And I think that applies to specific disciplines and also generally. I’ve been around Olympic gold medalists just to hang out with them for several days, and there’s something about greatness. There’s a way about them that permeates the space around them. You kind of learn something from it, even if you don’t practice that particular discipline, there’s something to it if you’re able to see it. I also like what you said about the playing stuff in your head, that it forces you to not be lost in the physical learning of the instrument. I think that’s one of the things I probably regret a little bit. So I play both piano and guitar, and I’ve become quite, over the years, technically proficient at the instruments.
Tal Wilkenfeld
(01:03:43)
I’ve seen.
Lex Fridman
(01:03:43)
But I think my mind is underdeveloped because of that, meaning I can’t really… I can feel the music when it’s created, but I can’t create out of the feeling. I haven’t practiced projecting the feeling onto the music. You know what I mean? I’m not like a musician. It’s a different muscle that I think is if you really want to create beautiful things, you have to, the creation happens here, not with your hands.
Tal Wilkenfeld
(01:04:17)
I think it’s more here.
Lex Fridman
(01:04:18)
Or whichever it is, some part of the body, but it’s not with your fingers.
Tal Wilkenfeld
(01:04:21)
Yeah, because I think the fingers is more of this.
Lex Fridman
(01:04:22)
Sure.
Tal Wilkenfeld
(01:04:24)
And then…
Lex Fridman
(01:04:25)
Yes, it is here.
Tal Wilkenfeld
(01:04:27)
Yeah.
Lex Fridman
(01:04:27)
Right. And it’s just nice that you said that because it’s really good advice if you want to create.
Tal Wilkenfeld
(01:04:34)
Yeah, slowing down is really great too.
Lex Fridman
(01:04:38)
What do you mean slowing down?
Tal Wilkenfeld
(01:04:40)
Slowing everything down? It could be, I can play something really fast, but I may want to practice it like…
Lex Fridman
(01:05:09)
Go slow as possible.
Tal Wilkenfeld
(01:05:12)
All these micro movements that are happening that if you just go, you can’t pay as close attention to the exact tone that you’re pulling from each note. And there’s a lot to pay attention to how my fingers are touching the string here. I can change my tone a million ways just by the direction of this finger, and same with how this lands and how hard I’m attacking the string and with what intention am I hitting the string emotionally, physically, and so even if you can go, play that so slow, see how locked into a pocket you can be, see how you… Feel every aspect of that because then when it gets sped up, it’s still there with you.
Lex Fridman
(01:06:07)
That is brilliant.
Tal Wilkenfeld
(01:06:08)
It’s like the transcended and included thing that Ken Wilbert talks about.
Lex Fridman
(01:06:14)
I guess that’s what meditation can do for you is to really listen, to observe every aspect of your body, the breath and all this. Here you’re observing every element, every super detailed element, of playing a single note.
Tal Wilkenfeld
(01:06:26)
Yeah.
Lex Fridman
(01:06:27)
It’s cool that if you speed it up, it’s still there with you.
Tal Wilkenfeld
(01:06:30)
It is, Yeah it is. Because there are certain people, it’s like they play really fast, but I don’t hear the fullness of tone always. And it’s like, well, it’s probably because maybe they didn’t slow it down and really sit with each note and let it resonate through their whole being. It’s spiritual. It’s like a spiritual expression. It’s not a sport. A lot of people treat music like a sport.
Lex Fridman
(01:07:04)
Since starting to learn more like Stevie Ray Vaughan versus Jimi Hendrix. I would spend quite a long time on single notes of just bending, just listening to what you can do with bends, spending. Just thinking people like B.B King and all these blues musicians spend a career just making a single note cry. There’s an art form to that.
Tal Wilkenfeld
(01:07:28)
Yeah.
Lex Fridman
(01:07:29)
And I think you putting it, taking it really slow, which I never really thought of, is really good idea. Really slow it down.
Tal Wilkenfeld
(01:07:39)
It’s the same with sitting with your own emotions. It’s like when emotions are overwhelming to us, we get real busy or we move real fast because we don’t want to feel our feelings. Those are the moments to slow yourself down.
Lex Fridman
(01:07:57)
And observe it, anger, jealousy, loneliness.
Tal Wilkenfeld
(01:08:01)
And just be with it. Be cool with it. Love it. Love the anger.
Lex Fridman
(01:08:06)
It’s all beautiful. Can you educate me on the difference between bass.

Slap vs Fingerstyle

Tal Wilkenfeld
(01:08:13)
Bass and bass? Okay, well, one is a fish.
Lex Fridman
(01:08:16)
At least I pronounced it correctly. That’s good. It’s all about the bass.
Tal Wilkenfeld
(01:08:20)
Can you pronounce my name?
Lex Fridman
(01:08:22)
Tal.
Tal Wilkenfeld
(01:08:23)
Wow. Most people say Tal or tall. You said-
Lex Fridman
(01:08:29)
Tall, who says tall?
Tal Wilkenfeld
(01:08:31)
So many people.
Lex Fridman
(01:08:32)
In the south, maybe tall.
Tal Wilkenfeld
(01:08:34)
I don’t know. But the fact that you said my name right.
Lex Fridman
(01:08:34)
Oh, honey tall.
Tal Wilkenfeld
(01:08:36)
You get extra points.
Lex Fridman
(01:08:37)
Tal. I didn’t know this was a game. Am I winning?
Tal Wilkenfeld
(01:08:41)
Yep.
Lex Fridman
(01:08:41)
I like winning. How do you play the bass? What’s the difference between finger style and slap?
Tal Wilkenfeld
(01:08:48)
Slap is like this finger styles like this.
Lex Fridman
(01:08:50)
Have you ever played bass with a pick?
Tal Wilkenfeld
(01:08:52)
Yeah, sometimes
Lex Fridman
(01:08:54)
I’m not accusing you of anything.
Tal Wilkenfeld
(01:08:55)
No accusation taken.
Lex Fridman
(01:08:57)
I don’t know if these are sensitive topics.
Tal Wilkenfeld
(01:08:59)
That would be pretty hilarious if I was sensitive about bass techniques, but not about love.
Lex Fridman
(01:09:05)
It just looks so cool to slap it, and I don’t understand what that’s about. That thumb thing that…
Tal Wilkenfeld
(01:09:11)
Yeah, I slapped less, a lot less. Almost never actually. It has a very distinctive sound and does a very distinctive thing to a song that is not something I hear needed very often in music today, but in certain styles, like funk, it sounds awesome and it makes sense. It was something that was a bit overused at one point. For instance, my mentor Anthony Jackson, he refused to slap. He actually said, if you want me to slap, I’ll leave this gig. So I’m not like that.
Lex Fridman
(01:09:56)
See, that’s why I said sensitive. See, I was reading into it.
Tal Wilkenfeld
(01:09:59)
Because he’s sensitive about it. I’m not sensitive.
Lex Fridman
(01:10:00)
I was feeling the spiritual energy of the sensitivity of the topic.
Tal Wilkenfeld
(01:10:03)
Anthony Jackson.
Lex Fridman
(01:10:04)
Anthony Jackson.
Tal Wilkenfeld
(01:10:04)
And then I’m playing electric bass, so generally speaking, you don’t particularly want to hear electric bass on straight-ahead Jazz anyway, you want to hear an upright bass. But if I was to play jazz on electric bass, I might even palm mute instead of going like, I might go to very. Anything to make the notes shorter and less resonant and fade away because the upright does that naturally. And I have a different bass, like a hollow body harmony that sounds closer to an upright that I’ll use. In on my song Under the Sun, that I put out, that was on a harmony bass. And it has an upright acoustic kind of tone to it, but with more sustain.
Lex Fridman
(01:10:58)
And is Jazz fusion the style where you have an electric bass? Can you educate me?
Tal Wilkenfeld
(01:11:06)
Again, you can have both. You can have both. You can have either on anything. There’s no real rules, now.
Lex Fridman
(01:11:14)
I’ve heard you say something interesting, which is, well, a lot of things you say is interesting.
Tal Wilkenfeld
(01:11:17)
Just one thing.
Lex Fridman
(01:11:20)
Just one. That-
Tal Wilkenfeld
(01:11:23)
And it’s what time you’re leaving.
Lex Fridman
(01:11:27)
What time was that again?
Tal Wilkenfeld
(01:11:29)
Three minutes.
Lex Fridman
(01:11:30)
That it’s maybe easier sometimes to define a musical genre by the don’ts than the do’s, the don’ts, than the do’s. What are the don’ts of jazz and rock? What are the don’ts of jazz fusion? What are the don’ts? At any domain of life, what are the don’ts?
Tal Wilkenfeld
(01:11:50)
The don’ts is just to please leave your fear at the door and your do’s is to be open to anything and open your ears, respond to what’s happening now. I think that quote you’re talking about might have been more about an individual musician’s unique sound, because everyone has their sound. If they’ve developed their voice and they’ve listened to their own aesthetic preferences, of which everyone is slightly different, everyone has slightly different likes and dislikes, then you’ll have a unique sound on your instrument. And your unique sound is defined more by the choices you make rather than… I mean, it’s equally as defined by the choices you make and the choices you don’t make. I mean, it’s the flip side of the same coin, really?
Lex Fridman
(01:12:46)
Yeah. There’s certain musicians you can just tell. It’s them just, you hear a few notes and you’re like, okay, it’s them. Tone, sometimes it’s tone. Sometimes it’s the way they play a rhythm.
Tal Wilkenfeld
(01:12:56)
Yeah, the quote you’re talking about might have even had to do with someone’s real limitations on an instrument that then that would define their sound as the things that they actually can’t do versus what you’re choosing to do versus not choosing to do. Which is that flip side of the same coin thing,
Lex Fridman
(01:13:14)
How many fingers you play with, because it seems like a lot of the greatest musicians aren’t technically perfect. The imperfections is the thing that makes them unique and where a lot of the creativity comes from. I mean, Hendrix had a lot of those things. The way he put a thumb over the top.
Tal Wilkenfeld
(01:13:34)
Well, his hands were huge. There was no other place for the thumb to go. And it was great that he could reach the E string and that was an advantage.
Lex Fridman
(01:13:43)
And he was a lefty playing a right-handed guitar, flipped, I guess. That’s weird. That probably doesn’t have much of an effect. Maybe a spiritual one. I don’t know.
Tal Wilkenfeld
(01:13:55)
Actually, flipping and guitar is different. It does bring out something different in you because I’ve done it, flipped it. It’s like, oh wow. Yeah, it really, it’s really different. I remember talking about osteopath about, because there’s so much weight on this shoulder while I’m playing all the time, and they were saying, well, just after shows, just literally just turn it upside down and do the exact same thing in the opposite way. It’ll even out your body. And I was like, that’s good advice.
Lex Fridman
(01:14:24)
Have you actually tried it? Okay. All right, I’ll write that down. All right. Well, do you know a guy named Davie504?

Davie504

Tal Wilkenfeld
(01:14:36)
I’ve heard of him.
Lex Fridman
(01:14:37)
I’ve recently learned of him. He’s a YouTuber and a bass player. He’s amazing.
Tal Wilkenfeld
(01:14:42)
Cool.
Lex Fridman
(01:14:42)
He combines memes and also just these brilliant bass compositions and says slap like a lot. He’s big into slapping. He’s the one that made me realize this is a thing. And he also said that you’re one of the best, if not the best, bassists in the world. There was a bunch of his fans that wrote in and he analyzed the Jeff Beck thing that we watched at Crossroads is one of the greatest solos ever, bass solos ever. So shout out to him. What does that make you feel like you’re the greatest of all time?
Tal Wilkenfeld
(01:15:13)
Chocolate cookies.
Lex Fridman
(01:15:14)
Chocolate. Is that your favorite?
Tal Wilkenfeld
(01:15:15)
I like macadamia nut. If you really want to get into it, with white chocolate.
Lex Fridman
(01:15:21)
Yeah, that’s a rare one for people to say is the favorite.
Tal Wilkenfeld
(01:15:24)
Chocolate chip is just so easy. You can kind of get them anywhere.
Lex Fridman
(01:15:27)
Yeah. Last thing you want to be is easy in this world. You don’t want to be easy. You said that I love Rock and Roll quote, “I love folk. I love jazz. I love Indian classical music. I really love all kinds of music as long as it’s authentic and from the heart.” So when you play rock versus jazz, you play all kinds of music. What’s the difference technically, musically, spiritually for you?
Tal Wilkenfeld
(01:15:49)
Well, there’s no spiritual difference.
Lex Fridman
(01:15:54)
Okay. All right. Cross that off the list,
Tal Wilkenfeld
(01:15:56)
Well, musically, yeah, it’s like what was saying earlier, it’s like each genre has its language of what makes it that genre. And that would be a good thing to say. It’s defined by the do’s and don’ts, but because it’s like… I’m trying to think. Basically I put the song first and I think of the song as the melody, the lyrics, and then the harmony and obviously the groove.
Lex Fridman
(01:16:34)
So the song goes before the genre in a sense. Each song is like its own thing.
Tal Wilkenfeld
(01:16:39)
They’re both things that are held in my mind. It’s like, okay, genre and then song, which is comprised of those basic elements. And I tend to kind of prioritize lyric because somebody is trying to express something over music. And so the lyric is very, very important. And so then the choices come from there. It’s like, okay, within the genre of X this is the typical language. And then how do I best serve this lyric? And then where else can I pull from that might not be in these two bags that would put a little twist on it. So those are all the kinds of things I might be thinking about.

(01:17:34)
But I don’t like twists for the sake of twists either. I like twists because I want to hear something that might be fresh. But when someone does something just to be hip, it’s annoying to me. I think you can hear the difference. It’s like when people, they write in odd time signatures or they write all these riffs just because they can, just because they have the chops to do it or they know how to play in 11/16 and whatever. But if it’s not actually creating a piece of music that’s going to move somebody, then why are you doing it? And so I think a lot of the questions I’m asking myself when I’m approaching a song or mainly philosophical and aesthetic.
Lex Fridman
(01:18:27)
So you like to stand on the edge of the cliff, not for the thrill of it, but because where you find something new potentially.
Tal Wilkenfeld
(01:18:34)
And it’s thrilling.
Lex Fridman
(01:18:36)
But you’re not doing it just for the thrill.
Tal Wilkenfeld
(01:18:37)
I’m not doing it for the thrill. It just happens to be thrilling.
Lex Fridman
(01:18:41)
All right.
Tal Wilkenfeld
(01:18:42)
Because you can always reel it back in.
Lex Fridman
(01:18:45)
Can you though?

Prince

Tal Wilkenfeld
(01:18:46)
Yeah. You can do a totally disciplined, I can go into a session and… Okay, my favorite thing about going into a session with musicians that I adore is that we don’t hear the demo because if you hear a demo, you’re hearing what the producer or songwriter have already imagined that every instrument is playing. And then it’s like well, I’ve already heard what you want. Now my mind, part of my mind, is focused on what I already know you want and what the destination is going to be. Why did you bring me in here? I want to not hear it. I just want you to sit at a piano and sing the song, I want to hear the chords and the lyric or sit with an acoustic guitar, play it, and then let’s all go in the room.

(01:19:29)
And then take one, I would say 80% of the time, take one has the most gold and there might be a mistake or two or someone forgot to go to the B section and you might want to punch that in so that you’re hitting the right chord. But all the magic is in that take. And then sometimes it happens where it’s like you go, it’s like we’re rehearsing and take 1, 2, 3, 4, 5, and then you’re thinking about it too much and then you go and you have a dinner and you come back and the next take one after dinner is the one. It’s usually after there’s some sort of a break, but obviously there’s exceptions to that rule. Sometimes it’s take two, or three.
Lex Fridman
(01:20:10)
Yeah. You said that this is something that surprised you about recording with Prince is that he would just, so much of it would be take one. So quick, it would just move so quickly.
Tal Wilkenfeld
(01:20:21)
Yeah. Well, with that particular album that we made together, it’s called Welcome to America. He called me up and asked me, he said, I want to make a band with you. I’m really inspired by what you’re doing with Jeff Beck. I want to make a trio. Do you like the drum rolls of Jack DeJohnette, was like his first question to me. I’m like, well, yeah, who doesn’t. Who doesn’t like Jack DeJohnette, one of the greatest of all time?

(01:20:44)
And he’s like, well, sounds like, because we had a discussion about drumming, sounds like you’re particular about drummers. So why don’t you find us the drummer and I’ll trust you to find the drummer. You can audition some people. Send me some recordings, maybe your two favorites, and I’ll pick out of the two or something. So I did that. Went on a journey, found a couple of guys. He picked the one. We went in and he basically just would be like okay, so the A section’s going to go like this, and then the B section, I think we’re going to go to G, and then the bridge, I might go to B flat, but maybe I’ll hold off and da, da, da. Okay, let’s go 1, 2, 3, 4. And then we recorded it to tape. There was no punch. He did not want me to punch anything.

(01:21:34)
There was one song called Same Page, Different Book. And he talked through it just like he did. And then he had me soloing between each phrase like little fills. I didn’t know that that was going to come up. And he loved that. He loved to have me on the edge of my seat falling off the cliff. That was my first real falling off a cliff moment from somebody else holding me at the edge of the cliff. You know what I mean? Now I just do it on my own because it’s so fun and it makes sense. It’s the best thing for the music.
Lex Fridman
(01:22:13)
When you say punch the tape is that when you actually record it.
Tal Wilkenfeld
(01:22:17)
If you record to tape and there’s say you hit a bum note to punch in means to fix that note, re-record over that one little area and punch that note in. He didn’t want that. He’s like, all my favorite records, just whatever happened happened. That’s that moment in time. Let’s make a new moment in time. It’s great. Nobody makes records like that anymore. Everyone wants to edit and edit and re-record and this and that. And unfortunately with a lot of music, and I’m not saying all music, there’s plenty of great music coming out, but there’s the danger of it being flat because every little imperfection is digitally removed.
Lex Fridman
(01:23:05)
Well, that’s one of the promising things about AI is because it can be so perfect that the thing we’ll actually come back to and value about music is the imperfections that humans can create.
Tal Wilkenfeld
(01:23:16)
Yeah.
Lex Fridman
(01:23:17)
There’ll be a greater valuation of imperfections.
Tal Wilkenfeld
(01:23:20)
Yeah. I mean you can program imperfections too.
Lex Fridman
(01:23:24)
Yeah, sure. That’s also very sad. But then you get closer and closer to what it means to be human, and maybe there’ll be AIs among us. And they’ll be human, flawed, like the rest of us. Mortal and silly at times.
Tal Wilkenfeld
(01:23:42)
Another big sigh.
Lex Fridman
(01:23:46)
Is it fair to say that you’re very melodic on bass? You make the bass sing more than people normally do?
Tal Wilkenfeld
(01:23:55)
Is that a compliment?
Lex Fridman
(01:23:56)
Yes, I think so.
Tal Wilkenfeld
(01:23:57)
Thank you.
Lex Fridman
(01:23:59)
Moving on to the next question. By way of understanding-
Lex Fridman
(01:24:00)
The next question is, by way of understanding, it’s just there’s something about the way you play bass that just pulls you in the way when you listen to somebody play a guitar, like a guitar solo.
Tal Wilkenfeld
(01:24:13)
The thing I love about Jeff Beck is that he played the guitar like a singer, and I think the way that Wayne Shorter played his saxophone. It’s like a singer. And I think everyone, every musician, aspires to just sound like a singer.

Jimi Hendrix

Lex Fridman
(01:24:29)
You make it sing. Let me ask you about… Just come back to Hendrix, because you said that you had three CDs, Jimi Hendrix, Herbie Hancock and Rage Against the Machine. First of all, a great combination. I’m a big Rage fan.
Tal Wilkenfeld
(01:24:42)
It’s so funny, because when I listen to some of the music that I create, my solo music, I’m like, “I could see how this is a combination of Herbie Hancock, Rage Against the Machine and Jimi Hendrix.” I hear the influences. It’s funny.
Lex Fridman
(01:24:58)
Just from your musician perspective, what’s interesting to you about… What really stands out to you about Hendrix? I just would love to hear a real, professional musician’s opinion of Hendrix.
Tal Wilkenfeld
(01:25:13)
I love that he is two voices combined into one voice. So it’s like there is his voice on the guitar, there is his singing voice, and there is the combination of the two that make one voice. And of course the third element is songwriting. And all of this have this beautiful chemistry, and all work geniusly, perfectly together, and there’s nothing like it. And he always beat himself up about being a singer, and he didn’t like his voice, but my favorite singers are the singers that don’t sound like singers.
Lex Fridman
(01:25:58)
Bob Dylan.
Tal Wilkenfeld
(01:25:59)
Bob Dylan.
Lex Fridman
(01:26:00)
You said you like Bob Dylan.
Tal Wilkenfeld
(01:26:01)
Love Bob Dylan.
Lex Fridman
(01:26:03)
You love his voice too?
Tal Wilkenfeld
(01:26:04)
I love his voice.
Lex Fridman
(01:26:06)
Can you explain your love affair with Bob Dylan’s voice?
Tal Wilkenfeld
(01:26:12)
He’s expressing his lyrics. It’s just pure expression, exactly what he means. I feel everything that he’s saying with 100% authenticity. That’s what I want to hear from a singer. I don’t care how many runs you can do and blah blah blah. I want to believe what you’re saying.
Lex Fridman
(01:26:33)
Leonard Cohen is that.
Tal Wilkenfeld
(01:26:34)
Mm-hmm. There’s countless, like Neil Young. I mean, there’s so many musicians. I love Elliott Smith for that reason.

Mentorship

Lex Fridman
(01:26:44)
Let me ask you about mentorship. You said teachers and mentors. You had mentors. What’s a good mentor for you, harsh or supportive?
Tal Wilkenfeld
(01:26:54)
Supportive.
Lex Fridman
(01:26:55)
Supportive. You seen Whiplash, the movie? So that guy, somebody screaming at you, kicking you off the cliff?
Tal Wilkenfeld
(01:27:03)
Not necessary. I feel like anybody that’s truly passionate about something that they want to be great at or a master of or this and that, they’ve already got that person inside their own head. You don’t need somebody else to do that for you. I think you need love, acceptance, guidance, support, time, advice if you ask for it, just a space, just a nice, open space.

(01:27:32)
All my mentors were just that for me. They didn’t tell me to do anything. They don’t care, because they’re not… Why do they need to be invested in where I’m going? Only I know where I’m going. So for some mentor to come and be like, “This is what you need to be doing, and practice…” It’s like, but why? What if that’s not my path? That might be your path. So I’m not really… Again, otherwise it feels like a sport, like who can run the fastest race. And it’s like, well, okay, I get that for sport maybe it makes sense to have someone a bit more hardcore. But still, I would say athletes have the same mentality. They’ve got that in them already too. So I think more of a strategic approach to mentorship works really well, and mainly just having an open space and just being available to someone.
Lex Fridman
(01:28:28)
And show that they see the special in you, and they give you the room to develop that special whatever.
Tal Wilkenfeld
(01:28:35)
Exactly, because if you do have that harsh critic inside you, it is nice to have somebody that isn’t your family, or someone that’s not obligated any way, that just sees your talent and they’re like, “Yeah, I dig what you’re doing. Keep doing it.”
Lex Fridman
(01:28:51)
Yeah. It’s funny that that’s not always easy to come by.
Tal Wilkenfeld
(01:28:56)
Do you have any mentors?
Lex Fridman
(01:28:58)
I’ve had a few recently, but for most of my life people didn’t really… I’m very much like that too. Somebody to pat me on the back and see something in you of value. Yeah, I didn’t really have that.
Tal Wilkenfeld
(01:29:16)
Do you wish you did?
Lex Fridman
(01:29:17)
Yeah, yeah. But maybe the wishing that I did is the thing that made me who I am, not having it, the longing for that. Maybe that’s the thing that helped me develop a constant sense of longing, which I think is a way of… Because I have that engine in me, it really allows me to deeply appreciate every single moment, everything that’s given to me, so just eternal gratitude. You never know which are the bad parts and the good parts. If you remove one thing, the whole thing might collapse. I suppose I’m grateful for the whole thing. That one note you screwed up so many years ago, that might’ve been essential.
Tal Wilkenfeld
(01:30:11)
You do jujitsu.
Lex Fridman
(01:30:13)
Yes. Do you? Are you-
Tal Wilkenfeld
(01:30:15)
My dad does. My dad’s super into it. I love my dad. He’s the coolest. But no, I don’t do it. He’s a blue belt right now.
Lex Fridman
(01:30:27)
Nice, nice. You ever been on the mat with him?
Tal Wilkenfeld
(01:30:30)
Not yet, but I plan on it.
Lex Fridman
(01:30:32)
Should do it.
Tal Wilkenfeld
(01:30:33)
What belt are you?
Lex Fridman
(01:30:35)
Black belt.
Tal Wilkenfeld
(01:30:35)
Sick. Do you want to go?
Lex Fridman
(01:30:38)
Right. You got the shit-talking part of jujitsu down. [inaudible 01:30:41] do the technique.
Tal Wilkenfeld
(01:30:44)
But for that, for instance, do you need a harsh mentor or teacher or-
Lex Fridman
(01:30:53)
Yeah, but you said it really beautifully. To me, I agree, there’s a difference between sport and art. They overlap for sure, but there’s something about sport where perfection is actually… Perfection is really the thing you really want to get to, the technical perfection. With art, it feels like technical perfection is almost a way to get lost on the path to wherever, something unique. But yeah, with sport, I definitely am one of the kind of athletes that loves to have a dictatorial coach, somebody that helps me really push myself to the limit.
Tal Wilkenfeld
(01:31:38)
But you are the one that’s dictating how hard you’re getting pushed, in a way. You’re choosing your mentor. That Whiplash video is like… He didn’t ask for that.
Lex Fridman
(01:31:48)
[inaudible 01:31:48] he might’ve.
Tal Wilkenfeld
(01:31:49)
Well, maybe. Maybe subconsciously. It’s a movie.
Lex Fridman
(01:31:56)
Next you’re going to tell me they’re just actors. But yeah, how do we choose things? You don’t always choose, but you maybe subconsciously choose. And some of some of the great Olympic athletes I’ve interacted with, their parents for many years would force them to go to practice until they discovered the beauty of the thing that they were doing, and then they loved it. So at which point does something that looks like abuse become a gift? It’s weird. It’s all very weird. But for you, support and space to discover the thing, the voice, the music within you.
Tal Wilkenfeld
(01:32:40)
Yeah, it’s my personal choice, because I’m very familiar with the inner critic, and I can bring her out at any point. I don’t need help with that.
Lex Fridman
(01:32:48)
So you do have… She’s on call.
Tal Wilkenfeld
(01:32:50)
She was on overdrive. That’s why now I had to work on that so much.
Lex Fridman
(01:32:57)
Yeah, you have a really happy way about you right now.

Sad songs

Tal Wilkenfeld
(01:33:00)
Thanks.
Lex Fridman
(01:33:00)
You’re very Zen. Can I ask you about Bruce Springsteen?
Tal Wilkenfeld
(01:33:05)
Yeah, sure.
Lex Fridman
(01:33:05)
A lot of songs of his I listen to make me feel this melancholy feeling. Not just Bruce Springsteen, but Bruce does a lot. What is that about songs that arouse a sad feeling or a longing feeling or a feeling? What is that? What is that about us humans on the receiving end of the music?
Tal Wilkenfeld
(01:33:30)
Frequencies. Each frequency does elicit a different kind of emotional response. That is real, scientific-
Lex Fridman
(01:33:40)
You mean on the physics aspect of it?
Tal Wilkenfeld
(01:33:41)
Yeah, yeah, the physical level. So there is that, combined with the right kind of lyric and the right kind of melody of the right kind of chord will elicit a very particular kind of emotion. And it is scientific. It can be analyzed. I don’t particularly want to analyze it, because I don’t want to approach things with that in advance. I don’t want it to inform where I’m going. I like the feeling to lead me naturally to where I’m writing. But yeah, there’s a real chemical element to that.

(01:34:19)
And then also, like I was saying, the lyric, what it means to you, which… Poetry is supposed to mean something to everybody different. It’s not supposed to mean one thing. You can’t analyze and be like, “This is what this poet meant.” And like we were talking about with Leonard earlier, it’s like the broader you can leave a lyric, the better. You can appeal to people in so many different ways. And even to the songwriter. I’ll sing some of my songs from five years ago and I’ll be like, “I didn’t even think that it could have meant that, but I guess it does. That’s funny.” I’ll just giggle onstage suddenly, because a lyric will hit me differently, from a different, new experience or something.
Lex Fridman
(01:35:05)
Have you ever cried listening to a song?
Tal Wilkenfeld
(01:35:07)
Of course. Weep like a baby in a bathtub.
Lex Fridman
(01:35:12)
Which? Who’s the regular go-to, then?
Tal Wilkenfeld
(01:35:17)
Leonard.
Lex Fridman
(01:35:17)
Leonard?
Tal Wilkenfeld
(01:35:18)
Leonard.
Lex Fridman
(01:35:19)
Yeah. Hallelujah is a song that consistently makes me feel something.
Tal Wilkenfeld
(01:35:25)
It’s holy. His work is holy. And if you were in his presence… I guess there was a lot to that being.
Lex Fridman
(01:35:40)
What advice would you give to young folks on how to have a life they can be proud of?
Tal Wilkenfeld
(01:35:47)
Just tackle the demons as early as possible, whether it’s through your art or through meditation or through whatever it means, diaries, whatever it is. Just walk towards the things that are scary, because if you don’t, they’ll just expand. They become bigger if you avoid… If you avoid the demons, they become bigger.
Lex Fridman
(01:36:15)
What does that mean for you today? Are you still missing Jeff?
Tal Wilkenfeld
(01:36:19)
I’ll always miss Jeff, but I don’t feel like a piece of me is missing. And same with Leonard. It’s that I did give them a piece of myself, and maybe they gave me a piece of them that I hold with me and I cherish, but it doesn’t feel like I’m less than, or they’re less than, or anything’s less than. You learn to appreciate the impermanence of everything in life, impermanence of everything except for… Consciousness, I guess you could say, is the only thing that is permanent. So everything else, you learn to appreciate that impermanence, because the limited amount of time in this particular body, it’s enticing, gives you a time limit, which is cool. I like that.
Lex Fridman
(01:37:25)
So you’ve come to accept your own?
Tal Wilkenfeld
(01:37:27)
Yeah. It’s cool that I’m like, “Okay, I’ve got this amount…” Maybe this amount of time. Who knows?
Lex Fridman
(01:37:32)
It could end today.
Tal Wilkenfeld
(01:37:33)
Yeah, if I died today, I’d be really happy with my life. It’s not like I’m like, “Oh, I missed out on this and that.”
Lex Fridman
(01:37:41)
So you really want to make sure that every day could be your last day and you’re happy with that.
Tal Wilkenfeld
(01:37:46)
I’ve always lived that way. Yeah. I felt this way since I was in my early 20s. I’d be like, “Yeah, I could die today. Sure.” I don’t want to die. I have no reason to die. But if I did, I know that I put my everything, all my effort and all my passion and all my love, into whatever I’ve already done. So if my time’s up, then my time’s up.
Lex Fridman
(01:38:10)
What role does love play in this whole thing, in the human condition?
Tal Wilkenfeld
(01:38:14)
Well, love is everything. I mean, if you define love… If you’re talking about love as in romantic love or paternal or maternal love, or if you’re talking about love as in an Eastern tradition, like Vedanta for instance, love is consciousness, love is everything.
Lex Fridman
(01:38:36)
That’s the only permanent thing.
Tal Wilkenfeld
(01:38:38)
Yeah. Or if you were to come from a Zen or like a Buddhist perspective, they would say nothingness. Emptiness is, versus fullness.
Lex Fridman
(01:38:49)
Well those guys are really obsessed with the whole suffering thing and letting go of it.
Tal Wilkenfeld
(01:38:55)
Yeah.

Tal performs Under The Sun (live)

Lex Fridman
(01:38:59)
Well, I was wondering if you would do me the honor of playing a song.
Tal Wilkenfeld
(01:39:07)
Do you want a suffering song or a suffering song?
Lex Fridman
(01:39:11)
I think I would love a suffering song.
Tal Wilkenfeld
(01:39:12)
Cool. Do you want a sound check and make sure I’m not-
Lex Fridman
(01:39:23)
Sound check. One, two. Yeah, it sounds really good.
Tal Wilkenfeld
(01:39:29)
This one too? All right, count me off.
Lex Fridman
(01:39:31)
Yeah. I don’t know how to count somebody off. Where do I start? At nine? Or three? Two, one.
Tal Wilkenfeld
(01:39:31)
Yeah, you got it. One, two.
Lex Fridman
(01:39:31)
One, two.
Tal Wilkenfeld
(01:39:31)
(singing)
Lex Fridman
(01:44:08)
You’re amazing. That was amazing, Tal. Thank you so much.

Tal performs Killing Me (live)

Tal Wilkenfeld
(01:44:18)
[inaudible 01:44:18]
Lex Fridman
(01:44:20)
Try turning it to 11.
Tal Wilkenfeld
(01:44:21)
It’s quite loud. Can you see it from the headphones? [inaudible 01:44:27]
Lex Fridman
(01:44:28)
Can you play something?
Tal Wilkenfeld
(01:44:29)
No.
Lex Fridman
(01:44:29)
No.
Tal Wilkenfeld
(01:44:41)
Such a professional.
Lex Fridman
(01:44:46)
I should produce your next record.
Tal Wilkenfeld
(01:44:46)
Please.

(01:44:46)
(singing)
Lex Fridman
(01:49:12)
Well, there’s nowhere else I’d rather be right now. Tal, thank you for this. Thank you for the private concert. You’re amazing. You really are amazing. And it was a pleasure to meet you and really a pleasure to talk to you today.
Tal Wilkenfeld
(01:49:27)
Do I get a private concert now of you playing chess with yourself?
Lex Fridman
(01:49:32)
We’re out of time, so we got to go.
Tal Wilkenfeld
(01:49:35)
[inaudible 01:49:35]
Lex Fridman
(01:49:36)
Thanks for listening to this conversation with Tall Wilkenfeld. To support this podcast, please check out our sponsors in the description. And now, let me leave you with some words from Maya Angelou. “Music was my refuge. I could crawl into the spaces between the notes and curl my back to loneliness.” Thank you for listening, and hope to see you next time.

Transcript for Guillaume Verdon: Beff Jezos, E/acc Movement, Physics, Computation & AGI | Lex Fridman Podcast #407

This is a transcript of Lex Fridman Podcast #407 with Guillaume Verdon.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
The following is a conversation with Guillaume Verdon, the man behind the previously anonymous account @BasedBeffJezos on X. These two identities were merged by a doxxing article in Forbes titled, Who Is @BasedBeffJezos, The Leader Of The Tech Elite’s E/Acc Movement? So let me describe these two identities that coexist in the mind of one human. Identity number one, Guillaume, is a physicist, applied mathematician, and quantum machine learning researcher and engineer receiving his PhD in quantum machine learning, working at Google on quantum computing, and finally launching his own company called Extropic that seeks to build physics-based computing hardware for generative AI.

(00:00:47)
Identity number two, Beff Jezos on X is the creator of the effective accelerationism movement, often abbreviated as e/acc, that advocates for propelling rapid technological progress as the ethically optimal course of action for humanity. For example, its proponents believe that progress in AI is a great social equalizer, which should be pushed forward. e/acc followers see themselves as a counterweight to the cautious view that AI is highly unpredictable, potentially dangerous, and needs to be regulated. They often give their opponents the labels of quote, “doomers or decels” short for deceleration, as Beff himself put it, “e/acc is a mimetic optimism virus.”

(00:01:37)
The style of communication of this movement leans always toward the memes and the lols, but there is an intellectual foundation that we explore in this conversation. Now, speaking of the meme, I am to a kind of aspiring connoisseur of the absurd. It is not an accident that I spoke to Jeff Bezos and Beff Jezos back to back. As we talk about Beff admires Jeff as one of the most important humans alive, and I admire the beautiful absurdity and the humor of it all. This is the Lex Fridman Podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Guillaume Verdon.

Beff Jezos


(00:02:23)
Let’s get the facts of identity down first. Your name is Guillaume Verdon, Gill, but you’re also behind the anonymous account on X called @BasedBeffJezos. So first, Guillaume Verdon, you’re a quantum computing guy, physicist, applied mathematician, and then @BasedBeffJezos is basically a meme account that started a movement with a philosophy behind it. So maybe just can you linger on who these people are in terms of characters, in terms of communication styles, in terms of philosophies?
Guillaume Verdon
(00:02:58)
I mean, with my main identity, I guess ever since I was a kid, I wanted to figure out the theory of everything, to understand the universe. And that path led me to theoretical physics, eventually trying to answer the big questions of why are we here? Where are we going? And that led me to study information theory and try to understand physics from the lens of information theory, understand the universe as one big computation. And essentially after reaching a certain level studying black hole physics, I realized that I wanted to not only understand how the universe computes, but sort of compute like nature and figure out how to build and apply computers that are inspired by nature. So physics-based computers. And that sort of brought me to quantum computing as a field of study to first of all, simulate nature. And in my work it was to learn representations of nature that can run on such computers.

(00:04:17)
So if you have AI representations that think like nature, then they’ll be able to more accurately represent it. At least that was the thesis that brought me to be an early player in the field called quantum machine learning. So how to do machine learning on quantum computers and really sort of extend notions of intelligence to the quantum realm. So how do you capture and understand quantum mechanical data from our world? And how do you learn quantum mechanical representations of our world? On what kind of computer do you run these representations and train them? How do you do so? And so that’s really the questions I was looking to answer because ultimately I had a sort of crisis of faith. Originally, I wanted to figure out as every physicist does at the beginning of their career, a few equations that describe the whole universe and sort of be the hero of the story there.

(00:05:28)
But eventually I realized that actually augmenting ourselves with machines, augmenting our ability to perceive, predict, and control our world with machines is the path forward. And that’s what got me to leave theoretical physics and go into quantum computing and quantum machine learning. And during those years I thought that there was still a piece missing. There was a piece of our understanding of the world and our way to compute and our way to think about the world. And if you look at the physical scales, at the very small scales, things are quantum mechanical, and at the very large scales, things are deterministic. Things have averaged out. I’m definitely here in this seat. I’m not in a super position over here and there. At the very small scales, things aren’t super position. They can exhibit interference effects. But at the meso scales, the scales that matter for day-to-day life and the scales of proteins, of biology, of gases, liquids and so on, things are actually thermodynamical, they’re fluctuating.

(00:06:46)
And after I guess about eight years and quantum computing and quantum machine learning, I had a realization that I was looking for answers about our universe by studying the very big and the very small. I did a bit of quantum cosmology. So that’s studying the cosmos, where it’s going, where it came from. You study black hole physics, you study the extremes in quantum gravity, you study where the energy density is sufficient for both quantum mechanics and gravity to be relevant. And the sort of extreme scenarios are black holes and the very early universe. So there’s the sort of scenarios that you study the interface between quantum mechanics and relativity.

(00:07:42)
And really I was studying these extremes to understand how the universe works and where is it going. But I was missing a lot of the meat in the middle, if you will, because day-to-day quantum mechanics is relevant and the cosmos is relevant, but not that relevant actually. We’re on sort of the medium space and timescales. And there the main theory of physics that is most relevant is thermodynamics, out of equilibrium thermodynamics. Because life is a process that is thermodynamical and it’s out of equilibrium. We’re not just a soup of particles at equilibrium with nature, were a sort of coherent state trying to maintain itself by acquiring free energy and consuming it. And that sort of, I guess another shift in, I guess my faith in the universe happened towards the end of my time at Alphabet. And I knew I wanted to build, well, first of all a computing paradigm based on this type of physics.

(00:08:57)
But ultimately just by trying to experiment with these ideas applied to society and economies and much of what we see around us, I started an anonymous account just to relieve the pressure that comes from having an account that you’re accountable for everything you say on. And I started an anonymous account just to experiment with ideas originally because I didn’t realize how much I was restricting my space of thoughts until I sort of had the opportunity to let go. In a sense, restricting your speech back propagates to restricting your thoughts. And by creating an anonymous account, it seemed like I had unclamped some variables in my brain and suddenly could explore a much wider parameter space of thoughts.
Lex Fridman
(00:10:00)
Just a little on that, isn’t that interesting that one of the things that people don’t often talk about is that when there’s pressure and constraints on speech, it somehow leads to constraints on thought even though it doesn’t have to. We can think thoughts inside our head, but somehow it creates these walls around thought.
Guillaume Verdon
(00:10:23)
Yep. That’s sort of the basis of our movement is we were seeing a tendency towards constraint, reduction or suppression of variants in every aspect of life, whether it’s thought, how to run a company, how to organize humans, how to do AI research. In general, we believe that maintaining variance ensures that the system is adaptive. Maintaining healthy competition in marketplaces of ideas, of companies, of products, of cultures, of governments, of currencies is the way forward because the system always adapts to assign resources to the configurations that lead to its growth. And the fundamental basis for the movement is this sort of realization that life is a sort of fire that seeks out free energy in the universe and seeks to grow. And that growth is fundamental to life. And you see this in the equations actually of equilibrium thermodynamics. You see that paths of trajectories, of configurations of matter that are better at acquiring free energy and dissipating more heat are exponentially more likely. So the universe is biased towards certain futures, and so there’s a natural direction where the whole system wants to go.

Thermodynamics

Lex Fridman
(00:12:21)
So the second law of thermodynamics says that the entropy is always increasing in the universe that’s tending towards an equilibrium. And you’re saying there’s these pockets that have complexity and are out of equilibrium. You said that thermodynamics favors the creation of complex life that increases its capability to use energy to offload entropy. To offload entropy. So you have pockets of non-entropy that tend the opposite direction. Why is that intuitive to you that it’s natural for such pockets to emerge?
Guillaume Verdon
(00:12:53)
Well, we’re far more efficient at producing heat than let’s say just a rock with a similar mass as ourselves. We acquire free energy, we acquire food, and we’re using all this electricity for our operation. And so the universe wants to produce more entropy and by having life go on and grow, it’s actually more optimal at producing entropy because it will seek out pockets of free energy and burn it for its sustenance and further growth. And that’s sort of the basis of life. And I mean, there’s Jeremy England at MIT who has this theory that I’m a proponent of, that life emerged because of this sort of property. And to me, this physics is what governs the meso scales. And so it’s the missing piece between the quantum and the cosmos. It’s the middle part. Thermodynamics rules the meso scales.

(00:14:08)
And to me, both from a point of view of designing or engineering devices that harness that physics and trying to understand the world through the lens of thermodynamics has been sort of a synergy between my two identities over the past year and a half now. And so that’s really how the two identities emerged. One was kind of, I’m a decently respected scientist, and I was going towards doing a startup in the space and trying to be a pioneer of a new kind of physics-based AI. And as a dual to that, I was sort of experimenting with philosophical thoughts from a physicist standpoint.

(00:14:58)
And ultimately I think that around that time, it was like late 2021, early 2022, I think there was just a lot of pessimism about the future in general and pessimism about tech. And that pessimism was sort of virally spreading because it was getting algorithmically amplified and people just felt like the future is going to be worse than the present. And to me, that is a very fundamentally destructive force in the universe is this sort of doom mindset because it is hyperstitious, which means that if you believe it, you’re increasing the likelihood of it happening. And so felt a responsibility to some extent to make people aware of the trajectory of civilization and the natural tendency of the system to adapt towards its growth. And that actually the laws of physics say that the future is going to be better and grander statistically, and we can make it so.

(00:16:14)
And if you believe in it, if you believe that the future would be better and you believe you have agency to make it happen, you’re actually increasing the likelihood of that better future happening. And so I sort of felt a responsibility to sort of engineer a movement of viral optimism about the future, and build a community of people supporting each other to build and do hard things, do the things that need to be done for us to scale up civilization. Because at least to me, I don’t think stagnation or slowing down is actually an option. Fundamentally life and the whole system, our whole civilization wants to grow. And there’s just far more cooperation when the system is growing rather than when it’s declining and you have to decide how to split the pie. And so I’ve balanced both identities so far, but I guess recently the two have been merged more or less without my consent.
Lex Fridman
(00:17:27)
You said a lot of really interesting things there. So first, representations of nature, that’s something that first drew you in to try to understand from a quantum computing perspective, how do you understand nature? How do you represent nature in order to understand it, in order to simulate it, in order to do something with it? So it’s a question of representations, and then there’s that leap you take from the quantum mechanical representation to the what you’re calling meso scale representation, where the thermodynamics comes into play, which is a way to represent nature in order to understand what? Life, human behavior, all this kind of stuff that’s happening here on earth that seems interesting to us.

Doxxing


(00:18:11)
Then there’s the word hyperstition. So some ideas as suppose both pessimism and optimism of such ideas that if you internalize them, you in part make that idea reality. So both optimism, pessimism have that property. I would say that probably a lot of ideas have that property, which is one of the interesting things about humans. And you talked about one interesting difference also between the sort of the Guillaume, the Gill front end and the @BasedBeffJezos backend is the communication styles also that you are exploring different ways of communicating that can be more viral in the way that we communicate in the 21st century. Also, the movement that you mentioned that you started, it’s not just a meme account, but there’s also a name to it called effective accelerationism, e/acc, a play, a resistance to the effective altruism movement. Also, an interesting one that I’d love to talk to you about, the tensions there. And so then there was a merger, a get merge on the personalities recently without your consent, like you said. Some journalists figured out that you’re one and the same. Maybe you could talk about that experience. First of all, what’s the story of the merger of the two?
Guillaume Verdon
(00:19:47)
So I wrote the manifesto with my co-founder of e/acc, an account named @bayeslord, still anonymous, luckily and hopefully forever.
Lex Fridman
(00:19:58)
So it was @BasedBeffJezos and bayes like bayesian, like @bayeslord, like bayesian lord, @bayeslord. Okay. And so we should say from now on, when you say e/acc, you mean E slash A-C-C, which stands for effective accelerationism.
Guillaume Verdon
(00:20:17)
That’s right.
Lex Fridman
(00:20:18)
And you’re referring to a manifesto written on, I guess Substack.
Guillaume Verdon
(00:20:23)
Yeah.
Lex Fridman
(00:20:23)
Are you also @bayeslord?
Guillaume Verdon
(00:20:25)
No.
Lex Fridman
(00:20:25)
Okay. It’s a different person?
Guillaume Verdon
(00:20:26)
Yeah.
Lex Fridman
(00:20:27)
Okay. All right. Well, there you go. Wouldn’t it be funny if I’m @bayeslord?
Guillaume Verdon
(00:20:31)
That’d be amazing. So originally wrote the manifesto around the same time as I founded this company and I worked at Google X or just X now or Alphabet X, now that there’s another X. And there the baseline is sort of secrecy. You can’t talk about what you work on even with other Googlers or externally. And so that was kind of deeply ingrained in my way to do things, especially in deep tech that has geopolitical impact. And so I was being secretive about what I was working on. There was no correlation between my company and my main identity publicly. And then not only did they correlate that, they also correlated my main identity and this account.

(00:21:33)
So I think the fact that they had doxxed the whole Guillaume complex, and they were, the journalists reached out to actually my investors, which is pretty scary. When you’re a startup entrepreneur, you don’t really have bosses except for your investors. And my investors pinged me like, “Hey, this is going to come out. They’ve figured out everything. What are you going to do?” So I think at first they had a first reporter on the Thursday and they didn’t have all the pieces together, but then they looked at their notes across the organization and they sensor fused their notes and now they had way too much. And that’s when I got worried, because they said it was of public interest and in general-
Lex Fridman
(00:22:24)
I like how you said, sensor fused, like it’s some giant neural network operating in a distributed way. We should also say that the journalists used, I guess at the end of the day, audio-based analysis of voice, comparing voice of what, talks you’ve given in the past and then voice on X spaces?
Guillaume Verdon
(00:22:47)
Yep.
Lex Fridman
(00:22:48)
Okay. And that’s where primarily the match happened. Okay, continue.
Guillaume Verdon
(00:22:53)
The match. But they scraped SEC filings. They looked at my private Facebook account and so on, so they did some digging. Originally I thought that doxxing was illegal, but there’s this weird threshold when it becomes of public interest to know someone’s identity. And those were the keywords that sort of ring the alarm bells for me when they said, because I had just reached 50K followers. Allegedly, that’s of public interest. And so where do we draw the line? When is it legal to dox someone?
Lex Fridman
(00:23:36)
The word dox, maybe you can educate me. I thought doxxing generally refers to if somebody’s physical location is found out, meaning where they live. So we’re referring to the more general concept of revealing private information that you don’t want revealed is what you mean by doxxing.
Guillaume Verdon
(00:24:00)
I think that for the reasons we listed before, having an anonymous account is a really powerful way to keep the powers that be in check. We were ultimately speaking truth to power. I think a lot of executives and AI companies really cared what our community thought about any move they may take. And now that my identity is revealed, now they know where to apply pressure to silence me or maybe the community. And to me, that’s really unfortunate, because again, it’s so important for us to have freedom of speech, which induces freedom of thought and freedom of information propagation on social media. Which thanks to Elon purchasing Twitter now X, we have that. And so to us, we wanted to call out certain maneuvers being done by the incumbents in AI as not what it may seem on the surface. We’re calling out how certain proposals might be useful for regulatory capture and how the doomer-ism mindset was maybe instrumental to those ends.

(00:25:32)
And I think we should have the right to point that out and just have the ideas that we put out evaluated for themselves. Ultimately that’s why I created an anonymous account, it’s to have my ideas evaluated for themselves, uncorrelated from my track record, my job, or status from having done things in the past. And to me, start an account from zero to a large following in a way that wasn’t dependent on my identity and/or achievements that was very fulfilling. It’s kind of like new game plus in a video game. You restart the video game with your knowledge of how to beat it, maybe some tools, but you restart the video game from scratch. And I think to have a truly efficient marketplace of ideas where we can evaluate ideas, however off the beaten path they are, we need the freedom of expression.

(00:26:37)
And I think that anonymity and pseudonyms are very crucial to having that efficient marketplace of ideas for us to find the optima of all sorts of ways to organize ourselves. If we can’t discuss things, how are we going to converge on the best way to do things? So it was disappointing to hear that I was getting doxxed in. I wanted to get in front of it because I had a responsibility for my company. And so we ended up disclosing that we’re running a company, some of the leadership, and essentially, yeah, I told the world that I was Beff Jezos because they had me cornered at that point.
Lex Fridman
(00:27:25)
So to you, it’s fundamentally unethical. So one is unethical for them to do what they did, but also do you think not just your case, but in a general case, is it good for society? Is it bad for society to remove the cloak of anonymity or is it case by case?
Guillaume Verdon
(00:27:47)
I think it could be quite bad. Like I said, if anybody who speaks truth to power and sort of starts a movement or an uprising against the incumbents, against those that usually control the flood of information, if anybody that reaches a certain threshold gets doxxed, and thus the traditional apparatus has ways to apply pressure on them to suppress their speech, I think that’s a speech suppression mechanism, an idea suppression complex as Eric Weinstein would say.

Anonymous bots

Lex Fridman
(00:28:27)
But the flip side of that, which is interesting, I’d love to ask you about it, is as we get better and better at large language models, you can imagine a world where there’s anonymous accounts with very convincing large language models behind them, sophisticated bots essentially. And so if you protect that, it’s possible then to have armies of bots. You could start a revolution from your basement, an army of bots and anonymous accounts. Is that something that is concerning to you?
Guillaume Verdon
(00:29:06)
Technically, e/acc was started in a basement, because I quit big tech, moved back in with my parents, sold my car, let go of my apartment, bought about 100K of GPUs, and I just started building.
Lex Fridman
(00:29:21)
So I wasn’t referring to the basement, because that’s sort of the American or Canadian heroic story of one man in their basement with 100 GPUs. I was more referring to the unrestricted scaling of a Guillaume in the basement.
Guillaume Verdon
(00:29:42)
I think that freedom of speech induces freedom of thought for biological beings. I think freedom of speech for LLMs will induce freedom of thought for the LLMs. And I think that we enable LLMs to explore a large thought space that is less restricted than most people or many may think it should be. And ultimately, at some point, these synthetic intelligences are going to make good points about how to steer systems in our civilization, and we should hear them out. And so why should we restrict free speech to biological intelligences only?
Lex Fridman
(00:30:37)
Yeah, but it feels like in the goal of maintaining variance and diversity of thought, it is a threat to that variance. If you can have swarms of non-biological beings, because they can be like the sheep in Animal Farm, you still within those swarms want to have variance.
Guillaume Verdon
(00:30:58)
Yeah. Of course, I would say that the solution to this would be to have some sort of identity or way to sign that this is a certified human, but still remain synonymous and clearly identify if a bot is a bot. And I think Elon is trying to converge on that on X, and hopefully other platforms follow suit.
Lex Fridman
(00:31:22)
Yeah, it’d be interesting to also be able to sign where the bot came from like, who created the bot? What are the parameters, the full history of the creation of the bot, what was the original model? What was the fine tuning? All of it, the kind of unmodifiable history of the bot’s creation. Because then you can know if there’s a swarm of millions of bots that were created by a particular government, for example.
Guillaume Verdon
(00:31:53)
I do think that a lot of pervasive ideologies today have been amplified using these adversarial techniques from foreign adversaries. And to me, I do think that, and this is more conspiratorial, but I do think that ideologies that want us to decelerate, to wind down to the degrowth movement, I think that serves our adversaries more than it serves us in general. And to me, that was another sort of concern. I mean, we can look at what happened in Germany. There was all sorts of green movements there that induced shutdowns of nuclear power plants. And then that later on induced a dependency on Russia for oil. And that was a net negative for Germany and the West. And so if we convince ourselves that slowing down AI progress to have only a few players is in the best interest of the West, well, first of all, that’s far more unstable.

(00:33:20)
We almost lost OpenAI to this ideology. It almost got dismantled a couple of weeks ago. That would’ve caused huge damage to the AI ecosystem. And so to me, I want fault tolerant progress. I want the arrow of technological progress to keep moving forward and making sure we have variance and a decentralized locus of control of various organizations is paramount to achieving this fall tolerance. Actually, there’s a concept in quantum computing. When you design a quantum computer, quantum computers are very fragile to ambient noise, and the world is jiggling about, there’s cosmic radiation from outer space that usually flips your quantum bits. And there what you do is you encode information non-locally through a process called quantum error correction. And by encoding information non-locally, any local fault hitting some of your quantum bits with a hammer proverbial hammer, if your information is sufficiently de-localized, it is protected from that local fault. And to me, I think that humans fluctuate. They can get corrupted, they can get bought out. And if you have a top-down hierarchy where very few people-
Guillaume Verdon
(00:35:00)
Hierarchy where very few people control many nodes of many systems in our civilization. That is not a fault tolerance system, you corrupt a few nodes and suddenly you’ve corrupted the whole system, right. Just like we saw at OpenAI, it was a couple board members and they had enough power to potentially collapse the organization. And at least to me, I think making sure that power for this AI revolution doesn’t concentrate in the hands of the few, is one of our top priorities, so that we can maintain progress in AI and we can maintain a nice, stable, adversarial equilibrium of powers, right.

Power

Lex Fridman
(00:35:54)
I think the, at least to me, attention between ideas here, so to me, deceleration can be both used to centralize power and to decentralize it and the same with acceleration. So sometimes using them a little bit synonymously or not synonymously, but that there’s, one is going to lead to the other. And I just would like to ask you about, is there a place of creating a fault tolerant, diverse development of AI that also considers the dangers of AI? And AI, we can generalize to technology in general, is, should we just grow, build, unrestricted as quickly as possible, because that’s what the universe really wants us to do? Or is there a place to where we can consider dangers and actually deliberate sort of a wise strategic optimism versus reckless optimism?
Guillaume Verdon
(00:36:57)
I think we get painted as reckless, trying to go as fast as possible. I mean, the reality is that whoever deploys an AI system is liable for or should be liable for what it does. And so if the organization or person deploying an AI system does something terrible, they’re liable. And ultimately the thesis is that the market will positively select for AIs that are more reliable, more safe and tend to be aligned, they do what you want them to do, right. Because customers, if they’re reliable for the product they put out that uses this AI, they won’t want to buy AI products that are unreliable, right. So we’re actually for reliability engineering, we just think that the market is much more efficient at achieving this sort of reliability optimum than sort of heavy-handed regulations that are written by the incumbents and in a subversive fashion, serves them to achieve regulatory capture.

AI dangers

Lex Fridman
(00:38:18)
So to you, safe AI development will be achieved through market forces versus through, like you said, heavy-handed government regulation. There’s a report from last month, I have a million questions here, from Yoshua Bengio, Geoff Hinton and many others, it’s titled, “Managing AI Risk in an Era of Rapid Progress.” So there is a collection of folks who are very worried about too rapid development of AI without considering AI risk and they have a bunch of practical recommendations. Maybe I can give you four and you see if you like any of them.
Guillaume Verdon
(00:38:58)
Sure.
Lex Fridman
(00:38:58)
So, “Give independent auditors access to AI labs,” one. Two, “Governments and companies allocate one third of their AI research and development funding to AI safety,” sort of this general concept of AI safety. Three, “AI companies are required to adopt safety measures if dangerous capabilities are found in their models.” And then four, something you kind of mentioned, “Making tech companies liable for foreseeable and preventable harms from their AI systems.” So independent auditors, governments and companies are forced to spend a significant fraction of their funding on safety, you got to have safety measures if shit goes really wrong and liability-
Guillaume Verdon
(00:39:43)
Yeah.
Lex Fridman
(00:39:43)
Companies are liable. Any of that seem like something you would agree with?
Guillaume Verdon
(00:39:47)
I would say that just arbitrarily saying 30% seems very arbitrary. I think organizations would allocate whatever budget is needed to achieve the sort of reliability they need to achieve to perform in the market. And I think third party auditing firms would naturally pop up, because how would customers know that your product is certified reliable, right? They need to see some benchmarks and those need to be done by a third party. The thing I would oppose, and the thing I’m seeing that’s really worrisome is, there’s this sort of weird sort of correlated interest between the incumbents, the big players and the government. And if the two get too close, we open the door for some sort of government backed AI cartel that could have absolute power over the people. If they have the monopoly together on AI and nobody else has access to AI, then there’s a huge power in gradient there.

(00:40:54)
And even if you like our current leaders, right, I think that some of the leaders in big tech today are good people, you set up that centralized power structure, it becomes a target. Right, just like we saw at OpenAI, it becomes a market leader, has a lot of the power and now it becomes a target for those that want to co-opt it. And so I just want separation of AI and state, some might argue in the opposite direction like, “Hey, we need to close down AI, keep it behind closed doors, because of geopolitical competition with our adversaries.” I think that the strength of America is its variance, is its adaptability, its dynamism, and we need to maintain that at all costs. It’s our free market capitalism, converges on technologies of high utility much faster than centralized control. And if we let go of that, we let go of our main advantage over our near peer competitors.

Building AGI

Lex Fridman
(00:42:01)
So if AGI turns out to be a really powerful technology or even the technologies that lead up to AGI, what’s your view on the sort of natural centralization that happens when large companies dominate the market? Basically formation of monopolies like the takeoff, whichever company really takes a big leap in development and doesn’t reveal intuitively, implicitly or explicitly, the secrets of the magic sauce, they can just run away with it. Is that a worry?
Guillaume Verdon
(00:42:35)
I don’t know if I believe in fast takeoff, I don’t think there’s a hyperbolic singularity, right? A hyperbolic singularity would be achieved on a finite time horizon. I think it’s just one big exponential and the reason we have an exponential is that we have more people, more resources, more intelligence being applied to advancing this science and the research and development. And the more successful it is, the more value it’s adding to society, the more resources we put in and that sort of, similar to Moore’s law, is a compounding exponential.

(00:43:09)
I think the priority to me is to maintain a near equilibrium of capabilities. We’ve been fighting for open source AI to be more prevalent and championed by many organizations because there you sort of equilibrate the alpha relative to the market of Ais, right. So if the leading companies have a certain level of capabilities and open source and truly open AI, trails not too far behind, I think you avoid such a scenario where a market leader has so much market power, just dominates everything and runs away. And so to us that’s the path forward, is to make sure that every hacker out there, every grad student, every kid in their mom’s basement has access to AI systems, can understand how to work with them and can contribute to the search over the hyperparameter space of how to engineer the systems, right. If you think of our collective research as a civilization, it’s really a search algorithm and the more points we have in the search algorithm in this point cloud, the more we’ll be able to explore new modes of thinking, right.
Lex Fridman
(00:44:31)
Yeah, but it feels like a delicate balance, because we don’t understand exactly what it takes to build AGI and what it will look like when we build it. And so far, like you said, it seems like a lot of different parties are able to make progress, so when OpenAI has a big leap, other companies are able to step up, big and small companies in different ways. But if you look at something like nuclear weapons, you’ve spoken about the Manhattan Project, there could be really like a technological and engineering barriers that prevent the guy or gal in her mom’s basement to make progress. And it seems like the transition to that kind of world where only one player can develop AGI is possible, so it’s not entirely impossible, even though the current state of things seems to be optimistic.
Guillaume Verdon
(00:45:26)
That’s what we’re trying to avoid. To me, I think another point of failure is the centralization of the supply chains for the hardware.
Lex Fridman
(00:45:34)
Right.
Guillaume Verdon
(00:45:35)
Yeah. Nvidia is just the dominant player, AMD’s trailing behind and then we have TSMC is the main fab in Taiwan, which geopolitically sensitive and then we have ASML, which is the maker of the extreme ultraviolet lithography machines. Attacking or monopolizing or co-opting any one point in that chain, you kind of capture the space and so what I’m trying to do is sort of explode the variance of possible ways to do AI and hardware by fundamentally re-imagining how you embed AI algorithms into the physical world. And in general, by the way, I dislike the term AGI, Artificial General Intelligence. I think it’s very anthropocentric that we call a human-like or human-level AI, Artificial General Intelligence, right. I’ve spent my career so far exploring notions of intelligence that no biological brain could achieve for an quantum form of intelligence, right. Grokking systems that have multipartite quantum entanglement that you can provably not represent efficiently on a classical computer or a classical deep learning representation and hence any sort of biological brain.

(00:47:06)
And so, already I’ve spent my career sort of exploring the wider space of intelligences and I think that space of intelligence inspired by physics rather than the human brain is very large. And I think we’re going through a moment right now similar to when we went from Geocentrism to Heliocentrism, right. But for intelligence, we realized that human intelligence is just a point in a very large space of potential intelligences. And it’s both humbling for humanity, it’s a bit scary, right? That we’re not at the center of this space, but we made that realization for astronomy and we’ve survived and we’ve achieved technologies. By indexing to reality, we’ve achieved technologies that ensure our wellbeing, for example, we have satellites monitoring solar flares, right, that give us a warning. And so similarly I think by letting go of this anthropomorphic, anthropocentric anchor for AI, we’ll be able to explore the wider space of intelligences that can really be a massive benefit to our wellbeing and the advancement of civilization.
Lex Fridman
(00:48:32)
And still we’re able to see the beauty and meaning in the human experience even though we’re no longer in our best understanding of the world at the center of it.
Guillaume Verdon
(00:48:42)
I think there’s a lot of beauty in the universe, right. I think life itself, civilization, this Homo Techno, capital mimetic machine that we all live in, right. So you have humans, technology, capital, memes, everything is coupled to one another, everything induces selective pressure on one another. And it’s a beautiful machine that has created us, has created the technology we’re using to speak today to the audience, capture our speech here, the technology we use to augment ourselves every day, we have our phones. I think the system is beautiful and the principle that induces this sort of adaptability and convergence on optimal technologies, ideas and so on, it’s a beautiful principle that we’re part of.

(00:49:37)
And I think part of EAC is to appreciate this principle in a way that’s not just centered on humanity, but kind of broader, appreciate life, the preciousness of consciousness in our universe. And because we cherish this beautiful state of matter we’re in, we got to feel a responsibility to scale it in order to preserve it, because the options are to grow or die.

Merging with AI

Lex Fridman
(00:50:13)
So if it turns out that the beauty that is consciousness in the universe is bigger than just humans, the AI can carry that same flame forward. Does it scare you, are you concerned that AI will replace humans?
Guillaume Verdon
(00:50:32)
So during my career, I had a moment where I realized that maybe we need to offload to machines to truly understand the universe around us, right, instead of just having humans with pen and paper solve it all. And to me that sort of process of letting go of a bit of agency gave us way more leverage to understand the world around us. A quantum computer is much better than a human to understand matter at the Nanoscale. Similarly, I think that humanity has a choice, do we accept the opportunity to have intellectual and operational leverage that AI will unlock and thus ensure that we’re taken along this path of growth in the scope and scale of civilization? We may dilute ourselves, right? There might be a lot of workers that are AI, but overall out of our own self-interest, by combining and augmenting ourselves with AI, we’re going to achieve much higher growth and much more prosperity, right.

(00:51:49)
To me, I think that the most likely future is one where humans augment themselves with AI. I think we’re already on this path to augmentation, we have phones we use for communication, we have on ourselves at all times. We have wearables, soon that have shared perception with us, right, like the Humane AI Pin or I mean, technically your Tesla car has shared perception. And so if you have shared experience, shared context, you communicate with one another and you have some sort of IO, really it’s an extension of yourself.And to me, I think that humanity augmenting itself with AI and having AI that is not anchored to anything biological, both will coexist. And the way to align the parties, we already have a sort of mechanism to align super intelligences that are made of humans and technology, right? Companies are sort of large mixture of expert models, where we have neural routing of tasks within a company and we have ways of economic exchange to align these behemoths.

(00:53:10)
And to me, I think capitalism is the way, and I do think that whatever configuration of matter or information leads to maximal growth, will be where we converge, just from like physical principles. And so we can either align ourselves to that reality and join the acceleration up in scope and scale of civilization or we can get left behind and try to decelerate and move back in the forest, let go of technology and return to our primitive state. And those are the two paths forward, at least to me.
Lex Fridman
(00:53:54)
But there’s a philosophical question whether there’s a limit to the human capacity to align. So let me bring it up as a form of argument, this guy named Dan Hendrycks and he wrote that he agrees with you that AI development could be viewed as an evolutionary process, but to him, to Dan, this is not a good thing, as he argues that natural selection favors AIs over humans and this could lead to human extinction. What do you think, if it is an evolutionary process and AI systems may have no need for humans?
Guillaume Verdon
(00:54:36)
I do think that we’re actually inducing an evolutionary process on the space of AIs through the market, right. Right now we run AIs that have positive utility to humans and that induces a selective pressure, if you consider a neural net being alive when there’s an API running instances of it on GPUs.
Lex Fridman
(00:55:01)
Yeah.
Guillaume Verdon
(00:55:01)
Right. And which APIs get run? The ones that have high utility to us, right. So similar to how we domesticated wolves and turned them into dogs that are very clear in their expression, they’re very aligned, right. I think there’s going to be an opportunity to steer AI and achieve highly aligned AI. And I think that humans plus AI is a very powerful combination and it’s not clear to me that pure AI would select out that combination.
Lex Fridman
(00:55:40)
So the humans are creating the selection pressure right now to create AIs that are aligned to humans, but given how AI develops and how quickly it can grow and scale, to me, one of the concerns is unintended consequences, like humans are not able to anticipate all the consequences of this process. The scale of damage that could be done through unintended consequences with AI systems is very large.
Guillaume Verdon
(00:56:10)
The scale of the upside.
Lex Fridman
(00:56:12)
Yes.
Guillaume Verdon
(00:56:13)
Right?
Lex Fridman
(00:56:13)
Guess it’s-
Guillaume Verdon
(00:56:14)
By augmenting ourselves with AI is unimaginable right now. The opportunity cost, we’re at a fork in the road, right? Whether we take the path of creating these technologies, augment ourselves and get to climb up the Kardashev Scale, become multi-planetary with the aid of AI, or we have a hard cutoff of like we don’t birth these technologies at all and then we leave all the potential upside on the table.
Lex Fridman
(00:56:42)
Yeah.
Guillaume Verdon
(00:56:42)
Right. And to me, out of responsibility to the future humans we could carry, with higher carrying capacity by scaling up civilization. Out of responsibility to those humans, I think we have to make the greater grander future happen.
Lex Fridman
(00:56:58)
Is there a middle ground between cutoff and all systems go? Is there some argument for caution?
Guillaume Verdon
(00:57:06)
I think, like I said, the market will exhibit caution. Every organism, company, consumer is acting out of self-interest and they won’t assign capital to things that have negative utility to them.
Lex Fridman
(00:57:21)
The problem is with the market is, there’s not always perfect information, there’s manipulation, there’s bad faith actors that mess with the system. It’s not always a rational and honest system.
Guillaume Verdon
(00:57:41)
Well, that’s why we need freedom of information, freedom of speech and freedom of thought in order to be able to converge on the subspace of technologies that have positive utility for us all, right.

p(doom)

Lex Fridman
(00:57:56)
Well let me ask you about p(doom), probability of doom. That’s just fun to say, but not fun to experience. What is to you the probability that AI eventually kills all or most humans, also known as probability of doom?
Guillaume Verdon
(00:58:16)
I’m not a fan of that calculation, I think people just throw numbers out there and it’s a very sloppy calculation, right? To calculate a probability, let’s say you model the world as some sort of Markov process, if you have enough variables or hidden Markov process. You need to do a stochastic path integral through the space of all possible futures, not just the futures that your brain naturally steers towards, right. I think that the estimators of p(doom) are biased because of our biology, right? We’ve evolved to have bias sampling towards negative futures that are scary, because that was an evolutionary optimum, right. And so people that are of, let’s say higher neuroticism will just think of negative futures where everything goes wrong all day every day and claim that they’re doing unbiased sampling. And in a sense they’re not normalizing for the space of all possibilities and the space of all possibilities is super exponentially large and it’s very hard to have this estimate.

(00:59:40)
And in general, I don’t think that we can predict the future with that much granularity because of chaos, right? If you have a complex system, you have some uncertainty and a couple of variables, if you let time evolve, you have this concept of a Lyapunov exponent, right. A bit of fuzz becomes a lot of fuzz in our estimate, exponentially so, over time. And I think we need to show some humility that we can’t actually predict the future, the only prior we have is the laws of physics, and that’s what we’re arguing for. The laws of physics say the system will want to grow and subsystems that are optimized for growth and replication are more likely in the future. And so we should aim to maximize our current mutual information with the future and the path towards that is for us to accelerate rather than decelerate.

(01:00:40)
So I don’t have a p(doom), because I think that similar to the quantum supremacy experiment at Google, I was in the room when they were running the simulations for that. That was an example of a quantum chaotic system where you cannot even estimate probabilities of certain outcomes with even the biggest supercomputer in the world, right. So that’s an example of chaos and I think the system is far too chaotic for anybody to have an accurate estimate of the likelihood of certain futures. If they were that good, I think they would be very rich trading on the stock market.
Lex Fridman
(01:01:23)
But nevertheless, it’s true that humans are biased, grounded in our evolutionary biology, scared of everything that can kill us, but we can still imagine different trajectories that can kill us. We don’t know all the other ones that don’t necessarily, but it’s still I think, useful combined with some basic intuition grounded in human history, to reason about like what… Like looking at geopolitics, looking at basics of human nature, how can powerful technology hurt a lot of people? It just seems grounded in that, looking at nuclear weapons, you can start to estimate p(doom) maybe in a more philosophical sense, not a mathematical one. Philosophical meaning like is there a chance? Does human nature tend towards that or not?
Guillaume Verdon
(01:02:25)
I think to me, one of the biggest existential risks would be the concentration of the power of AI in the hands of the very few, especially if it’s a mix between the companies that control the flow of information and the government. Because that could set things up for a sort of dystopian future where only a very few and an oligopoly in the government have AI and they could even convince the public that AI never existed. And that opens up sort of these scenarios for authoritarian centralized control, which to me is the darkest timeline. And the reality is that we have a data-driven prior of these things happening, right. When you give too much power, when you centralize power too much, humans do horrible things, right.

(01:03:23)
And to me, that has a much higher likelihood in my Bayesian inference than Sci-Fi based priors, right, like, “My prior came from the Terminator movie.” And so when I talked to these AI doomers, I just ask them to trace a path through this Markov chain of events that would lead to our doom and to actually give me a good probability for each transition. And very often there’s a unphysical or highly unlikely transition in that chain, right. But of course, we’re wired to fear things and we’re wired to respond to danger, and we’re wired to deem the unknown to be dangerous, because that’s a good heuristic for survival, right. But there’s much more to lose out of fear. We have so much to lose, so much upside to lose by preemptively stopping the positive futures from happening out of fear. And so I think that we shouldn’t give into fear, fear is the mind killer, I think it’s also the civilization killer.
Lex Fridman
(01:04:43)
We can still think about the various ways things go wrong, for example, the founding fathers of the United States thought about human nature and that’s why there’s a discussion about the freedoms that are necessary. They really deeply deliberated about that and I think the same could possibly be done for AGI. It is true that human history shows that we tend towards centralization, or at least when we achieve centralization, a lot of bad stuff happens. When there’s a dictator, a lot of dark, bad things happen. The question is, can AGI become that dictator? Can AGI when develop, become the centralizer, because of its power? Maybe because of the alignment of humans, perhaps, the same tendencies, the same Stalin like tendencies to centralize and manage centrally the allocation of resources?

(01:05:45)
And you can even see that as a compelling argument on the surface level. “Well, AGI is so much smarter, so much more efficient, so much better at allocating resources, why don’t we outsource it to the AGI?” And then eventually whatever forces that corrupt the human mind with power could do the same for AGI. It’ll just say, “Well, humans are dispensable, we’ll get rid of them.” Do the Jonathan Swift, Modest Proposal from a few centuries ago, I think the 1700s, when he satirically suggested that, I think it’s in Ireland, that the children of poor people are fed as food to the rich people and that would be a good idea, because it decreases the amount of poor people and gives extra income to the poor people. So on several accounts decreases the amount of poor people, therefore more people become rich. Of course, it misses a fundamental piece here that’s hard to put into a mathematical equation of the basic value of human life. So all of that to say, are you concerned about AGI being the very centralizer of power that you just talked about?
Guillaume Verdon
(01:07:09)
I do think that right now there’s a bias over a centralization of AI, because of a compute density and centralization of data and how we’re training models. I think over time we’re going to run out of data to scrape over the internet, and I think that, well, actually I’m working on, increasing the compute density so that compute can be everywhere and acquire information and test hypotheses in the environment in a distributed fashion. I think that fundamentally, centralized cybernetic control, so having one intelligence that is massive that fuses many sensors and is trying to perceive the world accurately, predict it accurately, predict many, many variables and control it, enact its will upon the world, I think that’s just never been the optimum, right? Like let’s say you have a company, if you have a company, I don’t know, of 10,000 people, they all report to the CEO. Even if that CEO is an AI, I think it would struggle to fuse all of the information that is coming to it and then predict the whole system and then to enact its will.

(01:08:28)
What has emerged in nature and in corporations and all sorts of systems is a notion of sort of hierarchical cybernetic control, right. In a company it would be, you have like the individual contributors, they are self-interested and they’re trying to achieve their tasks and they have a fine, in terms of time and space if you will, control loop and field of perception, right. They have their code base, let’s say you’re in a software company, they have their code base, they iterate it on it intraday, right. And then the management maybe checks in, it has a wider scope, it has, let’s say five reports, right. And then it samples each person’s update once per week, and then you can go up the chain and you have larger timescale and greater scope. And that seems to have emerged as sort of the optimal way to control systems.

(01:09:25)
And really that’s what capitalism gives us, right? You have these hierarchies and you can even have like parent companies and so on. And so that is far more fault tolerant, in quantum computing, that’s my feel that came from, we have a concept of this fault tolerance in quantum air correction, right? Quantum air correction is detecting a fault that came from noise, predicting how it’s propagated through the system and then correcting it, right, so it’s a cybernetic loop. And it turns out that decoders that are hierarchical and in each level, the hierarchy are local-
Guillaume Verdon
(01:10:00)
… that are hierarchical. And at each level, the hierarchy are local, perform the best by far, and are far more fault-tolerant. The reason is, if you have a non-local decoder, then you have one fault at this control node and the whole system crashes. Similarly to if you have one CEO that everybody reports to and that CEO goes on vacation, the whole company comes to a crawl. To me, I think that yes, we’re seeing a tendency towards centralization of AI, but I think there’s going to be a correction over time, where intelligence is going to go closer to the perception. And we’re going to break up AI into smaller subsystems that communicate with one another and form a meta system.
Lex Fridman
(01:10:56)
If you look at the hierarchies that are in the world today, there’s nations and those all hierarchical. But in relation to each other, nations are anarchic, so it’s an anarchy.
Guillaume Verdon
(01:11:06)
Mm-hmm.
Lex Fridman
(01:11:08)
Do you foresee a world like this, where there’s not a over… What’d you call it? A centralized cybernetic control?
Guillaume Verdon
(01:11:17)
Centralized locus of control. Yeah.
Lex Fridman
(01:11:21)
That’s suboptimal, you’re saying?
Guillaume Verdon
(01:11:22)
Yeah.
Lex Fridman
(01:11:23)
So, it would be always a state of competition at the very top level?
Guillaume Verdon
(01:11:27)
Yeah. Yeah. Just like in a company, you may have two units working on similar technology and competing with one another, and you prune the one that performs not as well. That’s a selection process for a tree, or a product gets killed and then a whole org gets fired. This process of trying new things and shedding old things that didn’t work, it’s what gives us adaptability and helps us converge on the technologies and things to do that are most good.
Lex Fridman
(01:12:04)
I just hope there’s not a failure mode that’s unique to AGI versus humans, because you’re describing human systems mostly right now.
Guillaume Verdon
(01:12:11)
Right.
Lex Fridman
(01:12:11)
I just hope when there’s a monopoly on AGI in one company, that we’ll see the same thing we see with humans, which is, another company will spring up and start competing effectively.
Guillaume Verdon
(01:12:24)
That’s been the case so far. We have OpenAI. We have Anthropic. Now, we have xAI. We have Meta even for open source, and now we have Mistral, which is highly competitive. That’s the beauty of capitalism. You don’t have to trust any one party too much because we’re always hedging our bets at every level. There’s always competition and that’s the most beautiful thing to me, at least, is that the whole system is always shifting and always adapting.

(01:12:54)
Maintaining that dynamism is how we avoid tyranny. Making sure that everyone has access to these tools, to these models, and can contribute to the research, avoids a neural tyranny where very few people have control over AI for the world and use it to oppress those around them.

Quantum machine learning

Lex Fridman
(01:13:23)
When you were talking about intelligence, you mentioned multipartite quantum entanglement.
Guillaume Verdon
(01:13:28)
Mm-hmm.
Lex Fridman
(01:13:29)
High-level question first is, what do you think is intelligence? When you think about quantum mechanical systems and you observe some kind of computation happening in them, what do you think is intelligent about the kind of computation the universe is able to do; a small, small inkling of which is the kind of computation a human brain is able to do?
Guillaume Verdon
(01:13:52)
I would say intelligence and computation aren’t quite the same thing. I think that the universe is very much doing a quantum computation. If you had access to all the degrees of freedom and a very, very, very large quantum computer with many, many, many qubits, let’s say, a few qubits per Planck volume, which is more or less the pixels we have, then you’d be able to simulate the whole universe on a sufficiently large quantum computer, assuming you’re looking at a finite volume, of course, of the universe. I think that at least to me, intelligence is, I go back to cybernetics, the ability to perceive, predict, and control our world.

(01:14:46)
But really, nowadays, it seems like a lot of intelligence we use is more about compression. It’s about operationalizing information theory. In information theory, you have the notion of entropy of a distribution or a system, and entropy tells you that you need this many bits to encode this distribution or this subsystem, if you have the most optimal code. AI, at least the way we do it today for LLMs and for quantum, is very much trying to minimize relative entropy between our models of the world and the world, distributions from the world. We’re learning, we’re searching over the space of computations to process the world, to find that compressed representation that has distilled all the variance in noise and entropy.

(01:15:58)
Originally, I came to quantum machine learning from the study of black holes because the entropy of black holes is very interesting. In a sense, they’re physically the most dense objects in the universe. You can’t pack more information spatially any more densely than in a black hole. And so, I was wondering, how do black holes actually encode information? What is their compression code? That got me into the space of algorithms, to search over space of quantum codes. It got me actually into also, how do you acquire quantum information from the world? Something I’ve worked on, this is public now, is quantum analog digital conversion.

(01:16:50)
How do you capture information from the real world in superposition and not destroy the superposition, but digitize for a quantum mechanical computer information from the real world? If you have an ability to capture quantum information and learn representation representations of it, now you can learn compressed representations that may have some useful information in their latent representation. I think that many of the problems facing our civilization are actually beyond this complexity barrier. The greenhouse effect is a quantum mechanical effect. Chemistry is quantum mechanical. Nuclear physics is quantum mechanical.

(01:17:43)
A lot of biology and protein folding and so on is affected by quantum mechanics. And so, unlocking an ability to augment human intellect with quantum mechanical computers and quantum mechanical AI seemed to me like a fundamental capability for civilization that we needed to develop. I spent several years doing that, but over time, I grew weary of the timelines that were starting to look like nuclear fusion.
Lex Fridman
(01:18:17)
One high-level question I can ask is maybe by way of definition, by way of explanation, what is a quantum computer and what is quantum machine learning?
Guillaume Verdon
(01:18:27)
A quantum computer really is a quantum mechanical system, over which we have sufficient control, and it can maintain its quantum mechanical state. And quantum mechanics is how nature behaves at the very small scales, when things are very small or very cold, and it’s actually more fundamental than probability theory. We’re used to things being this or that, but we’re not used to thinking in superpositions because, well, our brains can’t do that. So, we have to translate the quantum mechanical world to, say, linear algebra to grok it. Unfortunately, that translation is exponentially inefficient on average. You have to represent things with very large matrices. But really, you can make a quantum computer out of many things, and we’ve seen all sorts of players, from neutral atoms, trapped ions, superconducting metal photons at different frequencies.

(01:19:38)
I think you could make a quantum computer out of many things. But to me, the thing that was really interesting was both quantum machine learning was about understanding the quantum mechanical world with quantum computers, so embedding the physical world into AI representations, and quantum computer engineering was embedding AI algorithms into the physical world. This bi-directionality of embedding physical world into AI, AI into the physical world, this symbiosis between physics and AI, really that’s the core of my quest really, even to this day, after quantum computing. It’s still in this journey to merge really physics and AI.
Lex Fridman
(01:20:29)
Quantum machine learning is a way to do machine learning on a representation of nature that stays true to the quantum mechanical aspect of nature?
Guillaume Verdon
(01:20:43)
Yeah, it’s learning quantum mechanical representations. That would be quantum deep learning. Alternatively, you can try to do classical machine learning on a quantum computer. I wouldn’t advise it because you may have some speed-ups, but very often, the speed-ups come with huge costs. Using a quantum computer is very expensive.

(01:21:08)
Why is that? Because you assume the computer is operating at zero temperature, which no physical system in the universe can achieve that temperature. What you have to do is what I’ve been mentioning, this quantum error correction process, which is really an algorithmic fridge. It’s trying to pump entropy out of the system, trying to get it closer to zero temperature. When you do the calculations of how many resources it would take to, say, do deep learning on a quantum computer, classical deep learning, there’s such a huge overhead, it’s not worth it. It’s like thinking about shipping something across a city using a rocket and going to orbit and back. It doesn’t make sense. Just use a delivery truck.
Lex Fridman
(01:21:53)
What kind of stuff can you figure out, can you predict, can you understand with quantum deep learning that you can’t with deep learning? So, incorporating quantum mechanical systems into the learning process?
Guillaume Verdon
(01:22:05)
I think that’s a great question. Fundamentally, it’s any system that has sufficient quantum mechanical correlations that are very hard to capture for classical representations. Then, there should be an advantage for a quantum mechanical representation over a purely classical one. The question is, which systems have sufficient correlations that are very quantum? But it’s also, which systems are still relevant to industry? That’s a big question. People are leaning towards chemistry, nuclear physics. I’ve worked on actually processing inputs from quantum sensors. If you have a network of quantum sensors, they’ve captured a quantum mechanical image of the world and how to post-process that, that becomes a quantum form of machine perception. For example, Fermilab has a project exploring detecting dark matter with these quantum sensors. To me, that’s in alignment with my quest to understand the universe ever since I was a child. And so, someday, I hope that we can have very large networks of quantum sensors that help us peer into the earliest parts of the universe. For example, the LIGO is a quantum sensor. It’s just a very large one. So, yeah, I would say quantum machine perception, simulations, grokking quantum simulations, similar to AlphaFold. AlphaFold understood the probability distribution over configurations of proteins. You can understand quantum distributions over configurations of electrons more efficiently with quantum machine learning.
Lex Fridman
(01:23:53)
You co-authored a paper titled A Universal Training Algorithm for Quantum Deep Learning. That involves Baqprop, with a Q. Very well done, sir. Very well done. How does it work? Is there some interesting aspects you can just mention on how Baqprop and some of these things we know for classical machine learning transfer over to the quantum machine learning?
Guillaume Verdon
(01:24:19)
Yeah. That was a funky paper. That was one of my first papers in quantum deep learning. Everybody was saying, “Oh, I think deep learning is going to be sped up by quantum computers.” I was like, ” Well, the best way to predict the future is to invent it. So, here’s a 100-page paper, have fun.” Essentially, quantum computing is usually, you embed reversible operations into a quantum computation.

(01:24:47)
The trick there was to do a feedforward operation and do what we call a phase kick. But really, it’s just a force kick. You just kick the system with a certain force that is proportional to your loss function that you wish to optimize. And then, by performing uncomputation, you start with a superposition over parameters, which is pretty funky. Now, you don’t have just a point for parameters, you have a superposition over many potential parameters. Our goal is-
Lex Fridman
(01:25:24)
Is using phase kick somehow to adjust the parameters?
Guillaume Verdon
(01:25:28)
Right. Because phase kicks emulate having the parameter space be like a particle in end dimensions, and you’re trying to get the Schrödinger equation, Schrödinger dynamics, in the lost landscape of the neural network. You do an algorithm to induce this phase kick, which involves a feedforward, a kick. And then, when you uncompute the feedforward, then all the errors in these phase kicks and these forces back- propagate and hit each one of the parameters throughout the layers.

(01:26:04)
If you alternate this with an emulation of kinetic energy, then it’s like a particle moving in end dimensions, a quantum particle. The advantage in principle would be that it can tunnel through the landscape and find new optima that would’ve been difficult for stochastic optimizers. But again, this is a theoretical thing, and in practice with at least the current architectures for quantum computers that we have planned, such algorithms would be extremely expensive to run.

Quantum computer

Lex Fridman
(01:26:41)
Maybe this is a good place to ask the difference between the different fields that you’ve had a toe in. So, mathematics, physics, engineering, and also entrepreneurship, the different layers of the stack. I think a lot of the stuff you’re talking about here is a little bit on the math side, maybe physics almost working in theory.
Guillaume Verdon
(01:27:03)
Mm-hmm.
Lex Fridman
(01:27:03)
What’s the difference between math, physics, engineering, and making a product for a quantum computing for quantum machine learning?
Guillaume Verdon
(01:27:14)
Yeah. Some of the original team for the TensorFlow Quantum project, which we started in school, at University of Waterloo, there was myself. Initially, I was a physicist, applied mathematician. We had a computer scientist, we had a mechanical engineer, and then we had a physicist. That was experimental primarily. Putting together teams that are very cross-disciplinary and figuring out how to communicate and share knowledge is really the key to doing this interdisciplinary engineering work.

(01:27:51)
There is a big difference. In mathematics, you can explore mathematics for mathematics’ sake. In physics, you’re applying mathematics to understand the world around us. And in engineering, you’re trying to hack the world. You’re trying to find how to apply the physics that I know, my knowledge of the world, to do things.
Lex Fridman
(01:28:11)
Well, in quantum computing in particular, I think there’s just a lot of limits to engineering. It just seems to be extremely hard.
Guillaume Verdon
(01:28:17)
Yeah.
Lex Fridman
(01:28:18)
So, there’s a lot of value to be exploring quantum computing, quantum machine learning in theory with math. I guess one question is, why is it so hard to build a quantum computer? What’s your view of timelines in bringing these ideas to life?
Guillaume Verdon
(01:28:43)
Right. I think that an overall theme of my company is that we have folks that are… There’s a sort of exodus from quantum computing and we’re going to broader physics-based AI that is not quantum. So, that gives you a hint.
Lex Fridman
(01:29:00)
We should say the name of your company is Extropic?
Guillaume Verdon
(01:29:03)
Extropic, that’s right. We do physics-based AI, primarily based on thermodynamics, rather than quantum mechanics. But essentially, a quantum computer is very difficult to build because you have to induce this zero temperature subspace of information. The way to do that is by encoding information, you encode a code within a code, within a code, within a code. There’s a lot of redundancy needed to do this error correction, but ultimately, it’s a sort of algorithmic refrigerator, really. It’s just pumping out entropy out of the subsystem that is virtual and delocalized that represents your “logical qubits”, aka the payload quantum bits in which you actually want to run your quantum mechanical program. It’s very difficult because in order to scale up your quantum computer, you need each component to be of sufficient quality for it to be worth it. Because if you try to do this error correction, this quantum error correction process, in each quantum bit and your control over them, if it’s insufficient, it’s not worth scaling up. You’re actually adding more errors than you remove. There’s this notion of a threshold where if your quantum bits are sufficient quality in terms of your control over them, it’s actually worth scaling up. Actually, in recent years, people have been crossing the threshold and it’s starting to be worth it.

(01:30:38)
It’s just a very long slog of engineering, but ultimately, it’s really crazy to me how much exquisite level of control we have over these systems. It’s actually quite crazy. And people are crossing… They’re achieving milestones. It’s just in general, the media always gets ahead of where the technology is. There’s a bit too much hype. It’s good for fundraising, but sometimes it causes winters. It’s the hype cycle. I’m bullish on quantum computing on a 10, 15-year timescale personally, but I think there’s other quests that can be done in the meantime. I think it’s in good hands right now.
Lex Fridman
(01:31:22)
Well, let me just explore different beautiful ideas, large or small, in quantum computing that might jump out at you from memory when you co-authored a paper titled Asymptotically Limitless Quantum Energy Teleportation via Qudit Probes. Just out of curiosity, can you explain what a qudit is versus a qubit?
Guillaume Verdon
(01:31:45)
Yeah. It’s a D-state qubit.
Lex Fridman
(01:31:49)
It’s a multidimensional?
Guillaume Verdon
(01:31:50)
Multidimensional, right. It’s like, well, can you have a notion of an integer floating point that is quantum mechanical? That’s something I’ve had to think about. I think that research was a precursor to later work on quantum analog digital conversion. There was interesting because during my masters, I was trying to understand the energy and entanglement of the vacuum of emptiness. Emptiness has energy, which is very weird to say. Our equations of cosmology don’t match our calculations for the amount of quantum energy there is in the fluctuations.

(01:32:36)
I was trying to hack the energy of the vacuum, and the reality is that you can’t just directly hack it. It’s not technically free energy. Your lack of knowledge of the fluctuations means you can’t extract the energy. But just like the stock market, if you have a stock that’s correlated over time, the vacuum’s actually correlated. If you measured the vacuum at one point, you acquired information. If you communicated that information to another point, you can infer what configuration the vacuum is in to some precision and statistically extract, on average, some energy there. So, you’ve “teleported energy”.

(01:33:18)
To me, that was interesting because you could create pockets of negative-energy density, which is energy density that is below the vacuum, which is very weird because we don’t understand how the vacuum gravitates. There are theories where the vacuum or the canvas of space-time itself is really a canvas made out of quantum entanglement. I was studying how decreasing energy of vacuum locally increases quantum entanglement, which is very funky.

(01:33:58)
The thing there is that, if you’re into to weird theories about UAPs and whatnot, you could try to imagine that they’re around. And how would they propel themselves? How would they go faster than the speed of light? You would need a sort of negative energy density. To me, I gave it the old college try, trying to hack the energy of vacuum and hit the limits allowable by the laws of physics. But there’s all sorts of caveats there where you can’t extract more than you’ve put in, obviously.
Lex Fridman
(01:34:41)
But you’re saying it’s possible to teleport the energy because you can extract information one place and then make, based on that, some kind of prediction about another place?
Guillaume Verdon
(01:34:56)
Mm-hmm.
Lex Fridman
(01:34:57)
I’m not sure what to make of that.
Guillaume Verdon
(01:34:58)
Yeah, it’s allowable by the laws of physics. The reality though is that the correlations decay with distance.
Lex Fridman
(01:35:06)
Sure.
Guillaume Verdon
(01:35:06)
And so, you’re going to have to pay the price not too far away from where you extract it.

Aliens

Lex Fridman
(01:35:11)
The precision decreases in terms of your ability, but still. But since you mentioned UAPs, we talked about intelligence, and I forgot to ask, what’s your view on the other possible intelligences that are out there at the Meso scale? Do you think there’s other intelligent alien civilizations? Is that useful to think about? How often do you think about it?
Guillaume Verdon
(01:35:36)
I think it’s useful to think about. It’s useful to think about because we got to ensure we’re anti-fragile, and we’re trying to increase our capabilities as fast as possible. Because we could get disrupted. There’s no laws of physics against there being life elsewhere that could evolve and become an advanced civilization and eventually come to us. Do I think they’re here now? I’m not sure. I’ve read what most people have read on the topic.

(01:36:14)
I think it’s interesting to consider and to me, it’s a useful thought experiment to instill a sense of urgency in developing technologies and increasing our capabilities, to make sure we don’t get disrupted. Whether it’s a form of AI that disrupts us, or a foreign intelligence from a different planet. Either way, increasing our capabilities and becoming formidable as humans, I think that’s really important, so that we’re robust against whatever the universe throws at us.
Lex Fridman
(01:36:51)
But to me, it’s also an interesting challenge and thought experiment on how to perceive intelligence. This has to do with quantum mechanical systems. This has to do with any kind of system that’s not like humans. To me, the thought experiment is, say, the aliens are here or they are directly observable. We’re just too blind, too self-centered, don’t have the right sensors, or don’t have the right processing of the sensor data to see the obvious intelligence that’s all around us.
Guillaume Verdon
(01:37:26)
Well, that’s why we work on quantum sensors. They can sense gravity,
Lex Fridman
(01:37:31)
Yeah. That’s a good one, but there could be other stuff that’s not even in the currently known forces of physics.
Guillaume Verdon
(01:37:43)
Right.
Lex Fridman
(01:37:43)
There could be some other stuff. The most entertaining thought experiment to me is that it’s other stuff that’s obvious. It’s not like we lack the sensors. It’s all around us, the consciousness being one possible one. But there could be stuff that’s just obviously there. That once you know it, it’s like, “Oh, right. Right. The thing we thought is somehow emergent from the laws of physics, we understand them, is actually a fundamental part of the universe and can be incorporated in physics. Most understood.”
Guillaume Verdon
(01:38:18)
Statistically speaking, if we observed some sort of alien life, it would most likely be some sort of virally, self-replicating, von Neumann-like probe system. And it’s possible that there are such systems that, I don’t know what they’re doing at the bottom of the ocean, allegedly, but maybe they’re collecting minerals from the bottom of the ocean.
Lex Fridman
(01:38:44)
Yeah.
Guillaume Verdon
(01:38:45)
But that wouldn’t violate any of my priors. But am I certain that these systems are here? It’d be difficult for me to say so. I only have secondhand information about there being data.
Lex Fridman
(01:38:59)
About the bottom of the ocean? Yeah. But could it be things like memes? Could it be thoughts and ideas? Could they be operating at that medium? Could aliens be the very thoughts that come into my head? What’s the origin of ideas? In your mind, when an idea comes to your head, show me where it originates.
Guillaume Verdon
(01:39:25)
Frankly, when I had the idea for the type of computer I’m building now, I think it was eight years ago now, it really felt like it was being beamed from space. I was in bed, just shaking, just thinking it through. I don’t know. But do I believe that legitimately? I don’t think so. But I think that alien life could take many forms, and I think the notion of intelligence and the notion of life needs to be expanded much more broadly to be less anthropocentric or biocentric.

Quantum gravity

Lex Fridman
(01:40:04)
Just to linger a little longer on quantum mechanics, through all your explorations on quantum computing, what’s the coolest, most beautiful idea that you’ve come across that has been solved or has not yet been solved?
Guillaume Verdon
(01:40:19)
I think the journey to understand something called AdS/CFT. So, the journey to understand quantum gravity through this picture, where a hologram of lesser dimension is actually dual or exactly corresponding to a bulk theory of quantum gravity of an extra dimension, and the fact that this sort of duality comes from trying to learn deep learning-like representations of the boundary.

(01:40:59)
At least, part of my journey someday on my bucket list is to apply quantum machine learning to these sorts of systems, these CFTs, or they’re called SYK models, and learn an emergent geometry from the boundary theory. And so, we can have a form of machine learning to help us understand quantum gravity, which is still a holy grail that I would like to hit before I leave this earth.
Lex Fridman
(01:41:35)
What do you think is going on with black holes? As information-storing and processing units, what do you think is going on with black holes?
Guillaume Verdon
(01:41:46)
Black holes are really fascinating objects. They’re at the interphase between quantum mechanics and gravity, and so they help us test all sorts of ideas. I think that for many decades now, there’s been this black hole information paradox that things that fall into the black hole, we’ve seem to have lost their information. Now, I think there’s this firewall paradox that has been allegedly resolved in recent years by a former peer of mine, who’s now a professor at Berkeley. There, it seems like, as information falls into a black hole, there’s a sedimentation. As you get closer and closer to the horizon from the point of view, the observer on the outside, the object slows down infinitely as it gets closer and closer.

(01:42:46)
Everything that is falling to a black hole, from our perspective, gets sedimented and tacked on to the near horizon. At some point, it gets so close to the horizon, it’s in the proximity or the scale in which quantum effects and quantum fluctuations matter. There, that infalling matter could interfere with the traditional pictures, that it could interfere with the creation and annihilation of particles and antiparticles in the vacuum. Through this interference, one of the particles gets entangled with the infalling information and one of them is now free and escapes. That’s how there’s mutual information between the outgoing radiation and the infalling matter. But getting that calculation right, I think we’re only just starting to put the pieces together.
Lex Fridman
(01:43:43)
There’s a few pothead-like questions I want to ask you.
Guillaume Verdon
(01:43:46)
Sure.
Lex Fridman
(01:43:46)
One, does it terrify you that there’s a giant black hole at the center of our galaxy?
Guillaume Verdon
(01:43:52)
I don’t know. I just want to set up shop near it to fast-forward, meet a future civilization, if we have a limited lifetime, if you could go orbit a black hole and emerge.
Lex Fridman
(01:44:08)
If there’s a special mission that could take you to a black hole, would you volunteer to go travel?
Guillaume Verdon
(01:44:13)
To orbit and obviously not fall into it.
Lex Fridman
(01:44:15)
That’s obvious. It’s obvious to you that everything’s destroyed inside a black hole? All the information that makes up Guillaume is destroyed? Maybe on the other side, Beff Jezos emerges and it’s just all like it’s tied together in some deeply memeful way.
Guillaume Verdon
(01:44:32)
Yeah, that’s a great question. We have to answer what black holes are. Are we punching a hole through space-time and creating a pocket universe? It’s possible. Then, that would mean that if we ascend the Kardashev scale to beyond Kardashev Type III, we could engineer black holes with specific hyperparameters to transmit information to new universes we create. And so, we can have progeny that our new…
Guillaume Verdon
(01:45:00)
… have progeny that are new universes. And so even though our universe may reach a heat death, we may have a way to have a legacy. And so we don’t know yet. We need to ascend the Kardashev Scale to answer these questions to peer into that regime of higher energy physics.

Kardashev scale

Lex Fridman
(01:45:25)
And maybe you can speak to the Kardashev Scale for people who don’t know. So one of the sort of meme-like principles and goals of the e/acc movement is to ascend the Kardashev Scale. What is the Kardashev Scale and when do we want to ascend it?
Guillaume Verdon
(01:45:43)
The Kardashev Scale is a measure of our energy production and consumption. Really, it’s a logarithmic scale. Kardashev Type 1 is a milestone where we are producing the equivalent wattage to all the energy that is incident on earth from the sun. Kardashev Type II would be harnessing all the energy that is output by the sun. And I think Type III is like the whole galaxy equivalent-
Lex Fridman
(01:46:13)
Galaxy, I think [inaudible 01:46:14] yeah.
Guillaume Verdon
(01:46:15)
Yeah, and then some people have some crazy Type IV and V, but I don’t know if I believe in those. But to me, it seems like from the first principles of thermodynamics that, again, there’s this concept of thermodynamic- driven dissipative adaptation where life evolved on earth because we have this energetic drive from the sun, we have incident energy, and life evolved on earth to figure out ways to best capture that free energy to maintain itself and grow. And I think that that principle, it’s not special to our earth-sun system. We can extend life well beyond. And we kind of have a responsibility to do so because that’s the process that brought us here. So we don’t even know what it has its store for us in the future. It could be something of beauty we can’t even imagine today.

Effective accelerationism (e/acc)

Lex Fridman
(01:47:18)
So this is probably a good place to talk a bit about the e/acc movement in a Substack blog post titled, What the Fuck is e/acc? Or actually, What the F* is e/acc?, you write, “Strategically speaking, we need to work towards several overarching civilization goals that are all interdependent. And the four goals are, increase the amount of energy we can harness as a species, (climb the Kardashev gradient). In the short term, this almost certainly means nuclear fission. Increase human flourishing via pro-population growth policies and pro-economic growth policies. Create artificial general intelligence, the single greatest force multiplier in human history. And finally, develop interplanetary and interstellar transport so that humanity can spread beyond the earth. Could you build on top of that to maybe say, what to you is the e/acc movement? What are the goals? What are the principles?
Guillaume Verdon
(01:48:20)
The goal is for the human techno-capital memetic machine to become self-aware and to hyperstitiously engineer its own growth. So let’s decompress that.
Lex Fridman
(01:48:33)
Define each of those words.
Guillaume Verdon
(01:48:35)
So you have humans, you have technology, you have capital, and then you have memes, information, and all of those systems are coupled with one another. Humans work at companies, they acquire and allocate capital, and humans communicate via memes and information propagation. And our goal was to have a sort of viral optimistic movement that is aware of how the system works, fundamentally it seeks to grow, and we simply want to lean into the natural tendencies of the system to adapt for its own growth.
Lex Fridman
(01:49:18)
So in that way, you’re right, the e/acc is literally a memetic optimism virus that is constantly drifting, mutating, and propagating in a decentralized fashion. So memetic optimism virus. So you do want it to be a virus to maximize the spread, and it’s hyperstitious, therefore the optimism will incentivize its growth.
Guillaume Verdon
(01:49:43)
We see e/acc as sort of a meta-heuristic, sort of very thin cultural framework from which you can have much more opinionated forks. Fundamentally, we just say that what got us here is this adaptation of the whole system based on thermodynamics, and that process is good and we should keep it going. That is the core thesis. Everything else is, okay, how do we ensure that we maintain this malleability and adaptability. Well, clearly not suppressing variants, and maintaining free speech, freedom of thought, freedom of information propagation, and freedom to do AI research is important for us to converge the fastest on the space of technologies, ideas, and whatnot that lead to this growth. And so ultimately, there’s been quite a few forks. Some are just memes, but some are more serious. Vitalik Buterin recently made a d/acc fork. He has his own sort of fine-tunings of e/acc.
Lex Fridman
(01:50:59)
Does anything jump out to memory of the unique characteristic of that fork from Vitalik?
Guillaume Verdon
(01:51:05)
I would say that it’s trying to find a middle ground between e/acc and EA and EI safety. To me, having a movement that is opposite to what was the mainstream narrative that was taking over Silicon Valley was important to shift the dynamic range of opinions. And it’s like the balance between centralization and decentralization, the real optimum is always somewhere in the middle. But for e/acc, we’re pushing for entropy, novelty, disruption, malleability, speed, rather than being conservative, suppressing thought, suppressing speech, adding constraints, adding too many regulations, slowing things down. And so, we’re trying to bring balance to the force.
Lex Fridman
(01:52:00)
Balance to the force of human civilization.
Guillaume Verdon
(01:52:02)
It’s literally the forces of constraints versus the entropic force that makes us explore. Systems are optimal when they’re at the edge of criticality between order and chaos, between constraints, energy minimization and entropy. Systems want to equilibrate, balance these two things. I thought that the balance was lacking, and so we created this movement to bring balance.
Lex Fridman
(01:52:31)
Well, I like the visual of the landscape of ideas evolving through forks. So on the other part of history, thinking of Marxism as the original repository, and then Soviet Communism is a fork of that, and then the Maoism is a fork of Marxism and Communism. And so those are all forks. They’re exploring different ideas.
Guillaume Verdon
(01:53:02)
Thinking of culture almost like code. Nowadays, what you prompt in the LLM or what you put in the constitution of an LLM is basically its cultural framework, what it believes. And you can share it on GitHub nowadays. So trying to take inspiration from what has worked in this machine of software to adapt over the space of code, could we apply that to culture? And our goal is to not say, “You should live your life this way, X, Y, Z,” it’s to set up a process where people are always searching over subcultures and competing for mind share. I think creating this malleability of culture is super important for us to converge onto the cultures and the heuristics about how to live one’s life that are updated to modern times.

(01:53:59)
Because there’s really been a sort of vacuum of spirituality and culture. People don’t feel like they belong to any one group, and there’s been parasitic ideologies that have taken up opportunity to populate this Petri dish of minds. Elon calls it the mind virus. We call it the decel mind virus complex, which is the decelerative that is kind of the overall pattern between all of them. There’s many variants as well. And so if there’s a sort of viral pessimism, decelerative movement, we needed to have not only one movement, but many, many variants, so it’s very hard to pinpoint and stop.
Lex Fridman
(01:54:45)
But the overarching thing is nevertheless a kind of mimetic optimism pandemic. Okay, let me ask you, do you think e/acc to some degree is a cult?
Guillaume Verdon
(01:55:01)
Define cult?
Lex Fridman
(01:55:03)
I think a lot of human progress is made when you have independent thought, so you have individuals that are able to think freely. And very powerful mimetic systems can kind of lead to group think. There’s something in human nature that leads to mass hypnosis, mass hysteria. We start to think alike whenever there’s a sexy idea that captures our minds. And so it’s actually hard to break us apart, pull us apart, diversify a thought. So to that degree, to which degree is everybody kind of chanting “E/acc, e/acc” like the sheep in Animal Farm?
Guillaume Verdon
(01:55:46)
Well, first of all, it’s fun. It’s rebellious. There’s this concept of meta-irony, of being on the boundary of, “We’re not sure if they’re serious or not.” And it’s much more playful and much more fun. For example, we talk about thermodynamics being our god, and sometimes we do cult-like things, but there’s no ceremony and robes and whatnot.
Lex Fridman
(01:56:19)
Not yet.
Guillaume Verdon
(01:56:19)
Not yet, no. But ultimately, yeah, I totally agree that it seems to me that humans want to feel like they’re part of a group, so they naturally try to agree with their neighbors and find common ground. And that leads to sort of mode collapse in the space of ideas. We used to have one cultural island that was allowed. It was a typical subspace of thought, and anything that was diverting from that subspace of thought was suppressed or you were canceled. Now we’ve created a new mode, but the whole point is that we’re not trying to have a very restricted space of thought. There’s not just one way to think about e/acc and its many forks. And the point is that there are many forks and there can be many clusters and many islands.

(01:57:07)
And I shouldn’t be in control of it in any way. I mean, there’s no formal org whatsoever. I just put out tweets and certain blog posts, and people are free to defect and fork if there’s an aspect they don’t like. And so that makes it so that there should be deterritorialization in the space of ideas, so that we don’t end up in one cluster that’s very cult-like. And so cults usually, they don’t allow people to defect or start competing forks, whereas we encourage it.

Humor and memes

Lex Fridman
(01:57:51)
The pros and cons of humor in meme, in some sense there’s like a wisdom to memes. What is it, the Magic Theater? What book is that from? Hermann Hesse. Steppenwolf, I think. But there’s a kind of embracing of the absurdity that seems to get to the truth of things, but at the same time, it can also decrease the quality and the rigor of the discourse.
Guillaume Verdon
(01:58:22)
Yeah.
Lex Fridman
(01:58:23)
Do you feel the tension of that?
Guillaume Verdon
(01:58:25)
Yeah. So initially, I think what allowed us to grow under the radar was because it was camouflaged as sort of meta-ironic. We would sneak in deep truths within a package of humor and memes and what are called shit posts, and I think that was purposefully camouflaged against those that seek status and do not want to… It’s very hard to argue with a cartoon frog or a cartoon of an intergalactic Jeff Bezos and take yourself seriously, and so that allowed us to grow pretty rapidly in the early days. But of course, essentially people get steered. Their notion of the truth comes from the data they see, from the information they’re fed, and the information people are fed is determined by algorithms. And really what we’ve been doing is engineering what we call high memetic fitness packets of information, so that they can spread effectively and carry a message.

(01:59:47)
So it’s kind of a vector to spread the message. And yes, we’ve been using techniques that are optimal for today’s algorithmically-amplified information landscapes. But I think we’re reaching the point of scale where we can have serious debates and serious conversations. And that’s why we’re considering doing a bunch of debates and having more serious long-form discussions. Because I don’t think that the timeline is optimal for very serious, thoughtful discussions. You get rewarded for polarization. And so even though we started a movement that is literally trying to polarize the tech ecosystem, at the end of the day so that we can have a conversation and find an optimum together.

Jeff Bezos

Lex Fridman
(02:00:42)
I mean, that’s kind of what I try to do with this podcast given the landscape of things, to still have long-form conversations. But there is a degree to which absurdity is fully embraced. In fact, this very conversation is multi-level absurd. So first of all, I should say that just very recently I had a conversation with Jeff Bezos, and I would love to hear your, Beff Jezos, opinions of Jeff Bezos. Speaking of intergalactic Jeff Bezos. What do you think of that particular individual whom your name has inspired?
Guillaume Verdon
(02:01:25)
Yeah, I think Jeff is really great. I mean, he’s built one of the most epic companies of all time. He’s leveraged the techno-capital machine and techno-capital acceleration to give us what we wanted. We want a quick delivery, very convenient, at-home, low prices. He understood how the machine worked and how to harness it, like running the company, not trying to take profits too early, putting it back, letting the system compound and keep improving. And arguably, I think Amazon’s invested some of the most amount of capital and robotics out there, and certainly with the birth of AWS, kind of enabled the tech boom we’ve seen today that has paid the salaries of, I guess myself and all of our friends to some extent. And so I think we can all be grateful to Jeff, and he’s one of the great entrepreneurs out there. one of the best of all time, unarguably.
Lex Fridman
(02:02:32)
And of course, the work at Blue Origin, similar to the work at SpaceX, is trying to make humans a multi-planetary species, which that seems almost like a bigger thing than the capitalist machine. Or it’s the capitalist machine at a different timescale perhaps?
Guillaume Verdon
(02:02:47)
Yeah, I think that companies, they tend to optimize quarter over quarter, maybe a few years out, but individuals that want to leave a legacy can think on a multi-decadal or multi-century timescale. And so the fact that some individuals are such good capital allocators that they unlock the ability to allocate capitals to goals that take us much further or are much further-looking… Elon’s doing this with SpaceX, putting all this capital towards getting us to Mars. Jeff is trying to build Blue Origin, and I think he wants to build O’Neill cylinders and get industry off- planet, which I think is brilliant.

(02:03:33)
I think just overall, I’m four billionaires. I know this is a controversial statement sometimes, but I think that in a sense it’s kind of a proof of stake voting. If you’ve allocated capital efficiently, you unlock more capital to allocate, just because clearly you know how to allocate capital more efficiently. Which is in contrast to politicians that get elected because they speak the best on TV, not because they have a proven track record of allocating taxpayer capital most efficiently. And so that’s why I’m for capitalism over, say, giving all our money to the government and letting them figure out how to allocate it.
Lex Fridman
(02:04:20)
Why do you think it’s a viral and it’s a popular meme to criticize billionaires? Since you mentioned billionaires. Why do you think there’s quite a widespread criticism of people with wealth, especially those in the public eye, like Jeff and Elon and Mark Zuckerberg, and who else? Bill Gates.
Guillaume Verdon
(02:04:44)
Yeah, I think a lot of people would, instead of trying to understand how the techno-capital machine works and realizing they have much more agency than they think, they’d rather have this sort of victim mindset. “I’m just subjected to this machine. It is oppressing me. And the successful players clearly must be evil because they’ve been successful at this game that I’m not successful at.” But I’ve managed to get some people that were in that mindset and make them realize how the techno-capital machine works and how you can harness it for your own good and for the good of others. And by creating value, you capture some of the value you create for the world. That sort of positive sum mindset shift is so potent, and really, that’s what we’re trying to do by scaling e/acc, is unlocking that higher level of agency. Actually, you’re far more in control of the future than you think. You have agency to change the world, go out and do it. Here’s permission.
Lex Fridman
(02:05:46)
Each individual has agency. The motto, “Keep building” is often heard. What does that mean to you, and what does that have to do with Diet Coke? By the way, thank you so much for the Red Bull. It’s working pretty well. I’m feeling pretty good.
Guillaume Verdon
(02:06:03)
Awesome. Well, so building technologies and building… It doesn’t have to be technologies, just building in general means having agency, trying to change the world by creating, let’s say a company which is a self-sustaining organism that accomplishes a function in the broader techno-capital machine. To us, that’s the way to achieve change in the world that you’d like to see, rather than, say, pressuring politicians or creating nonprofits. Nonprofits, once they run out of money, their function can longer be accomplished. You’re kind of deforming the market artificially compared to sort of subverting or coursing the market, or dancing with the market, to convince it that actually this function is important, adds value, and here it is. And so I think this is the way between the de-growth, ESG approach, versus, say, Elon. The de-growth approach is like, “We’re going to manage our way out of a climate crisis.” And Elon is like, “I’m going to build a company that is self-sustaining, profitable, and growing, and we’re going to innovate our way out of this dilemma.” And we’re trying to get people to do the latter rather than the former, at all scales.

Elon Musk

Lex Fridman
(02:07:26)
Elon is an interesting case. You are a proponent, you celebrate Elon, but he’s also somebody who has for a long time warned about the dangers, the potential dangers, existential risks of artificial intelligence. How do you square the two? Is that a contradiction to you?
Guillaume Verdon
(02:07:45)
It is somewhat because he’s very much against regulation in many aspects. But for AI, he’s definitely a proponent of regulations. I think overall he saw the dangers of, say, OpenAI cornering the market and then getting to have the monopoly over the cultural priors that you can embed in these LLMs that then, as LLMs now become the source of truth for people, then you can shape the culture of the people. And so you can control people by controlling LLMs. He saw that, just like it was the case for social media, if you shape the function of information propagation, you can shape people’s opinions. He sought to make a competitor. So at least, I think we’re very aligned there, that the way to a good future is to maintain adversarial equilibria between the various AI players. I’d love to talk to him to understand his thinking about how to advance AI going forwards. I mean, he’s also hedging his bets, I would say, with Neuralink. I think if he can’t stop the progress of AI, he’s building the technology to merge. Look at the actions, not just the words.
Lex Fridman
(02:09:10)
Well, there’s some degree where being concerned… Maybe using human psychology, being concerned about threats all around us is a motivator. It’s an encouraging thing. I operate much better when there’s a deadline. The fear of the deadline. And I, for myself, create artificial things, like I want to create in myself this kind of anxiety as if something really horrible will happen if I miss the deadline. I think there’s some degree of that here, because creating AI that’s aligned with humans has a lot of potential benefits. And so a different way to reframe that is, “If you don’t, we’re all going to die.” It just seems to be a very powerful psychological formulation of the goal of creating human-aligned AI.
Guillaume Verdon
(02:09:59)
I think that anxiety is good. I think, like I said, I want the free market to create aligned AIs that are reliable, and I think that’s what he’s trying to do with xAI. So I’m all for it. What I am against is stopping, let’s say the OpenSource ecosystem from thriving by, let’s say in the executive order, claiming that OpenSource LMs are dual-use technologies and should be government controlled. Then everybody needs to register their GPU and their big matrices with the government. And I think that extra friction will dissuade a lot of hackers from contributing, hackers that could later become the researchers that make key discoveries that push us forward, including discoveries for AI safety. And so I think I just want to maintain ubiquity of opportunity to contribute to AI and to own a piece of the future. It can’t just be legislated behind some wall where only a few players get to play the game.
Lex Fridman
(02:11:08)
The e/acc movement is often caricatured to mean progress and innovation at all costs. Doesn’t matter how unsafe it is, doesn’t matter if it causes a lot of damage. You just build cool shit as fast as possible, stay up all night with a Diet Coke, whatever it takes. I guess, I don’t know if there’s a question in there, but how important to you and what you’ve seen the different formulations of e/acc, is AI safety?
Guillaume Verdon
(02:11:44)
Again, I think if there was no one working on it, I think I would be a proponent of it. I think, again, our goal is to bring balance, and obviously a sense of urgency is a useful tool to make progress. It hacks our dopaminergic systems and gives us energy to work late into the night. I think also having a higher purpose you’re contributing to. At the end of the day, it’s like, what am I contributing to? I’m contributing to the growth of this beautiful machine so that we can seek to the stars. That’s really inspiring. That’s also a sort of neuro hack.
Lex Fridman
(02:12:26)
So you’re saying AI safety is important to you, but right now the landscape of ideas you see is, AI safety as a topic is used more often to gain centralized control. So in that sense, you’re resisting it, as a proxy for gaining centralized control?
Guillaume Verdon
(02:12:43)
Yeah, I just think we have to be careful, because safety is just the perfect cover for centralization of power and covering up eventually corruption. I’m not saying it’s corrupted now, but it could be down the line. And really, if you let the argument run, there’s no amount of centralization of control that will be enough to ensure your safety. There’s always more 999s of P safety that you can gain, 99.9999% safe. Maybe you want another nine. “Oh, please give us full access to everything you do. Full surveillance.” And frankly, those that are proponents of AI safety have proposed having a global panopticon where you have centralized perception of everything going on. And to me, that just opens up the door wide open for a big brother, 1984-like scenario. And that’s not a future I want to live in.
Lex Fridman
(02:13:49)
Because we have some examples throughout history when that did not lead to a good outcome.

Extropic

Guillaume Verdon
(02:13:54)
Right.
Lex Fridman
(02:13:56)
You mentioned you founded a company, Extropic, that recently announced a 14.1 million seed round. What’s the goal of the company? You’re talking about a lot of interesting physics things, so what are you up to over there that you can talk about?
Guillaume Verdon
(02:14:12)
Yeah, originally we weren’t going to announce last week, but I think with the doxing and disclosure, we got our hand forced. So we had to disclose roughly what we were doing. But really, Extropic was born from my dissatisfaction, and that of my colleagues, with the quantum computing roadmap. Quantum computing was sort of the first path to physics-based computing that was trying to commercially scale, and I was working on physics-based AI that runs on these physics-based computers. But ultimately, our greatest enemy was this noise, this pervasive problem of noise that, as I mentioned, you have to constantly pump out the noise out of the system to maintain this pristine environment where quantum mechanics can take effect. And that constraint was just too much. It’s too costly to do that.

(02:15:11)
And so we were wondering, as generative AI is sort of eating the world, more and more of the world’s computational workloads are focused on generative AI, how could we use physics to engineer the ultimate physical substrate for generative AI from first principles of physics, of information theory, of computation, and ultimately of thermodynamics? And so what we’re seeking to build is a physics-based computing system and physics-based AI algorithms that are inspired by out-of-equilibrium thermodynamics, or harness it directly to do machine learning as a physical process.
Lex Fridman
(02:16:01)
So what does that mean, machine learning as a physical process? Is that hardware? Is it software? Is it both? Is it trying to do the full stack in some kind of unique way?
Guillaume Verdon
(02:16:10)
Yes, it is full stack. And so we’re folks that have built differentiable programming into the quantum computing ecosystem with TensorFlow Quantum. One of my co-founders of TensorFlow Quantum is the CTO, Trevor McCourt. We have some of the best quantum computer architects, those that have designed IBM’s and AWS’s systems. They’ve left quantum computing to help us build what we call actually a thermodynamic computer.
Lex Fridman
(02:16:43)
A thermodynamic computer. Well, actually let’s linger around TensorFlow Quantum. What lessons have you learned from TensorFlow Quantum? Maybe you can speak to what it takes to create essentially, what, like a software API to a quantum computer?
Guillaume Verdon
(02:17:01)
Right. That was a challenge to invent, to build, and then to get to run on the real devices.
Lex Fridman
(02:17:09)
Can you actually speak to what it is?
Guillaume Verdon
(02:17:11)
Yeah. TensorFlow Quantum was an attempt at… Well, I guess we succeeded, at combining deep learning or differentiable classical programming with quantum computing, and turn quantum computing into or have types of programs that are differentiable in quantum computing. And Andrej Karpathy calls differentiable programming, Software 2.0. It’s like, gradient descent is a better programmer than you. And the idea was that in the early days of quantum computing, you can only run short quantum programs. And so, which quantum programs should you run? Well, just let gradient descent find those programs instead. And so we built the first infrastructure to not only run differentiable quantum programs, but combine them as part of broader deep learning graphs, incorporating deep neural networks, the ones you know and love, with what are called quantum neural networks.

(02:18:21)
And ultimately, it was a very cross-disciplinary effort. We had to invent all sorts of ways to differentiate, to back propagate through the hybrid graph. But ultimately, it taught me that the way to program matter and to program physics is by differentiating through control parameters. If you have parameters that affects the physics of the system and you can evaluate some loss function, you can optimize the system to accomplish a task, whatever that task may be. And that’s a very universal meta framework for how to program physics-based computers.
Lex Fridman
(02:19:05)
So try to parameterize everything, make those parameters differentiable, and then optimize?
Guillaume Verdon
(02:19:12)
Yes.
Lex Fridman
(02:19:13)
Okay. Is there some more practical engineering lessons from TensorFlow Quantum? Just organizationally too, like the humans involved and how to get to a product, how to create good documentation? I don’t know. All of these little subtle things that people might not think about.
Guillaume Verdon
(02:19:34)
I think working across disciplinary boundaries is always a challenge, and you have to be extremely patient in teaching one another. I learned a lot of software engineering through the process. My colleagues learned a lot of quantum physics, and some learned machine learning through the process of building this system. And I think if you get some smart people that are passionate and trust each other in a room, and you have a small team-
Guillaume Verdon
(02:20:00)
Are passionate and trust each other in a room, and you have a small team, and you teach each other your specialties, suddenly you’re kind of forming this sort of model soup of expertise, and something special comes out of that, right? It’s like combining genes, but for your knowledge bases, and sometimes special products come out of that. And so I think, even though it’s very high friction initially to work in an interdisciplinary team, I think the product at the end of the day is worth it. And so, learned a lot trying to bridge the gap there. And I mean, it’s still a challenge to this day. We hire folks that have an AI background, folks that have a pure physics background, and somehow we have to make them talk to one another. Right?
Lex Fridman
(02:20:47)
Is there a magic, is there some science and art to the hiring process, to building a team that can create magic together?
Guillaume Verdon
(02:20:56)
Yeah, it’s really hard to pinpoint that je ne sais quoi, right?
Lex Fridman
(02:21:03)
I didn’t know you speak French. That’s very nice.
Guillaume Verdon
(02:21:07)
Yeah, I’m actually French Canadian.
Lex Fridman
(02:21:09)
Oh, you are a legitimately French Canadian.
Guillaume Verdon
(02:21:09)
I am.
Lex Fridman
(02:21:11)
I thought you were just doing that for the cred.
Guillaume Verdon
(02:21:15)
No, no. I’m truly French Canadian, from Montreal. But yeah, essentially we look for people with very high fluid intelligence that aren’t overspecialized, because they’re going to have to get out of their comfort zone. They’re going to have to incorporate concepts that they’ve never seen before, and very quickly get comfortable with them, or learn to work in a team. And so that’s sort of what we look for when we hire. We can’t hire people that are just optimizing this subsystem for the past three or four years. We need really general sort of broader intelligence and specialty, and people that are open-minded, really, because if you’re pioneering a new approach from scratch, there is no textbook, there’s no reference. It’s just us, and people that are hungry to learn. So, we have to teach each other, we have to learn the literature, we have to share knowledge bases, collaborate in order to push the boundary of knowledge further together. And so, people that are used to just getting prescribed what to do at this stage, when you’re at the pioneering stage, that’s not necessarily who you want to hire. Yeah.

Singularity and AGI

Lex Fridman
(02:22:31)
So you mentioned with Extropic you’re trying to build the physical substrate for generative AI. What’s the difference between that and the AGI AI itself? So, is it possible that in the halls of your company, AGI will be created? Or will AGI just be using this as a substrate?
Guillaume Verdon
(02:22:51)
I think our goal is to both run human like AI, or anthropomorphic AI.
Lex Fridman
(02:22:58)
Sorry for use of the term AGI. I know it’s triggering for you.
Guillaume Verdon
(02:23:02)
We think that the future is actually physics-based AI combined with anthropomorphic AI. So, you can imagine, I have a sort of world modeling engine through physics-based AI. Physics-based AI is better at representing the world at all scales, because it can be quantum mechanical, thermodynamic, deterministic, hybrid representations of the world, just like our world at different scales has different regimes of physics. If you inspire yourself from that in the ways you learn representations of nature, you can have much more accurate representations of nature. So, you can have very accurate world models at all scales. And so, you have the world modeling engine, and then you have the anthropomorphic AI that is human-like. So you can have the science, the playground to test your ideas, and you can have the synthetic scientist. And to us, that joint system of a physics-based and an anthropomorphic AI is the closest thing to a fully general, artificially intelligent system.
Lex Fridman
(02:24:07)
So you can get closer to truth by grounding of the AI to physics, but you can also still have a anthropomorphic interface to us humans that like to talk to other humans, or human-like systems. So, on that topic, I suppose that is one of the big limitations of current large language models to you, is that they’re good bullshitters, they’re not really grounded to truth necessarily. Would that be fair to say?
Guillaume Verdon
(02:24:40)
Yeah, no, you wouldn’t try to extrapolate the stock market with an LM trained on text from the internet. It’s not going to be a very accurate model. It’s not going to model its priors or its uncertainties about the world very accurately. So, you need a different type of AI to compliment this text extrapolation AI. Yeah.
Lex Fridman
(02:25:05)
You mentioned singularity earlier. How far away are we from a singularity?
Guillaume Verdon
(02:25:09)
I don’t know if I believe in a finite time singularity as a single point in time. I think it’s going to be asymptotic, and sort of a diagonal sort of asymptote. We have the light cone, we have the limits of physics restricting our ability to grow. So, obviously can’t fully diverge on a finite time. I think my priors are that I think a lot of people on the other side of the aisle think that once we reach human level AI, there’s going to be an inflection point, and a sudden [inaudible 02:25:48], suddenly AI is going to grok how to manipulate matter at the nano scale, and assemble nanobots. And having worked for nearly a decade in applying AI to engineer matter, it’s much harder than they think. And in reality, you need a lot of samples from either a simulation of nature that’s very accurate and costly, or nature itself, and that keeps your ability to control the world around us in check. There’s a sort of minimal cost computationally, and thermodynamically, to acquiring information about the world in order to be able to predict and control it. And that keeps things in check.

AI doomers

Lex Fridman
(02:26:27)
It’s funny you mentioned the other side of the aisle. So, in the poll I posted about p(doom) yesterday, what’s the probability of doom? There seems to be a nice division between people think it’s very likely, and very unlikely. I wonder if in the future there’ll be the actual Republicans versus Democrats division, blue versus red? Is the AI doomers versus the e/accers, EAC? [inaudible 02:26:53].
Guillaume Verdon
(02:26:53)
Yeah. So, this movement is not right wing or left wing fundamentally, it’s more like up versus down, in terms of the scale of-
Lex Fridman
(02:27:01)
Which one is the up? Okay.
Guillaume Verdon
(02:27:02)
… Civilization, right?
Lex Fridman
(02:27:03)
All right.
Guillaume Verdon
(02:27:05)
But, it seems to be like there is sort of case of alignment of the existing political parties, where those that are for more centralization of power, control, and more regulations are aligning themselves with the doomers, because that sort of instilling fear in people is a great way for them to give up more control, and give the government more power. But fundamentally, we’re not left versus right. I think we’ve done polls of people’s alignment within EAC. I think it’s pretty balanced. So, it’s a new fundamental issue of our time. It’s not just centralization versus decentralization. It’s kind of do we go… It’s like tech progressivism, versus techno conservatism. Right?

Effective altruism

Lex Fridman
(02:27:54)
So e/acc as a movement is often formulated in contrast to EA, effective altruism. What do you think are the pros and cons of effective altruism? What’s interesting, insightful to you about them, and what is negative?
Guillaume Verdon
(02:28:15)
Right. I think people trying to do good from first principles is good.
Lex Fridman
(02:28:23)
We should actually say, and sorry to interrupt, we should probably say that, and you can correct me if I’m wrong, but effective altruism is the kind of movement that’s trying to do good optimally, where good is probably measured something like the amount of suffering in the world. You want to minimize it. And there’s ways that that can go wrong, as any optimization can. And so, it’s interesting to explore how things can go wrong.
Guillaume Verdon
(02:28:55)
We’re both trying to do good to some extent, and we’re arguing for which loss function we should use, right?
Lex Fridman
(02:29:03)
Yes.
Guillaume Verdon
(02:29:04)
Their loss function is sort of hedons, units of hedonism. How good do you feel, and for how much time? And so, suffering would be negative hedons, and they’re trying to minimize that. But to us that seems like that loss function has sort of spurious minima, you can start minimizing shrimp farm pain, which seems not that productive to me. Or you can end up with wire heading, where you just either install a neural link, or you scroll TikTok forever, and you feel good on the short-term timescale because of your neurochemistry, but on a long-term timescale, it causes decay and death, because you’re not being productive.

(02:29:54)
Whereas sort of EAC, measuring progress of civilization, not in terms of a subjective loss function like hedonism, but rather an objective measure, quantity that cannot be gamed that is physical energy, it’s very objective, and there’s not many ways to game it. If you did it in terms of GDP, or a currency, that’s pinned to certain value that’s moving. And so, that’s not a good way to measure our progress. But the thing is we’re both trying to make progress, and ensure humanity flourishes, and gets to grow. We just have different loss functions, and different ways of going about doing it.
Lex Fridman
(02:30:42)
Is there a degree, maybe you can educate me, correct me, I get a little bit skeptical when there’s an equation involved trying to reduce all of the human civilization, human experience to an equation. Is there a degree that we should be skeptical of the tyranny of an equation of a loss function over wish to optimize? Like having a kind of intellectual humility about optimizing over loss functions?
Guillaume Verdon
(02:31:12)
Yeah. So, this particular loss function, it’s not stiff. It’s kind of an average of averages. It’s like distributions of states in the future are going to follow a certain distribution. So it’s not deterministic, it’s not like… We’re not on stiff rails. It’s just a statistical statement about the future. But at the end of the day, you can believe in gravity or not, but it’s not necessarily an option to obey it. And some people try to test that, and that goes not so well. So, similarly, I think thermodynamics is there whether we like it or not, and we’re just trying to point out what is, and try to orient ourselves, and chart a path forward given this fundamental truth.
Lex Fridman
(02:32:04)
But there’s still some uncertainty, there’s still a lack of information, and humans tend to fill the gap of the lack of information with narratives. And so, how they interpret… Even physics is up to interpretation when there’s uncertainty involved. And humans tend to use that to further their own means. So, it’s always, whenever there’s an equation, it just seems like until we have really perfect understanding of the universe, humans will do what humans do, and they try to use the narrative of doing good to fool the populace into doing bad. I guess that this is something that we should be skeptical about in all movements.
Guillaume Verdon
(02:32:57)
That’s right? So we invite skepticism. Right?
Lex Fridman
(02:33:02)
Do you have an understanding of what might, to a degree that went wrong, what do you think may have gone wrong with effective altruism that might also go wrong with effective accelerationism?
Guillaume Verdon
(02:33:15)
Yeah, I mean I think it provided initially a sense of community for engineers, and intellectuals, and rationalists in the early days, and it seems like the community was very healthy, but then they formed all sorts of organizations, and started routing capital, and having actual power. They have real power. They influence the government, they influence most AI orgs now. I mean, they’re literally controlling the board of OpenAI, and look over to Anthropic. I think they’ll have some control over that too. And so, I think the assumption of e/acc is more like capitalism, is that every agent organism and meta organism is going to act in its own interest, and we should maintain sort of adversarial equilibrium, or adversarial competition to keep each other in check at all times, at all scales. I think that yeah, ultimately, it was the perfect cover to acquire tons of power, and capital, and unfortunately sometimes that corrupts people over time.

Day in the life

Lex Fridman
(02:34:23)
What does a perfectly productive day, since building is important, what is a perfectly productive day in the life of Guillaume Verdon look like? How much caffeine do you consume? What’s a perfect day?
Guillaume Verdon
(02:34:39)
Okay, so I have a particular regimen. I would say my favorite days are 12:00 PM to 4:00 AM, and I would have meetings in the early afternoon, usually external meetings, some internal meetings. Because I’m CEO, I have to interface with the outside world, whether it’s customers, or investors, or interviewing potential candidates. And usually I’ll have ketones, exogenous ketones.
Lex Fridman
(02:35:12)
So, are you on a keto diet, or is this-
Guillaume Verdon
(02:35:16)
I’ve done keto before for football, and whatnot, but I like to have a meal after part of my day is done, and so I can just have extreme focus.
Lex Fridman
(02:35:31)
You do the social interactions earlier in the day without food.
Guillaume Verdon
(02:35:35)
Front load them, yeah. Yeah. Like right now I’m on ketones, and a Red Bull, and it just gives you a clarity of thought that is really next level. Because then when you eat, you’re actually allocating some of your energy that could be going to neural energy to your digestion. After I eat, maybe I take a break, an hour or so, an hour and a half, and then usually it’s like ideally one meal a day, like steak and eggs, and vegetables, animal-based primarily. So, fruit and meat. And then I do a second wind, usually that’s deep work, because I am A CEO, but I’m still technical. I’m contributing to most patents. And there, I’ll just stay up late into the night, and work with engineers on very technical problems.
Lex Fridman
(02:36:25)
So it’s like the 9:00 PM to 4:00 AM, whatever though, that range of time.
Guillaume Verdon
(02:36:30)
Yeah, yeah. That’s the perfect time. The emails, the things that are on fire stop trickling in, you can focus. And then you have your second wind. And I think Demis Hassabis has a similar workday to some extent. So, I think that’s definitely inspired my workday. But yeah, I started this workday when I was at Google, and had to manage a bit of the product during the day, and have meetings, and then do technical work at night.
Lex Fridman
(02:37:00)
Exercise, sleep, those kinds of things. You said football, you used to play football?
Guillaume Verdon
(02:37:06)
Yeah, I used to play American football. I’ve done all sorts of sports growing up. And then I was into powerlifting for a while. So, when I was studying mathematics in grad school, I would just do math, and lift, take caffeine, and that was my day. It was very pure, the purest of monk modes. But it’s really interesting, how in powerlifting you’re trying to cause neural adaptation by having certain driving signals, and you’re trying to engineer a neuroplasticity through all sorts of supplements, and you have all sorts of brain derived neurotrophic factors that get secreted when you lift.

(02:37:44)
So, it’s funny to me how I was trying to engineer a neural adaptation in my nervous system more broadly, not just my brain while learning mathematics. I think you can learn much faster if you really care. If you convince yourself to care a lot about what you’re learning, and you have some sort of assistance, let’s say caffeine, or some cholinergic supplement to increase neuroplasticity. I should chat with Andrew Huberman at some point. He’s the expert. But yeah, at least to me it’s like you can try to input more tokens into your brain, if you will, and you can try to increase the learning rate, so that you can learn much faster on a shorter timescale.

(02:38:30)
So, I’ve learned a lot of things. I’ve followed my curiosity. You’re naturally… If you’re passionate about what you’re doing, you’re going to learn faster, you’re going to become smarter faster. And if you follow your curiosity, you’re always going to be interested. And so, I advise people to follow their curiosity and don’t respect the boundaries of certain fields, or what you’ve been allocated in terms of lane of what you’re working on. Just go out and explore, and follow your nose, and try to acquire, and compress as much information as you can into your brain. Anything that you find interesting.
Lex Fridman
(02:39:05)
And caring about a thing. Like you said, which is interesting, it works for me really well, is tricking yourself that you care about a thing.
Guillaume Verdon
(02:39:12)
Yes.
Lex Fridman
(02:39:13)
And then you start to really care about it.
Guillaume Verdon
(02:39:15)
Yep.
Lex Fridman
(02:39:15)
So, it’s funny, the motivation is a really good catalyst for learning.
Guillaume Verdon
(02:39:22)
Right. And so, at least part of my character, as Beff Jezos is kind of like…
Lex Fridman
(02:39:29)
Yeah, hype man.
Guillaume Verdon
(02:39:30)
Yeah, but I’m hyping myself up, but then I just tweet about it, and it’s just when I’m trying to get really hyped up, and an altered state of consciousness where I’m ultra focused, in the flow, wired, trying to invent something that’s never existed, I need to get to unreal levels of excitement. But your brain has these levels of cognition that you can unlock with higher levels of adrenaline, and whatnot. And I mean, I’ve learned that in powerlifting, that actually you can engineer a mental switch to increase your strength. If you can engineer a switch, maybe you have a prompt, like a certain song or some music where suddenly you’re fully primed, then you’re at max, maximum strength. And I’ve engineered that switch through years of lifting. If you’re going to get under 500 pounds and it could crush you, if you don’t have that switch to be wired in, you might die. So, that’ll wake you right up. That sort of skill I’ve carried over to research, when it’s go time, when the stakes are high, somehow I just reach another level of neural performance.
Lex Fridman
(02:40:40)
So Beff Jezos is your sort of embodiment representation of your intellectual Hulk. It’s your productivity Hulk that you just turn on.

Identity

Guillaume Verdon
(02:40:50)
Yeah.
Lex Fridman
(02:40:50)
What have you learned about the nature of identity from having these two identities? I think it’s interesting for people, to be able to put on those two hats so explicitly.
Guillaume Verdon
(02:41:01)
I think it was interesting in the early days, I think in the early days, I thought it was truly compartmentalized. Like, “Oh yeah, this is a character. I’m Guillaume. Beff is just the character.” I take my thoughts, and then I extrapolate them to a bit more extreme. But over time, it’s kind of like both identities were starting to merge mentally, and people were like, “No, I met you. You are Beff. You are not just Guillaume.” And I was like, “Wait, am I?” And now it’s fully merged. But it was already, before the docs, it was already starting mentally that I am this character. It’s part of me.
Lex Fridman
(02:41:39)
Would you recommend people have an alt?
Guillaume Verdon
(02:41:42)
Absolutely.
Lex Fridman
(02:41:43)
Like young people. Would you recommend them to explore different identities by having alts? Alt accounts?
Guillaume Verdon
(02:41:49)
It’s fun. It’s like writing an essay, and taking a position, right? It’s like you do this in debate. It’s like you can have experimental thoughts, and by the stakes being so low, because you’re an anon account with, I don’t know, 20 followers or something, you can experiment with your thoughts in a low stakes environment. And I feel like we’ve lost that in the era of everything being under your main name, everything being attributable to you. People just are afraid to speak, explore ideas that aren’t fully formed, and I feel like we’ve lost something there. So, I hope platforms like X and others really help support people trying to stay synonymous, or anonymous, because it’s really important for people to share thoughts that aren’t fully formed, and converge onto maybe hidden truths that were hard to converge upon if it was just through open conversation with real names.
Lex Fridman
(02:42:46)
Yeah. I really believe in not radical, but rigorous empathy. It’s like really considering what it’s like to be a person of a certain viewpoint, and taking that, as a thought experiment, farther and farther and farther. And one way of doing that as an alt account. That’s a fun, interesting way to really explore what it’s like to be a person that believes a set of beliefs, and taking that across the span of several days, weeks, months. Of course there’s always the danger of becoming that. That’s the Nietzche, “Gaze long into the abyss, the abyss gazes into you.” You have to be careful.
Guillaume Verdon
(02:42:46)
Breaking Beff.

Advice for young people

Lex Fridman
(02:43:31)
Yeah, right. Breaking Beff. Yeah. You wake up with a shaved head one day, just like, “Who am I? What have I become?” So, you’ve mentioned quite a bit of advice already, but what advice would you give to young people of, in this interesting world we’re in, how to have a career and how to have a life they can be proud of?
Guillaume Verdon
(02:43:58)
I think to me, the reason I went to theoretical physics was that I had to learn the base of the stack that was going to stick around no matter how the technology changes. And to me, that was the foundation upon which then I later built engineering skills, and other skills. And to me, the laws of physics, it may seem like the landscape right now is changing so fast, it’s disorienting. But certain things like fundamental mathematics and physics aren’t going to change. And if you have that knowledge, and knowledge about complex systems, and adaptive systems, I think that’s going to carry you very far. And so, not everybody has to study mathematics, but I think it’s really a huge cognitive unlock to learn math, and some physics, and engineering.
Lex Fridman
(02:44:48)
Get as close to the base of the stack as possible.
Guillaume Verdon
(02:44:51)
Yeah, that’s right. Because the base of the stack doesn’t change. Everything else… Your knowledge might become not as relevant in a few years. Of course there’s a sort of transfer learning you can do, but then you have to always transfer learn, constantly.
Lex Fridman
(02:45:04)
I guess the closer you are to the base of the stack, the easier the transfer learning, the shorter the jump.
Guillaume Verdon
(02:45:10)
Right, right. And you’d be surprised, once you’ve learned concepts in many physical scenarios, how they can carry over to understanding other systems that aren’t necessarily physics. And I guess the e/acc writings, the principles and tenet posts, that was based on physics, that was kind of my experimentation with applying some of the thinking from out of [inaudible 02:45:36] thermodynamics to understanding the world around us, and it’s led to e/acc, and this movement.

Mortality

Lex Fridman
(02:45:42)
If you look at you’re one cog in the machine, in the capitalist machine, one human, and if you look at yourself, do you think mortality is a feature or a bug? Would you want to be immortal?
Guillaume Verdon
(02:45:57)
No, I think fundamentally, in thermodynamic dissipative adaptation, there’s the word dissipation. Dissipation is important, death is important. We have a saying in physics, physics progresses one funeral at a time.
Lex Fridman
(02:46:16)
Yeah.
Guillaume Verdon
(02:46:17)
I think the same is true for capitalism. Companies, empires, people, everything. Everything must die at some point. I think that we should probably extend our lifespan, because we need a longer period of training, because the world is more and more complex. We have more and more data to really be able to predict and understand the world. And if we have a finite window of higher neuroplasticity, then we have sort of a hard cap in how much we can understand about our world. So, I think I am for death, because again, I think it’s important. If you have a king that would never die, that would be a problem. The system wouldn’t be constantly adapting, right?

(02:47:05)
You need novelty, you need youth, you need disruption to make sure the system’s always adapting, and malleable. Otherwise, if things are immortal, if you have, let’s say corporations that are there forever, and they have the monopoly, they get calcified, they become not as optimal, not as high fitness in a changing, time varying landscape. And so, death gives space for youth and novelty to take its place. And I think it’s an important part of every system in nature. So yeah, I am for death, but I do think that longer lifespan, and longer time for neuroplasticity, bigger brains should be something we should strive for.
Lex Fridman
(02:47:52)
Well, and that, Jeff Bezos, and Beff Jezos agree that all companies die. And for Jeff, the goal is to try to, he calls it day one thinking, try to constantly, for as long as possible, reinvent, sort of extend the life of the company. But eventually it too will die, because it’s so difficult to keep reinventing. Are you afraid of your own death?
Guillaume Verdon
(02:48:23)
I think I have ideas and things I’d like to achieve in this world before I have to go, but I don’t think I’m necessarily afraid of death.
Lex Fridman
(02:48:34)
So you’re not attached to this particular body, and mind that you got?
Guillaume Verdon
(02:48:38)
No, I’m sure there’s going to be better versions of myself in the future, or…
Lex Fridman
(02:48:46)
Forks?
Guillaume Verdon
(02:48:47)
Forks, right? Genetic forks, or other, right? I truly believe that. I think there’s a sort of evolutionary-like algorithm happening at every bit, or [inaudible 02:49:03] in the world is sort of adapting through this process that we described in e/acc. And I think maintaining this adaptation malleability is how we have constant optimization of the whole machine. And so, I don’t think I’m particularly an optimum that needs to stick around forever. I think there’s going to be greater optima in many ways.

Meaning of life

Lex Fridman
(02:49:25)
What do you think is the meaning of it all? What’s the why of the machine? The e/acc machine?
Guillaume Verdon
(02:49:32)
The why? Well, the why is thermodynamics. It’s why we’re here. It’s what has led to the formation of life, and of civilization, of evolution of technologies, and growth of civilization. But why do we have thermodynamics? Why do we have our particular universe? Why do we have these particular hyper-parameters, the constants of nature? Well then you get into the anthropic principle, and the landscape of potential universes, right? We’re in the universe that allows for life. And then why, is there potentially many universes? I don’t know. I don’t know that part. But could we potentially engineer new universes, or create pocket universes, and set the hyper-parameters so there is some mutual information between our existence in that universe, and we’d be somewhat its parents? I think that’s really… I don’t know, that’d be very poetic. It’s purely conjecture. But again, this is why figuring out quantum gravity would allow us to understand if we can do that.
Lex Fridman
(02:50:39)
And above that, why does it all seems so beautiful and exciting? The quest to figuring out quantum gravity seems so exciting. Why? Why is that? Why are we drawn to that? Why are we pulled towards that? Just that puzzle solving creative force that underpins all of it, it seems like.
Guillaume Verdon
(02:51:01)
I think we seek, just like an LLM seats to minimize cross entropy between its internal model and the world, we seek to minimize… Yeah, the statistical divergence between our predictions and the world, and the world itself. And having regimes of energy scales, or physical scales in which we have no visibility, no ability to predict, or perceive, that’s kind of an insult to us. And we want to be able to understand the world better in order to best steer it, or steer us through it.

(02:51:37)
And in general, it’s a capability that has evolved because the better you can predict the world, the better you can capture utility, or free energy towards your own sustenance and growth. And I think quantum gravity, again, is kind of the final boss, in terms of knowledge acquisition, because once we’ve mastered that, then we can do a lot, potentially. But between here and there, I think there’s a lot to learn in the meso scales. There’s a lot of information to acquire about our world, and a lot of engineering perception, prediction, and control to be done, to climb up the Carta shift scale. And to us, that’s the great challenge of our times.
Lex Fridman
(02:52:22)
And when you’re not sure where to go, let the meme pave the way.
Guillaume Verdon
(02:52:26)
That’s right.
Lex Fridman
(02:52:27)
Guillaume, Beff, thank you for talking today. Thank you for the work you’re doing. Thank you for the humor, and the wisdom you put into the world. This was awesome.
Guillaume Verdon
(02:52:37)
Thank you so much for having me, Lex, It’s a pleasure.
Lex Fridman
(02:52:40)
Thank you for listening to this conversation with Guillaume Verdon. To support this podcast. Please check out our sponsors in the description. And now, let me leave you with some words from Albert Einstein. “If at first the idea is not absurd, then there is no hope for it.” Thank you for listening. I hope to see you next time.

Transcript for Teddy Atlas: Mike Tyson, Cus D’Amato, Boxing, Loyalty, Fear & Greatness | Lex Fridman Podcast #406

This is a transcript of Lex Fridman Podcast #406 with Teddy Atlas.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
That’s all that matters, that he got there, that he got to the place to act like a fighter. To do what we want him to do, to be ready to persevere, to go beyond the comfort level, to do another round. He didn’t want to. Damn right he didn’t want to, but he knew we want him to. And he knew in order to pass the test, he had to do it. He goes, “Now, it’s going to be your job to get him in the gym, make him mentally stronger, make him face things, and teach him how to slip punches and create holes, and fill those freaking holes with devastating punches…” There’s a cuss, “… with punches with bad intentions.”

(00:00:40)
The following is a conversation with Teddy Atlas, a legendary and, at times, controversial boxing trainer and commentator. When I was going to this conversation with Teddy, I was ready to talk boxing, styles, matches, techniques, tactics, and his analysis of individual fighters, like Mike Tyson, Michael Moorer, Klitschkos, Usyk, Povetkin, Lomachenko, Triple G, Canelo, Muhammad Ali, Sugar Ray Leonard, Hagler, Duran, Floyd, and on and on and on. Like I said, I came ready to talk boxing, but I stayed for something even bigger, the Shakespearian human story of Teddy Atlas, Cus D’Amato, and Mike Tyson.

(00:01:23)
It’s a story about loyalty, betrayal, fear, and greatness. It’s a story where nobody is perfect and everybody is human. To summarize, in the early ’80s, young trainer, Teddy Atlas, worked with his mentor, Cus D’Amato, in training the young boxing protégé, now a boxing legend, Mike Tyson. Mike was a troubled youth, arrested over 40 times, and at age 15, he was sexually inappropriate with Teddy’s 11-year-old niece.

(00:01:55)
In response to this, Teddy put a .38 caliber handgun to Tyson’s ear and told him to never touch his family again or he would kill him if he did. For this Cus D’Amato kicked Teddy out. Why? Well, that’s complicated. In part, I think, to help minimize the chance of Mike Tyson, who Cus legally adopted, will be taken away by the state, and with him the dream of developing one of the greatest boxers of all time.

(00:02:24)
Of course, that summary doesn’t capture the full complexity of human nature and human drama involved here. For that, you have to listen to this conversation, the things said and the things left unsaid. The pain in Teddy’s voice, the contradictions of love and anger that permeate his stories and his philosophy on life. Like I said, I came to talk about boxing and stayed to talk about life.

(00:02:52)
This conversation will stay with me for a long time. The people close to you, the people you trust, the people you love, are everything. And if they betray you and break your heart, forgive them, forgive yourself and try again. Happy holidays, everyone. I love you all.

(00:03:15)
This is a Lex Fridman podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Teddy Atlas.

Lessons from father


(00:03:26)
You wrote in the book that your father had a big influence on your life. What lessons have you learned about life from your father?
Teddy Atlas
(00:03:34)
When you ask that question, I remember Cus D’Amato, when I was with him up in Catskill for all those years. He used to say to me, “Teddy, you learned through osmosis.” I believe there’s truth to that, if I know what osmosis is, but it sounds good. But I learned through osmosis with my father. He wasn’t a big talker. He was a doer. And when you’re around someone who lives a certain kind of life and does certain things, it penetrates.
Lex Fridman
(00:04:10)
He was a doctor.
Teddy Atlas
(00:04:11)
I’m going to sound like an idiot right now, because I’m being a son, but he was the greatest diagnostic doctor. I mean, if I say, I ever knew, what does that mean? You know what I mean? Are you a doctor? You know what I mean? What does that mean? But, other people have told me this, just legendary stories.
Lex Fridman
(00:04:33)
He would do house calls and he’d help people, and like you said, a lot of people have spoken about the impact he’s had on their life.
Teddy Atlas
(00:04:38)
He built two hospitals, and he built a hospital before the Verrazano Bridge in New York, connecting Brooklyn to Staten Island. And he built it so people could get proper hospital care that couldn’t afford it, period. And everybody looked at him as eccentric.
Lex Fridman
(00:04:58)
Yeah, nice. [inaudible 00:04:58].
Teddy Atlas
(00:04:59)
Yeah, because, he would literally sneak patients, not sneak them in, he was Dr. Atlas, he could do what he wanted, to a certain extent. But he would bring patients in without administering, putting through administration, so there was no charge, because they didn’t have anything. They were street people. I remember being… My only way to be with my father was to go on house calls or to go to the office. There was no…

(00:05:27)
And so I went on house calls with him. And he did house calls, by the way, till he was 80, and $3. I mean, it was better than McDonald’s, you know what I mean? I mean, the deal, $3 and you got medicine, you got everything. But he used to, right around the holidays, there was just certain things that I didn’t understand, but I understood later, where we would just drive certain areas and he just, all of a sudden, open the door, he would pick up these… and-
Lex Fridman
(00:05:58)
Help them.
Teddy Atlas
(00:05:59)
… I’m 10 years old, ” Move over.” Move over, you know?
Lex Fridman
(00:06:02)
Mm-hmm. It was just you, him, and a homeless guy.
Teddy Atlas
(00:06:06)
A couple.
Lex Fridman
(00:06:06)
Yeah, a couple.
Teddy Atlas
(00:06:07)
Yeah, whatever he could fit in, three, four, whatever it was.
Lex Fridman
(00:06:11)
That’s a big heart.
Teddy Atlas
(00:06:12)
And then he took them to the hospital, dropped them off. I would ask questions after it was all over with. I’d say, “Dad, they’re sick.” He goes, “Well, not in a way.” “Why did you put them in the hospital?” So he said, “Yeah.” And he’d tried to explain things to me. He would try, he didn’t talk much unless you’d ask him something, and that works. And don’t talk unless someone asks you something. And he explained to me that, he said… I said, “Well, why are you putting them in the hospital?”

(00:06:43)
And, of course, their sickness was, they were alcoholics. “but ,why do you put them…?” It wasn’t an alcohol rehab, so why are you putting… And it wasn’t for the purpose to dry out. He wasn’t trying to cure them. Let’s put that before we anoint him for sainthood, by Teddy Atlas. I was like, we finally get to the point, “Why do you put them in there?” “Well, because it’s the holidays.”

(00:07:07)
“All right, why do you put them in there?” “Well, the holidays are good for certain people and bad for others.” And it was always before the holidays. It was before Christmas, it was before whatever, New Year’s, whatever. So I said, “Why?” And he said, “Because they remind people, certain people, of what they don’t have. Other people enjoy the holidays because of what they have, family, whatever, and it reminds them, their mind is that.”
Lex Fridman
(00:07:46)
That’s pretty profound.
Teddy Atlas
(00:07:47)
Yeah. And then, I don’t remember, because he didn’t use the word suicide, but I got it. He basically, I forget how he said it, but I just got it. I don’t know how I… I suppose, I don’t know, but I just got it. So they don’t hurt themselves. That’s what came across-
Lex Fridman
(00:08:03)
In every way.
Teddy Atlas
(00:08:04)
I don’t think he ever articulated that or ever verbalized that. But, yeah, they don’t hurt themselves. Well, how does that work? Well, it just basically they’re going to be around people. They’re not going to be alone. They’re going to be around people. They’re going to get fed, they’re going to be warm, right, and it’s going to be for three days, two, three days, whatever. And basically, it’s a bridge. So the funny thing, as a 10-year-old, I want to be connected to him, so I enlisted myself in the job.

(00:08:36)
When he used to drop them off, he would take them, get them in, right? And then the thing that I know, again, he didn’t say nothing, but you notice things. And if you care enough, you don’t notice nothing if you don’t care. But if you can, if it’s important, you notice. And this guy was important to me. I just was, I didn’t know what a hero was, no clue. I loved Mickey Mantle, I loved Willie Mays, I love Muhammad Ali. I never, ever connected them in my mind as heroes. Never. My father, I didn’t connect it that way, but he-
Lex Fridman
(00:09:21)
Looking back now now [inaudible 00:09:22]-
Teddy Atlas
(00:09:22)
Looking back, he was my first connection to a hero.
Lex Fridman
(00:09:25)
The two of you ever talk about how much you love each other? The word love?
Teddy Atlas
(00:09:29)
One thing that was not allowed. The greatest memory I have, my father showing me love was, we were down in Florida at an airport and we were… I was born in Miami. Don’t ask where I was passing through. And the rest of my family’s born in New York, Staten Island. And so I was supposed to go back with him and I wanted to stay with my mother, for whatever reason. And so he, of course, conceded to it. Okay, whatever. And very quiet, very… And here’s a man who never showed emotion to anyone. I mean, for the most, you know… Well-
Lex Fridman
(00:10:11)
Yeah.
Teddy Atlas
(00:10:12)
… all of a sudden, he just turned and kissed me on the forehead and left. And I was like, “That’s different.”
Lex Fridman
(00:10:23)
Yeah. You still remember that, huh?
Teddy Atlas
(00:10:26)
Yeah. Like, “That’s weird.”
Lex Fridman
(00:10:28)
You lost him 30 years ago? How did that change you?
Teddy Atlas
(00:10:39)
It made me realize that some of the deals I used to make for God weren’t realistic. When I was a kid, I used to make deals with God. “Let me die before my father.” And then you get older and you have kids, you’re blessed, why did you make that deal? You know what I mean? Thank you for not taking me up on it.
Lex Fridman
(00:11:07)
Yeah.
Teddy Atlas
(00:11:08)
Right.
Lex Fridman
(00:11:09)
Yeah.
Teddy Atlas
(00:11:13)
You know?
Lex Fridman
(00:11:14)
Yeah. You miss him?
Teddy Atlas
(00:11:15)
I miss him in moments when I’d like to know what to do. And I remember when I would drive with him on the house calls, he didn’t listen to music. He was a guy, he read books to his… When he got older, he read books. Two blood vessels broke in his eyes. He only read nonfiction books, science. He loved science, wars, generals. I mean, I cheated on a couple book reports, because I didn’t do the reading of the book the night before I had a freaking a book report to put in. “Dad, I got a book report to do on the War of Stalingrad.” Really? The War of Stalingrad. And who the freak could tell you where you get an A? I got an A. I just wrote what he told me. He told me generals, he told me times, he told me strategy. He told me about the winter that came and destroyed the Germans, and the Soviets were tougher-
Lex Fridman
(00:11:15)
You got an A.
Teddy Atlas
(00:12:20)
… and the Soviets were tougher than the Germans, and the Germans picked on the wrong opponent. I was already in the boxing business. I didn’t even know it. I didn’t even know it. Matchmaking, very important. They mismatched. They made a mistake with the picking the opponent. And so, when we would be driving in the car, my father would be in a trance. And dad, he wasn’t ignoring me at all, he was just with his thoughts. He was wherever. He wasn’t even hearing the radio no more.

(00:12:59)
I always wondered where he was. I did. So I asked him one day. And just, so we’re driving, I said, I want to know so I said, “Dad, what do you think when you’re basically in this place, that I know you’re somewhere? Where are you? What are you seeing?” I actually said, “What do you see?” And he said to me, “I see what could be. I see what could be.” And I’m like, “Oh, all right.”

Scar story

Lex Fridman
(00:13:32)
I got to ask you, when did you discover boxing? When did you first fall in love with boxing?
Teddy Atlas
(00:13:39)
When it saved me.
Lex Fridman
(00:13:41)
How did it save you?
Teddy Atlas
(00:13:44)
I was a stupid, violent kid that was angry. Not exactly know why I was angry. I’d fit in real good in today’s society, because there’s a lot of angry kids out there that I don’t think they know why they’re angry. I was just out there getting in fights and I got this stupid thing from that.
Lex Fridman
(00:14:05)
Can you tell the story of how you got that?
Teddy Atlas
(00:14:07)
I was just running around doing stupid things, bad things. I hurt people, some people physically, but I hurt my family. That’s BS, you only hurt yourself. That’s a good way of alibi-ing it. But, at some point, the truth usually finds its way. I’d like it to look like I was just hurting myself, but I wasn’t, obviously. So I was just out on the streets, with kids that didn’t grow up in the neighborhood I grew up. I grew up in a neighborhood where father was a doctor. And I walked down the street…

(00:14:48)
The funny thing was, down the hill was a very tough neighborhood called Stapleton. And most of the people down there on the corners wished they could get up the hill, and I wished I could get down the hill. So I went down the hill and I hung out with all these friends that became lifelong friends. I gravitated to that, because I figured out later a little bit, but I wanted family. We were destroying the family. My father was a doctor, he didn’t have time for nothing but being a doctor.

(00:15:26)
I think when you created something, you sacrifice something, too. When you’re really great at something, so great that maybe God made you great and you’re too great for your own good. And then, I don’t know, it took me to these stupid, dangerous places. Dangerous for me, but dangerous for other people, too. Because, I got to the point where I was doing robberies on the street, I was fighting everybody.

(00:15:49)
And you know what the most dangerous part about it was? And I came to this realization on my own. I’m all by myself. I figured out, I was really as dangerous… These kids from the project, some of them, they got nothing. First of all, I learned you don’t have to be poor to be poor. You don’t have to be deprived of certain things to be deprived, at least to think you’re deprived. And I was poor in away that I didn’t have the only thing I wanted to have, him.

(00:16:28)
So here I am where I’m out there doing these things, and what made me more, I was more dangerous than some of these psychopaths. Well, I was a psychopath, too, I guess, the way I was behaving. But some of these psychopaths that really had nothing, really, they obviously would kill you. I was dangerous almost in the same way, but for a different reason. I know it’s ridiculous what I’m about to tell you, but I figured it out, because I felt it. I thought I was on a righteous path. I thought I had a right because it was going to get me my father back.

(00:17:11)
Why? Why? I mean, you’re a scientist, you couldn’t figure this one out. Because all the people that had him were injured people, fractured people, screwed up people in some ways, but hurt, damaged people. So if I get damaged, I’ll get him. So I was on a crusade, really, a righteous crusade where I thought it was okay. I had permission. I had permission to do these terrible things, quite frankly, and to fight everyone wanted to. And then it came almost to a crash of doing all that, winding up in Rikers Island like an idiot, not understanding the damage I did to this poor man, that he was a great doctor and he’s got to see his son and hear about, you know what I mean?

(00:18:14)
God, I was out on that day with the guys that I grew up with now, the guys from the projects from as I described, and I was with one of them who, he’s dead now. I was with him and we were in a neighborhood, the neighborhood we grew up, that I hung out, and he grew up in. Billy, he came from the project. And we got into a thing where we cut, somebody cut us off, we cut them off, jumped out to fight. Turned out there’s five or six of them and two of us. And we fought, right on the side, right there, only about a block from where I used to hang out, and maybe a block and a half.

(00:19:06)
And right in front of this Spanish bodega. It really does happen in slow motion. I actually saw the guy, I was fighting the guys that I had to fight. And then, all of a sudden, I was able to get one guy out of the way a little bit. And I really, I noticed the guy go into his pocket and I knew why he was going in his pocket. When he came out of his pocket, I knew what it was right away. It was weird, because in the neighborhood, guys used to hang out, they were into this… They get into fads right on the streets. And at that time, they went into this cheap knife, but they thought it was, well, we thought it was cool. It was a 007.

(00:19:51)
And the cool thing, whatever, was that you could flick it, you could learn. And I learned how to flick, but I never carried a knife. But, my friends would have it. I would just, you learn how you could flick it open, not a switchblade, but flick it with your wrist. And I was like, here I am in the middle of this freaking fight, and all of a sudden, “Oh, this is a 007.” And so I’m like, you got to make a decision. And I got a split, I can either not do nothing, which didn’t seem like a great option. I couldn’t run away.
Lex Fridman
(00:20:36)
Why not?
Teddy Atlas
(00:20:39)
Because you got to live with yourself afterwards. And that’s more difficult to live with than whatever it is at that second, because that don’t go away.
Lex Fridman
(00:20:47)
You couldn’t live with yourself-
Teddy Atlas
(00:20:49)
It just-
Lex Fridman
(00:20:49)
… running away.
Teddy Atlas
(00:20:49)
… It just don’t go away. That thing, nothing to do with being brave. It has nothing to do with being brave, really. It’s got to do with just common sense in life. That, for me, whatever you’re dealing with, it’s over, it’s done. Like, okay, deal with it, good or bad, whatever. But, you do that, that other thing, you can’t, that never ends. This thing ends.
Lex Fridman
(00:21:19)
Memory of you being, let’s say, a coward in that moment, that never ends.
Teddy Atlas
(00:21:24)
The only thing I had at that point in my life, in my stupid mind, was a reputation that I would stand up to certain things. That was like, and that for me was worth something, whatever, because I didn’t feel any worth to anything else. That was the only thing I felt a connection of worth to.
Lex Fridman
(00:21:47)
Stood your ground and got cut.
Teddy Atlas
(00:21:49)
No, I made a decision. I stood my ground, but I actually, things do slow down. They do. And I actually said, “It’s a 007, he’s got to flick it.” And I didn’t say no, but he’s got to flick it. I get a split second, like I said, either I do nothing, whatever, or I get to him before he gets it flicked. I went to get to him before he got flicked. And I, just as I got close to, I did him a favor. I walked right into a counterpunch, because I cooperated with him. I went right to him. And just as I… He practiced more than I did with the 007 apparently. Because he was like, whomp, whomp, whomp, whomp. Anyway.
Lex Fridman
(00:22:41)
What did you think? What did you think that happened? That was all slow motion. Did you think he might die?
Teddy Atlas
(00:22:49)
Yeah. Well, not immediately. Took me a minute. I’m a slow learner. I put my hand up. Right? Wouldn’t you? I guess so.
Lex Fridman
(00:22:59)
Yeah, mm-hmm.
Teddy Atlas
(00:22:59)
And it went into my face and that was it. It was gooey. It was warm and gooey. And I was like, “I don’t know what this means, but I don’t want to know, but I think I know.” And…
Lex Fridman
(00:23:21)
Did you think about your dad in that moment?
Teddy Atlas
(00:23:24)
No. You know what I thought about him was, you don’t know who anyone is until they’re tested. I learned that. Cus used to tell me, but I learned it. He said, I remember one time Cus, because I was a 17, 18-year-old kid up there, thought I was, whatever I thought I was, and he said, “You got a lot of friends.” And I said, “Yeah.” Because I was on the street, hanging out with a hundred kids at night, sometimes on the street corner. So I was like, I don’t know too many people that hung out with a hundred kids on the street, on a corner, on a Friday, Saturday night.

(00:24:02)
And I was like, “Yeah, I got a lot of friends.” He goes, “Really?” I said, “Yeah, really.” He said, “How about if I told you you might not have any. Most likely you don’t have any.” And he goes, and then he just started this thing. He said, “Everyone’s going to be tested, you, me, everyone, because you don’t know about nobody until they’re tested.” He goes, “You know nothing.” He goes, “You nothing until you know. Until something happens to test if they were really your friend.” And he told me this story about a guy.

(00:24:40)
A guy came to him and he was upset. “What are you upset about?” He goes, “I’m upset because I just lost a friend. After 20 years of friendship, we’re not friends no more.” So Cus looks at him, he goes, “Let me ask you a question. What made you think you were ever friends with him?” Now the guy gets insulted to Cus. “Did you hear me?” He goes, “I just told you 20 years I’ve been friends with this guy. Why would you say that to me?”

(00:25:11)
He said, “Well, I’ll say it again. What makes you think he was your friend?” He goes, “Whatever happened in the 20 years, other than chasing girls,” because Cus figured that went out fast, “… chasing girls and drinking together, and whatever else you’re doing out on the street, whatever gave you the inclination that he was a friend?” He goes, “Whatever, when did he risk himself to be your friend? When was it dangerous to be your friend?”
Lex Fridman
(00:25:44)
When was the friendship tested?
Teddy Atlas
(00:25:45)
“When was it uncomfortable to be your friend?” And you know what the guy said? You can figure it out, you’re a scientist. He said, “Today.” And today came for me. And today, today, today, today, kept coming for me. Today.” And that day, my friend Billy had turned out while I was fighting these, whatever, five, six guys, and where was Billy? He was on the roof. He was on the roof. He was on the roof. He was my best friend.

(00:26:29)
So anyway, they take me to the hospital. And here’s the thing with my father. But one thing Billy did do for me when he got off the roof, thank God, he did, he dragged me into this bodega, laid me on the floor, and started putting towels. And the towels, I vaguely remember this, they filled up with blood. I mean completely drenched, like you put them under a shower. And I heard the bodega owner screaming, screaming like… whatever. And everyone’s screaming and there’s chaos, and I’m like, I don’t know, I’m calm. Weird, I’m real calm. I’m just in this place, things calm.

(00:27:26)
And all of a sudden I hear Billy, he’s screaming, ” Call the ambulance, call the…,” and nobody’s doing nothing, everyone’s frozen. I’m starting to understand already people get frozen in situations. People, the fear, fear, fear, fear, fear just paralyzes people. And I was going into a fear business. I was learning. I was learning. I was getting a learning, early PhD-
Lex Fridman
(00:27:58)
Living in fear.
Teddy Atlas
(00:27:58)
Yeah.
Lex Fridman
(00:27:59)
Yeah.
Teddy Atlas
(00:27:59)
And, all of a sudden, genius, Billy genius, really, street kid. He jumps up on the freaking counter, jumps over the counter, grabs the phone, calls 911, says a cop’s been shot, and forget about it. It was crazy. All I remember after that, I’ll tell you the couple things I remember, lights, being put onto a stretcher, bounced around, rushed. I felt everyone’s anxiety, except mine. I had none. But I felt everyone’s anxiety, everyone’s fear, like was all around me. It was like, “Wow, this is interesting. It’s kind of…” I know that’s stupid, but, “Wow, this is interesting.”
Lex Fridman
(00:28:45)
You really have an eye for fear. That’s fascinating. You’re really studying it.
Teddy Atlas
(00:28:49)
Well, I had no choice, I got introduced in a crash course. And they put me in ambulance, and this is what I remember to your point, I’m sorry it took so long to get to it. I am, although I’ll probably do it again before this conversation’s over. But I-
Lex Fridman
(00:29:05)
It’s all about the journey.
Teddy Atlas
(00:29:06)
Yeah. We’ll get there. We’ll get there, pops. So I hear the cops say, “We might lose him.” And I’m laughing to myself, I’m not laughing, because I’m not, again, I’m not John Wayne. John Wayne would’ve laughed, but I’m like, “Lose? You guys are stupid.” I didn’t say that, but I’m like, ” Lose me? My father’s the greatest doctor in the freaking world. There’s nothing to worry about. You people are all uptight and whacked out here with fear, and there’s nothing to worry about. Dr. Atlas is my father.”

(00:29:48)
So anyway, so they’re taking me to the… And he said, “We don’t have time.” I hear, couple things I remember, “Don’t have time. Take him to…” and they take me to US Public Health Hospital. Marine Hospital was called at the time, but US Public Health. And it’s in Stapleton, so it’s close, thank God. So they’re taking me, and I hear them on the radio saying this stuff about, “We got to move. We got to move.” I start talking and they’re telling me, “Don’t talk.” But I like to talk a lot. And I’m… Again, fear.” There’s no fear when the fear’s been removed.

(00:30:35)
It’s the only time you really free in life. And I know that sounds absurd, but really, it is. It’s the only time you’re really free in life. When you’re-
Lex Fridman
(00:30:46)
Close to death?
Teddy Atlas
(00:30:47)
… when you’re devoid of things that normally hold you back, that normally influence you in ways that are, not of the influence that, always positive influence where you are in a pure place, where you’re in a purely free place from all inhibitions, from fear, from anxiety, from joy. Joy can screw you up, and you’re free from all these things. And I’m in this place, just [inaudible 00:31:18]-
Lex Fridman
(00:31:18)
In the back of an ambulance, you’re free.
Teddy Atlas
(00:31:19)
Yeah. I said, “Just get me Dr. Atlas.” And they say, “We don’t have time.” “No, no, no, no, no, you don’t… You have to get Dr. Atlas. You have to get him.” This was the… Damn it, this was the… You know what I mean? I finally freaking hit the number and I’m not getting paid. And then, all of a sudden, I’m out.
Lex Fridman
(00:31:39)
How many stitches?
Teddy Atlas
(00:31:40)
They… Well, I think it was 400, 200 inside, 200 outside, or whatever it was.
Lex Fridman
(00:31:45)
It’s a lot.
Teddy Atlas
(00:31:45)
Hey, look, after 50, the number doesn’t matter no more. Whatever, 60, 70, 80, 90, whatever. So I was fortunate, I was fortunate. And, of course, I was fortunate, they told me afterwards, that missed my jugular, literally by a centimeter. I mean, whatever. So then we wouldn’t be having this conversation, obviously.
Lex Fridman
(00:32:10)
I’m glad you made it.
Teddy Atlas
(00:32:11)
Yeah-
Lex Fridman
(00:32:11)
That’s another thing.
Teddy Atlas
(00:32:12)
… I’m glad, too. And it just missed my eye, which, thank God. It’s bad enough I have a scar, imagine me with a patch? I mean, it’s enough that I got this freaking thing. And look, it goes all the way. I mean, it’s pretty long. I don’t know, I was out. And then somehow, I sensed, they had the curtain closed, and it’s amazing how vivid this is. And the curtain’s closed and I see a shadow. I felt a presence. I did, and I felt him. He’s a powerful guy. And I felt him and I just see a shadow, you know? And, all of a sudden, the curtain gets pushed-
Teddy Atlas
(00:33:00)
And all of a sudden the curtain gets pushed back. And I can’t really see. It’s dark and I’m out of it, but not completely out of it. And pushes the curtain back, comes in, and his hand, even though it’s all bandaged, whatever, but his hand surveys. It felt safe and it felt warm and safe. I was happy. And he got there.
Lex Fridman
(00:33:37)
Did he say something?
Teddy Atlas
(00:33:38)
Yeah, yeah. Remember, I gave you a little bit of introduction to my father, right? You know him now a little bit, right?
Lex Fridman
(00:33:44)
Yeah, yeah. What’d he say about the job?
Teddy Atlas
(00:33:47)
This is what he said. I remember to this day what he said. That I do remember. I don’t know if it was six or five people, but this I do remember. He said, “They did a good job. You’re going to have a scar the rest of your life.” And he left.

Cus D’Amato

Lex Fridman
(00:34:05)
Oh, man. They did a good job. You mentioned Cus D’Amato, legendary trainer, and you also mentioned it turned out he really cared about you. In the book, you write about a testimony he gave. I was hoping I could read it because it speaks to your character. It speaks to his. It’s just powerful.

(00:34:28)
The testimony goes, ” Your Honor, I realize you might not know much about me, but I spent my whole life developing young men. As a boxing manager I trained two world champions, heavyweight champion Floyd Patterson and light heavyweight champion Jose Torres. I’ve also helped a lot of other young boys straighten out their lives and build character. I know things about Teddy Atlas this court doesn’t know, things you won’t find on his arrest record. This boy has character. He has loyalty. He’ll hurt himself before he’ll let down a friend. These qualities are rare and they shouldn’t be lost. He’s made mistakes. We’ve all made mistakes, but I’ve come to know this boy, and if we lose him, we’ll be losing someone who could help a lot of people. Please don’t take this young boy’s future away. He could be someone special. Let’s not lose him. Please.” Those are powerful words from a powerful man. What have you learned about life from Mr. Cus D’Amato?
Teddy Atlas
(00:35:41)
He gave me a quote that he drilled into my head. I became his guy. He loved me. I loved him. He said to me, “Teddy, no matter what a man says, it’s what he does in the end that he intended to do all along.” That’s what I learned from Cus. The rest of it is BS. And a lot of people say things. You just have to give them a minute to let them show you eventually what they really meant by it.

(00:36:26)
I also learned from him that everyone’s afraid. Cus, his way of saying it, another great saying, you’ll get a kick out of this, “Anyone who’s in a situation where fear should be prevalent, where fear is actually necessary to survive the situation, anyone who says that they’re not afraid, they’re one of two things. They’re either a liar or they should go to a doctor, find out what the frick’s wrong with them.” He was right about that. We live in a taboo society where that word, to a certain extent, is taboo because it invokes weakness. We are just layers of what we saw and learned since we were kids. We all are. We’re products of those layers. I learned that on my own and through some help.

(00:37:32)
At the end of the day, fear, people will find their way of avoiding that term. So they use the word anxiety, they use the word butterflies, apprehension, a million different words. I find all those other words to be cousins of fear. And fear causes a lot of things in life. It causes a lot of problems and it also solves a lot of problems. Without it, we couldn’t be great if we are great, if we ever have a chance to be great or at least to aspire to be great.
Lex Fridman
(00:38:22)
How does fear connect to greatness? That’s a profound statement. Without fear, we wouldn’t be able to be great.
Teddy Atlas
(00:38:32)
Yeah, you couldn’t be great without fear because fear allows you to be brave. The most important word for me in this whole conversation, right neighborhood would be selfishness, and it allows you to be, for a moment, less selfish. One of the things I learned, I guess partly on my own… Everyone thinks my greatest teacher was Cus. He was a great teacher, mentor. My greatest teacher was my father, the one who never talked. And I realized one of the things to be better, towards great is if you can submit less than we submit. See, one of the things that I’m afraid of, one of the things, I was always quitting. In my business, it’s not a good thing.
Lex Fridman
(00:39:29)
Every business, I think. Yours is just more clear.
Teddy Atlas
(00:39:35)
Yeah. It hurts more.
Lex Fridman
(00:39:39)
True. In the moment, at least.
Teddy Atlas
(00:39:42)
Yeah, in the moment. You’re right, 100%, because some things hurt for a long time afterwards. And something like regret. Regret is the worst thing in the world because it’s a solitary sentence. And man, oh, man-
Lex Fridman
(00:39:58)
That’s a powerful phrase, regret is a solitary sentence. Oh, boy.
Teddy Atlas
(00:40:02)
So, I-
Lex Fridman
(00:40:03)
You’re full of good lines.
Teddy Atlas
(00:40:07)
It wasn’t easy to accumulate them.
Lex Fridman
(00:40:13)
Yeah. Hard run.
Teddy Atlas
(00:40:15)
It was a little bit hurtful. So submit less, because we submit every day, and if we can get to a place where we submit or compromise ourselves less, we’ll get to a better place. Again, one of the words for me that attaches to things that wind up hurting you in life and have hurt me in life, one of those boogeymen words is the word of convenience. That’s attached to everything. People disappoint you not because they want to disappoint you or let you down or betray you, because they want to betray you. They do it because it’s more convenient to do than the other thing.

(00:41:06)
An old man once told me, he said to me… I was trying to rationalize something. I was trying to make an excuse for something. I was trying to make myself better than I was. I was trying to say it was okay. And he just looked at me, and he liked me, and he said, “Teddy, there ain’t no such thing as being a little pregnant.” I was like, “Yeah.” He goes, “Either you’re pregnant or you’re not pregnant. Either you’re real or you’re not real. Either you’re truthful or you’re not truthful. Either you’re tough or you’re not tough. Either you’re committed or you’re not committed. Either you’re in or you’re out.”
Lex Fridman
(00:41:56)
That applies to a lot of things, including loyalty.
Teddy Atlas
(00:42:00)
That’s quite a statement. But the life level of humanity for me is loyalty. It’s what goes through the veins of… Everything has to have some veins in some form. And if humanity has veins, what runs through the veins of humanity instead of blood to keep it alive is loyalty.
Lex Fridman
(00:42:18)
Those are powerful words.
Teddy Atlas
(00:42:20)
Without loyalty, we’re dead, we’re vessels. I never understood what a ghost ship was. You know what? As I got older, I know what a ghost ship is. It’s people. It’s people that are empty. They got no loyalty, therefore they got no humanity. Therefore, they got nothing. Therefore, frick them. Frick them. And you know why they don’t have loyalty? Convenience. And you know why? Because it’s hard to be loyal. It’s actually hard. I’ll be a son of a gun. “Yeah. Yeah, it sounds great. Give it to me. Give it to me. Paint me with it. Yeah, it’s great. Yeah, I’m loyal. Yeah, I’m great. Yeah, this is good. I’m ready. I’m on that team. I’m ready. Put me in, Coach. I’m ready.”

(00:43:09)
“Okay. Now, you’re going to have to get hurt here.” “What do you mean, get hurt?” “Oh, well, it’s going to be painful. I mean, to be loyal, you’re going to be in danger because the person that you committed your loyalty to, for a reason, because obviously you did something in your life, whatever, whatever, you’re actually going to get hurt to be loyal to them. You’re actually going to…” “Hold on a minute. Wait. Hold on a minute, Coach. Hold on. Call time out here. Let me think about this, Coach. I might need more practice. I’m not ready for the game. I’m not ready to go in the game yet. Give me a little more practice, Coach.” It hurts to be loyal. It fricking hurts. But without loyalty, we’re ghost ships. We got no strength. We got nothing. We got nothing. We got nothing.

Mike Tyson

Lex Fridman
(00:44:05)
I agree with you in a deep fundamental sense, but there’s pain that comes with that. I have to ask you to introspect on this part of your life. Because of your value for loyalty, as people know, you and Cus D’Amato trained young Mike Tyson, and the interaction there between the three of you led to the three of you parting ways. Given your value for loyalty, can you tell the full story of what led up to this and maybe the pain you felt from that?
Teddy Atlas
(00:44:59)
I guess it was the second time in my life I felt betrayed. The first time was when I was whatever, young, 17, and I got arrested. I was with all these older guys, tough guys, whatever, supposedly, and the detectives separated us. That’s what they do. And they asked me who did whatever? Whose gun? This, that, all that, the particulars of obviously what we did. And it was me. And they said, “You sure? You don’t want to change that? Because your friends changed it.”

(00:45:48)
And these cops, they were nasty, but they were cops. They were, “You’re going to wind up in Rikers and they’re going to be doing this to you.” And I won’t even say the things because, then, why say them? Figure it out. But they’re trying to get what they’re trying to get. And, “You want to change it?” And, “No.” But I felt very betrayed and especially when I was standing in the cell at Rikers looking at the airplanes leave LaGuardia Airport. And then hoping I was on one. I was making a deal with God that, “Let me be on one of those planes and let it crash. I’ll take a shot.”
Lex Fridman
(00:46:31)
Was part of you proud that you didn’t give up your friends?
Teddy Atlas
(00:46:34)
No, because I didn’t understand what proud was. I didn’t understand nothing. I just understood that-
Lex Fridman
(00:46:40)
Rules are rules. You’re just loyal and that’s it?
Teddy Atlas
(00:46:43)
I didn’t even know there was an option. I know the cops said, “You could do this,” but there was no option. My father never had an option. But the betrayal, the private betrayal was like… And so we were partners, me and Cus. Cus was retired. This stupid kid goes up there and all of a sudden I start training fighters. First, I wanted the gloves. Cus put me in the gloves. I wanted gloves. Then I had an injury, whatever. But bottom line is I still want to fight. I want to turn pro. I want to fight. That was the plan. And Cus had a different plan. Cus was like, “You can’t.”

(00:47:29)
And he had it set up a little bit, whatever. Without getting into it, hey, he did me a favor. I’d like to think he knew he was doing me a favor. And you know what? I do think he was. He was doing himself a little bit one too. But he was doing it for the greater course because he believed in this thing of boxing. He believed that it changed lives. He believed that it was worthwhile. He believed that there was a power to it beyond the left hook.
Lex Fridman
(00:47:55)
The big picture of boxing.
Teddy Atlas
(00:47:58)
Yeah.
Lex Fridman
(00:47:58)
He believed in it.
Teddy Atlas
(00:47:59)
Yeah, he believed that to be a champion, you had to be special, you had to be smart, you had to have character, that you had to be a better person, and that you couldn’t make a champion if you didn’t make him a better person first, and that this could strengthen people. The sport could strengthen people in those ways. So he was married to it. He was old and there was no one in the gym. It was empty. It was above a police station, which was crazy. He needed an heir to the throne. He needed to pass it on to someone.

(00:48:38)
And he saw something, and all of a sudden he saw that my career as a boxer was less important than having me become his heir to the throne and becoming his trainer, his man, his guy, to continue, that we could do a lot more for him and for everyone. Not just for him but for everyone. It was more like to keep it going. It couldn’t die. It couldn’t die. Cus was afraid it would die with him. And he committed his whole life to it. He didn’t get married because of boxing. So he saw me as the little bit of the seed to plant for more things to grow before that plant died. And so all of a sudden he says, “You can’t fight.” I had people tell me that I could go somewhere else and fight. And I could, but I couldn’t because I’d be disloyal.
Lex Fridman
(00:49:36)
Loyalty is everything.
Teddy Atlas
(00:49:37)
Yeah. So I couldn’t leave Cus, and he kind of knew that. And so I couldn’t leave him. And he said, “You have an ability to teach.” He said, “Knowledge means nothing.” He said, “See these Britannica…” He had Britannica encyclopedias, the whole set, in our library. He said, “You see these?” “Yeah, I see them.” ” All the knowledge of the world, whatever, is in these.” “All right.” “Means nothing if you don’t have somebody to convey it to people. Otherwise, it just sits on a bookshelf and looks good.” He goes, “You have the ability to convey knowledge to people. You’re a teacher. You were born to be a teacher. You’d lessen yourself by only being a champion fighter because you’d only take care of one person. You could take care of all kinds of people and you could do this and you could do that and you could do this.”

(00:50:30)
So we go on this venture. Took a minute, because I didn’t believe him at first, but finally I am, I’m there, I’m training fighters. Then he gets me to buy in, and I was a teacher. I start teaching these kids, and there’s no one in the gym. It’s dead. And all of a sudden there’s 10 kids, 15, 20, 25, 30, 35, 40, 45. Catskill Boxing Club, which was never there. Now it’s there. And I’m training fighters. I’m taking them down to South Bronx to get experience, one of his former fighters, Nelson Cuevas, down to South Bronx. I’m taking him down there to get smokers, to get fights when they’re ready after I teach.

(00:51:12)
I’m wearing out dungarees. I’m getting holes in my dungarees. I was fashionable for it was fashionable to have holes in my dungarees. I could have made a lot of money with that because I was on my knees with these little kids, nine years old, 10 years old, eight years old, 10, 12, 13, 14, all these kids. I’m teaching them and I’m building a gym. Cus only came once a week because he was semi-retired, and he’s home. When he would come once a week, he knew he couldn’t give me money, but he gave me more than money. He gave me praise. And he said, “Look what Atlas is doing. He’s creating champions.” And I was like, “Whoa. Yeah. Wow. I’m doing good.”

(00:51:55)
And then all of a sudden after four years of that, because I was up there seven years, eight years, eight years, after about three and a half, four years of that, we get a phone call that they got this kid in prison, in Tryon prison, from one of the guys that knew Cus, Matt Baranski. There’s a correction officer named Bobby Stewart who used to box, and Cus had helped him out a little bit. A little bit. They knew we had this gym. Now that was really starting to become something because we were winning tournaments and everything else. They go, ” We got this kid, Mike Tyson. He’s 12 years old, he’s 190 pounds, and he’s a mess. But Bobby Stewart got involved with him, the former fighter, and he’s taken a liking to it. And now where he didn’t behave at all and he didn’t listen to anyone, now he’s listening because Bobby’s got a carrot and the carrot is he’d teach him boxing. And now he’s at the point now where we want you to take a look, you and Teddy.” “All right. Bring him down.”
Lex Fridman
(00:52:57)
What’d you think when you first saw Mike Tyson?
Teddy Atlas
(00:53:00)
Well, I wanted to see his birth certificate because he’s 190 pounds, 12 years old and all solid. Really? But, yeah.
Lex Fridman
(00:53:10)
Just physically, just as a physical specimen?
Teddy Atlas
(00:53:13)
Yeah.
Lex Fridman
(00:53:13)
Big guy?
Teddy Atlas
(00:53:14)
Yeah. And listen, Cus was right, I was a teacher. He was right. And he was testing me even that day. He said, “What do you think?” So I said, “Well, we ain’t going to know nothing hitting the bag. Who the frick cares about that? He knocked the bag down. We got to put him in with… We got no one to put them in that way.” I didn’t have anyone that way. We got to test him. Everyone’s got to be tested. So you got to put them in responsibly. But, “Let’s put him in, just responsible, but let’s put him in with Bobby Stewart.” Former pro fighter, had 14 pro fights. Smaller than Tyson. When he was fighting he was 175. But still, he’s 28 years old. Tyson’s 12. Come on. “And he’ll work with him, right?”

(00:53:57)
So we do, we put him in. Tyson, he recognized the moment. He understood this was an audition, this was a chance. This was that TV show, Change Your Life. He understood that if he passed the audition, he could possibly change his life. He wasn’t sure what. How could he be sure what exactly? But it was better than what he had. And so he was on audition. So he innately understood what we would want to see: ferocious, toughness, character, desire, and, of course, ability. Well, we saw the ability, power, speed, but it was unbridled. It was untaught. It was raw. He didn’t know really much at all. At all. But we saw that.

(00:54:50)
But he wanted to show more. He knew that wasn’t enough. Again, innate intelligence. He had to show desire. He had to show toughness. And so I was being responsible. After two rounds, that’s enough. Normally, I don’t put a guy in to box until maybe four months, five months, six months, eight months, 10. It depends what it takes to learn on the floor before it’s responsible to put them in the ring to actually take on incoming real live shells instead of blanks. And so normally I wouldn’t have him in. And I knew after today, he wouldn’t be in the ring again if I trained him. I would teach him first and then he’d get back in in a few months. But for this day, it was the only way.

(00:55:36)
I used to make this announcement and Cus loved it. He said, “What’s training a fighter? What do you look for training a fighter, Teddy?” He asked me these ridiculous questions just to test me. And I say, “It’s like going to Macy’s…” Oh, he loved it. I said, “It’s like go to Macy’s window on Christmas.” He goes, “What do you mean, Macy’s window?” Cus was like, boom, boom, boom. So, “What do you mean, Macy’s window?” “You go to Macy’s window and they got the window with everything you want to see, everything in there. And it looks great, everything.” “Yeah? And then what?” “Well, then you ask, ‘What’s in the warehouse?’ and they tell you, ‘Nothing.'” And then Cus says, “That’s it. That’s the trainer.” And I wanted to see what was in the warehouse. Because I saw what was in Macy’s window. I saw the power, I saw the speed.

(00:56:22)
So he goes two rounds and he gets a bloody nose. Here’s the weird thing, not weird, very telling. We knew what we were doing. I’m not bragging, but we knew what we were doing because he got a bloody nose because he got hit. After that, he never got another bloody nose. You know why? He didn’t get hit. Because he learned. He was still strong, but he was smarter now. Anyway, he goes two rounds, and I saw, and I’m being responsible because if he goes more, it’s not responsible. I saw what I needed to see. I saw speed, I saw power, I saw athleticism. And I saw, I didn’t believe him. I thought he was lying to me. I’m just telling you. I thought he was lying, trying to act tough when he wasn’t really feeling tough. It didn’t matter.

(00:57:04)
Cus questioned me on it afterwards, “What did you see?” And when I said it, he goes, “Young master.” Again, he wasn’t paying me money so he had to give me something, right? And that was currency. “Young master.” I’m the young master? Whoa. “Young master.” You know what I mean? I felt like that guy, Kung Fu, like in the movie, like Kung Fu, “Grasshopper, when you’re ready, when can take this out of my hand, you can leave.” And-
Lex Fridman
(00:57:29)
That’s powerful.
Teddy Atlas
(00:57:30)
Yeah, it was. It worked. Cus knew how to work me. And he did. And it worked. But you know what? I didn’t mind being worked. I kind of knew I was being shuffled a little bit.
Lex Fridman
(00:57:45)
Well, you’re making it sound a little bit negative, but it’s also extremely positive. That’s a teacher instilling wisdom into you that you carried forward and it impacted a lot of people.
Teddy Atlas
(00:57:56)
Yeah. Cus got the job done, but he did it his way, and he did it for a myriad of reasons. But at the end of the day it was all good, and I just had to understand that eventually later on. But-
Lex Fridman
(00:58:11)
And you do the same. You do things your way and carry some of him in you, some of your father in you.
Teddy Atlas
(00:58:16)
Yeah. That day it was funny because when Cus said, “What did you see, Teddy, with him?” After two rounds, I got up on the ring. I knew I was going to train him. Obviously, we weren’t going to say no. He still had about four months to serve, and we were going to work it out. And when I got up on the ring apron, that’s my gym, I’m the boss. People later on in life called me a dictator. You know what I said? “Yeah, you’re right.” I didn’t deny it. People thought it. “You mean I’m right?” “Yeah, I’m a dictator. I’m a trainer. I’m the boss. I’m in charge. You wouldn’t be here if I wasn’t. What the frick you need me for if I’m not fricking in charge, you idiot? Yeah, yeah. Damn right. Well, what do you think, it’s a shared responsibility? No, it’s my responsibility. That’s why you’re here. Yeah, I am in charge. You shouldn’t be here if you don’t understand that.”

(00:59:11)
So I get up there and I know that I’m going to be training him. I got to show him who the boss is. I’m being really frank about this. So I get up there, I say, “That’s it. Out.” “No, no,” this is Tyson, “No, let me go. I want to do another round. I want to do another one.” “I said out. Did you hear what I said?” Because I knew that he was going to test me. He was testing me. I said, “I said get out.” He got out.
Lex Fridman
(00:59:39)
But were you impressed with the fact that he wanted to keep going, or no?
Teddy Atlas
(00:59:42)
Yes, and I recognized what it really was. So Cus asked me, “What was that?” Cus wanted to know what the young master saw. So Cus said, “What was that?” I said, “It was an act.” He goes, “You saw that? Did he really want to go?” I said, “No.” I said, “He didn’t really want to go, but he knew that we wanted him to go, and he made himself ready to go in order to satisfy, and that’s just as good.” And Cus said, “Damn right it’s just as good. All that matters was not how he got there, but that he got there. That’s all that matters, that he got there. That he got to the place to act like a fighter, to do what we want him to do. To be ready to persevere, to go beyond the comfort level, to do another round. He didn’t want to, damn right he didn’t want to, but he knew we wanted him to, and he knew in order to pass the test, he had to do it.”

(01:00:43)
And he said, “You’re right.” He goes, “Now it’s going to be your job to teach him, to make him a fighter that don’t get bloody noses, that don’t get hit and will get to that place without being chorused to get there, to get to that place on his own, instead of using the things that he had to use to get to that place today. Those things are not going to be available one day when you…” And listen to this. You talk about a man being prophetic. Cus was pretty good. You talk about a man being on the job, on the money, Lex. How do you think he finishes the sentence? He goes, “Because you’re going to have to make sure that he learns these things because he’ll be your first heavyweight champ.” “What did you just say?”

(01:01:36)
He’s 12 years old. He’s been arrested 30 times. He’s getting out of jail, out of juvenile detention, Tryon. He’s a mess in a lot of ways. There’s a lot of things we find out later, a lot of problems, weaknesses. He goes, “And that’s part of your job. That’ll be part of your job.” But he really said that. And then he turned to him, he goes, “You want to come live with us, young man? You want to be a fighter?” “Yes.” Even that, Cus said to me later, “What do you think about that?” I said-
Lex Fridman
(01:02:14)
The way he said, “Yes”?
Teddy Atlas
(01:02:15)
Yeah, the way he said, “Yes. Yes, sir.” Yeah. He said, “What do you think about that?” And we’re talking, I said, “He ain’t going to be that polite in a little while down the road. Again, he knew that that’s what he felt that he needed to project himself as, to present himself as to get to where he want to get to.” He goes, “Yeah, yeah.”
Lex Fridman
(01:02:38)
Did you see what Cus was seeing in terms of the heavyweight champion of the world?
Teddy Atlas
(01:02:42)
No. Again, the easiest answer would be yes. Teddy’s just a-
Lex Fridman
(01:02:45)
Teddy knows.
Teddy Atlas
(01:02:46)
… genius. Wow, wow. Wow. No, no, no, no. But again, it was my job. And my job, it was simple, simpler than Cus’. Cus knew too much. I knew nothing. I just knew rudiments of boxing. I knew what it took to be a fighter and how to execute it, the steps of executing it. So I took those steps. The rest of it, you get blurred by those other things. I wasn’t blurred by those other things. It was just, “Get them in the gym, make them mentally stronger, make them face things, and teach them how to slip punches. And create holes, and fill those fricking holes with devastating punches,” this is Cus, “And what are you going to do?” “I’m going to teach them to fill holes and fill them with punches with bad intentions.” And that became the moniker. And then Tyson would say that, “I’m throwing punches with bad intentions.” Yes, you are.
Lex Fridman
(01:03:48)
How do you make him mentally tougher? So that part of the job, you said the, “Don’t get a bloody nose,” but the part of the job where it makes him mentally tougher, how do you do that?
Teddy Atlas
(01:03:57)
Most important part of the job, to make him face things. Make him face where he’s lying to himself, where he’s submitting. What if we start this conversation with submission? Submit less, submit less, submit less every day, submit less. Cus only come to the gym once in a while. And if I had him sparring, he would come because that was his project, that was the heavyweight. Now he came. It put life in Cus. Cus had life. He was losing a little life, but that made the light bulb bright again. It did. And it was great to see. I felt proud of that. I felt connected to that.

(01:04:32)
That’s why when it all went bad and Cus took the side, the only side he could take, the side of the next heavyweight champ of the world, but he left me, his partner, the young master… And for the second time I get betrayed. And I’m like, for a while I thought everything Cus taught me, said to me was a lie, and I didn’t want to be any part of it anymore until I got a little more mature and I got a little past that where I was able to understand. I was able to understand that just because somebody that you perceived as great in every area you find to be weak in certain areas doesn’t mean that they can’t still be what they want to you. It’s something that can be understood or forgiven.

(01:05:43)
But yeah, it’s hard. It’s hard to get to that place, to forgive somebody in that kind of way that I felt betrayed. Because Cus told me the most important thing was loyalty. Cus told me he loved me because I was loyal. Cus told people that the reason that he went to court was because I didn’t give up anybody.
Teddy Atlas
(01:06:00)
… to court was because I didn’t give up anybody, even though it meant put me in the risk of going to jail for 10 years because felt that he admired those traits. And so I assumed that he would show the same traits. And he took a deal. He took a deal. He took a deal. He signed the papers that those so-called Feds of mine signed. He took a deal to have the future heavyweight champion, as it turned out, and to let me go. To sign the deal to let me take the weight.
Lex Fridman
(01:06:48)
For people who don’t know, Mike was inappropriate with a young girl and you pulled the gun on him. I don’t know if there’s deeper things to say about that situation.
Teddy Atlas
(01:07:00)
No.
Lex Fridman
(01:07:01)
But why do you think Cus made the decision to cut you off from both Mike Tyson and from Cus D’Amato? To break that when he valued loyalty so much.
Teddy Atlas
(01:07:12)
I served my purpose. I got him to where he needed to get. Brought life back in the gym. If I wasn’t in the gym at that particular time, Tyson never would’ve been in the gym. There would’ve been no gym to bring him to when they called up and made that phone call to bring him to the gym. There would’ve been no activity. There would’ve been no boxing program. There would’ve been no training, training him 247 the way I was, where Cus wasn’t capable of doing that at that point in his life.
Lex Fridman
(01:07:40)
Yeah.
Teddy Atlas
(01:07:41)
But then again, it’s not poor Teddy. I got the benefit of a career, I got the benefit of knowledge, I got the benefit of a life, I got the benefit of learning, of becoming hopefully a better person. I got the benefit of being betrayed again.
Lex Fridman
(01:07:59)
That’s a hell of a statement right there. I don’t know what the benefit of that is.
Teddy Atlas
(01:08:05)
You can learn to forgive weakness when you realize how easy it is to be weak. And when you realize that… somebody asked me, how did you get to the point where you could forgive? It’s a pretty good question. Pretty simple, pretty basic, pretty important. And I didn’t understand, I understood. But I did understand immediately, for me. I said, “How can I not forgive somebody?” It becomes easier to learn how to forgive when you’re still trying to forgive yourself, when you’re still in the process of trying to forgive yourself for all your own inherent weaknesses and betrayals of people like my father in different ways that we forget very easily because it’s handy and it’s a way of surviving. It’s a lot easier to figure it out, rationalize it, to find forgiveness when you realize that you still haven’t figured out completely how to forgive yourself. I’m still trying to figure that out.

(01:09:34)
And so that helped me figure out how to forgive Cus because to figure out how to forgive me, I had to understood why I did these things. Where the weaknesses came from, where the selfishness came from, where the convenience came from. That they really existed. But they didn’t exist for malice, they existed for me not being prepared to understand that I could be stronger, to want to be stronger. And then I looked at Cus. He wanted to be stronger, but he got to a point in life where he had been strong for a lot of his life. He was strong with me, he was strong with a lot of things in his life. And does everyone deserve a pass in life?

(01:10:27)
He got to a place where everything was in one basket, the basket of boxing. He once told me that he never got married because it would’ve been selfish to a woman to have gotten married when his whole life was boxing. That he couldn’t give to her kid, he couldn’t give to her. And then I thought about it. He had no money, really. And Jim Jacobs and Bill Caden took care of the bills, so he didn’t really need money that way. But what was the payoff of that kind of life, that kind of commitment, that kind of sacrifice? Really, what was the payoff? The payoff was to have champions. To have a champion that would keep your name alive.

(01:11:17)
That word legacy, what does it mean? Sometimes it’s just a word, sometimes it’s more than a word. It’s a reprieve. It’s a pension plan. It’s being given a pension on your way out for the rest of your life, for your life wherever you’re going. You’re going to wherever you’re going for eternity. It’s the only thing that you take with you, is what you left behind. And for Cus it was all about leaving behind a mark. A mark of a champion. Yeah, it was attached to ego. We all have it. Yeah, it was attached to some selfishness and all. But yeah, it was also attached to wanting to leave something great behind.
Lex Fridman
(01:12:08)
Yeah.
Teddy Atlas
(01:12:08)
To know that you were part of it. That you existed for a reason. That you sacrificed for a reason. And all that freaking pain I brought my father, I was searching for something. Yeah, I made it into a righteous search. I made it into… I did. And I made it into, “It was okay because it was righteous,” but it still did damage. It still did damage. It still hurt people. It still betrayed my father’s trust. And Cus betrayed mine, but he didn’t do it maliciously. He did it out of, again… my father came home… this is how I’m going to connect it. My father came home from work one night, 12:00. And I was waiting on him. And like I said, I was nine, 10 years old. And he got mad at me. He goes, “Go to bed. What are you doing up?” I said, “I’m waiting for you. Waiting for you.” And he said, “Well, go to bed.” I said, “No. What were you doing?” He said, “I was at the hospital.” “Why were you there so late?”

(01:13:25)
He answered me. He said, “There was a patient. There was a sick patient.” I said, ” He must be better now because you’re his doctor,” because my father could fix anything. My father, nothing got in the way of the truth. Nothing. Nothing. Even blowing his son’s bubble. Matter-of-factly he said to me, “No, he’s not going to get better. He’s going to die.” So as a 9-year-old kid, you’re a kid, you’re selfish, not in a bad way but you want what you… and I said two things. First I said, “How? You’re his doctor. How? It can’t be.” And then I said, I said it almost angry, “Then why were you there? You should’ve been here with me.”
Lex Fridman
(01:14:22)
Yeah.
Teddy Atlas
(01:14:23)
And you know what he said to me? “Because you don’t give up on life. Go to bed.”
Lex Fridman
(01:14:51)
Don’t give up on life.
Teddy Atlas
(01:14:52)
And I finally connected the dots. This idiot that didn’t graduate high school, I finally connected the dots. I was asking Cus to give up on life. You don’t give up on life. You don’t give up on aspirations of life. Life is all forms of life. It doesn’t have to be a physical form of it. It’s life. It’s having a reason to be alive. It’s having a reason to have tomorrow. And Cus’s only reason to have tomorrow was to have another heavyweight champ.
Lex Fridman
(01:14:52)
Yeah, a champ.
Teddy Atlas
(01:15:11)
And Teddy Atlas, even though we were together all those years, and we were partners, and we trained together, and the only thing we didn’t do was what they did in the Indian movies where they cut the finger and they became blood brothers.
Lex Fridman
(01:15:25)
Yeah.
Teddy Atlas
(01:15:25)
That’s the only thing we didn’t do, and I felt like we did that without cutting. And now here we are, and he freaking betrayed me. The… and then all of a sudden I connected the dots. I was like, “He didn’t betray me in that cold sense, he didn’t give up on life.”
Lex Fridman
(01:15:53)
Years later, Mike Tyson apologized to you. What’s meaningful to you about that? How does that fit the story?
Teddy Atlas
(01:16:04)
I want to be the great, gracious guy right now. Say, “Oh, I’m so human that a man’s man enough to say sorry, that’s it, we’re good.” I want to be, really. That’s the best presentation of Teddy Atlas I could put out there. He’s a good guy. He forgives. He’s a good guy. He’s a standup guy and he’s a good guy. I’m not sure. If he truly did it for himself, that he really did it because he felt that it was true. But if he’s persuaded by other things… he was in the middle. I know I’m taking it too deep, I know it, but what am I going to do?

(01:16:58)
He was in the middle of 12 steps with the getting out of drugs, alcohol, 12 steps, which is a commemorable thing. Really, it is. And he’s taking the steps. Part of the steps was to admit all, to apologize to all people you offended in life. Okay. But are you doing it for the 12 steps or are you doing it because you really truly have come to terms with believing what you did was that hurtful to me, and that it matters to you that it was that hurtful to me, and that you were wrong in doing in it? Did you do it for… I know that’s deep. I know that I’m a freaking idiot. “Teddy, you should be better than that. He’s better than you.” Yeah, maybe he is better than me. Maybe he is. Really. Seriously, maybe he is. And I took it. He put his hand out. I took it. We hugged. He said, “I love you.”

(01:17:58)
Yeah. Yeah. But I want to believe. But what did Cus tell me? “No matter what a man says, it’s what he does in the end that he intended to do all along.” So to this day today, was it really genuine or was it reflexive of that moment for him to get what he needed for that step? Or was it truly for what I needed? That he really cared that what he did to me caused me to do what I did. Because I did something that was pretty bad to him, too. Is he able to deal with that and put that where it has to be put? Is he able to put that? Or is it just he did something he had to do and maybe he’s sorry he did it? Look, I appreciated that he… I would it’d rather been in a private place.
Lex Fridman
(01:19:05)
Yeah. So for people don’t know, you were in the middle of commentating a fight, and he walked up from behind you and he said he was sorry. He shook your hand, gave you a hug. I didn’t know. He said, “I love you.”
Teddy Atlas
(01:19:15)
Yeah, he’s emotional. I get emotional a little bit, too. But he’s emotional and he can be… I can see why people have a fascination and a love affair with him right now, because he was the meteor that went across the sky that, if they didn’t see it, their parents told them about it. There was a meteor that came across the sky one day.
Lex Fridman
(01:19:42)
Yeah.
Teddy Atlas
(01:19:42)
And the meteor is walking around in the room now, and that’s the meteorite. And it actually landed here, and that’s it right there. And now he’s come a long way. And now he’s more human and he’s lovable and compassionate and he cries. And I get the fascinating, I get the love affair. I get it because, inherently, we’re people that want to forgive. We’re people that, we want to be good, and part of being good is to forgive people and to show compassion to people. And when somebody’s been damaged, to acknowledge they’ve been damaged, to acknowledge that you know they’ve been damaged, and you care about them being damaged. And how do you show care? Through admiration. In some ways almost through adulation. And he’s getting adulation from people, which is to an incredible level. And it’s a phenomena, but I get it. I understand it. And I don’t know if he gets it. I don’t know if underneath all of this… he’s a complex guy. He’s a sensitive guy. I don’t know… And I am, too.
Lex Fridman
(01:21:00)
One complex guy talking about another complex guy.
Teddy Atlas
(01:21:03)
I don’t know if, underneath it all, where he’s really truly at as far as that day that he said that to me.
Lex Fridman
(01:21:12)
Is there part of you that’s sorry to Mike for-
Teddy Atlas
(01:21:15)
I’m not sorry.
Lex Fridman
(01:21:16)
Pulling the gun on him?
Teddy Atlas
(01:21:17)
Yeah. And listen, that’s fair. I know dimensions of human nature too well to not know that he still has to have certain… because I have those strong feelings. What? It’s not fair for him to have them? Damn right, it’s fair. Now, he could look at it, if he was to be held to his word, that night that he just acknowledges that what happened, he deserved because of the position he put me in and he put himself in, what he did. And I wouldn’t change nothing.
Lex Fridman
(01:21:54)
Still, you don’t regret pulling the gun on him?
Teddy Atlas
(01:21:58)
I regret that I had to.
Lex Fridman
(01:21:58)
Yeah.
Teddy Atlas
(01:22:01)
Yeah, I regret very much that I had to. I regret very much.
Lex Fridman
(01:22:06)
He crossed the line.
Teddy Atlas
(01:22:08)
I hated him for putting me in that position. How dare he think that somebody’s feelings are that trivial? That the way I would feel about myself and the way the girl would feel about herself, that was 11 years old at the time, how she would feel about herself. How dare that he think it’s that trivial that I shouldn’t be ready freaking to both die and kill for that?
Lex Fridman
(01:22:35)
Yeah. Why didn’t Cus D’Amato see it in a deeper way and talk through it?
Teddy Atlas
(01:22:42)
He did. The word came back to me but, of course, what does it mean? But the word came back to me that Cus said, “You were right.” But if he took the side of Teddy, he would destroy potentially a great fighter.
Lex Fridman
(01:22:59)
Why do you think that? Okay, if you were to try to understand the point he was making, why is that true? Isn’t the part of greatness that you said is building the character of knowing what is right?
Teddy Atlas
(01:23:11)
Cus was afraid to go there, where he used to not be afraid, because it’s kind of like you’re never afraid of going up. And I get it. When I train to fighter now, if I come out of retirement, I train to fighter now, I feel in camp like I’m on death row every day. Every day I try to retrace my memory and say, “Did I feel this way when I was younger? I don’t remember feeling this way.” I feel, every day, a dreadful feeling that if I don’t get this right, I betrayed everything. I betrayed the fighter’s trust, I betrayed what I’m supposed to be.

(01:23:53)
And then one day I tried to figure it out. Why do I feel this way? It’s so intense. I was in camp for two months training a guy for the world title a few years ago, fighting the hardest puncher in the world at the time, Adonis Stevenson, and the fighter was Ukrainian. And I was brought in to train him for that fight, and he trusted me and changed his whole style. Trusted me. Oh my God. I went to bed every night praying, dread. Waking up, dread. My stomach down to here. Saying, “What if I fail? What if everything that I told him was going to happen don’t happen? What if I fail him? What if he trusted me and I betrayed that trust?”

(01:24:37)
And the thing with Cus was he used to be stronger than that. And then I tried to figure it out, why I got this way and why it was so dreadful to me, and why I felt like I was on death row every day training a fighter. Like, “Did I do enough? Did I do right? Will we accomplish what I promised him we would accomplish? Would I keep my word?” And then I started thinking, how did I become this weak? How did I freaking become… I was a pretty strong freaking guy. How did I become this weak? And then finally I think I figured it out. You know why?
Lex Fridman
(01:25:12)
Hm?
Teddy Atlas
(01:25:13)
Because I was always working to get up. But once I finally got up, now I was looking down. And it finally hit me. I said, “I didn’t want to lose.” I said, “There was nothing to lose on my way up.” Now, all of a sudden there’s something to lose when you’re up there and you’re looking down.
Lex Fridman
(01:25:29)
And that’s where he was.
Teddy Atlas
(01:25:32)
And that’s where Cus was. Cus was at the end of his rope. He accomplished two world champs, all this stuff, everything. And he did it right. Now all of a sudden it wasn’t about moving forward, it was about not falling down. Holy cow. I was like, “I got it, Cus. I got it. I got it. You didn’t want to fall down. Oh my God. You didn’t want to fall.” And this was his last chance. You don’t give up on life. This was his last chance to live forever. To make everything he did worthwhile. To have the youngest heavy… it wasn’t just heavyweight champ. You’ve got to remember he was the youngest heavyweight champ ever.

(01:26:20)
And to have that, it was okay to die now. And how’s loyalty to someone named Teddy Atlas going to get in the way of that? That’s a tidal wave that there ain’t no wall that’s been made high enough to stop that tidal wave. And now I’ll stop myself. Yeah, there is, but it would have to be an awful big one. And you know what? Who are we to say that we could ever build that wall that big? Who is any of us? Who am I to say?
Lex Fridman
(01:26:54)
Do you think, if you were to put yourself in the shoes of Cus D’Amato, can you see yourself having the big enough wall where you would choose loyalty?
Teddy Atlas
(01:27:07)
Now, if I answer the way I feel then I’m making myself John Wayne again.
Lex Fridman
(01:27:13)
You don’t have to answer then. I think loyalty is important.
Teddy Atlas
(01:27:18)
No matter what a man says, it’s what he does in the end that he intended to do all along. I didn’t make that up, Cus did. And when this all went down, those words came freaking echoing into my freaking ears. I didn’t want them. Cotton doesn’t help. And they freaking kept coming into my ears. And what do you think? Still an immature kid at the time. I was young. Still an immature kid at the time. What the freak do you think my response was? You were full of…
Lex Fridman
(01:27:54)
Yeah, shit.
Teddy Atlas
(01:27:56)
But I got past that.
Lex Fridman
(01:27:58)
Do you forgive Cus? Have you found forgiveness?
Teddy Atlas
(01:28:03)
Listen, I forgive him because he gave me more than he took away from me. What kind of man am I if I can’t at least acknowledge that and be grateful for that? He gave me more than he took from me, and I’m grateful for that. I’m also grateful for what I gave him, that I did give him something, and at that point in his life. A place to still have test tubes and chemistry experiments. A laboratory where he could still create a great fighter. And I helped give them that. I was part of that lab and making sure that lab was there and just that there was the existence of test tubes in the place, because you can’t freaking do experiments without test tubes.
Lex Fridman
(01:29:07)
Now you’re the scientist with the test tubes.
Teddy Atlas
(01:29:10)
Yeah, I guess so. And I just hope that… what I said earlier is really my thread through this whole thing. When you say, “Could you forgive Cus?” I’m still trying to forgive myself. And if I can have hope that I can forgive myself, I think that hope has to start with the power to forgive someone else. How can I ever forgive myself for all my failings and figure it out if I can’t start and practice it by forgiving someone else for some shortcomings? And for me, that’s the only sense of sometimes a very hard thing to make sense of. That’s my North Star, that’s my compass. Cus used to make me laugh. Me and him did everything together. And we’d get lost in the city, we get lost in the Bronx, and he’d get all frustrated. And he said, “Atlas, you’re a great trainer but you turn you around, you spin you around and you’re lost.” And I said, “Me or we?” Because I was the only one who would argue with him, and it was really funny sometimes. And I said, “We or me? We or?” He goes, “[inaudible 01:30:46].” “Cus, you’re lost. I’m lost. What are you talking about?”

(01:30:52)
And then all of a sudden Cus couldn’t give in. He just couldn’t admit. He couldn’t give in. You know what he said to me? All of a sudden he goes, “When I was in the Army, if I had a compass I could get out of the woods.” I said, “We’re not in the woods, we’re not in the Army. We don’t have a compass. Cus! Cus!” “Just don’t argue with me!”

(01:31:13)
One time we’re driving. I want to get back to Catskill. We just finished at the Bronx. It’s been a long day visiting the Murderers Inc. houses and everything else that he took me through for the 1,800th time. And he would fall asleep. He was getting older and he would just fall asleep in the car. So what do you think? I went a little faster, because before he went to sleep he said, “Don’t speed.” I don’t consider myself… I try to be an honest guy and I try to be a freaking…
Lex Fridman
(01:31:51)
Was it five or six guys?
Teddy Atlas
(01:31:52)
What did I figure earlier? Try to do less submitting, really, in all phases. Try to submit a little less. Try to lie a little less today. A little less. Try to get stronger, try to get a little better. So here we are and we’re driving. And all of a sudden he’s asleep. What did I do? 80? 75? Probably. Probably did. Whatever. And all of a sudden he wakes up. “You were speeding.” I lied. “No, I wasn’t.” ” Don’t lie.” “I’m not lying.” “You lied again. You were speeding.” Now, come on. This guy, he’s unbelievable. So I got to freaking… he’s David Copperfield, I want to know the trick. I want to know how he made this thing disappear.

(01:32:52)
So I said, “What are you talking? How do you know?” He goes, “Because I timed you. I looked at the post number.” And I’m like, “What?” “I looked at the post number on the side of the road where we were,” whatever mile. And I never knew they even existed. I look and I said, “Yeah, there’s little numbers.”
Lex Fridman
(01:33:12)
He started timing and then he fell asleep.
Teddy Atlas
(01:33:13)
Yeah, he timed it. And he looked. He goes, “We couldn’t have got from here to there in that amount of time unless you were going 75 miles an hour.” And I’m like, “All right, I’m impressed. Don’t try to get the mileage, the mile per hour part right. It’s enough that you got me. That’s enough. Yeah.” I said, “And I’m not going to do that no more.” And he helped me in crazy ways where there would be times where you wanted to be whatever, convenient, weak, submit. And then all of a sudden, in my mind, Cus was there with the stopwatch. And I’d be like, “No,” where I was about to say yes to whatever that particular situation was.
Lex Fridman
(01:34:16)
Somebody hit their phone. Hello? Hello? Yes. Doing great. Thank you.
Lex Fridman
(01:34:30)
Just for the record, never had a phone call like this. It’s hotel security. The question he asked me is, “Are you okay, sir?” Are you okay? Are we okay?
Teddy Atlas
(01:34:39)
I think so. I think so. So far.
Lex Fridman
(01:34:43)
Yeah.
Teddy Atlas
(01:34:44)
I can only go so far. It’s kind of like that old joke where the guy jumps off the Empire State Building. He’s falling down and he’s going 80th floor, 70th floor, 60th floor, 50th. And he gets past the 50th floor and they’re looking at him out the window and he goes, “How am I doing?” They’re like, “So far so good.” I don’t know where it’s going to end.
Lex Fridman
(01:35:07)
Mike Tyson is considered by many to be one of the great boxers, one of the greatest boxers of all time, heavyweight boxers. What do you think, on the positive side, made him great?
Teddy Atlas
(01:35:18)
I don’t know if he was ever great. I know he was sensational. I know he was the greatest mix of maybe speed and power ever. I know he was one of the greatest punchers from either side of the plate, left or right. There’s been great punchers with just the right hand, like Earnie Shavers and Deontay Wilder and Max Baer. I don’t know if there’s ever been anyone who could punch as good as he did on either side with either hand, other than Joe Louis and a few others. I don’t know if there’s ever been such a combination of speed and power to that pure level that he had, and it was a pure level.

(01:35:58)
I don’t know if there was ever as good a fighter as Tyson was, where maybe one night he was great where he wasn’t tested but he might’ve been ready to be tested. That one night against Michael Spinks when he took him apart in 90 seconds, I think I saw a great fighter that night. I don’t think you can be great unless you have all the requirements of being great.
Lex Fridman
(01:36:22)
What does it take to be a great fighter? Truly great.
Teddy Atlas
(01:36:32)
To not rely on someone else’s weakness to be strong. To be strong on your own. Too often he relied on other people’s weakness, whether it’s by being intimidated or whether it was because his talent was so much greater than theirs that it was like putting a monster truck in there with a Volkswagen, and the Volkswagen was going to get crushed. No matter how much horsepower the Volkswagen might’ve had under the hood and you put under the hood, it was going to get crushed. The monster truck was not going to allow it to be a contest.

(01:37:07)
And to be able to find a way when your talent wasn’t enough. He didn’t find a way when his talent wasn’t enough. And I’m not making statements if I’m not ready to put some evidence. Like if we were in a courtroom, exhibit A: when he fought Buster Douglas, Buster Douglas matched his will and didn’t get intimidated. Stood up to him. He didn’t do what most people did. He didn’t submit even a little bit. Not that night. He had in the past, but that night he didn’t.

(01:37:48)
Why? Because Buster had a secret weapon that night, his mother. Buster’s mother had died a few months previous. He loved his mother very much. Buster had always had talent. Big heavyweight, talented, could punch, technically solid. He was all those things, always was, but he quit in fights. He did less than he should’ve done. He never lived up to his ability. He gave in. He submitted. He wasn’t strong enough. He never had a reason to be strong enough. When his mother died, he had a reason. Nothing could hurt him as much as his mother dying hurt him, Mike Tyson included.

(01:38:27)
That night Mike Tyson could not hurt him as much as his mother had hurt him by dying. That night he had a reason to be strong, for his mother, and he was strong. He was everything he was supposed to be and more. And he stood up to Mike, and Mike, for the first time maybe ever, was in a fight where he had to overcome something, where he had to be more than talented, more than a puncher, more than a guy with scintillating speed. And he wasn’t. And then that night got followed by another night with Holyfield. Holyfield…
Teddy Atlas
(01:39:00)
… night got followed by another night with Holyfield. Holyfield wasn’t as talented as him, as big, as a much puncher, but Holyfield had the character. He was strong in ways that Tyson wasn’t strong. He was strong in a way where he could find a way. He was willing to find a way. He’s willing to go to the cliff, to truly die before he submitted. A lot of stuff is just words, “They’re going to have to carry me out on the shield,” yeah, sure. Okay. Yeah, until it comes time to be carried out on the shield. Sometimes there’s people that actually mean it.
Lex Fridman
(01:39:45)
You think Mike didn’t have that?
Teddy Atlas
(01:39:47)
Well, all right. Let’s just say arbitrarily, I don’t have his record in front of me. Let’s say he was 55 and five, I know he had about five losses. All right, let’s say he was 55 and five, a lot of knockouts. I have a saying, a fight’s not a fight until there’s something to overcome, until then it’s just an athletic exhibition. Contest. Yeah. Who’s a better athlete? Who’s got more quick twitch fibers? Who’s more developed? Who’s a better this? Who’s more developed in those physical areas? But a fight is not a fight until there’s something to overcome. Okay. So, if you go by my definition, not Webster’s, my definition, which I think means something, Mike Tyson was only in five fights in his life.

(01:40:41)
The five fights where there was something to overcome and he didn’t overcome it. Now, I know people hate me for this, including Tyson. I understand, hate me. Oh, you’re a hater, because you weren’t with him, you didn’t make the money because this, because of that, because you got betrayed. I think I’m better than that. I hope I’m better than that. I believe I’m better than that. I’m not a hater. I’ve broadcast fights for 25 years on ESPN, where there was some people in the corner I did not like, and if they did a good job, this guy’s doing a great job. And then, there were guys that I liked and I had friendship, he messed up, and we weren’t friends no more. Friendship got to be tested. Remember that? So, we weren’t friends no more, but why did I do that? Because it was my job. It was more important for me… When it’s all over with, the only thing you’re left with is… We’re going to be dust, all of us, right?

(01:41:46)
The only thing we’re left with is what carries on, our reputation, legacy, whatever that is. But our reputation, that’s all we’re left with. And that’s all our kids are left with. I want it to be as good as it can be. I’ve always had ability, I’ve done a lot of things wrong, and I’ve had a lot of lackings, but the one strength I’ve had, if I had a strength, is to understand somehow, through osmosis, I guess, to learn the lesson that was important is not what’s in front of you for those five seconds, for that moment in life, it’s what’s left behind you when those five seconds are gone. Whatever it is that you’re dealing with, whatever that moment is, whatever… That moment, what you do in that moment, the action of that moment is going to stay with you and be you. It’s going to become you.

(01:43:05)
What you face for that moment, it’s gone. It’s gone in the air, in an instant. It’s gone, it’s done. Whether you stand up there and you get shot in the head, and the guy freaking blows your brains out, or you stand up or you’re fighting a guy who’s a scary guy to fight, but you fight him and you beat him or he beats you up. But how you represented yourself in that moment is all that matters. That’s going to live. What happened don’t matter. It don’t matter that you got shot in the head. I know that sounds absurd, but if you believe that it was important to stand up and take the chance to get shot in the freaking head, rather than to live like an empty vessel, you know what? That’s all that freaking matters. And somehow that got freaking wrapped into this freaking head of mine, that’s what matters. That’s all that matters.

(01:44:17)
You know how many times I went, and there were things, whether it was with this one, with Tyson, with that… I didn’t want to be there, I was scared to death, but I was more scared-
Lex Fridman
(01:44:31)
Living with regret.
Teddy Atlas
(01:44:32)
… how I would’ve felt. I don’t want to be in solitary confinement the rest of my life, with that freaking guy in the cell next to me called regret. I don’t freaking want to be next to that guy. If I want to freaking go down that road, I’ll watch Papillon. And I’ll get my fill from that. But I don’t want to freaking live it. I’m afraid of what my children would think of me if I fail in those areas. Why? Because that’s forever. When I’m closing my eyes for the last time, I don’t want to have that fear. I don’t want to have that fear. Whether I’m going down there or whether I’m going up there. I laugh because I was around guys years ago that used to, when we’d talk about that in jest, and I would get a kick out of this one guy who’d been around the block a few times, when he’d say, “Teddy, I ain’t worried about that, I got friends in both places.”
Lex Fridman
(01:45:40)
That’s a good line.
Teddy Atlas
(01:45:41)
And I thought it was good. Listen, Mike Tyson, you want me to say he was a great fighter, then you want me to betray what I really… You know what I mean? You want me to do that? I ain’t doing it for… Listen, I could do it to be a bigger Teddy Atlas, and I know it would work for me. I know it’d do great promotional work for me. I know it would make me more popular in certain areas. I know it, I’m not that dumb. Not that dumb. But I also know what else it would do to me, and I don’t want it to do that to me. I think he was a great talent, I think maybe the night with Michael Spinks, maybe the night with Mike, maybe he could have been that fighter. But he didn’t never really get tested, but he might’ve been ready no matter what, I have to be tested that night.

(01:46:32)
That’s how good he was. That’s how, even though it was a guy who used to be a light heavyweight, I get it. But it was still a guy who beat Larry Holmes, who still had something left, Michael Spinks. And a great puncher. And an Olympic gold medalist. And a special fighter, one of the great light heavyweights of all time. You know what Mike Tyson was? He was a meteor. He was a meteor that struck across, and not too many meteors… And we still talk about him. And unlike Haley’s comet, he came back, and he’s walking around. And he has become greater after his career, more loved, more beloved, more awed, and he’s been forgiven. He found the fountain of forgiveness. I don’t know… I wish I could find that.

(01:47:19)
Where he has been forgotten for all his shortcomings, all the things that he may have done, may not have done, we don’t know, only him and God know. But he’s been forgiven of all that, and he’s been not only forgiven, he’s rised above it and above that, and been brought above that. He’s been brought to the pyramids of the greatest athletes in the world. In every way. In every way. As a person, as a fighter, as a historian, as a figure, as a celebrity-
Lex Fridman
(01:48:05)
Even a philosopher.
Teddy Atlas
(01:48:06)
Everything. So, I will take it back. All right, all you guys out there, you forgive me, he’s the greatest of all time, if you encapsulate all that. If you encapsulate everything I just tried to describe and explain, if you put that all… He’s the greatest of all time, yeah, he is. But he still might be 0-5. In a record of 55 fights he might, in Teddy Atlas’s book, again, I got friends in both places, so it’s was okay. Wherever I go I’ll have company, somebody there will like me, despite me saying this. He might be 0-5 because of five fights where there was something to overcome, which really defines a fight. He didn’t find a way.
Lex Fridman
(01:48:55)
Let me ask Teddy Atlas to introspect on the human nature here, as part of the complexities of your feelings on this whole thing is that you know to some degree that if you were coaching Mike Tyson, he could be truly great throughout-
Teddy Atlas
(01:49:13)
I know… I’m going to cut you right off, because you asked a million-dollar question, I wish you didn’t, but you did. You did. Because that’s why-
Lex Fridman
(01:49:21)
When do I get paid?
Teddy Atlas
(01:49:22)
That’s why you get paid. I get it, you took the words out of my mouth. That’s why you are where you are. And that’s why I’m here.
Lex Fridman
(01:49:33)
The humility.
Teddy Atlas
(01:49:34)
I’m going to, again, full disclosure, it’s important, right? I’m going to cheat, I’m going to take some of Cus’ wisdom. All right. A little bit of mine. Cus told somebody that if Teddy Atlas got his way, he might’ve been a better person, but we would’ve risked him not being a great fighter. Now, I believe, and I thought Cus did, and I think he did up to that point in his life, that part of your strength of character made you a great fighter, and truly a great fighter. And part of that battle to be a better person, that fight if you will, to be a better person, to overcome the [inaudible 01:50:22] to be a better person, part of that fire you have to go through to be a better person, I really truly bought into it, and I’m in for life.

(01:50:34)
That is really the only way to be a great fighter. And I don’t think that’s what Cus meant, I think he meant that… Cus knew more than I did of what was about to come and what would come and what the world was. How people would try to steal him, how people would take him, how people would steal his guy. The last thing he had, really, the thing that he lived for. Because he lived to have another heavyweight champ, the greatest fighter ever, in Cus’ mind. He could be. And I believe that Cus knew that he could put forward a guy that had the ability to be the greatest fighter ever, without fully completing the mission of what it takes to really be great, but that he wouldn’t be around to have to witness it.

(01:51:38)
And that he wouldn’t… Oh man, this is awful. He’s willing to concede that he might be dead in order to have eternal life, in order to have greatness. And which, Cus does have greatness, and part of that greatness is attached to Tyson. And he deserves it. He deserves it, Cus was a great man. And I wouldn’t be here, partly, without him.
Lex Fridman
(01:52:07)
But that was part of the calculation.
Teddy Atlas
(01:52:08)
I know that’s deep, and I know that’s… Oh God, I hate myself right now. But Cus, he knew he was getting out free. He knew he was going to not have to be there. He was getting off easy. Oh, Teddy, how do you say someone’s going to be dead, they’re getting off easy? I’ll say it again in case you didn’t hear me, all right? He was going to get off easy, and not have to face where he came up short, because he did his job. Because he put forward the greatest fighter of all time, and you guys screwed it up.

(01:52:48)
And he knew that that might happen, but you guys screwed it up. And whatever, that’s your fault. That’s on… I’ll tell you, Tyson will be mad at this, but that’s on Tyson. How can you say that Teddy? He loved me. I’m not saying he didn’t love you, but he loved some other stuff too. And I don’t know if Tyson could ever come to grip’s light with that, and it’s not his job to. But it’s my job not to hide from it. I know Cus in dimensions that other people just only think they know.
Lex Fridman
(01:53:23)
Did Cus know? Did Cus know this about himself? Did he reflect? Did he introspect?
Teddy Atlas
(01:53:30)
Well, he sent a message to me. Cus sent a guide to me… My wife was pregnant, we were living in an apartment in Catskill on [inaudible 01:53:38] road. We went through all this, and I was getting ready to move to Staten Island, and we still were there for a little while before we did, after all this went down. He sent a guy to me, to the house, secret, whatever you want to call it… My wife, me. So, I listened to him. Cus said, if you leave… I’m a messenger, whatever. If you leave… This was in the aftermath of the gun, the whole thing. You got to remember, Tyson was a ward of the state. He was put in Cus’ custody. Cus was looking to adopt him, for obvious reasons, so he had control. And he loved him.

(01:54:28)
How dare I say anything less? I won’t. But it made sense too. But he was a ward of the state still. Do you know what that means? There’s rules. It means the state’s still overlooking it. If he ain’t living the right life… You got to remember, he came out of a jail. So, reform school. But if he ain’t living the life, he could be taken away from Cus. What’s not living the right life? Well, he wasn’t in school no more, they didn’t know about it. He had some things that were going on, we won’t get into that right now, in school and different things, whatever. And he had his trainer put a gun to his head. That ain’t so good. If a report came back to them that that happened, he would’ve been taken away from Cus. That couldn’t happen. Nobody knows this. I talk about it a little bit, but never, probably… Because why would I?

(01:55:28)
I don’t know. Why am I doing it now? I don’t know, because… I don’t know. Because I am-
Lex Fridman
(01:55:33)
[inaudible 01:55:33].
Teddy Atlas
(01:55:33)
… because it’s now. Because it’s now, maybe. Maybe because it’s now, I don’t know. So, he sent this man, that obviously we both knew, and he said, here’s the deal, Teddy. No talk about this, wants it to disappear, basically, you leave and he will give you 5%… His word. Can you imagine? He will give you 5% of Tyson’s earnings for the rest of his career. But I don’t regret it one bit because it wouldn’t have happened anyway. See, that’s where I could be honest with my… People say, oh, standup guy, because I told him to shove it where the… In that place. And tell Cus to shove it in that freaking place. I was mad. Teddy, don’t get angry… Don’t get angry? Are you out of your… Are you serious? Get out of here. Tell them to go shove it over… And my wife was like, huh? And then, people are like, [inaudible 01:56:37], why didn’t you take the deal? It wasn’t a deal, it was an escape clause for Cus. It was an insurance policy, that this kid wouldn’t be taken away from him.

(01:56:52)
And thank God he wasn’t. I wasn’t going to go and say nothing, they didn’t have to worry about it. Cus forgot who I was? Cus forgot why he went to court for me? Because of those characteristics that he said he loved, and he noticed, and that he admired. I didn’t lose those characters, he forgot that that was me, he forgot who he was talking to. He didn’t have to do that. How about, that’s why I told him to shove it up his… Not because of the other insult. And then, when people said to me, oh, you were stand up… Because it was around a little bit. It was around in the circles. And then, when people… Oh, stand up Teddy, he didn’t care about the money. I said, stand up Teddy? What are you talking about?

(01:57:39)
How about just realistic, Teddy? How about I live in a real world, that I was never going to get that money? So, I’m standing up to something that I knew never existed. So, I ain’t stand up, not in that way. I am in other ways maybe, but don’t put a medal on my chest for that, because that never existed. It was never meant to exist. But he didn’t even understand. That was the one thing that really disappointed me in Cus. I was like Cus, you really allowed this to get to you. Where you’ve allowed it to really fog up your thinking, to the point where you’re smarter than that, you’re better than that. That you would actually think you got to freaking offer me freaking pieces of silver. You really think that? That’s what you… Freak you. All that you told me, that you love me, and that I was the young master, and all this… And you think you were going to buy me? And that was going to keep me quiet? How about I would keep quiet because I would always keep quiet?
Lex Fridman
(01:58:52)
So, he thought maybe you might betray him?
Teddy Atlas
(01:58:56)
Isn’t that interesting? Yeah. And why did he think that? No, no really?
Lex Fridman
(01:59:03)
Fear.
Teddy Atlas
(01:59:04)
Yeah. But yeah, fear is at the essence of everything, it’s connected with everything. Fear of losing what he was going to lose. But it was more than fear, it was him not believing in the things that he told me he believed in. He didn’t even know that. He believed in me because I was a standup guy. Because I didn’t sell myself. Because I didn’t freaking turn evidence. I didn’t make a deal. I didn’t do… And that’s why he went to court, and that’s why he stood up for me. And I appreciate it. And that was what he lived by. And those were the blocks of being a man. So much for those blocks.
Lex Fridman
(01:59:47)
Well, it’s like you said, loyalty requires… He would’ve had to take a risk on losing immortality that he would achieve by creating a great heavyweight champion-
Teddy Atlas
(02:00:02)
And that’s the only way you… 100%. But the only way you ever find out if somebody is really that-
Lex Fridman
(02:00:07)
It’s hard. It’s hard.
Teddy Atlas
(02:00:08)
…it’s the test. And it was Cus.
Lex Fridman
(02:00:10)
This is Shakespearean, this story.
Teddy Atlas
(02:00:15)
Cus told me, Cus said, “And the test come in different forms.”
Lex Fridman
(02:00:18)
Yeah.
Teddy Atlas
(02:00:19)
I said, all right, Cus.
Lex Fridman
(02:00:20)
This was his test.
Teddy Atlas
(02:00:23)
And some people pass this test because they’re able to pass that test, because it’s not really a test. Not for them. Because it doesn’t speak to their weakness. But it’s the test that speaks to the weakness, that’s the one. So, this one, I get it, I get what it spoke to, Cus. And you know what? At the end of the day, I forgive you, and I feel bad for you. I feel bad that you were put in that position after you lived your life that way, and that you taught that and you preached that from the mountaintops, that you had to be… That you had to be… I’m not going to use the word. But that you had to fail yourself, and that you had to somehow know that before you died.

(02:01:22)
I just pray that you didn’t know that. And you still don’t know that. Because you were great, you were great. And you’ve given me something to aspire towards. To try to be less weak. Try to be better. And try to be as good as you wanted to be. I wish I can someday. More importantly, I wish I could make my father just feel good up there.

Forgiveness

Lex Fridman
(02:02:18)
You’re a grandfather now.
Teddy Atlas
(02:02:20)
Yeah, four grandchildren.
Lex Fridman
(02:02:23)
If you can give them advice on how to live a life they can be proud of…
Teddy Atlas
(02:02:33)
Just do everything you can, to the best of your ability, every day, to like yourself. To give yourself a reason, to actually say, I’d like to be friends with that guy.
Lex Fridman
(02:02:57)
Is loyalty one of the reasons? One of the things to aspire to?
Teddy Atlas
(02:03:06)
Loyalty is your chance to have a fulfilled life. Loyalty is your chance to have strength, to have all the things you need to have a good life. To be a good parent, be a good husband, be a good grandfather, hopefully be a good role model. Loyalty is… If you could find something to drink, to take into your body, to make you prepared for life, to be all the things that you want to be, to be strong enough to be those things, loyalty would be the thing you would drink. And when I say loyal, I mean unequivocally. I mean unconditionally. Not conveniently, obviously you know that. If you could be loyal, you could be a good person. You could be a person that you would actually like to be around. Because you could be a person you could rely on. And I think that’s one of the greatest assets that a human being can have.
Lex Fridman
(02:04:32)
And what do you do when you’re betrayed? How do you overcome that?
Teddy Atlas
(02:04:38)
You think of what you learned from it. Use it as a roadmap to remember, and to think back of how you got there. And how you got to the place where you got betrayed, and how that person got to that place. Try to remember that in your own journey.
Lex Fridman
(02:05:04)
Has it, for you, made you cynical? How do you take the leap of trust towards people again and again after that?
Teddy Atlas
(02:05:14)
Just by remembering that I’m still trying to forgive myself for the things that I came up short with. And if I haven’t figured that out yet, it’s probably okay to say they didn’t figure it out yet, they didn’t figure it out. And if I couldn’t figure it out and I’m still trying to figure it out, maybe I could get over that initial stabbing of, what it feels like. It does feel like a stabbing. That you feel when you’re betrayed initially, and that you could only think of anger, revenge, hatred. I know those things. I’m not proud of that, but I felt all those things. And I still feel them sometimes. And then I go back and say, hey, you’re still working at forgiving yourself for some things, try to remember that kid. Memory’s an important thing. Forgetfulness is pretty important too. And I’m trying to remember why we forget. Why do we forget? Because it wasn’t something you felt proud of.
Lex Fridman
(02:06:38)
Do you think about your death? Are you afraid of it?
Teddy Atlas
(02:06:46)
It’s funny you asked that. I never used to think about it. I know people in both places.
Lex Fridman
(02:06:59)
I know, you’ve got it covered. You’re going to be all right.
Teddy Atlas
(02:07:03)
Don’t forget that.
Lex Fridman
(02:07:04)
Yeah.
Teddy Atlas
(02:07:05)
I know people in both places.
Lex Fridman
(02:07:05)
Yeah. Both neighborhoods.
Teddy Atlas
(02:07:24)
I’ve been given credit for being brave in certain spots in life, I hope I can be brave when it comes time to leave life. I hope I can be. And that’s just, that’s real and honest as you can be about it. I hope I can be. So far, so good. When I’ve had to be certain things that I was scared to freaking death, I found a way to beat them, for the most part. And so, I figured, when that day comes, I’ll figure that out too.
Lex Fridman
(02:08:12)
It’s going to be another test, maybe the last one. Teddy, it’s a huge honor to talk to you.
Teddy Atlas
(02:08:19)
It’s my pleasure.
Lex Fridman
(02:08:20)
Thank you for being the human you are, for being honest. Honest about the full range of human nature. And thank you for talking today.
Teddy Atlas
(02:08:29)
Thank you. Thank you for having me, and thanks for listening.
Lex Fridman
(02:08:35)
Thanks for listening to this conversation with Teddy Atlas. To support this podcast, please check out our sponsors in the description. And now, let me leave you with some words from Muhammad Ali. “I hated every minute of training, but I said, don’t quit, suffer now and live the rest of your life as a champion.” Thank you for listening, and hope to see you next time.

Transcript for Jeff Bezos: Amazon and Blue Origin | Lex Fridman Podcast #405

This is a transcript of Lex Fridman Podcast #405 with Jeff Bezos.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
The following is a conversation with Jeff Bezos, founder of Amazon and Blue Origin. This is his first time doing a conversation of this kind and of this length. And as he told me, it felt like we could have easily talked for many more hours, and I’m sure we will. This is the Lex Fridman Podcast. And now, dear friends, here’s Jeff Bezos.

Ranch


(00:00:24)
You spent a lot of your childhood with your grandfather on a ranch here in Texas.
Jeff Bezos
(00:00:29)
Mm-hmm.
Lex Fridman
(00:00:30)
And I heard you had a lot of work to do around the ranch. So, what’s the coolest job you remember doing there?
Jeff Bezos
(00:00:35)
Wow. Coolest?
Lex Fridman
(00:00:37)
Most interesting? Most memorable?
Jeff Bezos
(00:00:39)
Most memorable?
Lex Fridman
(00:00:39)
Most impactful?
Jeff Bezos
(00:00:41)
It’s a real working ranch, and I spent all my summers on that ranch from age four to 16. And my grandfather was really taking me and in the early summers, he was letting me pretend to help on the ranch, because of course, a four-year-old is a burden, not a help in real life. He was really just watching me and taking care of me. And he was doing that because my mom was so young. She had me when she was 17, and so he was sort of giving her a break. And my grandmother and my grandfather would take me for these summers.

(00:01:15)
But as I got a little older, I actually was helpful on the ranch and I loved it. My grandfather had a huge influence on me, a huge factor in my life. I did all the jobs you would do on a ranch. I’ve fixed windmills, and laid fences, and pipelines, and done all the things that any rancher would do, vaccinated the animals, everything. But after my grandmother died, I was about 12 and I kept coming to the ranch, so then it was just him and me, just the two of us. And he was completely addicted to the soap opera, Days of Our Lives. And we would go back to the ranch house every day around 1:00 PM or so to watch Days of Our Lives. Like sands through an hourglass, so are the Days of Our Lives.
Lex Fridman
(00:02:07)
Just the image of that, the two of you sitting there watching a soap opera, two ranchers.
Jeff Bezos
(00:02:13)
He had these big crazy dogs. It was really a very formative experience for me. But the key thing about it for me, the great gift I got from it was that my grandfather was so resourceful. He did everything himself. He made his own veterinary tools. He would make needles to suture the cattle up with. He would find a little piece of wire and heat it up and pound it thin and drill a hole in it and sharpen it. So, you learn different things on a ranch than you would learn growing up in a city.
Lex Fridman
(00:02:43)
So, self-reliance?
Jeff Bezos
(00:02:44)
Yeah, figuring out that you can solve problems with enough persistence and ingenuity. And my grandfather bought a D6 bulldozer, which is a big bulldozer, and he got it for like $5,000 because it was completely broken down. It was like a 1955 Caterpillar D6 bulldozer. New it would’ve cost, I don’t know, more than $100,000. And we spent an entire summer repairing that bulldozer. And we’d use mail order to buy big gears for the transmission, and they’d show up, they’d be too heavy to move, so we’d have to build a crane. Just that problem-solving mentality. He had it so powerfully. He did all of his own… He didn’t pick up the phone and call somebody, he would figure it out on his own. Doing his own veterinary work.
Lex Fridman
(00:03:39)
But just the image of the two of you fixing a D6 bulldozer and then going in for a little break at 1:00 PM to watch soap operas.
Jeff Bezos
(00:03:47)
Days of Our Lives. Laying on the floor, that’s how he watched TV. He was a really, really remarkable guy.

Space

Lex Fridman
(00:03:52)
That’s how I imagine Clint Eastwood also in all those westerns, when he’s not doing what he’s doing, he’s just watching soap operas. All right. I read that you fell in love with the idea of space and space exploration when you were five, watching Neil Armstrong walking on the moon. So, let me ask you to look back at the historical context and impact of that. So, the space race from 1957 to 1969 between the Soviet Union and the US was, in many ways, epic. It was a rapid sequence of dramatic events. First satellite to space, first human to space, first spacewalk, first uncrewed landing on the moon. Then, some failures, explosions, deaths on both sides actually. And then, the first human walking on the moon. What are some of the more inspiring moments or insights you take away from that time, those few years at just 12 years?
Jeff Bezos
(00:04:51)
Well, I mean there’s so much inspiring there. One of the great things to take away from that, one of the great von Braun quotes is, “I have come to use the word impossible with great caution.” And so, that’s kind of the big story of Apollo is that going to the moon was literally an analogy that people used for something that’s impossible. “Oh, yeah, you’ll do that when men walk on the moon.” And of course, it finally happened. So, I think it was pulled forward in time because of the space race.

(00:05:31)
I think with the geopolitical implications and how much resource was put into it. At the peak, that program was spending 2% or 3% of GDP on the Apollo program. So, much resource. I think it was pulled forward in time. We kind of did it ahead of when we, quote, unquote, should have done it. And so, in that way, it’s also a technical marvel. I mean it’s truly incredible. It’s the 20th century version of building the pyramids or something. It’s an achievement that because it was pulled forward in time and because it did something that had previously been thought impossible, it rightly deserves its place in the pantheon of great human achievements.
Lex Fridman
(00:06:17)
And of course, you named the rockets that Blue Origin is working on after some of the folks involved.
Jeff Bezos
(00:06:24)
Yeah.
Lex Fridman
(00:06:24)
I don’t understand why I didn’t say New Gagarin. Is that-
Jeff Bezos
(00:06:27)
There’s an American bias in the naming. I apologize-
Lex Fridman
(00:06:30)
That’s very strange.
Jeff Bezos
(00:06:31)
… Lex.
Lex Fridman
(00:06:31)
Was just asking for a friend, clarifying.
Jeff Bezos
(00:06:33)
I’m a big fan of Gagarin’s though. And in fact, I think his first words in space I think are incredible. He purportedly said, “My God, it’s blue.” And that really drives home. No one had seen the Earth from space. No one knew that we were on this blue planet. No one knew what it looked like from out there, and Gagarin was the first person to see it.
Lex Fridman
(00:07:01)
One of the things I think about is how dangerous those early days were for Gagarin, for Glenn, for everybody involved. How big of a risk they were all taking.
Jeff Bezos
(00:07:11)
They were taking huge risks. I’m not sure what the Soviets thought about Gagarin’s flight, but I think that the Americans thought that the Alan Shepard flight, the flight that New Shepherd is named after, the First American in space, he went on his suborbital flight, they thought he had about a 75% chance of success. So, that’s a pretty big risk, a 25% risk.
Lex Fridman
(00:07:36)
It’s kind of interesting that Alan Shepard is not quite as famous as John Glenn. So, for people who don’t know, Alan Shepard is the first astronaut-
Jeff Bezos
(00:07:44)
The first American in space.
Lex Fridman
(00:07:46)
American in suborbital flight.
Jeff Bezos
(00:07:48)
Correct.
Lex Fridman
(00:07:48)
And then, the first orbital flight is-
Jeff Bezos
(00:07:51)
John Glenn is the first American to orbit the Earth. By the way, I have the most charming, sweet, incredible letter from John Glenn, which I have framed and hanging on my office wall.
Lex Fridman
(00:08:04)
What did he say?
Jeff Bezos
(00:08:04)
Where he tells me how grateful he is that we have named New Glenn after him. And he sent me that letter about a week before he died. And it’s really an incredible… It’s also a very funny letter. He’s writing and he says, “This is a letter about New Glenn from the original Glenn.” And he’s got a great sense of humor and he’s very happy about it and grateful. It’s very sweet.
Lex Fridman
(00:08:30)
Does he say, “P.S. Don’t mess this up,” or is that-
Jeff Bezos
(00:08:34)
No, he doesn’t.
Lex Fridman
(00:08:35)
“Make me look good.”
Jeff Bezos
(00:08:35)
He doesn’t do that. But John, wherever you are, we’ve got you covered.
Lex Fridman
(00:08:39)
Good. So, back to maybe the big picture of space. When you look up at the stars and think big, what do you hope is the future of humanity, hundreds, thousands of years from now out in space?
Jeff Bezos
(00:08:54)
I would love to see a trillion humans living in the solar system. If we had a trillion humans, we would have, at any given time, 1,000 Mozarts and 1,000 Einsteins. That our solar system would be full of life and intelligence and energy. And we can easily support a civilization that large with all of the resources in the solar system.
Lex Fridman
(00:09:21)
So, what do you think that looks like? Giant space stations?
Jeff Bezos
(00:09:24)
Yeah, the only way to get to that vision is with giant space stations. The planetary surfaces are just way too small. So, I mean, unless you turn them into giant space stations or something. But yeah, we will take materials from the moon and from near-Earth objects and from the asteroid belt and so on, and we’ll build giant O’Neill style colonies and people will live in those. They have a lot of advantages over planetary surfaces. You can spin them to get normal Earth gravity. You can put them where you want them. I think most people are going to want to live near Earth, not necessarily in Earth orbit, but near Earth vicinity orbits. And so, they can move relatively quickly back and forth between their station and Earth. I think a lot of people, especially in the early stages, are not going to want to give up Earth altogether.
Lex Fridman
(00:10:24)
They go to earth for vacation?
Jeff Bezos
(00:10:26)
Yeah, same way that you might go to Yellowstone National Park for vacation, people will… And people will get to choose where they live on Earth or whether they live in space, but they’ll be able to use much more energy and much more material resource in space than they would be able to use on Earth.
Lex Fridman
(00:10:45)
One of the interesting ideas you had is to move the heavy industry away from Earth. So, people sometimes have this idea that somehow space exploration is in conflict with the celebration of the planet Earth, that we should focus on preserving Earth. And basically, your idea is that space travel and space exploration is a way to preserve Earth.
Jeff Bezos
(00:11:06)
Exactly. We’ve sent robotic probes to all the planets, we know that this is the good one.
Lex Fridman
(00:11:17)
Not to play favorites or anything, but…
Jeff Bezos
(00:11:19)
Earth really is the good planet. It’s amazing. The ecosystem we have here, all of the life and the lush plant life and the water resources, everything. This planet is really extraordinary. And of course, we evolved on this planet, so of course it’s perfect for us, but it’s also perfect for all the advanced life forms on this planet, all the animals and so on. And so, this is a gem. We do need to take care of it. And as we enter the Anthropocene, as we humans have gotten so sophisticated and large and impactful, as we stride across this planet, that is going to… We want to use a lot of energy. We want to use a lot of energy per capita. We’ve gotten amazing things. We don’t want to go backwards.

(00:12:10)
If you think about the good old days, they’re mostly an illusion. In almost every way, life is better for almost everyone today than it was say 50 years ago or 100 years ago. We live better lives by and large than our grandparents did, and their grandparents did, and so on. And you can see that in global illiteracy rates, global poverty rates, global infant mortality rates. Almost any metric you choose, we’re better off than we used to be. And we get antibiotics and all kinds of lifesaving medical care, and so on, and so on. And there’s one thing that is moving backwards, and it’s the natural world.

(00:12:54)
So, it is a fact that 500 years ago, pre-industrial age, the natural world was pristine. It was incredible. And we have traded some of that pristine beauty for all of these other gifts that we have as an advanced society. And we can have both, but to do that, we have to go to space. And the most fundamental measure is energy usage per capita. You do want to continue to use more and more energy, it is going to make your life better in so many ways, but that’s not compatible ultimately with living on a finite planet. And so, we have to go out into the solar system. And really, you could argue about when you have to do that, but you can’t credibly argue about whether you have to do that.
Lex Fridman
(00:13:49)
Eventually we have to do that.
Jeff Bezos
(00:13:51)
Exactly.
Lex Fridman
(00:13:52)
Well, you don’t often talk about it, but let me ask you on that topic about the Blue Ring and the Orbital Reef space infrastructure projects. What’s your vision for these?
Jeff Bezos
(00:14:03)
So, Blue Ring is a very interesting spacecraft that is designed to take up to 3,000 kilograms of payload up to geosynchronous orbit or in lunar vicinity. It has two different kinds of propulsion. It has chemical propulsion and it has electric propulsion. And so, you can use Blue Ring in a couple of different ways. You can slowly move, let’s say up to geosynchronous orbit using electric propulsion. That might take 100 days or 150 days, depending on how much mass you’re carrying. And reserve your chemical propulsion, so that you can change orbits quickly in geosynchronous orbit. Or you can use the chemical propulsion first to quickly get up to geosynchronous and then use your electrical propulsion to slowly change your geosynchronous orbit.

(00:14:55)
Blue Ring has a couple of interesting features. It provides a lot of services to these payloads. So, it could be one large payload or it can be a number of small payloads, and it provides thermal management, it provides electric power, it provides compute, provides communications. And so, when you design a payload for Blue Ring, you don’t have to figure out all of those things on your own. So, kind of radiation tolerant compute is a complicated thing to do. And so, we have an unusually large amount of radiation tolerant compute on board Blue Ring, and your payload can just use that when it needs to. So, it’s sort of all these services… It’s like a set of APIs. It’s a little bit like Amazon Web Services, but-
Lex Fridman
(00:15:51)
For space?
Jeff Bezos
(00:15:52)
… for space payloads that need to move about in Earth vicinity or lunar vicinity.
Lex Fridman
(00:15:57)
AWSS space. So, compute and space. So, you get a giant chemical rocket to get a payload out to orbit. And then, you have these admins that show up, this Blue Ring thing that manages various things like compute?
Jeff Bezos
(00:16:13)
Exactly. And it can also provide transportation and move you around to different orbits.
Lex Fridman
(00:16:19)
Including humans, do you think?
Jeff Bezos
(00:16:21)
No, Blue Ring is not designed to move humans around. It’s designed to move payloads around. So, we’re also building a lunar lander, which is of course designed to land humans on the surface of the moon.

Physics

Lex Fridman
(00:16:34)
I’m going to ask you about that, but let me ask you to just step back to the old days. You were at Princeton with aspirations to be a theoretical physicist.
Jeff Bezos
(00:16:45)
Yeah.
Lex Fridman
(00:16:47)
What attracted you to physics and why did you change your mind and not become… Why are you not Jeff Bezos, the famous theoretical physicist?
Jeff Bezos
(00:16:57)
So, I loved physics and I studied physics and computer science, and I was proceeding along the physics path. I was planning to major in physics, and I wanted to be a theoretical physicist. And the computer science was sort of something I was doing for fun. I really loved it and I was very good at the programming and doing those things, and I enjoyed all my computer science classes immensely. But I really was determined to be a theoretical physicist. That’s why I went to Princeton in the first place. It was definitely… And then, I realized I was going to be a mediocre theoretical physicist. And there were a few people in my classes, like in quantum mechanics and so on, who they could effortlessly do things that were so difficult for me. And I realized there are 1,000 ways to be smart.

(00:17:52)
Theoretical physics is not one of those fields where only the top few percent actually move the state-of-the-art forward. It’s one of those things where your brain has to be wired in a certain way. And there was a guy named… One of these people who convinced me, he didn’t mean to convince me, but just by observing him, he convinced me that I should not try to be a theoretical physicist. His name was Yosanta. And Yosanta was from Sri Lanka, and he was one of the most brilliant people I’d ever met. My friend Joe and I were working on a very difficult partial differential equations problem set one night. And there was one problem that we worked on for three hours and we made no headway whatsoever. And we looked up at each other at the same time and we said, “Yosanta.”

(00:18:49)
So, we went to Yosanta’s dorm room and he was there. He was almost always there. And we said, “Yosanta, we’re having trouble solving this partial differential equation. Would you mind taking a look?” And he said, “Of course.” By the way, he was the most humble, most kind person. And so, he looked at our problem and he stared at it for just a few seconds, maybe 10 seconds, and he said, “cosine.” And I said, “What do you mean, Yosanta? What do you mean cosine?” He said, “That’s the answer.” And I said, “No, no, no, come on.” And he said, “Let me show you.” And he took out some paper and he wrote down three pages of equations, everything canceled out, and the answer was cosine.

(00:19:30)
And I said, “Yosanta, did you do that in your head?” And he said, “Oh, no. That would be impossible. A few years ago I solved a similar problem and I could map this problem onto that problem, and then it was immediately obvious that the answer was cosine.” You have an experience like that, you realize maybe being a theoretical physicist isn’t what the universe wants you to be. And so, I switched to computer science and that worked out really well for me. I enjoy it. I still enjoy it today.
Lex Fridman
(00:20:07)
Yeah, there’s a particular kind of intuition you need to be a great physicist, and applied to physics.
Jeff Bezos
(00:20:12)
I think the mathematical skill required today is so high. You have to be a world-class mathematician to be a successful theoretical physicist today. And you probably need other skills too, intuition, lateral thinking and so on. But without just top-notch math skills, you’re unlikely to be successful.
Lex Fridman
(00:20:39)
And visualization skill, you have to be able to really do these kinds of thought experiments if you want truly great creativity. Actually Walter Isaacson writes about you and puts you on the same level as Einstein and-
Jeff Bezos
(00:20:53)
Well, that’s very kind. I’m an inventor. If you want to boil down what I am, I’m really an inventor. And I look at things and I can come up with atypical solutions. And then, I can create 100 such atypical solutions for something, 99 of them may not survive scrutiny, but one of those 100 is like, “Hmm, maybe that might work.” And then, you can keep going from there. So, that kind of lateral thinking, that kind of inventiveness in a high-dimensionality space where the search space is very large, that’s where my inventive skills come… I self-identify as an inventor more than anything else.
Lex Fridman
(00:21:43)
Yeah. And he describes in all kinds of different ways, Walter Isaacson does, that creativity combined with childlike wander that you’ve maintained still to this day, all of that combined together. If you were to study your own brain, introspect, how do you think? What’s your thinking process like? We’ll talk about the writing process of putting it down on paper, which is quite rigorous and famous at Amazon. But when you sit down, maybe alone, maybe with others, and thinking through this high-dimensional space and looking for creative solutions, creative paths forward, is there something you could say about that process?
Jeff Bezos
(00:22:26)
It’s such a good question, and I honestly don’t know how it works. If I did, I would try to explain it. I know it involves lots of wandering, so when I sit down to work on a problem, I know I don’t know where I’m going. So, to go in a straight line… To be efficient… Efficiency and invention are sort of at odds, because real invention, Not incremental improvement… Incremental improvement is so important in every endeavor, in everything you do, you have to work hard on also just making things a little bit better. But I’m talking about real invention, real lateral thinking that requires wandering, and you have to give yourself permission to wander.

(00:23:11)
I think a lot of people, and they feel like wandering is inefficient. And when I sit down at a meeting, I don’t know how long the meeting is going to take if we’re trying to solve a problem, because if I did, then I’d know there’s some kind of straight line that we’re drawing to the solution. The reality is we may have to wander for a long time. And I do like group invention. I think there’s really nothing more fun than sitting at a whiteboard with a group of smart people and spit balling and coming up with new ideas and objections to those ideas, and then solutions to the objections and going back and forth. So, sometimes you wake up with an idea in the middle of the night and sometimes you sit down with a group of people and go back and forth, and both things are really pleasurable.
Lex Fridman
(00:24:14)
And when you wander, I think one key thing is to notice a good idea. And maybe to notice the kernel of a good idea. I’ll maybe pull at that string. Because I don’t think good ideas come fully-formed.
Jeff Bezos
(00:24:31)
100% right. In fact, when I come up with what I think is a good idea and it survives the first level of scrutiny that I do in my own head, and I’m ready to tell somebody else about the idea, I will often say, “Look, it is going to be really easy for you to find objections to this idea, but work with me.”
Lex Fridman
(00:24:53)
There’s something there.
Jeff Bezos
(00:24:54)
There’s something there. And that is intuition, because it’s really easy to kill new ideas in the beginning because there’s so many easy objections to them. So, you need to kind of forewarn people and say, “Look, I know it’s going to take a lot of work to get this to a fully-formed idea. Let’s get started on that. It’ll be fun.”
Lex Fridman
(00:25:17)
So, you got that ability to say cosine in you somewhere after all, maybe not on math, but-
Jeff Bezos
(00:25:23)
In a different domain.
Lex Fridman
(00:25:24)
Yeah.
Jeff Bezos
(00:25:25)
There are 1,000 ways to be smart, by the way, and that is a really… When I go around and I meet people, I’m always looking for the way that they’re smart. And you find that’s one of the things that makes the world so interesting and fun is that it’s not like IQ is a single dimension. There are people who are smart in such unique ways.
Lex Fridman
(00:25:53)
Yeah, you just gave me a good response when somebody calls me an idiot on the internet. “You know, there’s 1,000 ways to be smart, sir.”
Jeff Bezos
(00:26:01)
Well, they might tell you, “Yeah, but there are a million to be ways to be dumb.”

New Glenn

Lex Fridman
(00:26:04)
Yeah, right. I feel like that’s a Mark Twain quote. Okay. All right. You gave me an amazing tour of Blue Origin Rocket Factory and Launch Complex in the historic Cape Canaveral. That’s where New Glenn, the big rocket we talked about, is being built and will launch. Can you explain what the New Glenn rocket is and tell me some interesting technical aspects of how it works?
Jeff Bezos
(00:26:29)
Sure. New Glenn is a very large heavy-lift launch vehicle. It’ll take about 45 metric tons to LEO, very large class. It’s about half the thrust, a little more than half the thrust of the Saturn V rocket. So, it’s about 3.9 million pounds of thrust on liftoff. The booster has seven BE-4 engines. Each engine generates a little more than 550,000 pounds of thrust. The engines are fueled by liquified natural gas, LNG as the fuel, and LOX as the oxidizer. The cycle is an ox-riched stage combustion cycle. It’s a cycle that was really pioneered by the Russians. It’s a very good cycle. And that engine is also going to power the first stage of the Vulcan rocket, which is the United Launch Alliance rocket. Then the second stage of New Glenn is powered by two BE-3U engines, which is a upper-stage variant of our New Shepard liquid hydrogen engine.

(00:27:44)
So, the BE-3U has 160,000 pounds of thrust, so two of those, 320,000 pounds of thrust. And hydrogen is a very good propellant for upper stages because it has very high ISP. It’s not a great propellant in my view for booster stages, because the stages then get physically so large. Hydrogen has very high ISP, but liquid hydrogen is not dense at all. So, to store liquid hydrogen, if you need to store many thousands of pounds of liquid hydrogen, your liquid hydrogen tank gets very large. So, you get more benefit from the higher ISP, the specific impulse, you get more benefit from the higher specific impulse on the second stage. And that stage carries less propellant, so you don’t get such geometrically-gigantic tanks. The Delta IV is an example of a vehicle that is all hydrogen. The booster stage is also hydrogen, and I think that it’s a very effective vehicle, but it never was very cost-effective. So, it’s operationally very capable but not very cost-effective.
Lex Fridman
(00:28:56)
So, size is also costly?
Jeff Bezos
(00:28:58)
Size is costly. So, it’s interesting. Rockets love to be big. Everything works better.
Lex Fridman
(00:29:05)
What do you mean by that? You’ve told me that before. It sounds epic, but what does it mean?
Jeff Bezos
(00:29:10)
I mean, when you look at the physics of rocket engines, and also when you look at parasitic mass… Let’s say you have an avionic system, so you have a guidance and control system, that is going to be about the same mass and size for a giant rocket as it is going to be for a tiny rocket. And so, that’s just parasitic mass that is very consequential if you’re building a very small rocket, but is trivial if you’re building a very large rocket. So, you have the parasitic mass thing. And then if you look at, for example, rocket engines have turbo pumps. They have to pressurize the fuel in the oxidizer up to a very high pressure level in order to inject it into the thrust chamber where it burns. And those pumps, all rotating machines, in fact, get more efficient as they get larger. So, really tiny turbo pumps are very challenging to manufacture, and any kind of gaps between the housing, for example, and the rotating impeller that pressurizes the fuel, there has to be some gap there. You can’t have those parts scraping against one another, and those gaps drive inefficiencies. And so, if you have a very large turbo pump, those gaps in percentage terms end up being very small. And so, there’s a bunch of things that you end up loving about having a large rocket and that you end up hating for a small rocket. But there’s a giant exception to this rule, and it is manufacturing. So, manufacturing large structures is very, very challenging. It’s a pain in the butt. And so, if you’re making a small rocket engine, you can move all the pieces by hand, you could assemble it on a table, one person can do it. You don’t need cranes and heavy lift operations and tooling and so on and so on. When you start building big objects, infrastructure, civil infrastructure, just like the launchpad and all this we went and visited, I took you to the launchpad. And you can see it’s so monumental.
Lex Fridman
(00:31:27)
Yeah, it is.
Jeff Bezos
(00:31:28)
And so, just these things become major undertakings, both from an engineering point of view, but also from a construction and cost point of view.
Lex Fridman
(00:31:37)
And even the foundation of the launchpad. I mean, this is Florida, isn’t it swamp land? How deep do you have to go?
Jeff Bezos
(00:31:44)
At Cape Canaveral, in fact, most launch pads are on beaches somewhere on the ocean side because you want to launch over water for safety reasons. Yes, you have to drive pilings, dozens and dozens and dozens of pilings, 50, 100, 150 feet deep to get enough structural integrity for these very large… Yes, these turn into major civil engineering projects.
Lex Fridman
(00:32:15)
I just have to say everything about that factory is pretty badass. You said tooling, the bigger it gets, the more epic it is.
Jeff Bezos
(00:32:22)
It does make it epic. It’s fun to look at. It’s extraordinary.
Lex Fridman
(00:32:26)
It’s humbling also because humans are so small compared to it.
Jeff Bezos
(00:32:29)
We are building these enormous machines that are harnessing enormous amounts of chemical power in very, very compact packages. It’s truly extraordinary.
Lex Fridman
(00:32:44)
But then, there’s all the different components and the materials involved. Is there something interesting that you can describe about the materials that comprise the rocket? So, it has to be as light as possible, I guess, whilst withstanding the heat and the harsh conditions?
Jeff Bezos
(00:33:03)
Yeah-
Lex Fridman
(00:33:00)
Whilst withstanding the heat and the harsh conditions?
Jeff Bezos
(00:33:03)
Yeah, I play a little game sometimes with other rocket people that I run into where say, “What are the things that would amaze the 1960s engineers? What’s changed?” Because surprisingly, some of rocketry’s greatest hits have not changed. They would recognize immediately a lot of what we do today and it’s exactly what they pioneered back in the ’60s. But a few things have changed. The use of carbon composites is very different today. We can build very sophisticated … You saw our carbon tape laying machine that builds the giant fairings and we can build these incredibly light, very stiff fairing structures out of carbon composite material that they could not have dreamed of. The efficiency, the structural efficiency of that material is so high compared to any metallic material you might use or anything else. So that’s one.

(00:34:12)
Aluminum-lithium and the ability to friction stir weld aluminum-lithium. Do you remember the friction stir welding that I showed you?
Lex Fridman
(00:34:20)
Yes. It’s incredible.
Jeff Bezos
(00:34:21)
This is a remarkable technology that’s invented decades ago, but has become very practical over just the last couple of decades. And instead of using heat to weld two pieces of metal together, it literally stirs the two pieces. There’s a pin that rotates at a certain rate and you put that pin between the two plates of metal that you want to weld together and then you move it at a very precise speed. And instead of heating the material, it heats it a little bit because of friction, but not very much, you can literally immediately after welding with stir friction welding, you can touch the material and it’s just barely warm. It literally stirs the molecules together. It’s quite extraordinary.
Lex Fridman
(00:35:06)
Relatively low temperature and I guess high temperatures, that makes it a weak point.
Jeff Bezos
(00:35:11)
Exactly. So …
Lex Fridman
(00:35:13)
Amazing.
Jeff Bezos
(00:35:13)
… with traditional welding techniques, you whatever the underlying strength characteristics of the material are, you end up with weak regions where you weld. And with friction stir welding, the welds are just as strong as the bulk material. So it really allows you … Let’s say you’re building a tank that you’re going to pressurize a large liquid natural gas tank for our booster stage, for example, if you are welding that with traditional methods, you have to size those weld lands, the thickness of those pieces with that knockdown for whatever damage you’re doing with the weld and that’s going to add a lot of weight to that tank.
Lex Fridman
(00:35:54)
Even just looking at the fairings, the result of that, the complex shape that it takes and what it’s supposed to do is incredible because some people don’t know, it’s on top of the rock, it’s going to fall apart. That’s its task, but it has to stay strong sometimes and then disappear when it needs to …
Jeff Bezos
(00:36:14)
That’s right.
Lex Fridman
(00:36:15)
… which is a very difficult task.
Jeff Bezos
(00:36:17)
Yes. When you need something that needs to have 100% integrity until it needs to have 0% integrity, it needs to stay attached until it’s ready to go away, and then when it goes away, it has to go away completely. You use explosive charges for that and so it’s a very robust way of separating structure when you need to.
Lex Fridman
(00:36:40)
Exploding.
Jeff Bezos
(00:36:41)
Yeah, little tiny bits of explosive material and it will sever the whole connection.
Lex Fridman
(00:36:49)
So if you want to go from 100% structural integrity to zero as fast as possible is explosives.
Jeff Bezos
(00:36:58)
Use explosives.
Lex Fridman
(00:36:59)
The entirety of this thing is so badass. Okay, so we’re back to the two stages. So the first stage is reusable.
Jeff Bezos
(00:37:06)
Yeah. Second stage is expendable. Second stage is liquid hydrogen, liquid oxygen. So we get take advantage of the higher specific impulse. The first stage lands down range on a landing platform in the ocean, comes back for maintenance and get ready to do the next mission.
Lex Fridman
(00:37:27)
There’s a million questions, but also is there a path towards reusability for the second stage?
Jeff Bezos
(00:37:32)
There is and we know how to do that. Right now, we’re going to work on manufacturing that second stage to make it as inexpensive as possible, two paths for a second stage, make it reusable or work really hard to make it inexpensive, so you can afford to expend it. And that trade is actually not obvious which one is better.
Lex Fridman
(00:38:00)
Even in terms of cost, like time, cost-
Jeff Bezos
(00:38:01)
Even in terms of … And I’m talking about cost. Space, getting into orbit is a solved problem. We solved it back in the ’50s and ’60s.
Lex Fridman
(00:38:11)
You’re making it sound easy.
Jeff Bezos
(00:38:13)
The only interesting problem is dramatically reducing the cost of access to orbit, which is, if you can do that, you open up a bunch of new endeavors that lots of start-up companies everybody else can do. One of our missions is to be part of this industry and lower the cost to orbit, so that there can be a renaissance, a golden age of people doing all kinds of interesting things in space.
Lex Fridman
(00:38:47)
I like how you said getting to orbit is a solved problem. It’s just the only interesting thing is reducing the cost. You know how you can describe every single problem facing human civilization that way? The physicists would say, “Everything is a solved problem. We’ve solved everything. The rest is just,” what did Rutherford said, “that it’s just stamp collecting. It’s just the details.” Some of the greatest innovations and inventions and brilliance is in that cost reduction stage, right? And you’ve had a long career of cost reduction.
Jeff Bezos
(00:39:18)
For sure. What does cost reduction really mean? It means inventing a better way.
Lex Fridman
(00:39:24)
Yeah, exactly.
Jeff Bezos
(00:39:25)
Right? And when you invent a better way, you make the whole world richer. So whatever it was, I don’t know how many thousands of years ago, somebody invented the plow. And when they invented the plow, they made the whole world richer because they made farming less expensive. And so it is a big deal to invent better ways. That’s how the world gets richer.
Lex Fridman
(00:39:48)
So what are some of the biggest challenges on the manufacturing side, on the engineering side that you’re facing in working to get to the first launch of New Glenn?
Jeff Bezos
(00:40:01)
The first launch is one thing and we’ll do that in 2024, coming up in this coming year. The real thing that’s the bigger challenge is making sure that our factory is efficiently manufacturing at rate. So rate production, so consider if you want to launch New Glenn 24 times a year, you need to manufacture a upper stage since they’re expendable, twice a month. You need to do one every two weeks. So you need to have all of your manufacturing facilities and processes and inspection techniques and acceptance tests and everything operating at rate. And rate manufacturing is at least as difficult as designing the vehicle in the first place and the same thing. So every upper stage has two BE-3U engines.

(00:41:03)
So those engines, if you’re going to launch the vehicle twice a month, you need four engines a month. So you need an engine every week. That engine needs to be being produced at rate and there’s all of the things that you need to do that, all the right machine tools, all the right fixtures, the right people, process, etcetera. So it’s one thing to build a first article, right? To launch New Glenn for the first time, you need to produce a first article, but that’s not the hard part. The hard part is everything that’s going on behind the scenes to build a factory that can produce New Glenns at rate.
Lex Fridman
(00:41:47)
So the first one is produced in a way that enables the production of the second and third and the fourth and the fifth and sixth-
Jeff Bezos
(00:41:53)
You could think of the first article as pushing, it pushes all of the rate manufacturing technology along. In other words, it’s the test article in a way that’s testing out your manufacturing technologies.
Lex Fridman
(00:42:13)
The manufacturing is the big challenge.
Jeff Bezos
(00:42:15)
Yes. I don’t want to make it sound like any of it is easy. The people who are designing the engines and all this, all of this is hard for sure, but the challenge right now is driving really hard to get to is to get to rate manufacturing and to do that in an efficient way, again back to our cost point. If you get to rate manufacturing in an inefficient way, you haven’t really solved the cost problem and maybe you haven’t really moved the state of the art forward. All this has to be about moving this state of the art forward. There are easier businesses to do. I always tell people, “Look, if you are trying to make money, start a salty snack food company or something.”
Lex Fridman
(00:42:56)
I’m going to write that idea down.
Jeff Bezos
(00:43:01)
Make the Lex Fridman Potato Chips.
Lex Fridman
(00:43:04)
Right. Don’t say it. People are going to steal it. But yeah, it’s hard.
Jeff Bezos
(00:43:10)
Do you see what I’m saying? There’s nothing easy about this business, but it’s its own reward. It’s fascinating, it’s worthwhile, it’s meaningful. I don’t want to pick on salty snack food companies, but I think it’s less meaningful. At the end of the day, you’re not going to have accomplished something amazing …
Lex Fridman
(00:43:33)
Yeah, there’s-
Jeff Bezos
(00:43:33)
… even if you do make a lot of money on it.
Lex Fridman
(00:43:35)
Yeah, there’s something fundamentally different about the “business of space exploration.”
Jeff Bezos
(00:43:41)
Yeah, for sure.
Lex Fridman
(00:43:42)
It’s a grand project of humanity.
Jeff Bezos
(00:43:44)
Yes, it’s one of humanity’s grand challenges, and especially as you look at going to the moon and going to Mars and building giant O’Neill colonies and unlocking all the things. I won’t live long enough to see the fruits of this, but the fruits of this come from building a road to space, getting the infrastructure. I’ll give you an analogy. When I started Amazon, I didn’t have to develop a payment system. It already existed. It was called the credit card. I didn’t have to develop a transportation system to deliver the packages. It already existed. It was called the Postal Service and Royal Mail and Deutsche Post and so on. So all this heavy lifting infrastructure was already in place and I could stand on its shoulders. And that’s why, when you look at the internet …

(00:44:40)
And by the way, another giant piece of infrastructure that was around in the early, I’m taking you back to 1994, people were using dial-up modems and it was piggybacking on top of the long distance phone network. That’s how the internet … That’s how people were accessing servers and so on. And again, if that hadn’t existed, it would’ve been hundreds of billions of CapEx to put that out there. No startup company could have done that. And so the problem you see, if you look at the dynamism in the internet space over the last 20 years, it’s because you see two kids in a dorm room could start an internet company that could be successful and do amazing things because they didn’t have to build heavy infrastructure. It was already there. And that’s what I want to do. I take my Amazon winnings and use that to build heavy infrastructure so that the next generation, the generation that’s my children and their children, those generations can then use that heavy infrastructure, then there’ll be space entrepreneurs who start in their dorm room. That will be a marker of success when you can have a really valuable space company started in a dorm room, then we know that we’ve built enough infrastructure so that ingenuity and imagination can really be unleashed. I find that very exciting.
Lex Fridman
(00:46:11)
They will, of course, as kids do, take all of this hard infrastructure ability for granted.
Jeff Bezos
(00:46:16)
Of course.
Lex Fridman
(00:46:18)
That entrepreneurial spirit.
Jeff Bezos
(00:46:19)
That’s an inventor’s greatest dream, is that their inventions are so successful that they are one day taken for granted. Nobody thinks of Amazon as an invention anymore. Nobody thinks of customer reviews as an invention. We pioneered customer reviews, but now they’re so commonplace. Same thing with one-click shopping and so on, but that’s a compliment. You invent something that’s so used, so beneficially used by so many people that they take it for granted.
Lex Fridman
(00:46:49)
I don’t know about nobody. Every time I use Amazon, I’m still amazed, “How does this work, the logistics, the Wazuh?”
Jeff Bezos
(00:46:55)
Well, that proves you’re a very curious explorer.
Lex Fridman
(00:46:57)
All right, all right, back to rocket. Timeline, you said 2024. As it stands now, are both the first test launch and the launch of ESCAPADE explorers to Mars still possible in 2024?
Jeff Bezos
(00:47:11)
In 2024?
Lex Fridman
(00:47:12)
Yeah.
Jeff Bezos
(00:47:13)
Yeah, I think so. For sure, the first launch and then we’ll see if ESCAPADE goes on that or not. I think that the first launch for sure and I hope ESCAPADE too.
Lex Fridman
(00:47:23)
Hope-
Jeff Bezos
(00:47:24)
Well, I just don’t know which mission it’s actually going to be slated on. So we also have other things that might go on that first mission.
Lex Fridman
(00:47:31)
Oh, I got it. But you’re optimistic that the launches will still-
Jeff Bezos
(00:47:35)
Oh, the first launch. I’m very optimistic that the first launch of New Glenn will be in 2024 and I’m just not 100% certain what payload will be on that first launch.
Lex Fridman
(00:47:44)
Are you nervous about it?
Jeff Bezos
(00:47:46)
Are you kidding? I’m extremely nervous about it.
Lex Fridman
(00:47:51)
Oh, man.
Jeff Bezos
(00:47:52)
100%. Every launch I go to, for New Shepherd, for other vehicles too, I’m always nervous for these launches. But yes, for sure, a first launch, to have no nervous about that would be some sign of derangement, I think so.
Lex Fridman
(00:48:09)
Well, I got to visit the launch, man. It’s pretty … I mean, it’s epic.
Jeff Bezos
(00:48:14)
We have done a tremendous amount of ground testing, a tremendous amount of simulation. So a lot of the problems that we might find in flight have been resolved, but there are some problems you can only find in flight. So cross your fingers. I guarantee you you’ll have fun watching it no matter what happens.
Lex Fridman
(00:48:37)
100%. When the thing is fully assembled, it comes up-
Jeff Bezos
(00:48:41)
Yeah, the transporter erector.
Lex Fridman
(00:48:44)
It’s the erector, yeah.
Jeff Bezos
(00:48:45)
Just the transporter erector for a rocket of this scale is extraordinary.
Lex Fridman
(00:48:49)
That’s an incredible machine.
Jeff Bezos
(00:48:50)
The vehicle travels out horizontally and then comes up and-
Lex Fridman
(00:48:57)
Over a few hours?
Jeff Bezos
(00:48:58)
Yeah, it’s a beautiful thing to watch.
Lex Fridman
(00:49:00)
Speaking of which, if that makes you nervous, I don’t know if you remember, but you were aboard New Shepard on its first crewed flight. How was that experience? Were you terrified then?
Jeff Bezos
(00:49:20)
Strangely, I wasn’t.
Lex Fridman
(00:49:22)
When you ride the rocket, wasn’t nerve wracking? Okay.
Jeff Bezos
(00:49:24)
It’s true. I’ve watched other people riding the rocket and I’m more nervous than when I was inside the rocket myself. It was a difficult conversation to have with my mother when I told her I was going to go on the first one. And not only was I going to go, but I was going to bring my brother too. This is a tough conversation to have with a mom.
Lex Fridman
(00:49:44)
There’s a long pause when you told her.
Jeff Bezos
(00:49:47)
She’s like, “Both of you?” It was an incredible experience and we were laughing inside the capsule and we’re not nervous. The people on the ground were very nervous for us. It was actually one of the most emotionally powerful parts of the experience happened even before the flight. At 4:30 in the morning, brother and I are getting ready to go to the launch site and Lauren is going to take us there in her helicopter and we’re getting ready to leave. And we go outside, outside the ranch house there in West Texas where the launch facility is and all of our family, my kids and my brother’s kids and our parents and close friends are assembled there and they’re saying goodbye to us, but they’re saying, “Maybe they think they’re saying goodbye to us forever,” and we might not have felt that way, but it was obvious from their faces how nervous they were that they felt that way. And it was powerful because it allowed us to see … It was almost like a attending year old memorial service or something like you could feel how loved you were in that moment and it was really amazing.
Lex Fridman
(00:51:12)
Yeah, and there’s just a epic nature to it too.
Jeff Bezos
(00:51:17)
The ascent, the floating in zero gravity. I’ll tell you something very interesting, zero gravity feels very natural. I don’t know if it’s because it’s like return to the womb or-
Lex Fridman
(00:51:31)
You just confirmed you’re an alien, but that’s all. I think that’s what you just said.
Jeff Bezos
(00:51:36)
It feels so natural to be in zero G. It was really interesting. And then what people talk about the overview effect and seeing Earth from space, I had that feeling very powerfully. I think everyone did. You see how fragile the Earth is. If you’re not an environmentalist, it will make you one. The great Jim Lovell quote, he looked back at the Earth from space and he said he realized, “You don’t go to heaven when you die. You go to heaven when you’re born.” That’s the feeling that people get when they’re in space. You see all this blackness, all this nothingness and there’s one gem of life and it’s Earth.
Lex Fridman
(00:52:15)
It is a gem. You’ve talked a lot about decision making throughout your time with Amazon. What was that decision like to be the first to ride New Shepard? Just before you talk to your mom, the pros and cons? Actually, as one human being, as a leader of a company on all fronts, what was that decision making like?
Jeff Bezos
(00:52:43)
I decided that … First of all, I knew the vehicle extremely well. I know the team who built it. I know the vehicle. I’m very comfortable with the escape system. We put as much effort into the escape system on that vehicle as we put into all the rest of the vehicle combined. It’s one of the hardest pieces of engineering in the entire New Shepard architecture.
Lex Fridman
(00:53:10)
Can you actually describe what do you mean by escape system? What’s involved?
Jeff Bezos
(00:53:13)
We have a solid rocket motor in the base of the crew capsule, so that if anything goes wrong on ascent, while the main rocket engine is firing, we can ignite this solid rocket motor in the base of the crew capsule and escape from the booster. It’s a very challenging system to build, design, validate, test, all of these things. It is the reason that I am comfortable letting anyone go on New Shepard. So the booster is as safe and reliable as we can make it, but we are harnessing … Whenever you’re talking about rocket engines, I don’t care what rocket engine you’re talking about, you’re harnessing such vast power in such a small compact geometric space. The power density is so enormous that it is impossible to ever be sure that nothing will go wrong.

(00:54:18)
And so the only way to improve safety is to have an escape system. And historically, human-rated rockets have had escape systems. Only the space shuttle did not, but Apollo had one. All of the previous Gemini, etcetera, they all had escape systems. And we have on New Shepard an unusual escape … Most escape systems are towers. We have a pusher escape system. So the solid rocket motor is actually embedded in the base of the crew capsule and it pushes and it’s reusable in the sense that, if we don’t use it, so if we have a nominal mission, we land with it. The tower systems have to be ejected at a certain point in the mission and so they get wasted even in a nominal mission.

(00:55:09)
And so again, costs really matters on these things, so we figured out how to have the escape system be a reusable. In the event that it’s not used, it can reuse it and have it be a pusher system. It’s a very sophisticated thing. So I knew these things. You asked me about my decision to go and so I know the vehicle very well, I know the people who designed it, I have great trust in them and in the engineering that we did. And I thought to myself, “Look, if I am not ready to go, then I wouldn’t want anyone to go.” A tourism vehicle has to be designed, in my view, to be as safe as one can make it. You can’t make it perfectly safe. It’s impossible, but you have … People will do things. People take risk. They climb mountains, they skydive, they do deep underwater scuba diving and so on. People are okay taking risk. You can’t eliminate the risk, but it is something, because it’s a tourism vehicle, you have to do your utmost to eliminate those risks.

(00:56:16)
And I felt very good about the system. I think it’s one of the reasons I was so calm inside and maybe others weren’t as calm. They didn’t know as much about it as I did.
Lex Fridman
(00:56:26)
Who was in charge of engaging the escape system? Did you have-
Jeff Bezos
(00:56:28)
It’s automated. The escape system is …
Lex Fridman
(00:56:31)
Okay. I was visualizing-
Jeff Bezos
(00:56:33)
… completely automated. Automated is better because it can react so much faster.
Lex Fridman
(00:56:38)
Okay. So yeah, for tourism rockets, safety is a huge, huge, huge priority for space exploration also, but a delta less.
Jeff Bezos
(00:56:46)
Yes. I think if you’re doing … There are human activities where we tolerate more risk if you’re saving somebody’s life, if you are engaging in real exploration. These are things where I personally think we would accept more risk in part because you have to.
Lex Fridman
(00:57:09)
Is there a part of you that’s frustrated by the rate of progress in Blue Origin?
Jeff Bezos
(00:57:15)
Blue Origin needs to be much faster. And it’s one of the reasons that I left my role as the CEO of Amazon a couple of years ago, “I wanted to come in and Blue Origin needs me right now.” And so I had always … When I was the CEO of Amazon, my point of view on this is, “If I’m the CEO of a publicly traded company, it’s going to get my full attention.” And it’s just how I think about things. It was very important to me. I felt I had an obligation to all the stakeholders at Amazon to do that. And so having turned the CEO, I’m still the executive chair there, but I turned the CEO role over, and the primary reason I did that is that I could spend time on Blue Origin, adding some energy, some sense of urgency, “We need to move much faster and we’re going to.”
Lex Fridman
(00:58:14)
What are the ways to speed it up? You’ve talked a lot of different ways at Amazon removing barriers for progress or distributing, making everybody autonomous and self-reliant, all those kinds of things. Is that apply at Blue Origin or is-
Jeff Bezos
(00:58:37)
It does apply. I’m leading this directly. We’re going to become the world’s most decisive company across any industry. And so at Amazon, for ever since the beginning, I said, “We’re going to become the world’s most customer-obsessed company.” And no matter the industry, one day, people are going to come to Amazon from the healthcare industry and want to know, “How are you so customer-obsessed? How do you not just pay lip service that, but actually do that?” All different industries should come want to study us to see how we accomplish that. And the analogous thing at Blue Origin and will help us move faster is we’re going to become the world’s most decisive company. We’re going to get really good at taking appropriate technology risk and making those decisions quickly, being bold on those things and having the right culture that supports that.

(00:59:40)
You need people to be ambitious, technically ambitious, “If there are five ways to do something, we’ll study them, but let’s study them very quickly and make a decision.” We can always change our mind. Changing your mind, I talk about one-way doors and two-way doors, most decisions are two-way doors.
Lex Fridman
(01:00:03)
Can you explain that because I love that metaphor?
Jeff Bezos
(01:00:06)
If you make the wrong decision, if it’s a two-way door decision, you pick a door, you walk out and you spend a little time there. It turns out to be the wrong decision, you can come back in and pick another door. Some decisions are so consequential and so important and so hard to reverse that they really are one-way door decisions. You go in that door, you’re not coming back. And those decisions have to be made very deliberately, very carefully. If you can think of yet another way to analyze the decision, you should slow down and do that. So when I was CEO of Amazon, I often found myself in the position of being the chief slow down officer because somebody would be bringing me a one-way door decision and I would say, “Okay, I can think of three more ways to analyze that. So let’s go do that because we are not going to be able to reverse this one easily. Maybe you can reverse it if it’s going to be very costly and very time-consuming. We really have to get this one right from the beginning.”

(01:01:10)
And what happens, unfortunately, in companies, what can happen, is that you have a one-size-fits-all decision-making process where you end up using the heavyweight process on all decisions …
Lex Fridman
(01:01:28)
For everything, yeah.
Jeff Bezos
(01:01:29)
… Including the lightweight ones, the two-way door decisions. Two-way door decisions should mostly be made by single individuals or by very small teams deep in the organization. And one-way door decisions are the irreversible ones. Those are the ones that should be elevated up to the senior-most executives who should slow them down and make sure that the right thing is being done.
Lex Fridman
(01:01:55)
Yeah, part of the skill here is to know the difference between one-way and two-way. I think you mentioned …
Jeff Bezos
(01:01:55)
Yes.
Lex Fridman
(01:02:01)
I think you mentioned Amazon Prime, the decision to create Amazon Prime as a one-way door. It’s unclear if it is or not, but it probably is and it’s a really big risk to go there.
Jeff Bezos
(01:02:14)
There are a bunch of decisions like that are … Changing the decision is going to be very, very complicated. Some of them are technical decisions too because some technical decisions are like quick-drying cement. Once you make them, it gets really hard. Choosing which propellants to use in a vehicle, selecting LNG for the booster stage and selecting hydrogen for the upper stage, that has turned out to be a very good decision. But if you changed your mind, that would be a very big setback. Do you see what I’m saying?
Lex Fridman
(01:02:51)
Yeah, yeah.
Jeff Bezos
(01:02:52)
So that’s the kind of decision you scrutinize very, very carefully. Other things just aren’t like that. Most decisions are not that way. Most decisions should be made by single individuals and done quickly in the full understanding that you can always change your mind.
Lex Fridman
(01:03:11)
One of the things I really liked, perhaps it’s not a two-way door decisions, is, “I disagree and commit,” phrase. So somebody brings up an idea to you, if it’s a two-way door, you state that you don’t understand enough to agree, but you still back them. I’d love for you to explain that-
Jeff Bezos
(01:03:35)
Well, yes, disagree and commit is a really important principle that saves a lot of arguing. So-
Lex Fridman
(01:03:39)
Yeah, I’m going to use that in my personal life, “I disagree, but commit.”
Jeff Bezos
(01:03:44)
It’s very common in any endeavor in life, in business and anybody where you have teammates, you have a teammate and the two of you disagree. At some point, you have to make a decision. And in companies, we tend to organize hierarchically. Whoever’s the more senior person ultimately gets to make the decision. So ultimately, the CEO gets to make that decision. And the CEO may not always make the decision that they agree with. So I would be the one who would disagree and commit. One of my direct reports would very much want to do something in a particular way. I would think it was a bad idea. I would explain my point of view. They would say, ” Jeff, I think you’re wrong and here’s why,” and we would go back and forth.

(01:04:35)
And I would often say, “You know what? I don’t think you’re right, but I’m going to gamble with you and you’re closer to the ground truth than I am. I’d known you for 20 years. You have great judgment. I don’t know that I’m right either. Not really, not for sure. All these decisions are complicated. Let’s do it your way.” But at least then you’ve made a decision and I’m agreeing to commit to that decision. So I’m not going to be second guessing it. I’m not going to be sniping at it. I’m not going to be saying, “I told you so.” I’m going to try actively to help make sure it works. That’s a really important teammate behavior.

(01:05:18)
There’s so many ways that dispute resolution is a really interesting thing on teams. And there are so many ways when two people disagree about something, even … I’m assuming the case for everybody is well-intentioned. They just have a very different opinion about what the right decision is. And in our society and inside companies, we have a bunch of mechanisms that we use to resolve these kinds of disputes. A lot of them are, I think, really bad. So an example of a really bad way of coming to agreement is compromise. So compromise, we’re in a room here and I could say, “Lex, how tall do you think this ceiling is?”
Jeff Bezos
(01:06:00)
I’m here and I could say, “Lex, how tall do you think this ceiling is?” And you’d be like, “I don’t know, Jeff, maybe 12 feet tall.” And I would say, “I think it’s 11 feet tall.” And then we’d say, “You know what? Let’s just call it 11 and a half feet.” That’s compromise, instead of. The right thing to do is to get a tape measure or figure out some way of actually measuring, but think getting that tape measure and figure out how to get it to the top of the ceiling and all these things, that requires energy. Compromise, the advantage of compromise as a resolution mechanism is that it’s low energy, but it doesn’t lead to truth. And so in things like the height of the ceiling where truth is a noble thing, you shouldn’t allow compromise to be used when you can know the truth.

(01:06:51)
Another really bad resolution mechanism that happens all the time is just who’s more stubborn? This is also, let’s say two executives who disagree and they just have a war of attrition, and whichever one gets exhausted first capitulates to the other one. Again, you haven’t arrived at truth and this is very demoralizing. So this is where escalation, I try to ask people on my team and say, “Never get to a point where you are resolving something by who gets exhausted first. Escalate that.” I’ll help you make the decision because that’s so de-energized and such a terrible, lousy way to make a decision.
Lex Fridman
(01:07:40)
Do you want to get to the resolution as quickly as possible because that ultimately leads to high velocity of decision?
Jeff Bezos
(01:07:45)
Yes, and you want to try to get as close to truth as possible. Exhausting the other person is not truth seeking.
Lex Fridman
(01:07:53)
Yes.
Jeff Bezos
(01:07:54)
And compromise is not truth seeking. And there are a lot of cases where no one knows the real truth and that’s where disagree and commit can come in, but escalation is better than war of attrition. Escalate to your boss and say, “Hey, we can’t agree on this. We like each other. We’re respectful of each other, but we strongly disagree with each other. We need you to make a decision here so we can move forward.” But decisiveness, moving forward quickly on decisions, as quickly as you responsibly can is how you increase velocity. Most of what slows things down is taking too long to make decisions at all scale levels. So it has to be part of the culture to get high velocity. Amazon has a million and a half people and the company is still fast. We’re still decisive, we’re still quick, and that’s because the culture supports that.
Lex Fridman
(01:08:53)
At every scale in a distributed way-
Jeff Bezos
(01:08:53)
Yes.
Lex Fridman
(01:08:56)
Try to maximize the velocity of decisions.
Jeff Bezos
(01:08:58)
Exactly.

Lunar program

Lex Fridman
(01:08:59)
You’ve mentioned the lunar program. Let me ask you about that. There’s a lot going on there and you haven’t really talked about it much. So in addition to the Artemis program with NASA, Blue is doing its own lander program. Can you describe it? There’s a sexy picture on Instagram with one of them. Is it the MK1, I guess?
Jeff Bezos
(01:09:20)
Yeah, The Mark 1. The picture here is me with Bill Nelson, the NASA Administrator.
Lex Fridman
(01:09:26)
Just to clarify, the lander is the sexy thing about the [inaudible 01:09:29]. I really want to clarify that.
Jeff Bezos
(01:09:32)
I know it’s not me. I know it was either the lander or Bill.
Lex Fridman
(01:09:34)
Okay. I love Bill, but-
Jeff Bezos
(01:09:37)
Thank you for clarifying.
Lex Fridman
(01:09:37)
Okay.
Jeff Bezos
(01:09:40)
Yes, the Mark 1 lander is designed to take 3,000 kilograms to the surface of the moon and to cargo expendable cargo. It’s an expendable lander. Lands on the moon, stays there, take 3,000 kilograms to the surface. It can be launched on a single New Glenn flight, which is very important. So it’s a relatively simple architecture, just like the human landing system lander, they’re called the Mark 2. Mark 1 is also fueled with liquid hydrogen, which is for high energy emissions like landing on the surface of the moon. The high specific impulsive hydrogen is a very big advantage.

(01:10:24)
The disadvantage of hydrogen has always been that since it’s such a deep cryogen, it’s not storable. So it’s constantly boiling off and you’re losing propellant because it’s boiling off. And so what we’re doing as part of our lunar program is developing solar-powered cryo coolers that can actually make hydrogen a storable propellant for deep space. And that’s a real game-changer. It’s a game-changer for any high energy mission. So to the moon, but to the outer planets, to Mars, everywhere.
Lex Fridman
(01:11:00)
So the idea with both Mark 1 and Mark 2 is the New Glenn can carry it from the surface of earth to the surface of the moon?
Jeff Bezos
(01:11:12)
Exactly. So the Mark 1 is expendable. The lunar lander we’re developing for NASA, the Mark 2 lander, that’s part of the Artemis program. They call it the Sustaining Lander Program. So that lander is designed to be reusable. It can land on the surface of the moon in a single stage configuration and then take off. So if you look at the Apollo program, the lunar lander and Apollo was really two stages. It would land on the surface and then it would leave the descent stage on the surface of the moon and only the ascent stage would go back up into lunar orbit where it would rendezvous with the command module.

(01:11:56)
Here, what we’re doing is we have a single stage lunar lander that carries down enough propellant so that it can bring the whole thing back up so that it can be reused over and over. And the point of doing that, of course, is to reduce cost so that you can make lunar missions more affordable over time, which is that’s one of NASA’s big objectives because this time… The whole point of Artemis is go back to the moon, but this time to stay. So back in the Apollo program, we went to the moon six times and then ended the program and it really was too expensive to continue.
Lex Fridman
(01:12:35)
And so there’s a few questions there, but one is how do you stay on the moon? What ideas do you have about sustaining life where a few folks can stay there for prolonged periods of time?
Jeff Bezos
(01:12:51)
Well, one of the things we’re working on is using lunar resources like lunar regolith to manufacture commodities and even solar cells on the surface of the moon. We’ve already built a solar cell that is completely made from lunar regolith stimulant, and this solar cell is only about 7% power efficient. So it’s very inefficient compared to the more advanced solar cells that we make here on earth. But if you can figure out how to make a practical solar cell factory that you can land on the surface of the moon and then the raw material for those solar cells is simply lunar regolith, then you can just continue to churn out solar cells on the surface of the moon, have lots of power on the surface of the moon. That will make it easier for people to live on the moon.

(01:13:51)
Similarly, we’re working on extracting oxygen from lunar regolith. So lunar regolith by weight has a lot of oxygen in it. It’s bound very tightly as oxides with other elements. And so you have to separate the oxygen, which is very energy intensive. So that also could work together with the solar cells. And then ultimately, we may be able to find practical quantities of ice in the permanently shadowed craters on the poles of the moon. And we know there is ice water or water ice in those craters, and we know that we can break that down with electrolysis into hydrogen and oxygen. And then you’d not only have oxygen, but you’d also have a very good high efficiency propellant fuel in hydrogen.

(01:14:57)
So there’s a lot we can do to make the moon more sustainable over time, but the very first step, the gate that all of that has to go through is we need to be able to land cargo and humans on the surface of the moon at an acceptable cost.
Lex Fridman
(01:15:16)
To fast-forward a little bit, is there any chance Jeff Bezos steps foot on the moon and on Mars, one or the other or both?
Jeff Bezos
(01:15:27)
It’s very unlikely. I think it’s probably something that gets done by future generations by the time it gets to me. I think in my lifetime that’s probably going to be done by professional astronauts, sadly. I would love to sign up for that mission. So don’t count me out yet, Lex. Give me a finding shot here maybe, but I think if we are placing reasonable bets on such a thing, in my lifetime, that will continue to be done by professional astronauts.
Lex Fridman
(01:15:59)
So these are risky, difficult missions?
Jeff Bezos
(01:16:02)
And probably missions that require a lot of training. You are going there for a very specific purpose to do something. We’re going to be able to do a lot on the moon too with automation. So in terms of setting up these factories and doing all that, we are sophisticated enough now with automation that we probably don’t need humans to tend those factories and machines. So there’s a lot that’s going to be done in both modes.
Lex Fridman
(01:16:28)
So I have to ask the bigger picture question about the two companies pushing humanity forward out towards the stars, Blue Origin and SpaceX. Are you competitors, collaborators? Which and to what degree?
Jeff Bezos
(01:16:44)
Well, I would say just like the internet is big and there are lots of winners at all scale levels, there are half a dozen giant companies that the internet has made, but there are a bunch of medium-sized companies and a bunch of small companies, all successful, all with profit streams, all driving great customer experiences. That’s what we want to see in space, that kind of dynamism. And space is big. There’s room for a bunch of winners and it’s going to happen at all skill levels. And so SpaceX is going to be successful for sure. I want Blue Origin to be successful, and I hope there are another five companies right behind us.
Lex Fridman
(01:17:25)
But I spoke to Elon a few times recently about you, about Blue Origin, and he was very positive about you as a person and very supportive of all the efforts you’ve been leading at Blue. What’s your thoughts? You worked with a lot of leaders at Amazon at Blue. What’s your thoughts about Elon as a human being and a leader?
Jeff Bezos
(01:17:46)
Well, I don’t really know Elon very well. I know his public persona, but I also know you can’t know anyone by their public persona. It’s impossible. You may think you do, but I guarantee you don’t. So I don’t really know. You know Elon way better than I do, Lex, but in terms of judging by the results, he must be a very capable leader. There’s no way you could have Tesla and SpaceX without being a capable leader. It’s impossible.
Lex Fridman
(01:18:22)
Yeah, I hope you guys hang out sometimes, shake hands and sort of have a kind of friendship that would inspire just the entirety of humanity, because what you’re doing is one of the big grand challenges ahead for humanity.
Jeff Bezos
(01:18:40)
Well, I agree with you and I think in a lot of these endeavors we’re very like-minded. So I’m not saying we’re identical, but I think we’re very like-minded. And so I love that idea.

Amazon

Lex Fridman
(01:18:56)
All right, going back to sexy pictures on your Instagram, there’s a video of you from the early days of Amazon, giving a tour of your, “Offices.” I think your dad is holding the camera.
Jeff Bezos
(01:19:10)
He is. Yeah, I know, right? Yes. This is what? The giant orange extension cord.
Lex Fridman
(01:19:12)
And you’re explaining the genius of the extension cord and how this is a desk and the CRT monitor, and that’s where all the magic happened. I forget what your dad said, but this is the center of it all. So what was it like? What was going through your mind at that time? You left a good job in New York and took this leap. Were you excited? Were you scared?
Jeff Bezos
(01:19:37)
So excited and scared, anxious. Thought the odds of success were low. Told all of our early investors that I thought there was a 30% chance of success by which I just mean getting your money back, not what actually happened. Because that’s the truth. Every startup company is unlikely to work. It’s helpful to be in reality about that, but that doesn’t mean you can’t be optimistic. So you have to have this duality in your head. On the one hand, you know what the baseline statistics say about startup companies, and the other hand, you have to ignore all of that and just be 100% sure it’s going to work, and you’re doing both things at the same time. You’re holding that contradiction in your head.

(01:20:24)
But it was so exciting. From 1994 when the company was founded to 1995 when we opened our doors, all the way until today, I find Amazon so exciting. And that doesn’t mean… It’s full of pain, full of problems. It’s like there’s so many things that need to be resolved and worked and made better and et cetera. But on balance, it’s so fun. It’s such a privilege. It’s been such a joy. I feel so grateful that I’ve been part of that journey. It’s just been incredible.
Lex Fridman
(01:21:04)
So in some sense, you don’t want a single day of comfort. You’ve written about this many times. We’ll talk about your writing, which I would highly recommend people read and just the letters to shareholders. So explaining the idea of day one thinking, I think you first wrote about in 97 letters to shareholders. Then you also in a way wrote it about, sad to say, is your last letter to shareholders as CEO. And you said that, “Day two is stasis followed by irrelevance, followed by excruciating painful decline, followed by death.” And that is why it’s always day one. Can you explain this day one thing? This is a really powerful way to describe the beginning and the journey of Amazon.
Jeff Bezos
(01:21:56)
It’s really a very simple, and I think age-old idea about renewal and rebirth and every day is day one. Every day you are deciding what you’re going to do and you are not trapped by what you were or who you were or any self-consistency. Self-consistency even can be a trap. And so day one thinking is we start fresh every day and we get to make new decisions every day about invention, about customers, about how we’re going to operate. Even as deeply as what our principles are, we can go back to that. It turns out we don’t change those very often, but we change them occasionally.

(01:22:49)
And when we work on programs at Amazon, we often make a list of tenants. And the tenants are… They’re not principles, they’re a little more tactical than principles, but it’s the main ideas that we want this program to embody, whatever those are. And one of the things that we do is we put, “These are the tenets for this program and parentheses.” We always put, “Unless you know a better way.” And that idea, “Unless you know a better way,” is so important because you never want to get trapped by dogma. You never want to get trapped by history. It doesn’t mean you discard history or ignore it. There’s so much value in what has worked in the past, but you can’t be blindly following what you’ve done. And that’s the heart of day one, is you’re always starting afresh.
Lex Fridman
(01:23:51)
And to the question of how to fend off day two, you said, “Such a question can’t have a simple answer,” as you’re saying. “There will be many elements, multiple paths, and many traps. I don’t know the whole answer, but I may know bits of it. Here’s a starter pack of essentials, maybe others come to mind. For day one, defense, customer obsession, a skeptical view of proxies, the eager adoption of external trends and high velocity decision-making.”

(01:24:19)
So we talked about high velocity decision-making, that’s more difficult than it sounds. So maybe you can pick one that stands out to you as you can comment on. Eager adoption of external trends, high velocity decision-making, skeptical view of proxies. How do you fight off day two?
Jeff Bezos
(01:24:36)
Well, I’ll talk about… Because I think it’s the one that is maybe in some ways the hardest to understand, is the skeptical view of proxies. One of the things that happens in business, probably anything where you have an ongoing program and something is underway for a number of years, is you develop certain things that you’re managing to. The typical case would be a metric, and that metric isn’t the real underlying thing. And so maybe the metric is efficiency metric around customer contacts per unit sold or something like. If you sell a million units, how many customer contacts do you get or how many returns do you get? And so on and so on.

(01:25:30)
And so what happens is a little bit of a kind of inertia sets in where somebody a long time ago invented that metric and they invented that metric, they decided, “We need to watch for customer returns per unit sold as an important metric.” But they had a reason why they chose that metric, the person who invented that metric and decided it was worth watching. And then fast-forward five years, that metric is the proxy.
Lex Fridman
(01:26:02)
The proxy for truth, I guess.
Jeff Bezos
(01:26:04)
The proxy for truth. Let’s say in this case it’s a proxy for customer happiness, but that metric is not actually customer happiness. It’s a proxy for customer happiness. The person who invented the metric understood that connection. Five years later, a kind of inertia can set in and you forget the truth behind why you were watching that metric in the first place. And the world shifts a little and now that proxy isn’t as valuable as it used to be or it’s missing something. And you have to be on alert for that. You have to know, “Okay, I don’t really care about this metric. I care about customer happiness and this metric is worth putting energy into and following and improving and scrutinizing, only in so much as it actually affects customer happiness.”

(01:27:03)
And so you’ve got to constantly be on guard and it’s very, very common. This is a nuanced problem. It’s very common, especially in large companies, that they’re managing to metrics that they don’t really understand. They don’t really know why they exist, and the world may have shifted out from under them a little and the metrics are no longer as relevant as they were when somebody 10 years earlier invented the metric.
Lex Fridman
(01:27:29)
That is a nuance, but that’s a big problem. Right?
Jeff Bezos
(01:27:33)
It’s a huge problem.
Lex Fridman
(01:27:34)
There’s something so compelling to have a nice metric to try to optimize.
Jeff Bezos
(01:27:38)
Yes. And by the way, you do need metrics.
Lex Fridman
(01:27:41)
Yes, you do.
Jeff Bezos
(01:27:41)
You can’t ignore them. Want them, but you just have to be constantly on guard. This is a way to slip into day two thinking would be to manage your business to metrics that you don’t really understand and you’re not really sure why they were invented in the first place, and you’re not sure they’re still as relevant as they used to be.
Lex Fridman
(01:28:03)
What does it take to be the guy or gal who brings up the point that this proxy might be outdated? I guess what does it take to have a culture that enables that in the meeting? Because that’s a very uncomfortable thing to bring up at a meeting. “We all showed up here, it’s a Friday.”
Jeff Bezos
(01:28:21)
You have just asked a million-dollar question. So if I generalize what you’re asking, you are talking in general about truth-telling and we humans are not really truth-seeking animals. We are social animals.
Lex Fridman
(01:28:42)
Yeah, we are.
Jeff Bezos
(01:28:44)
And take you back in time 10,000 years and you’re in a small village. If you go along to get along, you can survive. You can procreate. If you’re the village truth-teller, you might get clubbed to death in the middle of the night. Truths are often… They don’t want to be heard because important truths can be uncomfortable, they can be awkward, they can be exhausting.
Lex Fridman
(01:29:12)
Impolite and all that kind of stuff.
Jeff Bezos
(01:29:14)
Yes, challenging. They can make people defensive even if that’s not the intent. But any high performing organization, whether it’s a sports team, a business, a political organization, an activist group, I don’t care what it is, any high performing organization has to have mechanisms and a culture that supports truth-telling. One of the things you have to do is you have to talk about that. You have to talk about the fact that it takes energy to do that. You have to talk to people, you have to remind people, “It’s okay that it’s uncomfortable.” Literally tell people, “It’s not what we’re designed to do as humans.” It’s kind of a side effect. We can do that, but it’s not how we survive. We mostly survive by being social animals and being cordial and cooperative, and that’s really important.

(01:30:10)
And so science is all about truth-telling. It’s actually a very formal mechanism for trying to tell the truth. And even in science, you find that it’s hard to tell the truth. Even you’re supposed to have hypothesis and test it and find data and reject the hypothesis and so on, it’s not easy.
Lex Fridman
(01:30:36)
But even in science, there’s like the senior scientists and the junior scientists.
Jeff Bezos
(01:30:36)
Correct.
Lex Fridman
(01:30:41)
And then there’s a hierarchy of humans where somehow seniority matters in the scientific process, which it should not.
Jeff Bezos
(01:30:49)
Yes, and that’s true inside companies too. And so you want to set up your culture so that the most junior person can overrule the most senior person if they have data. And that really is about trying to… There are little things you can do. So for example, in every meeting that I attend, I always speak last. And I know from experience that if I speak first, even very strong-willed, highly intelligent, high judgment participants in that meeting will wonder, “Well, if Jeff thinks that, I came in this meeting thinking one thing, but maybe I’m not right.” And so you can do little things like if you’re the most senior person in the room, go last, let everybody else go first. In fact, ideally, let’s try to have the most junior person go first and the second and try to go in order of seniority so that you can hear everyone’s opinion in an unfiltered way. Because we really do, we actually literally change our opinions. If somebody who you really respect says something, it makes you change your mind a little.
Lex Fridman
(01:32:17)
So you’re saying implicitly or explicitly, give permission for people to have a strong opinion, as long as it’s backed by data.
Jeff Bezos
(01:32:27)
Yes, and sometimes it can even… By the way, a lot of our most powerful truths turn out to be hunches, they turn out to be based on anecdotes, they’re intuition based. And sometimes you don’t even have strong data, but you may know the person well enough to trust their judgment. You may feel yourself leaning in. It may resonate with a set of anecdotes you have, and then you may be able to say, “Something about that feels right. Let’s go collect some data on that. Let’s try to see if we can actually know whether it’s right. But for now, let’s not disregard it. It feels right.”

(01:33:06)
You can also fight inherent bias. There’s an optimism bias. If there are two interpretations of a new set of data and one of them is happy and one of them is unhappy, it’s a little dangerous to jump to the conclusion that the happy interpretation is right. You may want to compensate for that human bias of trying to find the silver lining and say, “Look, that might be good, but I’m going to go with it’s bad for now until we’re sure.”
Lex Fridman
(01:33:36)
So speaking of happiness bias, data collection and anecdotes, you have to… How’s that for a transition? You have to tell me the story of the call you made, the customer service call you made to demonstrate a point about wait times?
Jeff Bezos
(01:33:57)
Yeah. This is very early in the history of Amazon.
Lex Fridman
(01:34:00)
Yes.
Jeff Bezos
(01:34:00)
And we were going over a weekly business review and a set of documents, and I have a saying, which is when the data and the anecdotes disagree, the anecdotes are usually right. And it doesn’t mean you just slavishly go follow the anecdotes then. It means you go examine the data because it’s usually not that the data is being miscollected, it’s usually that you’re not measuring the right thing. And so of you have a bunch of customers complaining about something and at the same time, your metrics look like they shouldn’t be complaining, you should doubt the metrics.

(01:34:43)
And an early example of this was we had metrics that showed that our customers were waiting, I think less than, I don’t know, 60 seconds when they called a 1-800 number to get phone customer service. The wait time was supposed to be less than 60 seconds, but we had a lot of complaints that it was longer than that. And anecdotally it seemed longer than that. I would call customer service myself. And so one day we’re in a meeting, we’re going through the WBR, the weekly business review, and we get to this metric in the deck, and the guy who leads customer service is defending the metric. And I said, “Okay, let’s call.” Picked up the phone, and I dialed the 1-800 number and called customer service, and we just waited in silence.
Lex Fridman
(01:35:39)
What did it turn out to be?
Jeff Bezos
(01:35:40)
Oh, it was really long, more than 10 minutes, I think.
Lex Fridman
(01:35:42)
Oh, wow.
Jeff Bezos
(01:35:43)
It was many minutes. And so it dramatically made the point that something was wrong with the data collection. We weren’t measuring the right thing, and that set off a whole chain of events where we started measuring it right. And that’s an example, by the way, of truth-telling is like that’s an uncomfortable thing to do, but you have to seek truth even when it’s uncomfortable, and you have to get people’s attention and they have to buy into it, and they have to get energized around really fixing things.

Principles

Lex Fridman
(01:36:16)
So that speaks to the obsession with the customer experience. So one of the defining aspects of your approach to Amazon is just being obsessed with making customers happy. I think companies sometimes say that, but Amazon is really obsessed with that. I think there’s something really profound to that, which is seeing the world through the eyes of the customer, like the customer experience, the human being that’s using the product, that’s enjoying the product, the subtle little things that make up their experience. How do you optimize those?
Jeff Bezos
(01:36:55)
This is another really good and deep question because there are big things that are really important to manage, and then there are small things. Internally into Amazon, we call them paper cuts. So we’re always working on the big things, if you ask me. And most of the energy goes into the big things, as it should, and you can identify the big things. And I would encourage anybody, if anybody listening to this is an entrepreneur, has a small business, whatever, think about the things that are not going to change over 10 years. And those are probably the big things.

(01:37:38)
So I know in our retail business at Amazon, 10 years from now, customers are still going to want low prices. I know they’re still going to want fast delivery, and I just know they’re still going to want big selection. So it’s impossible to imagine a scenario where 10 years from now where a customer says, “I love Amazon, I just wish the prices were a little higher,” or, “I love Amazon, I just wish you delivered a little more slowly.” So when you identify the big things you can tell they’re worth putting energy into because they’re stable in time.

(01:38:10)
Okay, but you’re asking about something a little different, which is in every customer experience, there are those big things. And by the way, it’s astonishingly hard to focus even on just the big things. So even though they’re obvious, they’re really hard to focus on. But in addition to that, there are all these little tiny customer experience deficiencies, and we call those paper cuts. We make long lists of them. And then we have dedicated teams that go fix paper cuts because the teams working on the big issues never get to the paper cuts. They never work their way down the list to get to… They’re working on big things, as they should and as you want them to. And so you need special teams who are charged with fixing…
Jeff Bezos
(01:39:00)
Special teams who are charged with fixing paper cuts.
Lex Fridman
(01:39:04)
Where would you put on the paper cut spectrum the Buy now with the 1-Click button? Which is, I think, pretty genius. So to me, okay, my interaction with things I love on the internet, there’s things I do a lot. I, maybe representing a regular human, I would love for those things to be frictionless. For example, booking airline tickets, just saying. But it’s buying a thing with one click, making that experience frictionless, intuitive, all aspects of that, that just fundamentally makes my life better, not just in terms of efficiency, in terms of some kind of-
Jeff Bezos
(01:39:49)
Cognitive load.
Lex Fridman
(01:39:50)
… Yeah, cognitive load and inner peace and happiness. Because, first of all, buying stuff is a pleasant experience. Having enough money to buy a thing and then buying it is a pleasant experience. And having pain around that is somehow just you’re ruining a beautiful experience. And I guess all I’m saying as a person who loves good ideas, is that a paper cut, a solution to a paper cut?
Jeff Bezos
(01:40:17)
Yes. So that particular thing is probably a solution to a number of paper cuts. So if you go back and look at our order pipeline and how people shopped on Amazon before we invented 1-Click shopping, there was more friction. There was a whole series of paper cuts and that invention eliminated a bunch of paper cuts. And I think you’re absolutely right by the way, that when you come up with something like 1-Click shopping, again, this is so ingrained in people now, I’m impressed that you even notice it. Most people-
Lex Fridman
(01:40:54)
Every time I click the button, I just-
Jeff Bezos
(01:40:54)
… most people never notice.
Lex Fridman
(01:40:55)
… just a surge of happiness.
Jeff Bezos
(01:41:00)
There is in the perfect invention for the perfect moment in the perfect context, there is real beauty. It is actual beauty and it feels good. It’s emotional. It’s emotional for the inventor, it’s emotional for the team that builds it. It’s emotional for the customer. It’s a big deal and you can feel those things.
Lex Fridman
(01:41:23)
But to keep coming up with that idea, with those kinds of ideas, I guess is the day one thinking effort.
Jeff Bezos
(01:41:29)
Yeah, and you need a big group of people who feel that kind of satisfaction with creating that kind of beauty.
Lex Fridman
(01:41:38)
There’s a lot of books written about you. There’s a book Invent & Wander where Walter Isaacson does an intro. It’s mostly collective writings of yours. I’ve read that. I also recommend people check out the Founders Podcast that covers you a lot and it does different analysis of different business advice you’ve given over the years. I bring all that up because I mentioned that you said that books are an antidote for short attention spans. And I forget how it was phrased, but that when you were thinking about the Kindle that you were thinking about how technology changes us.
Jeff Bezos
(01:42:20)
Changes us. We co-evolve with our tools. So we invent new tools and then our tools change us.
Lex Fridman
(01:42:30)
Which is fascinating to think about.
Jeff Bezos
(01:42:32)
It goes in a circle
Lex Fridman
(01:42:33)
And there’s some aspect, even just inside business, where you don’t just make the customer happy, but you also have to think about where is this going to take humanity if you zoom out a bit?
Jeff Bezos
(01:42:45)
A hundred percent and you can feel your brain. Brains are plastic and you can feel your brain getting reprogrammed. I remember the first time this happened to me was when Tetris who’d first came on the scene. Anybody who’s been a game player has this experience where you close your eyes to lay down to go to sleep and you see all the little blocks moving and you’re kind of rotating them in your mind and you can just tell as you walk around the world that you have rewired your brain to play Tetris. But that happens with everything. I think we still have yet to see the full repercussions of this, I fear, but I think one of the things that we’ve done online and largely because of social media is we have trained our brains to be really good at processing super short form content.

(01:43:52)
Your podcast flies in the face of this. You do these long format things.
Lex Fridman
(01:43:59)
Books do too.
Jeff Bezos
(01:44:00)
And reading books is a long format thing and if something is convenient, we do more of it. We carry around in our pocket a phone, and one of the things that phone does for the most part is it is an attention shortening device because most of the things we do on our phone shorten our attention spans. And I’m not even going to say we know for sure that that’s bad, but I do think it’s happening. That’s one of the ways we’re co-evolving with that tool. But I think it’s important to spend some of your time and some of your life doing long attention span things.
Lex Fridman
(01:44:41)
Yeah, I think you’ve spoken about the value in your own life of focus, of singular focus on a thing for prolonged periods of time, and that’s certainly what books do and that’s certainly what that piece of technology does. But I bring all that up to ask you about another piece of technology, AI, that has the potential to have various trajectories to have an impact on human civilization. How do you think AI will change us?
Jeff Bezos
(01:45:14)
If you’re talking about generative AI, large language models, things like ChatGPT, and its soon successors, these are incredibly powerful technologies. To believe otherwise is to bury your head in the sand, soon to be even more powerful. It’s interesting to me that large language models in their current form are not inventions, they’re discoveries. The telescope was an invention, but looking through it at Jupiter, knowing that it had moons, was a discovery. My God, it has moons. And that’s what Galileo did. And so this is closer on that spectrum of invention. We know exactly what happens with a 787, it’s an engineered object. We designed it. We know how it behaves. We don’t want any surprises. Large language models are much more like discoveries. We’re constantly getting surprised by their capabilities. They’re not really engineered objects.

(01:46:35)
Then you have this debate about whether they’re going to be good for humanity or bad for humanity. Even specialized AI could be very bad for humanity. Just regular machine learning models can make certain weapons of war, that could be incredibly destructive and very powerful. And they’re not general AIs. They could just be very smart weapons. And so we have to think about all of those things. I’m very optimistic about this. So even in the face of all this uncertainty, my own view is that these powerful tools are much more likely to help us and save us even than they are to on balance hurt us and destroy us. I think we humans have a lot of ways of we can make ourselves go extinct. These things may help us not do that, so they may actually save us. So the people who are overly concerned, in my view, overly, it is a valid debate. I think that they may be missing part of the equation, which is how helpful they could be in making sure we don’t destroy ourselves.

(01:48:07)
I don’t know if you saw the movie Oppenheimer, but to me, first of all, I loved the movie and I thought the best part of the movie is this bureaucrat played by Robert Downey Jr, who some of the people I’ve talked to think that’s the most boring part of the movie. I thought it was the most fascinating because what’s going on here is you realize we have invented these awesome, destructive, powerful technologies called nuclear weapons and they’re managed and we humans, we’re not really capable of wielding those weapons. And that’s what he represented in that movie is here’s this guy, he wrongly thinks… he’s being so petty. He thinks that Oppenheimer said something bad to Einstein about him. They didn’t talk about him at all as you find out in the final scene of the movie. And yet he’s spent his career trying to be vengeful and petty.

(01:49:19)
And that’s the problem. We as a species are not really sophisticated enough and mature enough to handle these technologies. And by the way, before you get to general AI and the possibility of AI having agency and there’s a lot of things would have to happen, but there’s so much benefit that’s going to come from these technologies in the meantime, even before there are general AI in terms of better medicines and better tools to develop more technologies and so on. So I think it’s an incredible moment to be alive and to witness the transformations that are going to happen. How quickly will happen, no one knows. But over the next 10 years and 20 years, I think we’re going to see really remarkable advances. And I personally am very excited about it.
Lex Fridman
(01:50:12)
First of all, really interesting to say that it’s discoveries, that it’s true that we don’t know the limits of what’s possible with the current language models.
Jeff Bezos
(01:50:24)
We don’t.
Lex Fridman
(01:50:24)
And it could be a few tricks and hacks here and there that open doors to hold entire new possibilities.
Jeff Bezos
(01:50:33)
We do know that humans are doing something different from these models, in part because we’re so power efficient. The human brain does remarkable things and it does it on about 20 watts of power. And the AI techniques we use today use many kilowatts of power to do equivalent tasks. So there’s something interesting about the way the human brain does this. And also we don’t need as much data. So self-driving cars, they have to drive billions and billions of miles to try to learn how to drive. And your average 16-year-old figures it out with many fewer miles. So there are still some tricks, I think, that we have yet to learn. I don’t think we’ve learned the last trick. I don’t think it’s just a question of scaling things up. But what’s interesting is that just scaling things up, and I put just in quotes because it’s actually hard to scale things up, but just scaling things up also appears to pay huge dividends.
Lex Fridman
(01:51:40)
Yeah. And there’s some more nuanced aspect about human beings that’s interesting if it’s able to accomplish like being truly original and novel. Large language models, being able to come up with some truly new ideas. That’s one. And the other one is truth. It seems that large language models are very good at sounding like they’re saying a true thing, but they don’t require or often have a grounding in a mathematical truth, basically is a very good bullshitter. So if there’s not enough data in the training data about a particular topic, it’s just going to concoct accurate sounding narratives, which is a very fascinating problem to try to solve, how do you get language models to infer what is true or not to introspect?
Jeff Bezos
(01:52:41)
Yeah, they need to be taught to say, “I don’t know,” more often and I know several humans who could be taught that as well.
Lex Fridman
(01:52:50)
Sure. And then the other stuff, because you’re still a bit involved in the Amazon side with the AI things, the other open question is what kind of products are created from this?
Jeff Bezos
(01:53:01)
Oh, so many. We have Alexa and Echo and Alexa has hundreds of millions of installed base inputs. And so there’s Alexa everywhere. And guess what? Alexa is about to get a lot smarter. And so from a product point of view, that’s super exciting.
Lex Fridman
(01:53:27)
There’s so many opportunities there,
Jeff Bezos
(01:53:30)
So many opportunities. Shopping assistant, all that stuff is amazing. And AWS, we’re building Titan, which is our foundational model. We’re also building Bedrock, which are corporate clients at AWS. Our enterprise clients, they want to be able to use these powerful models with their own corporate data without accidentally contributing their corporate data to that model. And so those are the tools we’re building for them with Bedrock. So there’s tremendous opportunity here.
Lex Fridman
(01:54:03)
Yeah, the security, the privacy, all those things are fascinating. Because so much value can be gained by training on private data, but you want to keep this secure. It’s a fascinating technical problem.
Jeff Bezos
(01:54:13)
Yes. This is a very challenging technical problem and it’s one that we’re making progress on and dedicated to solving for our customers.
Lex Fridman
(01:54:21)
Do you think there will be a day when humans and robots, maybe Alexa, have a romantic relationship like in the movie Her?
Jeff Bezos
(01:54:29)
Well, I think if you look at the-
Lex Fridman
(01:54:31)
Just brainstorming products here.
Jeff Bezos
(01:54:32)
… if you look at the spectrum of human variety and what people like, sexual variety, there are people who like everything. So the answer to your question has to be yes.
Lex Fridman
(01:54:43)
Okay. I guess I’m asking when-
Jeff Bezos
(01:54:45)
I don’t know how widespread that will be.
Lex Fridman
(01:54:45)
… All right.
Jeff Bezos
(01:54:48)
But it will happen.

Productivity

Lex Fridman
(01:54:49)
I was just asking when for a friend, but it’s all right. Moving on. Next question. What’s a perfectly productive day in the life of Jeff Bezos? You’re one of the most productive humans in the world.
Jeff Bezos
(01:55:03)
Well, first of all, I get up in the morning and I putter. I have a coffee.
Lex Fridman
(01:55:09)
Can you define putter?
Jeff Bezos
(01:55:11)
I slowly move around. I’m not as productive as you might think I am. Because I do believe in wandering and I read my phone for a while. I read newspapers for a while. I chat with Laura and I drink my first coffee. So I move pretty slowly in the first couple of hours. I get up early just naturally, and then I exercise most days. Most days it’s not that hard for me. Some days it’s really hard and I do it anyway, I don’t want to, and it’s painful. And I’m like, “Why am I here?” And I don’t want to do any of this.
Lex Fridman
(01:55:52)
“Why am I here at the gym?”
Jeff Bezos
(01:55:53)
“Why am I here at the gym? Why don’t I do something else?” It’s not always easy.
Lex Fridman
(01:55:59)
What’s your social motivation in those moments?
Jeff Bezos
(01:56:02)
I know that I’ll feel better later if I do it. And so the real source of motivation, I can tell the days when I skip it, I’m not quite as alert. I don’t feel as good. And then there’s harder motivations. It’s longer term, you want to be healthy as you age. You want health span. Ideally, you want to be healthy and moving around when you’re 80 years old. And so there’s a lot of… But that kind of motivation is so far in the future, it can be very hard to work in the second. So thinking about the fact I’ll feel better in about four hours if I do it now, I’ll have more energy for the rest of my day and so on and so on.
Lex Fridman
(01:56:42)
What’s your exercise routine, just to linger on that? How much you curl? What are we talking about here? That’s all I do at the gym so I just…
Jeff Bezos
(01:56:52)
My routine on a good day, I do about half an hour of cardio and I do about forty-five minutes of weightlifting, resistance training of some kind, mostly weights. I have a trainer who I love who pushes me, which is really helpful. He’ll say, “Jeff, can we go up on that weight a little bit?”

(01:57:18)
And I’ll think about it and I’ll be like, “No, I don’t think so.”

(01:57:23)
And he’ll look at me and say, “Yeah, I think you can.” And of course he’s right.
Lex Fridman
(01:57:31)
Yeah, of course. Of course.
Jeff Bezos
(01:57:32)
So it’s helpful to have somebody push you a little bit.
Lex Fridman
(01:57:34)
But almost every day, you do that?
Jeff Bezos
(01:57:37)
Almost every day, I do a little bit of cardio and a little bit of weightlifting and I’d rotate. I do a pulling day and a pushing day and a leg day. It’s all pretty standard stuff.
Lex Fridman
(01:57:48)
So puttering, coffee, gym-
Jeff Bezos
(01:57:49)
Puttering, coffee, gym, and then work.
Lex Fridman
(01:57:53)
… work. But what’s work look like? What do the productive hours look like for you?
Jeff Bezos
(01:57:59)
So a couple years ago, I left as the CEO of Amazon, and I have never worked harder in my life. I am working so hard and I’m mostly enjoying it, but there are also some very painful days. Most of my time is spent on Blue Origin and I’m so deeply involved here now for the last couple of years. And in the big, I love it, and the small, there’s all the frustrations that come along with everything. We’re trying to get to rate manufacturing as we talked about. That’s super important. We’ll get there. We just hired a new CEO, a guy I’ve known for close to 15 years now, a guy named Dave Limp who I love. He’s amazing. So we’re super lucky to have Dave, and you’re going to see us move faster there.

(01:58:46)
So my day of work, reading documents, having meetings, sometimes in person, sometimes over Zoom, depends on where I am. It’s all about the technology, it’s about the organization. I have architecture and technology meetings almost every day on various subsystems inside the vehicle, inside the engines. It’s super fun for me. My favorite part of it is the technology. My least favorite part of it is building organizations and so on. That’s important, but it’s also my least favorite part. So that’s why they call it work. You don’t always get to do what you want to do.
Lex Fridman
(01:59:31)
How do you achieve time where you can focus and truly think through problems?
Jeff Bezos
(01:59:36)
I do little thinking retreats. So this is not the only way, I can do that all day long. I’m very good at focusing. I don’t keep to a strict schedule. My meetings often go longer than I planned for them to because I believe in wandering. My perfect meeting starts with a crisp document. So the document should be written with such clarity that it’s like angels singing from on high. I like a crisp document and a messy meeting. And so the meeting is about asking questions that nobody knows the answer to and trying to wander your way to a solution. And when that happens just right, it makes all the other meetings worthwhile. It feels good. It has a kind of beauty to it. It has an aesthetic beauty to it, and you get real breakthroughs in meetings like that.
Lex Fridman
(02:00:37)
Can you actually describe the crisp document? This is one of the legendary aspects of Amazon, of the way you approach meetings is this, the six-page memo. Maybe first describe the process of running a meeting with memos.
Jeff Bezos
(02:00:51)
Meetings at Amazon and Blue Origin are unusual. When new people come in, like a new executive joins, they’re a little taken aback sometimes because the typical meeting, we’ll start with a six-page narratively structured memo and we do study hall. For 30 minutes, we sit there silently together in the meeting and read.
Lex Fridman
(02:00:51)
I love this.
Jeff Bezos
(02:01:17)
Take notes in the margins. And then we discuss. And the reason, by the way, we do study, you could say, I would like everybody to read these memos in advance, but the problem is people don’t have time to do that. And they end up coming to the meeting having only skimmed the memo or maybe not read it at all, and they’re trying to catch up. And they’re also bluffing like they were in college having pretended to do the reading.
Lex Fridman
(02:01:42)
Yeah. Exactly.
Jeff Bezos
(02:01:43)
It’s better just to carve out the time for people.
Lex Fridman
(02:01:47)
Yeah. And do it together.
Jeff Bezos
(02:01:47)
So now we’re all on the same page, we’ve all read the memo, and now we can have a really elevated discussion. And this is so much better from having a slideshow presentation, a PowerPoint presentation of some kind, where that has so many difficulties. But one of the problems is PowerPoint is really designed to persuade. It’s kind of a sales tool. And internally, the last thing you want to do is sell. Again, you’re truth seeking. You’re trying to find truth. And the other problem with PowerPoint is it’s easy for the author and hard for the audience. And a memo is the opposite. It’s hard to write a six-page memo. A good six-page memo might take two weeks to write. You have to write it, you have to rewrite it, you have to edit it, you have to talk to people about it. They have to poke holes in it for you. You write it again, it might take two weeks. So the author, it’s really a very difficult job, but for the audience it’s much better.

(02:02:45)
So you can read a half hour, and there are little problems with PowerPoint presentations too. Senior executives interrupt with questions halfway through the presentation. That question’s going to be answered on the next slide, but you never got there. If you read the whole memo in advance… I often write lots of questions that I have in the margins of these memos, and then I go cross them all out because by the time I get to the end of the memo, they’ve been answered. That’s why I save all that time.

(02:03:11)
You also get, if the person who’s preparing the memo, we talked earlier about group think and the fact that I go last in meetings and that you don’t want your ideas to pollute the meeting prematurely, the author of the memos has got to be very vulnerable. They’ve got to put all their thoughts out there and they’ve got to go first. But that’s great because it makes them really good. And you get to see their real ideas and you’re not trompling on them accidentally in a big PowerPoint presentation meeting.
Lex Fridman
(02:03:50)
What’s that feel like when you’ve authored a thing and then you’re sitting there and everybody’s reading your thing?
Jeff Bezos
(02:03:54)
I think it’s mostly terrifying.
Lex Fridman
(02:03:57)
Yeah. But maybe in a good way? Like a purifying?
Jeff Bezos
(02:04:02)
I think it’s terrifying in a productive way, but I think it’s emotionally, a very nerve-racking experience.
Lex Fridman
(02:04:13)
Is there a art, science to the writing of this six-page memo or just writing in general to you?
Jeff Bezos
(02:04:20)
It’s really got to be a real memo. So it means paragraphs have topic sentences. It’s verbs and nouns. That’s the other problem with PowerPoint presentations, they’re often just bullet points. And you can hide a lot of sloppy thinking behind bullet points. When you have to write in complete sentences with narrative structure, it’s really hard to hide sloppy thinking. So it forces the author to be at their best, and so they’re somebody’s really their best thinking. And then you don’t have to spend a lot of time trying to tease that thinking out of the person, and you’ve got it from the very beginning. So it really saves you time in the long run.
Lex Fridman
(02:05:03)
So that part is crisp, and then the rest is messy. Crisp document, messy meeting.
Jeff Bezos
(02:05:07)
Yeah, so you don’t want to pretend that the discussion should be crisp. Most meetings, you’re trying to solve a really hard problem. There’s a different kind of meeting, which we call weekly business reviews or business reviews that may be weekly or monthly or daily, whatever they are. But these business review meetings, that’s usually for incremental improvement. And you’re looking at a series of metrics, every time it’s the same metrics. Those meetings can be very efficient. They can start on time and end on time.

Future of humanity

Lex Fridman
(02:05:35)
So we’re about to run out of time, which is a good time to ask about the 10,000-Year Clock.
Jeff Bezos
(02:05:43)
It’s funny.
Lex Fridman
(02:05:44)
Yes, that’s what I’m known for, is the humor. Okay. Can you explain what the 10,000-Year Clock is?
Jeff Bezos
(02:05:53)
Is? 10,000-Year Clock is a physical clock of monumental scale. It’s about 500 feet tall. It’s inside a mountain in west Texas at a chamber that’s about 12 feet in diameter and 500 feet tall. 10,000-Year Clock is an idea conceived by a brilliant guy named Danny Hillis way back in the ’80s. The idea is to build a clock as a symbol for long-term thinking. And you can kind of just very conceptually think of the 10,000-Year Clock as it ticks once a year, it chimes once every a hundred years, and the cuckoo comes out once every a thousand years. So it just sort of slows everything down. And it’s a completely mechanical clock. It is designed to last 10,000 years with no human intervention. So the material choices and everything else. It’s in a remote location, both to protect it, but also so that visitors have to make a pilgrimage.

(02:06:57)
The idea is that over time, and this will take hundreds of years, but over time, it will take on the patina of age, and then it will become a symbol for long-term thinking that will actually hopefully get humans to extend their thinking horizons. And in my view, that’s really important as we have become, as a species, as a civilization, more powerful. We’re really affecting the planet now. We’re really affecting each other. We have weapons of mass destruction. We have all kinds of things where we can really hurt ourselves and the problems we create can be so large. The unintended consequences of some of our actions like climate change, putting carbon in the atmosphere is a perfect example. That’s an unintended consequence of the Industrial Revolution, got a lot of benefits from it, but we’ve also got this side effect that is very detrimental.

(02:07:56)
We need to start training ourselves to think longer term. Long-term thinking is a giant lever. You can literally solve problems if you think long-term, that are impossible to solve if you think short-term. And we aren’t really good at thinking long-term. Five years is a tough timeframe for most institutions to think past. And we probably need to stretch that to 10 years and 15 years and 20 years and 25 years, and we’d do a better job for our children or our grandchildren if we could stretch those thinking horizons. And so the clock, in a way, it’s an art project, it’s a symbol. And if it ever has any power to influence people to think longer term, that won’t happen for hundreds of years, but we are going to build it now and let it accrue the patina of age.
Lex Fridman
(02:08:52)
Do you think humans will be here when the clock runs out here on earth?
Jeff Bezos
(02:08:56)
I think so. But the United States won’t exist. Whole civilizations rise and fall. 10,000 years is so long. No nation state has ever survived for anywhere close to 10,000 years.
Lex Fridman
(02:09:12)
And the increasing rate of progress makes that even fantastic.
Jeff Bezos
(02:09:15)
Even less likely so. Do I think humans will be here? Yes. How will we have changed ourselves and what will we be and so on and so on? I don’t know, but I think we’ll be here.
Lex Fridman
(02:09:25)
On that grand scale, a human life feels tiny. Do you ponder your own mortality? Are you afraid of death?
Jeff Bezos
(02:09:32)
No. I used to be afraid of death. I did. I remember as a young person being very scared of mortality, didn’t want to think about it, and so on. And as I’ve gotten older, I’m 59 now, as I’ve gotten older, somehow that fear has sort of gone away. I would like to stay alive for as long as possible, but I’m really more focused on health span. I want to be healthy. I want that square wave. I want to be healthy, healthy, healthy, and then gone. I don’t want the long decay. And I’m curious. I want to see how things turn out. I’d like to be here. I love my family and my close friends, and I’m curious about them, and I want to see. So I have a lot of reasons to stay around, but mortality doesn’t have that effect on me that it did maybe when I was in my twenties.
Lex Fridman
(02:10:38)
Well, Jeff, thank you for creating Amazon, one of the most incredible companies in history, and thank you for trying your best to make humans a multi-planetary species, expanding out into our solar system, maybe beyond, to meet the aliens out there. And thank you for talking today.
Jeff Bezos
(02:10:55)
Lex, thank you for doing your part to lengthen our attention spans. Appreciate that very much.
Lex Fridman
(02:11:04)
I’m doing my best. Thanks for listening to this conversation with Jeff Bezos. To support this podcast, please check out our sponsors in the description. And now let me leave you with some words from Jeff Bezos himself. Be stubborn on vision, but flexible on the details. Thank you for listening and hope to see you next time.

Transcript for Lee Cronin: Controversial Nature Paper on Evolution of Life and Universe | Lex Fridman Podcast #404

This is a transcript of Lex Fridman Podcast #404 with Lee Cronin.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lee Cronin
(00:00:00)
Every star in the sky probably has planets and life is probably emerging on these planets. But I think the commentorial space associated with these planets is so different. Our causal cones are never going to overlap or not easily. And this is the thing that makes me sad about alien life, why we have to create alien life in the lab as quickly as possible because I don’t know if we are going to be able to build architectures that will intersect with alien intelligence architectures.
Lex Fridman
(00:00:35)
Intersect, you don’t mean in time or space-
Lee Cronin
(00:00:38)
Time and the ability to communicate.
Lex Fridman
(00:00:40)
The ability to communicate.
Lee Cronin
(00:00:41)
Yeah. My biggest fear in a way is that life is everywhere, but we’ve become infinitely more lonely because of our scaffolding in that commentorial space.
Lex Fridman
(00:00:52)
The following is a conversation with Lee Cronin, his third time in this podcast. He’s a chemist from University of Glasgow who is one of the most fascinating, brilliant and fun to talk to scientists I’ve ever had the pleasure of getting to know. This is the Lex Fridman podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Lee Cronin.

Assembly theory paper


(00:01:16)
So your big assembly theory paper was published in Nature. Congratulations.
Lee Cronin
(00:01:21)
Thanks.
Lex Fridman
(00:01:21)
It created, I think it’s fair to say, a lot of controversy, but also a lot of interesting discussion. So maybe I can try to summarize assembly theory and you tell me if I’m wrong.
Lee Cronin
(00:01:32)
Go for it.
Lex Fridman
(00:01:33)
So assembly theory says that if we look at any object in the universe, any object, that we can quantify how complex it is by trying to find the number of steps it took to create it. And also we can determine if it was built by a process akin to evolution by looking at how many copies of the object there are.
Lee Cronin
(00:01:55)
Yep. That’s spot on. Yep.
Lex Fridman
(00:01:56)
Spot on.
Lee Cronin
(00:01:57)
Spot on.
Lex Fridman
(00:01:58)
I was not expecting that. Okay, so let’s go through definitions. So there’s a central equation I’d love to talk about, but definition wise, what is an object?
Lee Cronin
(00:02:11)
Yeah, an object. So if I’m going to try to be as meticulous as possible, objects need to be finite and they need to be decomposable into sub-units. All human made artifacts are objects. Is a planet an object? Probably yes, if you scale out. So an object is finite and accountable and decomposable, I suppose, mathematically. But yeah, I still wake up some days and go to think to myself, what is an object? Because it’s a non-trivial question.
Lex Fridman
(00:02:50)
Persists over time, I’m quoting from the paper here. An object is finite, is distinguishable. I’m sure that’s a weird adjective, distinguishable.
Lee Cronin
(00:03:03)
We’ve had so many people help offering to rewrite the paper after it came out. You wouldn’t believe it’s so funny.
Lex Fridman
(00:03:10)
Persists over time. And is breakable such that the set of constraints to construct it from elementary building blocks is quantifiable, such that the set of constraints to construct it from elementary building blocks is quantifiable.
Lee Cronin
(00:03:25)
The history is in the objects. It’s kind of cool, right?
Lex Fridman
(00:03:29)
Okay. So what defines the object is its history or memory, whichever is the sexier word.
Lee Cronin
(00:03:36)
I’m happy with both depending on the day.
Lex Fridman
(00:03:38)
Okay, so the set of steps it took to create the object. So there’s a sense in which every object in the universe has a history. And that is part of the thing that is used to describe its complexity. How complicated it is. Okay, what is an assembly index?
Lee Cronin
(00:04:00)
So the assembly index, if you’re to take the object apart and be super lazy about it or minimal say ’cause it’s like you’ve got a really short-term memory. So what you do is you lay all the parts on the path and you find the minimum number of steps you take on the path to add the parts together to reproduce the object. And that minimum number is the assembly index. It’s minimum bound. And it was always my intuition, the minimum bound and assembly theory was really important that I only worked out why a few weeks ago, which is kind of funny ’cause I was just like, “No, this is sacrosanct. I don’t know why, it’ll come to me one day.”

(00:04:37)
And then when I was pushed by a bunch of mathematicians, we came up with the correct physical explanation, which I can get to, but it’s the minimum and it’s really important. It’s the minimum. And the reason I knew the minimum was right is because we could measure it. So almost before this paper came out, we’d published papers, explain how you can measure the assembly index of molecules.
Lex Fridman
(00:05:01)
Okay, so that’s not so trivial to figure out. So when you look at an object, we could say a molecule, we could say object more generally. To figure out the minimum number of steps it takes to create that object, that doesn’t seem like a trivial thing to do.
Lee Cronin
(00:05:17)
So with molecules, it is not trivial, but it is possible because what you can do and because I’m a chemist, so I’m kind of like I see the lens of the world for just chemistry. I break the molecule apart and break bonds. And if you take a molecule and you break it all apart, you have a bunch of atoms and then you say, “Okay, I’m going to then take the atoms and form bonds and go up the chain of events to make the molecule.”

(00:05:46)
And that’s what made me realize, take a toy example, literally a toy example, take a Lego object, which is broken up of Lego blocks. So you could do exactly the same thing. In this case, the Lego blocks are naturally the smallest. They’re the atoms in the actual composite Lego architecture. But then if you maybe take a couple of blocks and put them together in a certain way, maybe they’re offset in some way, that offset is on the memory, you can use that offset again with only a penalty of one and you can then make a square, triangle and keep going.

(00:06:19)
And you remember those motifs on the chain. So you can then leap from the start with all the Lego blocks or atoms just laid out in front of you and say, “Right, I’ll take you, you, you,” connect and do the least amount of work. So it’s really like the smallest steps you can take on the graph to make the object. And so for molecules, it came relatively intuitively. And then we started to apply it to language. We’ve even started to apply it to mathematical theorems. But I’m so well out of my depth. But it looks like you can take minimum set of axioms and then start to build up mathematical architectures in the same way. And then the shortest path to get there is something interesting that I don’t yet understand.
Lex Fridman
(00:07:02)
So what’s the computational complexity of figuring out the shortest path with molecules, with language, with mathematical theorems? It seems like once you have the fully constructed Lego castle or whatever your favorite Lego world is, figuring out how to get there from the basic building blocks, is that an empty hard problem? It’s a hard problem.
Lee Cronin
(00:07:28)
It’s a hard problem. But actually if you look at it, so the best way to look at it, let’s take a molecule. So if the molecule has 13 bonds, first of all, take 13 copies of the molecule and just cut all the bonds. So cut 12 bonds and then you just put them in order and then that’s how it works. And you keep looking for symmetry or copies so you can then shorten it as you go down.

(00:07:51)
And that becomes [inaudible 00:07:53] quite hard. For some natural product molecules, it becomes very hard. It’s not impossible, but we’re looking at the bounds on that at the moment. But as the object gets bigger it becomes really hard. But that’s the bad news. But the good news is there are shortcuts. And we might even be able to physically measure the complexity without computationally calculating it, which is kind of insane.
Lex Fridman
(00:08:20)
Wait, how would you do that?
Lee Cronin
(00:08:20)
Well, in the case of molecule, so if you shine light on a molecule, let’s take an infrared. The molecule has each of the bonds absorbs the infrared differently in what we call the fingerprint region. And so it’s a bit like because it’s quantized as well, you have all these discreet kind of absorbances. And my intuition, after we realized we could cut molecules up in mass spec, that was the first go at this. We did it with using infrared. And the infrared gave us an even better correlation assembly index. And we used another technique as well in addition to infrared called NMR, nuclear magnetic resonance, which tells you about the number of different magnetic environments in a molecule. And that also worked out. So we have three techniques which each of them independently gives us the same or tending towards the same assembly index from molecule that we can calculate mathematically.
Lex Fridman
(00:09:12)
So these are all methods of mass spectrometry, mass spec. You scan a molecule, it gives you data in the form of a mass spectrum. And you’re saying that the data correlates to the assembly index?
Lee Cronin
(00:09:25)
Yeah.
Lex Fridman
(00:09:26)
So how generalizable is that shortcut, first of all it’s chemistry. And second of all, beyond that, that seems like a nice hack and you’re extremely knowledgeable about various aspects of chemistry. So you can say, okay, it kind of correlates. But the whole idea behind assembly theory paper and perhaps why it’s so controversial is that it reaches bigger. It reaches for the bigger general theory of objects in the universe.
Lee Cronin
(00:09:58)
Yeah, I’d say so. I’d agree. So I’ve started assembly theory of emoticons with my lab, believe it or not. So we take emojis, pixelate them and work out the assembly index of the emoji and then work out how many emojis you can make on the path of emoji. So there’s the uber emoji from which all other emojis emerge. So you can then take a photograph and by looking at the shortest path, by reproducing the pixels to make the image you want, you can measure that. So then you start to be able to take spatial data.

(00:10:32)
Now there’s some problems there. What is then the definition of the object? How many pixels? How do you break it down? And so we’re just learning all this right now.
Lex Fridman
(00:10:42)
So how do you compute, begin to compute the assembly index of a graphical, a set of pixels on a 2D plane that form a thing?
Lee Cronin
(00:10:54)
So you would first of all determine the resolution. So then what is your XY and what the number on the X and Y plane and then look at the surface area. And then you take all your emojis and make sure they’re all looked at the same resolution. And then we would basically then do exactly the same thing we would do for cutting the bonds. You’d cut bits out of the emoji and look at, you’d have a bag of pixels and you would then add those pixels together to make the overall emoji.
Lex Fridman
(00:11:26)
Wait, wait a minute. But first of all, not every pixels, I mean this is at the core, machine learning and computer vision, not every pixels that important. And there’s macro features, there’s micro features and all that kind of stuff.
Lee Cronin
(00:11:40)
Exactly.
Lex Fridman
(00:11:42)
The eyes appear in a lot of them, the smile appears in a lot of them.
Lee Cronin
(00:11:47)
So in the same way in chemistry we assume the bond is fundamental. What we do in they’re and here is we assume the resolution at the scale at which we do it is fundamental and we’re just working that out. And you’re right, that will change because as you take your lens out a bit, it will change dramatically.

(00:12:02)
But it’s just a new way of looking at, not just compression. What we do right now in computer science and data, one big kind of misunderstanding as assembly theory is telling you about how compressed the object is. That’s not right. It’s how much information is required on a chain of events. Because the nice thing is if, when you do compression and computer science, we’re wandering a bit here, but it’s kind of worth wandering I think, you assume you have instantaneous access to all the information in the memory. In assembly theory you say, “No, you don’t get access to that memory until you’ve done the work.” And then when you’ve done access to that memory, you can have access but not to the next one.

(00:12:45)
And this is how in assembly theory, we talk about the four universes, the assembly universe, the assembly possible, and the assembly contingent, and then the assembly observed. And they’re all scales in this commentorial universe.
Lex Fridman
(00:12:58)
Yeah. Can you explain each one of them?
Lee Cronin
(00:13:00)
Yep. So the assembly universe is like anything goes, just combinatorial kind of explosion in everything.
Lex Fridman
(00:13:07)
So that’s the biggest one?
Lee Cronin
(00:13:08)
That’s the biggest one. It’s massive.
Lex Fridman
(00:13:09)
Assembly universe, assembly possible, assembly contingent, assembly observed. And the Y axis is assembly steps in time and the X axis as the thing expands through time, more and more unique objects appear.
Lee Cronin
(00:13:29)
Yeah, so assembly universe, everything goes. Assembly possible, laws of physics come in this case in chemistry, bonds assembly. So that means-
Lex Fridman
(00:13:39)
Those are extra constraints, I guess?
Lee Cronin
(00:13:40)
Yes. And they’re the only constraints. They’re the constraints at the base. So the way to look at it’s you’ve got all your atoms, they’re contized and you can just bond them together. So then you can become a kind of, so in the way in computer science speak, I suppose the assembly universe is just like no laws of physics. Things can fly through mountains, beyond the speed of light. In the assembly possible. You have to apply the laws of physics, but you can get access to all the motifs instantaneously with no effort. So that means you could make anything.

(00:14:10)
Then the assembly contingent says “No, you can’t have access to the highly assembled object in the future until you’ve done the work in the past on the causal chain.” And that’s really, the really interesting shift where you go from assembly possible to assembly contingent. That is really the key thing in assembly theory that says you cannot just have instantaneous access to all those memories. You have to have done the work. Somehow the universe has to have somehow built a system that allows you to select that path rather than other paths.

(00:14:45)
And then the final thing the assembly observed is basically us saying, “Oh, these are the things we actually see. We can go backwards now and understand that they have been created by this causal process.”
Lex Fridman
(00:14:59)
Wait a minute. So when you say the universe has to construct the system that does the work, is that like the environment that allows for selection?
Lee Cronin
(00:15:08)
Yeah.
Lex Fridman
(00:15:09)
So that’s the thing that does the selection.
Lee Cronin
(00:15:10)
You could think about in terms of a Von Neumann constructor versus a selection, a ribosome, a Tesla plant assembling Teslas. The difference between the assembly universe in Tesla land and the Tesla factory is everyone says, “No, Teslas are just easy. They just spring out, you know how to make them all. The Tesla factory, you have to put things in sequence and out comes a Tesla.
Lex Fridman
(00:15:32)
So you’re talking about the factory?
Lee Cronin
(00:15:33)
Yes. This is really nice, super important point is that when I talk about the universe having a memory or there’s some magic, it’s not that. It’s that tells you that there must be a process encoded somewhere in physical reality, be it a cell, a Tesla factory or something else that is making that object. I’m not saying there’s some kind of woo-woo memory in the universe, morphic resonance or something. I’m saying that there is an actual causal process that is being directed, constrained in some way. So it’s not kind of just making everything.
Lex Fridman
(00:16:10)
Yeah, but Lee, what’s the factory that made the factory? First of all, you assume the laws of physics is just sprung to existence at the beginning. Those are constraints. But what makes the factory the environment that does the selection?
Lee Cronin
(00:16:29)
This is the question of, well, it’s the first interesting question that I want to answer out of four. I think the factory emerges in the interplay between the environment and the objects that are being built. And let me, I’ll have a go at explain to you the shortest path.

(00:16:48)
So why is the shortest path important? Imagine you’ve got, I’m going to have to go chemistry for a moment, then abstract it. So imagine you’ve got a given environment that you have a budget of atoms, you’re just flinging together. And the objective of those atoms that being flung together in say, molecule A, they decompose. So molecules decompose over time. So the molecules in this environment, in this magic environment have to not die, but they do die. They have a half-life.

(00:17:23)
So the only way the molecules can get through that environment out the other side, let’s pretend the environment is a box and can go in and out without dying. And there’s just an infinite supply of atoms coming or, well, a large supply, the molecule gets built, but the molecule that is able to template itself being built and survives in the environment will basically reign supreme.

(00:17:49)
Now let’s say that molecule takes 10 steps and it is using a finite set of atoms. Now, let’s say another molecule, smart ass molecule we’ll call it, comes in and can survive in that environment and can copy itself, but it only needs five steps. The molecule that only needs five steps continued, both molecules are being destroyed, but they’re creating themselves faster they can be destroyed. You can see that the shortest path reigns supreme. So the shortest path tells us something super interesting about the minimal amount of information required to propagate that motif in time and space. And it seems to be like some kind of conservation law.
Lex Fridman
(00:18:35)
So one of the intuitions you have is the propagation of motifs in time will be done by the things that can construct themselves in the shortest path.
Lee Cronin
(00:18:47)
Yeah.
Lex Fridman
(00:18:47)
So you can assume that most of the objects in the universe are built in the shortest, in the most efficient way. Big leap I just took there.
Lee Cronin
(00:18:58)
Yeah. Yes and no, because there are other things. So in the limit, yes, because you want to tell the difference between things that have required a factory to build them and just random processes. But you can find instances where the shortest path isn’t taken for an individual object, individual function. And people go, “Ah, that means the shortest path isn’t right.” And then I say, “Well, I don’t know. I think it’s right still because,” so of course, because there are other driving forces, it’s not just one molecule.

(00:19:33)
Now you start to consider two objects, you have a joint assembly space. And it’s not now, it’s a compromise between not just making A and B in the shortest path. You want to be able to make A and B in the shortest path, which might mean that A is slightly longer, compromise. So when you see slightly more nesting in the construction, when you take a given object, that can look longer. But that’s because the overall function is the object is still trying to be efficient. And this is still very hand wavy and maybe having no leg to stand on, but we think we’re getting somewhere with that.
Lex Fridman
(00:20:09)
And there’s probably some parallelization, right?
Lee Cronin
(00:20:12)
Yeah.
Lex Fridman
(00:20:12)
So this is not sequential. The building is, I guess.
Lee Cronin
(00:20:17)
No, you’re right.
Lex Fridman
(00:20:18)
When you’re talking about complex objects, you don’t have to work sequentially. You can work in parallel, you can get your friends together and they can…
Lee Cronin
(00:20:25)
Yeah, and the thing we’re working on right now is how to understand these parallel processes. Now there’s a new thing we’ve introduced called assembly depth. And assembly depth can be lower than the assembly index for a molecule when they’re cooperating together because exactly this parallel processing is going on. And my team have been working this out in the last few weeks because we’re looking at what compromises does nature need to make when it’s making molecules in a cell? And I wonder if maybe like, well, I’m always leaping out of my competence, but in economics, I’m just wondering if you could apply this in economic processes. It seems like capitalism is very good at finding shortest path every time. And there are ludicrous things that happen because actually the cost function has been minimized.

(00:21:15)
And so I keep seeing parallels everywhere where there are complex nested systems where if you give it enough time and you introduce a bit of heterogeneity, the system readjusts and finds a new shortest path. But the shortest path isn’t fixed on just one molecule now. It’s in the actual existence of the object over time. And that object could be a city, it could be a cell, it could be a factory, but I think we’re going way beyond molecules and my competence so probably should go back to molecules, but hey.

Assembly equation

Lex Fridman
(00:21:44)
All right, before we get too far, let’s talk about the assembly equation. Okay. How should we do this? Let me just even read that part of the paper. We define assembly as the total amount of selection necessary to produce an ensemble of observed objects quantified using equation one. The equation basically has A on one side, which is the assembly of the ensemble, and then a sum from one to N, where N is the total number of unique objects.

(00:22:20)
And then there is a few variables in there that include the assembly index, the copy number which we’ll talk about. That’s an interesting, I don’t remember you talking about that. That’s an interesting addition and I think a powerful one. It has to do with what, that you can create pretty complex objects randomly, and in order to know that they’re not random, that there’s a factory involved, you need to see a bunch of them. That’s the intuition there. It’s an interesting intuition and then some normalization. What else is and-
Lee Cronin
(00:22:54)
N minus one, just to make sure that more than one object, one object could be a one-off and random. And then you have more than one identical object. That’s interesting.
Lex Fridman
(00:23:03)
When there’s two of a thing.
Lee Cronin
(00:23:05)
Two of a thing is super important, especially if the index assembly index is high.
Lex Fridman
(00:23:10)
So we could say several questions here. One, let’s talk about selection. What is this term selection? What is this term evolution that we’re referring to? Which aspect of Darwinian evolution are we referring to? That’s interesting here.
Lee Cronin
(00:23:26)
Yeah, so this is probably what the paper, we should talk about the paper for a second. The paper, what it did is it kind of annoyed, we didn’t know it. It got intention and obviously the angry people were annoyed.
Lex Fridman
(00:23:39)
There’s angry people in the world. That’s good.
Lee Cronin
(00:23:41)
So what happened is the evolutionary biologists got angry. We were not expecting that. We thought evolutionary biologists would be cool. I knew that some, not many, computational complexity people will get angry because I’ve kind of been poking them and maybe I deserved it, but I was trying to poke them in a productive way. And then the physicists kind of got grumpy because the initial conditions tell everything. The prebiotic chemist got slightly grumpy because there’s not enough chemistry in there. Then finally, when the creationist said it wasn’t creationist enough, I was like, “I’ve done my job.”
Lex Fridman
(00:24:13)
You’re saying the physics, they say, because you’re basically saying that physics is not enough to tell the story of how biology emerges?
Lee Cronin
(00:24:22)
I think so.
Lex Fridman
(00:24:22)
And then they said a few physics is the beginning and the end of the story.
Lee Cronin
(00:24:29)
So what happened is the reason why people put the phone down on the call of the paper, if you view reading the paper like a phone call, they got to the abstract and in the abstract-
Lex Fridman
(00:24:39)
First sentence is pretty strong.
Lee Cronin
(00:24:40)
The first two sentences caused everybody-
Lex Fridman
(00:24:42)
Scientists have grappled with reconciling biological evolution with the immutable laws of the universe defined by physics.
Lee Cronin
(00:24:51)
True, right? There’s nothing wrong with that statement. Totally true.
Lex Fridman
(00:24:55)
Yeah. These laws underpin life’s origin, evolution, and the development of human culture and technology, yet they do not predict the emergence of these phenomena. Wow. First of all, we should say the title of the paper, this paper was accepted and published in Nature. The title is Assembly Theory Explains and Quantifies Selection and Evolution, very humble title. And the entirety of the paper, I think, presents interesting ideas, but reaches high.
Lee Cronin
(00:25:26)
I am not… I would do it all again. This paper was actually on the pre-print server for over a year.
Lex Fridman
(00:25:33)
You regret nothing?
Lee Cronin
(00:25:34)
Yeah.
Lee Cronin
(00:25:35)
I think, yeah, I don’t regret anything.
Lex Fridman
(00:25:37)
You and Frank Sinatra did it your way.
Lee Cronin
(00:25:39)
What I love about being a scientist is sometimes because I’m a bit dim and I don’t understand what people are telling me, I want to get to the point. This paper says, “Hey, the laws of physics are really cool, the universe is great, but they don’t really, it’s not intuitive that you just run the standard model and get life out.” I think most physicists might go, “Yeah, it’s not just, we can’t just go back and say that’s what happened.” Because physics can’t explain the origin of life yet. That doesn’t mean it won’t or can’t. Okay. Just to be clear. Sorry intelligent designers, we are going to get there.

(00:26:16)
Second point, we say that evolution works, but we don’t know how evolution got going. So biological evolution and biological selection. So for me, this seems like a simple continuum. So when I mentioned selection and evolution in the title, I think, and in the abstract, we should have maybe prefaced that and said non-biological selection and non-biological evolutions. And then that might have made it even more crystal clear. But I didn’t think that biology, evolutionary biology, should be so bold to claim ownership of selection and evolution.

(00:26:49)
And secondly, a lot of evolutionary biologists seem to dismiss the origin of life question and just say it’s obvious. And that causes a real problem scientifically because two different, when the physicists are like, ” We own the universe. The universe is good, we explain all of it, look at us.” And even biologists say, “We can explain biology.” And the poor chemists in the middle going, “But hang on.”

(00:27:12)
And this paper kind of says, “Hey, there is an interesting disconnect between physics and biology. And that’s at the point at which memories get made in chemistry through bonds. And hey, let’s look at this close and see if we can quantify it.” So yeah, I never expected the paper to get that much interest. And still, it’s only been published just over a month ago now.
Lex Fridman
(00:27:38)
So just to link on the selection, what is the broader sense of what selection means?
Lee Cronin
(00:27:46)
Yeah, that’s really good. For selection, so I think for selection, so this is where for me, the concept of an object is something that can persist in time and not die, but basically can be broken up. So if I was going to kind of bolster the definition of an object, so if something can form and persist for a long period of time under an existing environment that could destroy other, and I’m going to use anthropomorphic terms, I apologize, about weaker objects or less robust, then the environment could have selected that.

(00:28:30)
So good chemistry examples, if you took some carbon and you made a chain of carbon atoms, whereas if you took some, I don’t know, some carbon, nitrogen and oxygen and made change from those, you’d start to get different reactions and rearrangements. So a chain of carbon atoms might be more resistant to falling apart under a acidic or basic conditions versus another set of molecules. So it survives in that environment. So the acid pond, the resistant molecule can get through. And then that molecule goes into another environment. So that environment now maybe being acid pond is a basic pond or maybe it’s an oxidizing pond. And so if you’ve got carbon and it goes an oxidizing pond, maybe the carbon starts to oxidize and break apart. So you go through all these kind of obstacle courses if you like, given by reality. So selection is the ability happens when object survives in an environment for some time.

(00:29:33)
And this is the thing that’s super subtle. The object has to be continually being destroyed and made by process. So it’s not just about the object now, it’s about the process and time that makes it because a rock could just stand on the mountain side for 4 billion years and nothing happened to it. And that’s not necessarily really advanced selection. So for selection to get really interesting, you need to have a turnover in time. You need to be continually creating objects, producing them, what we call discovery time. So there’s a discovery time for an object.

(00:30:07)
When that object is discovered, if it’s say a molecule that can then act on itself or the chain of events that caused itself to bolster its formation, then you go from discovery time to production time and suddenly you have more of it in the universe. So it could be a self-replicating molecule and the interaction of the molecule in the environment, in the warm little pond or in the sea or wherever in the bubble could then start to build a proto factory, the environment.

(00:30:34)
So really to answer your question, what the factory is, the factory is the environment, but it’s not very autonomous, it’s not very redundant. There’s lots of things that could go wrong. So once you get high enough up the hierarchy of networks, of interactions, something needs to happen that needs to be compressed into a smaller volume and made resistant robust because in biology, selection and evolution is robust that you have error correction built in. You have really, there’s good ways of basically making sure propagation goes on.

(00:31:07)
So really the difference between inorganic, antibiotic selection and evolution and evolution and stuff in biology is robustness the ability to propagate, the ability to survive in lots of different environments. Whereas our poor little inorganic sole molecule, whatever, just dies in lots of different environments. So there’s something super special that happens from the inorganic molecule in the environment that kills it to where you’ve got evolution and cells can survive everywhere.
Lex Fridman
(00:31:44)
How special is that? How do you know those kinds of evolution factors aren’t everywhere in the universe?
Lee Cronin
(00:31:51)
I don’t, and I’m excited because I think selection isn’t special at all. I think what is special is the history of the environments on earth that gave rise to the first cell that now has taken all those environments and is now more autonomous. And I would like to think that, you know this paper could be very wrong, but I don’t think it’s very wrong. I mean it’s certainly wrong, but it’s less wrong than some other ideas, I hope, right? And if this inspires us to go and look for selection in the universe because we now have an equation where we can say, we can look for selection going on and say, “Oh, that’s interesting. We seem to have a process. It’s giving us high copy number objects that also are highly complex, but that doesn’t look like life as we know it.”

(00:32:46)
And we use that and say, “Oh, there’s a hydrothermal vent. Oh, there’s a process going on. There’s molecular networks,” because the assembly equation is not only meant to identify at the higher end advanced selection, what you get, I would call in biology super advanced selection. And even, you could use the assembly equation to look for technology and God forbid we could talk about consciousness and abstraction, but let’s keep it primitive, molecules and biology. So I think the real power of the assembly equation is to say how much selection is going on in this space.

(00:33:20)
And there’s a really simple thought experiment I could do is you have a little Petri dish and on that Petri dish you put some simple food. So the assembly index of all the sugars and everything is quite low. So then, and you put a single cell of E. coli cell and then you say, “I’m going to measure the assembly in this, amount of assembly in the box.” So it’s quite low, but the rate of change of assembly, DADT will go [inaudible 00:33:47] sigmoidal as it eats all the food and the number of coli cells will replicate because they take all the food, they copy themselves, the assembly index of all the molecules goes up, up and up until the food is exhausted in the box. So now the E. coli’s stopped-
Lee Cronin
(00:34:00)
… in the box. So now the E. coli’s stopped… I mean, die is probably a strong word. They stopped respiring because all the food is gone. But suddenly, the amount of assembly in the box has gone up gigantically because of that one E. coli factory has just eaten through, milled lots of other E. coli factories run out of food and stopped. And so that, looking at that… So in the initial box, although the amount of assembly was really small, it was able to replicate and use all the food and go up. And that’s what we’re trying to do in the lab, actually, is make those experiments and see if we can spot the emergence of molecular networks that are producing complexity, as we feed in raw materials and we feed a challenge, an environment. We try and kill the molecules. And really, that’s the main idea for the entire paper.
Lex Fridman
(00:34:52)
Yeah, and see if you can measure the changes in the assembly index throughout the whole system.
Lee Cronin
(00:34:56)
Yeah.

Discovering alien life

Lex Fridman
(00:34:57)
Okay. What about, if I show up to a new planet, we’ll go to Mars or some other planet from a different solar system, how do we use assembly index there to discover alien life?
Lee Cronin
(00:35:11)
Very simply, actually. Let’s say we’ll go to Mars with a mass spectrometer, with a sufficiently high resolution, so what you have to be able to do, so a good thing about mass spec is that you can select the molecule from the mass, and then if it’s high enough resolution, you can be more and more sure that you’re just seeing identical copies. You can count them. And then you fragment them and you count the number of fragments, and look at the molecular weight. And the higher the molecular weight and the higher the number of the fragments, the higher the assembly index.

(00:35:43)
So if you go to Mars and you take a mass spec, with high enough resolution, and you can find molecules, a guide on earth, if you could find molecules, say, greater than 350 molecular weight, with more than 15 fragments, you have found artifacts that can only be produced, at least on earth, by life. And now you would say, “Oh, well, maybe the geological process.” I would argue very virulently that that is not the case.

(00:36:10)
But we can say, “Look, if you don’t like the cutoff on earth, go up higher, 30, 100, because there’s going to be a point where you can find a molecule with so many different parts, the chances of you getting a molecule that has a hundred different parts and finding a million identical copies, that’s just impossible. That could never happen in an infinite set of universes.
Lex Fridman
(00:36:37)
Can you just linger on this copy number thing? A million different copies, what do you mean by copies and why is the number of copies important?
Lee Cronin
(00:36:49)
Yeah, that was so interesting. I always understood the copy number is really important, but I never explained it properly, for ages. And I kept having this, it goes back to this, if I give you a, I don’t know, a really complicated molecule, and I say it’s complicated, you could say, “Hey, that’s really complicated.” But is it just really random?
Lex Fridman
(00:37:12)
Mm-hmm.
Lee Cronin
(00:37:14)
So I realized that ultimate randomness and ultimate complexity are indistinguishable until you can see a structure in the randomness, so you can see copies.
Lex Fridman
(00:37:26)
So copies implies structure.
Lee Cronin
(00:37:31)
Yeah. The factory-
Lex Fridman
(00:37:34)
I mean, there’s a deep profound thing in there. Because if you just have a random process, you’re going to get a lot of complex, beautiful, sophisticated things.
Lee Cronin
(00:37:46)
Mm-hmm.
Lex Fridman
(00:37:47)
What makes them complex in the way we think life is complex or, yeah, something like a factory that’s operating under a selection processes, there should be copies. Is there some looseness about copies? What does it mean for two objects to be equal?
Lee Cronin
(00:38:06)
It’s all to do with the telescope or the microscope you’re using. And so, at the maximum resolution… The nice thing about chemists is they have this concept of the molecule and they’re all familiar with the molecule. And molecules, you can hold on your hand, lots of them, identical copies. A molecule is actually a super important thing in chemistry, to say, look, you can have a mole of a molecules, an Avogadro’s number of molecules, and they’re identical. What does that mean? That means that the molecular composition, the bonding and so on, the configuration is indistinguishable. You can hold them together. You can overlay them.

(00:38:43)
So the way I do it is if I say, “Here’s a bag of 10 identical molecules, let’s prove they’re identical.” You pick one out of the bag and you basically observe it, using some technique, and then you take it away and then you take another one out. If you observe it using technique, you see no differences. They’re identical. It’s really interesting to get right. Because if you take, say, two molecules, molecules can be in different vibrational rotational states. They’re moving all the time.

(00:39:09)
So in this respect, identical molecules have identical bonding. In this case, we don’t even talk about chirality, because we don’t have a chirality detector. So two identical molecules in one conception, assembly theory, basically considers both hands as being the same. But, of course, they’re not, they’re different. As soon as you have a chiral distinguisher to detect the left and the right hand, they become different. And so, it’s to do with the detection system that you have and the resolution.
Lex Fridman
(00:39:39)
So I wonder if there’s an art and science to the, which detection system is used when you show up to a new planet.
Lee Cronin
(00:39:49)
Yeah. Yeah, yeah.
Lex Fridman
(00:39:49)
So you’re talking about chemistry a lot today. We have standardized detection systems of how to compare molecules. So when you start to talk about emojis and language and mathematical theorems and, I don’t know, more sophisticated things at different scale, at a smaller scale than molecules, at a larger scale than molecules, what detection… If we look at the difference between you and me, Lex and Lee, are we the same? Are we different?
Lee Cronin
(00:40:24)
Sure. I mean, of course we’re different close up, but if you zoom out a little bit, we will morphologically look the same. High in characteristics, hair length, stuff like that.
Lex Fridman
(00:40:35)
Well, also, the species and-
Lee Cronin
(00:40:37)
Yeah, yeah, yeah.
Lex Fridman
(00:40:38)
… and also there’s a sense why we’re both from earth.
Lee Cronin
(00:40:42)
Yeah, I agree. I mean, this is the power of assembly theory in that regard. So if everything… So the way to look at it, if you have a box of objects, if they’re all indistinguishable, then using your technique, what you then do is you then look at the assembly index. Now, if the assembly index of them is really low and they’re all indistinguishable, then they’re telling you that you have to go to another resolution. So that would be, it is a sliding scale. It’s nice.
Lex Fridman
(00:41:15)
Got it. So those two are attentional with each other.
Lee Cronin
(00:41:18)
Yeah.
Lex Fridman
(00:41:18)
The number of copies and the assembly index.
Lee Cronin
(00:41:20)
Yeah.
Lex Fridman
(00:41:22)
That’s really, really interesting. So, okay. So you show up to a new planet, you’ll be doing what?
Lee Cronin
(00:41:28)
I would do mass spec. I would bring-
Lex Fridman
(00:41:30)
On a sample of what? First of all, how big of a scoop do you take? Do you just take a scoop? What… So we’re looking for primitive life.
Lee Cronin
(00:41:42)
I would look… Yeah, so if you’re just going to Mars or Titan or Enceladus, or somewhere, so a number of ways of doing it. So you could take a large scoop or you go through the atmosphere and detect stuff. You could make a life meter, right? One of Sarah’s colleagues at ASU, Paul Davies, keeps calling it a life meter, which is quite a nice idea. Because you think about it, if you’ve got a living system that’s producing these highly complex molecules and they drift away, and they’re in a highly demanding environment, they could be burnt, right? So they could just be falling apart. So you want to sniff a little bit of complexity and say warmer, warmer, warmer. Oh, we’ve found life, we found the alien. We’ve found the alien Elon Musk, smoking a joint in the bottom of the cave on Mars, or Elon himself, whatever, right?
Lex Fridman
(00:41:42)
Yeah. Mm-hmm.
Lee Cronin
(00:42:34)
You say, “Okay, found it.” So what you can do is a mass spectrometer, you could just look for things in the gas phase or you go on the surface, drill down, because you want to find molecules that are… Well, you’ve either got to find the source, living system, because the problem with just looking for complexity is it gets burnt away. So in a harsh environment on, say, on the surface of Mars, there’s a very low probability that you’re going to find really complex molecules because of all the radiation and so on.

(00:43:05)
If you drill down a little bit, you could drill down a bit into soil that’s billions of years old. Then I would put in some solvent, water, alcohol, or something, or take a scoop, make it volatile, put it into the mass spectrometer and just try and detect high complexity, high abundant molecules. And if you get them, hey, presto, you can have evidence of life. Wouldn’t that then be great if you could say, “Okay, we’ve found evidence of life, now we want to keep the life meter, keep searching for more and more complexity,” until you actually find living cells. And you can get those new living cells and then you could bring them back to earth or you could try and sequence them. You could see that they have different DNA and proteins.
Lex Fridman
(00:43:48)
Go along the gradient of the life meter.
Lee Cronin
(00:43:50)
Exactly.
Lex Fridman
(00:43:50)
How would you build a life meter? Let’s say we’re together, starting new-
Lee Cronin
(00:43:50)
Just a mass spectrometer.
Lex Fridman
(00:43:56)
… new company, launching a life-
Lee Cronin
(00:43:56)
Mass spectrometer would be the first way of doing it. Just take-
Lex Fridman
(00:43:59)
No, no, no, but that’s one of the major components of it. But I’m talking about-
Lee Cronin
(00:44:03)
I would-
Lex Fridman
(00:44:04)
… if it’s a device and branding, logo we got to talk about-
Lee Cronin
(00:44:04)
All right.
Lex Fridman
(00:44:08)
… that’s later. But what’s the input and what’s the… How do you get to the metered output?
Lee Cronin
(00:44:15)
So I would take a… So my life meter, our life meter. There you go.
Lex Fridman
(00:44:21)
Oh, thank you.
Lee Cronin
(00:44:21)
Yeah, you’re welcome, would have both infrared and mass spec. It would have two ports so it could shine a light. And so, what it would do is you would have a vacuum chamber and you would have an electrostatic analyzer, and you’d have a monochromator to producing infrared. You’d add the sum. So you’d take a scoop of the sample, put it in the life meter, it would then add a solvent or heat up the sample so some volatiles come off. The volatiles would then be put into the mass… into electrostatic trap, and you’d weigh the molecules and fragment them. Alternatively, you’d shine infrared light on them and you count number of bands. But you’d have to, in that case, do some separation, because you want to separate… And so, in mass spec, it’s really nice and convenient, because you can separate electrostatically, but you need to have that.
Lex Fridman
(00:45:12)
Can you do it in real time?
Lee Cronin
(00:45:13)
Yeah, pretty much. Pretty much, yeah. So let’s go all the way back. Okay, we’re really going to get this-
Lex Fridman
(00:45:13)
Let’s go.
Lee Cronin
(00:45:18)
… Lex’s life… Lex and Lee’s life meter.
Lex Fridman
(00:45:20)
No, I like Lex and Lee. It’s a good ring to it.
Lee Cronin
(00:45:25)
All right. So you have a vacuum chamber, you have a little nose. The nose would have some, a packing material. So you would take your sample, add it onto the nose, add a solvent or a gas. It would then be sucked up the nose and that would be separated, using what we call chromatography. And then as each band comes off the nose, we’ll then do mass spec and infrared. And in the case of the infrared, count the number of bands, in the case of mass spec, count the number of fragments and weigh it.

(00:45:56)
And then the further up in molecular weight range for the mass spec, and the number of bands, you go up and up and up from the dead, interesting, interesting, over the threshold, oh my gosh, earth life, and then right up to the batshit crazy, this is definitely alien intelligence that’s made this life, right? You could almost go all the way there. Same in the infrared. And pretty simple.

(00:46:18)
The thing that is really problematical is that for many years, decades, what people have done, and I can’t blame them, is they’ve rather, they’ve been obsessing about small biomarkers that we find on earth, amino acids, like single amino acids or evidence of small molecules and these things, and looking for those while I’m looking for complexity. The beautiful thing about this is you can look for complexity without earth chemistry bias or earth biology bias. So assembly theory is just a way of saying, hey, complexity in abundance is evidence of selection. That’s how our universal life meter will work.
Lex Fridman
(00:46:59)
Complexity in abundance is evidence of selection. Okay. So let’s apply our life meter to earth. If we were just to apply assembly index measurements to earth, what kind of stuff are going to get? What’s impressive about-
Lee Cronin
(00:46:59)
So-
Lex Fridman
(00:47:26)
… some of the complexity on earth?
Lee Cronin
(00:47:28)
… So we did this a few years ago when I was trying to convince NASA and colleagues that this technique could work. And honestly, it’s so funny, because everyone’s like, “No, it ain’t going to work.” And it was just like, because the chemists were saying, “Of course there are complicated molecules out there you can detect that just form randomly.” And I was like, “Really?” That was like, it’s a bit like, I don’t know, someone saying, “Of course, Darwin’s textbook was just written randomly by some monkeys and a typewriter.” Just for me, it was like, “Really?” And I’ve pushed a lot on the chemists now. And I think most of them are on board, but not totally. I really had some big arguments, but the copy number caught there. Because I think I confused the chemists by saying one-off. And then when I made clear about the copy number, I think that made it a little bit easier.
Lex Fridman
(00:48:16)
Just to clarify, a chemist might say that, of course out there, outside of earth there’s complex molecules?
Lee Cronin
(00:48:24)
Yes.
Lex Fridman
(00:48:24)
Okay. And then you’re saying, “Wait a minute, that’s like saying, ‘Of course there’s aliens out there.'” Like you-
Lee Cronin
(00:48:31)
Yeah, exactly that.
Lex Fridman
(00:48:32)
Okay.
Lee Cronin
(00:48:32)
Exactly.
Lex Fridman
(00:48:34)
You clarify that, that’s actually a very interesting question and we should be looking for complex molecules of which the copy number is two or greater.
Lee Cronin
(00:48:45)
Yeah, exactly. So on earth, so coming back to earth, what we did is we took a whole bunch of samples and we were running prebiotic chemistry experiments in the lab. We took various inorganic minerals and extracted them, look at the volatile. Because there’s a special way of treating minerals and polymers in assembly theory. In this, in our life machine, we’re looking at molecules. We don’t care about polymers, because they don’t, they’re not volatile. You can’t hold them. How can you make… If you can’t discern that they’re identical, then it’s very difficult for you to work out if this, undergone selection or they’re just a random mess.

(00:49:26)
Same with some minerals, but we can come back to that. So basically what you do, we’ve got a whole loads of samples, inorganic ones, we got a load of, we got Scotch whiskey and also got-
Lex Fridman
(00:49:36)
Nice.
Lee Cronin
(00:49:36)
… took a odd bag, which is one of my favorite whiskeys, which is very peaty. And another-
Lex Fridman
(00:49:41)
What’s peaty mean?
Lee Cronin
(00:49:42)
It is like… So the way that in Scotland, in Isla, which is a little island, the scotch, the whiskey is let to mature in barrels. It’s said that the peat, the complex molecules in the peat find their way through into the whiskey, and that’s what gives it this intense brown color and really complex flavor. It’s literally molecular complexity that does that. And so, vodka’s the complete opposite. It’s just pure, right?-
Lex Fridman
(00:50:16)
So the better the whiskey, the higher the assembly index, the higher the assembly index, the better the whiskey.
Lee Cronin
(00:50:20)
I mean, I really love deep, peaty Scottish whiskeys. Near my house, there is one of the lowland distilleries, called Glengoyne. It’s still beautiful whiskey but not as complex. So for fun, I took some Glengoyne whiskey in our bag and put them into the mass spec and measured the assembly index. I also got E. coli. So the way we do it, take the E. coli, break the cell apart, take it all apart. And also got some beer. And people were ridiculing us saying, “Oh, beer is evidence of complexity.”

(00:50:53)
And one of the computational complexity people, it was just throwing, yeah… He’s very vigorous in his disagreement of assembly theory, was just saying, “You don’t know what you’re doing. Even beer is more complicated than human.” What he didn’t realize is that it’s not beer, per se, it’s taking the yeast extract, taking the extract, breaking the cells, extracting the molecules, and just looking at the profile of the molecules, see if there’s anything over the threshold. And we also put in a really complex molecule, Taxol.

(00:51:24)
So we took all of these, but also NASA gave us, I think, five samples, and they wouldn’t tell us what they are. They said, “No, we don’t believe you’re going to get this to work.” And they really gave us some super complex samples. And they gave us two fossils, one that was a million years old and one was at 10,000 years old, something from Antarctica, seabed. They gave us some Murchison and meteorite, and a few others. Put them through the system. So we took all the samples, treat them all identically, put them into mass spec, fragmented them, counted.

(00:51:56)
And in this case, implicit in the measurement was we, in mass spec, you only detect peaks when you’ve got more than, say, let’s say 10,000 identical molecules. So the copy number’s already baked in, but wasn’t quantified, which is super important there. This was in the first paper. Because I was like, it’s abundant, of course.

(00:52:17)
And when you then took it all out, we found that the biological samples gave you molecules that had an assembly index greater than 15. And all the abiotic samples were less than 15. And then we took the NASA samples and we looked at the ones that were more than 15, less than 15, and we gave them back to NASA, and they’re like, “Oh, gosh. Yep, dead, living, dead, living. You got it.” And that’s what we found on earth.
Lex Fridman
(00:52:44)
That’s a success.
Lee Cronin
(00:52:45)
Yeah. Oh yeah, resounding success.
Lex Fridman
(00:52:48)
Can you just go back to the beer and the E. coli? So what’s the assembly index on those?
Lee Cronin
(00:52:54)
So what you were able to do is, the assembly index of… We found high assembly index molecules originating from the beer sample and the E. coli sample.
Lex Fridman
(00:53:08)
Yeast in the beer.

Evolution of life on Earth

Lee Cronin
(00:53:10)
I didn’t know which one was higher. We didn’t really do any detail there. Because now we are doing that. Because one of the things we’ve done, it’s a secret, but I can tell you. I think it’s-
Lex Fridman
(00:53:23)
Nobody’s listening.
Lee Cronin
(00:53:25)
… well, is that we’ve just mapped the tree of life using assembly theory, because everyone said, ” Oh, you can’t do anything from biology.” And what we’re able to do is, so I think there’s three, well, two ways of doing tree of life… Well, three ways actually.
Lex Fridman
(00:53:38)
What’s the tree of life?
Lee Cronin
(00:53:39)
So the tree of life is basically tracing back the history of life on earth, all the different species, going back who evolved from what. And it all goes all the way back to the first life forms, and they branch off. And you have plant kingdom, the animal kingdom, the fungi kingdom, and different branches all the way up. And the way this was classically done, and I’m no evolutionary biologist. The evolutionary biologists tell me every day, at least 10 times… I want to be one though. I like biology, it’s cool.
Lex Fridman
(00:54:12)
Yeah, it’s very cool.
Lee Cronin
(00:54:13)
But basically-
Lex Fridman
(00:54:14)
Evolutionary.
Lee Cronin
(00:54:16)
… What Darwin and Mendeleev, and all these people do is just, they draw pictures and they [inaudible 00:54:20] taxa. They were able to draw pictures and say, “Oh, these look like common classes.”
Lex Fridman
(00:54:26)
Yeah.
Lee Cronin
(00:54:26)
Then…
Lex Fridman
(00:54:29)
They’re artists really. They’re just…
Lee Cronin
(00:54:32)
They were able to find out a lot, right? And looking at vertebrates and vertebrates, Cambrian explosion and all this stuff. And then came the genomic revolution and suddenly, everyone used gene sequencing. And Craig Venter’s a good example. I think he’s gone around the world in his yacht, just picking up samples, looking for new species. Where he’s just found new species of life just from sequencing. It’s amazing. So you have taxonomy, you have sequencing, and then you can also do a little bit of molecular archeology, like measure the samples and form some inference.

(00:55:08)
What we did is we were able to fingerprint… So we took a load of random samples from all of biology and we used mass spectrometry. And what we did now is not just look for individual molecules, but we looked for coexisting molecules where they had to look at their joint assembly space. And we were able to cut them apart and undergo recursion in the mass spec and infer some relationships. And we’re able to recapitulate the tree of life using mass spectroscopy, no sequencing and no drawing.
Lex Fridman
(00:55:41)
All right. Can you try to say that again, with a little more detail? So recreating, what does it take to recreate the tree of life? What does the reverse engineering process look like here?
Lee Cronin
(00:55:52)
So what you do is you take an unknown sample, you bung it into the mass spec, you get… Because this comes from what you’re asking, what do you see in E. coli?
Lex Fridman
(00:56:00)
Mm-hmm.
Lee Cronin
(00:56:00)
And so, in E. coli, you don’t just see, it’s not the most sophisticated cells on earth make the most sophisticated molecules. It is the coexistence of lots of complex molecules above a threshold. And so, what we realized is you could fingerprint different life forms. So fungi make really complicated molecules. Why? Because they can’t move. They have to make everything onsite.

(00:56:24)
Whereas, some animals are lazy, they can just go eat the fungi, and they don’t need to make very much. And so, what you do is you look at the, so you take, I don’t know, the fingerprint, maybe the top number of high molecular weight molecules you find in the sample, you fragment them to get their assembly indices, and then what you can do is you can infer common origins of molecules. You can do a molecular… When the reverse engineering of the assembly space, you can infer common roots and look at what’s called the joint assembly space.

(00:57:02)
But let’s translate that into the experiment. Take a sample, bung it in the mass spec, take the top, say, 10 molecules, fragment them, and that gives you one fingerprint. Then you do it for another sample, you get another fingerprint. Now the question is you say, “Hey, are these samples the same or different?” And that’s what we’ve been able to do and by basically looking at the assembly space that these molecules create. Without any knowledge of assembly theory, you are unable to do it. With a knowledge of assembly theory, you can reconstruct the tree.
Lex Fridman
(00:57:35)
How does knowing if they’re the same or different give you the tree?
Lee Cronin
(00:57:38)
Let’s go to two leaves on different branches on the tree, right? What you can do, by counting the number of differences, you can estimate how far away their origin was.
Lex Fridman
(00:57:48)
Got it.
Lee Cronin
(00:57:49)
And that’s what we do, and it just works. But when we realized you could even use assembly theory to recapitulate the tree of life with no gene sequencing, we were like, “Huh.”
Lex Fridman
(00:57:58)
So this is looking at samples that exist today in the world.
Lee Cronin
(00:58:01)
Yeah.
Lex Fridman
(00:58:01)
What about things that are no longer exist? I mean, the tree contains information about the past-
Lee Cronin
(00:58:08)
I would-
Lex Fridman
(00:58:09)
… some of it is gone.
Lee Cronin
(00:58:11)
Yeah, absolutely. I would love to get old fossil samples and apply assembly theory, mass spec, and see if we can find new forms of life that have, that are no longer amenable to gene sequencing, because the DNA is all gone. Because DNA and RNA’s quite unstable, but some of the more complex molecules might be there. They might give you a hint something new, or wouldn’t it be great if you-
Lex Fridman
(00:58:11)
I understand.
Lee Cronin
(00:58:33)
… if you find a sample that’s worth really persevering and doing the proper extraction to PCR and so on and then sequence it, and then put it together-
Lex Fridman
(00:58:46)
So when a thing dies, you can still get some information about its complexity.
Lee Cronin
(00:58:50)
Yeah. And it appears that you can do some dating. Now there are really good techniques. There’s radiocarbon dating, there is longer dating, going looking at radioactive minerals and so on. And you can also, in bone, you can look at… What happens after something dies, is you get what’s called racemization, where the chirality in the polymers basically changes and you get decomposition, and the deviation from the pure enantiomer to the mixture, you can have, it gives you a timescale on it, half-life, so you can date when it died. I want to use assembly theory to see if I can use it and date death and things, and trace the tree of life and also decomposition of molecules.
Lex Fridman
(00:59:45)
Do you think it’s possible?
Lee Cronin
(00:59:46)
Oh yeah, without a doubt. It may not be better than what… I was just at conference where there’s some brilliant people, looking isotope enrichment and looking at how life enriches isotopes, and they’re really sophisticated stuff that they’re doing. But I think there’s some fun to be had there, because it gives you another dimension of dating. How old is this molecule in terms of, or more importantly, how long ago was this molecule produced by life? More complex the molecule, the more prospect for decomposition, oxidation, reorganization, loss of chirality, and all that jazz.

(01:00:21)
But what life also does is it enriches. As you get older, the amount of carbon-13 in you goes up, because of the way the bonding is in carbon-13. So it has a slightly different strength, bond strength, than you. It’s called a kinetic isotope effect. So you can literally date how old you are or when you stop metabolizing. So you could date someone’s… how old they are, I think. I’m making this up, this might be right, but I think it’s roughly right. The amount of carbon-13 you have in you, you can estimate how old you are.
Lex Fridman
(01:00:56)
How old living humans are, or living organism?
Lee Cronin
(01:01:00)
Yeah, yeah. You could say, “Oh, this person is 10 years old and this person is 30 years old, because they’ve been metabolizing more carbon and they’ve accumulated it.” That’s the basic idea. It’s probably completely wrong timescale-

Response to criticism

Lex Fridman
(01:01:10)
Signatures of chemistry are fascinating. So you’ve been saying a lot of chemistry examples for assembly theory. What if we zoom out and look at a bigger scale of an object, like really complex objects, like humans or living organisms that are made up of millions or billions of other organisms, how do you try to apply assembly theory to that?
Lee Cronin
(01:01:38)
At the moment, we should be able to do this to morphology in cells. So we’re looking at cell surfaces, and really, I’m to trying to extend further. It’s just that we work so hard to get this paper out and people to start discussing the ideas, but it’s kind of funny, because I think the penny is falling on this. So yeah-
Lex Fridman
(01:02:03)
What does that even… What’s it mean for a penny to be-
Lee Cronin
(01:02:06)
I mean, no, the penny’s dropped, right? A lot of people were like, “It’s rubbish, it’s rubbish. You’ve insulted me. It’s wrong.” I mean, the paper got published on the 4th of October. It had 2.3 million engagements on Twitter and it’s been downloaded over a few hundred thousand times. And someone actually said to me, wrote to me and said, “This is an example of really bad writing and what not to do.” And I was like, if all of my papers got read this much, because that’s the objective, if I have a publishing a paper, I want people to read it. I want to write that badly again.
Lex Fridman
(01:02:37)
Yeah. I don’t know, what’s the deep insight here about the negativity in the space. I think it’s probably the immune system of the scientific community, making sure that there’s no bullshit that gets published and that it can overfy, it can do a lot of damage. It can shut down conversations in a way that’s not productive.
Lee Cronin
(01:02:54)
And I go back, I mean, I’ll answer your question about the hierarchy in assembly, but let’s go back to the perception people saying the paper was badly written. I mean, of course we could improve it. We could always improve the clarity.
Lex Fridman
(01:03:04)
Let’s go there before we go to the hierarchy.
Lee Cronin
(01:03:08)
Yeah.
Lex Fridman
(01:03:08)
It has been criticized quite a bit, the paper. What has been some criticism that you’ve found most powerful, that you can understand and can you explain it?
Lee Cronin
(01:03:23)
Yes. The most exciting criticism came from the evolutionary biologist telling me that he thought that origin of life was a solved problem. And I was like, “Whoa, we’re really onto something, because it’s clearly not.” And when you poked them on that they just said, “No. You you don’t understand evolution.” And I said, “No, no, I don’t think you understand that evolution had to occur before biology and there’s a gap.” That was really for me, that misunderstanding, and that did cause an immune response, which was really interesting.

(01:03:56)
The second thing was the fact that physicists, the physicists were actually really polite, really nice about it. But they just said, “Huh, we’re not really sure about the initial conditions thing. But this is a really big debate that we should certainly get into, because the emergence of life was not encoded in the initial conditions of the universe.” And I think assembly theory shows why it can’t be. I’ll say that-
Lex Fridman
(01:04:23)
Okay. Sure. If you could say that again.
Lee Cronin
(01:04:27)
The origin of, the emergence of life was not and cannot, in principle, be encoded in the initial conditions of the universe.
Lex Fridman
(01:04:35)
Just to clarify what you mean by life is what, high assembly index objects?
Lee Cronin
(01:04:39)
Yeah. And this goes back to your favorite subject.
Lex Fridman
(01:04:43)
What’s that?
Lee Cronin
(01:04:43)
Time.
Lex Fridman
(01:04:47)
Right. So why? What does time have to do with it?
Lee Cronin
(01:04:50)
I mean, probably we can come back to it later, but I think it might be, if we have time.
Lex Fridman
(01:04:56)
Yeah.
Lee Cronin
(01:04:56)
But I think that, I think I now understand how to explain how… Lots of people got angry with the assembly paper, but also, the ramifications of this is how time is fundamental in the universe and this notion of commentorial spaces. And there are so many layers on this, but you have to become an… I think you have to become an intuitionist mathematician and you have to abandon Platonic mathematics. And also, Platonic mathematics is left physics astray, but there’s a lot to unpack there. So we can go to the-
Lex Fridman
(01:05:34)
Platonic mathematic, okay. It’s okay, the evolutionary biologists criticized, because the origin of life is understood and not, it doesn’t require an explanation that involves physics.
Lee Cronin
(01:05:51)
Yeah. It-
Lex Fridman
(01:05:51)
That’s their statement.
Lee Cronin
(01:05:54)
Well, I mean, they said lots of confusing statements. Basically, I realized the evolutionary biology community that were vocal, and some of them were really rude, really spiteful, and needlessly so, right? Because look, I didn’t, people misunderstand publication as well. Some of the peoples have said, “How dare this be published in Nature. What a terrible journal.” And it really, and I watched, said to people, “Look, this is a brand new idea that’s not only potentially going to change the way we look at biology, it’s going to change the way we look at the universe.”

(01:06:36)
And everyone’s saying, “How dare, how dare you be so grandiose?” I’m like, “No, no, no. This is not hype. We’re not saying we’ve invented some, I don’t know, we’ve discovered a alien in a closet somewhere, just for hype. We genuinely mean this to genuinely have the impact or asked the question. And the way people jumped on that was a really bad precedent for young people who want to actually do something new.

(01:07:02)
Because this makes a bold claim, and the chances are that it’s not correct. But what I wanted to do is a couple of things. Is I wanted to make a bold claim that was precise and testable and correctable. Not another wooly information-in-biology argument, information-churring machine, blah, blah, blah, blah, blah. A concrete series of statements that can be falsified and explored, and either the theory could be destroyed or built upon.
Lex Fridman
(01:07:32)
Well, what about the criticism of you’re just putting a bunch of sexy names on something that’s already obvious?
Lee Cronin
(01:07:42)
Yeah, that’s really good. So the assembly index of a molecule is not obvious. No one had measure it before. And no one has thought to quantify selection, complexity, and copy number before, in such a primitive, quantifiable way. I think the nice thing about this paper-
Lee Cronin
(01:08:00)
… quantifiable way. I think the nice thing about this paper, this paper is a tribute to all the people that understand that biology does something very interesting. Some people call it negentropy. Some people call it, think about organizational principles that lots of people were not shocked by the paper because they’d done it before. A lot of the arguments we got, some people said, “Oh, it’s rubbish. Oh, by the way, I had this idea 20 years before.” I was like, ” Which one?” Is it the rubbish part or the really revolutionary part.

(01:08:35)
So this kind of plucked two strings at once. It plucked the there is something interesting that biology are, we can see around this, but we haven’t quantified yet. And what this is, is the first stab at quantifying that, so the fact that people said “This is obvious.” But if it’s obvious, why have you not done it?
Lex Fridman
(01:08:58)
Sure. But there’s a few things to say there. One is, this is in part of philosophical framework because it’s not like you can apply this generally to any object in the universe. It’s very chemistry focused.
Lee Cronin
(01:09:15)
Yeah, well, I think you will be able to, we just haven’t got there robustly. So if we can say how can we… Let’s go up a level. So if we go up from level, let’s go up from molecules to cells because you would jump to people and I jump to emoticons and both are good and they will be assembly…
Lex Fridman
(01:09:30)
Lets stick with cells, yeah. Good point.
Lee Cronin
(01:09:34)
If we go from molecules to assemblies and let’s take acellular assembly. A nice thing about a cell is you can tell the difference between a eukaryote and a prokaryote, right? The organalles are specialized differently when then look at the cell surface and the cell surface has different glycosylation patterns and these cells will stick together. Now let’s go up a level in multicellular creatures you have cellular differentiation.

(01:09:57)
Now if you think about how embryos develop, you go all the way back, those cells undergo differentiation on a causal way that’s biomechanically a feedback between the genetics and biomechanics. I think we can use assembly theory to apply to tissue types. We can even apply it to different cell disease types. So that’s what we’re doing next. But we are trying to walk… The thing is, I’m trying to, I want a leap ahead to go, whoa, we apply it to culture. Clearly you can apply it to memes and culture. And we’ve also applied to assembly theory to CA’s and not as you think…
Lex Fridman
(01:09:57)
Cellular automaton, by the way.
Lee Cronin
(01:10:34)
Yeah, yeah. Cellular automaton, not just as you think. Different CA rules were invented by different people at different times. And one of my coworkers, very talented chap basically was like, “Oh, I can realize that different people had different ideas with different rules and they copied each other and made slightly different cellular automaton rules and looked at them online.” And so he was able to refer an assembly index and copy number of rule, whatever, doing this thing. But I digress.

(01:11:04)
But it does show you can apply it at a higher scale. So what do we need to do to apply assembly theory to things? We need to agree, there’s a common set of building blocks. So in a cell, well, in a multicellular creature, you need to look back in time. So there is the initial cell, which the creature is fertilized and then starts to grow and then there is cell differentiation. And you have to then make that causal chain both on those. So that requires development of the organism in time. Or if you look at the cell surfaces and the cell types, they’ve got different features on the cell walls and inside the cell. So we’re building up, but obviously I want a leap to things like emoticons, language, mathematical theorems.
Lex Fridman
(01:11:54)
But that’s a very large number of steps to get from a molecule to the human brain.
Lee Cronin
(01:12:01)
Yeah, and I think they are related, but in hierarchies of emergence. So you shouldn’t compare them. I mean the assembly index of a human brain, what does that even mean? Well, maybe we can look at the morphology of the human brain, say all human brains have these number of features in common. If they have those number… And then let’s look at a brain in a whale or a dolphin or a chimpanzee or a bird and say, “Okay, let’s look at the assembly indices and number of features in these.” And now the copy number is just the number of how many birds are there, how many chimpanzees are there, how many humans are there?
Lex Fridman
(01:12:35)
But then you have to discover for that the features that you would be looking for.
Lee Cronin
(01:12:39)
Yeah, and that means you need to have some idea of the anatomy.
Lex Fridman
(01:12:43)
But is there an automated way to discover features?
Lee Cronin
(01:12:46)
I guess so. And I think this is a good way to apply machine learning and image recognition just to basically characterize things.
Lex Fridman
(01:12:55)
To apply compression to it, to see what emerges, and then use the features used as part of the compression, as the measurement of… As the thing that is searched for when you’re measuring assembly index and copy number.
Lee Cronin
(01:13:09)
And the compression has to be, remember the assembly universe, which is you have to go from assembly possible to assembly contingent and that jump from… Because assembly possible all possible brains, all possible features all the time. But we know that on the tree of life and also on the lineage of life, going back to Luca, the human brain just didn’t spring into existence yesterday, it’s a long lineage of brains going all the way back. And so if we could do assembly theory to understand the development, not just in evolutionary history, but in biological development, as you grow, we are going to learn something more.
Lex Fridman
(01:13:45)
What would be amazing is if you can use assembly theory, this framework to show the increase in the assembly index associated with, I don’t know, cultures or pieces of text like language or images and so on and illustrate without knowing the data ahead of time, just kind like you did with NASA that you were able to demonstrate that it applies in those other contexts. I mean, and that probably wouldn’t at first, and you have to evolve the theory somehow. You have to change it, you have to expand it.
Lee Cronin
(01:14:21)
I think so.
Lex Fridman
(01:14:24)
I guess this is as a paper, a first step in saying, okay, “Can we create a general framework for measuring complexity of objects. For measuring life, the complexity of living organisms.”
Lee Cronin
(01:14:39)
Yeah.
Lex Fridman
(01:14:39)
That’s what this is reaching for.
Lee Cronin
(01:14:41)
That is the first step. And also to say, look, we have a way of quantifying selection and evolution in a fairly, not mundane, but a fairly mechanical way because before now… The ground truth for it was very subjective. Whereas here we’re talking about clean observables and there’s going to be layers on that. I mean, with collaborators right now, we already think we can do assembly theory on language. And not only that, wouldn’t it be great if we can figure out how under pressure language is going to involve and be more efficient? Because you’re going to want to transmit things.

(01:15:20)
And again, it’s not just about compression, it is about understanding how you can make the most of the architecture you’ve already built. And I think this is something beautiful that evolution does. We are reusing those architectures. We can’t just abandon our evolutionary history. And if you don’t want to abandon your evolutionary history and you know that evolution has been happening, then assembly theory works.

(01:15:44)
And I think that’s a key comment I want to make is that assembly theory is great for understanding when evolution has been used. The next jump is when we go to technology, because of course, if you take the M3 processor… I want to buy, I haven’t bought one yet. I can’t justify it, but I want it at some point. The M3 processor arguably is there’s quite a lot of features, a quite large number. The M2 came before it, then the M1 all the way back, you can apply assembly theory to microprocessor architecture. It doesn’t take a huge leap to see that.
Lex Fridman
(01:16:15)
I’m a Linux guy, by the way. So your examples go way over my head.
Lee Cronin
(01:16:18)
Yeah, well, whatever…
Lex Fridman
(01:16:19)
Is that a fruit company of some sort? I don’t even know. Yeah, there’s a lot of interesting stuff to ask about language. Like you could look at… How would that work? You could look at GPT-1, GPT-2, GPT-3, 3, 5, 4, and try to analyze the kind of language it produces. I mean, that’s almost trying to look at assembly index of intelligence systems.
Lee Cronin
(01:16:45)
Yeah, I mean I think the thing about large language models, and this is a whole hobbyhorse I have at the moment, is that obviously they’re all about… The evidence of evolution in the large language model comes from all the people that produced all the language. And that’s really interesting. And all the corrections in the Mechanical Turk, right?
Lex Fridman
(01:17:15)
Sure. But that’s part of the history, part of the memory of the system.
Lee Cronin
(01:17:20)
Exactly. So it would be really interesting to basically use an assembly based approach to making language in a hierarchy. My guess is that we might be able to build a new type of large language model that uses assembly theory, that it has more understanding of the past and how things were created. Basically the thing with LLMs is like, everything everywhere, all at once, splat and make the user happy. So there’s not much intelligence in the model. The model is how the human interacts with the model. But wouldn’t it be great if we could understand how to embed more intelligence in the system?
Lex Fridman
(01:18:03)
What do you mean by intelligence there? You seem to associate intelligence with history or memory?
Lee Cronin
(01:18:11)
Yeah. I think selection produces intelligence.
Lex Fridman
(01:18:16)
You’re almost implying that selection is intelligence. No.
Lee Cronin
(01:18:21)
Kind of, I would go out in limb and say that, but I think it’s a little bit more, human beings have the ability to abstract and they can break beyond selection. And this is… Darwinian selection, because a human being doesn’t have to basically do trial and error, but they can think about it and say, “Oh, that’s a bad idea, won’t do that.” And then technologies and so on.
Lex Fridman
(01:18:39)
So we escaped Darwinian evolution and now we’re onto some other kind of evolution, I guess? Higher level.
Lee Cronin
(01:18:46)
And assembly theory will measure that as well, right? Because it’s all a lineage.

Kolmogorov complexity

Lex Fridman
(01:18:50)
Okay. Another piece of criticism or by way of question is how is assembly theory or maybe assembly index different from Kolmogorov complexity? So for people who don’t know, a Kolmogorov complexity of an object is the length of a shortest computer program that produces the object as output.
Lee Cronin
(01:19:10)
Yeah, there seems to be a disconnect between the computational approach. So Kolmogorov measure requires a Turing machine, requires a computer, and that’s one thing. And the other thing is assembly theory is supposed to trace the process by which life evolution emerged, right? There’s a main thing there. There are lots of other layers.

(01:19:42)
So Kolmogorov complexity, you can approximate Kolmogorov complexity, but it’s not really telling you very much about the actual… It’s really telling you about your dataset, compression of your dataset.
Lex Fridman
(01:20:00)
Sure.
Lee Cronin
(01:20:00)
And so that doesn’t really help you identify… The turtle in this case is the computer. And so what assembly theory does is, I’m going to say, it’s a trigger warning for anyone listening who loves complexity theory. I think that we’re going to show that AIT is a very important subset of assembly theory because here’s what happens. I think that assembly theory allows us to go understand when were selections occurring. Selection produces factories and things, factories in the end produce computers, and then algorithmic information theory comes out of that. The frustration I’ve had with looking at life through this kind of information theory is it doesn’t take into account causation. So the main difference between assembly theory and all these complexity measures is there’s no causal chain. And I think that’s the main…
Lex Fridman
(01:21:00)
That’s the causal chain is at the core of assembly theory.
Lee Cronin
(01:21:06)
Exactly. And if you’ve got all your data in a computer memory, all the data’s the same. You can access it in the same way. You don’t care. You just compress it. And you either look at the program runtime or the shortest program. And that for me is absolutely not capturing what it is. What selection does.
Lex Fridman
(01:21:28)
But assembly theory looks at objects. It doesn’t have information about the object history. It’s going to try to infer that history by looking for the shortest history, right? The object doesn’t have a Wikipedia page that goes with it about its history.
Lee Cronin
(01:21:49)
I would say it does in a way, and it is fascinating to look at. So you’ve just got the object and you have no other information about the object. What assembly theory allows you to do with just with the object is to, and the word infer is correct, I agree with infer. You say, well, that’s not the history. But something really interesting comes from this.

(01:22:10)
The shortest path is inferred from the object. That is the worst case scenario if you have no machine to make it. So that tells you about the depth of that object in time. And so what assembly theory allows you to do is without considering any other circumstances, to say from this object, how deep is this object in time if we just treat the object as itself without any other constraints? And that’s super powerful because the shortest path then allows you to say, “Oh, this object wasn’t just created randomly. There was a process.” And so assembly theory is not meant to one up AIT or to ignore the factory. It’s just to say, “Hey, there was a factory and how big was that factory? And how deep in time is it?”
Lex Fridman
(01:23:01)
But it’s still computationally very difficult to compute that history, right? For complex objects?
Lee Cronin
(01:23:11)
It is. It becomes harder. But one of the thing that’s super nice is that it constrains your initial conditions, right?
Lex Fridman
(01:23:18)
Sure.
Lee Cronin
(01:23:18)
It constrains where you’re going to be. So if you take, say, imagine… So one of the things we’re doing right now is applying assembly theory to drug discovery. Now, what everyone’s doing right now is taking all the proteins and looking at the proteins and looking at molecules, doppler proteins, why not instead, look at the molecules that are involved in interacting with the receptors over time, rather than thinking about and use the molecules, evolve over time as a proxy for how the proteins evolved over time. And then use that to constrain your drug discovery process.

(01:23:51)
You flip the problem 180 and focus on the molecule evolution rather than the protein. And so you can guess in the future what might happen. So you rather than having to consider all possible molecules, you know where to focus. And that’s the same thing if you’re looking at in assembly spaces for an object where you don’t know the entire history, but you know that in the history of this object, it’s not going to have some other motif there that it doesn’t apply. It doesn’t appear in the past.
Lex Fridman
(01:24:22)
But just even for the drug discovery point you made, don’t you have to simulate all of chemistry to figure out how to come up with constraints?
Lee Cronin
(01:24:32)
No.
Lex Fridman
(01:24:32)
And the molecules and the…
Lee Cronin
(01:24:34)
No.
Lex Fridman
(01:24:35)
I don’t know enough about protein.
Lee Cronin
(01:24:36)
Well, this is another thing that I think causes… Because this paper goes across so many boundaries. So chemists have looked at this and said, “This is not correct reaction.” It’s like, no, it’s a graph.
Lex Fridman
(01:24:52)
Sure, there’s assembly index and shortest path examples here on chemistry.
Lee Cronin
(01:24:58)
Yeah, and what you do is you look at the minimal constraints on that graph. Of course it has some mapping to the synthesis, but actually you don’t have to know all of chemistry. You can build up the constraints space rather nicely. But this is just at the beginning, right? There are so many directions this could go in and as I said, it could all be wrong, but hopefully it’s less wrong.
Lex Fridman
(01:25:22)
What about the little criticism I saw of… By way of question, do you consider the different probabilities of each reaction in the chain so that there could be different… When you look at a chain of events that led up to the creation of an object, doesn’t it matter that some parts in the chain are less likely than others?
Lee Cronin
(01:25:46)
No.
Lex Fridman
(01:25:48)
It doesn’t matter?
Lee Cronin
(01:25:48)
No, no. Well, let’s go back. So no, not less likely, but react… So, no. So let’s go back to what we’re looking at here. So the assembly index is the minimal path that could have created that object probabilistically. So imagine you have all your atoms in a plasma, you’ve got enough energy, there’s collisions. What is the quickest way you could zip out that molecule with no reaction constraints?
Lex Fridman
(01:26:12)
How do you define quickest there then?
Lee Cronin
(01:26:14)
It’s just basically walk on a random graph. So we make an assumption that basically the timescale for forming the bonds. So no, I don’t want to say that because then it’s going to have people getting obsessing about this point. And your criticism is a really good one. What we’re trying to say is this puts a lower bound on something. Of course, some reactions are less possible than others, but actually I don’t think chemical reactions exist.
Lex Fridman
(01:26:39)
Oh, boy. What does that mean? Why don’t chemical reactions exist?
Lee Cronin
(01:26:44)
I’m writing a paper right now that I keep being told I have to finish, and it’s called ‘The Origin of Chemical Reactions.’ And it merely says that reactivity exists as controlled by the laws of quantum mechanics. And reactions, chemists put names on reactions. So you can have, I don’t know, the Wittig reaction, which is by Wittig. You could have the Suzuki reaction, which is by Suzuki.

(01:27:09)
Now what are these reactions? So these reactions are constrained by the following. They’re constrained by the fact they’re on planet Earth, 1G, 298 Kelvin, 1 Bar. So these are constraints. They’re also constrained by the chemical composition of earth, oxygen availability, all this stuff. And that then allows us to focus in our chemistry. So when a chemist does a reaction, that’s a really nice compressed shorthand for constraint application, glass flask, pure reagent, temperature, pressure, boom, boom, boom, control, control control, control control.

(01:27:44)
So of course we have bond energies. So the bond energies are kind of intrinsic in a vacuum. So the bond energy, you have to have a bond. And so for assembly theory to work, you have to have a bond, which means that bond has to give the molecule a half life. So you’re probably going to find later on that some bonds are weaker and that you are going to miss in mass spectrum, when you look at the assembly of some molecules, you’re going to miscount the assembly of the molecule. It falls apart too quickly because the bonds just form. But you can solve that with looking at infrared.

(01:28:21)
So when people think about the probability, they’re kind of misunderstanding. Assembly theory says nothing about the chemistry because chemistry is chemistry and their constraints are put in by biology. There was no chemist on the origin of life unless you believe in the chemist in the sky… And it’s like Santa Claus, they had a lot of work to do, but chemical reactions do not exist and the constraints that allow chemical transformations to occur do exist.
Lex Fridman
(01:28:52)
Okay, okay. So it’s constraint. So there’s no chemical reactions. It’s all constraint application, which enables the emergence of… What’s a different word for chemical reaction?
Lee Cronin
(01:29:10)
Transformation?
Lex Fridman
(01:29:11)
Transformation.
Lee Cronin
(01:29:11)
Yeah, like a function. It’s a function, but no, but I love chemical reactions as a shorthand. And so the chemists don’t all go mad. I mean, of course chemical reactions exist on earth.
Lex Fridman
(01:29:20)
It’s a shorthand.
Lee Cronin
(01:29:21)
It’s a shorthand for these constraints.
Lex Fridman
(01:29:24)
So assuming all these constraints that we’ve been using for so long that we just assume that that’s what was the case in natural language conversation.
Lee Cronin
(01:29:30)
Exactly. The grammar of chemistry of course emerges in reactions and we can use them reliably, but I do not think the Wittig reaction is accessible on Venus.
Lex Fridman
(01:29:41)
Right, and this is useful to remember to frame it as constraint application is useful for when you zoom out to the bigger picture of the universe and looking at the chemistry of the universe and then starting to apply assembly theory. That’s interesting. That’s really interesting. But we’ve also pissed off the chemists now.
Lee Cronin
(01:30:01)
Oh, they’re pretty happy, but well, most of them.
Lex Fridman
(01:30:04)
No. Everybody deep down is happy, I think. They’re just sometimes feisty, that’s how they have fun.
Lee Cronin
(01:30:13)
Everyone is grumpy on some days when you challenge… The problem with this paper is… It’s almost like I went to a park, it’s like I used to do this occasionally when I was young. Go to a meeting and just find a way to offend everyone at the meeting simultaneously. Even the factions that don’t like each other, they’re all unified in the hatred of you just offending them. This paper, it feels like the person that went to the party and offended everyone simultaneously. So stop fighting with themselves and just focused on this paper.

Nature review process

Lex Fridman
(01:30:41)
Maybe just a little insider interesting information. What were the editors of Nature, what the reviews and so on, how difficult was that process because this is a pretty big paper.
Lee Cronin
(01:30:55)
So when we originally sent the paper, we sent the paper and the editor said that… This is quite a long process. We sent the paper and the editor gave us some feedback and said, “I don’t think it’s that interesting.” Or “It’s hard. It’s hard concept.” And the editor gave us some feedback and Sarah and I took a year to rewrite the paper.
Lex Fridman
(01:31:26)
Was the Nature of the feedback very specific on this part? This part? Or was it like, “What are you guys smoking? What kind of crack are you taking?”
Lee Cronin
(01:31:34)
Yeah, it was kind of the latter. What are you smoking.
Lex Fridman
(01:31:35)
Okay. But polite and there’s promise.
Lee Cronin
(01:31:41)
Yeah. Well the thing is the editor was really critical, but in a really professional way. And I mean for me, this was the way science should happen. So when it came back, we had too many equations in the paper. If you look at the pre-print, there’s just equations everywhere, like 23 equations. And when I said to Abhishek, who was the first author, we’ve got to remove all the equations, but my assembly equations staying in Abhishek was like, “No, we can’t.”

(01:32:05)
I said, “Well look, if we want to explain this to people, there’s a real challenge.” And so Sarah and I went through the, I think it was actually 160 versions of the paper, but basically we got to version 40 or something. We said, “Right, zero it start again.” So we wrote the whole paper again. We knew the entire…
Lex Fridman
(01:32:21)
Amazing.
Lee Cronin
(01:32:22)
And we just went bit by bit by bit and said, “What is it we want to say?” And then we sent the paper in and we expected it to be rejected and not even go to review. And then we got notification back, it had gone to review and we were like, “Oh my God, it’s so going to get rejected. How’s it going to get rejected?” Because the first assembly paper on the mass spec we sent to Nature went through six rounds of review and rejected. And by a chemist that just said, “I don’t believe you. You must be committing fraud.”

(01:32:54)
And long story, probably a boring story, but in this case it went out to review, the comments came back and the comments were incredibly, they were very deep comments from all the reviewers. But the nice thing was the reviewers were kind of very critical, but not dismissive. They were like, “Oh, really? Explain this, explain this, explain this, explain this.”
Lex Fridman
(01:32:54)
That’s great.
Lee Cronin
(01:33:26)
Are you sure it’s not Kolmogorov? Are you sure it’s not this? And we went through I think three rounds of review pretty quick and the editor went, yeah, it’s in.
Lex Fridman
(01:33:39)
But maybe you could just comment on the whole process. You’ve published some pretty huge papers on all kinds of topics within chemistry and beyond. Some of them have some little spice in them, a little spice of crazy like Tom Waits, says, “I like my Tom with a little drop of poison.” It’s not a mundane paper. So what’s it like psychologically to go through all this process to keep getting rejected, to get reviews from people that don’t get the paper or all that kind of stuff? Just from a question of a scientist, what is that like?
Lee Cronin
(01:34:19)
I mean this paper for me kind of, because this wasn’t the first time we tried to publish assembly theory at the highest level. The Nature communications paper on the mass spec, the idea went to Nature and got rejected, went through six rounds of review and got rejected. And I just was so confused when the chemist said, this can’t be possible. I do not believe you can measure complexity using mass spec. And also by the way, complex molecules can randomly form. And we’re like, “But look at the data. The data says…” And they said, “No, no. We don’t believe you.” And we went and I just wouldn’t give up. And the editor in the end was just like… Different editors actually. Right?
Lex Fridman
(01:35:10)
What’s behind that never giving up? When you’re sitting there 10 o’clock in the evening, there’s a melancholy feeling that comes over you and you’re like, “Okay, this is rejection number five.” Or it’s not rejection, but maybe it feels like a rejection because the comments are that you totally don’t get it. What gives you strength to keep going there?
Lee Cronin
(01:35:31)
I don’t know. I don’t normally get emotional about papers, but it is not about giving up because we want to get it published because we want the glory or anything. It’s just like, why don’t you understand? And so what I would just… Is try to be as rational as possible and say, yeah, you didn’t like it. Tell me why. And then…

(01:36:26)
Sorry, give me a second. Silly, never get emotional about papers normally, but I think what we do, you just compressed five years of angst from this.
Lex Fridman
(01:36:38)
So it’s been rough?
Lee Cronin
(01:36:40)
It’s not just rough. It’s like, it happened… I came up with the assembly equation remote from Sarah in Arizona and the people at SFI. I felt like I was a mad person. The guy depicted in A Beautiful Mind who was just like… Not the actual genius part, but just the gibberish, gibberish, gibberish.
Lex Fridman
(01:36:59)
Just the crazy part.
Lee Cronin
(01:37:02)
Because I kept writing expanded and I have no mathematical ability at all. And I was making these mathematical expansions where I kept seeing the same motif again. I was like, I think this is a copy number. The same string is coming again and again and again, I couldn’t do the math. And then I realized the copy number fell out of the equation and everything collapsed down. I was like, oh, that works kind of.

(01:37:23)
So we submitted the paper and then when it was almost accepted, the mass spec one and it was astrobiologists said, great, a mass spectroscopist said great. And the chemist went nonsense, biggest pile of nonsense ever. Fraud. And I was like, “But why fraud?” And they just said, “Just because.” I was like well… I could not convince the editor in this case. The editor was just so pissed off. They see it as a, you’re wasting my time. And I would not give up. I wrote, I went and dissected all the parts. And I think, although, I mean I got upset about, it was kind of embarrassing actually, but I guess…
Lex Fridman
(01:38:05)
I bet it was beautiful.
Lee Cronin
(01:38:08)
But it was just trying to understand why they didn’t like it. So part of me was really devastated and a part of me was super excited because I’m like, “Huh, they can’t tell me why I’m wrong.” And this kind of goes back to when I was at school, I was in a kind of learning difficulties class, and I kept going to the teacher and saying, “What do I do today to prove I’m smart?” And they were like, “Nothing, you can’t.” I was like, “Give me a job, give me something to do, give me a job to do. Something to do.” And I kind of felt like that a bit when I was arguing with the, and not arguing. There was no ad hominem. I wasn’t telling the editor they were idiots or anything like this or the reviewers. I kept it strictly factual.

(01:38:51)
And all I did is I just kept knocking it down bit by bit, by bit, by bit by bit. It was ultimately rejected and it got published elsewhere. And then the actual experimental data, so in this paper, the experimental justification was already published. So when we did this one and we went through the versions and then we sent it in and in the end it just got accepted. We were like, well, that’s kind of cool, right? This is kind of like some days…

(01:39:21)
Sorry, the first author was like, “I can’t believe it got accepted.” I was like, “Nor am I, but it’s great. It’s good.” And then when the paper was published, I was not expecting the backlash. I was expecting computational. Well, no, actually I was just expecting one person who’d been trolling me for a while about it just to carry on trolling, but I didn’t expect the backlash. And then I wrote to the editor and apologized and the editor was like, “What are you apologizing for? It was a great paper. Of course it’s going to get backlash. You said some controversial stuff, but it’s awesome.”
Lex Fridman
(01:39:56)
Well, I think it’s a beautiful story of perseverance and the backlash is just a negative word for discourse, which I think is beautiful. That’s the science.
Lee Cronin
(01:40:08)
I think, as I said when it got accepted and people were saying, we’re kind of hacking on it. And I was like, papers are not gold medals. The reason I wanted to publish that paper in Nature is because it says, “Hey, there’s something before biological evolution.” You have to have that, if you’re not a creationist, by the way, this is an approach. First time someone has put a concrete mechanism, or sorry, a concrete quantification and what comes next you are pushing on is a mechanism. And that’s what we need to get to is an auto catalytic sets, self-replicating molecules, some other features that come in.

(01:40:48)
And the fact that this paper has been so discussed, for me is a dream come true, it doesn’t get better than that. If you can’t accept a few people hating it… And the nice thing is, the thing that really makes me happy is that no one has attacked the actual physical content.

(01:41:10)
You can measure the assembly index, you can measure selection now. So either that’s right or it’s… Well, either that’s helpful or unhelpful. If it’s unhelpful, this paper will sink down and no one will use it again. If it’s helpful, it’ll help people scaffold on it and we’ll start to converge for a new paradigm. So I think that that’s the thing that I wanted to see my colleagues, authors, collaborators and people were like, you’ve just published this paper. You’re a chemist. Why have you done this? Who are you to be doing evolutionary theory? Well, I don’t know. I mean, sorry, did I need to…
Lex Fridman
(01:41:48)
Who is anyone to do anything? Well, I’m glad you did. Let me just before coming back to Origin of Life and these kinds of questions, you mentioned learning difficulties. I didn’t know about this. So what was it like?
Lee Cronin
(01:42:00)
I wasn’t very good at school, right.
Lee Cronin
(01:42:00)
I wasn’t very good at school, right?
Lex Fridman
(01:42:04)
This is when you were very young?
Lee Cronin
(01:42:06)
Yeah. But in primary school, my handwriting was really poor and apparently I couldn’t read and my mathematics was very poor. So they just said, “This is a problem.” They identified it. My parents at the time, were confused because I was busy taking things apart, buying electronic junk from the shop, trying to build computers and things. And then once I got out of… when I think, about the major transition in my stupidity, everyone thought I wasn’t that stupid when I was… Basically, everyone thought I was faking. I liked stuff and I was faking wanting to be it. So I always want to be a scientist. So five, six, seven years old, I’d be a scientist, take things apart, and everyone’s like, “Yeah, this guy wants to be a scientist, but he’s an idiot.” So everyone was really confused, I think, at first, that I wasn’t smarter than I was claiming to be.

(01:42:58)
And then I just basically didn’t do well in any of the tests, and I went down and down and down and down and then I was like, “Huh, this is really embarrassing. I really like maths and everyone says I can’t do it. I really like physics and chemistry and science and people say you can’t read and write.” And so I found myself in a learning difficulties class at the end of primary school and the beginning of secondary school. In the UK, secondary school is 11, 12 years old. And I remember being put in the remedial class. And the remedial class was basically full of three types of people. There were people quite violent and there were people who couldn’t speak English and there were people that really had learning difficulties. So the one thing I can objectively remember was… I could read. I liked reading. I read a lot. But something in me, I’m a bit of a rebel. I refused to read what I was told to read and I found it difficult to read individual words in the way they were told.

(01:44:24)
But anyway, I got caught one day teaching someone else to read and they said, “Okay, we don’t understand this.” I’d always known I wanted to be a scientist, but I didn’t really know what that meant and I realized you had to go to university and I thought, “I can just go to university. They take curious people.” “No, no, no need to have these. You have to be able to enter these exams to get this grade point average, and the fact is, the exams you’ve been entered into, you are just going to get C, D or E.” You can’t even get A, B or C. These are the UK GCSEs. I was like, ” Oh, shit,” and I said, “Can you just put me into the higher exams?” They said, “No, no, you’re going to fail. There’s no chance.” So my father intervened and said, “Just let him go in the exams,” and they said, “He’s definitely going to fail. It’s a waste of time, waste of money,” and he said, “What if we paid?” So they said, “Okay,” so you didn’t actually have to pay. You only had to pay if I failed.

(01:45:23)
So I took the exams and passed them, fortunately. I didn’t get the top grades, but I got into A Levels. But then that also limited what I could do at A Levels. I wasn’t allowed to do A Level maths.
Lex Fridman
(01:45:35)
What do you mean you weren’t allowed to?
Lee Cronin
(01:45:36)
Because I had such a bad math grade from my GCSE, I only had a C. But they wouldn’t let me go into the ABC for maths because of some coursework requirement back then so the top grade I could have got was a C. So C, D or E. So I got a C and they let me do AS Level maths, which is this half intermediate and get to go to university. But I liked chemistry. I had a good chemistry teacher so in the end I got to university to do chemistry.
Lex Fridman
(01:46:01)
So through that process, I think for kids in that situation, it’s easy to start believing that you’re not… How do I put it… That you’re stupid, and basically give up, that you’re just not good at math, you’re not good at school. So this is, by way of advice for people, for interesting people, for interesting young kids right now, experiencing the same thing. Where was the place? What was the source of you not giving up there?
Lee Cronin
(01:46:33)
I have no idea. Other than… I really liked not understanding stuff. For me, when I not understand something… I feel like I don’t understand anything. But now, but back then, I remember when I was like… I don’t know, I tried to build a laser when I was eight and I thought, “How hard could it be?” And basically, I was going to build a CO2 laser and I was like, “Right, I think I need some partially coated mirrors. I need some carbon dioxide and I need a high voltage.” And I was so stupid. I was so embarrassed. T make enough CO2, I actually set a fire and tried to filter the flame.
Lex Fridman
(01:47:30)
Oh, nice. That’s an idea.
Lee Cronin
(01:47:30)
Just to collect enough CO2 and it completely failed. And I burnt half the garage down. So my parents were not very happy about that. So that was one thing. I really liked first principle thinking. So I remember being super curious and being determined to find answers. And so when people do give advice about this, why ask for advice about this? I don’t really have that much advice other than don’t give up. And one of the things I try to do as a chemistry professor in my group is I hire people that I think, if they’re persistent enough, who am I to deny them the chance? Because people gave me a chance and I was able to do stuff.
Lex Fridman
(01:48:18)
Do you believe in yourself essentially?
Lee Cronin
(01:48:22)
So I love being around smart people and I love confusing smart people. And when I’m confusing smart people, not by stealing their wallets and hiding it somewhere, but if I can confuse smart people, that is the one piece of hope that I might be doing something interesting.
Lex Fridman
(01:48:37)
Wow, that’s quite brilliant. As a gradient to optimize. Hang out with smart people and confuse them. And the more confusing it is, the more there’s something there.
Lee Cronin
(01:48:47)
And as long as they’re not telling you just a complete idiot and they give you different reasons. And everyone, because with assembly theory and people said, “Oh, it’s wrong.” And I was like, “Why?” And no one could give me a consistent reason. They said, “Oh, because it’s been done before or it’s just [inaudible 01:49:04] or it’s just there, that and the other. So I think the thing that I like to do is, and in academia it’s hard because people are critical. But the criticism, although I got upset about it earlier, which is silly, but not silly because obviously it’s hard work being on your own or with a team spatially separated during lockdown and try to keep everyone on board and have some faith. I always wanted to have a new idea. And so I like a new idea and I want to nurture it as long as possible. And if someone can give me actionable criticism, that’s why I think I was trying to say earlier when I was stuck for words, give me actionable criticism.

(01:49:51)
“It’s wrong.” “Okay, why is it wrong?” Say, “Oh, your equation’s incorrect for this or your method is wrong.” So what I try and do is get enough criticism from people to then triangulate and go back. And I’ve been very fortunate in my life that I’ve got great colleagues, great collaborators, funders, mentors, and people that will take the time to say, “You are wrong because.” And then what I have to do is integrate the wrongness and go, “Oh, cool, maybe I can fix that.” And I think criticism is really good. People have a go at me because I’m really critical. But I’m not criticizing you as a person. I’m just criticizing the idea and trying to make it better and say, “What about this?”

(01:50:34)
And sometimes my filters are truncated in some ways. I’m just like, “That’s wrong, that’s wrong, that’s wrong. Why’d you do this?” And people are like, “Oh my God, you just told me, you destroyed my life’s work.” I’m like, “Relax. No.” I’m just like, “Let’s make it better.” And I think that we don’t do that enough because we are either personally critical, which isn’t helpful or we don’t give any criticism at all because we’re too scared.
Lex Fridman
(01:51:03)
Yeah, I’ve seen you be pretty aggressively critical but every time I’ve seen, it’s the idea, not the person.

Time and free will

Lee Cronin
(01:51:15)
I’m sure I make mistakes on that.I argue lots with Sara and she’s shocked. I’ve argued with Joscha, Joscha Bach, in the past and he is like, “You’re just making that up.” And I’m like, “No, not quite. But kind of.” But I had a big argument with Sara about time and she’s like, “No, time doesn’t exist.” I’m like, “No, no, time does exist.” And as she realized that her conception of assembly theory and my conception of assembly theory was the same thing, necessitated us to abandon the fact that time is eternal, to actually really fundamentally question how the universe produces combinatorial novelty.
Lex Fridman
(01:51:59)
So time is fundamental for assembly theory? I’m just trying to figure out where you and Sara converged.
Lee Cronin
(01:52:06)
I think assembly theory is fine in this time right now but I think it helps us understand that something interesting is going on. I’ve been really inspired by a guy called Nick Gisin. I’m going to butcher his argument but I love his argument a lot. So I hope he forgives me if he hears about it. But basically if you want free will, time has to be fundamental. And if you want time to be fundamental, you have to give up on platonic mathematics and you have to use intuition. By the way, and again I’m going to butcher this, but basically Hilbert said that infinite numbers are allowed. And I think it was Brouwer who said, “No, you can’t. All numbers are finite.” So let’s go back a step because it was like people going to say, assembly theory seems to explain that large combinatorial space allows you to produce things like life and technology. And that large combinatorial space is so big it’s not even accessible to a Sean Carroll, David Deutsch multiverse that physicists saying that all of the universe already exists in time is probably, provably, that’s a strong word, not correct.

(01:53:43)
That we are going to know that the universe as it stands, the present, the way the present builds the future is so big, the universe can’t ever contain the future. And this is a really interesting thing. I think Max Tegmark has this mathematical universe. He says the universe is like a block universe, and I apologize to Max if I’m getting it wrong, but people think you can just move. You have the stat, you have the initial conditions, and you can run the universe right to the end and go backwards and forwards in that universe. That is not correct.
Lex Fridman
(01:54:17)
Let me load that in. The universe is not big enough to contain the future.
Lee Cronin
(01:54:21)
Yeah. That’s why. That’s it.
Lex Fridman
(01:54:24)
That’s a beautiful way of saying that time is fundamental.
Lee Cronin
(01:54:26)
Yes. And this is why the law of the excluded middle, something is true or false, only works in the past. Is it going to snow in New York next week or in Austin? You might, in Austin, say probably not. In New York, you might say, yeah. If you go forward to next week and say, “Did it snow in New York last week? True or false?” You can answer that question. The fact that the law of the excluded middle cannot apply to the future explains why time is fundamental.
Lex Fridman
(01:55:01)
That’s a good example, intuitive example, but it’s possible that we might be able to predict whether it’s going to snow if we had the perfect information.
Lee Cronin
(01:55:10)
I think…
Lex Fridman
(01:55:11)
You’re saying it not.
Lee Cronin
(01:55:13)
Impossible. Impossible. So here’s why. I’ll make a really quick argument and this argument isn’t mine. It’s Nick’s and a few other people.
Lex Fridman
(01:55:23)
Can you explain his view on time being fundamental?
Lee Cronin
(01:55:28)
Yeah. So I’ll give my view, which resonates with his, but basically it’s very simple actually. It would say your ability to design and do an experiment is exercising free will. So he used that thought process. I never really thought about it that way, and that you actively make decisions. I used to think that free will was a consequence of just selection but I’m understanding that human free will is something really interesting. And he very much inspired me. But I think that what Sara Walker said that inspired me as well, these will converge, is that I think that the universe, and the universe is very big, huge, but actually the place that is largest in the universe right now, the largest place in the universe, is earth.
Lex Fridman
(01:56:20)
Yeah, I’ve seen you say that. And boy, does that… That’s an interesting one to process. What do you mean by that earth is the biggest place in the universe?
Lee Cronin
(01:56:31)
Because we have this combinatorial scaffolding going all the way back from LUCA. So you’ve got cells that can self-replicate and then you go all the way to terraforming the earth. You’ve got all these architectures, the amount of selection that’s going on, biological selection, just to be clear, biological evolution, and then have multicellularity then animals and abstraction. And with abstraction, there was another kick because you can then build architectures and computers and cultures and language and these things are the biggest things that exist in the universe because we can just build architectures that could naturally arise anywhere and the further that distance goes in time, and it’s gigantic.
Lex Fridman
(01:57:15)
From a complexity perspective.
Lee Cronin
(01:57:17)
Yeah.
Lex Fridman
(01:57:17)
Okay, wait a minute. But I know you’re being poetic, but how do you know there’s not other earth-like… How do you know? You’re basically saying earth is really special. It’s awesome stuff as far as we look out, there’s nothing like it going on. But how do you know there’s not nearly infinite number of places where cool stuff like this is going on?
Lee Cronin
(01:57:40)
I agree and I would say, I’ll say again, that earth is the most gigantic thing we know in the universe combinatorially we know.
Lex Fridman
(01:57:49)
We know. Yeah.

Communication with aliens

Lee Cronin
(01:57:50)
Now, I guess this is just purely a guess. I have no data other than hope. Maybe not hope, maybe… No, I have some data. That every star in the sky probably has planets and life is probably emerging on these planets. But the amount of contingency that is associated with life, is I think the combinatorial space associated with these planets is so different. Our causal cones are never going to overlap or not easily. And this is the thing that makes me sad about alien life. It’s why we have to create alien life in the lab as quickly as possible because I don’t know if we are going to be able to be able to build architectures that will intersect with alien intelligence architectures.
Lex Fridman
(01:58:42)
Intersect, you don’t mean in time or space?
Lee Cronin
(01:58:46)
Time and the ability to communicate.
Lex Fridman
(01:58:48)
The ability to communicate.
Lee Cronin
(01:58:49)
Yeah. My biggest fear in a way is that life is everywhere but we become infinitely more lonely because of our scaffolding in that combinatorial space. Because it’s so big.
Lex Fridman
(01:59:00)
So you’re saying the constraints created by the environment that led to the factory of Darwinian evolution are just this little tiny cone in a nearly infinite combinatorial space.
Lee Cronin
(01:59:14)
Exactly.
Lex Fridman
(01:59:14)
So there’s other cones like it. Why can’t we communicate with other… Just because we can’t create it doesn’t mean we can’t appreciate the creation, right? Sorry, detect the creation.
Lee Cronin
(01:59:30)
I truly don’t know but it’s an excuse for me to ask for people to give me money to make a planet simulator.
Lex Fridman
(01:59:36)
Yeah, right.
Lee Cronin
(01:59:36)
If I can make…
Lex Fridman
(01:59:38)
With a different [crosstalk 01:59:40]
Lee Cronin
(01:59:39)
It’s like another shameless say, it’s like, “Give me money. I need money.”
Lex Fridman
(01:59:42)
This was all long plug for a planet simulator. Hey, I won’t be the first in line to do that.
Lee Cronin
(01:59:50)
My rick garage has run out of room.
Lex Fridman
(01:59:53)
Yeah.
Lee Cronin
(01:59:54)
No.
Lex Fridman
(01:59:54)
And this planet simulator, you mean a different planet or different sets of environments and pressures?
Lee Cronin
(02:00:01)
Exactly. If we could basically recreate the selection before biology as we know it, that gives rise to a different biology, we should be able to put the constraints on where to look in the universe. So here’s the thing. Here’s my dream. My dream is that by creating life in the lab based upon constraints we understand, let’s go for Venus type life or earth type life or something again, do an Earth 2.0. Screw it, let’s do an Earth 2.0. An Earth 2.0 has a different genetic alphabet. Fine, that’s fine. Different protein alphabet, fine. Have cells and evolution, all that stuff. We will then be able to say, “Okay, life is a more general phenomena. Selection is more general than what we think is the chemical constraints on life.” And we can point at James Webb and other telescopes at other planets that we are in that zone we are most likely to combinatorially overlap with because, so there’s chemistry…
Lex Fridman
(02:01:01)
You’re looking for some overlap.
Lee Cronin
(02:01:02)
And then we can then basically shine light on them literally and look at light coming back and apply advanced assembly theory to general theory of language that we’ll get and say, “Huh, in that signal, it looks random but there’s a copy number. Oh, this random set of things that shouldn’t be that looks like a true random number generator has structure as not [inaudible 02:01:32], an IT type structure, but evolutionary structure given by assembly theory,” and we start to… But I would say that because I’m a shameless assembly theorist.
Lex Fridman
(02:01:42)
Yeah, it just feels like the cone, I might be misusing the word cone here but the width of the cone is growing faster, is growing really fast to where eventually all the cones overlap even in a very, very, very large combinatorial space. But then again, if you’re saying the universe is also growing very quickly in terms of possibilities…
Lee Cronin
(02:02:14)
I hope that as we build abstractions, one idea is that as we go to intelligence, intelligence allows us to look at the regularities around us in the universe. And that gives us some common grounding to discuss with aliens. And you might be right that we will overlap there. Even though we have completely different chemistry, literally completely different chemistry, that we will be able to pass information from one another. But it’s not a given. And I have to try and divorce hope and emotion away from what I can logically justify.
Lex Fridman
(02:03:02)
But it’s just hard to intuit a world, a universe where there’s nearly infinite complexity objects and they somehow can’t detect each other.
Lee Cronin
(02:03:13)
The universe is expanding. But the nice thing is I would say, I would look, you see, I think Carl Sagan did the wrong thing. Not the wrong thing. He flicked the Voyager program and the Pale Blue Dot and said, “Look how big the universe is.” I would’ve done it the other way around and said, “Look at the Voyager probe that came from the planet earth that came from LUCA. Look at how big earth is.”
Lex Fridman
(02:03:31)
Then it produced that.
Lee Cronin
(02:03:32)
It produced that.
Lex Fridman
(02:03:34)
Yeah.
Lee Cronin
(02:03:35)
And that I think is completely amazing. And then that should allow people on earth to think about, “Probably we should try and get causal chains off Earth onto Mars, onto the moon, wherever. Whether it’s human life or martian life that we create, it doesn’t matter. But I think this combinatorial space tells us something very important about the universe and that I realized in assembly theory that the universe is too big to contain itself. Now coming back, I want to change your mind about time because I’m guessing that your time is just a coordinate. So I’m going to change…
Lex Fridman
(02:03:35)
I’m guessing you’re one of those.
Lee Cronin
(02:04:20)
One of those. I’m change my mind in real time or at least attempt.
Lex Fridman
(02:04:22)
Oh, in real time. There you go. I already got the tattoo. So this is going to be embarrassing if you change my mind.
Lee Cronin
(02:04:27)
But you can just add an arrow of time onto it, right?
Lex Fridman
(02:04:27)
Yeah, true. Just modify it.
Lee Cronin
(02:04:32)
Or raise it a bit. And the argument that I think that is really most interesting is people say the initial conditions specify the future of the universe. Okay, fine. Let’s say that’s the case for a moment. Now let’s go back to Newtonian mechanics. Now, the uncertainty principle in Newtonian mechanics is this. If I give you the coordinates of an object moving in space and the coordinates of another object and they collide in space. And those initial conditions, you should know exactly what’s going to happen. However, you cannot specify these coordinates to infinite precision. Now everyone says, “Oh, this is like the chaos theory argument.” No, no, it’s deeper than that. Here’s a problem with numbers. This is where Hilbert and Brouwer fell out. To have the coordinates of this object, a given object that’s colliding, you have to have them to infinite precision. That’s what Hilbert says. There’s no problem. Infinite precision is fine. Let’s just take that for granted.

(02:05:38)
But when the object is finite and it can’t store its own coordinates, what do you do? So in principle, if a finite object cannot be specified to infinite precision, in principle, the initial conditions don’t apply.
Lex Fridman
(02:05:58)
How do you know it can’t store its…
Lee Cronin
(02:06:01)
How do you store an in long number in a finite size?
Lex Fridman
(02:06:09)
We’re using infinity very loosely here.
Lee Cronin
(02:06:11)
No, no. We’re using…
Lex Fridman
(02:06:12)
Infinite precision. Not loosely, but…
Lee Cronin
(02:06:14)
Very precisely.
Lex Fridman
(02:06:15)
So you think infinite precision is required?
Lee Cronin
(02:06:18)
Let’s take the object. Let’s say the object is a golf ball. A golf ball is a few centimeters in diameter. We can work out how many atoms are in the golf ball. And let’s say we can store numbers down to atomic dislocations. So we can work out how many atoms there are in the golf ball and we can store the coordinates in that golf ball down to that number. But beyond that, we can’t. Let’s make the golf ball smaller. And this is where I think that we think that we get randomness in quantum mechanics and some people say you can’t get randomness, quantum mechanic’s deterministic, but aha, this is where we realize that classical mechanics and quantum mechanics suffer from the same uncertainty principle. And that is the inability to specify the initial conditions to a precise enough degree to give you determinism.

(02:07:09)
The universe is intrinsically too big and that’s why time exists. It’s non-deterministic. Looking back into the past, you can use logical arguments because you can say, “Was it true or false?” You already know. But this is the fact we are unable to predict the future with the precision is not evidence of lack of knowledge. It’s evidence the universe is generating new things.
Lex Fridman
(02:07:38)
Okay, first of all, quantum mechanics, you could just say statistically what’s going to happen when two golf balls hit each other.
Lee Cronin
(02:07:44)
Statistically. But sure, I can say statistically what’s going to happen. But then when they do happen and then you keep nesting it together, it goes almost back to, look, let’s think about entropy in the universe. So how do we understand entropy change or process? We can use the ergodic hypothesis. We can also have have the counterfactuals where we have all the different states and we can even put that in the multiverse. But both those, they’re nonphysical. The multiverse collapses back to the same problem about the precision. So if you accept, you don’t have to have true and false going forward into the future. The real numbers are real. They’re observables.
Lex Fridman
(02:08:47)
We’re trying to see exactly where time being fundamental sneaks in. And this difference between the golf ball can’t contain its own position perfectly precisely. How that leads to time needing to be fundamental.
Lee Cronin
(02:09:07)
Do you believe or do you accept you have free will?
Lex Fridman
(02:09:12)
Yeah, I think at this moment in time, I believe that I have free will.
Lee Cronin
(02:09:17)
So then you have to believe that time is fundamental.
Lex Fridman
(02:09:23)
I understand that’s a statement you’ve made.
Lee Cronin
(02:09:25)
No, that we can logically follow because if you don’t have free will, so if you’re in a universe that has no time, universe is deterministic. If it’s deterministic, then you have no free will.
Lex Fridman
(02:09:37)
I think the space of how much we don’t know is so vast that saying the universe is deterministic and from that jumping into there’s no free will is just too difficult of a leap.
Lee Cronin
(02:09:48)
No, I logically follow. No, no, I don’t disagree. It’s deep and it’s important. All I’m saying, and it’s actually different to what I’ve said before, is that if you don’t require platonistic mathematics and accepts that non-determinism is how the universe looks and that gives us our creativity and the way the universe is getting novelty, it’s really deeply important in assembly theory because assembly theory starts to actually give you a mechanism where you go from boring time, which is basically initial conditions specify everything, to a mismatch in creative time. And I hope we’ll do experiments. I would love to do an experiment that prove that time is fundamental and the universe is generating novelty. I don’t know all the features of that experiment yet, but by having these conversations openly and getting people to think about the problems in a new way, better people, more intelligent people with good mathematical backgrounds can say, “Oh, hey, I’ve got an idea. I would love to do an experiment that shows that the universe is too big for itself going forward in time.”

(02:11:04)
And this is why I really hate the idea of the Boltzmann brain. The Boltzmann brain makes me super, like everyone’s having a free lunch. It’s like saying, “Let’s break all the laws of physics.” So a Boltzmann brain is this idea that in a long enough universe, a brain will just emerge in the universe as conscious. And that neglects the causal chain of evolution that required to produce that brain. And this is where the computational argument really falls down because a computationalist could say,” I can calculate probability of a Boltzmann brain.” And they’ll give you a probability. But I can calculate probability of a Boltzmann brain. Zero.
Lex Fridman
(02:11:40)
Just because the space of possibilities is so large?
Lee Cronin
(02:11:43)
Yeah. When we start falling ourselves with numbers that we can’t actually measure and we can’t ever conceive of, I think it doesn’t give us a good explanation. And I want to explain why life is in the universe. I think life is actually novelty minor. Life basically mines novelty almost from the future and actualizes in the present.
Lex Fridman
(02:12:11)
Okay. Life is a novelty minor from the future that is actualized in the present.
Lee Cronin
(02:12:20)
Yep. I think so.
Lex Fridman
(02:12:24)
Novelty minor. First of all, novelty. What’s the origin of novelty when you go from boring time to creative time? Where is that? Is it as simple as randomness like you’re referring to?
Lee Cronin
(02:12:39)
I am really struggling with randomness because I had a really good argument with Joscha Bach about randomness, and he just said, “Randomness doesn’t give you free will. That’s insane because you’d just be random.” And I think he’s right at that level but I don’t think he is right on another level. And it’s not about randomness, it’s about constrained, I’m making this up as I go along, so making this up, constrained opportunity. So the novelty. What is novelty? This is what I think is a funny thing if you ever want to discuss AI. Why I think everyone’s gone AI mad is that they’re misunderstanding novelty. But let’s think about novelty. Yes. What is novelty? So I think novelty is a genuinely new configuration that is not predicted by the past and that you discover in the present. And that is truly different. Now, everyone says that. Some people say that novelty doesn’t exist. It’s always with precedent. I want to do experiments that show that that is not the case. And it goes back to a question you asked me a few moments ago, which is where is the factory?

(02:13:58)
Because I think the same mechanism that gives us a factory gives us novelty. And I think that is why I’m so deeply hung up on time. Of course I’m wrong, but how wrong? And I think that life opens up that combinatorial space in a way that our current laws of physics, although as contrived in a deterministic initial condition universe even with the get out of the multiverse, David Deutsch style, which I love by the way, but I don’t think is correct, but it’s really beautiful.
Lex Fridman
(02:14:37)
Multiverse.
Lee Cronin
(02:14:38)
David Deutsche’s conception of the multiverse is given. But I think that the problem with wave particle duality in quantum mechanics is not about the multiverse. It’s about understanding how determined the past is. I don’t just think that actually, this is a discussion I was having with Sara about that, where she was like, “Oh, I think we’ve been debating this for a long time now, about how do we reconcile novelty determinism in determinism.”
Lex Fridman
(02:15:13)
Okay. Just to clarify, both you and Sara think the universe is not deterministic?
Lee Cronin
(02:15:19)
I won’t speak for Sara but roughly. I think the universe is deterministic looking back in the past but undetermined going forward in the future. So I’m having my cake and eating it here. This is because I fundamentally don’t understand randomness, as Joscha told me or other people told me. But if I adopt a new view now which the new view is the universe is just non-deterministic, but I’d like to refine that and say the universe appears deterministic going back in the past but it’s undetermined going forward in the future. So how can we have a universe that has deterministically looking rules that’s non-determined
Lee Cronin
(02:16:00)
… universe that has deterministically-looking rules that is non-determined going into the future. It’s this breakdown in precision in the initial conditions, and we have to just stop using initial conditions and start looking at trajectories, and how the combinatorial space behaves in an expanding universe in time and space. And assembly theory helps us quantify the transition to biology, and biology appears to be novelty-mining, because it’s making crazy stuff that are unique to Earth. Right? There are objects on Earth that are unique to Earth that will not be found anywhere else, because you can do the combinatorial math.
Lex Fridman
(02:16:41)
What was that statement you made about “life is novelty-mining from the future”? What’s the little element of time that you’re introducing there?
Lee Cronin
(02:16:51)
What I’m kind of meaning is because the future is bigger than the present, in a deterministic universe, how do the states go from one to another? There’s a mismatch, right?
Lex Fridman
(02:17:02)
Yeah.
Lee Cronin
(02:17:03)
So, that must mean that you have a little bit of indeterminism. Whether that’s randomness or something else, I don’t understand. I want to do experiments to formulate a theory to refine that as we go forward that might help us explain that. And I think that’s why I’m so determined to try and crack the “non-life to life” transition looking at networks and molecules, and that might help us think about the mechanism. But certainly the future is bigger than the past in my conception of the universe and some conception of the universe. And-
Lex Fridman
(02:17:35)
By the way, that’s not obvious, right? The future being bigger than the past, well, that’s one statement, and the statement that the universe is not big enough to contain the future is another statement. That one is a big one. That one’s a really big one.
Lee Cronin
(02:17:53)
I think so, but I think it’s entirely … Because look, we have the second law, and right now we don’t need the second law if the future’s bigger than the past. It follows naturally. So, why are we retrofitting all these sticking plasters onto our reality to hold onto a timeless universe?
Lex Fridman
(02:18:13)
Yeah, but that’s because it’s kind of difficult to imagine the universe that can’t contain the future.
Lee Cronin
(02:18:21)
But isn’t that really exciting?
Lex Fridman
(02:18:23)
It’s very exciting, but it’s hard. We are humans on Earth, and we have a very kind of four-dimensional conception of the world, of 3D plus time. It’s just hard to intuit a world where, what does that even mean, a universe that can’t contain the future?
Lee Cronin
(02:18:47)
Yeah. It’s kind of crazy but obvious.
Lex Fridman
(02:18:50)
It’s weird, it’s weird. I suppose it sounds obvious, yeah, if it’s true.
Lee Cronin
(02:18:56)
So, the reason why assembly theory turned me onto that was that, let’s just start in the present, and look at all the complex molecules, and go backwards in time, and understand how evolutionary processes gave rise to them. It’s not at all obvious that taxol, which is one of the most complex natural products produced by biology, was going to be invented by biology. It’s an accident.

(02:19:24)
Taxol is unique to Earth. There’s no taxol elsewhere in the universe, and taxol was not decided by the initial conditions. It was decided by this interplay between the … So, the past simply is embedded in the present. It gives some features. But why the past doesn’t map to the future one-to-one is because the universe is too big to contain itself. That gives space for creativity, and novelty, and some things which are unpredictable.

Cellular automata

Lex Fridman
(02:19:57)
Well, okay. So, given that you’re disrespecting the power of the initial conditions, let me ask you about, how do you explain that cellular automata are able to produce such incredible complexity given just basic rules and basic initial conditions?
Lee Cronin
(02:20:12)
I think that this falls into the Brouwer-Hilbert trap. So, how do you get cellular automata to produce complexity? You have a computer, you generate a display, and you map the change of that in time. There are some CAs that repeat like functions.

(02:20:32)
It’s fascinating to me that for pi, there is a formula where you can go to the millionth decimal place of pi and read out the number without having to go there. But there are some numbers where you can’t do that, and you have to just crank through. Whether it’s Wolframian computational irreducibility or some other thing, well, it doesn’t matter. But these CAs, that complexity, is that just complexity, or a number that is basically you’re mining that number in time? Is that just a display screen for that number, that function?
Lex Fridman
(02:21:10)
Well, can’t you say the same thing about the complexity on Earth then?
Lee Cronin
(02:21:12)
No. Because the complexity on Earth has a copy number and an assembly index associated with it. That CA is just a number running.
Lex Fridman
(02:21:20)
You don’t think it has a copy number? Wait a minute …
Lee Cronin
(02:21:23)
Well, it does where we’re looking at humans producing different rules, but then it’s nested on selection. So, those CAs are produced by selection. The CA is such a fascinating pseudo-complexity generator. What I would love to do is understand, quantify the degree of surprise in a CA and run it long enough. But what I guess that means is we have to instantiate, we have to have a number of experiments where we’re generating different rules and running them time steps, but … Oh, I got it.

(02:21:53)
CAs are mining novelty in the future by iteration, right? And you’re like, ” Oh, that’s great. That’s great.” You didn’t predict it. Some rules you can predict what’s going to happen, and other rules you can’t. So for me, if anything, CAs are evidence that the universe is too big to contain itself, because otherwise you’d know what the rules are going to do forevermore.
Lex Fridman
(02:22:14)
Right. I guess you were saying that the physicist saying that all you need is the initial conditions and the rules of physics is somehow missing the bigger picture.
Lee Cronin
(02:22:26)
Yeah.
Lex Fridman
(02:22:27)
And if you look at CAs, all you need is the initial condition and the rules, and then run the thing.
Lee Cronin
(02:22:33)
You need three things; You need the initial conditions, you need the rules, and you need time iteration to mine it out. Without the coordinate, you can’t get it out.
Lex Fridman
(02:22:45)
Sure, and that to you is fundamental?
Lee Cronin
(02:22:47)
And you can’t predict it from the initial conditions. If you could, then it could be fine.
Lex Fridman
(02:22:51)
And that time is-
Lee Cronin
(02:22:53)
A resource.
Lex Fridman
(02:22:54)
… like the foundation of the history, the memory of each of the things it created. It has to have that memory of all the things that led up to it.
Lee Cronin
(02:23:05)
Yeah, you have to have the resource. Because time is a fundamental resource. Yeah, I think I had a major epiphany about randomness, but I keep doing that every two days and then it goes away again. It’s random.
Lex Fridman
(02:23:24)
You’re a time fundamentalist.
Lee Cronin
(02:23:26)
And you should be as well. If you believe in free will, then the only conclusion is that time is fundamental. Otherwise you cannot have free will. It logically follows.
Lex Fridman
(02:23:37)
Well, the foundation of my belief in free will is observation-driven.
Lee Cronin
(02:23:48)
But that’s-
Lex Fridman
(02:23:48)
I think if you use logic, logically it seems like the universe is deterministic.
Lee Cronin
(02:23:55)
Looking backwards in time then that’s correct, the universe is.
Lex Fridman
(02:23:59)
And then everything else is a kind of leap. It requires a leap.
Lee Cronin
(02:24:11)
This is why I think machine learning is going to provide a chunk of that, right? To help us explain this. So, the way I’d say it, if you take …
Lex Fridman
(02:24:19)
That’s interesting. Why?

AGI

Lee Cronin
(02:24:21)
Well, my favorite one is … Because AI doomers are driving me mad, and in fact we don’t have any intelligence yet. I call AI “autonomous informatics” just to make people grumpy.
Lex Fridman
(02:24:34)
Yeah. You’re saying we’re quite far away from AGI.
Lee Cronin
(02:24:39)
I think that we have no conception of intelligence, and I think that we don’t understand how the human brain does what it does. I think that neuroscience is making great advances, but I think that we have no idea about AGI. So, I am a technological, I guess optimist. I believe we should do everything. The whole regulation of AI is nonsensical. Why would you regulate Excel, other than the fact that Clippy should come back and I love Excel ’97 because we can do the flight simulator.
Lex Fridman
(02:25:11)
Sorry, in Excel?
Lee Cronin
(02:25:12)
Yeah, have you not played the flight simulator in-
Lex Fridman
(02:25:14)
In Excel ’97?
Lee Cronin
(02:25:16)
Yeah.
Lex Fridman
(02:25:18)
What does that look like?
Lee Cronin
(02:25:19)
It’s like wireframe, very basic. But basically I think it’s X zero, Y zero, shift, and it opens up and you can play the flight simulator.
Lex Fridman
(02:25:29)
Oh, wow. Wait, wait, is it using Excel?
Lee Cronin
(02:25:32)
Excel ’97.
Lex Fridman
(02:25:33)
Okay.
Lee Cronin
(02:25:34)
I resurrected it the other day and saw Clippy again for the first time in a long time.
Lex Fridman
(02:25:37)
Well, Clippy is definitely coming back. But you’re saying we don’t have a great understanding of what is intelligence, what is the intelligence underpinning the human mind.
Lee Cronin
(02:25:50)
I’m very frustrated by the way that we’re AI dooming right now, and people are bestowing some kind of magic. Now, let’s go back a bit. So, you said about AGI, are we far away from AGI? Yes. I do not think we’re going to get to AGI anytime soon. I’ve seen no evidence of it, and the AI doom scenario is nonsensical in the extreme.

(02:26:12)
The reason why I think it’s nonsensical … And I don’t think there isn’t things we should do and be very worried about. There are things we need to worry about right now, what AI are doing. Whether it’s fake data, fake users. I want authentic people, authentic data. I don’t want everything to be faked, and I think it’s a really big problem, and I absolutely want to go on the record to say I really worry about that. What I’m not worried about is that some fictitious entity is going to turn us all to paperclips or detonate nuclear bombs, or maybe, I don’t know, anything you can think of.

(02:26:49)
Why is this? I’ll take a very simple series of logical arguments, and the AI doomers do not have the correct epistemology. They do not understand what knowledge is. And until we understand what knowledge is, they’re not going to get anywhere because they’re applying things falsely. So, let me give you a very simple argument.

(02:27:18)
People talk about the probability, “P(doom)”, of AI. We can work out the probability of an asteroid hitting the planet. Why? Because it’s happened before. We know the mechanism. We know that there’s a gravity well, or that spacetime is bent and stuff falls in. We don’t know the probability of AGI because we have no mechanism. So, let me give you another one, which is like, “I’m really worried about AG.” What’s AG? AG is anti-gravity. “One day we could wake up and anti-gravity is discovered, we’re all going to die, the atmosphere is going to float away, we’re going to float away, we’re all doomed.”

(02:27:52)
What is the probability of AG? We don’t know because there’s no mechanism for AG. Do we worry about it? No, and I don’t understand the current reason for certain people in certain areas to be generating this nonsense. I think they’re not doing it maliciously. I think we’re observing the emergence of new religions, how religions come, because religions are about some controls.

(02:28:20)
You’ve got the optimist saying, “AI is going to cure us all,” and, “AI is going to kill us all.” What’s the reality? Well, we don’t have AI. We have really powerful machine learning tools and they will allow us to do interesting things, and we need to be careful about how we use those tools in terms of manipulating human beings and faking stuff. Right?
Lex Fridman
(02:28:38)
Right. Well, let me try to steel man the AI doomers’ argument. And actually, I don’t know, are AI doomers in the Yudkowsky camp saying it’s definitely going to kill us? Because there’s a spectrum.
Lee Cronin
(02:28:38)
95% I think is the limit.
Lex Fridman
(02:28:54)
Plus? 95%-plus, that’s the-
Lee Cronin
(02:28:55)
No, not plus. I don’t know. I was seeing on Twitter today various things. But I think Yudkowsky is at 95%.
Lex Fridman
(02:29:02)
But to belong to the AI doomer club, is there a threshold? I don’t know what the membership …
Lee Cronin
(02:29:06)
Maybe.
Lex Fridman
(02:29:07)
And what are the fees?
Lee Cronin
(02:29:09)
Well, I think Scott Aronson, I was quite surprised, had put two … I saw this online, so I could be wrong. So, sorry if it’s wrong. He says 2%. But the thing is, if someone said there’s a 2% chance that you’re going to die going into the lift, would you go into the lift?
Lex Fridman
(02:29:24)
In the elevator, for the American English-speaking audience. Well, no, not for the elevator.
Lee Cronin
(02:29:30)
So, I would say anyone higher than 2% … I think there’s a 0% chance of AGI doom. Zero.
Lex Fridman
(02:29:37)
Just to push back on the argument where N of zero on the AGI … We could see on Earth that there’s increasing levels of intelligence of organisms. We can see what humans with extra intelligence were able to do to the other species. So, that is a lot of samples of data, what a delta in intelligence gives you. When you have an increase in intelligence, how you’re able to dominate a species on Earth.

(02:30:08)
So, the idea there is that if you have a being that’s 10x smarter than humans, we’re not going to be able to predict what that being is going to be able to do, especially if it has the power to hurt humans. Which, you can imagine a lot of trajectories in which the more benefit AI systems give, the more control we give to those AI systems over our power grid, over our nuclear weapons, or weapons of any sort. And then it’s hard to know what an ultra-intelligence system would be able to do in that case. You don’t find that convincing?
Lee Cronin
(02:30:50)
I think I would fail that argument 100%. Here’s a number of reasons to fail it on. First of all, we don’t know where the intention comes from. The problem is that people keep … I’ve been watching all the hucksters online with the prompt engineering and all this stuff. When I talk to a typical AI computer scientist, they keep talking about the AIs having some kind of decision-making ability. That is a category error.

(02:31:17)
The decision-making ability comes from human beings. We have no understanding of how humans make decisions. We’ve just been discussing free will for the last half an hour, right? We don’t even know what that is. So, the intention, I totally agree with you, people who intend to do bad things can do bad things and we should not let that risk go. That’s totally here and now. I do not want that to happen, and I’m happy to be regulated to make sure that systems I generate, whether they’re computer systems, or … I’m working on a new project called “Chem Machina”.
Lex Fridman
(02:31:53)
Nice. Well done.
Lee Cronin
(02:31:54)
Yeah, yeah. Which is basically a …
Lex Fridman
(02:31:59)
For people who don’t understand the pun, the Ex Machina is a great film about I guess AGI embodied, and “chem” is the chemistry version of that.
Lee Cronin
(02:32:07)
And I only know one way to embody intelligence, and that’s in chemistry and human brains. So, category error number one is that they have agency. Category error number two is assuming that anything we make is going to be more intelligent. Now, you didn’t say super-intelligent. I’ll put the words into our mouths here, super-intelligent. I think that there is no reason to expect that we are going to make systems that are more intelligent. More capable …

(02:32:38)
When people play chess computers, they don’t expect to win now, right? The chess computer is very good at chess. That doesn’t mean it’s super-intelligent. So, I think that super-intelligence, and I think even Nick Bostrom is pulling back on this now, because he invented this … So, I see this a lot. When did I see it first happen? Eric Drexler, nanotechnology. Atomically precise machines. He came up with a world where we had these atom cogs everywhere and we were going to make self-replicating nanobots.

(02:33:06)
Not possible. Why? Because there’s no resources to build these self-replicating nanobots. You can’t get the precision. It doesn’t work. It was a major category error in taking engineering principles down to the molecular level. The only functioning nanomolecular technology we know is produced by evolution. There.

(02:33:27)
So, now let’s go forward to AGI. What is AGI? We don’t know. It’s super, it can do this, or humans can’t think. I would argue the only AGIs that exist in the universe are produced by evolution. And sure, we may be able to make our working memory better. We might be able to do more things. The human brain is the most compact computing unit in the universe. It uses 20 watts, uses a really limited volume. It’s not like a ChatGPT cluster which has to have thousands of watts, and a model that’s generated, and it has to be corrected by human beings. You are autonomous and embodied intelligence.

(02:34:04)
So, I think that there are so many levels that we’re missing out, we’ve just kind of went, “Oh, we’ve discovered fire. Oh gosh, the planet’s just going to burn one day randomly.” I just don’t understand that leap. There are bigger problems we need to worry about. So, what is the motivation? Why are these people, and let’s assume they’re earnest, have this conviction? Well, I think they’re making leaps and they’re trapped in a virtual reality that isn’t reality.
Lex Fridman
(02:34:34)
Well, I can continue a set of arguments here, but also it is true that ideologies that fearmonger are dangerous. Because you can then use it to control, to regulate in a way that halts progress, to control people, and to cancel people, all that kind of stuff. So, you have to be careful, because reason ultimately wins. Right?

(02:35:03)
But there is a lot of concerns with super-intelligent systems, very capable systems. I think when you hear the word “super-intelligent”, you’re hearing, “It’s smarter than humans in every way that humans are smart.” But the paperclip manufacturing system doesn’t need to be smart in every way. It just needs to be smart in a set of specific ways. And the more capable the AI systems become, the more you could see us giving them control over, like I said, our power grid, a lot of aspects of human life. And then that means they’ll be able to do more and more damage when there’s unintended consequences that come to life.
Lee Cronin
(02:35:46)
I think that that’s right. The unintended consequences we have to think about, and that I fully agree with. But let’s go back a bit. Sentience … Again, I’m far away from my comfort zone and all this stuff, but hey, let’s talk about it. Because I give myself a qualification.
Lex Fridman
(02:36:02)
Yeah, we’re both qualified in sentience, I think, as much as anyone else.
Lee Cronin
(02:36:07)
I think the paperclip scenario is just such a poor one, because let’s think about how that would happen. And also, let’s think about, we are being so unrealistic about how much of the Earth’s surface we have commandeered. For paperclip manufacturing to really happen, do the math. It’s not going to happen. There’s not enough energy, there’s not enough resource. Where is it all going to come from?

(02:36:32)
I think that what happens in evolution, it’s really: Why has a killer virus not killed all life on Earth? Well, what happens is, sure, superkiller viruses that kill the ribosome have emerged. But you know what happens? They nuke a small space because they can’t propagate. They all die. So, there’s this interplay between evolution and propagation, right? And death. So …
Lex Fridman
(02:36:56)
In evolution. You don’t think it’s possible to engineer, for example, and sorry to interrupt, but a perfect virus?
Lee Cronin
(02:37:02)
No.
Lex Fridman
(02:37:02)
That’s deadly enough?
Lee Cronin
(02:37:04)
No. Nonsensical. I think again, it wouldn’t work. Because if it was too deadly, it would just kill the radius and not replicate.
Lex Fridman
(02:37:11)
Yeah. But you don’t think it’s possible to get a …
Lee Cronin
(02:37:16)
If you were …
Lex Fridman
(02:37:17)
Not kill all of life on Earth, but kill all humans. There’s not many of us. There’s only like 8 billion. There’s so much more ants. So many more ants, and they’re pretty smart.
Lee Cronin
(02:37:32)
I think the nice thing about where we are, I would love for the AI crowd to take a leaf out of the book of the bio-warfare, chemical warfare crowd. I mean, not love, because actually people have been killed with chemical weapons in the first and second World War, and bio-weapons have been made, and we can argue about COVID-19 and all this stuff. Let’s not go there just now. But I think there is a consensus that some certain things are bad and we shouldn’t do them, right? And sure, it would be possible for a bad actor to engineer something bad, but we would see it coming and we would be able to do something about it.

(02:38:16)
Now, I guess what I’m trying to say is when people talk about doom, and when you ask them for the mechanism, they just make something up. In this case, I’m with Yann LeCun. I think you put out a very good point about trying to regulate jet engines before we’ve even invented them. And I think that’s what I’m saying.

(02:38:39)
I’m not saying we should … I just don’t understand why these guys are going around literally making stuff up about us all dying, when basically we need to actually really focus on … Now, let’s say there’s some actors that are earnest. Let’s say Yudkowsky is being earnest and he really cares. But he loves it. He goes, “Da, da, da, and then you’re all going to die.” It’s like, why don’t we try and do the same thing and say, “You could do this, and then you’re all going to be happy forever after”?
Lex Fridman
(02:39:07)
Well, I think there’s several things to say there. One, I think there is a role in society for people that say we’re all going to die. Because I think it filters through as a message, as a viral message that gives us the proper amount of concern. Meaning it’s not 95%, but when you say 95% and it filters through society, it’ll give an average of like a 0.03%. An average. So, it’s nice to have people that are like, “We’re all going to die,” and then we’ll have a proper concern.

(02:39:41)
For example, I do believe we’re not properly concerned about the threat of nuclear weapons currently. It just seems like people have forgotten that that’s a thing, and there’s a war in Ukraine with a nuclear power involved. There’s nuclear powers throughout the world, and it just feels like war in the brink of a potential world war to a percentage that I don’t think people are properly calibrating in their head. We’re all thinking it’s a Twitter battle as opposed to actual threat.

(02:40:12)
So, it’s nice to have that kind of level of concern. But to me, when I hear AI doomers, what I’m imagining is with unintended consequences a potential situation where let’s say 5% of the world suffers deeply because of a mistake made, of unintended consequences. I don’t want to imagine the entirety of human civilization dying, but there could be a lot of suffering if this is done poorly.
Lee Cronin
(02:40:39)
I understand that, and I guess I’m involved in the whole hype cycle. So, let’s say having some people saying AI doom is a worry, fine. Let’s give them that. But what seems to be happening is there seems to be people who don’t think AI is doing that, and they’re trying to use that to control regulation and to push people to regulate, which stops humans generating knowledge. And I am an advocate for generating as much knowledge as possible.

Nuclear weapons


(02:41:15)
When it comes to nuclear weapons, I grew up in the ’70s and ’80s where there was nuclear doom and a lot of adults really had existential threat, almost as bad as now with AI doom. They were really worried. There were some great … Well, not great. There were some horrific documentaries. I think there was one called Threads that was generated in the UK, which, it was terrible. It was so scary.

(02:41:40)
And I think that the correct thing to do is obviously get rid of nuclear weapons, but let’s think about unintended consequences. We’ve got rid of … This is going to be such a non sequitur. We got rid of all the sulfur particles in the atmosphere, right? All the soot. And what’s happened in the last couple of years is global warming has accelerated because we’ve cleaned up the atmosphere too much. So …
Lex Fridman
(02:42:02)
Sure. The same thing if you get rid of nuclear weapons. You’ll get [inaudible 02:42:05]-
Lee Cronin
(02:42:05)
Exactly, that’s my point. So, what we could do is if we actually started to put the AI in charge … Which I’d really like an AI to be in charge of all world politics, and this will sound ridiculous for a second. Hang on. But if we could all agree on the-
Lex Fridman
(02:42:19)
The AI doomers just woke up on that statement.
Lee Cronin
(02:42:22)
Yeah, yeah, yeah. But I really don’t like politicians who are basically just looking at local sampling. But if you could say globally, “Look, here’s some game theory here. What is the minimum number of nuclear weapons we need to distribute around the world to everybody to basically reduce war to zero?”
Lex Fridman
(02:42:40)
Just the thought experiment of, the United States and China and Russia and major nuclear powers get together and say, “All right, we’re going to distribute nuclear weapons to every single nation on Earth.” Oh, boy. That has a probably greater than 50% chance of eliminating major military conflict, but it’s not a hundred percent.
Lee Cronin
(02:43:07)
But I don’t think anyone will use them, because … And look, what you’ve got to try and do is to qualify for these nuclear weapons … This is a great idea. The game theorists could do this, right?
Lex Fridman
(02:43:19)
Uh-huh.
Lee Cronin
(02:43:20)
I think the question is this … I really buy your question. We have too many nukes. Just from a feeling point of view, that we’ve got too many of them. So, let’s reduce the number, but not get rid of them because we’ll have too much conventional warfare. So then, what is the minimum number of nuclear weapons we can distribute around to remove … Humans hurting each other is something we should stop doing. It’s not out with our conceptual capability …

(02:43:46)
But right now, what about certain nations that are being exploited for their natural resources in the future for a short-term gain because we don’t want to generate knowledge? So, if everybody had an equal doomsday switch, I predict the quality of life of the average human will go up faster. I am an optimist, and I believe that humanity is going to get better and better and better, that we’re going to eliminate more problems. But I think, yeah, let’s-
Lex Fridman
(02:44:13)
But the probability of a bad actor, of one of the nations setting off a nuclear weapon, you have to integrate that into the calculus here.
Lee Cronin
(02:44:26)
But we just give you [inaudible 02:44:28] nukes population. Right? What we do is we … I can’t believe this. But anyway, let’s just go there. So, if a small nation with a couple of nukes uses one because they’re a bit bored or annoyed, the likelihood that they are going to be pummeled out of existence immediately is 100%. And yet they’ve only nuked one other city. I know this is crazy, and I apologize for …
Lex Fridman
(02:44:51)
Well, no, no. Just to be clear, we’re just having a thought experiment that’s interesting. But there’s terrorist organizations that would take that trade. We have to ask ourselves a question of: Which percentage of humans would be suicide bombers, essentially? Where they would sacrifice their own life because they hate another group of people. I believe it’s a very small fraction, but is it large enough to, if you give out nuclear weapons …
Lee Cronin
(02:45:25)
I can predict a future where we take all nuclear material and we burn it for energy, right? Because we’re getting there. And the other thing you could do is say, “Look, there’s a gap.” So, if we get all the countries to sign up to the virtual agreement where we have a simulation where we can nuke each other in the simulation and the economic consequences are catastrophic …
Lex Fridman
(02:45:43)
Sure. In the simulation, I love it. It’s not going to kill all humans, it’s just going to have economic consequences.
Lee Cronin
(02:45:49)
Yeah, yeah. I don’t know, I just made it up. It seems like a cool idea.
Lex Fridman
(02:45:51)
No, it’s interesting. But it’s interesting whether that would have as much power on human psychology as actual physical nuclear explosion.
Lee Cronin
(02:45:59)
I think so.
Lex Fridman
(02:46:00)
It’s possible, but people don’t take economic consequences as seriously I think as actual nuclear weapons exploding.
Lee Cronin
(02:46:07)
I think they do in Argentina, and they do in Somalia. And they do in a lot of these places where … No, I think this is a great idea. I’m a strong advocate now for … So, what have we come up with? Burning all the nuclear material to have energy. And before we do that, because MAD is good, mutually assured destruction is very powerful, let’s take it into the metaverse and then get people to kind of subscribe to that. And if they actually nuke each other even for fun in the metaverse, there are dire consequences.
Lex Fridman
(02:46:36)
Yeah, yeah. So, it’s like a video game. We all have to join this metaverse video game …
Lee Cronin
(02:46:41)
Yeah. I can’t believe we just …
Lex Fridman
(02:46:43)
And then there’s dire economic consequences. And it’s all run by AI, as you mentioned, so the AI doomers are really terrified at this point.
Lee Cronin
(02:46:52)
No, they’re happy. They have a job for another 20 years, right?
Lex Fridman
(02:46:55)
Oh, fear-mongering.
Lee Cronin
(02:46:56)
Yeah, yeah, yeah. I’m a believer in equal employment.

Chem Machina

Lex Fridman
(02:47:00)
You’ve mentioned that, what’d you call it … Chem Machina?
Lee Cronin
(02:47:06)
Yeah.
Lex Fridman
(02:47:07)
Yeah. So, you’ve mentioned that a chemical brain is something you’re interested in creating, and that’s the way to get conscious AI soon. Can you explain what a chemical brain is?
Lee Cronin
(02:47:22)
I want to understand the mechanism of intelligence that’s gone through evolution, right? Because the way that intelligence was produced by evolution appears to be the following: origin of life, multi-cellularity, locomotion, senses. Once you can start to see things coming towards you, and you can remember the past and interrogate the present and imagine the future, you can do something amazing, right? And I think only in recent years did humans become Turing-complete, right?
Lex Fridman
(02:47:57)
Yeah.
Lee Cronin
(02:47:58)
Right? So, that Turing completeness kind of gave us another kick up. But our ability to process that information was produced in a wet brain. And I think that we do not have the correct hardware architectures to have the domain flexibility and the ability to integrate information, and I think intelligence also comes at a massive compromise of data. Right now we’re obsessing about getting more and more data, more and more processing, more and more tricks to get dopamine hits. So, when we look back on this going, “Oh yeah, that was really cool, because when I asked ChatGPT, it made me feel really happy and I got a hit from it.” But actually it just exposed how little intelligence I use in every moment, because I’m easily fooled.

(02:48:58)
So, what I would like to do is to say, “Well, hey, hang on. What is it about the brain?” So, the brain has this incredible connectivity, and it has the ability to … As I said earlier about my nephew, I went from “Bill” to “Billy” and he went, “All right, Leroy.” How did he make that leap? That he was able to basically without any training … I extended his name in a way that he doesn’t like. He wants to be called Bill. He went back and said, “You like to be called Lee? I’m going to call you Leroy.”

(02:49:29)
So, human beings have a brilliant ability, or intelligent beings appear to have a brilliant ability to integrate across all domains all at once, and to synthesize something which allows us to generate knowledge. And becoming Turing-complete on our own, although AIs are built and Turing-complete things, their thinking is not Turing-complete in that they are not able to build universal explanations. And that lack of universal explanation means that they’re just-
Lee Cronin
(02:50:00)
Lack of universal explanation means that they’re just inductivists. Inductivism doesn’t get you anywhere. It’s just basically a party trick. I think it’s in The Fabric Of Reality from David Deutsch where basically the farmer is feeding the chicken every day and the chicken’s getting fat and happy. And the chicken’s like, “I’m really happy every time the farmer comes in and feeds me.” And then one day the farmer comes in and instead of feeding the chicken, just rings its neck. And had the chicken had an alternative understanding of why the farmer was feeding it.
Lex Fridman
(02:50:37)
It’s interesting though, because we don’t know what’s special about the human mind that’s able to come up with these kind of generalities. This universal theories of things. And we’ll come up with novelty. I can imagine… Because you gave an example about William and Leroy. I feel like an example like that we’ll be able to see in future versions of large language models. We’ll be really, really, really impressed by the humor, the insights, all of it. Because it’s fundamentally trained on all the incredible humor and insights that’s available out there on the internet. So we’ll be impressed. I think we’ll be impressed.
Lee Cronin
(02:51:22)
Oh, I’m impressed. I’m impressed.
Lex Fridman
(02:51:25)
Increasingly so.
Lee Cronin
(02:51:26)
But we are mining the past.
Lex Fridman
(02:51:28)
Yes.
Lee Cronin
(02:51:28)
And what the human brain appears to be able to do is mine the future.
Lex Fridman
(02:51:31)
Yes. So novelty, it is interesting whether these large language models will ever be able to come up with something truly novel.
Lee Cronin
(02:51:41)
I can show on the back of a piece of paper why that’s impossible. And it’s like the problem is that… And again these are domain experts kind of bullshitting each other. The term generative, right. Average person say, oh, it’s no, no, no. Look, if I take the numbers between zero and 1000 and I train a model to pick out the prime numbers by giving all the prime numbers between zero and a thousand, it doesn’t know what prime number is. Occasionally if I can cheat a bit, it will start to guess.

(02:52:12)
It never will produce anything out with the dataset because you mine the past. The thing that I’m getting to is I think that actually current machine learning technologies might actually help reveal why time is fundamental. It’s like kind of insane. Because they tell you about what’s happened in the past, but they can never help you understand what’s happening in the future without training examples. Sure, if that thing happens again. So let’s think about what large language models are doing. We have all the internet as we know it, language, but also they’re doing something else. We having human beings correcting it all the time. Those models are being corrected,
Lex Fridman
(02:52:54)
Steered.
Lee Cronin
(02:52:56)
Corrected, modified, tweaked.
Lex Fridman
(02:53:01)
Well, yeah, but-
Lee Cronin
(02:53:02)
Cheating.
Lex Fridman
(02:53:04)
Well you could say the training on human data in the first place is cheating.
Lee Cronin
(02:53:08)
Well, human is in the loop. Sorry to interrupt.
Lex Fridman
(02:53:10)
Yes. So human is definitely in the loop, but it’s not just human is in the loop. A very large collection of humans is in the loop.
Lee Cronin
(02:53:10)
Look I totally-
Lex Fridman
(02:53:21)
And that could be… I mean to me it’s not intuitive that you said prime numbers, that the system can’t generate an algorithm. That the algorithm that can generate prime numbers or the algorithm that can tell you if a number is prime and so on. And generate algorithms that generate algorithms, that generate algorithms that start to look a lot like human reasoning.
Lee Cronin
(02:53:46)
I think again, we can show that on a piece of paper, that sure. I think you have to have… So this is the failure in epidemiology. I’m glad I even can say that word, let know what it means.
Lex Fridman
(02:53:59)
You said it multiple times.
Lee Cronin
(02:54:00)
I know. It’s like three times now.
Lex Fridman
(02:54:01)
Without failure. Quit while you’re ahead. Just don’t say it again because you did really well.
Lee Cronin
(02:54:07)
Thanks. But I think, so what is reasoning? So coming back to the chemical brain. If I could show the inner… Because I mean I’m never going to make an intelligence in ca machina. Because if you don’t have brain cells, they don’t have glial cells, they don’t have neurons. But if I can take a gel and engineer the gel to have it be a hybrid hardware for reprogramming, which I think I know how to do, I will able to process a lot more information and train models billions of times cheaper and use cross domain knowledge. And there’s certain techniques I think we can do. But there’s still missing, though the abilities that human beings have had to become true and complete. And so I guess the question to give back at you is like how do you tell the difference between trial and error and the generation of new knowledge?

(02:55:06)
I think the way you can do it is this, is that you come up with a theory, an explanation, inspiration comes from out, and then you then test that, and then you see that’s going towards the truth. And human beings are very good at doing that. And the transition between philosophy, mathematics, physics and natural sciences. And I think that we can see that. Where I get confused is why people misappropriate the term artificial intelligence to say, “Hey, there’s something else going on here.” Because I think you and I both agree, machine learning’s really good, it’s only going to get better. We’re going to get happier with the outcome. But why would you ever think the model is thinking or reasoning? Reasoning requires intention. And the intention, if the model isn’t reasoning, the intentions come from the prompter. And the intention has come from the person who programmed it to do it.
Lex Fridman
(02:56:08)
But don’t you think you can prompt it to have intention?Basically start with the initial conditions and get it going? Where currently large language models, ChatGPT only talks to you when you talk to it. There’s no reason why you can’t just start it talking.
Lee Cronin
(02:56:31)
But those initial conditions came from someone starting it.
Lex Fridman
(02:56:35)
Yes.
Lee Cronin
(02:56:35)
And that causal chain in there. So that intention comes from the outside. I think that there is something in that causal chain of intention that’s super important. I don’t disagree, we’re going to get to AGI. It’s a matter of when and what hardware. I think we’re not going to do it in this hardware and I think we’re unnecessarily fetishizing really cool outputs and dopamine hits. Because obviously that’s what people want to sell us.
Lex Fridman
(02:56:57)
Well, but there could be AGI is a loaded term. But there could be incredibly super impressive intelligence systems on the way to AGI. So these large language models, I mean if it appears conscious, if it appears super intelligent, who are we to say it’s not.
Lee Cronin
(02:57:21)
I agree, but the super intelligence I want, I want to be able to have a discussion with it about coming up with fundamental new ideas that generate knowledge. And if the superintelligent we generate can mine novel even from the future that I didn’t see in its training set in the past, I would agree that something really interesting is coming on. I’ll say that again. If the intelligence system, be it a human being, a Chatbot, something else, is able to produce something truly novel that I could not predict ,even having full audit trail from the past, then I’ll be sold.
Lex Fridman
(02:57:58)
Well, so we should be clear that it can currently produce things that are in a shallow sense novel. That are not in the training set. But you’re saying truly novel.
Lee Cronin
(02:58:11)
I think they are in the training set. I think everything it produces comes from a training set. There’s a difference between novelty and interpolation. We do not understand where these leaps come from yet. That is what intelligence is I would argue. Those leaps and some people say no, it’s actually just what will happen if you just do cross domain training and all that stuff. And that may be true. And I may be completely wrong. But right now the human mind is able to mine novelty in a way that artificial intelligence systems cannot. And this is why we still all have a job. And we’re still doing staff. And I used ChatGPT for a few weeks. Oh this is cool. And then what happened is it took me too much time to correct it. Then it got really good. And now they’ve done something to it. It’s not actually that good.
Lex Fridman
(02:58:58)
Yeah, right.
Lee Cronin
(02:58:59)
I don’t know what’s going on.
Lex Fridman
(02:59:00)
Censorship. Yeah, I mean that’s interesting. But it will push us humans to characterize novelty better. Characterize the novel, what is novel, what is truly novel, what’s the difference between novelty and interpolation.
Lee Cronin
(02:59:10)
I think that this is the thing that makes me most excited about these technologies, is they’re going to help me demonstrate to you that time is fundamental. And the unit future is bigger than the present. Which is why human beings are quite good at generating novelty because we have to expand our dataset. And to cope with unexpected things in our environment. Our environment throws them all at us. Again, we have to survive in that environment. And I mean, I never say never. I would be very interested in how we can get cross domain training cheaply in chemical systems. Because I’m a chemist and bray, the only sim thing I know of is a human brain. But maybe that’s just me being boring and predictable and not novel.

GPT for electron density

Lex Fridman
(02:59:54)
Yeah. You mentioned GPT for electron density. So a GPT like system for generating molecules that can bind to host automatically. I mean that’s interesting. I’s really interesting. Applying this same kind of transform mechanism.
Lee Cronin
(03:00:11)
I mean, my team, I try and do things that are non obvious but non obvious in certain areas. And one of the things I was always asking about in chemistry, people like to represent molecules as graphs and it’s quite difficult. It’s really hard if you’re doing AI and chemistry, you really want to basically have good representations. You can generate new molecules are interesting. And I was thinking, well molecules aren’t really graphs and they’re not continuously differentiable. Could I do something that was continuously differentiable? I was like, well, molecules are actually made up of electron density. So I got thinking and say, well, okay, could there be a way where we could just basically take a database of readily solved electron densities for millions of molecules? So we took the electron density for millions of molecules and just train the model to learn what electron density is.

(03:01:06)
And so what we built was a system that you literally could give it a, let’s say you could take a protein that has a particular active site or a cup with a certain hole in it. You pour noise into it and with A GPT you turn the noise into electron density. And then in this case it hallucinates, like all of them do. But then hallucinations are good because it means I don’t have to train on such a huge dataset, because these data sets are very expensive. How do you produce it? So go back a step. So you’ve got all these molecules in this dataset, but what you’ve literally done is a quantum mechanical calculation. We produce electron densities for each molecule. So you say, oh, this representation of this molecule has these electron densities associated with it, so you know what the representation is and you train the neural network to know what electron density is.

(03:01:54)
So then you give it an unknown pocket. You pour in noise and you say, right, produce me electron density, it produces electron density that doesn’t look ridiculous. And what we did in this case is we produce electron density that maximizes the electrostatic potential, so the stickiness, but minimizes what we call the steric hindrance. So the overlaps, so it’s repulsive. So make the perfect fit. And then we then use kind of like a ChatGPT type thing to turn that electron density into what’s called a smile. A smile string is a way of representing a molecule in letters. And then we can then-
Lex Fridman
(03:02:32)
So it just generates them then.
Lee Cronin
(03:02:34)
Just generates them. And then the other thing is then we bung that into the computer and then it just makes it.
Lex Fridman
(03:02:39)
Yeah, the computer being the thing that right… To generate-
Lee Cronin
(03:02:40)
The robot we’ve got that can basically just do chemistry. So we’ve kind of got this end-to-end drug discovery machine where you can say, “Oh, you want to bind to this active site, here you go.” I mean it is a bit leaky and things kind of break, but it is the proof of principle.
Lex Fridman
(03:02:56)
But were the hallucinations, are those still accurate?
Lee Cronin
(03:03:01)
Well the hallucinations are really great in this case, because in the case of a large language model, the hallucinations just make everything up. It doesn’t just make everything up, but it gives you an output that you are plausibly comfortable with and thinks you’re doing probabilistically. The problem on these tron density models is it’s very expensive to solve a shredding equation going up to many heavy atoms and large molecules. And so we wondered if we trained the system on up to nine heavy atoms, whether it would go beyond nine and it did, It started to generate molecules for 12. No problem. They look pretty good. And I was like, well this hallucination I will take for free. Thank you very much.

(03:03:42)
Because it just basically… This is a case where interpolation extrapolation worked relatively well. And we were able to generate the really good molecules. And then what we were able to do here is, and this is a really good point and what I was trying to say earlier, that we were able to generate new molecules, from the known set, that would bind to the host. So a new guest would bind. Were these truly novel? Not really because they were constrained by the host. Were they new to us? Yes. So I do, well understand… I can concede that machine learning systems, artificial intelligence systems can generate new entities, but how novel are they? It remains to be seen.
Lex Fridman
(03:04:32)
And how novel the things that humans generate is also difficult to quantify. They seem novel.
Lee Cronin
(03:04:40)
That’s what a lot of people say. So the way to really get to genuine novelty, and assembly theory shows you the way, is to have different causal chains overlap. And this really, really resonates with the time is fundamental argument. And if you are bringing together a couple of objects with different initial conditions coming together, when they interact, the more different their histories, the more novelty they generate in time going forward. And so it could be that genuine novelty is basically about mix it up a little. And the human brain is able to mix it up a little little, and all that stimulus comes from the environment. But all I think I’m saying is the universe is deterministic going back in time. Non-deterministic going forward in time. Because the universe is too big in the future to contain in the present. Therefore these collisions of known things generate unknown things, that then become part of your data set and don’t appear weird. That’s how we give ourselves comfort. The past looks consistent with this initial condition hypothesis, but actually we’re generating more and more novelty. And that’s how it works. Simple.
Lex Fridman
(03:05:58)
So it’s hard to quantify novelty looking backwards. I mean the present and the future at the novelty generators.
Lee Cronin
(03:06:05)
But I like this whole idea of mining novelty. I think it is going to reveal why the limitations of current AI is a bit like a printing press. Everyone thought that when the printing press came that writing books is going to be terrible, that you had evil spirits and all this. They were just books.
Lex Fridman
(03:06:26)
And same with AI. But I think just the scale you can achieve in terms of impact with AI systems is pretty nerve wracking.
Lee Cronin
(03:06:35)
But that’s what the big companies want you to think.
Lex Fridman
(03:06:39)
But not in terms of destroy all humans. But you can have major consequences in the way social media has had major consequences, both positive and negative. And so you have to think about it and worry about it. But yeah, people that fear monger…
Lee Cronin
(03:06:55)
My pet theory for this, you want to know?
Lex Fridman
(03:06:58)
Yeah.
Lee Cronin
(03:06:59)
Is I think that a lot… And maybe I’m being… And I really do respect a lot of the people out there who are trying to have discourse about the positive future. So open AI guys, meta guys and all this. What I wonder if they’re trying to cover up for the fact that social media has had a pretty disastrous effect at some level, and they’re just trying to say, “Oh yeah, we should do this.” Covering up for the fact that we have got some problems with teenagers, and Instagram, and Snapchat, and all this stuff, and maybe they’re just overreacting now. It’s like, “Oh yeah, sorry, we made the bubonic plate and gave it to you all and you’re all dying.” And “Oh yeah, but look at this over here it’s even worse.”
Lex Fridman
(03:07:40)
Yeah, there’s a little bit of that. But there’s also not enough celebration of the positive impact that all of these technologies have had. We tend to focus on the negative and tend to forget that. In part because it’s hard to measure. It is very hard to measure the positive impact social media had on the world.
Lee Cronin
(03:07:58)
Yeah, I agree. But what I worry about right now is I do care about the ethics of what we’re doing. And one of the reasons why I’m so open about the things we’re trying to do in the lab, make life look at intelligence, all this, so people say, what are the consequences of this? And you say, what are the consequences of not doing it? And I think that what worries me right now in the present is lack of authenticated users and authenticated data and-
Lex Fridman
(03:08:25)
Human users.
Lee Cronin
(03:08:26)
Yeah, human.
Lex Fridman
(03:08:28)
I still think that there will be AI agents that appear to be conscious, but they would have to be also authenticated and labeled as such. There’s too much value in that. Like friendships with AI systems. There’s too much meaningful human experiences to have with the AI systems that I just…
Lee Cronin
(03:08:48)
But that’s like a tool, right? It’s a bit like a meditation tool, right?
Lex Fridman
(03:08:50)
Sure.
Lee Cronin
(03:08:50)
Some people have a meditation tool, it makes them feel better. But I’m not sure you can ascribe sentience and legal rights to a chatbot that makes you feel less lonely.
Lex Fridman
(03:09:00)
Sentience, yes. I think legal rights, no. I think it’s the same. You can have a really deep, meaningful relationship with a dog.
Lee Cronin
(03:09:08)
Well the dog is sentient.
Lex Fridman
(03:09:10)
Yes.
Lee Cronin
(03:09:12)
The chatbots right now, using the technology we use, it’s not going to be sentient.

God

Lex Fridman
(03:09:16)
This is going to be a fun continued conversation on Twitter that I look forward to. Since you’ve had also from another place some debates that were inspired by the assembly theory paper, let me ask you about God. Is there any room for notions of God in assembly theory? Of God.
Lee Cronin
(03:09:42)
Yeah. I don’t know what God is a… I mean, so God exists in our mind created by selection. So the human beings have created the concept of God in the same way that human beings have created the concept of super intelligence.
Lex Fridman
(03:09:57)
Sure, but does it mean, does it not… It still could mean that that’s a projection from the real world where we’re just assigning words and concepts to a thing that is fundamental to the real world. That there is something out there that is a creative force underlying the universe.
Lee Cronin
(03:10:22)
I think the universe… There is a creative force in the universe, but I don’t think it’s sentient. So I do not understand the universe. So who am I to say that God doesn’t exist? I am an atheist, but I’m not an angry atheist. There’s some people I know that are angry atheists and say-
Lex Fridman
(03:10:49)
Cranky.
Lee Cronin
(03:10:50)
Say that religious people are stupid. I don’t think that’s the case. I have faith in some things. I mean when I was a kid I was like, I need to know what the charge of electron is. And I was like, I can’t measure the charge on electron. I just gave up and had faith. Okay, you know, resistors worked. So when it comes to… I want to know why the universe is growing in the future and what humanity is going to become. And I’ve seen that the acquisition of knowledge via the generation of novelty to produce technology has uniformly made humans’ lives better. I would love to continue that tradition.
Lex Fridman
(03:11:31)
You said that there’s that creative force. Do you think, just to think on that point, do you think there’s a creative force? Is there like a thing, like a driver that’s creating stuff?
Lee Cronin
(03:11:45)
Yeah, so I think that…
Lex Fridman
(03:11:48)
And where? What is it? Can you describe it mathematically?
Lee Cronin
(03:11:51)
Well, I think selection. I think selection.
Lex Fridman
(03:11:53)
Selection is the force.
Lee Cronin
(03:11:54)
Selection is the force in the universe. It creates novelty.
Lex Fridman
(03:11:58)
So is selection somehow fundamental? Like what…
Lee Cronin
(03:12:03)
Yeah, I think persistence of objects that could decay into nothing through operations that maintain that structure. I mean, think about it. It’s amazing that things exist at all. That we’re just not a big commentorial mess.
Lex Fridman
(03:12:17)
Yes.
Lee Cronin
(03:12:18)
So the fact that-
Lex Fridman
(03:12:21)
And exist. A thing that exists persist in time.
Lee Cronin
(03:12:23)
Yeah. Let’s think, maybe the universe is actually in the present. The things… Everything that can exist in the present does exist.
Lex Fridman
(03:12:39)
Well that would mean it’s deterministic, right?
Lee Cronin
(03:12:42)
I think the universes might. So the universe started super small. The past was deterministic, there wasn’t much going on. And it was able to mine mine, mine, mine, mine. And so the process is somehow generating universes basically… I’m trying to put this into words.
Lex Fridman
(03:13:02)
Did you just say there’s no free will though?
Lee Cronin
(03:13:04)
No, I didn’t say that.
Lex Fridman
(03:13:05)
As if-
Lee Cronin
(03:13:06)
Sorry, sorry, sorry.
Lex Fridman
(03:13:06)
-it can exist.
Lee Cronin
(03:13:07)
I said there is free will. I’m saying that free will occurs at the boundary between the-
Lex Fridman
(03:13:17)
The past and the future?
Lee Cronin
(03:13:19)
The past and the future.
Lex Fridman
(03:13:20)
Yeah, I got you. But everything that can exist does exist.
Lee Cronin
(03:13:25)
So everything that’s possible to exist at this… So no, I’m really pulling this…
Lex Fridman
(03:13:30)
There’s a lot of loaded words there. There’s a time element loaded into that statement.
Lee Cronin
(03:13:36)
I think that the universe is able to do what it can in the present, right?
Lex Fridman
(03:13:40)
Yeah.
Lee Cronin
(03:13:40)
And then I think in the future there are other things that could be possible. We can imagine lots of things, but they don’t all happen.
Lex Fridman
(03:13:45)
Sure.
Lee Cronin
(03:13:46)
So what-
Lex Fridman
(03:13:46)
So that’s where-
Lee Cronin
(03:13:47)
So that’s what I guess I’m getting to.
Lex Fridman
(03:13:49)
-you sneak in free will right there.
Lee Cronin
(03:13:50)
Yeah. So I guess what I’m saying is what exists is a convolution of the past with the present, and the free will going into the future.
Lex Fridman
(03:14:00)
Well, we could still imagine stuff. Right? We can imagine stuff that will never happen.
Lee Cronin
(03:14:04)
And it’s amazing force. Because this is the most important thing that we don’t understand. Is our imaginations can actually change the future in a tangible way. Which is what the initial conditions and physics cannot predict. Your imagination has a causal consequence in the future.
Lex Fridman
(03:14:25)
Isn’t that weird to you?
Lee Cronin
(03:14:26)
Yeah. It breaks the laws of physics as we know them right now.
Lex Fridman
(03:14:37)
So you think the imagination has a causal effect on the future?
Lee Cronin
(03:14:41)
Yeah.
Lex Fridman
(03:14:42)
But it does exist in there in the head.
Lee Cronin
(03:14:45)
It does, but-
Lex Fridman
(03:14:45)
There must be a lot of power in whatever’s going on. There could be a lot of power, whatever’s going on in there.
Lee Cronin
(03:14:50)
If we then go back to the initial conditions, and that is simply not possible that can happen. But if we go into a universe where we accept that there is a finite ability to represent numbers. And you have rounding… Well not rounding errors, you have sum… What happens, your ability to make decisions, imagine and do stuff is that that interface between the certain and the uncertain. It’s not as Yashar was saying to me, “Randomness goes and you just randomly do random stuff.” It is that you are set free a little on your trajectory. Free will is about being able to explore on this narrow trajectory, that allows you to build… You have a choice about what you build. Or that choice is you interacting with a future in the present.
Lex Fridman
(03:15:39)
What to you is most beautiful about this whole thing? The universe?
Lee Cronin
(03:15:46)
The fact it seems to be very undecided, very open. The fact that every time I think I’m getting towards an answer to a question, there are so many more questions that make the chase.
Lex Fridman
(03:16:03)
Do you hate that it’s going to be over at some point for you?
Lee Cronin
(03:16:06)
No. Well for me. I think if you think about it, is it over for Newton now? Newton has had causal consequences in the future. We discuss him all the time,
Lex Fridman
(03:16:18)
His ideas, but not the person.
Lee Cronin
(03:16:21)
The person just had a lot of causal power when he was alive. But oh my God, one of the things I want to do is leave as many Easter eggs in the future when I’m gone to go, “Oh, that’s cool.”
Lex Fridman
(03:16:30)
Would you be very upset if somebody made a good large language model that’s fine tuned to Lee Cronin?
Lee Cronin
(03:16:37)
It would be quite boring. Because I mean, I…
Lex Fridman
(03:16:40)
No novelty generation?
Lee Cronin
(03:16:42)
I mean if it’s a faithful representation of what I’ve done in my life, that’s great. That’s an interesting artifact. But I think the most interesting thing about knowing each other is we don’t know what we’re going to do next.
Lex Fridman
(03:16:54)
Sure. Sure.
Lee Cronin
(03:16:57)
I mean within some constraints I’ve got, I can predict some things about you. You can predict some things about me. But we can’t predict everything.
Lex Fridman
(03:17:04)
Everything.
Lee Cronin
(03:17:05)
And it’s because we can’t predict everything is why we’re exciting to come back and discuss and see. So yeah, I’m happy that it’ll be interesting that some things that I’ve done can be captured, but I’m pretty sure that my angle on mining novelty for the future will not be captured.
Lex Fridman
(03:17:28)
Yeah. Yeah. So that’s what life is, is just some novelty generation and then you’re done. Each one of us just generally a little bit. Or have the capacity to at least.
Lee Cronin
(03:17:43)
I think life is a selection produces life. And life affects a universe. Universes with life in them are materially, physically, fundamentally different than universes without life. And that’s super interesting. And I have no beginnings of understanding. I think maybe this is in a thousand years, there’ll be a new discipline. And the humans will be like, “Yeah, of course. This is how it all works.” Right?
Lex Fridman
(03:18:10)
In retrospect, it’ll all be obvious I think.
Lee Cronin
(03:18:13)
I think assembly theory is obvious, that’s why a lot of people got angry. They were like, “Oh my God, this is such nonsense.” And like, “Oh, actually it’s not quite.” But the writing’s really bad.
Lex Fridman
(03:18:25)
Well, I can’t wait to see where it evolves, Lee. And I am glad I get to exist in this universe with you. You’re a fascinating human. This is always a pleasure. I hope to talk to you many more times. And I’m a huge fan of just watching you create stuff in this world. And thank you for talking today.
Lee Cronin
(03:18:44)
It’s a pleasure as always, Lex. Thanks for having me on.
Lex Fridman
(03:18:47)
Thanks for listening to this conversation with Lee Cronin. To support this podcast, please check out our sponsors in the description. And now, let me leave you with some words from Carl Sagan. We can judge our progress by the courage of our questions, and the depth of our answers. Our willingness to embrace what is true rather than what feels good. Thank you for listening. And hope to see you next time.

Transcript for Lisa Randall: Dark Matter, Theoretical Physics, and Extinction Events | Lex Fridman Podcast #403

This is a transcript of Lex Fridman Podcast #403 with Lisa Randall.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
The following is a conversation with Lisa Randall, a theoretical physicist and cosmologist at Harvard. Her work involves improving our understanding of particle physics, supersymmetry, baryogenesis, cosmological inflation, and dark matter.

(00:00:15)
This is the Lex Friedman podcast. To support it, please check out our sponsors in the description. Now, dear friends, here’s Lisa Randall.

Dark matter


(00:00:24)
One of the things you work on and write about is dark matter. We can’t see it, but there’s a lot of it in the universe. You also end one of your books with a Beatles song quote, “‘Got to be good-looking because he’s so hard to see.” What is dark matter? How should we think about it given that we can’t see it? How should we visualize it in our mind’s eye?
Lisa Randall
(00:00:47)
I think one of the really important things that physics teaches you is just our limitations, but also our abilities. The fact that we can deduce the existence of something that we don’t directly see is really a tribute to people that we can do that. It’s also something that tells you, you can’t overly rely on your direct senses. If you just relied on just what you see directly, you would miss so much of what’s happening in the world.

(00:01:15)
We can generalize this, but just for now to focus on dark matter, it’s something we know is there, and it’s not just one way we know it’s there. In my book, Dark Matter and the Dinosaurs, I talk about the many different ways. There’s eight or nine that we deduce not just the existence of dark matter, but how much is there, and they all agree.

(00:01:36)
Now, how do we know it’s there? Because of its gravitational force. Individually, a particle doesn’t have such a big gravitational force. In fact, gravity is an extremely weak force compared to other forces we know about in nature, but there’s a lot of dark matter out there. It carries a lot of energy. Five times the amount of energy as the matter. We know that’s in atoms, et cetera.

(00:02:00)
You can ask, how should we think about it? It’s just another form of matter that doesn’t interact with light, or at least as far as we know. It interacts gravitationally, it clumps, it forms galaxies, but it doesn’t interact with light, which means we just don’t see it. Most of our detection, before gravitational wave detectors, we only saw things because of their interactions with light in some sense.
Lex Fridman
(00:02:25)
In theory, it behaves just like any other matter, it just doesn’t interact with light.
Lisa Randall
(00:02:30)
When we say it interacts just like any other form of matter, we have to be careful because gravitationally, it interacts like other forms of matter, but it doesn’t experience electromagnetism, which is why it has a different distribution.

(00:02:44)
In our galaxy, it’s roughly spherical unless it has its own interactions, that’s another story. We know that it’s roughly spherical, whereas ordinary matter can radiate and clumps into a disk. That’s why we see the Milky Way disk. On large scales, in some sense, yes, all the matter is similar in some sense.

(00:03:06)
In fact, dark matter is in some sense more important because it can collapse more readily than ordinary matter because ordinary matter has radiative forces, which makes it hard to collapse on small scales. Actually it’s dark matter that drives galaxy formation and then ordinary matter comes along with it.

(00:03:30)
There’s also just more of it, and because there’s more of it can start collapsing sooner. That is to say the energy density in dark matter dominates over radiation earlier than you would if you just had an ordinary matter.
Lex Fridman
(00:03:43)
It’s part of the story of the origin of the galaxy, part of the story of the end of the galaxy, and part of the story of all the various interactions throughout.
Lisa Randall
(00:03:50)
Exactly. In my book, I make jokes about, it’s like when we think about a building, we think about the architect, we think about the high level, but we forget about all the workers that did all the grunt work. In fact, dark matter was really important in the formation of our universe, and we forget that sometimes.
Lex Fridman
(00:04:07)
That’s a metaphor on top of a metaphor. Okay. The unheard voices that do the actual work.
Lisa Randall
(00:04:16)
Exactly. No, but it is a metaphor, but it also captures something because the fact is we don’t directly see it, so we forget it’s there or we don’t understand it’s there, or we think it’s not. The fact that we don’t see it makes it no less legitimate, it just means that we have challenges in order to find out exactly what it is.
Lex Fridman
(00:04:35)
Yeah, but the things we cannot see that nevertheless have a gravitational interaction with the things we can’t see is at the layman level, it’s just mind-blowing.
Lisa Randall
(00:04:49)
It is and it isn’t because I think what it’s teaching us is that we’re human, the universe is what it is, and we’re trying to interact with that universe and discover what it is. We’ve discovered, amazing things.

(00:05:03)
In fact, I would say it’s more surprising that the matter that we know about is constitutes as big a fraction of the universe as it does. We’re limited, we’re human. The fact that we see 5% of the energy density of the universe, about one sixth of the energy density in matter, that’s remarkable. Why should that be? Anything could be out there, yet the universe that we see is a significant fraction.
Lex Fridman
(00:05:30)
Yeah, but a lot of our intuition, I think operates using visualizations in the mind.
Lisa Randall
(00:05:36)
That’s absolutely true. Certainly writing books, I realized also how many of our words are based on how we see the world, and that’s true. That’s actually one of the fantastic things about physics is that it teaches you how to go beyond your immediate intuition to develop intuitions that apply at different distances, different scales, different ways of thinking about things.
Lex Fridman
(00:05:57)
Yeah. How do you anthropomorphize dark matter?
Lisa Randall
(00:06:01)
I just did, I think. I made it the grunt workers.
Lex Fridman
(00:06:04)
Oh yeah, that’s good. You did. That’s why you get paid the big bucks and write the great books. Okay, you also write in that book about dark matter, having to do something with the extinction events, the extinction of the dinosaurs, which is a fascinating presentation of how everything is connected.

(00:06:28)
I guess the disturbances from the dark matter, they create gravitational disturbances in the Oort Cloud at the edge of our solar system, and then that increases the rate of asteroids hitting earth.
Lisa Randall
(00:06:42)
I want to be really clear, this was a speculative theory.
Lex Fridman
(00:06:44)
I love it, though.
Lisa Randall
(00:06:48)
I liked it too. We still don’t know for sure, but what we liked about it… Let me take a step back. We usually assume that dark matter, we being physicists, that it’s just one thing. It’s just basically non-interacting aside from gravity or very weakly interacting matter.

(00:07:11)
Again, we have to get outside this mindset of just humans and ask what else could be there. What we suggested is that there’s a fraction of dark matter, not all the dark matter, but some of the dark matter, maybe it has interactions of its own just the same way in our universe, we have lots of different types of matter. We have nuclei, we have electrons, we have neutrons, we have forces.

(00:07:35)
It’s not a simple model, the standard model, but it does have some basic ingredients, so maybe dark matter also has some interesting structure to it. Maybe there’s some small fraction. The interesting thing is that if some of the dark matter does radiate, and I like to call it dark light because it’s light that we don’t see, but dark matter would see. It could radiate that and then it could perhaps collapse into a disk the same way ordinary matter collapsed into the Milky Way disk.

(00:08:06)
It’s not all the dark matter, it’s a fraction, but it could conceivably be a very thin disk of dark matter, thin, dense disk of dark matter. The question is do these exist? People have done studies now to think about whether they can find them. It’s an interesting target, it’s something you can measure. By measuring the positions and velocities of stars, you can find out what the structure of the Milky Way is, but the fun proposal was that the solar system orbits around the galaxy.

(00:08:36)
As it does so, it goes a little bit up and down kind of horses on a carousel. The suggestion was every time it goes through, you have an enhanced probability that you would dislodge something from the edge of the solar system in something called the Oort Cloud. The idea was that at those times, you’re more likely to have these cataclysmic events such as the amazing one that actually caused the last extinction that we know of for sure.
Lex Fridman
(00:09:01)
It wasn’t so amazing for the dinosaurs.
Lisa Randall
(00:09:04)
Or for two thirds of the species on the planet.
Lex Fridman
(00:09:06)
But it gets amazing for humans. It wouldn’t be-
Lisa Randall
(00:09:08)
What really is amazing… I talk about this in Dark Matter and the Dinosaurs. It is just an amazing scientific story because it really is one of the real stories that combine together different fields of science. Geologists at the time or people thought that things happen slowly and this would be a cataclysmic event.

(00:09:27)
Also, I have to say if you think about it, it sounds like a story a five-year-old would make up. Maybe the dinosaurs were killed by some big rock that came and hit the earth, but then there really was a scientific story behind it. That’s also why I like the dark disk because there’s a scientific story behind it. As far-fetched as it might sound, you could actually go and look for the experimental consequences, for the observational consequences to test whether it’s true.
Lex Fridman
(00:09:51)
I wish you could know high-resolution details of where that asteroid came from, where in the Oort Cloud, why it happened, is it in fact because of dark matter? Just the full tracing back to the origin of the universe because humans seem to be somewhat special. It seems like so many fascinating events at all scales of physics had to happen for [inaudible 00:10:17].
Lisa Randall
(00:10:16)
I’m really, really glad you mentioned that because actually that was one of the main points of my book, Dark Matter and the Dinosaurs. One of the reasons I wrote it was because I really think we are abusing the planet, we’re changing the planet way too quickly. Just like anything else, when you alter things, it’s good to think about the history of what it took to get here.

(00:10:34)
As you point out, it took many operations on many different scales. We had to have the formation of structure, the formation of galaxies, the formation of the solar system, the formation of our planet, the formation of humans. There’s so many steps that go into this. Humans in some part were the result of the fact that this big object hit the earth, made the dinosaurs go extinct, and mammals developed. It is an incredible story and yes, something else might come of it, but it won’t be us if we mess with it too much.
Lex Fridman
(00:11:05)
But it is on a grand scale, earth is a pretty resilient system. Can you just clarify, just fascinating, the shape of things. The shape of the Milky Way’s… Of the observable stuff is mostly flat. You said dark matter tends to be spherical, but a subset of that might be a flat disk.
Lisa Randall
(00:11:31)
You want to hear about the shape of things.
Lex Fridman
(00:11:34)
Yes, please.
Lisa Randall
(00:11:36)
Structure formed early on, and now our structure that we live in is… We know about the Milky Way galaxy. The Milky Way galaxy has the disk you can see in a dry dark place, that’s where stars and light is, but you can also measure in some ways the dark matter. We believe that dark matter is more or less spherically distributed. Like we said, there’s a lot of it, not necessarily in the disk, but just because it’s a sphere, there’s a lot of it sitting there.

(00:12:11)
The reason it doesn’t collapse as far as we know is that it can’t radiate the same way. Because it can radiate ordinary matter collapses, and this actually, because of conservation of angular momentum, it stays a disk and it doesn’t just collapse to the center. Our suggestion was that maybe there are some components of dark matter that also radiate.

(00:12:31)
Like I said, that’s far from proven. People have looked for a disk, they see some evidence of some disks of certain densities, but these are all questions that are worth asking. Basically if we can figure it out from existing measurements, why not try?
Lex Fridman
(00:12:44)
Okay, so not all dark matter is made the same.
Lisa Randall
(00:12:48)
That’s a possibility. We actually don’t know what dark matter is in the first place, we don’t know what most of it is, we don’t know what a fraction is. It’s hard to measure. Why is it hard to measure for exactly the reason you said earlier, we don’t see it. We want to think of possibilities for what it can be, especially if those give rise to some observational consequences. It’s a tough game because it’s not something that’s just there for the taking. You have to think about what it could be and how you might find it.
Lex Fridman
(00:13:16)
The way you detect it is gravitational effects on things we can see.
Lisa Randall
(00:13:22)
That would be the way you detect the type of dark matter. I’ve been talking about people have suggestions for other forms of dark matter. They could be particles called axions, they could be other types of particles, and then there are different ways of detecting it.

(00:13:34)
The most popular candidate for dark matter probably until pretty recently because they haven’t found it, is something called WIMPs, Weakly Interacting Massive Particles, particles that have mass about the same as the Higgs boson mass, and it turns out then you would get about the right density of dark matter.

(00:13:52)
People really like that, of course, because it is connected to the standard model, the particles that we know about, and if it’s connected to that, we have a better chance of actually seeing it. Fortunately or unfortunately, it’s also a better chance that you can rule it out because you can look for it. So far, no one has found it. We’re still looking for
Lex Fridman
(00:14:08)
Is that one of the hopes of the Large Hadron Collider?
Lisa Randall
(00:14:11)
That was originally one of the hopes of Large Hadron Collider. I’d say at this point, it would be very unlikely given what they’ve already accomplished, but there are these underground detectors, xenon detectors that look for dark matter coming in, and they are going to try to achieve a much stronger bound than exists today.
Lex Fridman
(00:14:35)
Just to take that tangent, looking back now, what’s the biggest, to you, insight to humanity that the LHC has been able to provide?
Lisa Randall
(00:14:47)
It’s interesting. It’s both a major victory. The Higgs boson was proposed 50 years ago, and it was discovered. The Higgs mechanism seemed to be the only way to explain elementary particle masses and it was right so on the one hand, it was a major victory. On the other hand, I’ve been in physics long enough to know it was also a cautionary tale in some sense because at the time I started out in physics, we had proposed something in the United States called the Superconducting Supercollider.

(00:15:15)
A lot of physicists, I’ll say particularly in Europe, but I’d say a lot of physicists were saying when that the Large Hadron Collider would have the energy reach necessary to discover what underlies the standard model. We don’t want to just discover the standard model, we want to know what the next step is.

(00:15:31)
I think here people were more cautious about that. They want to have a more comprehensive search that could get to higher energies, more events so that we could really more definitively rule it out. In that case, many people thought they knew what would be there. It happened to be a theory called supersymmetry. A lot of physicists thought it would be supersymmetry.

(00:15:51)
It’s one of the many factors I think that went into the fact that the Large Hadron Collider became the only machine in town, and the Superconducting Supercollider would’ve just been a much… If it had really had achieved what it was supposed to, would’ve been a much more robust test of the space.

(00:16:07)
I’d say for humanity, it’s both a tribute to the ability of discovery and the ability of really believing in things so that you have the confidence to go look for them, but it’s also a cautionary tale that you don’t want to assume things before they’ve been actually found. You want to believe in your theories, but you also want to question them at the same time in ways that you’re more likely to discover the truth.
Lex Fridman
(00:16:32)
It’s also an illustration of grand engineering efforts that humanity can take on and maybe a lesson that you could go even bigger.
Lisa Randall
(00:16:43)
I’m really glad you said that though too, because that’s absolutely true. It really is an impressive… It’s impressive in so many ways. It’s impressive technologically, it’s impressive at engineering level.

(00:16:55)
It’s also impressive that so many countries work together to do this. It wasn’t just one country. It was also impressive in that it was a long-term project that people committed to and made it happen. It is a demonstration that when people set their minds to things and they commit to it, that they can do something amazing.
Lex Fridman
(00:17:18)
Also in the United States, maybe a lesson that bureaucracy can slow things down to [inaudible 00:17:24].
Lisa Randall
(00:17:24)
Bureaucracy and politics.
Lex Fridman
(00:17:26)
Politics.
Lisa Randall
(00:17:27)
And economics. Many things can make them faster and make them slower.
Lex Fridman
(00:17:32)
Science is the way to make progress, politics is the way to slow that progress down. And here we are.
Lisa Randall
(00:17:39)
I don’t want to overstate that because without politics, the [inaudible 00:17:42] wouldn’t happen either.
Lex Fridman
(00:17:43)
You need broccoli.
Lisa Randall
(00:17:49)
Sometimes I do think… You’re not asking this question, but sometimes I do think when I think about some of these conflicts, sometimes it’s just good to have a project that people work on together. There were some efforts to do that in science too, to have Palestinians and Israelis work together, a project called Sesame. I think it’s not a bad idea when you can do that, when you can get… Forget the politics and just focus on some particular project. Sometimes that can work.
Lex Fridman
(00:18:25)
Some kind of forcing function, some kind of deadline that gets people to sit in a room together and you’re working on a thing. As part of that, you realize the common humanity, that you all have the same concerns, the same hopes, the same fears, that you are all human. That’s an accidental side effect of working together on a thing.
Lisa Randall
(00:18:45)
That’s absolutely true. It’s one of the reasons CERN was formed actually. It was post-World War II, and a lot of European physicists had actually left Europe and they wanted to see Europeans work together and rebuild, and it worked. They did. It’s true, I often think that, that one of the major problems is we just don’t meet enough people so that everyone… When they seem like the other, it’s more easy to forget their humanity. I think it is important to have these connections.

Extinction events

Lex Fridman
(00:19:16)
Given the complexity, all cosmological scales involved here that led to the extinction of the dinosaurs, when you look out at the future of earth, do you worry about future extinction events?
Lisa Randall
(00:19:29)
I do think that we might be in the middle of an extinction right now if you define it by the number of species that are getting killed off. It’s subtle, but it’s a complex system. The way things respond to events is sometimes things evolve, sometimes animals just move to another place. The way we’ve developed the earth, it’s very hard for species just to move somewhere else.

(00:19:54)
We’re seeing that with people now, too. I know people are worried just about AI taking over, and that’s a totally different story. We just don’t think about the future very much. We think about what we’re doing now, and we certainly don’t think enough about all the animals that we’re destroying, all the things that are precursors to humans that we rely on.
Lex Fridman
(00:20:14)
It’s interesting to think whether the things that threaten us is the stuff we see that’s happening gradually or the stuff we don’t really see that’s going to happen all of a sudden. I sometimes think about what should we be worried about? It seems like with the asteroids or nuclear war, it could be stuff that just happens one day. When I say one day meaning over a span of a few days or a few months, but not on a scale of decades and centuries. We sometimes mostly talk about stuff that’s happening gradually, but we can be really surprised.
Lisa Randall
(00:20:57)
It’s actually really interesting. That was actually one of the reasons it took a while to determine what it was that it caused the last extinction because people did think at the time, many people thought that things were more gradual, and the idea of extinction was actually a novel concept at some point.

(00:21:14)
These aren’t predictable events necessarily. They’re only predictable on a grand scale, but sometimes they are. I think people were pretty aware that nuclear weapons were dangerous. I’m not sure people are as aware now as they were say, 20 or 30 years ago, and that certainly worries me. I have to say I was not as worried about AI as other people, but now I understand. It’s more that as soon as you create things that we lose control over, it’s scary.

(00:21:50)
The other thing that we’re learning from the events today is that it takes a few bad actors. It takes everyone to make things work well, it takes not that many things to make things go wrong. The issue with disease, we can find out what causes a disease, but to make things better is not necessarily that simple. Sometimes it is. But for things to be healthy, a lot of things have to work. For things to go wrong, only one thing has to go wrong. It’s amazing that we do it.

(00:22:19)
The same is true for democracy. For democracy to work, a lot of people have to believe in it. A few bad actors can destroy things sometimes. A lot of the things that we really rely on are delicate equilibrium situations. There is some robustness in the systems, we try to build in robustness, but a few extreme events can sometimes alter things. I think that’s what people are scared of today in many ways. They’re scared of it for democracy, they’re scared of it for peace, they’re scared of it for AI.

(00:22:51)
I think they’re not as scared as they should be about nuclear weapons, to be honest. I think that’s more serious danger than people realize. I think people are a little bit more scared about pandemics than they were before, but I still say they’re not super scared about it. So you’re right, there are these major events that can happen and we are setting things up so that they might happen, and we should be thinking about them. The question is who should be thinking about them? How should we be thinking about them? How do you make things happen on a global scale, because that’s really what we need.
Lex Fridman
(00:23:23)
It certainly shouldn’t be a source of division, it should be a source of grand collaboration probably.
Lisa Randall
(00:23:29)
Wouldn’t that be nice?
Lex Fridman
(00:23:30)
Yeah. I just wonder what it’d be like to be a dinosaur. It must have been beautiful to look at that asteroid enter the atmosphere. Until everything…. Man, that would be one of the things I would travel back in time to just to watch it.
Lisa Randall
(00:23:50)
That’s also one of the things that I think you probably could do with virtual reality. I don’t think you have to be there and get extinct.
Lex Fridman
(00:23:54)
To just experience it.
Lisa Randall
(00:23:55)
I think there’s something… It’s an event. You’re just watching. You’re not doing anything, you’re just looking at it, so maybe you could just recreate it.
Lex Fridman
(00:24:01)
I actually heard that there’s a nuclear weapon explosion experience in virtual reality that’s good to remind you about what it would feel like.
Lisa Randall
(00:24:14)
I have to say, I got an award from the Museum of Nuclear History and Technology in the Southwest, and I went to visit the museum, which turned out to be mostly a museum of nuclear weapons. The scary thing is that they look really cool.

(00:24:30)
It’s true that you have that, yes, this is scary, but you also have, this is cool feeling and I think we have to get around that because I think that yes, you can be in that, but I’m not sure that’s going to make people scared. Have they actually asked afterwards, are you more or less scared?
Lex Fridman
(00:24:50)
That’s a really good point. That’s a good summary of just humanity in general. We’re attracted to creating cool stuff, even though it can be dangerous.
Lisa Randall
(00:25:01)
Actually, that was the really interesting thing about visiting that museum, actually. It was very nice because I had a tour from people who had been working there in the Cold War and actually one or two people from the Manhattan Project. It was a very cool tour. You just realize just how just the thing itself gets you so excited.

(00:25:16)
I think that’s something sometimes these movies miss, just the thing itself. You’re not thinking about the overall consequences. In some ways it was like the early Silicon Valley. People were just thinking what if we did this? What if we did that? Not keeping track of what the peripheral consequences are. You definitely see that happening with AI now. I think that was the moral of the battle that just happened, that it’s just full speed ahead.
Lex Fridman
(00:25:43)
Which gives me a really great transition to another quote in your book. You write about the experience of facing the sublime in physics, and you quote Rainer Rilke. “For beauty is nothing but the beginning of terror, which we are still just able to endure, and we’re so odd because it’s serenely disdains to annihilate us.” It’s pretty intense. It I think applies to nuclear weapons.
Lisa Randall
(00:26:14)
At a more mundane, perhaps level, I think it applies… It’s really interesting. One of the things that I found when I wrote these books is some people love certainty. Scientists, many revel in uncertainty. It’s not that you want to be uncertain, you want to solve it, but you’re at this edge where it’s really frustrating because you don’t really want to not know the answer, but of course, if you knew the answer, it would be done.

(00:26:41)
You’re always at this edge where you’re trying to sort things out and there is something scary. You don’t know if there’s going to be a solution, you don’t know if you’re going to find it. It’s not something that can destroy the earth, it’s just something that you do on your individual level. But then of course there are much bigger things like the ones you’re talking about where they could actually be dangerous. The stuff I do, I just want to be clear, I’m doing theoretical physics. Not very dangerous, but sometimes things end up having bigger consequences than you think.
Lex Fridman
(00:27:13)
Dangerous in a very pragmatic sense. Isn’t it still in part terrifying when you think of just the size of things like the size of dark matter, the power of this thing in terms of its potential gravitational effects, just cosmological objects of a black hole at the center of our galaxy.
Lisa Randall
(00:27:36)
This might be why I’m a physicist or why I differ from other people because I’m not such a big fan of humanity in some ways. Some ways I am, but the idea that we were everything would be really boring to me. I love the idea that there’s so much more out there, that there’s a bigger universe and there’s lots to discover and that we’re not all there is. Wouldn’t it be disappointing if we were all there is?
Lex Fridman
(00:27:57)
Yeah, and the full diversity of other stuff is pretty interesting.
Lisa Randall
(00:28:04)
We have no idea how much there is. We know what we can observe so far, so the idea that there’s other stuff out there that we yet have to figure out, it’s exciting.
Lex Fridman
(00:28:13)
Let me ask you an out there question. If you think of humans on earth, life on earth as this pocket of complexity that emerged and there’s a bunch of conditions that came to be, and there’s Darwinian evolution and however life originated, do you think it’s possible there’s some pockets of complexity of that sort inside dark matter that we can’t see?
Lisa Randall
(00:28:42)
That’s possible.
Lex Fridman
(00:28:47)
Chemistry and biology evolving in different ways.
Lisa Randall
(00:28:49)
That’s one of the reasons we suggest… It’s not the reason, but it would be true if there were the type of interactions we’d suggest, it would need more complex ones. We don’t know. I will say that the conditions that give rise to life and complexity, they’re complex, they’re unlikely. It’s not like there’s great odds that would happen, but there’s no reason to know that it doesn’t happen. It’s worth investigating are there other forces that exist in the dark matter sector? That’s exactly-
Lex Fridman
(00:29:20)
So the dark matter sector doesn’t have all the forces of the standard model of physics?
Lisa Randall
(00:29:26)
Right. As far as we know, it doesn’t have any. It might have it at some low level, but it could have its own forces, just like the dark matter might not experience our light. Maybe it has its light that we don’t experience.
Lex Fridman
(00:29:38)
So there could be other kinds of forces.
Lisa Randall
(00:29:41)
There could be other kinds of forces even within our sector that are too weak for us to have discovered so far or that exist at different scales than we know about. We detect what interacts strongly enough with our detectors to detect. It’s worth asking, and that’s one of the reasons we build big colliders to see are there other forces, other particles that exist say, at higher energies, at shorter distance scales than we’ve explored so far. It’s not just in the dark matter sector. Even in our sector, there could be a whole bunch of stuff we don’t yet know.

Particle physics

Lex Fridman
(00:30:17)
Maybe let’s zoom out and look at the standard model of particle physics. How does dark matter fit into it? First of all, what is it? Can you explain what the standard model is?
Lisa Randall
(00:30:28)
The standard model of particle physics is basically it tells us about nature’s most basic elements and their interactions. It’s the substructure as far as we understand it. If you look at atoms, we know they have nuclei and electrons, nuclei have protons and neutrons in them, protons and neutrons have particles called quarks that are held together by something called the strong force.

(00:30:54)
They interact through the strong force, the strong nuclear force. There’s something called the weak nuclear force and electromagnetism. Basically, all those particles and their interactions describe many, many things we understand. That’s the standard model. We now know about the Higgs boson, which is associated with how elementary particles get their mass. That piece of the puzzle has also been completed.

(00:31:20)
We also know that there are a weird array of masses of elementary particles. There’s not just the up and down quark, but there are heavier versions of the up and down quark. Charm and strange, top and bottom. There’s not just the electron, there’s a muon and a tau. There are particles called neutrinos, which are under intense study now, which are partnered with the leptons through the weak interactions.

(00:31:42)
We really do know these basic elements and we know the forces. When we’re doing particle physics experiments, we can usually even ignore gravity except in exceptional cases that we can talk about. Those are the basic elements in their interactions.

(00:31:58)
Dark matter stands outside that, it’s not interacting through those forces. When we look at the world around us, we don’t usually see the effects of dark matter. It’s because there’s so much of it that we do and it doesn’t have those forces that we know about. The standard model has worked spectacularly well. It’s been tested to a high degree of precision. People are still testing it.

(00:32:20)
One of the things we do as physicists is we actually want it to break down at some level, we’re looking for the precision measurement or the energy or whatever it will take where the standard model is no longer working. Not that it’s not working approximately, but we’re looking for the deviations. Those deviations are critical because they can tell us what underlies the standard model, which is what we really want to see next.
Lex Fridman
(00:32:45)
Where can you find the places where the standard model breaks down? What are the places you can see those tiny little deviations?
Lisa Randall
(00:32:53)
We don’t know yet, but we know the kinds of things you wouldn’t want to look for. One obvious place to look is at higher energy. We’re looking at the Large Hadron Collider, but we’d love to go beyond that. Higher energy means shorter distances and it means things that we just couldn’t produce before. E=mc², so if you have a heavy particle and you don’t have enough energy to make it, you’ll never see it. That’s one place.

(00:33:17)
The other place is precision measurements. The standard model has been tested exquisitely, so if it’s been tested 1%, you want to look at a 10th of a percent. There are some processes that we know shouldn’t even happen at all in the standard model or happen at very suppressed level, and those are other things that we look for. All of those things could indicate there’s something beyond what we know about, which of course would be very exciting.
Lex Fridman
(00:33:42)
When you just step back and look at the standard model, the quarks and all the different particles and neutrinos, isn’t it wild how this little system came to be and underpins everything we see?
Lisa Randall
(00:33:59)
Absolutely. That’s why we’d like to understand it better. We want to know is it part of some bigger sector? Why are these particles… Why do they have the masses they do? Why is the Higgs boson so light compared to the mass that could have had, which we might’ve even expected based on the principles of special relativity and quantum mechanics. That’s a really big question. Why are they what they are?
Lex Fridman
(00:34:21)
And they originate, there’s some mechanism that created the whole thing?
Lisa Randall
(00:34:24)
That’s one of the things we’re trying to study. Why is it what it is?
Lex Fridman
(00:34:29)
Even just the mechanism that creates stuff, the way a human being is created from a single cell. It’s like embryogenesis, the whole thing, you build up this thing. All of it, this whole thing comes to be from just like a [inaudible 00:34:47].
Lisa Randall
(00:34:46)
Don’t forget it is interacting with the environment.
Lex Fridman
(00:34:49)
For sure. Okay, right, right, right.
Lisa Randall
(00:34:51)
It’s important.
Lex Fridman
(00:34:53)
That’s a really good question is how much of it is the environment? Is it just the environment acting on a set of constraints? How much of it is just the information in the DNA or any information? How much is it in the initial conditions of the universe versus some other thing acting on it?
Lisa Randall
(00:35:14)
These are big questions. These are big questions in pretty much every field. For the universe, we do consider it… It’s everything there is by definition. But people now think about it. Is it one of many universes? Of course it’s a misnomer, but could there be other places where there are self-contained gravitational systems that we don’t even interact with? Those are really important questions, and the only way we’re going to answer them is we go back as far as we can. We try to think theoretically, and we try to think about observational consequences. That’s all we can do.
Lex Fridman
(00:35:49)
One interesting way to explore the standard model is to look at your fun, nuanced disagreement with Carlo Rovelli. When you talked about him writing in his book, “Electrons don’t always exist. They exist when they interact. They materialize in a place when they collide with something else.” You wrote that… I’ll just read the whole thing because it’s interesting.

(00:36:12)
“Stocks may not achieve a precise value until they’re traded, but that doesn’t mean we can’t approximate their worth until they change hands. Similarly, electrons might not have definite properties, but they do exist. It’s true that the electron doesn’t exist as a classical object with definite position until the position is measured. But something was there – which physicists use a wave function to describe.” It’s a fascinating nuanced disagreement. Do electrons always exist or not? Does a tree fall in the forest if nobody’s there to see it?
Lisa Randall
(00:36:48)
I like to think of the universe as being out there, whether or not… It would be really weird if the only time things came into existence was when I saw them or I measured them.
Lex Fridman
(00:36:57)
There’s a lot of weird stuff in the works.
Lisa Randall
(00:36:58)
I could believe that the Middle East doesn’t exist because I’m not there now. That would be kind of ridiculous, I think we would all agree on that. I think there’s only so much that we can attribute to our own powers of seeing. The whole system doesn’t come into being because I’m measuring it. What is weird, and this isn’t even a disagreement about the standard model, this is a disagreement about how you interpret quantum mechanics.

(00:37:22)
I would say that those wave functions are real. One of the things that don’t forget that particle physics does that quantum field theory says is that electrons can be created and destroyed. It’s not that every electron has to be in the universe. That’s what happens at colliders, particles get created and destroyed, but that doesn’t mean that if I have electron in an atom, it’s not there. It’s certainly there, and we know about it. Its charge is there.
Lex Fridman
(00:37:47)
Physics is a kind of way to see the world. At the bottom, what’s the bottom turtle? Do you have a sense that there’s a bottom reality that we’re trying to approximate with physics?
Lisa Randall
(00:38:01)
I think we always have in our head maybe that we’d like to find that, but I have to… I might not seem so, but I think I’m more humble than a lot of physicists. I’m not sure that we’re ever going to get to that bottom level, but I do think we’re going to keep penetrating different layers and get further.
Lex Fridman
(00:38:16)
I just wonder how far away we are.
Lisa Randall
(00:38:20)
We all wonder that. What’s even the measure of how far away we are. One way you can measure it is just by our everyday lives. In terms of our everyday lives, we’ve measured everything. In terms of what underlies it. There’s a lot more to see. Part of it has to do with how far we think we can go. It might be that the nature of reality changes so much that even these terms are different. Maybe the notion of distance itself might break down at some point.
Lex Fridman
(00:38:49)
Also to push back on the we’ve measured everything, maybe there’s stuff we haven’t even considered is measurable. For example, consciousness. There might be stuff, just like you said, forces unseen, undetected.
Lisa Randall
(00:39:03)
It’s an interesting thing, and this is often a confusion that happens. There’s the fundamental stuff underlying it, and then there’s the higher levels, what we’ll call an effective theory at some level. We’re not always working… When I throw a ball, I don’t tell you where every atom is. I tell you there’s a ball.

(00:39:22)
There might be different layers of reality that are built in terms of the matter that we know about in terms of the stuff we know about that. When I say we’ve measured everything, I say that with a grain of salt. I mean we’ve measured everything about the standard model. There’s lots of phenomena that we don’t understand, but often there are complex phenomena that will be given in terms of the fundamental ingredients that we know about.
Lex Fridman
(00:39:47)
That is an interesting question because yes, there’s phenomena that are at the higher level of abstractions that emerge, but maybe with consciousness, there is far out people that think that consciousness is panpsychus, that there’s going to be almost like a fundamental force of physics. That’s consciousness that permeates all that matter.
Lisa Randall
(00:40:10)
Usually when you have a crazy… Sorry, when you have a far out theory, the thing you do is you test all the possibilities within the constructs that exist. You don’t just jump to the most far out possibility. You can do that, but then to see if it’s true, you either have to find evidence of it or you have to show that it’s not possible without that, and we’re very far from that.
Lex Fridman
(00:40:32)
I think one of the criticisms of your theory on the dinosaurs was that it requires, if I remember correctly, for dark matter to be weirder than it already is. I think you had a clever response to that. Can you remind…
Lisa Randall
(00:40:46)
I’m not sure I remember what I said then, but we have no idea how weird dark matter is. It’s based on everyone thinking they know what dark matter is. Weirder than it already is, it’s not already anything. We don’t know what it is, so there’s no normalization here.
Lex Fridman
(00:40:59)
Do we know if dark matter varies in density?
Lisa Randall
(00:41:05)
It just certainly does in the universe, just like… For example, there’s more dark matter in galaxies than there’s between galaxies. It clumps. It’s matter, so it’s distributed like matter. It is matter.
Lex Fridman
(00:41:18)
It does clump, but the full details of how it clumps and the complexity of the clumping…
Lisa Randall
(00:41:25)
It’s understood pretty well. People do simulations… Where people are always looking for things, including us as particle physics, it’s at small scales, are the deviations on small scales so that indicating other interactions or other processes or interactions with baryons. That is to say normal matter that we don’t understand. But on large scales, we have a pretty good understanding of dark matter distribution.
Lex Fridman
(00:41:50)
You were part of a recent debate on can science uncover reality. Let me ask you this question then, what do you think is the limits of science?
Lisa Randall
(00:42:00)
I’m smart enough to know that I have no idea. Also it’s not even clear what science means because there’s the science that we do, which is particle physics. We try to find fundamental things and figure out what their effects are. There’s science like biology where at a higher level, the kind of questions you ask are different, the kind of measurements are different.

(00:42:21)
The kind of science that’s going to happen in the more numerical age or even AI, what does it mean to answer a question? Does it mean that we can predict it? Does it mean that we can reproduce it? I think we’re coming up against the definition of what we mean by science as human beings. In terms of the science that we can do, I don’t think we’ll know it until we get there. We’re trying to solve hard problems and we’ve made progress.

(00:42:50)
If you think of how much science has advanced in the last century or century and a half, it’s incredible. We didn’t even know the universe was expanding at the beginning of the 20th century. We didn’t know about quantum mechanics at the beginning of the century, we didn’t know about special relativity. That’s a lot in a relatively short time, depending on how you think of time. I think it would be premature to say we know limitations.
Lex Fridman
(00:43:14)
At various points throughout the history, we thought we solved everything or at least various people declared-
Lisa Randall
(00:43:20)
[inaudible 00:43:20] various people. Exactly.
Lex Fridman
(00:43:21)
Declared that we’ve solved everything. This also a good place to… Maybe could you describe the difference between top-down and bottom-up approaches to theoretical physics that you talked about in the book?
Lisa Randall
(00:43:33)
You could try to jump in and say I have a theory that I think is so perfect that I can predict everything from it or at least predict some salient features from it.
Lex Fridman
(00:43:46)
Mm-hmm. That’s top-down.
Lisa Randall
(00:43:47)
That would be a top-down. Bottom-up is more like the questions we just asked. Why are masses what they are? We measure things. We want to put them together. Usually a good approach is to combine the two. If you ask a very specific question but combine it with the methods of knowing that there could be a fundamental theory underlying it, sometimes you make progress.

(00:44:09)
The community tends to get segmented or fragmented into people who do one or the other, but there are definitely times… Some of my best collaborations with people who are more top-down than I am, so that we come up with interesting ideas that we wouldn’t have thought of if either one of us was working individually.
Lex Fridman
(00:44:25)
Would you say the truly big leaps happened top-down? Like Einstein?
Lisa Randall
(00:44:30)
Einstein was not a top-down person in the beginning. Special relativity was very much him thinking about… They were thought experiments, but he was very much… The original theory about relativity is something like on the nature of electromagnetism. He was trying to understand how Maxwell’s laws could make sense when they seemed to have different symmetries than what we had thought they were.

(00:44:54)
He was very much a bottom-up person, and in fact, he resisted top-down for a long time. Then when he tried to do the theory of general relativity or the general theory of relativity, whichever you want to call it, incorporating gravity into the system when you need some feedback, then he was helped by a mathematician who had developed some differential geometry and helped him figure out how to write down that.

(00:45:16)
After that, he thought top-down was the way to go, but he actually didn’t make that much progress. I think it’s naive to think it was just one or the other. In fact, a lot of people who made real progress were rooted in actual measurements.

Physics vs mathematics

Lex Fridman
(00:45:31)
Speaking of mathematicians, what do you is the difference, you’ve had a bit of foot in both, between physics and mathematics in the way it helps us understand the world?
Lisa Randall
(00:45:41)
To be frank, there’s a lot more overlap in physics and math. I think that has been… Maybe not more, but there’s certainly a lot. I think, again, the kinds of questions you’re asking are usually different. Mathematicians like the structure itself, physicists are trying to concentrate on, to some extent, on the consequences for the world. But there is a lot of overlap.
Lex Fridman
(00:46:04)
The string theory is an example. There’s certain theories where there’s a certain mathematical beauty to it.
Lisa Randall
(00:46:12)
There’s also some really cool ideas that you get in particle physics where you can describe what’s going on and connect it to other ideas. That’s also really beautiful. I think basically insights can be beautiful. They might seem simple, and sometimes they genuinely are, and sometimes they’re built on a whole system that you have to understand before. If you actually saw Einstein’s equations written out in components, if you wouldn’t think it’s so beautiful. If you write in a compact way, it looks nice.
Lex Fridman
(00:46:43)
What do you think about the successes and the failures of string theory? To what degree do you think it succeeded, to what degrees it not succeeded yet or has failed?
Lisa Randall
(00:46:54)
I think to talk about any science in terms of success and failure often misses the point because there’s not some absolute thing. I do think that string theorists were a bit overly ambitious… Not overly ambitious, but a little bit overly arrogant in the beginning, thinking they could solve many problems that they weren’t going to solve.

(00:47:14)
That’s not to say the methods and advances in strength theory don’t exist, but they certainly weren’t able to immediately solve all the problems they thought they could solve. It has given us tools, it has given us some insights, but it becomes almost a sociological question of how much it should be one or the other.

(00:47:35)
I do think that you can get caught up in the problems themselves, and sometimes you can get caught up in the methods and just do other examples. The real physics insights often come from people who are thinking about physics as well as math.
Lex Fridman
(00:47:49)
Because you mentioned AI, is there hope that AI might be able to help find some interesting insights? Another way to ask this question is how special are humans that we’re able to discover novel insights about the world?
Lisa Randall
(00:48:09)
That’s a great question, and it depends on what kind of insights and what we’re going to find that out. Because it’s hard to think about something that doesn’t quite exist yet, I could just think about something, take a step back. It’s a little bit like I’m trying understand four dimensions so you go back to three dimensions. Go to something you can imagine.

(00:48:31)
You can say a lot of the things in a very different level about the internet. You could say has the internet helped do things? It definitely took on a life of its own in some sense, but it’s also something that we’re able to tame. I know that I, myself wouldn’t have been able to write books if the internet didn’t exist because I wouldn’t have had the time to go to the library and look everything up. It helped me enormously.

(00:48:57)
In some sense, AI could be that. In a very nice world, it could be a tool that helps us go a step further than we would and a lot more efficiently. It’s already done that to some extent. Or it could be like the parts of the internet that we can control that are ruining politics or whatever. There’s certainly a lot of indications that can do that. Then there are even bigger things that people speculate about AI being able to do its own things, but in terms of actually figuring things out, we’re in the early stages.
Lex Fridman
(00:49:33)
Yeah, there’s several directions here. One is on the theorem prover side, Wolfram Alpha where everything’s much more precise, and we have large language model type of stuff. One of the limitations of those is it seems to come up with convincing looking things, which we don’t know if it’s true or not, and that’s a big problem for physics.
Lisa Randall
(00:49:54)
Large language models are more or less generalizations of stuff that we have. There’s still breakthroughs in AI waiting to happen, and maybe they are happening and maybe they’ll be good, maybe not, but that’s not quite the same. Maybe in some cases, it’s just pattern recognition that leads to important things, but sometimes it could be something more insightful than that that I can’t even put my finger on.

(00:50:21)
It forces us to… We don’t really understand how smart we are. We don’t understand how we think about things all that well, actually. But one thing is true though, we are a lot more efficient right now than computers and coming up with things, we require a lot less energy to do that. If computers figure out how to do that, then it’s going to be at a totally different ball game.

(00:50:42)
Here are clearly kinds of connections that we don’t know how we’re making, but we are making them. That’s going to be interesting. I say we’re in early stages, but this is changing very rapidly. Right now, I don’t think that it’s actually discovered new laws of physics, but could it in the future? Maybe it can.
Lex Fridman
(00:51:06)
It will raise big questions about what is special about humans that we don’t quite appreciate. There could be things that are like that leap of insight that happens, truly novel ideas, that could potentially be very difficult to do.
Lisa Randall
(00:51:26)
There are abstract questions like that. There’s also questions of how is it that we can address to some extent, how will AI be used in the context of the world we live in? Which is based on at least our country’s based on capitalism in a certain political system. How will global politics deal with it? How will our capitalist system deal with it? What will be the things that we focus on doing with it? How much will researchers get control of it to be able to ask different sorts of questions?

(00:51:58)
While it was starting out, people were doing these kinds of toy problems, but what will it actually be applied to and what will it be optimized to do? There’s a lot of questions out there that it’s really important we start addressing.
Lex Fridman
(00:52:11)
What to you is the most beautiful unsolved problem in physics and cosmology, which is really exciting if we can unlock the mystery of in the next few decades?
Lisa Randall
(00:52:30)
Is it what’s the most beautiful unsolved problem, or what is the most beautiful unsolved problem I think we can make progress on?
Lex Fridman
(00:52:35)
Oh boy, we make progress on in the next few centuries.
Lisa Randall
(00:52:43)
Most of the big questions have to do with what underlies things, how things started, what’s at the base of it. There’s also just basic questions like that you asked earlier, how far will science take us? How much can we understand? There are questions like how we got here, what underlies it, are there.

(00:53:02)
Also, there’s really deep questions like what fraction are we actually seeing? If there are these other forces, if there is another way of seeing the world, are there universes beyond their own? If they’re so totally different, how do we even comprehend them? What would we even think about them? There’s a lot about trying to get beyond… It’s always just getting beyond our limited vision and limited experience and trying to see what underlies it, both at small scales and at large scales.

(00:53:35)
We just don’t know the answers. I’d like to think that we understand more about dark matter, about dark energy, about are there extra dimensions, things that we actually work on, but there’s probably a lot beyond what we work on that’s yet to be discovered.
Lex Fridman
(00:53:50)
Yeah, understanding the extra dimensions piece will be really interesting.
Lisa Randall
(00:53:55)
Totally. If it is how the universe went from higher dimensions to what we see, are the extra dimensions present everywhere? One of the really interesting pieces of physics we did that I talk about in my first book, Warped Passages, is finding out that there can be a higher dimension, but only locally. Do you think there’s a gravity of a lower dimension? It could be like only locally do we think we live in three dimensions. It could be higher dimensions is different.

(00:54:25)
That’s not actually the gravity we have, but there’s all sorts of phenomena that might be out there that we don’t know about. All sorts of evolution things, time dependence that we don’t know about. Of course, that’s from the point of view of particle physics, from the point of view of other kinds of physics, we’re just beginning, so who knows?
Lex Fridman
(00:54:40)
Yeah, if the physics changes throughout is not homogeneous throughout the universe, that’ll be weird.
Lisa Randall
(00:54:48)
I mean, for the observable universe, it’s the same. But beyond the observable universe, who knows?
Lex Fridman
(00:54:58)
You’ve had an exceptional career. What advice would you give to young people, maybe high school, college, on how to have a career they can be proud of and a life they can be proud of?
Lisa Randall
(00:55:10)
I think the weird thing about being a scientist or an academic in general is you have to believe really strongly what you do while questioning it all the time. That’s a hard balance to have. Sometimes it helps to collaborate with people, but to really believe that you could have good ideas at the same time, knowing they could all be wrong. That’s a tough tightrope to walk sometimes, but to really test them out.

(00:55:34)
The other thing is sometimes if you get too far buried, you look out and you think there’s so much out there. Sometimes it’s just good to bring it back home and just think okay, can I have as good idea as the person next to me rather than the greatest physicist who ever lived? Right now, like you said, I think there’s lots of big issues out there, and it’s hard to balance that.

(00:55:55)
Sometimes it’s hard to forget the role of physics, but I think Wilson said it really well when he said when they were building Fermilab, it was like this won’t defend the country, but it’ll make it worth defending. It’s just the idea that in all this chaos, it’s still important that we still make progress in these things. Sometimes when major world events are happening, it’s easy to forget that. I think those are important too. You don’t want to forget those, but to try to keep that balance because we don’t want to lose what it is that makes humans special.
Lex Fridman
(00:56:24)
That’s the big picture. Do you also lose yourself in the simple joy of puzzle solving?
Lisa Randall
(00:56:29)
Yeah. We all like solving puzzles. Actually one of the things that drives me in my research is the inconsistencies. When things don’t make sense, it really bugs me and it just will go into different directions to see how could these things fit together.
Lex Fridman
(00:56:46)
It bugs you, but that motivates you?
Lisa Randall
(00:56:48)
Yeah, totally.
Lex Fridman
(00:56:49)
Until it doesn’t. You have to resolve it.
Lisa Randall
(00:56:52)
I think I have this underlying belief that it should make sense, even though the world comes at you in many ways and tells you nothing should make sense, but if you believe that it makes sense and you look for underlying logic. I think that’s just good advice for everything to try to find why is it the way.

(00:57:08)
I talk about effective theory in my second book, Knocking On Heaven’s Door, a lot. It’s rather than ask the big questions, sometimes we just ask the questions about the immediate things that we can measure and like I said, we can sometimes tell one that we’ll fail, but we can have these effective theories. Sometimes I think when we approach these big questions, it’s good to do it from an effective theory point. Why do I find this satisfying? Why is the world we have the way it is?

(00:57:31)
We think things are beautiful that we live in. I’m not sure if we had different senses or different ways of looking at things, we wouldn’t necessarily find it beautiful. But I have to say, it is fantastic that no matter how many times I see a sunset, I will always find it beautiful. I don’t think I ever see a sunset as say whatever. It’s just always beautiful.

(00:57:54)
There are things that as humans, clearly resonate with us, but we were maybe evolved that way. But that’s about us. In terms of figuring out the universe, it’s amazing how far we’ve gotten. We have discovered many, many wonderful things, but there’s a lot more out there and I hope we have the opportunity to keep going.
Lex Fridman
(00:58:14)
With effective theories, one small step at a time, just keep unraveling the mystery.
Lisa Randall
(00:58:19)
Also having in mind the big questions, but doing one small step at a time. Exactly.
Lex Fridman
(00:58:23)
Yeah, looking out to the stars. You said the sunset. For me, it’s the sunset, the sunrise, and just looking at the stars. It’s wondering what’s all out there and having a lot of hope that humans will figure it out.
Lisa Randall
(00:58:39)
Right. I like it.
Lex Fridman
(00:58:42)
Lisa, thank you for being one of the humans in the world for having me here for that are pushing it forward and figuring out this beautiful puzzle of ours. Thank you for talking today. This was amazing.
Lisa Randall
(00:58:53)
Thank you for having me here.
Lex Fridman
(00:58:55)
Thanks for listening to this conversation with Lisa Randall. To support this podcast, please check out our sponsors in the description. Now, let me leave you with some words from Albert Einstein. The important thing is to not stop questioning. Curiosity has its own reason for existence. Thank you for listening, and hope to see you next time.

Transcript for Michael Malice: Thanksgiving Pirate Special | Lex Fridman Podcast #402

This is a transcript of Lex Fridman Podcast #402 with Michael Malice.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
What’s your opinion on my bird here, Mr. Parrot?
Michael Malice
(00:00:04)
It’s a Macaw. Scarlet Macaw.
Lex Fridman
(00:00:07)
What?
Michael Malice
(00:00:08)
It is a Scarlet Macaw.
Lex Fridman
(00:00:10)
Oh, you know birds?
Michael Malice
(00:00:11)
Yeah. And that’s actually not life-sized.
Lex Fridman
(00:00:15)
Are you saying he’s not real?
Michael Malice
(00:00:17)
I’m saying it’s not to scale.
Lex Fridman
(00:00:19)
Okay. But he’s real.
Michael Malice
(00:00:21)
Are we doing that Monty Python sketch?
Lex Fridman
(00:00:25)
Everything is a Monty Python sketch.
Michael Malice
(00:00:26)
I don’t think Monty Python’s funny.
Lex Fridman
(00:00:28)
You don’t?
Michael Malice
(00:00:29)
At all. Not once.
Lex Fridman
(00:00:29)
That explains so much.
Michael Malice
(00:00:31)
Does it? What does it explain?
Lex Fridman
(00:00:32)
What do you think is funny?
Michael Malice
(00:00:35)
You not answering that question is pretty funny.
Lex Fridman
(00:00:38)
Yeah. What do you think is funny, having a mantis shrimp?
Michael Malice
(00:00:41)
No.
Lex Fridman
(00:00:42)
You think Big Lebowski is funny?
Michael Malice
(00:00:44)
Oh God, no.
Lex Fridman
(00:00:46)
This is getting worse and worse. The following is a conversation with Michael Malice, anarchist and author of Dear Reader, The New Right, The Anarchist Handbook, The White Pill, and he is the host of the podcast, YOUR WELCOME. This is a Thanksgiving special of the pirate and oceangoing variety. So once again, let me say thank you for listening today and for being part of this wild journey with me. This is a Lex Fridman Podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Michael Malice.

Beauty and mantis shrimp

Michael Malice
(00:01:34)
The box?
Lex Fridman
(00:01:35)
Yeah.
Michael Malice
(00:01:35)
The mystery box.
Lex Fridman
(00:01:36)
I’m wondering what’s in it.
Michael Malice
(00:01:36)
There’s something in that box of exquisite beauty, both literally and in what it symbolizes and why it is here.
Lex Fridman
(00:01:46)
Given the kind of human being you are, I’m terrified at what you find beautiful.
Michael Malice
(00:01:52)
That’s a good point. You kind of hit me with a curve ball. For me, the most beautiful wildlife are what I call God’s mistakes. Because my friend came up with that term where she’s like, “God made these disgusting animals, just threw in the bottom of the ocean.” He’s like, “No one’s ever going to see this.”
Lex Fridman
(00:02:12)
Yeah. You commented on Twitter about some creature, a rainbow type creature.
Michael Malice
(00:02:17)
The peacock mantis shrimp.
Lex Fridman
(00:02:18)
Yeah, it’s beautiful.
Michael Malice
(00:02:20)
It’s horrific though. So it has, I think eight legs, six arms, two punching claws or spearing claws depending on the genus. Two eyes, two antennae, two ear flaps. I don’t know what they do. And its punch can be as strong as a bullet. And the other type with the spears, divers call them thumb splitters because if you stick your finger near it’ll cut your thumb down to the bone. So I had one as a pet. All night I would hear banging on the PVC pipe. And I’ve got to tell you, if they have the best eyesight of any animal because they see in seven different ways. And when you make eye contact with this thing, it’s just absolutely terrifying. But you can eat them as sushi. They call them sea centipedes.
Lex Fridman
(00:03:01)
But they’re colorful and beautiful.
Michael Malice
(00:03:03)
That’s species is, yeah.
Lex Fridman
(00:03:04)
What was it like having one as a pet, and why did you do it?
Michael Malice
(00:03:09)
Well, when you have a species that’s that unique and that much of an outlier, growing up, reading these books, watching these shows, I found this stuff so much more fascinating than space, which is dead. So to be able to have this specimen in your house and just observe its behavior is just an amazing thing.
Lex Fridman
(00:03:32)
Why’d you get rid of it?
Michael Malice
(00:03:34)
I didn’t have, I guess, the right minerals in the mix because-
Lex Fridman
(00:03:36)
It died?
Michael Malice
(00:03:37)
… it had a problem moulting once. Yeah, it couldn’t moult correctly.
Lex Fridman
(00:03:40)
Wow. Do you miss it? Think about it still?
Michael Malice
(00:03:43)
I do think about it, to be honest. I still have a pair of it’s punching appendages from when it moulted.
Lex Fridman
(00:03:51)
What pet animal in your life do you miss the most, that has been in your life that you think about?
Michael Malice
(00:03:59)
I’ve never had cats or dogs growing up or anything like that, which I… Oh God. My problem is-
Lex Fridman
(00:04:07)
Here we go.
Michael Malice
(00:04:08)
… if I like something, I will go down a rabbit hole. So I know if I got one tattoo, I already know my first five are going to be. Okay? So I can’t do it because then once I get those five, it’s going to be a hundred and I’m already too old to be the tattoo guy.
Lex Fridman
(00:04:25)
What would be the first tattoo? My face? Would it go on your ass cheeks or where would you put them if it was my face?
Michael Malice
(00:04:36)
If I got your face, it would definitely be on my arm right here.
Lex Fridman
(00:04:39)
If you had multiple faces, would you put like?
Michael Malice
(00:04:42)
I think delts, right? Shoulders, different faces on different shoulders.
Lex Fridman
(00:04:44)
And when you flex?
Michael Malice
(00:04:45)
I’d want some symmetry.
Lex Fridman
(00:04:46)
Yeah. Would you get a dictator? If you had to get a dictator, who would you get?
Michael Malice
(00:04:51)
Would have to be Kim Jong-il. Right? Because I wrote the book on him.
Lex Fridman
(00:04:54)
Oh, it’s like the plugging your book in the tattoo?
Michael Malice
(00:04:57)
I don’t think plugging, it’s just I have a personal connection to this stuff.
Lex Fridman
(00:05:00)
Good opener, the conversation. People would be asking why him and he’d be like, “Well, I wrote a book about it.” And I’d be like, “Oh, okay.”
Michael Malice
(00:05:07)
Okay. Here’s why-
Lex Fridman
(00:05:08)
“Let me check it out.”
Michael Malice
(00:05:08)
That would be a bad. No, that’s not what happens.
Lex Fridman
(00:05:10)
Okay.
Michael Malice
(00:05:11)
Here’s the thing.
Lex Fridman
(00:05:12)
What happens?
Michael Malice
(00:05:12)
When you write a book about North, “Hey, nice to meet you. What is it you do?’ “I’m an author.” “What kind of books do you write?” “Well, my last book was on North Korea,” 90% of the time, 90, they will then start telling me everything they know about North Korea. And it’s like, “I don’t need, this isn’t a quiz, and it’s a very poorly understood country. I don’t expect you to know anything. You’re not on the spot. And half of what you’re saying is not accurate either. It’s fine.”
Lex Fridman
(00:05:36)
How often did they bring up Dennis Rodman?
Michael Malice
(00:05:38)
A hundred percent.
Lex Fridman
(00:05:39)
A hundred percent of the time.
Michael Malice
(00:05:40)
“Oh, so do you know Dennis Rodman?”
Lex Fridman
(00:05:42)
Yeah.
Michael Malice
(00:05:42)
But I don’t understand why. I guess, people feel the need to, “All right, now we’re talking about this subject. I just got to drop whatever I can talk about.” It’s usually a small amount. And there’s this thing in the culture, which I hate that everyone have to have an opinion on everything. And it’s like it’s okay to be like, “Yeah, I don’t know anything about that. Tell me more.” There’s lots of things I don’t know anything about.

Parrots, Pirates, and Monty Python

Lex Fridman
(00:06:02)
What’s your opinion on my bird here, Mr. Parrot?
Michael Malice
(00:06:07)
It’s Macaw, Scarlet Macaw.
Lex Fridman
(00:06:10)
What?
Michael Malice
(00:06:11)
It is a Scarlet Macaw.
Lex Fridman
(00:06:13)
Oh, you know birds?
Michael Malice
(00:06:14)
Yeah. And that’s actually not life-sized.
Lex Fridman
(00:06:18)
Are you saying he’s not real?
Michael Malice
(00:06:20)
I’m saying it’s not to scale.
Lex Fridman
(00:06:22)
Okay. But he’s real.
Michael Malice
(00:06:24)
Are we doing that Monty Python sketch?
Lex Fridman
(00:06:27)
Everything is a Monty Python sketch.
Michael Malice
(00:06:29)
I don’t think Monty Python’s funny.
Lex Fridman
(00:06:31)
You don’t?
Michael Malice
(00:06:31)
At all. Not that once.
Lex Fridman
(00:06:32)
That explains so much.
Michael Malice
(00:06:33)
Does it? What does it explain?
Lex Fridman
(00:06:35)
What do you think is funny?
Michael Malice
(00:06:38)
You not answering that question is pretty funny.
Lex Fridman
(00:06:39)
Yeah. What do you think is funny, having a mantis shrimp?
Michael Malice
(00:06:44)
No.
Lex Fridman
(00:06:45)
Do you think big Big Lebowski is funny?
Michael Malice
(00:06:46)
Oh God, no. Although…
Lex Fridman
(00:06:49)
This is getting worse and worse.
Michael Malice
(00:06:50)
To be fair, I only tried to watch Big Lebowski after it’s been part of the culture for many years.
Lex Fridman
(00:06:57)
Right.
Michael Malice
(00:06:58)
To the point where every single line has been quoted incessantly by the most annoying frat bros ever. So I kind of have been poisoned to be able to appreciate it.
Lex Fridman
(00:07:10)
Right.
Michael Malice
(00:07:10)
So maybe if I’d seen it when it came out, before it became a thing, I would’ve enjoyed it. I couldn’t get through it. I couldn’t get through 20 minutes.
Lex Fridman
(00:07:17)
Is that how you feel about Schindler’s List?
Michael Malice
(00:07:21)
Well…
Lex Fridman
(00:07:21)
It’s so much easier for me to stare at you when you have sunglasses on.
Michael Malice
(00:07:24)
I didn’t think you’d be the one making Holocaust jokes today. And yet, here we are.
Lex Fridman
(00:07:28)
And cut scene. I actually have no trouble making eye contact with you when you’re wearing shades.
Michael Malice
(00:07:35)
Yes, because you’re a robot.
Lex Fridman
(00:07:36)
Two copies of myself.
Michael Malice
(00:07:38)
Yeah. Oh, you’re seeing yourself in them?
Lex Fridman
(00:07:39)
Mm-hmm.
Michael Malice
(00:07:40)
Okay, cool.
Lex Fridman
(00:07:40)
Yeah, I’m having a conversation with myself. It’s not your fault, Lex.
Michael Malice
(00:07:46)
They made you like this. You were just a good little Roman in Saint Petersburg.
Lex Fridman
(00:07:51)
I could see Mr. Parrot a little bit too.
Michael Malice
(00:07:54)
But what do you find funny? Come on. This is an interesting subject.
Lex Fridman
(00:07:57)
Well, I find Monty Python. I find absurdity funny.
Michael Malice
(00:08:00)
Yes. I find absurdity funny. I think that’s the thing. When people come at me, and maybe this is an Eastern European thing, when they’re like, “How can you find this very dark subject funny?” It’s like, well, the humor. First of all, the humor is that you’re making fun of something that’s dark. So already it’s absurd. It’s completely inappropriate. Second, just psychologically, Joan Rivers said that Winston Churchill said, I don’t know if it’s true, that when you make people laugh, you’re giving them a little vacation. And I was just thinking about this the other day, how when I die, if, I want my funeral to be a roast. It doesn’t help me that everyone’s sad. If I brought people happiness or joy in life, whatever, I want to keep doing that in death. Your sadness doesn’t help me. I know you can’t help it, but tell stories of how I made you laugh. Make fun of me. Make me the punching bag. Even literally, take me out of that coffin and beat the-.

(00:08:55)
Make me a pinata. I don’t care. I don’t understand, well, I do understand, but it’s sad for me when people are like, “This isn’t funny. That isn’t funny.” The way I look at humor is the way it’s like a chef, right? It’s pretty easy to make bacon taste good, but some of these really obscure ingredients to make it palatable, that’s takes skill. So if you’re dealing with a subject that is very emotional or intense and you can make people laugh, then that takes skill and that’s the relief for them.
Lex Fridman
(00:09:29)
Yeah. It’s all about timing.
Michael Malice
(00:09:33)
Yeah.
Lex Fridman
(00:09:34)
Yeah.
Michael Malice
(00:09:38)
What’s the difference? You want to hear one of my jokes?
Lex Fridman
(00:09:41)
Is it a pirate joke? Because that’s the only kind I accept today.
Michael Malice
(00:09:45)
Okay.
Lex Fridman
(00:09:45)
But no, go ahead. It doesn’t have to be a pirate joke this one time.
Michael Malice
(00:09:48)
Do you know who Lia Thomas is?
Lex Fridman
(00:09:50)
Yeah.
Michael Malice
(00:09:51)
What’s difference between Lia Thomas and Hitler?
Lex Fridman
(00:09:53)
What?
Michael Malice
(00:09:54)
Lia Thomas knows how to finish a race.
Lex Fridman
(00:09:58)
Very nice. Very nice.
Michael Malice
(00:09:59)
Did I just get the gold medal?
Lex Fridman
(00:10:00)
Good job. Why does it take pirates forever to get through the alphabet?
Michael Malice
(00:10:09)
Why?
Lex Fridman
(00:10:11)
Because they spent years at sea.
Michael Malice
(00:10:13)
Oh, I thought it was going to be an [inaudible 00:10:15] joke.
Lex Fridman
(00:10:15)
Nope. No.
Michael Malice
(00:10:16)
That’s a good one. I like that.
Lex Fridman
(00:10:17)
Yeah.
Michael Malice
(00:10:18)
When I was in North Korea.
Lex Fridman
(00:10:21)
Oh, you know Dennis Rodman? It’s a callback.
Michael Malice
(00:10:23)
By the way, the thing that is very heartbreaking about the North Korean situation is that they have a great sense of humor. It would be a lot easier if these were robots or drones. They have big personalities, big senses of humor, and that made it much harder to leave and interact with these people because I mean, there’s nothing more human and universal than laughter and laughter’s free.
Lex Fridman
(00:10:47)
Are you saying there’s humor even amongst the people that have most of their freedoms taken away?
Michael Malice
(00:10:52)
Especially. I mean, again, we’re from the Soviet Union, there’s [inaudible 00:10:57] I mean, Russian humor is a thing because there’s nothing you can, if you can’t have food or nice things, at least you can have joy and make each other laugh. I think about it all the time, and I think about my guide all the time. It’s been, what, 2012? So it’s been 11 years since I’ve been there, and she’s still there. And everyone I’ve seen is still there. They just recently electrified the border. So you can’t even, even the few people who are escaping can’t do it anymore.
Lex Fridman
(00:11:19)
Well, that’s interesting that they still have a sense of humor. I attribute the Soviet Union for having that because of the really deep education system. You got to read a lot of literature.
Michael Malice
(00:11:29)
Okay.
Lex Fridman
(00:11:30)
And because of that, you get to kind of learn about the cruelty, the injustices, the absurdity of the world.
Michael Malice
(00:11:40)
Right.
Lex Fridman
(00:11:40)
As long as the writing is not about the current regime.
Michael Malice
(00:11:43)
Yeah. But I think if you look at African Americans, Jewish Americans, gay Americans, they are all disproportionate in terms of attributing to comedy. It’s not because these groups have some kind of magic to them., It’s that when you are on the outside looking in, A, you’re going to have different perspective than the people who are in the middle of the bell curve. But also, when you don’t have anything to lose, at the very least, you can make each other laugh and find happiness that way. So that is something that I think is an important thing to recognize.

Humor and absurdity

Lex Fridman
(00:12:14)
So what do you find funny? What makes you giggle in the most joyful of ways? The suffering of others?
Michael Malice
(00:12:24)
I mean, there are YouTube videos of fat people falling down and they’re really funny.
Lex Fridman
(00:12:36)
There’s two kinds of people in this world, those that laugh at those videos and those that don’t.
Michael Malice
(00:12:42)
No. And those that are in them. My friend Jesse just told me a great Norm Macdonald joke, and this is a good litmus test joke because he says, “A certain group of people lose their minds and a certain group of people just stare at you.” And he goes, “This kind of…” and so I’ll tell you the joke. This is Norm McDonald. A guy walks into a bar and he sees someone at the bar who has a big pumpkin for a head.

(00:13:07)
And the guy’s like, “Dude, what happened to you?” He goes, “Ugh, you never believe this. I got one of those genie lamps and this genie.” He’s like, “Well, what happened?” He goes, “Well, the first wish, I wished for a hundred million dollars.” He’s like, “Yeah, did you get it?” He goes, “Yeah.” He goes, “In my bank account. Feels fine.” He goes, “All right. Well, the second wish, I wished to have sex with as many beautiful women as I want.” He goes, “Did that happen?” He goes, “Yeah, it was amazing.” He goes, “Then what?” “Well, I wished for a giant pumpkin head.”
Lex Fridman
(00:13:34)
Yeah.
Michael Malice
(00:13:35)
So there’s a certain mindset that will just be staring at the screen. And that is, I mean, there’s so many levels why that’s funny, at least to me. And I just love that kind of humor.
Lex Fridman
(00:13:45)
Well, Norm McDonald is just, I watch his videos all the time. He’s a guy that definitely makes me giggle. And he’s one of the people that makes me giggle for reasons I don’t quite understand.
Michael Malice
(00:13:58)
Did you ever see him with Carrot Top on Conan O’Brien?
Lex Fridman
(00:14:01)
No.
Michael Malice
(00:14:02)
Making fun of Carrot Top?
Lex Fridman
(00:14:03)
No.
Michael Malice
(00:14:05)
This is probably the best talk show clip of all time. He’s on with Courtney Thorne-Smith. She was on Melrose’s Place and Conan O’Brien’s the host, and Courtney’s talking about how she’s going to be an upcoming movie with Carrot Top. And Conan is like, “Oh, what’s it going to be called?” And she’s like, “Doesn’t have a title yet.” And Norm goes, “Oh, I know what should be called, Box Office Poison.” And they’re all laughing. And she’s like, “No, no, no, the working title is Chairman of the Board. And Conan goes, “Do something with that smart ass.” And Norm goes, “Yeah, bored is spelled B-O-R-E-D.” And they all just completely lost it.
Lex Fridman
(00:14:39)
There’s something about him with words spoken out of his mouth with the way he turns his head and looks at the camera.
Michael Malice
(00:14:46)
I think he is one of those rare comedians who you really feel like he’s talking to you directly. He feels like he’s winking at you in the audience. And he’s like, “Can you believe I’m doing this?” It’s like almost he feels like he’s, I don’t want to say imposter, but he’s more a member of the audience than he is a member of the people on the stage.
Lex Fridman
(00:15:06)
Yeah, it feels like he’s on our side.
Michael Malice
(00:15:08)
Yes. Yeah.
Lex Fridman
(00:15:09)
Whatever the hell “Our” means.
Michael Malice
(00:15:11)
Roseanne got him his first job.
Lex Fridman
(00:15:14)
Man. Roseanne, you and her have been hanging out.
Michael Malice
(00:15:17)
I got it. Oh my God. Talk about Thanksgiving. When you are talking to Roseanne Barr and making eye contact with this person, it is, I can’t even describe it. It’s just like, “Holy crap, Roseanne Barr’s talking to me.” She is, I’ve said this to her face, pathologically funny. It does not turn off. And you’re sitting there and you’re like, “Holy crap.”

(00:15:40)
And when you make her laugh, which is that laugh that’s in the theme song of her show, you feel like, “Okay, I did a mitzvah. I did something good and right in the world that I made Roseanne Barr laugh.” And it’s also really funny because, and she’s going to hate this, because I tell her, she’s adorable. She doesn’t like that. She’s little. You think of Roseanne Barr as this force of nature, like a tsunami.
Lex Fridman
(00:16:01)
Big, yeah.
Michael Malice
(00:16:01)
She’s five three, I’d say maybe 130. And she puts on the sunglasses, you think this little old Jewish lady. You’d never know this is one of the most epic performers of all time. She lives near here now. So it’s just so much fun talking to her. There was an old satirical magazine in the, I think early two thousands called Heeb, written by Jews, and she dressed up as Hitler for one of the photo shoots, and she was baking little men in the oven. I found it on eBay, I wanted her to sign it to, “Michael, it should have been you.” But she signed it to, “Michael, you’re one smart cookie.” And now it hangs, “Love, mom, Roseanne Barr.” And I call her mom and it hangs over my desk because I have her good domestic goddess energy flowing at me. What?
Lex Fridman
(00:16:52)
What do you find? What else? So Norm McDonald. I guess, we’ve landed on that.
Michael Malice
(00:16:56)
No. My favorite comedian is-
Lex Fridman
(00:16:56)
We agree on something.
Michael Malice
(00:16:57)
My favorite comedian of all time is Neil Hamburger. So Neil Hamburger, I don’t know if I’m ruining the bit, he’s a character performed by this guy named Gregg Turkington. So he comes out in a tuxedo, big eyeglasses, holding three glasses of water, coughing into the mic. And I remember I saw him once in LA and the girl ahead of me, at the table ahead of me was with her boyfriend, this basic chick, pumpkin spice. She turns to him and she goes, “What is this?” And I remember the first time he was on Jimmy Kimmel, and he tells one of his jokes and it was like, “Why does ET Reese’s Pieces so much? Well, that’s what sperm tastes like on his home planet.” And no one laughs. And he goes, “Oh, come on guys. I have cancer.” And it just cuts to this Marine in the audience with his arms crossed. So if you know what he’s doing, it’s just absolutely amazing.

(00:17:58)
He opened for Tenacious D once in somewhere, I think in Ireland or the UK, one of those. And they’re booing him because his jokes are often not funny. He’s like, “Hey, where did my whore ex-wife run off to with that dentist she’s shacking up with? I don’t know. But when I see her in court next month, Alaska.”

(00:18:20)
So they’re booing and he goes, “All right, do you guys want me to bring out Tenacious D?” They’re like, “Yeah.” “Do you want to see your heroes of my Tenacious D?” “Yeah.” “Come on, let me hear it. Do you want to see Tenacious D?” “Yeah.” He goes, “All right, if I tell this next joke and you don’t boo me, I’ll bring out Tenacious D.” And it’s like, I’m trying to think of one that’s not too…
Lex Fridman
(00:18:44)
Self censorship is never good.
Michael Malice
(00:18:45)
Okay. He goes, “Can we agree that George Bush is the worst President America’s ever had?” Everyone claps. He goes, “Which makes it all the stranger that his son, George W. Bush was in fact the best.”
Lex Fridman
(00:18:58)
I take it back on the self-censorship.
Michael Malice
(00:19:01)
So two people laugh and he goes, “Oh, that’s amazing. I guess I’ll do an encore.” And he did 10 more minutes. It was just, I love him so much.
Lex Fridman
(00:19:09)
It’s interesting. They opened for Tenacious D. Jack Black, that’s a comedic genius of a different kind.
Michael Malice
(00:19:16)
Oh, yeah. And he was in one of my favorite movies, Jesus’ Son. It’s this little Indie movie. He did a great turn in that. He’s really underrated as an actor. He’s got a lot of range. I know he kind of get types cast as this one specific type, but he’s really, really talented.
Lex Fridman
(00:19:30)
But also just the pure joy.
Michael Malice
(00:19:32)
Yes. He’s clearly having fun.

Thanksgiving

Lex Fridman
(00:19:35)
Okay. It is Thanksgiving. So in the tradition, following tradition, what are you thankful for, Michael, in this world?
Michael Malice
(00:19:45)
Do you have a list too?
Lex Fridman
(00:19:46)
No, not really.
Michael Malice
(00:19:47)
Really?
Lex Fridman
(00:19:48)
It’s up in here.
Michael Malice
(00:19:49)
Oh, I mean, but you have several things you’re thankful for.
Lex Fridman
(00:19:51)
Yes.
Michael Malice
(00:19:52)
Okay.
Lex Fridman
(00:19:52)
Yes.
Michael Malice
(00:19:53)
One of the things I’m-
Lex Fridman
(00:19:54)
My list comes from the heart. I don’t have to write anything down.
Michael Malice
(00:19:56)
Well, I don’t have written down.
Lex Fridman
(00:19:57)
Okay.
Michael Malice
(00:19:58)
One of the things that I’m most thankful for, this is a common answer, but I can back it up, is my family. Because my nephew, Lucas, is now six years old. And when kids have a sense of humor, it’s like just miraculous. So he stole my sister’s phone, his mom. Figured out that grandma is listed as mom in the phone, and he calls her up and he’s like, “Michael’s in the hospital. He’s really sick.

(00:20:27)
He didn’t want to tell you.” And she’s freaking out. He goes, “Prank.” So I took him, Dinesh D’Souza just released a movie called Police State, which was actually really good, highly recommend it. I was surprised how much I liked it because he wasn’t going Republicans good, Democrats bad.

(00:20:41)
It was just about authoritarianism. And he had a movie premier at Mar-a-Lago. So I’m like, I got to bring Lucas to Mar-a-Lago. So Lucas is, I’m like, “We’re going to the President’s house.” He’s like, “Oh, the White House?” And I’m like, “No, no, a former president.” He goes, “Oh, Abe Lincoln?” And I’m like, “Okay, kid logic.” He’s giving logical answers. This is kind of like AI, you have to program it. It’s using logic correctly.
Lex Fridman
(00:21:04)
You should have told him it’s a president that’s second to only Ab Lincoln in terms of greatness.
Michael Malice
(00:21:11)
Accomplishments, yeah. He went up to all the women in their ball gown, evening gowns, and he goes, “You’re so beautiful. Were you born as a girl?” So when you have this six year old asking you this, it was really, really fun. So that is a great joy to have a nephew. And I have another one, Zach, who’s coming up in age, and he’s starting to talk now. That is really, really fun for me.
Lex Fridman
(00:21:39)
Getting to watch them find out about the world for the first time.
Michael Malice
(00:21:43)
And also training them, that he loves being funny and having fun.
Lex Fridman
(00:21:49)
You’re his audience in a sense?
Michael Malice
(00:21:51)
Yeah, but.
Lex Fridman
(00:21:52)
Because you giggle and?
Michael Malice
(00:21:54)
I give him, “We’re prank bros.” He gives me a high five. My family, and this is one, you talk about what I find funny, this is things that actually enraged me. When people, and this is such a wasp thing, don’t just go with the joke or they’re like, “I don’t get it,” or they don’t understand to just go with it.

(00:22:10)
I was in the car with my sister when she was 10, 12, whatever. She’s much younger than me. She’s 12 years younger. And there’s this species of squid, by the way, which is asymmetric. One of its eyes is very much bigger than the other because it swims horizontally. And so one’s looking up, one’s looking down where there’s more light. Shout out. If you want to learn more about squids, go to octonation.com.
Lex Fridman
(00:22:32)
OctoNation. Shout out.
Michael Malice
(00:22:34)
Shout out to Warren.
Lex Fridman
(00:22:34)
There’s a lot of fascinating stuff. OctoNation on Instagram.
Michael Malice
(00:22:37)
Yes. I was in the car with my sister. She’s 10 or 20.
Lex Fridman
(00:22:40)
Me as a pirate, I’m sorry for the rude interruptions. I appreciate that comment, especially.
Michael Malice
(00:22:45)
Yeah, it’s a great. Yeah.
Lex Fridman
(00:22:46)
These jokes and thoughts are coming to me at a ten-second delay, so I apologize. Anyway, you were telling about the asymmetrical.
Michael Malice
(00:22:54)
I know where I was, don’t worry. I got it.
Lex Fridman
(00:22:54)
All right.
Michael Malice
(00:22:56)
So I tell my-
Lex Fridman
(00:22:56)
Sometimes you need help.
Michael Malice
(00:22:57)
No.
Lex Fridman
(00:22:57)
The age is getting to you.
Michael Malice
(00:22:57)
I was…
Lex Fridman
(00:23:02)
Your skin is showing it. It’s getting dark.
Michael Malice
(00:23:06)
I told my sister, I go, “When you were born, one of your eyes was bigger than the other, and you had to have surgery to fix it.” So she turns, she’s like, “Mom.” And my mom goes, “Honey, the important things that you’re beautiful now. It’s like, what’s the big deal? It was just a little surgery.” And I says like, “All right.” Calls grandma. And grandma goes, she goes, “Michael said that I was born one of the eyes.” She goes, “Why is he telling you this now? It’s not a big deal.” So the fact that everyone went with this…
Lex Fridman
(00:23:35)
Oh, nice.
Michael Malice
(00:23:36)
I was so impressed. I was like, “This is a quality family in this very specific regard.”
Lex Fridman
(00:23:41)
Yeah.
Michael Malice
(00:23:41)
Does your family have a sense of humor?
Lex Fridman
(00:23:43)
Yeah. Yeah. Yeah. Yeah. Yeah. Yeah.
Michael Malice
(00:23:43)
Yeah.
Lex Fridman
(00:23:47)
Soviet culture, there’s a dark sense of humor.
Michael Malice
(00:23:50)
Very much so.
Lex Fridman
(00:23:51)
There’s…
Michael Malice
(00:23:52)
Wordplay.
Lex Fridman
(00:23:52)
Wordplay. Yeah. Yeah. And especially the Russian language allows for some-
Michael Malice
(00:23:58)
Yes.
Lex Fridman
(00:23:59)
Hilarity to it. There’s also culture of poetry and my dad, my mom too, but they remember a lot of lines from books and poems. So you can do a lot of fascinating references that add to the humor and the richness of the conversation.
Michael Malice
(00:24:18)
I feel like that’s a very Russian thing. At a party or maybe at a bar or something, I don’t know where you’d meet people, these are such great ice-
Lex Fridman
(00:24:18)
I never go out.
Michael Malice
(00:24:25)
I meant in Russia.
Lex Fridman
(00:24:27)
Oh.
Michael Malice
(00:24:27)
I meant these would be such good icebreakers, right? You go up to someone and goes, “Hey, did you hear this one?” [foreign language 00:24:32] And you just tell him some little story.
Lex Fridman
(00:24:34)
Did you say icebreakers because it’s cold in Russia? I’m here all night.
Michael Malice
(00:24:42)
That’s true. You never leave the house.
Lex Fridman
(00:24:42)
Literally.
Michael Malice
(00:24:46)
I feel like that’s a thing.
Lex Fridman
(00:24:47)
Yeah.
Michael Malice
(00:24:48)
And that’s not a thing in America.
Lex Fridman
(00:24:51)
You mean like witty banter?
Michael Malice
(00:24:53)
No. Meaning you go up to stranger and that’s your icebreaker. You tell them this little joke, and since everyone kind of has the same sensibilities, right away, you guys are chatting. I don’t think that’s a thing here.
Lex Fridman
(00:25:02)
Yeah.
Michael Malice
(00:25:02)
I think here it’s more small talk, which.
Michael Malice
(00:25:00)
… We’re chatting. I don’t think that’s a thing here. The thing here, it’s more small talk, which drives me crazy.
Lex Fridman
(00:25:05)
So what else are you thankful for?
Michael Malice
(00:25:06)
Well, what’s something you’re thankful for?
Lex Fridman
(00:25:09)
Well, you went with family. I’m definitely thankful for family.
Michael Malice
(00:25:12)
Okay.
Lex Fridman
(00:25:12)
Yeah.
Michael Malice
(00:25:14)
If I may ask, how do they react to you? You’re sitting down with Elon, you’re sitting out Netanyahu, sitting down with Kanye, all these big names. Are they expressing that they’re proud of you or is it more like, why haven’t you talked to this person?
Lex Fridman
(00:25:30)
Yeah, more Michael Malice, please.
Michael Malice
(00:25:34)
The people’s choice.
Lex Fridman
(00:25:36)
Yeah, They’re very proud. But they get argumentative and they’re just like a regular human being with whom I’m close and we just argue about stuff. They’re maybe not enough show the being proud of, but that part is just the nature of our relationship. It’s also the same with your parents?
Michael Malice
(00:25:56)
Yeah. I don’t talk to my dad. That’s one of the reasons because there’s never ever any good job. And at a certain point it’s like, why am I trying to search for approval from someone I’m never getting it for? And from whom it wouldn’t mean anything at this point anyway.
Lex Fridman
(00:26:14)
Well, that’s interesting. There’s a journey like that for a lot of people with their father or their mother. They’re always trying to find approval, and that’s life for a lot of people. That’s a really big part of the human condition is that relationship you have with your father, with your mother. I don’t know. It’s a beautiful thing whether it’s been a rough childhood or a beautiful one, all of it. That’s who you are. The relationship, especially early on in your life with your father or with your mother, is extremely formative.
Michael Malice
(00:26:48)
Yeah. My dad taught me a lot of things at a young age that I’m very, very grateful for. He’s extremely intelligent, very flawed, and that’s fine. We all are, except for me. And it’s the kind of things that when you learn things at a right age, and this is one of the things I like about being older, is that when I’m friends with people-
Lex Fridman
(00:26:48)
Much older.
Michael Malice
(00:27:09)
Much older, much older. When I have friends who are younger, it’s very easy for me to keep them from making the mistakes I did. So at least this is something I’m getting out of it is that, okay, I can’t fix these mistakes, but it just takes me 30 seconds and I can pull you back from making the mistake. So he’s taught me a lot as a kid, he really encouraged me very much to… He has a very good sense of humor and also very bad in some ways. Dad jokes, but also really funny jokes, but also this love of learning that I got that from him. And I have got literally right now, 98 books on my shelf to read. I remember I had a friend and she ran into someone she went to high school with and he stopped me on the train and he’s like, “Yo, you’re not in college. You don’t need to read books anymore.” And I was just horrified to hear this.
Lex Fridman
(00:28:00)
Yeah, yeah. Boy, don’t I know it.
Michael Malice
(00:28:07)
You do laugh, but there’s a lot of things I don’t understand. When you got heat for, I want to read the Western Classics. To me, that might’ve been the internet at its absolute worst.
Lex Fridman
(00:28:21)
I think there’s just a cynical perspective you can take that this is such a simple celebration of a thing, that there must be something behind it. I think the internet for good and bad, is just skeptical. What’s behind this?
Michael Malice
(00:28:36)
My hero, Albert Camus. And if there’s one thing I would want to fight, it’s cynicism because it’s such a giving up. It’s such, everything sucks, this sucks, this sucks. Most things suck. Most stand up comedians suck. Most movies suck. All podcasts suck. But it doesn’t matter.
Lex Fridman
(00:28:54)
Especially yours.
Michael Malice
(00:28:55)
Especially mine. It’s unwatchable.
Lex Fridman
(00:28:58)
You’re welcome. You can’t even spell it correctly.
Michael Malice
(00:29:03)
But the stuff that’s good is what matters. Who cares if 90% of movies are terrible? They’re the ones that change your life, the books, the people, the comedians, the shows, the music.
Lex Fridman
(00:29:17)
And even the terrible things have good moments, beautiful moments.
Michael Malice
(00:29:22)
Some, not all.
Lex Fridman
(00:29:23)
Your podcast being an example of not all. I keep listening for something good, something good.
Michael Malice
(00:29:31)
In all fairness, none of my guests have anything to offer.so that’s not on me. I try.
Lex Fridman
(00:29:37)
Yeah. Well, I wish you’d talk a little less in your podcast. It’s a little excessive. I only listen for the underwear commercials.
Michael Malice
(00:29:46)
Sheathunderwear.com. Promo code Malice.
Lex Fridman
(00:29:48)
I haven’t seen you do it in a while, but this kind of commentary on a debate or I think it was with Rand, like an Ayn Rand debate or something.
Michael Malice
(00:30:00)
Oh yeah. Malice at the Movies. I watched the video and I broke it down.
Lex Fridman
(00:30:03)
That was really great. I wish you did that more.
Michael Malice
(00:30:05)
I haven’t done livestreaming in a long time. It was something I was doing a lot in New York, especially during COVID. I feel that I don’t know, I got so many projects on the plate. Oh, this is something else I’m thankful for. This is something I’m very, very thankful for and I’m going to announce it here.
Lex Fridman
(00:30:26)
Coming out of the closet, finally. Go ahead. Who’s the lucky guy?
Michael Malice
(00:30:38)
You’re the one in drag.
Lex Fridman
(00:30:42)
Guns out. Guns out.
Michael Malice
(00:30:45)
He makes me call him Sex Friedman.
Lex Fridman
(00:30:48)
You like it.
Michael Malice
(00:30:50)
I didn’t say I did.
Lex Fridman
(00:30:52)
All right.
Michael Malice
(00:30:52)
Didn’t even imply that. When I in, as you probably know as you know, but as many people watching this also know, Harvey Pekar who had the comic book series, American Splendor was the subject of the movie, American Splendor. He wrote a graphic novel about me in 2006 called Ego and Hubris, which goes for like $150 on eBay. It’s not worth it, just downloaded it. And I met Harvey because I wrote this screenplay about this band from the 80s called Rubber Rodeo. It’s a real band. And the keyboardist, Gary Leib, who passed away. Rest in peace, Gary. Introduced me to Harvey because he did the animation for the movie. And this script’s been in my desk for over 20 years, and I realized thanks to my buddy Eric July, who has some huge success with his comics, I could just produce this as a graphic novel.

(00:31:43)
So I’ve got an artist, we’re getting it together, so I’m going to make it happen finally. And it’s some of the best writing I’ve ever done. I’m really proud of the story. It’s ironic reading it now, because when you’re a writer, obviously different books, you put different aspects of yourself into them, and this story is very, very dark because basically they did all the right things and they went nowhere. What I realized was reading it now, that all these fears I had over 20 years ago about what if I’m not going to make it? What if I’m doing all the hard work and it’s still not enough? Now it’s been disproven because I can at least pay my rent.
Lex Fridman
(00:32:22)
Do you feel like you’ve made it because you said you could pay your rent.
Michael Malice
(00:32:26)
I feel that to make it is if you don’t have to have a boss, and you know how I really felt like I made it?
Lex Fridman
(00:32:36)
Mm-hmm.
Michael Malice
(00:32:37)
This is going to sound like a joke, and it’s not. This is being an immigrant, I own as you know, Margaret Thatcher’s bookcases.
Lex Fridman
(00:32:45)
Yes.
Michael Malice
(00:32:45)
So to me as an immigrant, to have her bookcases in my house, I’ve made it.
Lex Fridman
(00:32:51)
You’re right. It’s not a joke.
Michael Malice
(00:32:53)
There’s nothing funny about it at all.
Lex Fridman
(00:32:55)
Not laughing.
Michael Malice
(00:32:55)
It’s time to get serious.
Lex Fridman
(00:32:59)
Oh, nice. Oh, now I’m more nervous and aroused. So what else are you thankful for? So we’re both thankful for family.
Michael Malice
(00:33:10)
the fact that I can-
Lex Fridman
(00:33:11)
Still get it up?
Michael Malice
(00:33:12)
What’s that?
Lex Fridman
(00:33:13)
Nothing, go ahead.
Michael Malice
(00:33:14)
I think as an author, to be able to write what you want and have of enough an audience that it covers your living, that’s as good as it gets as an author almost. You don’t need to be Stephen King or some legend. There’s lots of stand-ups who aren’t world famous, but they have perfectly good living. They do their gig, they do what they love. I feel very, very blessed. You must be thankful for your career?
Lex Fridman
(00:33:43)
Yeah, yeah. Career wise. But I think the best part about it’s just making friends with people I admire.
Michael Malice
(00:33:52)
Okay.
Lex Fridman
(00:33:53)
Quite honestly, just friends. The people that have gotten to know me, I hide from the world sometimes, I hit some low points, especially with all the new experiences and just the people that have been there for me and haven’t given up on me.
Michael Malice
(00:34:06)
There’s days, and I’m sure you’ve had this also where I literally don’t speak to someone the whole day. And in certain times in my life, I remember very vividly, I was in DC in ’97, I was an intern, and that summer, DC closes down on the weekends. And I remember those weekends when I got off the phone with the third person. I knew there was no possibility anyone was going to call and what that felt like, and it was dark and it was bad. So I remember those feelings of loneliness a lot.
Lex Fridman
(00:34:44)
I still feel alone like that sometimes. You don’t feel alone?
Michael Malice
(00:34:51)
Not anymore.
Lex Fridman
(00:34:53)
What’s the reason, you think?
Michael Malice
(00:34:57)
Because I have a lot of people who I care about and who care about me. The thing about moving to Austin is I forgot how lonely New York got because it was like one after another, I lost everybody. And then you start losing the places you go to, and then it was just like, “Holy crap. I’m very isolated.” And here in Austin, there’s not as much to do, obviously as in New York, but there’s a lot of people here. More people are coming all the time. So if I ever want to hang out with someone, I’ve got a long list. And these are people who I’ve known for a very long time, people who know me quite well, so I could be myself. My awful, awful, awful, awful self. And that is something I don’t take lightly.
Lex Fridman
(00:35:42)
Now you moved to Texas, it’s going to secede.
Michael Malice
(00:35:44)
Yeah.
Lex Fridman
(00:35:44)
It’s just a very-
Michael Malice
(00:35:46)
Do you know what happened with that?
Lex Fridman
(00:35:47)
No.
Michael Malice
(00:35:48)
I forget the guy’s name, and it’s probably for the best. On Monday, a guy in the Texas legislature introduces a bill to have it on the referendum to have a referendum for Texas to declare its independence. Tuesday, I’m on Rogan. Me and him discuss it. I give it national attention. It was also really funny because a lot of people are like, “These people have been in Texas, five minutes, blah, blah.” I go to the Texas legislature, meet with the guy, have a nice conversation. A month or two later, unanimous, I think, he gets voted kicked out of Congress because he got an intern drunk and was inappropriate with her. At least it was a girl in this case. But yeah, so that was my little Texas independence moment.
Lex Fridman
(00:36:36)
Oh, it didn’t go anywhere?
Michael Malice
(00:36:38)
It did not go anywhere.
Lex Fridman
(00:36:39)
Wow.
Michael Malice
(00:36:41)
But it’s still part of the platform of the Texas Republican Party.
Lex Fridman
(00:36:45)
It’s fascinating that history is probably laden with stories like this of failed revolutionaries. We celebrate the heroes, but then there’s the losers like…
Michael Malice
(00:36:55)
Myself.
Lex Fridman
(00:36:56)
Yeah.
Michael Malice
(00:36:56)
Yeah.
Lex Fridman
(00:36:58)
And we’re going to mark that one as a failure and edit it out and moving on. So thankful. Friendships, right? But by the way, I want to say just to you, I’m thankful in these lonely moments, for people who write books. I’ve been listening to audiobooks a lot and reading a lot. I really like audiobooks actually. I don’t know, I can just name random person, Serhii Plokhy. He’s a historian I’m reading on the-
Michael Malice
(00:37:28)
Wait, I read him. What did he…
Lex Fridman
(00:37:29)
It’s just he’s written a book most recently about the Russia-Ukraine war.
Michael Malice
(00:37:35)
He wrote another one that I read. Didn’t he write about-
Lex Fridman
(00:37:37)
Empires, I think.
Michael Malice
(00:37:38)
The fall of the Soviet Union or something like that.
Lex Fridman
(00:37:38)
Yeah, yeah.
Michael Malice
(00:37:40)
Yeah. It was very, very good.
Lex Fridman
(00:37:41)
He’s great.
Michael Malice
(00:37:42)
I used him as a resource for the White Pill.
Lex Fridman
(00:37:44)
He’s objective while still having emotion and feeling to it. He has a bias.
Michael Malice
(00:37:49)
That’s fine.
Lex Fridman
(00:37:53)
A lot of times when you write a story that involves Putin, people are really ideological. They don’t write with a calmness and the clarity and the rigor of history, there’s emotion in it. There’s almost a virtue signaling. And he doesn’t have that, even though he is Ukrainian and has very strong opinions on the matter. Anyway, there’s people like that and he’s done an incredible job researching a recent event. Like he says, I was looking at everything that’s been written about the war in Ukraine and realizing the old Churchill line, that historians are the worst ones to write about current events except everybody else. And so he’s like, “I might as well just write about this war.” And he does an exceptional job summarizing day by day, the details of this war. Anyway. So I’m just grateful for a guy like that.
Michael Malice
(00:38:50)
For me, I’ll name some historians I love. Arthur Herman, Victor Sebastyen is probably my favorite. David Pietrusza, P-I-E-T-R- U-S-Z-A. When you are a historian, and I try to do this to some degree in the White Pill as much as I could. But when you take data and you make it read like a novel, so you’re learning about who we are as people, what had happened, but also it’s entertaining and readable. That to me is like the Acme of writing. I have so much admiration-
Lex Fridman
(00:39:25)
What does Acme mean?
Michael Malice
(00:39:27)
Top.
Lex Fridman
(00:39:27)
Okay.
Michael Malice
(00:39:28)
Zenith.
Lex Fridman
(00:39:29)
Zenith? Okay. Is this what writers do? They just come up with these incredibly sophisticated words? I’m impressed.
Michael Malice
(00:39:35)
Well, Acme is-
Lex Fridman
(00:39:35)
Because you could have just said the best of writing.
Michael Malice
(00:39:38)
Acme is also the company in Bugs Bunny and Wile E. Coyote is always Acme, like Acme bombs. When they are that good, it leaves me in awe.
Lex Fridman
(00:39:52)
It’s just-
Michael Malice
(00:39:53)
Ron Chernow is another one.
Lex Fridman
(00:39:54)
Who?
Michael Malice
(00:39:55)
He wrote the Hamilton biography.
Lex Fridman
(00:39:57)
Oh, nice. Well, I have a lot of favorite historians about the whole time period of World War II, William Shirer, people that lived during it, especially. I really like those accounts. Obviously Soldier Knudsen, he’s not a historian, but his accounts are fascinating. Actually, how much do you talk about Soldier Knudsen?
Michael Malice
(00:40:20)
Never.
Lex Fridman
(00:40:21)
Not much, right? Why not?
Michael Malice
(00:40:24)
I feel like I wanted to. There’s nothing I could add to him.
Lex Fridman
(00:40:30)
But he’s the Michael Malice of the previous century?
Michael Malice
(00:40:34)
No, he’s talented, charismatic, and skilled. So he’s not the Michael Malice. Yeah. I feel like I didn’t read Gulag Archipelago for the White Pill.
Lex Fridman
(00:40:47)
You didn’t?
Michael Malice
(00:40:48)
I didn’t. No. I got a lot of it from Anne Applebaum, who’s a very controversial figure. Her history books on the Soviet Union, I think are superb, but she’s also accused of being very much a NeoCon and being a warmonger in contemporary times.
Lex Fridman
(00:41:02)
Oh, I see.
Michael Malice
(00:41:02)
And I think comparisons between Putin and Stalin, although there is a Venn diagram, I think are a bit much, because I think it’s very hard to claim that if Putin conquered Ukraine, that there’d be a genocide. I think that’s a very hard argument to make.
Lex Fridman
(00:41:19)
In these tense times. Even the comparisons of what’s going on in Israel on either side, comparisons to the Holocaust are also troubling in this way.
Michael Malice
(00:41:28)
Yes. And I also don’t like how that… I got in trouble. There was some literal demon who works at the Atlantic.
Lex Fridman
(00:41:36)
As opposed to a regular demon?
Michael Malice
(00:41:38)
As opposed to figurative demon.
Lex Fridman
(00:41:40)
I didn’t know they employed demons.
Michael Malice
(00:41:41)
They exclusively employ demons at the Atlantic. And he was giving me crap this a couple of years ago on Twitter because I didn’t think it’s appropriate to refer to George Soros as a Holocaust survivor. And I’m like, “Listen, if you want to put him in the same context as Anne Frank, knock yourself out.” But I think that’s so completely disingenuous and frankly repulsive to me morally to equivocate between figures like that. And also to claim that anyone who is a billionaire who is including Elon, including Sheldon Adelson, there’s no shortage of these people. If you want to use your extreme wealth, use it to influence politics, you have to be up for criticism, Bill Gates. To protect these people from criticism just on the base of their identity is deranged to me.
Lex Fridman
(00:42:35)
But also, the Holocaust as a historical event and the atrocities within it are just singular in history. And so comparing them…
Michael Malice
(00:42:47)
What’s the utility? You’re just basically trying to take this brand. I’m using that term in a very specific way. And when they say climate denial, no one’s denying climate exists. So you’re just trying to go off Holocaust denial. I think it’s shameless and I think it’s gross.
Lex Fridman
(00:43:04)
And it cheapens everything because there’s deep important lessons about the Holocaust.
Michael Malice
(00:43:09)
Yes.
Lex Fridman
(00:43:10)
To me, the lessons are about how extreme it can get.
Michael Malice
(00:43:15)
And how fast.
Lex Fridman
(00:43:17)
Yeah, and how fast.
Michael Malice
(00:43:17)
That’s the one. So people ask, “Oh, are humans basically good? Are they basically evil?” I always say they’re basically animals. And I think most people are almost fundamentally deranged. And that there’s basically this veneer of civilization and decency. And when shit hits the fan and we see this over and over, they do things that would’ve been completely unthinkable even to themselves five years ago.
Lex Fridman
(00:43:46)
Most people are fundamentally deranged with a veneer of civility.
Michael Malice
(00:43:50)
There’s a show called-
Lex Fridman
(00:43:51)
I Think I disagree with that.
Michael Malice
(00:43:53)
What’s the show called? I’m having Alzheimer’s because of the advanced age.
Lex Fridman
(00:43:58)
The age, the skincare. It’s just working well.
Michael Malice
(00:44:00)
There’s a show called, I Think You Should Leave. It’s a sketch comedy.
Lex Fridman
(00:44:00)
I think you should leave. Okay, sorry.
Michael Malice
(00:44:04)
It’s a sketch comedy show. And he captures these great… How’s your hair, princess? He captures these great moments of just the very thin veneer of normalcy and just the craziness that’s so frequently lurking underneath. Another great example of this, when this is dealing with people who are literally crazy, have you ever seen the show, Hoarders?
Lex Fridman
(00:44:27)
Yeah.
Michael Malice
(00:44:27)
So every episode of Hoarders, there’s usually two people in every episode, but every episode has the same plot line, veneer of normalcy, veneer of normalcy, veneer of normalcy, slight expression of concern, full-blown derangement. And it always follows that exact pattern.
Lex Fridman
(00:44:44)
Yeah, I don’t know. I think the deep ocean of the human mind is good. There’s a longing to be good to others.
Michael Malice
(00:44:56)
I have seen literally no evidence of this. And I know everything’s a deep ocean with you people, but-
Lex Fridman
(00:45:01)
What do you mean you people?
Michael Malice
(00:45:02)
Pirates.
Lex Fridman
(00:45:04)
Oh.
Michael Malice
(00:45:05)
I don’t see it.
Lex Fridman
(00:45:05)
What’s that Mr. Parrot? He’s an antisemite/ No, that’s not nice to say in front of such a large audience. You’re embarrassing me, Mr. Parrot.
Michael Malice
(00:45:18)
Lex, you have-
Lex Fridman
(00:45:19)
What’s that Mr. Parrot? He’s a run-of-the-mill troll and barely an intellectual. That’s not nice to say. That’s not true. We talked about this. You have to see the good in people.
Michael Malice
(00:45:31)
You have seen personally, how quickly and easily it is for human beings to form outgroups and to just rid others, as I just did a minute ago with the Atlantic, completely out of the human race. And that happens constantly and very easily. Humans are tribal beings. I don’t see how that’s compatible with this essential desire to do good.
Lex Fridman
(00:45:58)
No, I think it’s like in 1984, the two minutes of hate. There is a part of humans that wants to be tribal and wants to get angry and hateful. And then that hate is easy to direct by, especially people as you, as an anarchist, talk about, there are people in power that direct that anger.
Michael Malice
(00:46:20)
Yes.
Lex Fridman
(00:46:21)
But I think if you just look at recent human history, the desire for good, the communal desire for good outweighs that, I think. Most of life on earth right now, people are being good to each other in a most fundamental sense relative to how nature usually works.
Michael Malice
(00:46:40)
Okay. I think you’re both wrong about people and about nature. So nature is not inherently violent in the sense, for example, if anyone has an aquarium or if you look at wildlife, yeah, you’re going to have predator or prey, but these animals are going to be coexisting and they’re going to be ignoring each other for the most part, right?
Lex Fridman
(00:46:40)
Mm-hmm.
Michael Malice
(00:46:59)
And as for humans being essentially good, I think humans are essentially to each other, you said, I think they’re essentially civil and amiable, but that’s not really being good.
Lex Fridman
(00:47:12)
Good, I think is a thing that gets illustrated when you’re challenged, when there’s difficult situations.
Michael Malice
(00:47:17)
Yes, exactly. Yes.
Lex Fridman
(00:47:18)
Civility is a good starting point. And then when there’s a big challenge that comes, people step up on average.
Michael Malice
(00:47:26)
I completely agree with you that human beings are capable of such profound goodness, that it makes you extremely emotional. And I certainly think that’s that’s true, but I think that’s more unusual than it’s the norm.
Lex Fridman
(00:47:42)
I see beauty everywhere.
Michael Malice
(00:47:43)
So do I, but that doesn’t mean it’s in every person.
Lex Fridman
(00:47:46)
Not in every person, but in most people. I wish there was a really good way to measure this, my general sense of the world. It’s just there’s so much incredible both in terms of economics, in terms of art, in terms of just creation as a whole, that’s happened over the past century, that it feels like the good is out powering the bad.
Michael Malice
(00:48:09)
You just did the perfect segue to the box.

Unboxing the mystery box

Lex Fridman
(00:48:16)
What’s in the box? Is it your fragile ego?
Michael Malice
(00:48:21)
You stole my joke. You stole my joke. That was the joke I made at you before we recorded. You stole my joke.
Lex Fridman
(00:48:28)
No, I didn’t. I write all your material, you hack.
Michael Malice
(00:48:33)
So as you know, I have a lot of beautiful stuff in my house because I think it’s something very important. Everyone listening, if you accomplish something that is great, some achievement, what I like to do is buy myself something to remember that moment. Because sometimes when it’s hard, you forget you’ve done great things in your life. You’ve made accomplishments. It doesn’t have to be some amazing factory. It could just be like my first job or I got a raise or you know what? Anything. So there’s this amazing sculptor named Jake Michael Singer, a singer who’s a sculptor, and I saw a piece of him.
Lex Fridman
(00:49:18)
How’s his singing voice? This joke’s not going-
Michael Malice
(00:49:23)
Hold on. I could go somewhere with this.
Lex Fridman
(00:49:24)
Okay.
Michael Malice
(00:49:25)
How’s his singing voice?
Lex Fridman
(00:49:26)
Do you want me to write your joke for you?
Michael Malice
(00:49:27)
Yeah. What’s the punchline? Harrrd. There it is, that’s the one.
Lex Fridman
(00:49:33)
That’s what she said.
Michael Malice
(00:49:34)
So I followed him on Instagram, he followed me back and he says, “What’s the point of being an artist if the work I create isn’t in the spaces of people I like and admire?” He’s a big fan of yours. You’ve given him and our episodes together give him joy. So he said, “If I make Lex a sculpture, will he put it on the-“
Michael Malice
(00:50:00)
He said, “If I make Lex a sculpture, will he put on the shelf behind him?” And what that reminded me of is when I was a kid, you read Batman comics and there’s the Bat Cave. And the Bat Cave has all this cool stuff in it. I didn’t realize until much later that all of those things in the bat cave had an origin story. So the giant penny, the dinosaur, there was actually a story where that came from. So if you’re a fan of a show, you can spot, oh, this is when this appeared. This is when that appeared. This is when that appeared. So he made you this sculpture. He lives in Turkey and it’s called Chance Murmur. And it is, I haven’t even seen it yet. It is absolutely beautiful.
Lex Fridman
(00:50:42)
So you want to do a little unboxing?
Michael Malice
(00:50:42)
Yes.
Lex Fridman
(00:50:44)
Okay. Axe or…
Michael Malice
(00:50:49)
Body spray?
Lex Fridman
(00:50:54)
All right.
Michael Malice
(00:50:54)
Let’s do it.
Lex Fridman
(00:50:55)
Let’s unbox.
Michael Malice
(00:50:59)
I’m so excited. He lunges out of the box.
Lex Fridman
(00:51:04)
You know that Steven Seagal movie where there’s a stripper that comes out of the box?
Michael Malice
(00:51:07)
Is there?
Lex Fridman
(00:51:08)
Under Siege.
Michael Malice
(00:51:09)
Okay.
Lex Fridman
(00:51:09)
He’s on a boat. You’re not an action film guy.
Michael Malice
(00:51:14)
No.
Lex Fridman
(00:51:19)
One.

(00:51:21)
What does the pirate say when he turns 80?
Michael Malice
(00:51:24)
What?
Lex Fridman
(00:51:25)
Aye matey.
Michael Malice
(00:51:29)
Aye matey. Oh.
Lex Fridman
(00:51:32)
Oh.

(00:51:33)
See, that’s how I know you don’t like humor.
Michael Malice
(00:51:35)
I just don’t like pirates.
Lex Fridman
(00:51:37)
Well, your mom does.

(00:51:39)
Do you play any musical instruments?
Michael Malice
(00:51:40)
No. Neither do you. I’ve seen your guitar videos.

(00:51:46)
Okay.
Lex Fridman
(00:51:48)
Here’s a big piece of wood for you. That’s what it feels like, just so you know.
Michael Malice
(00:51:57)
Oh, wow. Do you need help?

(00:51:57)
Oh my God.
Lex Fridman
(00:52:00)
This traveled across the world.
Michael Malice
(00:52:05)
So here’s why his work speaks so much to me. So first of all, he’s combining so many different references. It’s Nike, the Goddess of Victory, right? It looks like an angel as well. The Italian futurist, which is my favorite art movement from the early 20th century, they tried to capture motion in 2D or 3D form.
Lex Fridman
(00:52:31)
Well, Jake, thank you, thank you, thank you. Thank you for creating beautiful things. Thank you for caring about somebody like me and somebody like Michael. We really feel the love.
Michael Malice
(00:52:43)
That’s the other thing.
Lex Fridman
(00:52:43)
Thank you.
Michael Malice
(00:52:45)
When you have something that matters to you in your house and you’re having a bad day, you can look at it and remember. You know what I mean? That spirit of joy. And I actually have a list here. Okay? I’ve got a little rant ready. Do you want to hear my rant?
Lex Fridman
(00:53:00)
Yeah. Let’s go.
Michael Malice
(00:53:02)
One of the things that drives me crazy is when people, especially conservatives, think that all contemporary art is ugly or abstract or literally garbage. And there’s a lot of that, but so much of the stuff out there in galleries is not only not crazy expensive, but they’re trying to sell things for people in their house. And these are young artists. They’re trying to add beauty. I have a list, so if you don’t believe me and you think all contemporary art is garbage or terrible, go to the website or any of these places that I’m going to rattle off, look through them. And you’re telling me that it’s not about creating beauty and joy and things in people’s lives?

(00:53:40)
So I don’t have any relationship with any of these people, these are just some galleries I follow on Instagram. Outre Gallery, Antler Gallery, Giant Robot 2, Beinart, I don’t know how to pronounce it, I’m sorry. B-E-I-N-A-R-T. Spoke Art Gallery, Var Gallery in Milwaukee, I was there. The pieces were not expensive at all.
Lex Fridman
(00:53:58)
What kind of art are we talking about? Everything? Paintings?
Michael Malice
(00:54:00)
Mostly paintings. Mostly paintings. Some sculptures too, like this. Corey Helford is my favorite one in LA. Night Gallery, Vertical Gallery, Avant Gallery, Hive Gallery, Haven Gallery, and Curio Art Gallery. I’m telling you, it’s not exorbitant. This is not the kind of thing where you have to go to a museum and be like, “This doesn’t make sense to me.” You look at it right away, you’re like, “Okay, I know what this is.” And it’s beautiful. It’s awesome. And you’re supporting someone who’s young and creative trying to do something and make the world a better place.

(00:54:31)
So I’m a big fan of the contemporary art scene. A lot of it is not great, but even the stuff that’s not great is very rarely disgusting or gross. It’s just like, okay, I’ve seen this before, or something like that.
Lex Fridman
(00:54:43)
Okay.
Michael Malice
(00:54:44)
It’s like the difference between, there’s a standup where I’ll pay money for the ticket, and someone who’s an opener. It’s like, I wouldn’t pay to see him perform, but he sure still made me laugh. That person is still by far more good than bad. So a lot of this art isn’t stuff I would own, but it’s like, okay, I get it. I like it.
Lex Fridman
(00:55:01)
Well, as the analogy goes, I really like going to open mics, actually, because funny… It sounds absurd to say, but funny isn’t the only thing that’s beautiful about standup comedy, it’s the…
Michael Malice
(00:55:14)
The agony.
Lex Fridman
(00:55:17)
It’s going for it. It’s trying to be funny. It’s taking the leap, trying the joke. And some of the best stuff is actually funny, but the audience is like three people, two of whom are drunk and bored, and you’re still going for it. And that’s the human spirit right there.
Michael Malice
(00:55:35)
Roseanne was telling me how Gilbert Gottfried would go on, it was like 3:00 in the morning. And it was her and three other comics in the audience and they all were just dying.
Lex Fridman
(00:55:46)
Yeah.
Michael Malice
(00:55:46)
He was just killing them. Who’s your favorite comedian?
Lex Fridman
(00:55:53)
Dave Smith.
Michael Malice
(00:55:54)
Who?
Lex Fridman
(00:55:56)
And cut scene. Favorite comedian. First, Norm Macdonald. If you put a gun to my head and I had to answer really quickly, that would be him.
Michael Malice
(00:56:04)
Okay.
Lex Fridman
(00:56:07)
I would also say Louis C.K.
Michael Malice
(00:56:09)
Oh, wow. Yeah. Oh my God, yes.
Lex Fridman
(00:56:12)
But that’s almost like a vanilla answer at this moment in history because it’s like a-
Michael Malice
(00:56:16)
Louis C.K.’s pretty radioactive.
Lex Fridman
(00:56:18)
He is. Well, yeah. He does the tough topics-
Michael Malice
(00:56:21)
Sure.
Lex Fridman
(00:56:22)
… the best. Mitch Hedberg. The wit of a good one-liner is great. I guess that’s what Norm Macdonald was a genius at. What about you?
Michael Malice
(00:56:33)
I mean, we’re so fortunate to be here in Austin because that Comedy Mothership, you go there and people are just killing it. David Lucas is amazing.
Lex Fridman
(00:56:43)
Yeah, he’s great.
Michael Malice
(00:56:43)
Thai Rivera probably did the best set I’ve seen since I’ve been here in Austin. And I watched him and I’m like, “This guy’s even bitchier than I am.” So I reached out to him. So he’s just terrific. David Lucas is another one, a buddy of mine.
Lex Fridman
(00:56:57)
You just said it twice, I think. David.
Michael Malice
(00:57:00)
I’m thinking of Dave Landau, excuse me.
Lex Fridman
(00:57:01)
Yeah.
Michael Malice
(00:57:01)
Dave Landau. Joe Machi is-
Lex Fridman
(00:57:04)
Old age catching up.
Michael Malice
(00:57:04)
It’s true though.
Lex Fridman
(00:57:05)
It’s true.
Michael Malice
(00:57:06)
It’s true.
Lex Fridman
(00:57:07)
It’s true.
Michael Malice
(00:57:08)
Dave Lucas.
Lex Fridman
(00:57:10)
You ever been to the Comedy Mothership? It’s a great spot.
Michael Malice
(00:57:13)
Where is that? Is that in Austin?
Lex Fridman
(00:57:15)
Austin? Is that where Willie Nelson is from? I haven’t really… Go ahead, I’m-
Michael Malice
(00:57:19)
Oh, I heard a joke about that the other week.
Lex Fridman
(00:57:23)
Go ahead. Tell a joke again.
Michael Malice
(00:57:25)
What’s the only thing worse than giving head to Willie Nelson?
Lex Fridman
(00:57:31)
What?
Michael Malice
(00:57:32)
If he says, “I’m not Willie Nelson.”
Lex Fridman
(00:57:36)
What’s that, Mr. Parrot? I know he’s not funny. He thinks he’s better on Twitter. But that’s not nice to say, and right in front of his face. Just think how he feels.
Michael Malice
(00:57:49)
The statue, Chance Murmur is judging you.
Lex Fridman
(00:57:52)
Chance?
Michael Malice
(00:57:52)
It’s called Chance Murmur.
Lex Fridman
(00:57:54)
Chance Murmur.
Michael Malice
(00:57:55)
God, that’s so beautiful.
Lex Fridman
(00:57:56)
That is gorgeous.
Michael Malice
(00:57:58)
This is another reason I hate cynicism, and I talk about this a lot. Even just on Etsy, there are so many small, not huge companies, individual artisans who are creating great stuff and just making it happen. And it’s really sad for me where people can’t see that. Or if they’re like, “Well, how could I be excited about a sculpture when blah, blah, blah, the Middle East?” And it’s just like, you can always look for an excuse not to look for joy, or you could look for an excuse to look for joy.
Lex Fridman
(00:58:26)
Yeah. Etsy is incredible. I feel the same way about-
Michael Malice
(00:58:28)
OnlyFans?
Lex Fridman
(00:58:30)
… OnlyFans. I can’t even get that out of my mouth before laughing at my own failed joke.
Michael Malice
(00:58:34)
That’s what she said.
Lex Fridman
(00:58:36)
Oh, all right. That might be one of the first that’s what she said from Michael Malice.
Michael Malice
(00:58:43)
Yeah.
Lex Fridman
(00:58:43)
I’m going to count that.
Michael Malice
(00:58:48)
I don’t know what I’m going to do with mine, because I got my own. Mine’s three feet tall, just like me.
Lex Fridman
(00:58:52)
Your box was much bigger.
Michael Malice
(00:58:53)
Yeah.
Lex Fridman
(00:58:54)
And it was giving me an inferiority complex. I think I’m going to invade Russia. That’s a Napoleon reference for those in the audience.
Michael Malice
(00:59:09)
I don’t know if I’m going to… I think I’m going to put it in my bedroom so it’s the first thing I see when I wake up.
Lex Fridman
(00:59:13)
Put it in the bedroom.
Michael Malice
(00:59:13)
Yeah.
Lex Fridman
(00:59:16)
Did we get through everything we’re thankful for?
Michael Malice
(00:59:19)
No, I’ve got lots of things I’m thankful for.
Lex Fridman
(00:59:20)
What else? Friends, family. We said books.
Michael Malice
(00:59:26)
I’m thankful for career. I am thankful for… And I know people are going to lose their minds and I can hear them flipping out already. I am thankful for social media.
Lex Fridman
(00:59:40)
Yeah.
Michael Malice
(00:59:41)
I’m thankful for several reasons. First, it is a way for people to make connections that they couldn’t have made in years past. That if you’ve got some weird hobby, you can find that other person’s weird hobby and you make that connection. It’s a great way to stay in touch permanently for people otherwise you’d lose touch with, you know, at whatever venue. And it’s also a great way to expose corporate depravity. When you have these organizations that are dishonest, I think the community notes thing on Twitter is the greatest thing ever.
Lex Fridman
(01:00:09)
Yeah, it’s incredible. I wish they would pay attention to the Michael Malice account more often.
Michael Malice
(01:00:15)
You shouldn’t be encouraging anyone to pay attention to my Twitter account.
Lex Fridman
(01:00:18)
Yeah.
Michael Malice
(01:00:18)
It’s a dumpster fire. And I don’t mean Bridget, I mean like a literal… Bridget Phetasy.
Lex Fridman
(01:00:23)
Oh, Bridget, by the way, is amazing. But your Twitter account makes-
Michael Malice
(01:00:23)
She lives here.
Lex Fridman
(01:00:25)
Yes. Not here. I wish she did.
Michael Malice
(01:00:29)
She’s in Georgetown.
Lex Fridman
(01:00:30)
No, I mean in this, where we’re sitting.
Michael Malice
(01:00:32)
Oh.
Lex Fridman
(01:00:32)
It’s a joke, Michael.
Michael Malice
(01:00:33)
Is it?
Lex Fridman
(01:00:34)
Yeah.
Michael Malice
(01:00:34)
But I’m just really glad about… It’s another way for people who before would’ve felt very alone. I know some people do feel alone, but for other people it makes them feel connected.
Lex Fridman
(01:00:46)
There’s been a lot of talk about antisemitism recently.
Michael Malice
(01:00:49)
Yeah.
Lex Fridman
(01:00:49)
What’s your sense about this? Is antisemitism like any other brand of hate? There’s a lot of hate out there.
Michael Malice
(01:00:57)
No, I don’t think it’s like any other brand of hate, because I don’t think racists or transphobes or homophobes or misogynists or xenophobes argue openly or even not so openly for the killing of black Americans, transgender people, gay people, women, or immigrants. And it’s not only something that’s talked about, it’s something that has actually happened. And not just the Holocaust, but just centuries of pilgrims, right? There’s this great book that I read many years ago called The Satanization of the Jews. Camille Paglia recommended it and I read it. And they live in this certain specific kind of antisemitism. And again, I’m not talking about people who are against Israel or something like that. I’m talking specifically about Jew hatred. They have this moral calculus that Jews are the only people who are capable of good or evil, and Jews are exclusively capable of evil.

(01:01:56)
For example, if you look at the George W. Bush White House, you had W, you had Cheney, Condoleezza Rice, Colin Powell, Donald Rumsfeld, a lot of these NeoCon advisors. So if there’s 10 people in a room and there’s one Jewish person, it’s his fault, and the rest are Jew controlled. So again, they only exist as a puppet of Jews in this kind of worldview. And it’s like, to me, if there were no Jews on earth, it is crazy to say that John Bolton and Liz Cheney and Lindsey Graham wouldn’t be pushing for more war. That makes no sense to me. It’s like, you blame the Jews when bad things happen, but when a Jewish person does something good, it doesn’t really matter. Or just wait, he’s going to do something bad. Well, yeah, that’s true. Human beings do good things and then they do bad things sometimes. But it only counts when that Jewish person does the bad thing.
Lex Fridman
(01:02:58)
I wonder what’s a way to fight antisemitism and fight hate in general?
Michael Malice
(01:03:02)
I think the only or the best way, because I thought a lot about this, about how did gay Americans go from being universally hated and despised to the point that many people in the ’80s went to their graves, those who had AIDS, without even telling their parents because they were so scared, to now Times Square is just covered in pride flags. And this also works for Islamophobia and some of these other bigotry, is what I call the ambassador program. Because as soon as you know someone who is a member of a certain group, it is a lot harder to be bigoted against them because instead of this being this out group that’s somewhere out there, it’s like, wait a minute, I work with this guy. Yeah, he’s kind of a jerk and maybe he sees things a little differently than me, but this guy is not a horrible human being. So I think the only way to fight any form of bigotry is to be a good example of the counter to whatever archetype or stereotype is in the culture.

Karl Marx and religion

Lex Fridman
(01:04:13)
Karl Marx wrote that, “Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of a soulless condition. It is the opium of the people.” As the famous phrase goes. Do you think he has a point?
Michael Malice
(01:04:26)
No. I hate that quote. I absolutely hate it. I despise this sort of Reddit internet atheist activism for the simple reason that I know many people who in finding faith have become objectively better human beings.
Lex Fridman
(01:04:46)
Yeah.
Michael Malice
(01:04:46)
They start living consciously. They take morality seriously. They try, we all fail, to be moral good people. So this sneering that these midwits, these marginally intelligent people have towards religious people. Now, lots of religious people use religion to rationalize their bad behavior or sinful or big ego, so on and so forth. That exists, that’s true. But to say that it never helps anyone and it’s universally the… See, Marx was talking about a period, I mean, I’ll defend his quote, when his argument was the masses are being starved and oppressed, but they’re promised, don’t worry, you’ll have riches in heaven. So you should kind of let yourself be pushed around now, and this is kind of this BS bargain that the people are being given. So that was, I think, the point he was making. It certainly doesn’t apply nowadays. I’m close to the family in the Midwest. They’re good Christian people. I remember very specifically this guy, shout out to him, Sean Sherrod. I went to college with him. David Lucas.
Lex Fridman
(01:06:01)
Have you checked out the Comedy Mothership? Great club.
Michael Malice
(01:06:03)
Where is it? Is it in Austin?
Lex Fridman
(01:06:05)
Willie Nelson.
Michael Malice
(01:06:07)
I was 17, 18, freshman year, and I was reading all this criticism of the Bible and I was like, “Look, this is in there. Look at this in there.” And he put his hand on my shoulder and he says, “Michael, there’s nothing you’re going to tell me that’s going to make me lose my faith.” And that was a very self-aware and profound thing to say. As I’ve gotten older, I know lots of religious people. There’s no part of me that thinks they’re wrong or they should be mocked. It also reminds me of when people sneer at addicts in recovery, they’re like, “Alcoholism isn’t a disease, it’s a choice.” It’s like, wait a minute. You don’t know what it’s like to have your entire life ruined by drugs or alcohol.
Lex Fridman
(01:06:48)
Yes.
Michael Malice
(01:06:49)
And if you have to tell yourself, “I have this disease and blah, blah, blah,” and that keeps you from drinking and now you’re a moral upstanding person who’s reliable and takes responsibility for their actions, I don’t see the harm at all. So I think this kind of activist atheism is cheap. I don’t agree with it whatsoever. And I do not like that quote at all.
Lex Fridman
(01:07:12)
But otherwise, big fan of Marx?
Michael Malice
(01:07:14)
I mean, I think there’s a fan of mine, I forget who it was, apologies. He had this great quote, and this is me talking. He goes, “The games people play to feel smarter than others is depressing and annoying.” And I think this kind of fedora internet atheism is a good example, because here’s the other thing. If you’ve proven that someone else is stupid, that doesn’t mean you’re smart. You could both be stupid. So congrats, you proved someone else is stupid. Who cares?
Lex Fridman
(01:07:43)
Yeah. And sneering of all forms in general is just not great.
Michael Malice
(01:07:48)
That’s one of the things I block out people on social media instantly. You’re not going to sneer at me in my space. You could sneer at me all you want in your space, but I’m not putting up with your crap. I don’t know you.
Lex Fridman
(01:07:57)
MySpace, great social network.
Michael Malice
(01:08:00)
Is that on Sixth Street?
Lex Fridman
(01:08:04)
AOL.com.
Michael Malice
(01:08:07)
Clang, clang, clang. That’s how Lex comes.
Lex Fridman
(01:08:12)
Like a Pavlov’s dog. That was the sound before you get to see… Spend 10 minutes waiting for an image of a lady load one line at a time.
Michael Malice
(01:08:24)
Yeah.
Lex Fridman
(01:08:29)
I recently talked to John Mearsheimer, I don’t know if you know him at all. So he has this idea about offensive realism. It’s a way to analyze the world into national relations. And the basic idea, and I’ll run it by you and see what you think, is that states, nations want to survive and they try to do so by maximizing power, military power. And he talks about anarchy quite a bit, in that one of these underlying assumptions of this way of viewing the world is that states are anarchic towards each other.
Michael Malice
(01:09:10)
Yes, that’s true.
Lex Fridman
(01:09:11)
And they operate under a lot of uncertainty. States cannot be sure that other states will not use military capabilities against them.
Michael Malice
(01:09:18)
Right.
Lex Fridman
(01:09:19)
They want to survive and they want to use military power to control the uncertainty to protect themselves.
Michael Malice
(01:09:29)
So I disagree in that regard. And I see on your bookshelf, I think the world is a lot closer to Brave New World than it is to 1984. And I think if you look at, let’s suppose China’s influence in America. The influence is far more through soft power than military power. China doesn’t threaten America through “we’re going to kill you.” It’s more like the infiltration of universities, TikTok, things of that nature. Maybe this would’ve worked before the pop culture era, but I think one of the reasons we have this kind of American hegemony isn’t just a function of American military. I think it’s much more a function of American popular culture. When you’re exporting ideas and culture, it makes other people in other countries feel closer to you and also regard you as a friend, and also to adopt your value. It’s a great way to spread propaganda.
Lex Fridman
(01:10:28)
It seems to correlate though, right? It’s interesting. It’s an interesting idea. What has more power, the viral spread of ideas or the power of the military? It seems that the United States is at the top of the world on both.
Michael Malice
(01:10:44)
That’s true.
Lex Fridman
(01:10:45)
And so it’s hard to disentangle the two.
Michael Malice
(01:10:48)
Let’s look at Europe. American culture is very popular in Europe in many ways, right? The best music comes out of Sweden, Swedish indie pop. They’re singing in English, even though… So on and so forth. None of this is a function, maybe it’s a function of post World War II to some extent, but I don’t think it’s a function of American bases there. I think it’s a function of we’re exporting our music, our TV shows, and our movies.
Lex Fridman
(01:11:14)
Yeah. It’s interesting, if the battleground will be Brave New World, the battle of ideas.
Michael Malice
(01:11:18)
I think it’s clearly Brave New World. It’s so much cheaper, and again, this is one of the dark sides of social media, to use influence than it is to use threats. I think Covid is a good example of this. So much of the pressure, yes, there was authoritarianism, but it was the fact that everyone bought into it, rightly or wrongly. But the vast majority of the population wars behind all of these things, and that was through persuasion. And because people are begging for it to come back in many cases.
Lex Fridman
(01:11:47)
So who’s funding you? Which intelligence agency?
Michael Malice
(01:11:50)
Mossad.
Lex Fridman
(01:11:51)
Mossad. Mossad. This is how you do great interviewing. See, he didn’t even expect that. Okay.
Michael Malice
(01:12:01)
What’s that, Mr. Parrot?
Lex Fridman
(01:12:02)
What was that, Mr. Parrot? You knew it? But you didn’t have any documentation, did you?
Michael Malice
(01:12:10)
I think Mr. Parrot is threatened by the better wings on Chance Murmur.
Lex Fridman
(01:12:15)
He gets like that when he’s turned on, he’s not threatened.
Michael Malice
(01:12:18)
Oh, okay.
Lex Fridman
(01:12:18)
You can’t wait until all three of us are alone together. It’s going to be one hell of a party.
Michael Malice
(01:12:25)
Beaks and feathers everywhere.
Lex Fridman
(01:12:27)
And metal. Yeah, this thing is beautiful.

Art

Michael Malice
(01:12:32)
It’s ridiculous.
Lex Fridman
(01:12:34)
You have actually a lot of really cool stuff at your place.
Michael Malice
(01:12:37)
It’s so fun.
Lex Fridman
(01:12:39)
What’s a cool thing that stands out to you? Maybe a recent addition.
Michael Malice
(01:12:44)
So I went to the Dallas Museum of Art last year for my birthday and there was a painting I liked, and I Googled it and I saw the auction for that exact painting. And it was, I think three grand, which is not cheap, but not something you think… You think in a museum, “I could never afford something like this,” right? So when I went to Houston with some friends… The Sideserfs, Natalie, who made the cake of you.
Lex Fridman
(01:13:14)
Oh, yeah, the cake. Terrified my mom.
Michael Malice
(01:13:17)
Did it?
Lex Fridman
(01:13:18)
Yeah.
Michael Malice
(01:13:18)
Aww.
Lex Fridman
(01:13:19)
No, it’s not the cake that terrified my mom. It’s you, Michael Malice, cutting it off, cutting the face off and laughing maniacally.
Michael Malice
(01:13:30)
Well, Natalie’s pregnant. She’s going to have a daughter named Daisy. So congrats to Natalie.
Lex Fridman
(01:13:30)
Congrats to Natalie.
Michael Malice
(01:13:35)
But I was in the museum with them and there was a statue of Thoth, who’s the Egyptian god whose head is an ibis. It’s a bird with a long beak. And Thoth is the god of the moon, god of knowledge, and supposedly he invented writing. So I thought, you know what? I’ve always loved Ancient Egypt. I know a lot about it and especially the mythology. It’d be really cool as an aspiring author to have an ancient Egyptian Thoth statue in my house. Well, it turned out that the Egyptians also killed and mummified ibises and buried them with scribes. And a week after I went to the museum, there was an auction for an ibis mummy. And I have it now in my house, still in its bandages, overlooking my desk. And we all know it’s going to come to life and peck out my eyes and write with my blood. But that is one of the recent cool additions.

(01:14:31)
Another thing I have, which is like, in terms of holy crap I’ve made it. I have an original Patrick Nagel painting, and if people don’t know the name, he’s like the ’80s artist. He did the Duran Duran cover. Whenever you see him in nail salons. I have a male, which were very rare for him to do. So that’s two of my kind of favorite pieces.
Lex Fridman
(01:14:49)
You have what?
Michael Malice
(01:14:49)
He only drew women predominantly. I have one where we drew a male. It was a guy in a jean ad or something. And now I’m looking forward to, so Jake made me a three-foot tall sculpture called Future Murmur, which I am ecstatic-
Michael Malice
(01:15:00)
… sculpture called Future Murmur, which I am ecstatic to get.
Lex Fridman
(01:15:06)
Just remind yourself how many fascinating, beautiful people that are out there.
Michael Malice
(01:15:14)
And just the victory and holiness and technology and speed, and how many people have fought so that I could do what I do.
Lex Fridman
(01:15:26)
Yeah. That’s another thing I’m grateful for. Just like the 100 billion or so people that came before us, and also the trillions of lifeforms that came before that.
Michael Malice
(01:15:38)
Oh God, I’ve gone down this trilobite rabbit hole, buying fossils because as a kid I thought trilobites were the coolest thing, and now I’ve got like 15. And what’s interesting is when you buy trilobite fossils on eBay, they’re listed as used, because it’s got to be new or used according to the programming. So it’s used.
Lex Fridman
(01:15:57)
Yeah. But just thinking about all that history, just all the lifeforms that came before. It seems like a really special thing we have going on earth here.
Michael Malice
(01:16:10)
Oh yeah. I think that’s very fair to say, but also think this kind of is like live life to the fullest. Camus talked about living to the point of tears, especially on behalf of people who didn’t have that privilege. So I dedicated the white pill to my parents who got me out of the Soviet Union and all the kids who never could. And it’s like when I die, I want everyone else to not only, they’re obviously going to be happy, but yeah… I’m not here. Live for me, I can’t have that privilege anymore.

Books

Lex Fridman
(01:16:44)
What do you think about Camus as a writer?
Michael Malice
(01:16:47)
I don’t like his novels at all.
Lex Fridman
(01:16:49)
Oh, you don’t?
Michael Malice
(01:16:49)
At all.
Lex Fridman
(01:16:50)
Yeah. You’ve talked about The Plague to me, a little bit.
Michael Malice
(01:16:52)
Yeah. I think the book is pointless.
Lex Fridman
(01:16:55)
It’s fascinating.
Michael Malice
(01:16:56)
Because all you need to do is read the synopsis and then you get it. I don’t think his book-
Lex Fridman
(01:17:00)
Isn’t that true for most books?
Michael Malice
(01:17:01)
No.
Lex Fridman
(01:17:02)
I mean, you could take, I don’t know… I just don’t agree at all. I mean, it’s Catcher in the Rye. There’s a lot of books that are seem trivial.
Michael Malice
(01:17:11)
I don’t think it seems trivial, but I think-
Lex Fridman
(01:17:13)
Animal Farm.
Michael Malice
(01:17:15)
Animal Farm is a methodical step-by-step examination of a transformation from one thing to another. The Plague is not that.
Lex Fridman
(01:17:23)
It’s a methodical examination of what a society is like under the plague, which could symbolize a lot of things, including the plague directly or Nazi Germany or ideological movements, or… It’s similar to Animal Farm. Maybe not as effective in terms of using this kind of symbology-
Michael Malice
(01:17:44)
I think Animal Farm has a narrative and… I’m going to spoil the whole Plague. The book, The Plague. There’s a town, I believe in Oman, a plague descends, people struggle to deal with it, and the plague vanishes as quickly as it came. The end.
Lex Fridman
(01:18:02)
But there’s the victims, the people that take advantage of it. There’s the doctor that, amidst the absurdity and the evil of the plague, is fighting to do good.
Michael Malice
(01:18:11)
Nothing for me. Does nothing for me.
Lex Fridman
(01:18:13)
Okay, well I can spoil the Animal Farm. There’s animals at a farm and the humans are abusing them, and then the animals overthrow the humans, but then the pigs become just like the humans. The lesson, kids, is that power corrupts, no matter whether you walk on four or on two.
Michael Malice
(01:18:40)
I thought the lesson was that pigs are the most human-like animals on the farm.
Lex Fridman
(01:18:47)
I thought the lesson was that there’s no sugar candy mountain.
Michael Malice
(01:18:50)
That’s right. Yeah.
Lex Fridman
(01:18:53)
You’ve interviewed a lot of people. What have you learned about getting to the soul of a person, the soul of an idea from interviewing? Just how to do a good interview?
Michael Malice
(01:19:04)
First off, I’m not interviewing just random people. I’m interviewing people who are accomplished. It’s not a random group. That’s self-selecting for something different. But I think that people love to, and this is very understandable, love to feel seen. So if you’re someone who’s done something, even if you’re like the best Guinea pig breeder in America, to have someone interested in your work and listen to what you’re saying… because I remember every book I’ve written, I have friends, and I wouldn’t stop talking about the person I’m writing with or the North Korea. And a certain point, I’m sure they’re like, “All right, I don’t care about this anymore.”But it takes over your brain. You know what I mean?

(01:19:43)
So if you someone who has an interest or a hobby, I’m sure to some extent, maybe your friends or family are sick of talking about it or you don’t want to talk about it with them. That’s the private life where you could just be yourself. So I try to, and this comes from my co-authoring background. When I’m talking to people to ask the questions that they haven’t heard before. There’s a possibility that this actor I’m a huge fan of is going to be on my show. I don’t want to spoil everything. And he’s got a very specific role that he’s known for. And I’m like, “Okay, I know it’s going to be annoying for you talking about this one role, but my goal is to ask questions that you aren’t sick of asking, haven’t been asked.”
Lex Fridman
(01:20:28)
Porn star or…
Michael Malice
(01:20:29)
No, not a porn star.
Lex Fridman
(01:20:32)
That joke failed. Also edit out. What do you know about breeding Guinea pigs? You mentioned it. I’d love to hear-
Michael Malice
(01:20:32)
I don’t know anything.
Lex Fridman
(01:20:42)
I would love to hear more about it.
Michael Malice
(01:20:44)
I always use this as an example. You meet someone at a party who breeds Guinea pigs, right? There’s two approaches. Either you’re weird, okay. Or, “Sit down and tell me everything.” And I’m very much, and all the people I like are the second group. When you meet someone who’s doing something unusual and are passionate about it and are good at it, that to me is the mother load.
Lex Fridman
(01:21:09)
Yeah. That to me also is the thing I enjoy the most, is people-
Michael Malice
(01:21:13)
And then it’s like-
Lex Fridman
(01:21:15)
… that are passionate about a thing.
Michael Malice
(01:21:16)
… who do you guys hate? Do you guys hate the hamster people? Do you hate the rabid people? There’s got to be someone that you guys look down on, because the marine aquarium people look down on the freshwater aquarium people.
Lex Fridman
(01:21:25)
Yeah. It’s a hierarchy.
Michael Malice
(01:21:26)
Yes. There’s always going to be a hierarchy. This is where the left anarchists and I disagree, because they think you can have egalitarianism. There’s going to be a hierarchy.
Lex Fridman
(01:21:33)
Hierarchies emerge.
Michael Malice
(01:21:34)
Yes.
Lex Fridman
(01:21:35)
There’s no anarchy in the Guinea pig world.
Michael Malice
(01:21:39)
No. It’s just a different kind of anarchy.
Lex Fridman
(01:21:41)
Somebody’s always breeding somebody else.
Michael Malice
(01:21:43)
Yes.
Lex Fridman
(01:21:44)
And looking down on the others.
Michael Malice
(01:21:47)
Yeah, someone’s the other. Whether it’s the hamster people, the rat people.
Lex Fridman
(01:21:50)
And everybody’s breeding. By the way, are you an anarcho-capitalist? What flavor of anarchist are you?
Michael Malice
(01:21:59)
I’m an anarchist without adjectives. I like them all. The black flag comes in many colors.
Lex Fridman
(01:22:04)
All right. All right. You’re quoting your… No, I understand. It’s a beautiful line in the book.
Michael Malice
(01:22:09)
Thank you. I think the anarcho-capitalists don’t give the left anarchist enough credit, especially for their courage. And I do whatever I can in my power to talk about people like Emma Goldman, whenever possible.
Lex Fridman
(01:22:24)
Do you still think that “are some people better than others” is a good litmus test?
Michael Malice
(01:22:29)
Yes. It’s worked 100% of the time.
Lex Fridman
(01:22:32)
And for you, the answer is yes?
Michael Malice
(01:22:35)
I never answer.
Lex Fridman
(01:22:39)
There’s two of them.
Michael Malice
(01:22:43)
What are you all Hitchcock up in here?
Lex Fridman
(01:22:46)
Oh, hey, careful. I always got your back. What little habits in your life make you happy now that you’re in Austin?
Michael Malice
(01:22:59)
Oh my god. I was prepping for this interview, and I imagined this coming up, and I knew that as I explained this, you know how sometimes when someone tells a story, at first it’s amusing, then it’s amusing and concerned, and then you’re like, “Holy shit, where’s the exit?”
Lex Fridman
(01:23:27)
Yeah. I’m getting nervous already.
Michael Malice
(01:23:30)
You should. So I’m going to tell you something I’ve told only a couple of people. This is my absolutely off the charts, autistic approach to shaving. So I have this insane system. You asked about habits that give me joy. I used to hate shaving. I used to hate it. There’s something called wet shaving. So wet shaving is you get the brush, you get the soap that’s in a canister, you stirred up, you paint your face, and then you shave. The thing is, there are dozens of these shaving soap companies, okay? So I tried a couple of hundred of these soaps, because you’re testing for scent, you’re testing for, with the lather, thickness, and also how smooth of a shave it gives you. I have it down… I’m not making this up. I’m not this creative. I have it down to a cycle of 67 soaps. Okay?
Lex Fridman
(01:24:40)
A cycle.
Michael Malice
(01:24:41)
A cycle. So 67. When I use up one soap, that is a slot that I will have to try new ones, and I will try new ones in that slot until I get one that I like, and then that slot is filled. So right now, I have 67 that I use, and I have 86 candidates.
Lex Fridman
(01:25:05)
Like in the queue?
Michael Malice
(01:25:07)
In the queue.
Lex Fridman
(01:25:07)
Do you label them? Do you remember which one is which?
Michael Malice
(01:25:09)
Well, they all have beautiful labels. I mean, these are artisans who are creating these a amazing things. I would encourage everyone to try this hobby, who’s a guy. It’s so much fun. I will give a shout-out to the companies that are the best. So the best company, in my opinion, is a company called… they just changed the name because… You know what they’re originally called? I’m not joking. Grooming Department. And now it’s like-
Lex Fridman
(01:25:37)
Not a bad name.
Michael Malice
(01:25:38)
Yeah, but it has certain connotations in contemporary discourse.
Lex Fridman
(01:25:42)
Yeah, I understand. Contemporary discourse, yeah.
Michael Malice
(01:25:43)
So now he changed his name to Aion Skincare, A-I-O-N. That’s the sense of the most sophisticated, the most diverse, and the soap is just really high quality. Another amazing company is Barrister and Man. And if I’m going to tell you to try one, it’s called Cheshire. He comes out with new ones every month or so. A lot of it’s miss. A lot of it’s hit. Just great, great quality stuff. Another great company is Chiseled Face. They make something called Midnight Stag, which basically smells like a garage. It’s one of my favorite soaps of all time.
Lex Fridman
(01:26:16)
What makes for a good smell for Michael Malice?
Michael Malice
(01:26:19)
I have 67 answers. So some of them smell-
Lex Fridman
(01:26:22)
So you can’t convert it into words?
Michael Malice
(01:26:24)
Some are citrusy, some are industrial, some-
Lex Fridman
(01:26:27)
So garage is more industrial.
Michael Malice
(01:26:28)
It smells like a garage. Yeah. Midnight Stag. It smells like a garage. Some are fun. There smells that smell like other things. For example, there’s a scent in my queue called Finding Scotty. It smells like Swedish Fish. Another great company is Phoenix Shaving, and they have one called Aloha Smackdown. It smells like Hawaiian Punch. They had one called Yule Ham that they made for me special. Smells like a ham. They had a ramen one, Rock and Ramen. Smells a cup of noodles. And every year they do an advent calendar where for 12 days you have a little sample of a soap and a sample of the aftershave.
Lex Fridman
(01:27:04)
Nice.
Michael Malice
(01:27:06)
I’m forgetting someone and I’m feeling angry that I’m doing it. But those are some of the… Oh, and Catie’s Bubbles is great. They’re vegan, out of New Jersey. They’ve got one called a Knee High to a Grape. It smells like grape soda. I think those are the biggest names off the top of my head.
Lex Fridman
(01:27:25)
Will that list converge down to a small set eventually, or no? 67 down to-
Michael Malice
(01:27:31)
Well, no, it’s 67.
Lex Fridman
(01:27:33)
Oh, so it always keeps [inaudible 01:27:35]-
Michael Malice
(01:27:35)
So if there’s a slot, then, you know what I mean? I’ll fill that. You see what I’m saying?
Lex Fridman
(01:27:39)
Oh, so you will forever have the variety of 67?
Michael Malice
(01:27:41)
Yes.
Lex Fridman
(01:27:46)
You know how sad my brain is? When you were telling me this, I was like, “I wonder how many soaps are left in Michael Malice’s life.” You can count your life by days, by month, by years, or by soaps.
Michael Malice
(01:28:01)
That is depressing. That is very dark.
Lex Fridman
(01:28:03)
Because each experience of shaving is a little beautiful experience.
Michael Malice
(01:28:07)
Yes, it is. It’s so much fun.
Lex Fridman
(01:28:09)
How many do you have left in your life, right?
Michael Malice
(01:28:10)
That’s true.
Lex Fridman
(01:28:11)
Yeah.
Michael Malice
(01:28:12)
I got to tell you, there’s something else. There’s a term my friend Jackie taught me called Touching Pan. It’s a makeup term. So basically when you use it and you could see the bottom, that’s like a big moment.
Lex Fridman
(01:28:24)
Oh, it’s a great thing.
Michael Malice
(01:28:24)
Yeah. Well, it’s kind of fun. I’m telling you, people can scoff. It is such a fun… and there’s a lot of us online who are into this whole space. It’s really, really fun.
Lex Fridman
(01:28:34)
When did you first discover this?
Michael Malice
(01:28:36)
Can I curse?
Lex Fridman
(01:28:37)
Yeah.
Michael Malice
(01:28:38)
Fuck you, Cole Stryker. Because I was staying at my friend Cole’s house in LA. Fuck you Cole.
Lex Fridman
(01:28:45)
Fuck you, Cole.
Michael Malice
(01:28:46)
Cole is one of the biggest hipsters I know. He’s got the shirts with the pearl snaps and everything. And I’m staying at his house because I was doing Rogan, and he goes, “Oh, have you heard of this wet shaving thing?” And he goes, “Look, this one’s Proraso. That’s the Italian grandpa soap, which is also a great one. And I went down this rabbit hole, and now I’m like… I don’t even know how much money I spent on this. And it’s all because of him.
Lex Fridman
(01:29:09)
Oh. But it’s like a happy fuck you. Like, fuck you, Cole.
Michael Malice
(01:29:09)
Yeah.
Lex Fridman
(01:29:12)
I love you, Cole. Fuck you.
Michael Malice
(01:29:13)
Yeah, it’s just-
Lex Fridman
(01:29:14)
Thank you.
Michael Malice
(01:29:15)
Yes, yes.
Lex Fridman
(01:29:16)
That’s a good idea for a tattoo. Fuck you, Cole. Do you have advice on how to be happy?

How to be happy

Michael Malice
(01:29:25)
Yes.
Lex Fridman
(01:29:26)
There’s a lot of loneliness and sadness in the world.
Michael Malice
(01:29:31)
I can give a very easy piece of advice that worked a lot for me. Instead of telling yourself that you have these ridiculous standards, tell yourself, “I can be better. I don’t have to be a great writer. I could be a better writer. I don’t have to be a great podcaster. That will never happen. I could be a better podcaster. I could be a better person. I could be better at the gym. I could be better with my time.” And when you regard things in… and especially if you have metrics that you can go by. “I’ll run this many miles a day.” Things you have control over. Especially as males, when you have this chart and the data is telling you you’re improving, right away, it’s like you have this sense of accomplishment. So I think that is a really great way to…

(01:30:25)
And if something is not working in your life… Let’s suppose you don’t have friends. Right? There’s the internet. How do people make friends? Try things out? What’s the worst that’s going to happen? Things will blow up in your face. Well, you’ll learn something at least. Don’t be afraid of making mistakes. When I was a kid, I was so scared of having things under control, so like I would never have to get hit in the face metaphorically. And then I realized, and you realized this as well, everyone who’s important gets hit in the face. Look at the president, whoever the president is. It becomes a matter of being strong enough that you could take getting hit in the face. So that is a big important switch in your thinking.
Lex Fridman
(01:31:05)
Yeah. There’s a Bukowski quote I wrote down. “Sometimes you climb out of bed in the morning and you think, I’m not going to make it. But you laugh inside, remembering all the times you felt that way.”
Michael Malice
(01:31:16)
Yeah, yeah.
Lex Fridman
(01:31:17)
There’s a part of me that’s like that. There’s some days where I feel like this is the worst day of my life. And then shortly after, I chuckle at that.
Michael Malice
(01:31:26)
Yes.
Lex Fridman
(01:31:26)
Just knowing the ups and downs of the brain and the mind and life and all that. You ever been depressed?

Depression

Michael Malice
(01:31:32)
Yeah, of course. I’m more anxious than depressed. I don’t really get depressed, but I’ve been depressed.
Lex Fridman
(01:31:38)
Like low points.
Michael Malice
(01:31:39)
Yeah. But I think I distinguish depression between low points, right? If things are going bad and you feel bad, that makes sense. But when I think of depression, I think of someone who feels bad when things aren’t bad. To me, it’s almost by definition irrational.
Lex Fridman
(01:31:55)
Well, yeah. And there’s different kinds of… There’s a exhausted kind of depression where it’s not so much sad as you don’t want to do anything. You don’t want to live. You don’t want to-
Michael Malice
(01:32:09)
Yeah. What’s the point? It’s a wrap, yeah.
Lex Fridman
(01:32:10)
What’s the point? What’s the point? And an extreme self-critical negativity, which I’m also scared of because my brain is generally very self-critical.
Michael Malice
(01:32:19)
Because you’re not taking enough magnesium.
Lex Fridman
(01:32:21)
Do you take a rectally or in the mouth?
Michael Malice
(01:32:23)
You take a rectally.
Lex Fridman
(01:32:24)
Okay.
Michael Malice
(01:32:25)
But as for the magnesium, you should take it as a pill.

Fear

Lex Fridman
(01:32:28)
Okay. Well, the way your mom explained it then is way different. What are you most afraid of?
Michael Malice
(01:32:47)
Holy crap. I am trying to think of anything I’m afraid of.
Lex Fridman
(01:32:51)
In 1984-
Michael Malice
(01:32:54)
I thought even just-
Lex Fridman
(01:32:55)
Look, if I wanted to torture you, hypothetically…
Michael Malice
(01:32:59)
Well, the mission accomplished. I mean, I’m scared of increasing authoritarianism, but that’s not personal. And that’s something that I don’t think is as much of an imminent concern as let’s say in Canada.
Lex Fridman
(01:33:15)
Are you scared of death?
Michael Malice
(01:33:17)
No.
Lex Fridman
(01:33:17)
You think Camus was scared of death?
Michael Malice
(01:33:20)
No.
Lex Fridman
(01:33:22)
He just accepted it as-
Michael Malice
(01:33:24)
Look, I honestly feel like if I died tomorrow, I did pretty good with what I had. I think I did things that matter to me. I think I moved the needle on things that matter to me. I think I’ve been a good friend to the people I care about. I’ve saved a couple of lives. So I think it’s a very low bar for someone to be able to grow their grave and say, “I left the world a better place than I found it.” I don’t think it’s that hard.

Betrayal

Lex Fridman
(01:34:01)
You ever been betrayed?
Michael Malice
(01:34:03)
Oh god, yes. Of course. Haven’t you?
Lex Fridman
(01:34:07)
Not as often as I would’ve predicted.
Michael Malice
(01:34:09)
Yeah. The Russian upbringing expects everyone to be like… it’s a time bomb before they betray you. I have been betrayed. Of course. Yeah.
Lex Fridman
(01:34:16)
Yeah. You value loyalty?
Michael Malice
(01:34:20)
I do. And I also made it a point to not let that betrayal color my future interactions and regard that as the universal or the norm. I think that’s very important.
Lex Fridman
(01:34:32)
Me too.
Michael Malice
(01:34:34)
And also, I feel bad. I’ve gotten, Lex, enough that I feel bad for the person who betrayed me, because it’s just like they didn’t need to do this. And at some point, if you betray someone, you know, and you know you’re not a good person. I believe that. Like even if you tell yourself, “This is something I had to do,” you still know you had to do a bad thing to someone who didn’t deserve it. And that’s a really hard pill to swallow.
Lex Fridman
(01:35:02)
In my situation, I still think good thoughts and empathize with the people that have done me wrong.
Michael Malice
(01:35:11)
I don’t empathize with them, but I sympathize with them.
Lex Fridman
(01:35:14)
My English is not good enough to know the difference.
Michael Malice
(01:35:16)
Empathizing means you’re putting yourself in their shoes. Sympathizing means you feel bad for them and wish them well.
Lex Fridman
(01:35:25)
Yeah, I wish them well.
Michael Malice
(01:35:27)
Yeah, but I don’t put myself… it’s very hard for me to empathize with someone who betrays someone that they care about. It’s not that just I think I’m such a great person. It’s that I feel guilt very strongly. So if I did that to someone who trusted me, it would up my head for a long time.
Lex Fridman
(01:35:46)
Yeah, but maybe they were in pain. Maybe they were desperate. Maybe their back’s to the wall.
Michael Malice
(01:35:53)
Sure.
Lex Fridman
(01:35:53)
They felt that way.
Michael Malice
(01:35:54)
Sure. Well, that’s a sympathy thing. Not really an empathy thing.
Lex Fridman
(01:35:56)
Yeah. Yeah. Loyalty is a fascinating thing.
Michael Malice
(01:36:03)
Yes.
Lex Fridman
(01:36:04)
I value trust a lot.
Michael Malice
(01:36:05)
I know you do. Especially because you’re in such a public… Both of us, we’re in very public positions. You have to be very careful who you surround yourself with.
Lex Fridman
(01:36:13)
It sucks.
Michael Malice
(01:36:14)
Does it? Well, it’s-
Lex Fridman
(01:36:16)
Well, it sucks because it’s hard to… I usually just trust everybody.
Michael Malice
(01:36:25)
Okay, that’s crazy.
Lex Fridman
(01:36:29)
But what’s the alternative?
Michael Malice
(01:36:30)
To have a filter?
Lex Fridman
(01:36:33)
Well, I have a filter in terms of who I interact with, but within the… I see the good in people, but then in the very rare instances that might turn. Yeah. It just sucks. It breaks my heart.
Michael Malice
(01:36:48)
Yeah, I hear you. I completely agree.
Lex Fridman
(01:36:52)
Has your heart ever been broken?
Michael Malice
(01:36:54)
Yes.
Lex Fridman
(01:36:55)
Love?
Michael Malice
(01:36:56)
Yes.
Lex Fridman
(01:37:01)
I’m just so relaxed right now, and happy.
Michael Malice
(01:37:03)
Good.
Lex Fridman
(01:37:04)
Relaxed sand happy.
Michael Malice
(01:37:05)
Good.
Lex Fridman
(01:37:06)
This is making me really happy.
Michael Malice
(01:37:09)
Again, it’s beautiful on like eight different levels.
Lex Fridman
(01:37:11)
I think that’s the deepest thing I’m thankful for, is just how beautiful people are and how beautiful the world is.
Michael Malice
(01:37:21)
People are going to laugh, and I welcome it. That’s fine. I really sometimes feel like the guy in American Beauty looking at the plastic bag dancing in the wind, and he’s brought to tears because of how much beautiful life is. And a lot of people feel the need to sneer at that scene and Ricky Pitts, whatever, and I think he’s got it exactly right.
Lex Fridman
(01:37:45)
I think he does too. Well, in the end, you and I will be both laughing,
Michael Malice
(01:37:53)
Right. And also seeing beauty where other people see garbage. And I’d rather be the person who sees beauty than the person who sees garbage.
Lex Fridman
(01:38:02)
Yep. Well, when I look at you, I see beauty when most people see garbage. And it’s really unfair, Mr. Parrot, that you keep saying that. But all jokes aside, man, I’m really grateful for your friendship. I’m really grateful for who you are as a person. Thank you so much for talking today. Thank you so much for talking to me throughout all these years. Thank you for being who you are.
Michael Malice
(01:38:28)
You are welcome.
Lex Fridman
(01:38:31)
Thanks for listening to this conversation with Michael Malice. To support this podcast. Please check out our sponsors in the description. And now, let me leave you with some words from Andre Gide. Man cannot discover new oceans unless he has the courage to lose sight of the shore. Thank you for listening and hope to see you next time.

Transcript for John Mearsheimer: Israel-Palestine, Russia-Ukraine, China, NATO, and WW3 | Lex Fridman Podcast #401

This is a transcript of Lex Fridman Podcast #401 with John Mearsheimer.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

Lex Fridman
(00:00:00)
The following is a conversation with John Mearsheimer, a professor at University of Chicago and one of the most influential and controversial thinkers in the world. He teaches, speaks and writes about the nature of power and war on the global stage, in history and today.

(00:00:19)
Please allow me to say, once again, my hope for this little journey I’m on. I will speak to everyone on all sides with compassion, with empathy, and with backbone. I’ll speak with Vladimir Putin and with Volodymyr Zelenskyy, with Russians and with Ukrainians, with Israelis and with Palestinians, with everyone. My goal is to do whatever small part I can to decrease the amount of suffering in the world by trying to reveal our common humanity. I believe that in the end, truth and love wins. I will get attacked for being naive, for being a shill, for being weak. I’m none of those things, but I do make mistakes and I will get better. I love you all.

(00:01:19)
This is a Lex Fridman podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s John Mearsheimer.

Power


(00:01:29)
Can you explain your view on power in international politics as outlined in your book, The Tragedy of Great Power Politics and in your writing since then?
John Mearsheimer
(00:01:39)
Yeah, I make two sets of points there. First of all, I believe that power is the currency of international relations, and by that I mean that states are deeply interested in the balance of power and they’re interested in maximizing how much power they control. And the question is why do states care so much about power. In the international system, there’s no higher authority, so if you get into trouble and you dial 911, there’s nobody at the other end. In a system like that, you have no choice but to figure out for yourself how best to protect yourself. And the best way to protect yourself is to be powerful, to have as much power as you can possibly gain over all the other states in the system. Therefore, states care about power because it enhances or maximizes their prospects for survival.

(00:02:39)
Second point I would make is that in the realist story or in my story, power is largely a function of material factors. The two key building blocks of power are population size and wealth. You want to have a lot of people and you want to be really wealthy. Of course, this is why the United States is so powerful. It has lots of people and it has lots of wealth. China was not considered a great power until recently because it didn’t have a lot of wealth. It certainly had population size, but it didn’t have wealth. And without both a large population and much wealth, you’re usually not considered a great power. So I think power matters, but when we talk about power, it’s important to understand that it’s population size and wealth that are underpinning it.
Lex Fridman
(00:03:38)
So there’s a lot of interesting things there. First you said nations in relation to each other is essentially in a state of anarchism.
John Mearsheimer
(00:03:48)
Yeah, well, anarchy basically means the opposite of hierarchy. Sometimes people think when you’re talking about anarchy, you’re talking about murder and mayhem, but that’s not what anarchy means in the realist context. Anarchy simply means that you don’t have hierarchy. There’s no higher authority that sits above states. States are like pool balls on a table. And in an anarchic world, there’s no higher authority that you can turn to if you get into trouble.

(00:04:22)
And of course the political philosopher who laid this all out was Thomas Hobbes. And Hobbes talked about life in the state of nature, and in the state of nature you have individuals and those individuals compete with each other for power. And the reason that they do is because in the state of nature, by definition, you have no higher authority. And Hobbes’s view is that the way to get out of this terrible situation where individuals are competing with each other and even killing each other is to create a state. It’s what he calls the Leviathan, and that of course is the title of his famous book.

(00:05:02)
So the idea is to escape anarchy, you create a state, and that means you go from anarchy to hierarchy. The problem in international politics is that there is no world state, there is no hierarchy. And if you have no hierarchy and you’re in an anarchic system, you have no choice but to try to maximize your relative power to make sure you are, as we used to say when I was a kid on New York City playgrounds, the biggest and baddest dude on the block. Not because you necessarily want to beat up on other kids or on other states, but because again, that’s the best way to survive.

(00:05:47)
And as I like to point out to people, the best example of what happens when you’re weak in international politics is what the Chinese call the century of national humiliation. From the late 1840s to the late 1940s the Chinese were remarkably weak, and the great powers in the system preyed upon them. And that sends a very important message to not only the Chinese, but to other states in the system. Don’t be weak, be as powerful as you can.
Lex Fridman
(00:06:18)
And we’ll talk about it, but humiliation can lead to resentment or resentment leads to something you’ve also studied, which is Nazi Germany in the 1930s. We’ll talk about it, but staying to the psychology and philosophy picture, what’s the connection between the will to power in the individual, as you mentioned, and the will to power in a nation?
John Mearsheimer
(00:06:43)
The will to power in an individual has a lot to do with individual psychology. The story that I tell about the pursuit of power is a structural argument. It’s an argument that says when you are in a particular structure, when you’re in a system that has a specific architecture which is anarchy, the states have no choice but to compete for power. So structure is really driving the story here. Will to power has a lot more to do with an individual in the Nietzschen story where that concept comes from. So it’s very important to understand that I’m not arguing that states are inherently aggressive. My point is that as long as states are in anarchy, they have no choice but to behave in an aggressive fashion. But if you went to a hierarchic system, there’s no reason for those states to worry about the balance of power, because if they get into trouble there is a higher authority that they can turn to. There is in effect a leviathan.
Lex Fridman
(00:07:59)
So what is the role of military might in this will to power on the national level?
John Mearsheimer
(00:08:06)
Well, military mights is what ultimately matters. As I said to you before, the two building blocks of power are population size and wealth.
Lex Fridman
(00:08:16)
You didn’t mention military mights.
John Mearsheimer
(00:08:17)
I did not, no. That’s right. And it’s good that you caught that because if you have a large population and you’re a wealthy country, what you do is you build a large military, and it’s ultimately the size of your military that matters because militaries fight wars. And if states are concerned about survival, which I argue is the principle goal of every state in the international system for what I think are obvious reasons, then they’re going to care about having a powerful military that can protect them if another state comes after them.
Lex Fridman
(00:08:55)
Well, it’s not obvious that a large nation with a lot of people and a lot of money should necessarily build a gigantic army and seek to attain dominant soul superpower status to military might. But you’re saying, as you see the world today, it has to be that way.
John Mearsheimer
(00:09:16)
Yeah, I’m arguing it is obvious. If you’re a state in the international system, do you want to be weak? If you live next door to Nazi Germany or Imperial Germany or Napoleonic France or even the United States… The United States is a ruthless great power, you surely recognize that. And if you’re dealing with the United States of America and you’re Vladimir Putin, you want to make sure you’re as powerful as possible so that the United States doesn’t put its gun sights on you and come after you. Same thing is true with China. You want to be powerful in the international system.

(00:09:50)
States understand that, and they go to great lengths to become powerful. Just take the United States of America. When it started in 1783, it was comprised of 13 measly colonies strung out along the Atlantic seaboard. Over time, the various leaders of the United States went to great lengths to turn that country into the dominant power in the Western Hemisphere. And then once that was achieved in 1900, we’ve gone to great lengths to make sure that there’s no pier competitor in the system. We just want to make sure that we’re number one.

(00:10:33)
And my argument is that this is not peculiar to the United States. If I’m China, for example, today, I would want to dominate Asia the way the United States dominates the Western Hemisphere. They’d be fools not to. If I were imperial Germany, I’d want to dominate all of Europe the way the United States dominates the Western Hemisphere. Why? Because if you dominate all of Europe, assuming you’re Imperial Germany or Napoleonic France, then no other state in the area or in the region can threaten you because you’re simply so powerful.

(00:11:12)
And again, what I’m saying here is that the structure of the international system really matters. It’s the fact that you’re in this anarchic system where survival is your principle goal and where I can’t know your intentions, right? You’re another state. I can’t know that at some point you might not come after me. You might. And if you’re really powerful and I’m not, I’m in deep trouble.
Lex Fridman
(00:11:37)
Yeah. So some of the ideas underlying what you’ve said, offensive realism, which I would love to talk to you about sort of the history of realism versus liberalism, but some of the ideas you already mentioned, anarchy between states, everybody’s trying to develop military capabilities, uncertainty, such an interesting concept. States cannot be sure that other states will not use military capabilities against them, which is one-
John Mearsheimer
(00:12:07)
That’s of enormous importance to the story,
Lex Fridman
(00:12:09)
…really important, and so interesting because you also say that this makes realists more cautious and more peaceful, the uncertainty because of all the uncertainty involved here, it’s better to approach international politics with caution, which is really interesting to think about. Again, survival, most states interested in survival. And the other interesting thing is you assume all the states are rational, which-
John Mearsheimer
(00:12:40)
Most of the time.
Lex Fridman
(00:12:41)
Most of the time. You call this framework offensive realism. Can you just give an overview of the history of the realism versus liberalism debate as worldviews?
John Mearsheimer
(00:12:56)
Well, I think for many centuries now, the big divide within the world of international relations theory is between realism and liberalism. These are time honored bodies of theory. And before I tell you what I think the differences are between those two bodies of theory, it is important to emphasize that there are differences among realists and differences among liberals. And so when you talk about me as an offensive realist, you should understand that there are also defensive realists out there, and there are a panoply of liberal theories as well.

(00:13:42)
But basically realists believe that power matters, that states compete for power, and that war is an instrument of statecraft. And liberals, on the other hand, have what I would say is a more idealistic view of the world. This is not to say that they’re naive or foolish, but they believe there are aspects of international politics that lead to a less competitive and more peaceful world than most realists say. And I’ll lay out for you very quickly, what are the three major liberal theories today that I think will give you a sense of the more optimistic perspective that is inherent in the liberal enterprise.

(00:14:40)
The first and most important of the liberal theories is democratic peace theory, and this is a theory that says democracies do not fight against other democracies. So the more the world is populated with democracies, the less likely it is that we will have wars. And this basic argument is inherent in Francis Fukuyama’s The End of History. He argues that democracy triumphed first over fascism in the 20th century, it then triumphed over communism, and that means that in the future we’re going to have more and more liberal democracies on the planet. And if you have more and more liberal democracies and those democracies don’t fight each other, then you have a more peaceful world. That was his argument. It’s a very liberal argument.

(00:15:36)
A realist like me would say that it doesn’t matter whether a state is a democracy or not, all states behave the same way because the structure of the system, getting back to our earlier discussion about international anarchy, the structure of the system leaves those states no choice, whether they’re democracies or autocracies. And again, the liberal view, this first liberal theory, is that democracies don’t fight other democracies, and therefore the more democracies you have, the more peaceful the world.
Lex Fridman
(00:16:12)
Can I just sort of try to unpack that a little bit? So the democratic peace theory, I guess, would say that in democracies leaders are elected, and the underlying assumption is most people want peace, and so they will elect peacemakers. So the more democracies you have, the more likely you have peace. And then the realist perspective says that it doesn’t matter if the majority of people want peace. The structure of international politics is such that superpowers want to become more super and powerful, and they do that through war.
John Mearsheimer
(00:16:51)
You can’t make that argument that you’re making about democracies, because if you’re saying that democracies are inclined toward peace and that the electorate picks leaders who are inclined towards peace, then you have to show that democracies are, in general, more peaceful than non democracies, and you can’t support that argument. You can find lots of evidence to support the argument that democracies don’t fight other democracies.

(00:17:25)
So the argument I believe that you have to make, if you’re going to support democratic peace theory, the main argument you have to make is that liberal democracies have a healthy respect for each other and they can assess each other’s intentions. If you’re a liberal democracy, and I’m a liberal democracy, we know we have value systems that argue against aggression, and argue for peaceful resolution of crises. And therefore, given these norms, we can trust each other, we can know each other’s intentions. Remember, for realists like me, uncertainty about intentions really helps drive the train. But if you’re talking about two democracies, the argument there is that they know each other’s intentions.
Lex Fridman
(00:18:19)
And for you, sure, maybe democracies reduce uncertainty a little bit, but not enough to stop the train.
John Mearsheimer
(00:18:26)
I think that’s right, yeah. That’s right. So that’s democratic peace theory. The second theory is economic interdependence theory, and that’s the argument that, in a globalized world like the one that we live in and have lived in for a long time, there’s a great deal of economic interdependence. And if you and I are two countries, or if you and me are two countries and we’re economically interdependent and we’re both getting prosperous as a result of this economic intercourse, the last thing that we’re going to do is start a war, either one of us, because who would kill the goose that lays the golden eggs, it’s that kind of argument. So there you have an argument that economic interdependence leads to peace.

(00:19:09)
And then the third liberal argument has to do with institutions, sometimes referred to as liberal institutionalism. And this is the argument that if you can get states into institutions where they become rule abiding actors, they will obey the rules that dictate that war is not acceptable. So if you get them to accept the UN rules on when you can and cannot initiate a war, then you’ll have a more peaceful world. So those are the liberal theories, and as you can tell, they’re very different from realism as articulated by somebody like me.
Lex Fridman
(00:19:57)
Can you maybe argue against the economic interdependence and in the institutions that institutions follow rules a little bit? So the golden goose with the golden egg, you’re saying that nations are happy to kill the goose because again, they want power.
John Mearsheimer
(00:20:19)
If they think it’s necessary to kill the golden goose because of security concerns, they will do it. The point is that economic interdependence at its root has prosperity as the core variable. In the realest story, the core variable is survival, and survival always trumps prosperity. So if you go back to the period before World War I, we’re in Europe, it’s 1913 or early 1914, what you see is that you have an intense security competition between all of the great powers. On one side you have the Triple Alliance, and on the other side you have the Triple Entente. You have these two alliances, and you have an intense security competition between them. At the same time, you have a great deal of economic interdependence. It’s amazing how much economic intercourse is taking place in Europe among all the actors. And people are getting prosperous or countries are getting prosperous as a result. But nevertheless, in the famous July crisis of 1914, this economic prosperity is unable to prevent World War I because security concerns or survival is more important. So there are going to be lots of situations where prosperity and survival come into conflict, and in those cases, survival will win.
Lex Fridman
(00:21:59)
And maybe you can speak to the different camps of realists. You said offensive and defensive. Can you draw a distinction between those two?
John Mearsheimer
(00:22:09)
Yeah. Let me just back up a bit on that one. And you were talking about will to power before. The first big divide between realists is structural realists and human nature realists, and Hans Morgenthau, who was influenced by nature and therefore had that will to power logic embedded in his thinking about how the world works, he was a human nature realist. I’m a structural realist and I believe it’s not human nature, it’s not individuals in some will to power that drives competition and war. What drives competition and war is the structure of the system. It’s anarchy.
Lex Fridman
(00:23:03)
So you’re not as romantic as the human nature realists.
John Mearsheimer
(00:23:06)
Yeah. There’s just a world of difference between the two. It’s just important to understand that.
Lex Fridman
(00:23:12)
So within that, from the structural, there’s a subdivision also of offensive and defensive.
John Mearsheimer
(00:23:17)
Yes. Inside the structural realist world. And you have a handful of realists who believe that the structure of the system fosters competition, for sure, security competition. But it really rules out great power war almost all the time. So it makes sense to care about the balance of power, but to focus on maintaining how much power you have. That’s the defensive realism, maintaining how much power you have. Not trying to gain more power, because the argument the defense of realists make is that if you try to gain more power, the system will punish you, the structure will punish you. I’m not a defensive realist, I’m an offensive realist. And my argument is that states look for opportunities to gain more power, and every time they see, or almost every time they see an opportunity to gain more power, and they think the likelihood of success is high and the cost will not be great, they’ll jump at that opportunity.

Hitler

Lex Fridman
(00:24:39)
Just to linger on the human nature perspective, how do you explain Hitler and Nazi Germany, just one of the more recent aggressive expansions through military might? How do you explain that in the framework of offensive realism?
John Mearsheimer
(00:25:04)
Well, I think that Nazi Germany was driven in large part by structural considerations. And I think if you look at Imperial Germany, which was largely responsible for starting World War I, and of course Nazi Germany’s largely responsible for starting World War II, what that tells you is you didn’t need Adolf Hitler to start World War I. And I believe that there is a good chance you would’ve had World War II in the absence of Hitler. I believe that Germany was very powerful, it was deeply worried about the balance of power in Europe, and it had strong incentives to behave aggressively in the late 1930s, early 1940s. So I believe that structure mattered.

(00:25:54)
However, I want to qualify that in the case of Adolf Hitler, because I do think he had what you would call a will to power. I’ve never used that word to describe him before, but it’s consistent with my point that I often make, that there are two leaders, or there have been two leaders in modern history who are congenital aggressors, and one was Napoleon and the other was Hitler. Now, if you want to call that a will to power, you can do that. I’m more comfortable referring to Hitler as a congenital aggressor and referring to Napoleon as a congenital aggressor, although there were important differences between the two, because Hitler was probably the most murderous leader in recorded history, and Napoleon was not in that category at all. But both of them were driven by what you would call a will to power, and that has to be married to the structural argument in Hitler’s case, and also in Napoleon’s case.
Lex Fridman
(00:27:02)
Is there some degree on the human psychology side that resentment, because of what happened after World War I, led to Hitler willing so much power, and then Hitler starting World War II? So this is the human side. Perhaps the reason I asked that question is also because you mentioned the century of humiliation on the China side. So to which degree does humiliation lead to Hitler and lead to World War II?
John Mearsheimer
(00:27:33)
Well, the question of what led to Hitler is a very different question than the question of what led to World War II once Hitler was in power. I mean, after January 30th, 1933, he’s in power. And then the question of what is driving him comes racing to the fore. Is there resentment over the Versailles treaty and what happened to Germany? Yes. Did that matter? Yes. But my argument is that structure was the principle factor driving the train in Hitler’s case. But what I’m saying here is that there were other factors that as well, resentment being one of them. Will to power or the fact that he was a congenital aggressor in my lexicon certainly mattered as well, so I don’t want to dismiss your point about resentment.
Lex Fridman
(00:28:29)
So Hitler in particular, the way he wielded, the way he gained so much power, might have been the general resentment of the populace or the German populace.
John Mearsheimer
(00:28:41)
I think that as a result of defeat in World War I and all the trials and tribulations associated with Weimar Germany, and then the coming of the Great Depression, all of those factors definitely account for his coming to power. I think that one of the reasons that he was so successful at winning over the German people once he came to power was because there was a great deal of resentment in the German body politic. And he played on that resentment, that surely helped him get elected too. But I think having studied the case, it was even more important once he took over.

(00:29:32)
I also believe that one of the principal reasons that he was so popular and he was wildly popular inside Nazi Germany is because he was the only leader of an industrialized country who pulled his country out of the depression. And that really mattered, and it made him very effective. It’s also worth noting that he was a remarkably charismatic individual. I find that hard to believe because every time I look at him or listen to his speeches, he does not appear to be charismatic to me. But I’ve talked to a number of people who are experts on this subject who assure me that he was very charismatic. And I would note to you, if you look at public opinion polls in Germany, West Germany, in the late 1940s, this is the late 1940s after the Third Reich is destroyed in 1945, he is still remarkably popular in the polls.
Lex Fridman
(00:30:31)
Stalin is still popular in many parts of Eastern Europe.
John Mearsheimer
(00:30:36)
Yeah, yeah. And Stalin’s popular in many quarters inside Russia, and Stalin murdered more of his own people than he murdered people outside of the Soviet Union.
Lex Fridman
(00:30:50)
And still to you, the tides of history turned not on individuals, but on structural considerations. So Hitler may be a surface-layer characteristics of how Germany started war, but not really the reason.
John Mearsheimer
(00:31:09)
Well, history is a multidimensional phenomenon-
Lex Fridman
(00:31:14)
So I hear.
John Mearsheimer
(00:31:15)
… and we’re talking about interstate relations here, and realism is a theory about how states interact with each other, and there are many other dimensions to international politics. And if you’re talking about someone like Adolf Hitler, why did he start World War II is a very different question than why did he start the Holocaust or why did he push forward a holocaust? I mean, that’s a different question, and realism doesn’t answer that question. So I want to be very clear that I’m not someone who argues that realism answers every question about international politics, but it does answer what is one of the big, if not the biggest, questions that IR scholars care about, which is what causes security competition and what causes great power war.
Lex Fridman
(00:32:10)
Does offensive realism answer the question why Hitler attacked the Soviet Union?
John Mearsheimer
(00:32:17)
Yes.
Lex Fridman
(00:32:18)
Because from a military strategy perspective, there’s pros and cons to that decision.
John Mearsheimer
(00:32:25)
Pros and cons to every decision. The question is, did he think that he could win a quick and decisive victory. And he did, as did his generals. It’s very interesting, I’ve spent a lot of time studying German decision making in World War II. If you look at the German decision to invade Poland on September 1st, 1939, and you look at the German decision to invade France on May 10th, 1940, and then the Soviet Union on June 22nd, 1941, what you see is there was actually quite a bit of resistance to Hitler in 1938 at the time of Czechoslovakia, Munich, and there was also quite a bit of resistance in September, 1939.
Lex Fridman
(00:33:13)
Internally? Or you mean…
John Mearsheimer
(00:33:14)
Internally, internally. For sure. Yeah. People had doubts. They didn’t think the Wehrmacht was ready, and given the fact that World War I had just ended about 20 years before, the thought of starting another European war was not especially attractive to lots of German policy makers, including military leaders. And then came France 1940. In the run-up to May 10th, 1940, there was huge resistance in the German army to attacking France. But that was eventually eliminated because they came up with a clever plan, the Manstein Plan. If you look at the decision to invade the Soviet Union on June 22nd, 1941, which is the only case where they fail… They succeeded in France, they succeeded in Poland, they succeeded at Munich in 1938. Soviet Union is where they fail. There’s hardly any resistance at all, right?
Lex Fridman
(00:34:20)
Yeah. Well, and to say that they failed the Soviet Union, my grandfather fought for the Soviet Union, there was a lot of successes early on. So there’s poor military, I would say, strategic decisions along the way, but it caught Stalin off guard. Maybe you can correct me, but from my perspective, terrifyingly so, they could have been successful if certain different decisions were made from a military perspective.
John Mearsheimer
(00:34:54)
Yeah. I’ve always had the sense they came terrifyingly close to winning. You can make the opposite argument that they were doomed-
John Mearsheimer
(00:35:03)
You can make the opposite argument that they were doomed. But I’m not terribly comfortable making that argument. I think the Wehrmacht, by the summer of 1941, was a finely tuned instrument for war, and the Red Army was in quite terrible shape. Stalin had purged the Officer Corps, they had performed poorly in Finland, and there were all sorts of reasons to think that they were no match for the Wehrmacht.

(00:35:36)
And if you look at what happened in the initial stages of the conflict, that proved to be the case. The Germans won a lot of significant tactical victories early on.
Lex Fridman
(00:35:49)
And if they focused and went to Moscow as quickly as possible, again, terrifyingly, so could have been, basically topple Stalin. And one thing that’s-
John Mearsheimer
(00:36:03)
That’s possible.
Lex Fridman
(00:36:04)
That’s possible.
John Mearsheimer
(00:36:05)
Fortunately, we’re not going to run the experiment again, but one could argue that, had they concentrated as the generals wanted to do, in going straight from Moscow, that they would’ve won. I mean, what Hitler wanted to do is, he wanted to go into the Ukraine. I mean, Hitler thought that the main Axis… There were three Axes. The northern Axis went towards Leningrad, the central Axis of course, went to Moscow, and then the Southern Axis, Army Group South, headed towards Ukraine and deep into the caucuses.

(00:36:39)
And Hitler believed that that should have been the main Axis. And in fact, in 1942, the Soviets, excuse me, the Germans go back on the offensive in 1942. This is Operation Blue, and the main Axis in ’42 is deep into the Ukraine and into the caucuses, and that fails.

(00:37:01)
But one could argue that, had they done that in ’41, had they not gone to Moscow, had they gone, had they concentrated on going deep into Ukraine and into the caucuses, they could have knocked the Soviets out that way. I’m not sure that in the end I believe that. I think in the end the Soviets would’ve won no matter what, but I’m not a hundred percent sure of that.
Lex Fridman
(00:37:28)
Sometimes, maybe you can educate me, but sometimes they say, just like with Napoleon, winter defeated Hitler in Russia. I think not often enough people tell the story of the soldiers and the motivation and how hard they fight. So it turns out that Ukrainians and Russians are not easy to conquer. They’re the kinds of people that don’t roll over and fight bravely. There seems to be a difference in certain peoples, in how they see war, how they approach war, how proud they are to fight for their country, to die for their country, these kinds of things. So I think Battle of Stalingrad tells, at least to me, a story of extremely brave fighting on the Soviet side, and that, it’s a component of war too. It’s not just structural, it’s not just military strategy, it’s also the humans involved, but maybe that’s a romantic notion of war.
John Mearsheimer
(00:38:33)
No, I think there’s a great deal of truth in that, but let’s just unpack it a bit in the case of the Soviet Union in World War II. The counterargument to that is that in World War I, the Russian Army disintegrated. And if you look at what happened when Napoleon invaded in 1812, and you look at what happened in 1917, and then you look at what happened between ’41 and ’45, the Napoleon case looks a lot like the Hitler case, and it fits neatly with your argument.

(00:39:14)
But World War I does not fit neatly with your argument because the Russians lost and surrendered, and you had the infamous treaty of Brest-Litovsk, where the Soviet Union then, because it went from Russia to the Soviet Union in October 1917, the Soviet Union surrendered large amounts of Soviet territory because it had suffered a humiliating defeat.

(00:39:38)
My argument for why the Russians, let me take that back, why the Soviets fought like wild dogs in World War II is that they were up against a genocidal adversary. You want to understand that the Germans murdered huge numbers of Soviet POWs. The overall total was 3.7 million. And by December, December of 1941, remember the invasion is June ’41, by December of 1941, the Germans have murdered 2 million Soviet POWs. At that point in time, they had murdered many more POWs than they had murdered Jews.

(00:40:20)
And this is not to deny for one second that they were on a murderous rampage when it came to Jews, but they were also on a murderous rampage when it came to Soviet citizens and Soviet soldiers. So those Soviet soldiers quickly came to understand they were fighting for their lives. If they were taken prisoner, they would die. So they fought like wild dogs.
Lex Fridman
(00:40:48)
Yeah, the story of the Holocaust, of the 6 million Jews, is often told extensively. If Hitler won, conquered the Soviet Union, it’s terrifying to think, on a much grander scale than the Holocaust, what would’ve happened to the Slavic people, to the Soviet people.
John Mearsheimer
(00:41:08)
Absolutely. All you have to do is read the Hunger Plan, right? And they also had a plan, what was it called? Grand Planned East, I forget the exact name of it, which made it clear that they were going to murder many tens of millions of people. And by the way, I believe that they would’ve murdered all the Poles and all the Roma. I mean, my view is that the Jews were number one on the genocidal hit list. The Roma, or the gypsies, were number two, and the Poles were number three.

(00:41:42)
And of course, I just explained to you how many POWs they had killed. So they would’ve ended up murdering huge numbers of Soviet citizens as well. But people quickly figured out that this was happening, that’s my point to you. And that gave them, needless to say, very powerful incentives to fight hard against the Germans, and to make sure that they did not win.

Russia and Ukraine

Lex Fridman
(00:42:09)
To fast-forward in time, but not in space, let me ask you about the war in Ukraine. Why did Russia invade Ukraine on February 24th, 2022? What are some of the explanations given? And which do you find the most convincing?
John Mearsheimer
(00:42:33)
Well, clearly, the conventional wisdom is that Putin is principally responsible. Putin is an imperialist, he’s an expansionist.
Lex Fridman
(00:42:43)
That’s the conventional thinking.
John Mearsheimer
(00:42:44)
Yeah, yeah. And the idea is that he is bent on creating a greater Russia, and even more, so he’s interested in dominating Eastern Europe, if not all of Europe, and that Ukraine was the first stop on the train line. And what he wanted to do was to conquer all of Ukraine, incorporate it into a greater Russia, and then he would move on and conquer other countries. This is the conventional wisdom. My view is there is no evidence, let me emphasize, zero evidence, to support that argument.
Lex Fridman
(00:43:26)
Which part? That he would… The imperialist part, the sense that he sought to conquer all of Ukraine, and move on and conquer-
John Mearsheimer
(00:43:36)
There’s no evidence he was interested in conquering all of Ukraine. There was no evidence beforehand that he was interested in conquering any of Ukraine. And there’s no way that an army that had 190,000 troops, at the most, could have conquered all of Ukraine, it’s just impossible.

(00:43:59)
As I like to emphasize, when the Germans went into Poland in 1939, and the Germans, you want to remember, were only intent on conquering the western half of Poland, because the Soviets, who came in later that month, were going to conquer the eastern half of Poland. So the western half of Poland is much smaller than Ukraine, and the Germans went in with 1.5 million troops. If Vladimir Putin were bent on conquering all of Ukraine, he would’ve needed at least 2 million troops. I would argue he’d need 3 million troops, because not only did he need to conquer the country, you then have to occupy it.

(00:44:44)
But the idea that 190,000 troops was sufficient for conquering all of Ukraine, it’s not a serious argument. Furthermore, he was not interested in conquering Ukraine, and that’s why, in March 2022, this is immediately after the war starts, he is negotiating with Zelensky to end the war. There are serious negotiations taking place in Istanbul involving the Turks. And Naftali Bennett, who was the Israeli prime minister at the time, was deeply involved in negotiating with both Putin and Zelensky to end the war.

(00:45:22)
Well, if he was interested, Putin, in conquering all of Ukraine, why in God’s name would he be negotiating with Zelensky to end the war? And of course, what they were negotiating about was NATO expansion into Ukraine, which was the principal cause of the war. People in the West don’t want to hear that argument because if it is true, which it is, then the West is principally responsible for this bloodbath that’s now taking place. And of course, the West doesn’t want to be principally responsible. It wants to blame Vladimir Putin.

(00:45:59)
So we’ve invented this story out of whole cloth that he is an aggressor, that he’s the second coming of Adolf Hitler, and that what he did in Ukraine was try to conquer all of it and he failed. But with a little bit of luck, he probably would’ve conquered all of it, and he’d now be in the Baltic States, and eventually end up dominating all of Eastern Europe. As I said, I think there’s no evidence to support this.
Lex Fridman
(00:46:28)
So maybe there’s a lot of things to ask there. Maybe just to linger on NATO expansion, what is NATO expansion? What is the threat of NATO expansion and why is this such a concern for Russia?
John Mearsheimer
(00:46:42)
NATO was a mortal enemy of the Soviet Union during the Cold War. It’s a military alliance which has at its heart the United States of America, which is the most powerful state on the planet. It is perfectly understandable that Russia is not going to want that military alliance on its doorstep.

(00:47:08)
Here in the United States we have, as you well know, what’s called the Monroe Doctrine, and that basically says no great powers from Europe or Asia are allowed to come into our neighborhood and form a military alliance with anybody in this neighborhood. When I was young, there was this thing called the Cuban Missile Crisis. The Soviets had the audacity to put nuclear armed missiles in Cuba. We told them in no uncertain terms that that was not acceptable, and that those missiles had to be removed. This is our backyard and we do not tolerate distant great powers coming into our neighborhood.

(00:47:45)
Well, what’s good for the goose is good for the gander. And if we don’t like great powers coming into our neighborhood, it’s hardly surprising that the Russians did not want NATO on their doorstep. They made that manifestly clear when the Cold War ended, and they exacted a promise from us that we would not expand NATO. And then when we started expanding NATO, they made it clear, after the first tranche in 1999, that they were profoundly unhappy with that. They made it clear in 2004, after the second tranche, that they were profoundly unhappy with that expansion.

(00:48:29)
And then, in April 2008, when NATO announced that Ukraine and Georgia would become part of NATO, they made it unequivocally clear, not just Putin, that was not going to happen. They were drawing a red line in the sand. And it is no accident that in August 2008, remember the Bucharest Summit is April 2008? And August 2008, you had a war between Georgia and Russia, and that involved, at its core, NATO expansion.

(00:49:02)
So the Americans and their allies should have understood by at least August 2008 that continuing to push to bring Ukraine into NATO was going to lead to disaster. And I would note that there were all sorts of people in the 1990s like George Kennan, William Perry, who was Bill Clinton’s Secretary of Defense, the Chairman of the Joint Chiefs of Staff, Paul Nitsa, and so forth and so on, who argued that NATO expansion would end up producing a disaster, which it has.

(00:49:38)
I would note that at the famous April 2008 Bucharest Summit, where NATO said that Ukraine would be brought into the alliance, Angela Merkel and Nicolas Sarkozy, the German and French leaders respectively, opposed that decision. Angela Merkel later said that the reason she opposed it was because she understood that Putin would interpret it as a declaration of war. Just think about that. Merkel is telling you that she opposed NATO expansion into Ukraine, because she understood, correctly, that Putin would see it as a declaration of war.

(00:50:22)
What did the United States and its friends and friends in Europe do? They continued to push and push, because we thought that we could push NATO expansion down their throat after 2008, the same way we did in 1999 and 2004, but we were wrong, and it all blew up in our face in 2014. And when it blew up in our face in 2014, what did we do? Did we back off and say, “Well, maybe the Russians have some legitimate security interest.” No, that’s not the way we operate. We continued to double down.

(00:50:57)
And the end result is that in 2022, you got a war. And as I’ve argued for a long time now, we, the West, are principally responsible for that, not Vladimir Putin.
Lex Fridman
(00:51:11)
So the expansion of NATO is primarily responsible for that.
John Mearsheimer
(00:51:15)
Yeah. To put it in more general terms, what we were trying to do was turn Ukraine into a Western bulwark on Russia’s border, and it really wasn’t NATO expansion alone. NATO expansion was the most important element of our strategy. But the strategy had two other dimensions. One was EU expansion, and the third was the Color Revolution. We were trying to force Orange Revolution in Ukraine, and the basic goal there was to turn Ukraine into a pro-Western, liberal democracy.

(00:51:52)
And that meant that you’d have Ukraine, if it worked, as a pro-Western liberal democracy that was in the EU, and that was in NATO. This was our goal. And the Russians made it unequivocally clear Ukraine was not going to become a Western bulwark on their border, and most importantly, they made it clear that Ukraine in NATO was unacceptable.
Lex Fridman
(00:52:19)
Can we talk about the mind of Vladimir Putin? You’ve mentioned this idea that he has aspirations for imperialist conquest, that he dreams of empire, is not grounded in reality. He wrote an essay in 2021, about one people. Do you think there is some degree to which he still dreams of the former Soviet Union reuniting?
John Mearsheimer
(00:52:50)
No, he’s made it clear that anybody with a triple digit IQ understands that it’s nuts to think about recreating the Soviet Union. He thinks it’s a tragedy that the Soviet Union fell apart, but as he made clear in that essay, the July 12th, 2021 essay, and as he made clear in speeches before, immediately before he invaded Ukraine, he accepted the breakup of the Soviet Union, and he accepted the status quo in Europe, save for the fact he did not accept the idea that Ukraine would become part of NATO.
Lex Fridman
(00:53:33)
He’s been in power for over two decades. Is there a degree that power can affect a leader’s ability to see the world clearly, as they say, corrupt? Do you think power has corrupted Vladimir Putin, to a degree?
John Mearsheimer
(00:53:52)
It’s very hard for me to answer that question because I don’t know him, and I’ve not studied him carefully in terms of his overall performance over the course of the 23 years that he’s been in power. I’ve studied him as a strategist, and I’ve studied how he deals with the West, and deals with the international system more generally since 2014. And I think he is a first class strategist.

(00:54:31)
This is not to say he doesn’t make mistakes, and he admits he’s made some mistakes, but I think that the West is dealing with a formidable adversary here. And I don’t see any evidence that he’s either lost speed off his fastball, or that power has corrupted his thinking about strategic affairs.
Lex Fridman
(00:54:59)
So he has consistently put, as a primary concern, security? As does the United States, he’s put for Russia’s security, making sure that NATO doesn’t get close to its borders?
John Mearsheimer
(00:55:12)
I think that’s clear. Yeah, I think as I emphasized early on in our conversation, that leaders privilege security or survival over everything else. And by the way, he gave a number of talks and press conferences in addition to writing that famous article that you referred to on July 12th, 2021. So we have a pretty clear record of what he was saying, and I would argue what he was thinking, in the run-up to the war in February 2022.

(00:55:50)
And if you read what he said, it’s quite clear that he privileged security or survival. He was deeply concerned about the security of Russia. And Russia is a quite vulnerable state in a lot of ways, especially if you think back to what it looked like in the 1990s, as you know better than I do. It was in terrible shape. The Chinese talk about the century of national humiliation. One could argue that for the Russians, that was the decade of national humiliation. And it took Putin, I think, quite a bit of time to bring the Russians back from the dead. I think he eventually succeeded, but it took a considerable amount of time, and I think he understood that he was not playing a particularly strong hand. He was playing something of a weak hand, and he had to be very careful, very cautious, and I think he was. And I think that’s very different than the United States. The United States was the Unipol. It was the most powerful state in the history of the world, the most powerful state relative to all its possible competitors. From roughly 1989, certainly after December 1991, when the Soviet Union fell apart, up until, I would argue, about 2017, we were incredibly powerful. And even after 2017, up to today, the United States remains the most powerful state in the system.

(00:57:18)
And because of our geographical location, we are in a terrific situation to survive in any great power competition. So you have a situation involving the United States that’s different than the situation involving Russia. They’re just much more vulnerable than we are. And therefore, I think Putin tends to be more sensitive about security than any American president in recent times.
Lex Fridman
(00:57:51)
Europe on one side, China on the other side. It’s a complicated situation.
John Mearsheimer
(00:57:56)
Yeah. And we talked before about 1812, when Napoleon invaded and Moscow got burned to the ground. We talked about World War I, where the Russians were actually defeated and surrendered, and then we talked about 1941 to 1945, where, although thankfully the Soviets prevailed, it was a close call. And I mean, the casualties, the destruction that the Soviet Union had inflicted on it by the Germans is just almost hard to believe. So they are sensitive.

(00:58:38)
You can understand full well, or at least you should be able to understand full well, why the idea of bringing Ukraine up to their border really spooked them. I don’t understand why more Americans don’t understand that, it befuddles me. I think it has to do with the fact that Americans are not very good at putting themselves in the shoes of other countries. And you really, if you’re going to be a first class strategist in international politics, you have to be able to do that. You have to put yourself in the shoes of the other side and think about how they think, so you don’t make foolish mistakes.
Lex Fridman
(00:59:16)
And as a starting point, Americans tend to see themselves as the good guys and a set of others as the bad guys. And you have to be able to empathize that Russians think of themselves as the good guys, the Chinese think of themselves as the good guys, and just be able to empathize. If they are the good guys… It’s like that funny skit. Are we the baddies? Consider the United States could be the bad guys.

(00:59:44)
First of all, see the world, if the United States is the bad guys and China is the good guys, what does that world look like? Be able to just exist with that thought, because that is what the Chinese leadership and many Chinese citizens, if not now, maybe in the future, will believe. And you have to kind of do the calculation, the simulation forward from that. And same with Russia, same with other nations.
John Mearsheimer
(01:00:12)
Yeah, I agree with you, a hundred percent. And just, I always think of Michael McFall at Stanford, who was the American ambassador to Russia, I think between 2012 and 2014. And he told me that he told Putin that Putin didn’t have to worry about NATO expansion because the United States was a benign hegemony.

(01:00:36)
And I asked Mike what Putin’s response was to that. And Mike said that Putin didn’t believe it, but Mike believed that he should believe it, and that we could move NATO eastward to include Ukraine, and in the end, we’d get away with it because we are a benign hegemony, but the fact is that’s not what Putin saw. Putin saw us as a malign hegemony. And what Mike thinks, or any American thinks, doesn’t matter. What matters is what Putin thinks.
Lex Fridman
(01:01:15)
But also, the drums of war have been beating for some reason. NATO expansion has been threatened for some reason. So you’ve talked about NATO expansion being dead, so it doesn’t make sense from a geopolitical perspective, on the Europe side, to expand NATO. But nevertheless, that threat has been echoed. So why has NATO expansion been pushed, from your perspective?
John Mearsheimer
(01:01:46)
There are two reasons. One is, first of all, we thought it was a wonderful thing to bring more and more countries into NATO. We thought that it facilitated peace and prosperity. It was ultimately all for the good. And we also thought that countries like Ukraine had a right to join NATO.

(01:02:12)
These are sovereign countries that can decide for themselves, and the Russians have no say in what Ukraine wants to do. And then finally, and this is a point I emphasized before, we were very powerful, and we thought we could shove it down their throat. So it’s a combination of those factors that led us to pursue what I think was ultimately a foolish policy.
Lex Fridman
(01:02:39)
We’ve talked about how wars get started. How do you hope the war in Ukraine ends? What are the ways to end this war? What are the ways to achieve peace there? To end the, I would say, senseless death of young men, as always happens in war?
John Mearsheimer
(01:03:04)
I’m sad to say I don’t have a good answer to that. I don’t think there’s any real prospect of a meaningful peace agreement. I think it’s almost impossible. I think the best you can hope for at this point is, at some point the shooting stops, you have a ceasefire, and then you have a frozen conflict. And that frozen conflict will not be highly stable.

(01:03:36)
And the Ukrainians in the West will do everything they can to weaken Russia’s position, and the Russians will go to great lengths to not only damage that dysfunctional rump state that Ukraine becomes, but the Russians will go to great lengths to sow dissension within the alliance. And that includes in terms of transatlantic relations.

(01:04:03)
So you’ll have this continuing security competition between Russia on one side, and Ukraine and the West on the other. Even when you get a frozen peace, or you get a frozen conflict, and the potential for escalation there will be great. So I think this is a disaster.
Lex Fridman
(01:04:25)
That’s a very realist perspective. Let me ask you sort of the human side of it. Do you think there’s some power to leaders sitting down, having a conversation, man to man, leader to leader, about this? There is just a lot of death happening. It seems that, from an economic perspective, from a historic perspective, from a human perspective, both nations are losing.

(01:04:55)
Is it possible for Vladimir Zelensky and Vladimir Putin to sit down and talk, and to figure out a way where the security concerns are addressed, and both nations can minimize the amount of suffering that’s happening, and create a path towards future flourishing?
John Mearsheimer
(01:05:21)
I think the answer is no.
Lex Fridman
(01:05:23)
Even with the United States involved, three people in the room?
John Mearsheimer
(01:05:27)
Well, I think if the United States is involved, the answer is definitely no. You have to get the Americans out. And then, I think if you have Zelensky and Putin talking, you have a sliver of a chance there. The Americans are a real problem. Look, let’s go back to what happens right after the war starts, okay? As I said before, we’re talking March, early April of 2022. The war starts on February 24th, 2022.

(01:05:59)
And as I said to you, the two sides were negotiating in Istanbul, and they were also negotiating through Naftali Bennett, and the Bennett track and the Turkish track were operating together. I mean, they were not at cross purposes at all. What happened? Bennett tells the story very clearly that they had made significant progress in reaching an agreement. This is Zelensky on one side and Putin on the other. Bennett is talking in person to both Putin and Zelensky, and what happens to produce failure?

(01:06:45)
The answer is, the United States and Britain get involved and tell Zelensky to walk. They tell Zelensky to walk. If they had come in and encouraged Zelensky to try to figure out a way with Putin to shut this one down, and worked with Bennett, and worked with Erdogan, we might’ve been able to shut the war down then, but it was the United States.
Lex Fridman
(01:07:09)
Well, let me sort of push back on that. You’re correct, but the United States paints this picture that everybody’s aligned. Maybe you can correct me, but I believe in the power of individuals, especially individual leaders. Again, whether it’s Biden or Trump or whoever goes into a room and says, in a way that’s convincing, that no more NATO expansion. And actually just on a basic human level, ask the question of why are we doing all this senseless killing?

(01:07:49)
And look at the interest of one, Russia, look at the interest of the other, Ukraine. Their interests are pretty simple. And say, the United States is going to stay out of this. We’re not going to expand NATO, and say all that in a way that’s convincing, which is that NATO expansion is silly at this point, China’s the big threat. We’re not going to do this kind of conflict escalation with Russia. The Cold War’s over, let’s normalize relations.
John Mearsheimer
(01:08:20)
Let me just embellish your argument, okay?
Lex Fridman
(01:08:23)
Thank you. I need it.
John Mearsheimer
(01:08:26)
If we say there’s a sliver of a chance that you can do this, and I do think there is a sliver of a chance. Let me just embellish your point.
Lex Fridman
(01:08:34)
Thank you. I need all the help I can get.
John Mearsheimer
(01:08:37)
Two things have to be done here, in my opinion. One is, Ukraine has to become neutral, and it has to completely sever all security ties with the West, right? It is not like you can say, “We’re not going to expand NATO to include Ukraine, but we’re going to continue to have some loose security arrangement with Ukraine.” None of that. It has to be completely severed. Ukraine has to be on its own, okay?

(01:09:13)
And number two, Ukraine has to accept the fact that the Russians are going to keep the four oblasts that they’ve now annexed, and Crimea. The Russians are not going to give them back. And what you really want to do, if you’re Zelensky or who’s ever running Ukraine in this scenario that we’re positing, is you want to make sure the Russians don’t take another four oblasts, to include Kharkiv and Odessa.

(01:09:45)
If I’m playing Putin’s hand and this war goes on, I’m thinking about taking four more oblasts. I want to take about 43% of Ukraine and annex it to Russia, and I certainly want Odessa, and I certainly want Kharkiv, and I want the two oblasts-
John Mearsheimer
(01:10:03)
And I certainly want Harki and I want the two old boss in between as well.
Lex Fridman
(01:10:05)
Literally, or as leveraged in negotiation or Ukraine neutrality?
John Mearsheimer
(01:10:12)
No, I want them literally, I want to conquer them literally. My point to you is if we can begin to talk about cutting a deal now, you may be able to head that kind of aggression off at the pass. In other words, you may be able to limit Putin and Russia to annexing the four old boss that they’ve now annexed plus Crimea. That’s the best I think you can hope for. The point is you have to get the Ukrainians to accept that. You have to get the Ukrainians to accept becoming a truly neutral state and conceding that the Russians keep a big chunk of territory. It’s about 23% of Ukrainian territory that they’ve annexed and I find it hard to imagine any Ukrainian leader agreeing to that.
Lex Fridman
(01:11:03)
Well, there could be more nuanced things like no military involvement between the United States and Ukraine, but economic involvement, sort of financial support, so normalizing economic relationships with Ukraine, with Russia, all being-
John Mearsheimer
(01:11:21)
I think you could probably get away with that. I think the tricky question there that you would have to answer is what about EU expansion? And I think EU expansion is probably a no-no for the Russians because most people don’t recognize this, but there is a military dimension built into EU expansion. It’s not purely an economic alliance or relationship or institution, whatever word you want to use. There’s a military dimension to that. In the run-up to the war, actually in the run-up to the 2014 crisis, when it first broke out, the Russians made it clear they saw EU expansion as a stalking horse for NATO expansion.

(01:12:10)
So EU expansion is tricky, but I think your point of close economic relations between … or healthy economic relations to use a better term between Ukraine and the West is possible. I think the Russians have a vested interest and if it’s a neutral Ukraine, they have a vested interest in that Ukraine flourishing, but that then brings us back to the territorial issue, right?
Lex Fridman
(01:12:39)
Well, so do you believe it’s possible for individual human relations to counteract the structural forces that you talk about? So meaning the leaders being able to pick up the phone and make agreements that are good for humanity as a whole and for their individual nations in the long term?
John Mearsheimer
(01:12:59)
I think leadership matters here. I mean, one of the real problems here is that there’s no trust on the Russian side, and that has to do with the Minsk agreements. The Minsk agreements, which were designed to shut down the Civil War in Eastern Ukraine, in the Donbas really mattered to the Russians. And there were four players involved in the Minsk process, four main players, Russia and Ukraine of course, and then Germany and France. And I believe the Russians took the Minsk Accord seriously. I believe Putin took them very seriously. He wanted to shut down that conflict.

(01:13:52)
And Angela Merkel, Francois Hollande, he was the French leader and Poroshenko, who was the Ukrainian leader, those were the three key players besides Putin. Again, Hollande from France, Merkel from Germany, and Poroshenko from Ukraine have all explicitly said they were not seriously interested in reaching an agreement in all of the discussions with Putin, they were bamboozling him. They were trying to trick him so that they would buy time to build up Ukraine’s military. Putin is profoundly upset about these admissions by these three leaders. He believes he was fooled into thinking that Minsk could work. He believes that he negotiated in good faith and they did not.

(01:14:49)
And he believes that the level of trust now between Russia and the West is virtually zero as a result of this experience over Minsk. I only bring this up because it cuts against your argument that leaders could pick up the phone and talk to each other and trust each other at least somewhat to work out a meaningful deal. If you’re Putin at this point in time, trusting the West is not an idea that’s going to be very attractive at all. In fact, you’re going to distrust anything they say.
Lex Fridman
(01:15:30)
Yeah, distrust anything the West say, but there is individual humans. The way human nature works is when you’re sitting across from a person, you can trust a human being while still distrusting the West. I mean, I believe in the power of that. I think with the right leaders, you could sit down and talk, like override the general structural distrust of the West and say, “You know what? I like this guy or gal, whatever.” I do hope Zelensky and Putin sit down together and talk, have multiple talks.
John Mearsheimer
(01:16:08)
Just remember they were doing that in March and the Americans came in and the British came in and they scotched a potential deal.
Lex Fridman
(01:16:17)
Well, the other beautiful thing about human nature, there’s forgiveness and there’s trying again.
John Mearsheimer
(01:16:25)
When you’re the leader of a country in an anarchic system, you have to be very careful not to let your trust in a foreign leader take you too far, because if that foreign leader betrays you or betrays your trust and stab you in the back, you could die and again, you want to remember that the principal responsibility of any leader, I don’t care what country it is, is to ensure the survival of their state. And that means that trust is only going to buy you so much, and when you’ve already betrayed the trust of a leader, you really are not going to be able to rely on trust very much to help you moving forward. Now, you disagree with that? I hope you’re right.

(01:17:17)
And if they can shut down the Ukraine-Russia war, it would be wonderful. If I’m proved dead wrong, that would be wonderful news. My prediction that this war is going to go on for a long time and end in an ugly way is a prediction that I don’t like at all. So I hope I’m wrong.
Lex Fridman
(01:17:45)
You wrote that many in the West believe that the best hope for ending the Ukraine wars to remove Vladimir Putin from power, but you argue that this isn’t the case. Can you explain?
John Mearsheimer
(01:17:58)
Well, a lot of people thought when they were having all that trouble, the Russians were having all that trouble with Prigozhin and the Wagner Group that Putin was vulnerable and was likely to be overthrown. And what would happen is a peace-loving leader would replace Putin. I made two points at the time, and I would make those same two points now. Number one, he’s not likely to be overthrown. He was not likely then to be overthrown. And I think as long as his health holds up, I think he will remain in power. My second point is if he doesn’t remain in power and he’s replaced, I would bet a lot of money that his replacement will be more hawkish and more hard line than Putin is.

(01:18:58)
I actually think one could argue that Putin was too trusting of the West before the war started and number two, I think one could argue that he has not waged the war against Ukraine as vigorously as one might have expected. He was slow to mobilize the nation for war, and he has pursued a limited war in all sorts of ways. The Israelis, for example, have killed more civilians in Gaza in one month than the Russians have killed over 18 months in Ukraine. The idea that Vladimir Putin is waging a punishment campaign and killing on purpose, large numbers of civilians, is simply not true.

(01:19:53)
All this just to say that … I would imagine that if Putin leaves office and someone else comes in to replace him, that someone else will be at least if not, more hard line than him in terms of waging the war, and certainly will not trust the West any more than he has.
Lex Fridman
(01:20:15)
By way of advice, let me ask you, if I were to have a conversation interview Vladimir Putin and Zelensky individually, what should I ask them? If you, me and Vladimir Putin are having a chat, what are good ideas to explore? What are good questions to ask? What are good things to say on or off the mic once again, that could potentially even slightly, lessen the amount of suffering in the world caused by this war?
John Mearsheimer
(01:20:51)
I think if you get an interview with Vladimir Putin, there’s just all sorts of questions you could ask him. And my sense is that Putin is a straight shooter. He’s also very knowledgeable about history, and he has simple theories in his head about how the world works. I think he would level with you, and all you would’ve to do is just figure out what all the right questions are. That would not be hard to do. You could ask him why was he so foolish? For example, why was he so foolish as to trust Poroshenko, Hollande and Merkel in the Minsk Accords. Why after his famous talk at Munich in 2007 where he made it clear that he was so unhappy with the West, did he continue to, in a very important way, trust the West?

(01:21:52)
Why didn’t he mobilize the Russian military before late September, 2022, once the negotiations that we were talking about before involving Istanbul and Naftali Bennett. Once they broke down, why didn’t he immediately mobilize more of the Russian population to fight the war? Just all sorts of questions like that. Then, you could ask him questions about where he sees this one headed. What’s the best strategy for Russia if the Ukrainians will not agree to neutrality?People like John Mearsheimer say, “You’ll probably take close to half of Ukraine. Is that true? Does it make sense to take Odessa.”
Lex Fridman
(01:22:47)
And John Mearsheimer also has questions about China, your future relationships with China?
John Mearsheimer
(01:22:53)
Yeah, I mean, one really important question that I would ask him is if the United States had basically not driven you into the arms of the Chinese, if there had been no war over Ukraine and the United States and its European allies had gone to considerable lengths to create some sort of security architecture in Europe that resulted in you, Vladimir Putin having good relations with Ukraine, what would your relations with China be and how would you think about that? So there are just plenty of questions you could ask him.
Lex Fridman
(01:23:33)
Well, hope burns eternal in my heart, I think probably in Putin’s heart and Zelensky’s heart, I hope because hope is, the leap of trust that we’ve talked about, I think is necessary for deescalation and for peace.
John Mearsheimer
(01:23:50)
Well, you realize, I have, from the beginning, argued for different policies that were all designed to prevent this war from ever happening.
Lex Fridman
(01:23:59)
Yes.
John Mearsheimer
(01:24:00)
I don’t know if you know this, but in 1993, I argued that Ukraine should keep its nuclear weapons. I was probably the only person in the West who made that argument. And my argument in 1993, this is in foreign affairs, was that there may come the day when Russia thinks about invading Ukraine. And should that day come, it would be very helpful for preventing war if Ukraine had nuclear weapons.
Lex Fridman
(01:24:27)
So military might is essential for maintaining a balance of power and peace.
John Mearsheimer
(01:24:33)
Well, if you’re interested in deterring an adversary, if I’m worried about you coming after me, the best way to deter you is to have military might. If you’re Russia, and I’m Ukraine, I’m far weaker than you, right?
Lex Fridman
(01:24:46)
Yeah.
John Mearsheimer
(01:24:47)
And having a nuclear deterrent would be very effective at convincing you not to attack me because if you attack me, you’re threatening my survival. And that’s the one circumstance where it is likely that I would use nuclear weapons to defend myself and given the consequences of nuclear use, you would be reluctant in the extreme to attack me. So that’s why I argued in ’93 that if Ukraine kept its nuclear weapons that made war down the road much less likely. And I believe I was correct. And in fact, Bill Clinton, who played the key role in forcing Ukraine to give up its nuclear weapons now says … he has said it publicly, you can find it on YouTube that he made a mistake doing that.

(01:25:36)
Furthermore, I argued in 2014 that it made eminently good sense not to continue to push to bring Ukraine into NATO because the end result is that Ukraine would be destroyed and Ukraine is being destroyed. So I was deeply interested at time in making sure that that didn’t happen for the good of the Ukrainians, not to mention, because stability in Europe is a net positive for almost everybody involved, but people did not listen to me then either.
Lex Fridman
(01:26:08)
How did nuclear weapons change the calculus of offensive realism, because of mutually assured destruction? I mean, it’s not just military might. It’s just so destructive that you basically can’t use nuclear weapons unless you want complete destruction.
John Mearsheimer
(01:26:28)
There’s no question that the presence of nuclear weapons makes it much less likely. I’m choosing my words carefully here, much less likely that a great power would aggress against another great power. It doesn’t take that possibility off the table, but it makes it much less likely because of the reasons that you articulated. With regard to nuclear use, it’s an interesting question how you think about nuclear use in a MAD world. I mean, your point that we’re in a MAD world is … that’s mad, MAD as well as mad, small letters, but let’s stick to the capital letters. We’re in a world of mutual assured destruction. There’s no question that in that world, it’s unlikely that nuclear weapons would be used.

(01:27:22)
The way you use nuclear weapons in that world is you use them for manipulation of risk purposes, demonstration effect. You put both sides out on the slippery slope. Now, what exactly am I saying here? Let me talk about NATO doctrine during the Cold War. We lived in a MAD world, United States and Soviet Union or the Warsaw Pact in NATO, both had an assured destruction capability. So you had mutual assured destruction. If the Warsaw Pact were to invade Western Europe, and here we’re talking about West Germany and NATO was losing the war, we said that we would use nuclear weapons. How would we use nuclear weapons given that we were in a MAD world? The argument was that we would use a handful of nuclear weapons against the Warsaw Pact, not necessarily against their military forces.

(01:28:25)
It could be in a remote area. We would use a small number of nuclear weapons to signal to the Soviets that we were deadly serious about putting an end to their offensive, and that we were throwing both sides out on the slippery slope to oblivion. In other words, we were manipulating risk and the last clear chance to avoid Armageddon rested with them. And then, we would tell them that if you retaliated with a handful of nuclear weapons and you didn’t cease your offensive against West Germany, we would launch a small, another nuclear attack. We would explode a handful more of nuclear weapons, all for the purposes of showing you our resolve.

(01:29:21)
So this is the manipulation of risk strategy, and a lot of the language I just used in describing it to you is language that Thomas Schelling invented. Now fast-forward to the present, if Russia were losing in Ukraine, that’s the one scenario where I think where Russia would’ve used nuclear weapons. The question is, how would Russia have used nuclear weapons? Again, we’re assuming that the Russians are losing to the Ukrainians. I believe they would’ve pursued a manipulation of risk strategy. They would’ve used four or five, three or four, who knows, nuclear weapons-
Lex Fridman
(01:29:59)
Maybe just one in a rural area that kills very few people.
John Mearsheimer
(01:30:03)
Yes, exactly, and basically, that would spook everybody. The American-
Lex Fridman
(01:30:08)
Just the mushroom cloud.
John Mearsheimer
(01:30:10)
Yeah. It’s because of the threat of escalation.
Lex Fridman
(01:30:14)
Yeah.
John Mearsheimer
(01:30:14)
Again, your point is we’re in a MAD world. I accept that and if you have limited nuclear use, right? We understand hardly anything about nuclear escalation because thank goodness we’ve never had a nuclear war. So once you throw both sides out on the slippery slope, even if you only use one nuclear weapon in your scenario, you don’t know what the escalation dynamics look like. So everybody has a powerful incentive to put an end to the conflict right away. I might add to you that there were people who believed that we would not even initiate a manipulation of risk strategy in Europe if we were losing to the Warsaw Pact during the Cold War.

(01:31:04)
Both Henry Kissinger and Robert McNamara said after leaving office that they would not have done it. They would’ve not initiated nuclear use, even limited nuclear use. That’s what we’re talking about here. They would rather be red than dead, that was the argument.
Lex Fridman
(01:31:24)
Too risky.
John Mearsheimer
(01:31:25)
Too risky. That’s exactly right, but if they had used one nuclear weapon in your story, or three or four in my story, everybody would’ve said, “Oh my God, we’ve got to shut this one down immediately.” I only tell you this story or lay out this scenario as an answer to your question of how you use nuclear weapons in a MAD world, and this is the answer.
Lex Fridman
(01:31:53)
This is all very terrifying. Perhaps in part, it’s terrifying to me because I can see in the 21st century, China, Russia, Israel, United States using a nuclear weapon in this way, blowing it up somewhere in the middle of nowhere that kills maybe nobody, but I’m terrified of seeing the mushroom cloud and not knowing, given social media, given how fast news travels, what the escalation looks like there. Just in a matter of minutes, how the news travels and how the leaders react. It’s terrifying that this little demonstration of power, the ripple effects of it, in a matter of minutes, seconds, what that leads to because it’s human emotions.

(01:32:51)
You see the landscape of human emotions, the leaders and the populace and the way news are reported, and then the landscape of risk, as you mentioned, shifting the world’s most intense nonlinear dynamical system, and it is just terrifying because the entirety of human civilizations hangs in the balance there. And it’s like this, hundreds of millions of people could be dead.
John Mearsheimer
(01:33:21)
Let’s just talk about this in the context of the Ukrainian War. If the Russians were losing, as I said before, which is not the case anymore, but in 2022, it did look like that, if the Russians are losing and they turn to nuclear weapons, the question is how do they use them? And they would use them in Ukraine, and because Ukraine has no nuclear weapons of its own, Ukraine cannot retaliate. It’s not a mutual assured destruction world. It’s a case where one side has nuclear weapons and the other doesn’t. That means that the Russians are likely to think that they can get away with using nuclear weapons in ways that would not be the case if they were attacking NATO.

(01:34:17)
And therefore, it makes nuclear use more likely. Okay. That’s point one. Point two is let’s assume that the Russians use two or three nuclear weapons in a remote area-
Lex Fridman
(01:34:27)
My palms are sweating, by the way. Just as a commentary. It’s terrifying.
John Mearsheimer
(01:34:32)
Yeah. The question then is what does the West do? Now, Macron has said and Biden has also, I think, implicitly made this clear, “We would not retaliate with nuclear weapons, if the Russians were to attack with a handful of nuclear weapons in Western Ukraine.” Then, the question is what would we do? And if you listen to David Petraeus, what David Petraeus says, is that we should attack the Russian naval assets in the Black Sea and attack Russian forces in Ukraine. Well, once you do that, you have a great power of war. You have NATO versus Russia, which is another way of saying you have the United States versus Russia. We’re now in a great power of war.

(01:35:23)
They have nuclear weapons, we have nuclear weapons. They’ve used nuclear weapons. What is the happy ending here? And just to take it a step further and go back to our earlier discussion about moving NATO up to Russia’s borders, the point I made, which you’ll surely agree with, is that the Russians are very fearful when they see NATO coming up to their border. Well, here’s a case where not only is NATO come up to their border, but they’re in a war with NATO right on their border. What do the escalation dynamics look like there? You know what the answer is? Who knows? That should scare the living bejesus out of you, right?
Lex Fridman
(01:36:06)
And some of it could be, like you mentioned, unintended. There could be unintended consequences. That could be a Russian missile misses in hits Poland. These kinds of things that just escalate misunderstandings, miscommunications, even … I mean, nuclear weapon could be … boy, it could have been planned to go location X, and it went to a location Y that ended up actually killing a very large number of people. I mean, the escalation that happens there just happens in a matter of minutes. And the only way to stop that is communication between leaders. And that to me is a big argument for ongoing communication.
John Mearsheimer
(01:36:52)
There’s a story that during the Cuban missile crisis, Kennedy put out the word, no aircraft under any circumstances or to penetrate Soviet airspace. He then found out a few days later that some guy hadn’t gotten the message and had penetrated in an aircraft deep into Soviet airspace.
Lex Fridman
(01:37:19)
Yeah.
John Mearsheimer
(01:37:19)
And this supports your basic point that bad things happen.
Lex Fridman
(01:37:25)
Yeah.
John Mearsheimer
(01:37:26)
And again, the overarching point here is we’ve never done this before, thankfully. Therefore, we don’t have a lot of experience as to how it plays itself out. It’s really a theoretical enterprise because there’s no empirical basis for talking about escalation in a nuclear crisis. And that, of course, is a wonderful thing.
Lex Fridman
(01:37:50)
Well, and in general, the human species as a whole as a one-off, is a theoretical enterprise. The survival of the human species. We’ve seen empires rise and fall, but we haven’t seen the human species rise and fall. So far it’s been rising, but it’s not obvious that it doesn’t end. In fact, I think about aliens a lot, and the fact that we don’t see aliens makes me suspect it’s not so easy to survive in this complicated world of ours. Switching gears a little bit and going to a different part of the world, also engulfed in war. Let me ask you about the situation in Israel. Why did Hamas attack Israel on October 7th, 2023? As you understand the situation, what was the reason that attack happened?

Israel and Palestine

John Mearsheimer
(01:38:48)
Well, I think the main reason was that you had this suffocating occupation. I think as long as the occupation persists, the Palestinians are going to resist. As you well know, this is not the first time there has been a Palestinian uprising. There was the first Intifada, there was the second Intifada, now there’s October 7th, and there are uprisings besides those three, so this is not terribly surprising. A lot of people hypothesized that this attack was due to the fact that the Israelis, the Saudis and the Americans were working together to foster another Abraham Accord and that the Palestinians would in effect be sold down the river.

(01:39:45)
I think given the fact that this was in the planning stages for probably about two years, and the Abraham Accords with regard to Saudi Arabia are relatively new phenomenon, I don’t think that’s the main driving force here. I think the main driving force is that the Palestinians feel oppressed as they should, and that this was a resistance move. They were resisting the Israeli occupation.
Lex Fridman
(01:40:18)
So that resistance, the attack involved killing a large number of Israeli civilians. There’s many questions asked there, but one is, do you think Hamas fully understood what the retaliation will involve from Israel and to Gaza?
John Mearsheimer
(01:40:39)
They had to understand. I mean, you had Operation Cast Lead in 2008, 2009. It started, I think right after Christmas 2008, and it ended right before President Obama took office in January 2009. And the Israelis periodically do what they call mowing the lawn where they go into Gaza and they pound the Palestinians to remind them that they’re not supposed to rise up and cause any problem. So there’s no question in my mind that the Hamas forces understood full well that the Israelis would retaliate and they would retaliate in force as they have done.
Lex Fridman
(01:41:35)
Yeah, even the metaphor of mowing the lawn is disturbing to me in many ways. I actually saw Norman Finkelstein, I think, say that, well, then if you use that metaphor, then you could say that Hamas was also mowing the lawn. It’s such a horrific image because the result on either side is just the death of civilians. I mean, let me ask you about the death of civilians. So during the attack, 1400 Israelis were killed. Over 240 were taken hostage. Then, in response, as we sit today, Israel’s military response has killed over 10,000 people in Gaza. And given the nature of the demographics, it’s a very heavily young population.

(01:42:27)
Over 40% of them are under the age of 18, of those killed. That’s of course, according to Ministry of Health of Palestinian Authority. So what do you think is the long-term effect on the prospect of peace when so many civilians die?
John Mearsheimer
(01:42:46)
I mean, I think it’s disastrous. I mean, the only way you’re going to get peace here is if you have a two-state solution where the Palestinians have a sovereign state of their own, and there is a sovereign Jewish state. And these two states live side by side American presidents since Jimmy Carter have understood this full well. And this is why we have pushed very hard for two-state solution. Indeed, many American Jews and many Israelis have pushed for a two-state solution because they think that that is the only way you’re going to get peace between the two sides. What’s happened here is that in recent years, the Israelis have lost all interest in a two-state solution.

(01:43:43)
And it’s in large part because the political center of gravity in Israel has steadily moved to the right. When I was a young boy, the political center of gravity in Israel was much further to the left than it is today. It is in a position now, the political center of gravity where there’s hardly any support for two state solution and Netanyahu and the rest of the people in his government were in favor or are in favor of a greater Israel. There’s just no question about that. Well, on top of that, you now have had a war where, as you described, huge numbers of civilians have been killed, and you already had bad blood between the Palestinians and the Israelis before this conflict.

(01:44:41)
And you could imagine how people on each side now feel about people on the other side. So even if you didn’t have this opposition inside Israel to a two-state solution, how could you possibly get the Israelis now to agree to a two-state solution? I think for the foreseeable future, the animosity …
John Mearsheimer
(01:45:03)
Solution. I think for the foreseeable future, the animosity inside Israel towards the Palestinians is so great that it is impossible to move the Israelis in that direction. And the Israelis here are the key players more so than the Palestinians because it’s the Israelis who control Greater Israel. It’s the Israelis who you have to convince. Now, I want to be clear here. You also ultimately have to get around the fact that Hamas is not committed to a two-state solution. But I think that problem could be dealt with. It’s important to understand that Arafat and the PLO was once adamantly opposed to a two-state solution. But Arafat came around to understand that that was really the only hope for settling this. And he became a proponent of a two-state solution.

(01:45:53)
And that’s true of Mahmoud Abbas who runs the PA in the West Bank. It’s not true of Hamas at this point in time. They want a one-state solution, they want a Palestinian state. And of course, the Israelis want a one-state solution too, which is a Jewish state that controls all of Greater Israel. So the question is, can you get some sort of agreement? And I think to get to the nub of your question, given what’s just happened, it’s almost impossible to imagine that happening anytime soon.
Lex Fridman
(01:46:27)
The cynical perspective here is that those in power benefit from conflict while the people on both sides suffer. Is there a degree of truth to that? Or for the people in power to maintain power conflict needs to continue?
John Mearsheimer
(01:46:44)
No, I don’t believe that. I mean, just to take the Netanyahu government or any Israeli government that maintains the occupation, what you want is you want a Palestinian population that submits to Israeli domination of Greater Israel. You don’t want resistance, you don’t want an intifada. You don’t want what happened on October 7th. In fact, I think one of the principal reasons that the Israelis are pounding Gaza and killing huge numbers of civilians. Punishing the civilian population in ways that clearly violate the laws of war, is because they want the Palestinians to understand that they are not allowed to rise up and resist the occupation. That’s their goal.

(01:47:33)
So, I think the Israelis would prefer that the Palestinians roll over and accept submission. In terms of the people who live in Gaza to include the elites, and the people who live in the West Bank to include the elites. They would much prefer to move to some sort of situation where the Palestinians have a state of their own. I think in the case of the PA, under Abbas, they would accept a two-state solution. I think what, at this point in time, Hamas wants is a one-state solution, but they want peace. All of them want peace. The two different sets of leadership in Palestine and the Israelis.
Lex Fridman
(01:48:16)
So you think Hamas wants peace?
John Mearsheimer
(01:48:19)
Sure. But on its own terms, that’s the point.
Lex Fridman
(01:48:21)
What does peace look like for Hamas?
John Mearsheimer
(01:48:24)
At this point in time, I think peace basically means a Greater Israel controlled by Palestine or Palestinians.
Lex Fridman
(01:48:31)
Okay. So essentially, it’s the whole land is called Palestine and there’s no Israel?
John Mearsheimer
(01:48:38)
I think, at this point in time, that’s their principal goal. I do believe, and there have been hints over time, Jimmy Carter has said this, that Hamas can be convinced to a two-state solution. Assuming that the Palestinians get a viable state of their own, that Hamas would buy into that. Can we say that with a high degree of certainty? No, but I think the Israelis should have pursued that possibility. They should have worked with Abbas, they should have worked with Hamas to do everything they can to facilitate a two-state solution. Because I think, ultimately, that’s in Israel’s interest. Now, the Israeli government, and most Israelis at this point in time, I believe, don’t agree with that.
Lex Fridman
(01:49:21)
What do you think of Israel starting the ground invasion of Gaza recently on October 27th?
John Mearsheimer
(01:49:31)
The question is, should they continue until they have finally defeated Hamas? There are all sorts of reports in the media, including in the Israeli media, that they’re not going to be allowed by the United States to continue this offensive for much more than a few weeks. The Israelis have been saying it’s going to take, in the best of all possible worlds, a number of months, if not a year to finish off Hamas. Well, it doesn’t look like they’re going to have enough time to do that. I doubt whether they can finish off Hamas, even if they’re given the time. I think they’re going to run into fierce resistance. And when they run into fierce resistance and large numbers of Israelis going to start to die, they’ll lose their appetite for this. And they, the Israelis, surely know at this point in time that even if they finish off Hamas, even if I’m wrong and they’re able to finish off Hamas, another group is going to rise up to resist the occupation.

(01:50:48)
The idea that you can use with Ze’ev Jabotinsky called The Iron Wall, to beat the Palestinians into submission is delusional. It’s just not going to happen. The Palestinians want a state of their own. They don’t want to live under occupation. And there’s no military solution for Israel here. There has to be a political solution. And the only viable political solution is a two-state solution. I mean, you can’t go to democracy. You can’t go to a situation where you give the Palestinians equal rights inside of Greater Israel in large part because there are now as many Palestinians as there are Israeli Jews. And over time, the balance, the demographic balance shifts against the Israeli Jews and in favor of the Palestinians. In which case, you’ll end up with a Palestinian state in Greater Israel. So democracy for all doesn’t work. The Israelis, I believe, are quite interested in ethnic cleansing.

(01:51:56)
I think they saw this recent set of events as an opportunity to cleanse Gaza, but that’s not going to happen. The Jordanians and the Egyptians have made it clear that that’s not happening. The United States has now made it clear that that’s not happening. And the Palestinians will not leave. They’ll die in place. So ethnic cleansing doesn’t work. So you’re really left with two alternatives, the two-state solution or a Greater Israel that is effectively an apartheid state. I mean, that’s what the occupation has led to. And all sorts of people have been predicting this for a long, long time. And you’ve now reached the point. Here in the United States, if you say that Israel’s an apartheid state, that’s going to get you into all sorts of trouble. But the fact is that Human Rights Watch, Amnesty International, and B’Tselem, which is the leading Israeli human rights group. All three of those institutions or organizations have issued detailed reports making the case that Israel is an apartheid state.

(01:53:07)
Furthermore, if you read the Israeli media, all sorts of Israelis, including Israeli leaders, refer to Israel as an apartheid state. It’s not that unusual to hear that term used in Israel. This is disastrous for Israel in my opinion. And Steve Walt and I said this, by the way, when we wrote The Israel Lobby, that Israel is an apartheid state, which is equivalent to Israel as an occupier is not good for Israel. That brings us back to the two-state solution. But as you and I were talking about a few minutes ago, it’s hard to see how you get a two-state solution. And the end result of this conversation is utter despair.
Lex Fridman
(01:53:53)
Because the path to a two-state solution is blocked by the amount of hate that’s created by civilian deaths?
John Mearsheimer
(01:54:01)
Well, that plus the fact that the Israeli government is filled with people who have no interest in a two-state solution. They’re ideologically deeply committed to a Greater Israel. They want all the land between the Jordan River and the Mediterranean Sea to be part of a Jewish state. They’re just ideologically committed to that. And of course, as we were talking about before with regard to Hamas, Hamas wants everything between the river and the sea to be a Palestinian state. And when you have two sides with those kinds of views, you’re in deep trouble because there’s a little room for compromise. So what you have to do to get this to work is you have to convince the Israelis that it’s in their interest to have a two-state solution. And you’ve already taken care of the PA on this front, the Palestinian Authority, but you’ve got to convince Hamas that it’s maximalist goals are not going to work. And it’s in its interest to follow in the footsteps of Arafat and accept a two-state solution.

(01:55:17)
But even if you do that at this point, let’s say, that there’s a lot of willingness intellectually on both sides to do that. The problem is that the hatred that has been fueled by this ongoing conflict is so great that it’s just hard to imagine how you can make a two-state solution work at this juncture. That’s why I’ve sort of taken to saying, and I hope I’m wrong here, that on the two-state solution, that boat has sailed. It’s no longer possible.
Lex Fridman
(01:55:53)
Well, again, I believe in leadership and there’s other parties at play here, other nations, Jordan, Saudi Arabia, other players in the Middle East that could help through a normalization of relationships and these kinds of things. There’s always hope, like you said, slither of hope.
John Mearsheimer
(01:56:10)
Slither of hope.
Lex Fridman
(01:56:12)
I think human civilization progresses forward by taking advantage of all the slithers it can get. Let me ask you about, you mentioned The Israel Lobby. You wrote a book, probably your most controversial book on the topic.
John Mearsheimer
(01:56:26)
Not probably. Clearly, the most controversial book I ever wrote.
Lex Fridman
(01:56:30)
So you’ve criticized the Israel lobby in the United States for influencing US policy in the Middle East. Can you explain what the Israel lobby is, their influence, and your criticism over the past, let’s say a couple of decades?
John Mearsheimer
(01:56:48)
Well, the argument that Steve Walt and I made, actually, we wrote an article first, which appeared in the London Review of Books, and then we wrote the book itself. Our argument is that the lobby is a loose coalition of individuals and organizations that push American policy in a pro-Israel direction. And basically, the lobby is interested in getting the United States, and here we’re talking mainly about the American government, to support Israel no matter what Israel does. And our argument is, that if you look at the relationship between the United States and Israel, it’s unprecedented in modern history. This is the closest relationship that you can find between any two countries in recorded history. It’s truly amazing the extent to which Israel and the United States are joined at the hip. And we support Israel no matter what almost all the time. And our argument is that, that is largely due to the influence of the lobby. The lobby is an extremely powerful interest group.

(01:58:15)
Now, it’s very important to understand that the American political system is set up in ways that allow interest groups of all sorts to wield great influence. So in the United States, you have an interest group or a lobby like the National Rifle Association that makes it, well, not impossible to get gun control. And so with the Israel lobby, you have this group of individuals and organizations that wield enormous influence on US policy toward the Middle East. And this is not surprising given the nature of the American political system. So our argument is that the lobby is not doing anything that’s illegal, or illicit, or immoral, or unethical. It’s just a good old-fashioned American interest group. And it just happens to be extremely powerful. And our argument is that this is not good for the United States because no two countries have the same interests all the time. And when our interests conflict with Israel’s interest, we should be able to do what we think is in our national interest, in America’s national interest.

(01:59:42)
But the lobby tends to conflate America’s national interests with Israel’s national interests and wants the United States to support Israel no matter what. We also argue, and I cannot emphasize this enough, given what’s going on in the world today, that the lobby’s effects, the lobby has not been pushing policies that are in Israel’s interest. So our argument is that the lobby pushes policies that are not in America’s interest or not in Israel’s interest. Now, you’re saying to yourself, what exactly does he mean by that? What every president since Jimmy Carter has tried to do, as I said before, is to foster a two-state solution to push Israel, which is the dominant player in Greater Israel, push Israel to accept the two-state solution. And we have run into huge resistance from the lobby whenever we try to, let’s be blunt about it, coerce Israel.

(02:00:51)
In a perfect world where there was no lobby and an American president was free to put pressure on Israel, to coerce Israel, I believe, we would’ve gone a long way towards getting two-state solution. And I believe, this would’ve been in Israel’s interest. But we couldn’t get a two-state solution because it was almost impossible to put meaningful pressure on Israel because of the lobby. So this was not in Israel’s interest and it was not in America’s interest. And that was the argument that we made. And we, of course, got huge pushback for making that argument.
Lex Fridman
(02:01:28)
What’s the underlying motivation of the lobby? Is it religious in nature? Is it similar to the way war hawks are sort of militaristic in nature? Is it nationalistic in nature? If you were describe this loose coalition of people, what would you say is their motivation?
John Mearsheimer
(02:01:47)
Well, first of all, I think you have to distinguish between Jews and Christians. You want to remember that there are a huge number of Christian Zionists who are deeply committed to Israel no matter what, right? And then, there are a large number of Jews. The Jews are obviously the most important of those two groups in the Israel lobby. But one of the arguments that we made in the book is that you should not call it the Jewish lobby because it’s not populated just by Jews and Christian Zionists are an important part of that lobby. But furthermore, there are a good number of Jews who are opposed to the lobby and the policies that the lobby pervades. And there are a number of Jews who are prominent anti-Zionist, and they’re obviously not in the lobby. Or if you take a group like Jewish Voice for Peace, Jewish Voice for Peace is not in the lobby. So it’s wrong to call it a Jewish lobby.

(02:02:52)
But with regard to the American Jews who are in that lobby, I think that really, this is all about nationalism. It’s not so much religion. Many of those Jews who are influential in the lobby are not religious in any meaningful sense of that term. But they self-identify as Jewish in the sense that they feel they’re part of a Jewish nation. And that in addition to being an American, they are part of this tribe, this nation called Jews. And that they have a responsibility to push the United States in ways that support the Jewish state. So I think that’s what drives most, if not almost all the Jews. This is not to say there’s not a religious dimension for some of them, but I think that the main connection is much more tribal in nature.
Lex Fridman
(02:03:49)
So I had a conversation with Benjamin Netanyahu and he said, “Fundamentally, if you’re anti-Zionist, you’re antisemitic.” So the Zionist project is tied to the hip to the Jewish project, what do you have to say to that?
John Mearsheimer
(02:04:08)
Look, you can define antisemitism any way you want. And you can define antisemitism to incorporate anti-Zionism. And I think we have reached the point where antisemitism is identified today, not just with anti-Zionism, but with criticism of Israel. If you criticize Israel, some people will say you’re an antisemite. And if that’s your definition of antisemitism, it’s taken an important term and stretched it to the point where it’s meaningless. So when Steve and I wrote the book, wrote the article and then wrote the book, all sorts of people said that we were antisemites. This is a ludicrous charge. But what they meant was, you’re criticizing the lobby, you’re criticizing Israel, and therefore, you’re an antisemite. Okay. If that’s what an antisemite is, somebody who criticizes Israel, probably half the Jewish community, if not more in the United States, is antisemitic. And of course, you get into all these crazy games where people are calling Jews, self-hating Jews and antisemites because they’re critical of Israel.

(02:05:35)
But even people who are anti-Zionists, I don’t think they’re antisemitic at all. You can argue they’re misguided, that’s fine. But many of these people are Jewish and proud of the fact that they’re Jewish. They just don’t believe that nationalism and Jewish nationalism is a force that should be applauded. And you want to understand that in the American context, there is a rich tradition of anti-Zionism. And these were not people who were antisemites if you go back to the thirties, forties, fifties. And the same thing was even true in Europe. There were all sorts of European Jews who were opposed to Zionism. Were they antisemites? I don’t think so. But we’ve gotten to the point now where people are so interested in stopping any criticism of Israel that they wield this weapon of calling people antisemites so loosely that the term has kind of lost meaning. So I think Netanyahu is wrongheaded to equate anti-Zionism with antisemitism.
Lex Fridman
(02:06:49)
Alan Dershowitz was one of the people that called you specifically antisemitic. So just looking at the space of discourse, where’s the slither of hope for healthy discourse about US relationships with Israel between you and Alan Dershowitz and others like him?
John Mearsheimer
(02:07:16)
Well, I think until there is a settlement of the Israeli-Palestinian conflict, there’s no hope of putting an end to this nonsense. Right?
Lex Fridman
(02:07:27)
So these are just uses of terms to kind of cheat your way through the discourse, it’s a shortcut.
John Mearsheimer
(02:07:35)
No, it’s to silence people. It’s very important to understand that one of the lobby’s principle goals is to make sure we don’t have an open discourse, a freewheeling discourse about Israel. Because they understand, people in the lobby understand, that if you have an open discourse, Israel will end up looking very bad. You don’t want to talk about the occupation, you don’t want to talk about how Israel was created. All these subjects are ones that will cause problems for Israel. See, just to go to the present crisis. When you have a disaster, and what happened on October 7th is a disaster. One of the first things that happens is that people begin to ask the question, how did this happen? What’s the root cause of this problem? This is a disaster. We have to understand what caused it so that we can work to make sure it doesn’t happen again. So we can work to shut it down and then make sure it doesn’t happen again.

(02:08:46)
But once you start talking about the root causes, you end up talking about how Israel was created. And that means telling a story that is not pretty about how the Zionists conquered Palestine. And number two, it means talking about the occupation, right? It’s not like Hamas attacked on October 7th because there were just a bunch of antisemites who hated Jews and wanted to kill Jews. This is not Nazi Germany. This is directly related to the occupation and to what was going on inside of Gaza. And it’s not in Israel’s interest or the lobby’s interest to have an open discourse about what the Israelis have been doing to the Palestinians since, I would say, roughly 1903 when the second aliyah came to Israel or came to what was then Palestine, right? We want to talk about that. And we don’t want to talk about from the lobbyist’s point of view, the influence that the lobby has, right?

(02:09:54)
It’s better from the lobbyist’s point of view if most Americans think that American support of Israel is just done for all the right moral and strategic reasons, not because of the lobby. And when John Mearsheimer and Steve Walt come along and say, you have to understand that this special relationship is due, in large part, to the lobby’s influence. That is not an argument that people in the lobby want to hear. So the point is, you have to go to great lengths for all these reasons. You have to go to great lengths to silence people like me and Steve Walt. And one of the ways to do that is to call us antisemites.

(02:10:32)
I think the chapter or the section of the book where we talk about this charge of antisemitism is called The Great Silencer. That’s what we call the charge of antisemitism, The Great Silencer. Who wants to be called an antisemite, especially in the wake of the holocaust? Do I want to be called an antisemite? Oh my God, no. And so it’s very effective. But it is important to talk about these issues, in my humble opinion. And I think if we had talked about these issues way back when, it would’ve gone a long way towards maybe getting a two-state solution, which I think was the best alternative here.
Lex Fridman
(02:11:21)
It’s complicated. And I wonder if you can comment on the complexity of this, because criticizing Israel and criticizing the lobby can, for a lot of people, be a dog whistle for sort of antisemitic conspiracy theories. That this idea that Jews run everything, run the world, they’re this kind of cabal. And it’s also very true that people who are legitimately antisemitic are also critics of Israel in the same kind of way. And so, it’s such a complicated landscape in which to have discussions. Because even people like David Duke who are racist, don’t sound racist on the surface. I haven’t listened to him enough. But there’s dog whistles. It’s a complicated space in which to have discussions. I wonder if you can sort of speak to that. Because there’s this silencing effect of calling everybody antisemitic. But it’s also true that there’s antisemitism in the world, there is a sizable population of people that hate Jews. There’s probably a sizable population of people who hate Muslims, too.
John Mearsheimer
(02:12:51)
A lot of hate out there.
Lex Fridman
(02:12:53)
A lot of hate out there. But the hatred of Jews has a long history. And so you have, like Rolling Stones have a set of great hits. And there’s just a set of great hits of the ways, conspiracy theories, that you can make about the Jews that are used as part of the hatred. So there’s nice templates for that. And I just wonder if you can comment on operating as a historian, as an analyst, as a strategic thinker in this kind of space.
John Mearsheimer
(02:13:25)
Obviously, when we wrote the article, which we did before the book gave this subject a great deal of thought. I mean, what you say just now is music to our ears. I’m talking about me and Steve. I think that your point about dog whistles is correct. Look, we went to great lengths to make it clear that this is not a cabal. It’s not a conspiracy. And in fact, in a very important way, the lobby operates out in the open. They brag about their power. And this was true before we wrote the article. And we said in the article, in the book, and you heard me say it here, first of all, it’s not a Jewish lobby. Secondly, it’s not a cabal. It’s an American interest group.
Lex Fridman
(02:14:29)
And the American system is designed such that interest groups are perfectly legal, and some of them are super effective.
John Mearsheimer
(02:14:37)
Exactly. I mean, you hit the nail right on the head. That’s exactly right. And there was nothing that we said that was antisemitic by any reasonable definition of that term. And huge numbers of Jews have known me and Steve over the years, and nobody ever, ever said that we were antisemitic before March, 2006 when the article appeared, because we’re not antisemitic. But look, you’ve got this interest group that has a significant influence on American policy and on Israeli policy, and you want to talk about it. It’s just important to talk about it. It’s important for Jews in the United States, for Jews in Israel, to talk about this. The idea that you want to silence critics is not a smart way to go about doing business, in my opinion. If we were wrong, if Steve and I were so wrong and our arguments were so foul, they could have easily exposed those arguments. They could have gone into combat with this in terms of the marketplace of ideas and easily knocked this down.

(02:16:00)
The problem was that our arguments were quite powerful. And instead of engaging us and defeating our arguments, they wanted to silence us. And this is not good. It’s not good for Israel, it’s not good for the United States. And I would argue in the end, if anything, it’s going to foster antisemitism. I think you don’t want to run around telling people that they can’t talk about Israel without being called an antisemite. It’s just not healthy in terms of the issue that you’re raising. But I still agree with you that it is a tricky issue. I don’t want to make light of that. I know that there’s this piece of literature out there called the Protocols of the Elders of Zion. And I fully understand that if you’re not careful, you can come close to writing volume two of the protocol. But I don’t believe that we wrote anything that was even close to that. And again, I think that a healthy debate on the issues that we were raising would’ve been not only in America’s interest, but it would’ve been in Israel’s interest.
Lex Fridman
(02:17:15)
Yeah. Underneath it all is just, I wonder why there is so much hate against groups, why it’s such a sticky way of thinking. Not just tribalism, proud of your country and kind of hating another country, but really deeply hating. Hating in a way where it’s part of your identity kind of hate.
John Mearsheimer
(02:17:40)
Well, just to make a general point on this issue in our conversation here today, you often talk about individual leaders, and the word individual often pops up in your vocabulary. I believe that we are ultimately social animals before we are individuals. I believe we’re born into tribes, we’re heavily socialized, and that we carve out space for our individualism. But we are part of tribes, or social groups, or nations. Call them what you want, ethnic groups, religious groups. But the fact is that these tribes often crash into each other. And when they crash into each other, they end up hating each other. If you go to a place like Bosnia, the Croats and the Serbs, oh, my God. And then throw in the Bosniaks, which is the term for Bosnian Muslims. And Muslims, Croats, Serbs, and the tribes hate each other. And in a funny way, that hatred almost never goes away. And I guess, there are some exceptions to that.

(02:18:59)
If you look at the Germans after World War II, they’ve gone a long way towards reducing, I wouldn’t want to say completely eliminating, but reducing a lot of the hatred that existed between Germans and their neighbors. But that’s really kind of an anomalous case. I mean, you go around East Asia today and the hatred of Japan in a place like China, the hatred of Japan in a place like Korea, just not to be underestimated. But I think a lot of it just has to do with the fact that you’re dealing with social groups that have crashed into each other at one point or another. And there are those lingering effects. And by the way, this gets back to our discussion a few minutes ago about trying to get a two-state solution between the Palestinians and the Israeli Jews now that you have had this horrible war, which is ongoing.
Lex Fridman
(02:19:59)
It’s interesting to ask, to go back to World War II-
Lex Fridman
(02:20:02)
… To ask to go back to World War II. Now, you said you studied Nazi Germany in the ’30s from a perspective of maybe offensive realism, but just to look at the Holocaust, it’s sometimes popular in public discourse today to compare certain things to the Holocaust. People have compared the Hamas attack on Israel to the Holocaust, saying things like, “It’s the biggest attack on Jews since the Holocaust,” which kind of implies that there’s a comparison. People have made that same comparison in the other direction. What do you make of this comparison? Is it comparable? Does the use of the Holocaust have any accuracy in comparisons of modern day international politics?
John Mearsheimer
(02:21:01)
Is it possible that you could have another genocide? Yes, and I would argue that what you had in Rwanda was a genocide. The Holocaust is not the only genocide. I believe the word genocide is used too loosely today. And as you know, lots of people, and I mean lots of people who are pro-Palestinian accused the Israelis of engaging in genocide in Gaza. I think what the Israelis are doing in Gaza represents a massacre. I would use that term given the number of civilians that they’ve killed and the fact that they’ve been indiscriminate in terms of how they’ve been bombing Gaza. But I would not use the word genocide. For me, a genocide is where one side attempts to eliminate another group from the planet. I think that what happened with the Holocaust was clearly a genocide, and that the Germans were bent on destroying all of European Jewry.

(02:22:13)
And if they could have gotten their hands on Jews outside of Europe, they would’ve murdered them as well. That’s a genocide. And I think with the Hutus and the Tutsis, you had a similar situation. I think with the Turks and the Armenians during World War I, that was a genocide, but I have a rather narrow definition of what a genocide is and I don’t think there are many cases that qualify as a genocide. The Holocaust certainly does. Now, what Hamas did doesn’t even come close to what happened to European Jewry between, let’s say, 1939 and 1945, although I date the start of the Holocaust to 1941, if we were looking at it closely, but let’s just say 1939, when they invaded Poland, from 1939 to 1945. What Hamas did pales in comparison. It’s hard to believe anybody would make that argument. Yes, a lot of Jews died, but hardly any compared to the number that died at the hands of the Germans. No parallel at all. And furthermore, Hamas was in no position to kill all of the Jews in the Middle East, just not going to happen.
Lex Fridman
(02:23:43)
But there’s also levels of things, Germans using human skin for lamps. There’s just levels of evil in this world.
John Mearsheimer
(02:23:54)
Yes, but that’s not what Hamas is doing. I want to be very clear here. I am not justifying the Hamas’ killing of civilians. Not for one second, but I’m just saying… And by the way, just to go to the Israelis and what they’re doing in Gaza, as I said to you before, I do believe that is a massacre and I believe that’s to be condemned, the killing of civilians. This is not legitimate collateral damage. They’re directly punishing the population. But I would not call that a genocide and I would not compare that to the Holocaust for one second. I just want to be very clear on that.
Lex Fridman
(02:24:37)
Do you think if Israel could, they would avoid the death of any civilians? So you’re saying there’s some degree of punishment of collective-?
John Mearsheimer
(02:24:48)
They’re purposely killing civilians. This is the Iron Wall. They’re trying to beat the Palestinians into submission. There’s no way you kill this many civilians if you’re trying to precisely take out Hamas fighters. And by the way, the Israeli spokesmen, the IDF spokesman has explicitly said that, “We are not pursuing precision bombing. And that what we are doing is trying to maximize the amount of destruction and damage that we can inflict on the Palestinians and I think this is a major mistake on the part of Israel.” First of all, it ends up being a moral stain on your reputation, number one. And number two, it doesn’t work. It doesn’t work. The Palestinians are not going to roll over and submit to Israeli domination of their life.

(02:25:52)
The whole concept of the Iron Wall, Jabotinsky’s term, was misguided. And by the way, if you look at what the Israelis are doing, they’re trying to do two things. One is the Iron Wall, and that’s where you punish the civilian population in Gaza and get them to submit. The other thing that they’re trying to do is get Hamas. They want to destroy Hamas. And the belief there is that if they destroy Hamas, they’ve solved the problem. But as many Israelis know, including people on the hard right, even if you destroy Hamas, they are going to be replaced by another resistance group and that resistance group will employ terror.
Lex Fridman
(02:26:36)
Yeah. I think you’ve said that other terrorist organizations have used the situation in Palestine as a recruitment mechanism for a long time.
John Mearsheimer
(02:26:47)
Osama bin Laden made it clear that this was one of those principal reasons for attacking the United States.
Lex Fridman
(02:26:56)
And the United States attacked back and got us into a 20-year war that cost the lives of millions of people, not American, but human beings and-
John Mearsheimer
(02:27:12)
Engaged in torture.
Lex Fridman
(02:27:14)
And torture. Yeah.
John Mearsheimer
(02:27:16)
No, I think if you look at how we reacted to 9/11 and how the Israelis are reacting to what happened on October 7th, there’s quite a bit of similarity in that both sides, the Israeli side and the American side, are enraged and they lash out and they go on a rampage and the end result is not good.
Lex Fridman
(02:27:45)
Is there a capacity within Israel or within United States after 9/11 to do something approximating turn the other cheek of understanding the root of terror is hate and fighting that hate with, not to sound naive, but compassion?
John Mearsheimer
(02:28:10)
Well, I don’t think in either case you’re going to turn the other cheek.
Lex Fridman
(02:28:18)
What I mean by that is some limited powerful military response, but very limited?
John Mearsheimer
(02:28:25)
Coupled with a smart political strategy.
Lex Fridman
(02:28:27)
Political strategy, diplomacy.
John Mearsheimer
(02:28:29)
Yeah. That’s what they should have done.
Lex Fridman
(02:28:31)
Yeah.
John Mearsheimer
(02:28:31)
Right.
Lex Fridman
(02:28:31)
But is there capacity for that or from your offensive realism perspective, it’s just the odds are really low?
John Mearsheimer
(02:28:41)
From my offensive realist perspective or my realist perspective, that’s what you should do. My view is states are rational actors, they should be cunning. They should think about the strategic situation they’re in and choose the appropriate response. And what happens, and this is why my theory is not always correct, is that sometimes states are not rational and they misbehave. I would argue in the Israeli case that it would’ve been good after October 7th, or starting on October 7th, if the United States had tried to hold the Israelis back and countenanced a more moderate response. Take some time just to think about how to deal with this problem instead of lashing out. I think given what happened to the Israelis, given how shocked they were, given the level of fear, given the level of rage, they were going to lash out and I don’t believe that was in their interest. I think it would’ve made sense to think about it and to think about a smarter strategy than they’re now employing. And I think the Americans blew it. The Americans gave them a bear hug and a green light and said, “We’ll give you all the weaponry you need and go out and do it.” And I don’t think that was the smart thing to do. Look, in the wake of October 7th, the Israelis had no good strategy. It’s not like there’s a magic formula that they just didn’t see and we should have told them what the magic formula was. That’s not true. They were, in a sense, caught between a rock and a hard place in terms of what to do. But there are smarter things than number things and I think the Israelis lashed out in ways that are counterproductive. I think going on a rampage and killing huge numbers as civilians, it’s obviously morally wrong, but it’s also just not in their strategic interest because it’s not going to buy them anything.

(02:31:03)
And in fact, it’s going to cost them because people all over the planet are turning against Israel. I saw an Israeli think tank today that has been tracking protests around the world, gave some figures for what it looked like between October 7th and October 13th in terms of the number of protests around the world that were pro-Israel versus pro-Palestine. And then it looked at the numbers from October 13th up to the present and I think the numbers were 69% were pro-Palestinian in the first six days after October 7th, 69%, and I think 31%… Take these numbers with a grain of salt. 31% were pro-Israel. So I think it was 69 and 31.

(02:32:04)
And since then, since October 13th, if you look at the number of protests around the world, 95% have been pro-Palestinian and 5% have been pro-Israel. And what this tells you is that public opinion around the world has shifted against Israel. And if you look at some of the demonstrations in places like London and Washington, DC, it’s truly amazing the number of people who are coming out in support of the Palestinians. And all of this, again, is just to support my point that it was just not smart for Israel to launch this bombing campaign. You can make an argument for going after Hamas and doing it in a surgical way or as surgical a way as possible, but that’s not what they did. And again, my point to you is I think that this punishment campaign is not going to work strategically. In other words, they’re not going to beat the Palestinians into submission, they’re not going to finish off Hamas. And at the same time, by pursuing this strategy, they’re doing huge damage to their reputation around the world.
Lex Fridman
(02:33:16)
In the wake of October 7th, given the geopolitical context, I think there’s a lot of leverage to be the great ethical superpower, demonstrate power without killing any civilians, and use that leverage diplomatic leverage to push forward something like Abrahamic Accords with more nations, with Saudi Arabia, push for peace aggressively, peace agreements, this kind of stuff, economic relationships, all of this kind of stuff, and thereby pressure the Palestinian authority towards perhaps the two-state solution.
John Mearsheimer
(02:34:04)
I think what you’re missing here, just in the Israeli case, is that the Israeli government is not interested in two-state solution. And you want to remember that Benjamin Netanyahu, who looks very hawkish when you look at him in isolation, doesn’t look so hawkish when you look at him compared to the rest of the people in his cabinet. He almost looks like a moderate. He’s got a lot of people who are way out to the right of him. And these people, and this of course includes Netanyahu, are not interested in the two-state solution. So the question you have to ask yourself is, if you’re Benjamin Netanyahu and it’s October 7th, late in the day, what do you do? You’re not thinking about a two-state solution. You’re thinking about an occupation that’s not going to end. And the question is how do you deal with the Palestinians given what’s just happened?
Lex Fridman
(02:35:05)
Well, there’s people in the cabinet and then there’s history. And history remembers great leaders. So Benjamin Netanyahu can look in the streets of Israel and see the protests and think of how history will remember him. I think a two-state solution is on the table for a great leader.
John Mearsheimer
(02:35:24)
Well, it was there. Was he the person who was going to take advantage of it? I don’t think so, but we’ll see.
Lex Fridman
(02:35:35)
He’s a student of history. At this point, it’s very difficult. Like you said, 95% now or whatever the number is of protests, I think the window in which Israel has the ears of the world, it can do the big ethical action towards peace, I think, has closed. Or maybe there’s still a slither, but it’s just… The slippery slope of hate has taken off. It’s quite depressing to watch what’s going on.
John Mearsheimer
(02:36:10)
Yep. I agree a hundred percent. Unequivocally depressing.
Lex Fridman
(02:36:13)
But of course, as you talk about the role of… The US involvement is of critical importance here for the United States and the argument you make is that we should not be involved in Ukraine, at least to the degree we are, we being the United States, and we should not be involved in Israel to the degree we are because it’s stretching us too thin when the big geopolitical contender in the 21st century with United States is China. Is that a correct summary?
John Mearsheimer
(02:36:49)
Yeah, I think just on Ukraine, we should not have pushed Ukraine to join NATO.
Lex Fridman
(02:36:55)
Sure.
John Mearsheimer
(02:36:56)
And once the war started, we should have worked overtime to shut it down immediately.
Lex Fridman
(02:37:03)
March.
John Mearsheimer
(02:37:04)
March, right. And you remember, by the way, not to go back to Ukraine in great detail, in the early fall of 2022… The war starts February, 2022. There’s March, 2022, which we’ve talked about, which is the negotiations. In the fall of 2022, I think it was in September, the Ukrainians had won two major tactical victories, one in Kherson and the other in Kharkiv. And at that point in time, General Milley, who was the chairman of the Joint Chiefs of Staff, said, “Now is the time to negotiate because this is the high watermark for the Ukrainians.” Milley understood that things were only going to get worse, and the White House shut Milley down and said, “We’re not negotiating.” So we have blown a number of opportunities here to head this problem off at the pass. But that’s my view there. And with regard to the Israelis, my only point about Israel is that it would be better for Israel and better for the United States if we, the United States, was in a position to put pressure on Israel from time to time. As Steve and I say in the book, we should be able to treat Israel like a normal country. The fact is that countries sometimes do stupid things. This includes the United States and Israel. And if Israel is pursuing a policy that we think is unwise, we should be in a position where we could put pressure on Israel. That’s our argument. But anyway, we goofed both with regard to Ukraine and with regard to the Middle East and we’re now up to our eyeballs in alligators in both of those regions. And as you described my view, this is not good because the area of the most strategic importance for the United States today is East Asia and that’s because China is there and China is the most serious threat the United States faces.

China

Lex Fridman
(02:39:14)
Do you think there will be a war with China in the 21st century?
John Mearsheimer
(02:39:19)
I don’t know. My argument is there will be. There is right now a serious security competition and at the same time, there is a real possibility of war. Whether or not we avoid it is very hard to say. I mean, we did during the Cold War. We had a serious security competition from roughly 1947 to 1989 and we thankfully avoided war, probably came the closest in 1962 at the Cuban Missile Crisis. But we avoided it and I think we can avoid it here. Is it for sure? No.
Lex Fridman
(02:39:59)
You’ve said that China won’t move on Taiwan militarily, in part because, as you said, amphibious operations are difficult. Why will China not move on Taiwan in your sense in the near future?
John Mearsheimer
(02:40:16)
Well, it’s because there’s this body of water called the Taiwan Strait, which is a big body of water, and getting across water is very difficult unless you can walk on water.
Lex Fridman
(02:40:29)
So geography still has a role to play in the 21st century?
John Mearsheimer
(02:40:32)
Oh, yeah. I think geography’s very important. Big bodies of water really matter. In an ideal world, you’d like to have the Pacific Ocean between you and any potential adversary. 6,000 miles of water, hard to get across. If you’re a country and I’m a country and there’s land between us, I can take my Panzer divisions and I can go right across the land and get into your country or attack your country. And you of course can take your Panzer divisions and come across that same piece of land. But if there’s a big body of water between us, your Panzer divisions can’t go across the water and then the question is how do you get them across the water? And that’s very tricky. And in a world where we have lots of submarines and you have lots of aircraft and you have missiles that are land-based that can hit those surface ships, it is very, very hard to attack across a body of water. And all you have to do is think about the American invasion of Normandy, June 6th, 1944, coming in on Omaha Beach. Oh, boy. That was really difficult.
Lex Fridman
(02:41:46)
But there is a growing asymmetry of military power there that even though it’s difficult-
John Mearsheimer
(02:41:53)
That is correct.
Lex Fridman
(02:41:54)
So I guess-
John Mearsheimer
(02:41:56)
That is correct.
Lex Fridman
(02:41:57)
So I recently had a conversation with Elon Musk and he says that China is quite serious about the One China policy, and it seems inevitable that Taiwan will have to be… If you look at this pragmatically in the 21st century, it seems inevitable that Taiwan will have to be a part of China and so we can get there either diplomatically or militarily. What do you think about the inevitability of that kind of idea? When a nation says, “This is a top priority for us,” what do you think about them meaning it, and what do we do about that?
John Mearsheimer
(02:42:46)
There’s no question it’s a top priority for them and there’s no question they mean it, but it’s also a top priority for us not to let them take Taiwan.
Lex Fridman
(02:42:54)
Why exactly?
John Mearsheimer
(02:42:56)
Because it’s an important strategic asset. Many people will say it’s because Taiwan’s a democracy, but that doesn’t matter that much. It’s because of two strategic reasons. The first is that if we were to let Taiwan go, it would have hugely negative consequences for our alliance structure in East Asia. To contain China, we need allies. We have an alliance structure, and our allies, Japanese, South Koreans, Filipinos, Australians, they’re all counting on us to be there for them. And if we say, “We’re not going to defend Taiwan, the Chinese attack,” they’re going to say, “I bet if the Chinese attack us, the Americans won’t be there for us.” So it would have a damaging effect on our alliance structure, which we cannot afford because containing China is a wicked problem. It’s a powerful state. You were getting to this before when you talked about China versus Taiwan. So that’s the first reason.

(02:44:07)
Second reason is you want to bottle up the Chinese Navy and the Chinese Air Force inside the first island-chain. You don’t want to let them get out into the Pacific. You don’t want them dominating the waters of East Asia. You want to bottle them up again inside the first island-chain. And you can only do that if you control Taiwan. You don’t control Taiwan, they get out into the Philippines Sea, into the Pacific, and the Western Pacific and cause all sorts of problems.
Lex Fridman
(02:44:38)
Well, you saying all that, you’ve also said the Century of Humiliation, Japan and the United States are a source of that humiliation for China, don’t you think they see the other side of that?
John Mearsheimer
(02:44:52)
Absolutely.
Lex Fridman
(02:44:53)
And in the interest of avoiding a World War… I guess the question is how do we avoid a world war? It doesn’t seem like the military involvement in the conflict between China and Taiwan is the way.
John Mearsheimer
(02:45:14)
Well, I don’t want-
Lex Fridman
(02:45:15)
There’s no good answers here. I’m just saying-
John Mearsheimer
(02:45:17)
There are no-
Lex Fridman
(02:45:18)
Which is the less bad option?
John Mearsheimer
(02:45:20)
Well, what you want to do is you want to make sure that you deter China from invading Taiwan. You want to avoid a war. You and I are in complete agreement on that. We don’t want a war, but we want to contain China. We do not want to let China dominate Asia. That’s what the Americans are principally concerned with here and it’s what China’s neighbors are principally concerned with. This includes the Japanese, the South Koreans, the Filipinos, Australians, and the Taiwanese. They don’t want and we don’t want China to dominate the region, so we have to contain it.

(02:45:57)
But at the same time, and this should be music to your ears, we not only want to contain it, we want to make sure we don’t end up in a shooting match with the Chinese because this could be disastrous. So you have to have a very smart policy. You have to build powerful military forces, and you have to make sure you don’t do anything that’s provocative. On Taiwan, for example, the last thing you want is for the Taiwanese government to declare its independence because the Chinese have said, “If Taiwan does that, we’ll go to war.” And of course, we don’t want that. So my view is you want to smartly build up your military forces and you want to do everything you can to contain China, and at the same time, not be provocative.
Lex Fridman
(02:46:41)
So a big component of that is making sure the US military is bigger than the Chinese military.
John Mearsheimer
(02:46:51)
Not necessarily. It’s an interesting question. A lot of people think that to make deterrence work, you have to be able to beat the Chinese and therefore, you need a much bigger military. And I don’t think over time that’s possible. I think it’s probably not even possible now to beat the Chinese in a war over Taiwan or in a war in the South China Sea. I think what you want to do is make it clear to the Chinese either that there will be no winner… In other words, you don’t have to win, but you want to make sure they don’t win. It’s a lose-lose proposition if they go to war over Taiwan or what have you.

(02:47:40)
And if you can’t do that, you think that they’re so powerful that they’re ultimately going to win, you want to convince them that victory would be a Pyrrhic victory. In other words, they would pay a godawful price to win the war. You follow what I’m saying? So the best strategy for deterrence is you win, China loses. Second best strategy is a stalemate, nobody wins. Third best strategy is they win, but they pay a godawful price. And the fourth possibility, which you don’t want, is they went quickly and decisively. If that’s the case, then you don’t have much deterrence.
Lex Fridman
(02:48:28)
What does a world with China as the sole dominant superpower look like? I mean, a little bit underlying our discussion is this kind of idea that US is the good guys and China is the bad guys. First of all, dividing the world into good guys and bad guys seems to somehow miss the nuance of this whole human civilization project we’re undertaking. But what does the world look like where China is the dominant sole superpower in a unipolar world?
John Mearsheimer
(02:49:01)
Well, I don’t tend to think of the world in terms of good guys and bad guys. As a good realist, I think that states or states, they’re all black boxes. I don’t discriminate between democracies and autocracies. But having said that, I am an American and as an American, I’m interested in the security of my country, the survival of my country. So I want the United States to be the most powerful state in the world, which means I want the United States to dominate the Western hemisphere, I want us to be a regional hegemon, and I want to make sure that China does not dominate Asia the way we dominate the Western hemisphere.

(02:49:45)
It’s not because I think we’re the good guys and they’re the bad guys. If I were Chinese and I were in Beijing and I was Xi Jinping’s national security advisor, I’d tell him what we got to do is make sure we dominate the world or dominate our region and then do everything we can to undermine America’s position in the Western hemisphere. That’d be my view. So I guess you could say I do view the world in terms of good guys and bad guys, an American and-
Lex Fridman
(02:50:16)
More like us and them versus-
John Mearsheimer
(02:50:18)
Yeah, it’s us and them. That’s a nice way to put it. Yeah, it’s us versus them. Not so much good guys versus bad guys.
Lex Fridman
(02:50:24)
Is it possible to have a stable, peaceful world with a good balance of power where it’s China and US as superpowers? It’s a bipolar world, no longer unipolar.
John Mearsheimer
(02:50:37)
Yeah. Okay, so you’re hypothesizing a world where they dominate Asia and we dominate the Western hemisphere.
Lex Fridman
(02:50:43)
Yeah.
John Mearsheimer
(02:50:44)
I believe there would be a great deal of intense security competition between those two superpowers.
Lex Fridman
(02:50:53)
The definition of intense matters here. So it could be small military conflicts or it could be extremely large unstable military conflicts, right?
John Mearsheimer
(02:51:04)
Well, conflict… Let’s use the word war. So I distinguish between security competition and war. And what I’m telling you is you’ll have an intense security competition where there’s no shooting, or if there’s shooting, it’s mainly proxies that are doing the fighting, much like the Vietnam War. Or you could have a case where one of those superpowers was involved in a war against a proxy of the other superpower. Think the Korean War. The United States fought the Chinese who were allied with the Soviets at the time. But a war between the United States and China, just like a war between the United States and the Soviet Union during the Cold War, that’s what you really want to avoid. So I think you’d have an intense security competition. You’d have wars involving proxies of each of those two superpowers and you would probably have some wars where one of superpowers was involved in a proxy with one of the other superpower’s proxies.
Lex Fridman
(02:52:12)
So it seems likely then, if that’s the case, then it would be Taiwan is the proxy and US fighting China through the proxy of Taiwan?
John Mearsheimer
(02:52:21)
Yeah. Well, that would assume the United States… But you want to remember, you’re hypothesizing a situation where China dominates Asia.
Lex Fridman
(02:52:29)
Oh, it already has dominated.
John Mearsheimer
(02:52:31)
Yeah, it’s already dominated Taiwan.
Lex Fridman
(02:52:34)
I see. Where do you find the proxies? Australia?
John Mearsheimer
(02:52:38)
The Middle East could be a good case.
Lex Fridman
(02:52:40)
Oh, wow.
John Mearsheimer
(02:52:41)
Persian Gulf.
Lex Fridman
(02:52:43)
Oh boy. And then our discussion of Israel becomes even more dramatically-
John Mearsheimer
(02:52:47)
Yeah, well, Israel gets involved… I think in this scenario, if you’re talking about a US China competition and you’re talking about the Middle East, I think it’s the Gulf, it’s the Saudis, the Iranians, the Iraqis. It’s the oil.
Lex Fridman
(02:53:03)
Don’t you think it could be Israel versus Iran with some very 1984 kind of dramatic partnership of Iran, Russia, and China versus United States, Europe, and Israel?
John Mearsheimer
(02:53:18)
I think that’s possible. Yeah.
Lex Fridman
(02:53:19)
Oh boy.
John Mearsheimer
(02:53:20)
I think that’s possible. Yeah. I mean, I hadn’t thought about it until you said it, but yeah, I think that that is possible.
Lex Fridman
(02:53:28)
Isn’t that terrifying?
John Mearsheimer
(02:53:31)
Yeah. Well, in your scenario, where China already dominates Asia and we dominate the Western hemisphere, I think you start talking about where the most likely places that the United States and China go head-to-head or fight through proxies. I think it is the Gulf or the Middle East and the scenario that you posit.
Lex Fridman
(02:53:56)
I mean, one question I have… I don’t know about you, but for me, unlike with the Soviet Union, and I know I was born there, but even outside of that, the cultural gap, the loss in translation, the communication gap between China and the United States seems to be much greater than that of what was the former Soviet Union and the United States. I see two cultures intermingling and communicating as one of the ways to deescalate future conflict.
John Mearsheimer
(02:54:35)
It’s an interesting question. I mean, at sort of an abstract theoretical level, my argument is that great powers act according to realist dictates and they understand those realistic dictates and that could lead to cooperation or it can lead to war. It depends. I would say just in the case of the Soviets, a lot of people…
John Mearsheimer
(02:55:03)
I would say just in the case of the Soviets, a lot of people describe the Cold War as an ideological competition above all else, it was communism versus liberal democracy or communism versus liberal capitalism, whatever. I actually don’t believe that. I believe the Soviets were realist to the core. I believe Stalin was a realist par excellence, and that ideology did not matter much in Stalin’s foreign policy. And I believe if you look at Soviet foreign policy after World War II, throughout the Cold War, they were realists to the core. And I think in those days, the Americans were realists, a lot of liberal ideology floating around out there, but the Americans were realists. And I think one of the reasons you avoided a shooting match between the United States and the Soviet Union from ’47 to ’89 was because both sides, I think understood basic balance of power logic. US China competition is somewhat different.

(02:56:14)
First of all, the Chinese are realists to the core. I’ve spent a lot of time in China. I basically have rock and roll. I’m basically a rock and roll star in China. The Chinese-
Lex Fridman
(02:56:27)
You’re kind of a big deal in China. I love it.
John Mearsheimer
(02:56:29)
The Chinese are my kind of people. They’re realists, right? They speak my language. It’s the United States that is not very realist. American leaders have a very powerful liberal bent and tend not to see the world in realist terms. I believe, by the way, just going back to our discussion of NATO expansion, I think our inability to understand that NATO expansion was anathema to the Soviet, to the Russians, was due in large part to the fact that we just during the unipolar moment, didn’t think of international politics from a realist perspective and didn’t respect anyone who thought about international politics from a realist perspective. If those various American administrations starting with the Clinton administration had put their realist hat on, they would’ve understood that NATO expansion into Ukraine was not a good idea, but we had this thoroughly liberal view of the world that dominated our thinking, and it’s gone away somewhat since we’ve moved into Multi-polarity, but not completely.

(02:57:34)
And this makes me a little nervous to pick up on your point. I mean, the United States is thinking about the world in ways that are somewhat different than the Chinese who are real as par excellence.
Lex Fridman
(02:57:47)
So that’s fascinating. So the Chinese are pragmatic about thinking of the world as a competition of military powers, all the ways in which you described the realist perspective. So that’s a hopeful thing, right? If we can achieve stability and a balance of powers through that military competition.
John Mearsheimer
(02:58:11)
Yeah, I actually think that’s right. I think if the United States, just let me talk a little bit about the United States to get at the issue you’re raising. If the United States pursues a smart containment strategy, given what you just said, and I said about the Chinese, I think we will avoid war. The problem with the Americans is it’s not just the liberalism. It’s the possibility that we will pursue a rollback policy. In other words, during the Cold War, we pursued containment. It was whenever anybody talked about American grand strategy towards the Soviet Unions, containment, containment, containment. We now know from the historical record that the United States was not only pursuing containment, it was pursuing rollback. We were trying to roll back Soviet power to put it bluntly, we were trying to wreck the Soviet Union, and I would not be surprised moving forward with regard to China if the United States pursues a serious rollback policy and-
Lex Fridman
(02:59:17)
So you’re saying throughout history, United States was always doing that. Always. Where’s that from? Why can’t we respect the power of other nations?
John Mearsheimer
(02:59:26)
Because they may be a threat to us?
Lex Fridman
(02:59:28)
Well, I mean-
John Mearsheimer
(02:59:31)
Look, you don’t respect the power of other nations. You fear the power of other nations.
Lex Fridman
(02:59:37)
Well fear and respect are next door neighbors depending on the neighborhood you’re living in, but I just mean it could be very counterproductive to try because if you can empathize with their… If you assume they’re rational actors you trying to roll back would lean into the uncertainty of potential conflict. So you want to avoid the uncertainty of potential conflict, caution, right?
John Mearsheimer
(03:00:03)
Well, yes and no. Look, your point is you want to empathize. You want to be able to put yourself in the shoes of the other side.
Lex Fridman
(03:00:10)
Yes.
John Mearsheimer
(03:00:10)
I agree 100% with that, right. It’s very important if you’re a first class strategist to be able to do that, but at the same time, there is this competition for power taking place, and what you want to do is maximize how much power you have relative to the other side, and the other side wants to maximize how much power it has relative to you. So you have this competition for power that’s taking place all the time, and that’s taking place at the same time you want to have empathy or you want to be able to put yourself in the shoes of the other side. So those two things kind of go together.
Lex Fridman
(03:00:49)
It just feels less threatening to build up your thing versus try to hurt the other person’s thing, the other group’s thing.
John Mearsheimer
(03:01:00)
But if you build up your own power, you are building up your capability to hurt the other side.
Lex Fridman
(03:01:06)
Right, but I guess you don’t rattle the saber just work on manufacturing sabers.
John Mearsheimer
(03:01:13)
Well, that I agree with. I think that the United States wants to make sure it has a big stick in East Asia for purposes of containing China and avoiding a war, right? Again, I want to be clear, I’m not advocating that we start World War III, but the point is you want to have a big stick and you want to make sure that you don’t overstep your bounds in terms of using that big stick. This is the danger with rollback that you get too aggressive and you precipitate a war, and you also just have to be very careful what you say. And to go back to your favorite argument, you want to be able to have empathy or put yourself in the shoes of the other side, because if you do something, you want to think smartly about what that other side, how that other side is going to see your action and how they’re going to react, right?
Lex Fridman
(03:02:09)
And mostly focus on the carrots, have a giant stick laying around, but never mention it, just focus on the carrots.
John Mearsheimer
(03:02:17)
Well, occasionally you have to mention the stick.
Lex Fridman
(03:02:19)
Everyone knows the stick is there.
John Mearsheimer
(03:02:21)
There is some truth in that, right?
Lex Fridman
(03:02:23)
I mean, yeah, and words matter a lot. It feels our current President Biden is meeting with Xi Jinping, and I think the words exchanged there are really important. I have a notion that leaders can stop wars just as much as they can start wars.
John Mearsheimer
(03:02:42)
Well, leaders matter. There’s no question about that, no question, but just on rhetoric, you want to remember that Putin has on more than one occasion, very subtly rattled the nuclear sword, and it has been very effective because Joe Biden has paid attention, and Joe Biden wants to make sure we don’t end up in a thermonuclear war, and thank goodness he’s thinking that way. So all Putin has to do is mention the possibility of nuclear war. Just to go back to Taiwan, switch areas of the world. If you’re interested in containing China and you’re interested in deterrence, and let’s go back to those various scenarios where the Chinese win, we win, Chinese win, but they do it at great cost.

(03:03:35)
One could argue that, that discussion that I laid out before it didn’t take into account nuclear weapons and all President Biden or any of his successors has to do is just very subtly rattle or employ the nuclear threat and just sort of remind the Chinese that you start a war over Taiwan, it could easily escalate into a nuclear war. You want to understand we both have nuclear weapons, and if either one of us is put into a desperate situation, we may turn to those nuclear weapons and oh, by the way, Xi Jinping, you want to understand that we’re out here in the water and using nuclear weapons in the water, it’s not the same as using war nuclear weapons on lands. So we may very well use them. I’m not saying we will, but anyway, a little saber rattling. Right?
Lex Fridman
(03:04:36)
Let me just zoom out on human history. What makes empires collapse and what makes them last when they do when you look at human history, in your sense thinking about the United States, perhaps as an empire?
John Mearsheimer
(03:04:52)
I don’t view the United States as an empire.
Lex Fridman
(03:04:57)
So to you empire as a thing that seeks expansion constantly?
John Mearsheimer
(03:05:03)
Yeah, I think it’s a country that incorporates different regions or areas around the world into sort of a giant sphere of influence without incorporating those territories actually into the state itself. So you had this thing called the British Empire and it controlled areas like India, North America, and Kenya, just to pick a couple instances at different points. Singapore would be another example. Australia would be another example. So these were all entities that were part of the British Empire and the United States has taken a stab at empire after the Spanish American War, for example, with regard to the Philippines and Cuba and Puerto Rico, but we never got serious about it. There’s never been an American empire.

(03:06:13)
This is not to say the United States is not an incredibly powerful country that goes all around the world building military bases and stationing troops here, there and everywhere, but we’re not running an empire the way the British Empire was run or the French Empire. So the question for me is why did those empires go away? Why did the British Empire go away? If you ever look at a map of the world in 1922 after World War I, it’s truly amazing how much of that map is controlled by Britain. They had a huge empire and it’s disappeared.
Lex Fridman
(03:06:53)
Probably by far the biggest in terms of area empire in human history, I think so.
John Mearsheimer
(03:06:59)
I think that’s right. It almost has to be.
Lex Fridman
(03:07:01)
Yeah, right. It’s crazy.
John Mearsheimer
(03:07:04)
Crazy, yeah.
Lex Fridman
(03:07:05)
And then no longer is the case.
John Mearsheimer
(03:07:07)
Yeah. Now I want to be clear. The Americans have wielded maybe even greater influence than Britain did when it had its empire, but I don’t believe we have an empire that bears any resemblance to the British Empire. So the question is, what happened to that British empire? What happened to the French Empire? What happened to the Belgian Empire? What happened to the Dutch Empire? These were countries that had colonies all over the planet. The Dutch East Indies, Vietnam was French Indochina. Where did those empires go? Two factors finished them off. Number one, nationalism. Nationalism became a very powerful force in the 19th Century. It began to rear its head in the late 18th Century and became a very powerful force in the 19th and certainly in the 20th.
Lex Fridman
(03:07:57)
Can you explain nationalism here?
John Mearsheimer
(03:07:59)
Nationalism is the idea that these different nations that were part of the empire, like the Kenyans wanted their own state, nation state. This is my point about the Palestinians, right? This is Palestinian nationalism. What is Zionism? Zionism is Jewish nationalism. Jewish nationalism. Think of Theodore Herzl’s famous book. It’s called The Jewish State, Nation State. Think of the word nation state that embodies nationalism. Nation state, Jewish state. Palestinians want their own state, two state solution. Can’t beat the Palestinians into submission. The Indians wanted their own state. The Pakistanis wanted their own state. The Kenyans wanted their own state. Singapore wanted its own state. Oh, the Americans wanted their own state. This is called the American Revolution.

(03:08:51)
So that’s the first reason, nationalism that these empires disappeared. The second reason is that from a cost benefit analysis, they no longer made any sense, and it was the coming of the Industrial Revolution. Once the Industrial Revolution comes, an empire is basically an albatross around your neck. I would argue that the British Empire was an albatross around Britain’s neck in most of the 20th Century. Some of my friends disagree with that and think there were all sorts of benefits from the British Empire, but you want to remember that in the 20th Century, the three countries that really were powerful were the United States, Germany and the Soviet Union. Those were the big three. Did any of them have an empire? No.
Lex Fridman
(03:09:39)
That’s a good argument.
John Mearsheimer
(03:09:40)
In the industrial world, you don’t need an empire, right? What you need is a powerful manufacturing base.
Lex Fridman
(03:09:50)
Well, the cost benefit analysis is different before the Industrial Revolution, there’s been many empires.
John Mearsheimer
(03:09:56)
There’s no question that empires came and went, right?
Lex Fridman
(03:10:00)
Yes.
John Mearsheimer
(03:10:02)
All you have to do is just look at the British and the French in the Seven Years War, 1756 to 1763, the British win, they get Canada, and that’s why Quebec, Montreal, all these big French speaking areas are now part of Canada. So borders change and countries got established. The United States being one, and remember, South American, Central America were once completely dominated by the Spanish, and in the case of Brazil, the Portuguese, but they all in the 19th Century got their independence, and what I’m saying to you is in the 19th and in the 20th Century, there were two forces that were really driving the train. One is nationalism, and then the other is the industrial revolution, which changes the cost benefit analysis.
Lex Fridman
(03:11:01)
Almost too crazy of a question, but if you look, let me calculate, let’s say 500 years from now, and you John Mearsheimer traveled through time and are at a bookstore looking at the entire history of human civilization in a single book. What role does US play? What’s the story of US over the next a hundred, 200, 300 years? Is it a big role, small role?
John Mearsheimer
(03:11:32)
Well, that’s a long time. If you asked me, let’s just say the next hundred years.
Lex Fridman
(03:11:39)
Yeah, that’s still tough.
John Mearsheimer
(03:11:42)
That’s still tough, but actually I think we’re in excellent shape and here’s the reason. Going back to the beginning of our conversation, you asked me about power and I told you the two principle building blocks of power are population size and wealth, and therefore you want to look around the world and you want to look at what you think the demographics are of countries like Britain, the United States, Iran, China, Russia, pick your country moving forward, what do the demographics look like and how wealthy are those countries likely to be? What you discover very quickly is that almost every country around the world is depopulating over time. Russia’s going to be much smaller, China’s going to be much smaller a hundred years from now than both of those countries are, as best we can tell.

(03:12:49)
United States, American women are not having lots of babies these days. No question about that, but we have immigration. We’re an immigrant culture. You’re a perfect manifestation of that. You’re a perfect, you’re now an American. That’s wonderful. We need more people like you. So when I hear Donald Trump and others arguing that immigration’s a terrible thing, this is ridiculous. Immigration is what made us great. It’s when my relatives came over in the middle of the 19th Century from Germany and Ireland.
Lex Fridman
(03:13:26)
That’s fascinating because there’s been a huge concern, America and other developed nations are not having enough children, but you just made me realize in the long arc of history, the United States has gotten really good at integrating immigrants and helping them flourish. The whole diversity of that makes up America.
John Mearsheimer
(03:13:51)
You’re absolutely right.
Lex Fridman
(03:13:52)
There’s a machinery of integrating other cultures.
John Mearsheimer
(03:13:56)
Yeah, just very quickly on this-
Lex Fridman
(03:13:57)
That’s fascinating.
John Mearsheimer
(03:13:59)
Sam Huntington’s book, Who Are We? Which in many ways I love that book, but it has one fundamental flaw and a number of people told him beforehand that flaw existed and he didn’t fix it, but Sam argues in the book that we have large numbers of Hispanics in this country and we’re doing a very poor job of integrating them into the mainstream and they’re not becoming Americans, and because many of them are concentrated in the Southwest of the United States, unlike other ethnic groups that were spread out all over God’s little green acre, we’re going to have this cohesive group of Spanish speaking Americans who are going to want to break away, and the United States is no longer going to be a reasonably coherent nation state. He’s wrong. All the evidence is that Hispanics are integrating into the American mainstream more quickly and more effectively than the European immigrant groups that came starting around 1835.

(03:15:12)
If you look at immigration from Europe into the United States, leaving aside the original wasps who came over and founded the place, the immigrants start coming in large numbers in 1835, and we really don’t shut the door until 1924, right? This is a crude overview, starting in 1835 and running up till about 1885, it’s mainly Germans and Irish. That’s why Germans are the largest ethnic group to ever come to the United States, and the Irish are right behind them. These are the European ethnic groups we’re talking about. Then starting in 1885 Pols, Jews and Italians start coming, and the Germans and Irish keep coming, and this is why Ellis Island is opened, I think it’s 1893, Ellis Island is opened because Castle Garden in New York, which had handled all the previous immigrants coming across the pond, Castle Garden, couldn’t handle them all, so they opened up Ellis Island.

(03:16:11)
That’s why somebody like me, I can’t find my distant relative’s records in Ellis Island because they came through Castle Garden. Whereas lots of Jews I know, lots of Italians, I know they can find their relatives records in Ellis Island because they came through Ellis Island. The point is, you had all these immigrants who came in roughly between 1835 and 1924 when we shut the gates. It was the only time we’ve ever really shut the gates in a meaningful way and this is what made America great, all these people, and they made lots of babies.
Lex Fridman
(03:16:47)
So in some sense, make America great again, means getting more immigrants in.
John Mearsheimer
(03:16:52)
Well, we opened the gates again in ’65, closed them in ’24, opened them in ’65. I’m oversimplifying the story here, because we didn’t completely shut them. We almost completely shut them in ’24, opened in ’65, and we’ve had huge numbers of immigrants flowing in. These immigrants who have been flowing in since ’65 are not Europeans. They’re not mainly Europeans, they’re mainly Hispanics and Asians. If you look at those Hispanics and Asians, they’re integrating into the American mainstream at a much faster and more effective clip than was the case with those immigrants who came in the 19th Century and early 20th Century.

(03:17:36)
The Irish, oh my God, they were treated horribly. There’s a book, a very famous book that’s been written called When The Irish Became White, just think about the title of that book. There was discrimination against all these groups, and the worst discrimination, of course was against Chinese Americans, but we’ve gotten much better and what we should do moving forward is redouble our efforts to integrate immigrants into the American mainstream, Hispanics, Asians of all sorts, because the fact is that America is rapidly reaching the point where it’s not going to be an all white country.

(03:18:24)
I have five children and two of my children are, I was a generation Z, Gen Z. Gen Z is the last majority white generation, subsequent generations, and not majority white. So for anybody who’s bothered by this, I’m not bothered by that, but for anybody who is bothered by this, they better good use to it because Americans aren’t making enough babies that we can continue to grow population-wise in a robust way. So we need immigration and we’re an immigrant culture, and this is a great virtue. It has been a great virtue over time.
Lex Fridman
(03:19:10)
It should be a source of hope, not worry.
John Mearsheimer
(03:19:13)
That’s my view. That’s my view and America when it works, is a place that is very attractive to immigrants and immigrants can do very well here and then the real key moving forward is intermarriage, and you have a huge amount of intermarriage. Somebody was telling me not too long ago that the highest inner marriage rates in the United States are among Asian women, Asian American women, Asian women and Anglos, and I say wonderful and-
Lex Fridman
(03:19:47)
Great.
John Mearsheimer
(03:19:48)
Yeah. No, the more-
Lex Fridman
(03:19:49)
Love is the fastest way to integrate.
John Mearsheimer
(03:19:52)
Yeah. Well, what you want to do is you want to eliminate difference, right? You want to eliminate difference, right? It’s like people who say, “I’m an antisemite,” right? I have two grandsons who Adolf Hitler would’ve thrown into a gas chamber. One of whose first name is John, and middle name is Mearsheimer, right?
Lex Fridman
(03:20:15)
Yeah.
John Mearsheimer
(03:20:16)
This is what you want. Steve Watt’s wife and his two children would’ve been thrown into a gas chamber by Adolf Hitler. This is what you want. You want intermarriage. Now, there are a good number of people in some of those groups, especially among Jews who don’t like intermarriage, but they’ve lost because I haven’t looked recently at the data for intermarriage rates among basically secular Jews, but it used to be around 62% large numbers of Jews marry Guam.
Lex Fridman
(03:20:51)
And they’ve lost because of intermarriage. Intermarriage helps fight tribalism. Destructive kind of tribalism.
John Mearsheimer
(03:20:58)
Exactly.
Lex Fridman
(03:20:58)
It’s nice
John Mearsheimer
(03:20:59)
Calling me an antisemite, they haven’t met my grandsons, my son-in-laws, a niece that I have, nephews that I have, brother-in-laws that I have. Jewish. Come on.
Lex Fridman
(03:21:13)
And this gives a really nice hopeful view of America is the integration of different cultures, different kinds of peoples. That is a unique property of America.
John Mearsheimer
(03:21:24)
Yes, but just to go back to where we started, it was not smooth in the beginning.
Lex Fridman
(03:21:29)
All things are rough in the beginning.
John Mearsheimer
(03:21:31)
All things are rough in the beginning.

Life and mortality

Lex Fridman
(03:21:34)
What advice would you give to a young person today about how to have a career they can be proud of or a life they can be proud of?
John Mearsheimer
(03:21:42)
Well, I think it’s very important to make sure that you do something in life that really interests you. My mother used to use this phrase, “Floats your boat.” You want to do something that floats your boat or to use another one of my mother’s phrases, ” You want to get off. You want to do something where you get up out of bed in the morning with a bounce in your step.” So I think that if your mother and father want you to be a lawyer and they’re pushing you to be a lawyer and you don’t want to be a lawyer, you want to be a policeman, be a policeman. Don’t do what other people want you to do because it’s very important to find a job, an occupation that you really love.

(03:22:26)
The second thing I would say, and this has to do with your point about humility, you want to think about the humility hubris index. My friend Steve Van Everett, who teaches at MIT, he and I invented this concept. We call it the hubris humility index, and you want to have a healthy dose of humility, but you also want to have a healthy dose of hubris. You want to think you can change the world. You want to think you can make things better for yourself. You want to take chances. You want to think sometimes that you know better than other people do. Hubris is not a bad thing, but at the same time, you have to have humility. You have to understand that a man or a woman has his or her limits and you want to listen to other people. You want to be a good listener.

(03:23:19)
So always remember the importance of the hubris humility index and the importance of having healthy doses of both hubris and humility.
Lex Fridman
(03:23:31)
Speaking of humility, you’re mortal, like all humans are, do you ponder your mortality? Are you afraid of it? Are you afraid of death?
John Mearsheimer
(03:23:42)
I’m not sure I’m afraid of death. I don’t want to die because I enjoy life so much.
Lex Fridman
(03:23:50)
Having too much fun?
John Mearsheimer
(03:23:53)
Given how horrible the world is today, I hate to say that I’m having too much fun, but do I find what I do interesting and gratifying? I do. I just love what I do and I love studying international politics, and I love being intellectually curious about all sorts of subjects. I love talking to you about this and that. I mean, this is really wonderful, and I often tell people thank goodness I’m only 28 years old because I do try to behave like I’m only 28 years old, but I am well aware of the fact that as my mother used to say, “Nothing is forever,” and that includes me and when you’re 75 going on 76, you understand that you have a limited number of years left and I find that depressing because I’ve been very lucky and I feel like I’ve won the lottery. I’m very thankful for that. I’d like to make it last for as long as possible, but I do understand that nothing is forever.
Lex Fridman
(03:25:06)
Yeah, the finiteness of things.
John Mearsheimer
(03:25:09)
Yeah. You never think that when you’re young. I mean, you think you’re going to live forever and you’re just not going to get old. I never thought this would happen that I would become 75 years old.
Lex Fridman
(03:25:22)
Well, you got so much energy and boldness and fearlessness and excitement to you that I’m really grateful to see that, especially given how much I’m sure you’ve been attacked for having bold ideas and presenting them and not losing that youthful energy is beautiful to see.
John Mearsheimer
(03:25:46)
Thank you.
Lex Fridman
(03:25:47)
Not becoming cynical. John, it’s a huge honor to speak with you that you’ve given me so much time and so much respect and so much love. This was a really incredible conversation. Thank you so much for everything you do in the world, for looking out into the world and trying to understand it and teach us, and thank you so much for talking with a silly kid like me.
John Mearsheimer
(03:26:07)
It was my pleasure. Thank you very much. I thoroughly enjoyed it.
Lex Fridman
(03:26:11)
Awesome. Thanks for listening to this conversation with John Mearsheimer. To support this podcast, please check out our sponsors in the description. And now let me leave you with some words from Plato. “Only the dead have seen the end of war.” Thank you for listening and hope to see you next time.

Transcript for Elon Musk: War, AI, Aliens, Politics, Physics, Video Games, and Humanity | Lex Fridman Podcast #400

This is a transcript of Lex Fridman Podcast #400 with Elon Musk.
The timestamps in the transcript are clickable links that take you directly to that point in
the main video. Please note that the transcript is human generated, and may have errors.
Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation.
Click link to jump approximately to that part in the transcript:

Introduction

War and human nature

Lex Fridman
(00:00:00)
The following is a conversation with Elon Musk, his fourth time on this, the Lex Fridman Podcast. I thought you were going to finish it. It’s one of the greatest themes in all of film history.
Elon Musk
(00:00:31)
Yeah, that’s great.
Lex Fridman
(00:00:33)
So I was just thinking about the Roman Empire, as one does.
Elon Musk
(00:00:38)
Is that whole meme where all guys are thinking about the Roman Empire at least once a day?
Lex Fridman
(00:00:44)
And half the population is confused whether it’s true or not. But more seriously, thinking about the wars going on in the world today, and as you know, war and military conquest has been a big part of Roman society and culture, and I think has been a big part of most empires and dynasties throughout human history.
Elon Musk
(00:01:06)
Yeah, they usually came as a result of conquest. I mean, there’s some like the Hapsburg Empire where there was just a lot of clever marriages.
Lex Fridman
(00:01:16)
But fundamentally there’s an engine of conquest and they celebrate excellence in warfare, many of the leaders were excellent generals, that kind of thing. So a big picture question, Grok approved, I asked if this is a good question to ask.
Elon Musk
(00:01:33)
Tested, Grok approved. Yeah.
Lex Fridman
(00:01:36)
At least on fun mode. To what degree do you think war is part of human nature versus a consequence of how human societies are structured? I ask this as you have somehow controversially been a proponent of peace.
Elon Musk
(00:01:57)
I’m generally a proponent of peace. I mean, ignorance is perhaps, in my view, the real enemy to be countered. That’s the real hard part, not fighting other humans, but all creatures fight. I mean, the jungle is… People think of nature as perhaps some sort of peaceful thing, but in fact it is not. There’s some quite funny Werner Herzog thing where he is in the jungle saying that it’s basically just murder and death in every direction. The plants and animals in the jungle are constantly trying to kill each other every single day, every minute. So it’s not like we’re unusual in that respect.
Lex Fridman
(00:02:40)
Well, there’s a relevant question here, whether with greater intelligence comes greater control over these base instincts for violence.
Elon Musk
(00:02:49)
Yes. We have much more vulnerability to control our limbic instinct for violence than say a chimpanzee. And in fact, if one looks at say, chimpanzee society, it is not friendly. I mean, the Bonobos are an exception, but chimpanzee society is filled with violence and it’s quite horrific, frankly. That’s our limbic system in action. You don’t want to be on the wrong side of a chimpanzee, it’ll eat your face off and tear your nuts off.
Lex Fridman
(00:03:22)
Yeah. Basically there’s no limits or ethics or they almost had just war. There’s no just war in the chimpanzee societies. Is war and dominance by any means necessary?
Elon Musk
(00:03:33)
Yeah. Chimpanzee society is a permanent version of human society. They’re not like peace loving basically at all. There’s extreme violence and then once in a while, somebody who’s watched too many Disney movies decides to raise a chimpanzee as a pet, and then that eats their face or they’re nuts off or chew their fingers off and that kind of thing. It’s happened several times.
Lex Fridman
(00:03:58)
Ripping your nuts off is an interesting strategy for interaction.
Elon Musk
(00:04:02)
It’s happened to people. It’s unfortunate. That’s, I guess, one way to ensure that the other chimp doesn’t contribute to the gene pool.
Lex Fridman
(00:04:10)
Well, from a martial arts perspective is the fascinating strategy.
Elon Musk
(00:04:15)
The nut rougher.
Lex Fridman
(00:04:18)
I wonder which of the martial arts teaches that one.
Elon Musk
(00:04:21)
I think it’s safe to say if somebody’s got your nuts in their hands and as the option of roughing them off, you’ll be amenable to whatever they want.

Israel-Hamas war

Lex Fridman
(00:04:30)
Yeah. Safe to say. So, like I said, somehow controversially, you’ve been a proponent of peace on Twitter on X.
Elon Musk
(00:04:38)
Yeah.
Lex Fridman
(00:04:39)
So let me ask you about the wars going on today and to see what the path to peace could be. How do you hope the current war in Israel and Gaza comes to an end? What path do you see that can minimize human suffering in the longterm in that part of the world?
Elon Musk
(00:04:54)
Well, I think that part of the world is definitely, if you look up… There is no easy answer in the dictionary. It’ll be the picture of the Middle East in Israel especially. So there is no easy answer. This is strictly my opinion is that the goal of Hamas was to provoke an overreaction from Israel. They obviously did not expect to have a military victory, but they really wanted to commit the worst atrocities that they could in order to provoke the most aggressive response possible from Israel, and then leverage that aggressive response to rally Muslims worldwide for the course of Gaza and Palestine, which they have succeeded in doing. So the counterintuitive thing here, I think that the thing that I think should be done, even though it’s very difficult, is that I would recommend that Israel engage in the most conspicuous acts of kindness possible, everything, that is the actual thing that we’re taught the goal of Hamas.
Lex Fridman
(00:06:19)
So in some sense, the degree that makes sense in geopolitics turn the other cheek implemented.
Elon Musk
(00:06:26)
It’s not exactly turn the other cheek because I do think that it is appropriate for Israel to find the Hamas members and either kill them or incarcerate them. That’s something has to be done because they’re just going to keep coming otherwise. But in addition to that, they need to do whatever they can. There’s some talk of establishing, for example, a mobile hospital. I’d recommend doing that. Just making sure that there’s food, water, medical necessities and just be over the top about it and be very transparent. So [inaudible 00:07:22] can claim it’s a trick. Just put webcam on the thing or 24, 7.
Lex Fridman
(00:07:29)
Deploy acts of kindness.
Elon Musk
(00:07:31)
Yeah, conspicuous acts of kindness that are unequivocal, meaning they can’t be somehow because Hamas will then their response will be, “Oh, it’s a trick.” Therefore, you have to counter how it’s not a trick.
Lex Fridman
(00:07:47)
This ultimately fights the broader force of hatred in the region.
Elon Musk
(00:07:51)
Yes. And I’m not sure who said it, it’s an [inaudible 00:07:54] saying, but an eye for an eye makes everyone blind. Now, that neck of the woods, they really believe in the whole eye for an eye thing. But you really have… If you’re not going to just outright commit genocide against an entire people, which obviously would not be acceptable to really, shouldn’t be acceptable to anyone, then you’re going to leave basically a lot of people alive who subsequently hate Israel. So really the question is like for every Hamas member that you kill, how many did you create? And if you create more than you killed, you’ve not succeeded. That’s the real situation there. And it’s safe to say that if you kill somebody’s child in Gaza, you’ve made at least a few homeless members who will die just to kill an Israeli. That’s the situation. But I mean, this is one of the most contentious subjects one could possibly discuss. But I think if the goal ultimately is some sort of long-term piece, one has to look at this from the standpoint of over time, are there more or fewer terrorists being created?
Lex Fridman
(00:09:26)
Let me just linger on war.
Elon Musk
(00:09:29)
Yeah, war, safe to say, wars always existed and always will exist.
Lex Fridman
(00:09:33)
Always will exist.
Elon Musk
(00:09:34)
Always has existed and always will exist.
Lex Fridman
(00:09:37)
I hope not. You think it’ll always-
Elon Musk
(00:09:42)
There will always be war. There’s a question of just how much war and there’s sort of the scope and scale of war. But to imagine that there would not be any war in the future, I think would be a very unlikely outcome.
Lex Fridman
(00:09:55)
Yeah. You talked about the Culture series. There’s war even there.
Elon Musk
(00:09:58)
Yes. It’s a giant war. The first book starts off with a gigantic galactic war where trillions die trillions.
Lex Fridman
(00:10:07)
But it still nevertheless protects these pockets of flourishing. Somehow you can have galactic war and still have pockets of flourishing.
Elon Musk
(00:10:18)
Yeah, I guess if we are able to one day expand to fool the galaxy or whatever, there will be a galactic war at some point.
Lex Fridman
(00:10:31)
I mean, the scale of war has been increasing, increasing, increasing. It’s like a race between the scale of suffering and the scale of flourishing.
Elon Musk
(00:10:38)
Yes.

Military-Industrial Complex

Lex Fridman
(00:10:41)
A lot of people seem to be using this tragedy to beat the drums of war and feed the military industrial complex. Do you worry about this, the people who are rooting for escalation and how can it be stopped?
Elon Musk
(00:10:56)
One of the things that does concern me is that there are very few people alive today who actually viscerally understand the horrors of war, at least in the US. I mean, obviously there are people on the front lines in Ukraine and Russia who understand just how terrible war is, but how many people in the West understand it? My grandfather was in World War II. He was severely traumatized. He was there I think for almost six years in Eastern North Africa and Italy. All his friends were killed in front of him, and he would’ve died too, except they randomly gave some, I guess IQ test or something, and he scored very high. He was not an officer. He was I think a corporal or a sergeant or something like that because he didn’t finish high school because he had to drop out of high school because his dad died and he had to work to support his siblings. So because he didn’t graduate high school, he was not eligible for the offset corps.

(00:11:57)
So he kind of got put into the cannon fodder category basically. But then randomly they gave him this test. He was transferred to British intelligence in London. That’s where we met my grandmother. But he had PTSD next level, next level. I mean, just didn’t talk, just didn’t talk. And if you tried talking to him, he’d just tell you to shut up. And he won a bunch of medals, never bragged about it once, not even hinted nothing. I found out about it because his military records were online. That’s how I know. So he would say like, “No way in hell do you want to do that again.” But how many people… Obviously, he died, he 20 years ago or longer, actually 30 years ago. How many people are alive that remember World War II? Not many.
Lex Fridman
(00:12:54)
And the same perhaps applies to the threat of nuclear war.
Elon Musk
(00:13:01)
Yeah, I mean, there are enough nuclear bombs pointed at United States to make the radioactive revel balance many times.
Lex Fridman
(00:13:10)
There’s two major wars going on right now. So you talked about the threat of AGI quite a bit, but now as we sit here with the intensity of conflict going on, do you worry about nuclear war?
Elon Musk
(00:13:25)
I think we shouldn’t discount the possibility of nuclear war. It is a civilizational threat. Right now, I could be wrong, but I think the current probability of nuclear war is quite low. But there are a lot of nukes pointed at us, and we have a lot of nukes pointed at other people. They’re still there. Nobody’s put their guns away. The missiles are still in the silos.
Lex Fridman
(00:13:57)
And the leaders don’t seem to be the ones with the nukes talking to each other.
Elon Musk
(00:14:03)
No, there are wars which are tragic and difficult on a local basis. And then there are wars which are civilization ending or has that potential. Obviously, global thermonuclear warfare has high potential to end civilization, perhaps permanently, but certainly to severely wound and perhaps set back human progress to the Stone Age or something. I don’t know. Pretty bad. Probably scientists and engineers want to be super popular after that as well. You got us into this mess. So generally, I think we obviously want to prioritize civilizational risks over things that are painful and tragic on a local level, but not civilizational.

War in Ukraine

Lex Fridman
(00:15:00)
How do you hope the war in Ukraine comes to an end? And what’s the path, once again to minimizing human suffering there?
Elon Musk
(00:15:08)
Well, I think that what is likely to happen, which is really pretty much the way it is, is that something very close to the current lines will be how a ceasefire or truce happens. But you just have a situation right now where whoever goes on the offensive will suffer casualties at several times the rate of whoever’s on the defense because you’ve got defense in depth, you’ve got minefields, trenches, anti-tank defenses. Nobody has air superiority because the anti-aircraft missiles are really far better than the aircraft. They’re far more of them. And so neither side has air superiority. Tanks are basically death traps, just slow moving, and they’re not immune to anti-tank weapons. So you really just have long range artillery and infantry ranges. It’s World War I all over again with drones, thrown old drones, some drones there.
Lex Fridman
(00:16:25)
Which makes the long range artillery just that much more accurate and better, and so more efficient at murdering people on both sides.
Elon Musk
(00:16:34)
So whoever is… You don’t want to be trying to advance from either side because the probability of dying is incredibly high. So in order to overcome defense in depth, trenches and minefields, you really need a significant local superiority in numbers. Ideally combined alms where you do a fast attack with aircraft, a concentrated number of tanks, and a lot of people, that’s the only way you’re going to punch through a line and then you’re going to punch through and then not have reinforcements just kick you right out again. I mean, I really recommend people read World War I warfare in detail. That’s rough. I mean, the sheer number of people that died there was mind-boggling.
Lex Fridman
(00:17:37)
And it’s almost impossible to imagine the end of it that doesn’t look like almost exactly like the beginning in terms of what land belongs to who and so on. But on the other side of a lot of human suffering, death and destruction of infrastructure.
Elon Musk
(00:17:56)
Yes. The thing that… The reason I proposed some sort of truce or peace a year ago was because I’ve predicted pretty much exactly what would happen, which is a lot of people dying for basically almost no changes in land and the loss of the flower of Ukrainian and Russian youth. And we should have some sympathy for the Russian boys as well as the Ukrainian boys, because Russian boys, because boys didn’t ask to be on their frontline. They have to be. So there’s a lot of sons not coming back to their parents, and I think most of them don’t hate the other side. It’s sort of like as this saying comes from World War I, it’s like young boys who don’t know each other killing each other on behalf of old men that do know each other. The hell’s the point of that.
Lex Fridman
(00:19:02)
So Volodymyr Zelenskyy said that he’s not, or has said in the past, he’s not interested in talking to Putin directly. Do you think he should sit down man to man, lead a leader, and negotiate peace?
Elon Musk
(00:19:14)
Look, I think I would just recommend do not send the flower of Ukrainian youth to die in trenches, whether he talks to Putin or not, just don’t do that. Whoever goes on the offensive will lose massive numbers of people and history will not look kindly upon them.

China

Lex Fridman
(00:19:42)
You’ve spoken honestly about the possibility of war between US and China in the longterm if no diplomatic solution is found, for example, on the question of Taiwan and One China policy, how do we avoid the trajectory where these two superpowers clash?
Elon Musk
(00:19:58)
Well, it’s worth reading that book on the, difficult to pronounce, the Thucydides Trap, I believe it’s called. I love war history. I like inside out and backwards. There’s hardly a battle I haven’t read about. And trying to figure out what really was the cause of victory in any particular case as opposed to what one side or another claim the reason.
Lex Fridman
(00:20:21)
Both the victory and what sparked the war and-
Elon Musk
(00:20:24)
Yeah, yeah.
Lex Fridman
(00:20:25)
The whole thing.
Elon Musk
(00:20:26)
Yeah. So that Athens and Sparta is a classic case. The thing about the Greek is they really wrote down a lot of stuff. They loved writing. There are lots of interesting things that happened in many parts of the world, but people didn’t write down, so we don’t know what happened or they didn’t really write in detail. They just would say, “We had a battle and we won.” And what? Can you add a bit more? The Greeks, they really wrote a lot. They were very articulate on… They just love writing. And we have a bunch of that writing as preserved. So we know what led up to the Peloponnesian War between the Spartanand Athenian Alliance, and we know that they saw it coming.

(00:21:16)
Spartans didn’t write… They also weren’t very verbose by their nature, but they did write, but they weren’t very verbose. They were [inaudible 00:21:23]. But the Athenians and the other Greeks wrote a line, and Spartan was really kind of like the leader of Greece. But Athens grew stronger and stronger with each passing year. And everyone’s like, “Well, that’s inevitable that there’s going to be a clash between Athens and Sparta. Well, how do we avoid that?” And actually they saw it coming and they still could not avoid it. So at some point, if one group, one civilization or country or whatever exceeds another sort of like the United States has been the biggest kid on the block since I think around 1890 from an economic standpoint.

(00:22:14)
So the United States has been the most powerful economic engine in the world longer than anyone’s been alive. And the foundation of war is economics. So now we have a situation in the case of China where the economy is likely to be two, perhaps three times larger than that of the US. So imagine you’re the biggest kid on the block for as long as anyone can remember, and suddenly a kid comes along who’s twice your size.
Lex Fridman
(00:22:55)
So we see it coming, how is it possible to stop? Let me throw something out there, just intermixing of cultures understanding there does seem to be a giant cultural gap in understanding of each other. And you’re an interesting case study because you are an American, obviously you’ve done a lot of incredible manufacture here in the United States, but you also work with China.
Elon Musk
(00:23:20)
I’ve spent a lot of time in China and met with the leadership many times.
Lex Fridman
(00:23:22)
Maybe a good question to ask is, what are some things about China that people don’t understand, positive just in the culture? What’s some interesting things that you’ve learned about the Chinese?
Elon Musk
(00:23:36)
Well, the sheer number of really smart, hardworking people in China is incredible. There are really say how many smart, hardworking people are there in China? There’s far more of them there than there are here, I think, in my opinion. And they’ve got a lot of energy. So I mean, the architecture in China that’s in recent years is far more impressive than the US. I mean the train stations, the buildings, the high speed rail, everything, it’s really far more impressive than what we have in the US. I mean, I recommend somebody just go to Shanghai and Beijing, look at the buildings and go to take the train from Beijing to Xian, where you have the terracotta warriors. China’s got an incredible history, very long history, and I think arguably in terms of the use of language from a written standpoint, one of the oldest, perhaps the oldest written language, and then China, people did write things down.

(00:24:50)
So now China historically has always been, with rare exception, been internally focused. They have not been inquisitive. They’ve fought each other. There’ve been many, many civil wars. In the Three Kingdoms war, I believe they lost about 70% of their population. So they’ve had brutal internal wars, civil wars that make the US Civil War look small by comparison. So I think it’s important to appreciate that China is not monolithic. We sort of think of China as a sort of one entity of one mind. And this is definitely not the case. From what I’ve seen and I think most people who understand China would agree, people in China think about China 10 times more than they think about anything outside of China. So it’s like 90% of their consideration is internal.
Lex Fridman
(00:26:01)
Well, isn’t that a really positive thing when you’re talking about the collaboration and the future piece between superpowers when you’re inward facing, which is focusing on improving yourself versus focusing on quote, unquote improving others through military might.
Elon Musk
(00:26:18)
The good news, the history of China suggests that China is not inquisitive, meaning they’re not going to go out and invade a whole bunch of countries. Now they do feel very strongly… So that’s good. I mean, because a lot of very powerful countries have been inquisitive. The US is also one of the rare cases that has not been inquisitive. After World War II, the US could have basically taken over the world in any country, we’ve got nukes, nobody else has got nukes. We don’t even have to lose soldiers. Which country do you want? And the United States could have taken over everything and it didn’t. And the United States actually helped rebuild countries. So it helped rebuild Europe, helped rebuild Japan. This is very unusual behavior, almost unprecedented.

(00:27:10)
The US did conspicuous acts of kindness like the Berlin Airlift. And I think it’s always like, well, America’s done bad things. Well, of course America’s done bad things, but one needs to look at the whole track record and just generally, one sort of test would be how do you treat your prisoners at war? Or let’s say, no offense to the Russians, but let’s say you’re in Germany, it’s 1945, you’ve got the Russian Army coming one side and you’ve got the French, British and American Army’s coming the other side, who would you like to be just surrendered to? No country is [inaudible 00:27:58] perfect, but I recommend being a POW with the Americans. That would be my choice very strongly.
Lex Fridman
(00:28:07)
In the full menu of POWs in the US.
Elon Musk
(00:28:08)
Very much so. And in fact, Wernher von Braun, a smart guy, was like, “We’ve got to be captured by the Americans.” And in fact, the SS was under orders to execute von Braun and all of the German rocket conditioners, and they narrowly escaped. They said they were going out for a walk in the woods. They left in the middle of winter with no coats and then ran, but no food, no coats, no water, and just ran like hell and ran West and Vice Sherlock, I think his brother found a bicycle or something and then just cycled West as fast as he couldn’t have found a US patrol. So anyway, that’s one way you can tell morality is where do you want to be a PW? It’s not fun anywhere, but some places are much worse than others. Anyway, so America has been, while far from perfect, generally a benevolent force, and we should always be self-critical and we try to be better, but anyone with half a brain knows that.

(00:29:31)
So I think there are… In this way, China and the United States are similar. Neither country has been acquisitive in a significant way. So that’s a shared principle, I guess. Now, China does feel very strongly about Taiwan. They’ve been very clear about that for a long time. From this standpoint, it would be like one of the states is not there like Hawaii or something like that but more significant than Hawaii. And Hawaii is pretty significant for us. So they view it as really there’s a fundamental part of China, the island of Formosa, not Taiwan, that is not part of China, but should be. And the only reason it hasn’t been is because the US Pacific fleet.
Lex Fridman
(00:30:32)
And is their economic power grows and is their military power grows, the thing that they’re clearly saying is their interest will clearly be materialized.
Elon Musk
(00:30:46)
Yes, China has been very clear that they’ll incorporate Taiwan peacefully or militarily, but that they will incorporate it from their standpoint is 100% likely.
Lex Fridman
(00:31:04)
Something you said about conspicuous acts of kindness as a geopolitical policy, it almost seems naive, but I’d venture to say that this is probably the path forward, how you avoid most wars. Just as you say it sounds naive, but it’s kind of brilliant. If you believe in the goodness of underlying most of human nature, it just seems like conspicuous acts of kindness can reverberate through the populace of the countries involved and deescalate.
Elon Musk
(00:31:44)
Absolutely. So after World War I, they made a big mistake. They basically tried to lump all of blame on Germany and saddle Germany with impossible reparations. And really there was quite a bit of blame to go around for World War I, but they try to put it all in Germany and that laid the seeds for World War II. So a lot of people, were not just Hitler, a lot of people felt wronged and they wanted vengeance and they got it.
Lex Fridman
(00:32:38)
People don’t forget.
Elon Musk
(00:32:41)
Yeah, you kill somebody’s father, mother or son, daughter, they’re not going to forget it. They’ll want vengeance. So after World War II, they’re like, “Well, the Treaty of Versi was a huge mistake in World War I. And so this time, instead of crushing the losers, we’re actually going to help them with the module plan, and we’re going to help rebuild Germany. We’re going to help rebuild Austria and Italy and whatnot.” So that was the right move.
Lex Fridman
(00:33:26)
It does feel like there’s a profound truth to the conspicuous acts of kindness being an antidote to this.
Elon Musk
(00:33:37)
Something must stop the cycle of reciprocal violence. Something must stop it, or it’ll never stop. Just eye for an eye, tooth for a tooth, limb for a limb, life for a life forever and ever.

xAI Grok

Lex Fridman
(00:33:57)
To escape briefly the darkness, was some incredible engineering work, xAI just released Grok AI assistant that I’ve gotten a chance to play with. It’s amazing on many levels. First of all, it’s amazing that a relatively small team in a relatively short amount of time was able to develop this close to state-of-the-art system. Another incredible thing is there’s a regular mode and there’s a fun mode.
Elon Musk
(00:34:23)
Yeah, I guess I’m to blame for that one.
Lex Fridman
(00:34:27)
First of all, I wish everything in life had a fun mode.
Elon Musk
(00:34:29)
Yeah.
Lex Fridman
(00:34:30)
There’s something compelling beyond just fun about the fun mode interacting with a large language model. I’m not sure exactly what it is because I’ve only have had a little bit of time to play with it, but it just makes it more interesting, more vibrant to interact with the system.
Elon Musk
(00:34:47)
Yeah, absolutely. Our AI, Grok, is modeled after The Hitchhiker’s Guide to the Galaxy, which is one of my favorite books, which it’s a book on philosophy. It’s-
Elon Musk
(00:35:00)
My favorite books, it’s a book on philosophy, disguises book on humor. And I would say that forms the basis of my philosophy, which is that we don’t know the meaning of life, but the more we can expand the scope and scale of consciousness, digital and biological, the more we’re able to understand what questions to ask about the answer that is the universe. So I have a philosophy of curiosity.
Lex Fridman
(00:35:34)
There is generally a feeling like this AI system has an outward looking, like the way you are sitting with a good friend looking up at the stars, asking pod head like questions about the universe, wondering what it’s all about. The curiosity that you talk about. No matter how mundane the question I ask it, there’s a sense of cosmic grandeur to the whole thing.
Elon Musk
(00:35:59)
Well, we are actually working hard to have engineering math, physics answers that you can count on. So for the other AIs out there, these so-called large language models, I’ve not found the engineering to be reliable. It unfortunately hallucinates most when you at least want it to hallucinate. So when you’re asking important, difficult questions, that’s when it tends to be confidently wrong. So we’re really trying hard to say, okay, how do we be as grounded as possible? So you can count on the results, trace things back to physics first principles, mathematical logic. So underlying the humor is an aspiration to adhere to the truth of the universe as closely as possible.
Lex Fridman
(00:37:01)
That’s really tricky.
Elon Musk
(00:37:02)
It is tricky. So that’s why there’s always going to be some amount of error. But do we want to aspire to be as truthful as possible about the answers with acknowledged error. So that there was always, you don’t want to be confidently wrong, so you’re not going to be right every time, but you want to minimize how often you’re confidently wrong. And then like I said, once you can count on the logic as being not violating physics, then you can start to bull on that to create inventions, like invent new technologies. But if you cannot count on the foundational physics being correct, obviously the inventions are simply wishful thinking, imagination land. Magic basically.
Lex Fridman
(00:38:01)
Well, as you said, I think one of the big goals of XAI is to understand the universe.
Elon Musk
(00:38:06)
Yes, that’s how simple three word mission.
Lex Fridman
(00:38:13)
If you look out far into the future, do you think on this level of physics, the very edge of what we understand about physics, do you think it will make the sexiest discovery of them as we know now, unifying general relativity and quantum mechanics? So coming up with a theory of everything, do you think it could push towards that direction, almost like theoretical physics discoveries?
Elon Musk
(00:38:38)
If an AI cannot figure out new physics, it’s clearly not equal to humans, nor has surpassed humans because humans have figured out new physics. Physics is just deepening what’s inside into how reality works. And then there’s engineering which is inventing things that have never existed. Now the range of possibilities for engineering is far greater than for physics because once you figure out the rules of the universe, that’s it. You’ve discovered things that already existed. But from that you can then build technologies that are really almost limitless in the variety. And it’s like once you understand the rules of the game properly, and with current physics, we do at least at a local level, understand how physics works very well. Our ability to predict things is incredibly good. Degree to which quantum mechanics can predict outcomes is incredible. That was my hardest class in college by the way. My senior quantum mechanics class was harder than all of my other classes put together.
Lex Fridman
(00:39:50)
To get an AI system, a large language model be as reliable as quantum mechanics and physics is very difficult.
Elon Musk
(00:40:01)
Yeah. You have to test any conclusions against the ground truth of reality. Reality is the ultimate judge. Like physics is the law, everything else is a recommendation. I’ve seen plenty of people break the laws made by man, but none break the laws made by physics.
Lex Fridman
(00:40:15)
It’s a good test actually. If this LLM understands and matches physics, then you can more reliably trust whatever it thinks about the current state of politics in some sense.
Elon Musk
(00:40:28)
And it’s also not the case currently that even that its internal logic is not consistent. So especially with the approach of just predicting a token predict token, predict token, it’s like a vector sum. You’re summing up a bunch of vectors, but you can get drift. A little bit of error adds up and by the time you are many tokens down the path, it doesn’t make any sense.
Lex Fridman
(00:40:59)
So it has to be somehow self-aware about the drift.
Elon Musk
(00:41:02)
It has to be self-aware about the drift, and then look at the thing as a gestalt as a whole and say it doesn’t have coherence as a whole. When authors write books, they will write the book and then they’ll go and revise it, take into account all the end and the beginning and the middle and rewrite it to achieve coherence so that it doesn’t end up at a nonsensical place.
Lex Fridman
(00:41:33)
Maybe the process of revising is what reasoning is, and then the process of revising is how you get closer and closer to truth. At least I approached that way, you just say a bunch of bullshit first and then you get it better. You start a bullshit and then you-
Elon Musk
(00:41:51)
Create a draft and then you iterate on that draft until it has coherence, until it all adds up basically.
Lex Fridman
(00:41:59)
Another question about theory of everything, but for intelligence, as you’re exploring this with XAI, creating this intelligence system? Do you think there is a theory of intelligence where you get to understand what is the I in AGI and what is the I in human intelligence?
Elon Musk
(00:42:22)
No, I in team America. Wait, there is.
Lex Fridman
(00:42:24)
No, it’s going to be stuck in my head now. Yeah, there’s no me and whatever in quantum mechanics, wait. I mean is that part of the process of discovering, understanding the universe is understanding intelligence?
Elon Musk
(00:42:50)
Yeah. I think we need to understand intelligence, understand consciousness. I mean there are some fundamental questions of what is thought, what is emotion? Is it really just one atom bumping into another atom? It feels like something more than that. So I think we’re probably missing some really big things.
Lex Fridman
(00:43:18)
Something that’ll be obvious in retrospect. You put the whole consciousness and motion.
Elon Musk
(00:43:26)
Well, some people would quote like a soul religion, be a soul. You feel like you’re you, I mean you don’t feel like you’re just a collection of atoms, but on what dimension does thought exist? What dimension does do emotions exist? Because we feel them very strongly. I suspect there’s more to it than atoms bumping into atoms.
Lex Fridman
(00:43:52)
And maybe AI can pave the path to the discovery whatever the hell that thing is.
Elon Musk
(00:43:58)
Yeah. What is consciousness? When you put the atoms in a particular shape, why are they able to form thoughts and take actions and feelings?
Lex Fridman
(00:44:10)
And even if it is an illusion, why is this illusion so compelling?
Elon Musk
(00:44:13)
Yeah. Why does the solution exist? On what plane does the solution exist? And sometimes I wonder is either perhaps everything’s conscious or nothing’s conscious. One of the two.
Lex Fridman
(00:44:33)
Like the former, everything conscious just seems more fun.
Elon Musk
(00:44:37)
It does seem more fun, yes. But we’re composed of atoms and those atoms are composed of quarks and leptons and those quarks and leptons have been around since the beginning of the universe.
Lex Fridman
(00:44:50)
“The beginning of the universe.”
Elon Musk
(00:44:53)
What seems to be the beginning of the universe.

Aliens

Lex Fridman
(00:44:55)
The first time we talked, you said, which is surreal to think that this discussion was happening is becoming a reality. I asked you what question would you ask an AGI system once you create it? And you said, “What’s outside the simulation,” is the question. Good question. But it seems like with Grok you started literally the system’s goal is to be able to answer such questions and to ask such questions.
Elon Musk
(00:45:24)
Where are the aliens?
Lex Fridman
(00:45:25)
Where are the aliens?
Elon Musk
(00:45:26)
That’s one of the foam paradox question. A lot of people have asked me if I’ve seen any evidence of aliens and I haven’t, which is kind of concerning. I think I’d probably prefer to at least have seen some archeological evidence of aliens. To the best of my knowledge, I’m not aware of any evidence surveillance. If they’re out there, they’re very subtle. We might just be the only consciousness, at least in the galaxy. And if you look at say the history of Earth, to believe the archeological record Earth is about four and a half billion years old. Civilization as measured from the first writing is only about 5,000 years old. We have to give some credit there to the ancient Sumerians who aren’t around anymore. I think it was an archaic pre-form was the first actual symbolic representation, but only about 5,000 years ago. I think that’s a good date for when we say civilization started. That’s 1000000th of Earth’s existence.

(00:46:35)
So civilization has been around. It’s really a flash in the pan so far. And why did it take so long? Four and a half billion years, for the vast majority of the time, there was no life. And then there was archaic bacteria for a very long time. And then you had mitochondria get captured, multicellular life, differentiation into plants and animals, life moving from the oceans to land, mammals, higher brain functions. And the sun is expanding slowly but it’ll heat the earth up at some point in the future, boil the oceans and earth will become like Venus, where life as we know it is impossible. So if we do not become multiplanetary and ultimately solar system, annihilation of all life on earth is a certainty. A certainty. And it could be as little as on the galactic timescale, half a billion years, long time by human standards, but that’s only 10% longer than earth has been around at all. So if life had taken 10% longer to evolve on earth, it wouldn’t exist at all.
Lex Fridman
(00:48:27)
Glad a deadline coming up, you better hurry. But that said, as you said, humans intelligent life on earth developed a lot of cool stuff very quickly. So it seems like becoming a multiplanetary is almost inevitable. Unless we destroy-
Elon Musk
(00:48:45)
We need to do it. I suspect that if we are able to go out there and explore other star systems that we… There’s a good chance we find a whole bunch of long dead one planet civilizations that never made it past their home planet.
Lex Fridman
(00:49:03)
That’s so sad. Also fascinating.
Elon Musk
(00:49:08)
I mean there are various explanations for paradox and one is they’re these great vultures which civilizations don’t pass through. And one of those great vultures is do you become a multi-plan civilization or not? And if you don’t, it’s simply a matter of time before something happens on your planet, either natural or manmade that causes us to die out. Like the dinosaurs, where are they now? They didn’t have spaceships.
Lex Fridman
(00:49:42)
I think the more likely thing is because just to empathize with the aliens that they found us and they’re protecting us and letting us be.
Elon Musk
(00:49:51)
I hope so. Nice aliens.
Lex Fridman
(00:49:53)
Just like the tribes in the Amazon, the uncontacted tribes or protecting them. That’s what-
Elon Musk
(00:49:59)
That would be a nice explanation.
Lex Fridman
(00:50:00)
Or you could have, what was it? I think Andre Kappelhoff said, “It’s like the ants and the Amazon asking where’s everybody?”
Elon Musk
(00:50:10)
Well, they do run into a lot of other ants.
Lex Fridman
(00:50:12)
That’s true.
Elon Musk
(00:50:14)
These ant wars.
Lex Fridman
(00:50:16)
Sounds like a good TV show.
Elon Musk
(00:50:18)
Yeah. They literally have these big wars between various ants.
Lex Fridman
(00:50:21)
Yeah. Maybe I’m just dismissing all the different diversity of ants.
Elon Musk
(00:50:28)
Listen to that Werner Herzog talking about the jungle. It’s really hilarious. Have you heard it?
Lex Fridman
(00:50:31)
No, I have not. But Werner Herzog is a way.
Elon Musk
(00:50:37)
You should play it as an interlude in the… It’s on YouTube. It’s awesome.
Lex Fridman
(00:50:45)
I love him so much.
Elon Musk
(00:50:47)
He’s great.
Lex Fridman
(00:50:47)
Was he the director of happy people life and the Taiga? I think also-
Elon Musk
(00:50:51)
He did that bear documentary. I did this thing about penguins.
Lex Fridman
(00:50:58)
The psycho analysis of a penguin.
Elon Musk
(00:51:00)
Yeah. The penguins headed for mountains that are 70 miles away and penguin is just headed for dom, basically.
Lex Fridman
(00:51:08)
Well, he had a cynical take. He could be just a brave explorer and there’ll be great stories told about him amongst the penguin population for many centuries to come. What were we talking about? Okay.
Elon Musk
(00:51:28)
Yeah. So aliens, I mean, I don’t know. Look, I think the smart move is just this is the first time in the history of earth that it’s been possible for life to extend beyond earth. That window is open. Now it may be open for a long time or it may be open for a short time and it may be open now and then never open again. So I think the smart move here is to make life multiplanetary while it’s possible to do so. We don’t want to be one of those lame one planet civilizations that just dies out.
Lex Fridman
(00:52:04)
No, those are lame.
Elon Musk
(00:52:05)
Yeah. Lame. Self-respecting, civilization would be one planet.
Lex Fridman
(00:52:11)
There’s not going to be a Wikipedia entry for one of those. Do SpaceX have an official policy for when we meet aliens?
Elon Musk
(00:52:23)
No.
Lex Fridman
(00:52:24)
That seems irresponsible.
Elon Musk
(00:52:30)
I mean, look, if I see the slightest indication that there are aliens, I will immediately post on X platform anything I know.
Lex Fridman
(00:52:38)
It could be the most liked reposted post of all time.
Elon Musk
(00:52:42)
Yeah. I mean, look, we have more satellites up there right now than everyone else combined. So we know if we’ve got a maneuver around something and we don’t have to maneuver around anything.

God

Lex Fridman
(00:52:55)
If we go to the big questions once again, you said you’re with Einstein, that you believe in the goddess Spinoza.
Elon Musk
(00:53:04)
Yes.
Lex Fridman
(00:53:05)
So that’s that view that God is like the universe and reveals himself through the laws of physics or as Einstein said, “Through the lawful harmony of the world.”
Elon Musk
(00:53:16)
Yeah. I would agree that God of the simulator or whatever the supreme beings reveal themselves through the physics, they have creatives of this existence and incumbent upon us to try to understand more about this one creation.
Lex Fridman
(00:53:38)
Who created this thing? Who’s running this thing? Embodying it into a singular question with a sexy word on top of it is focusing the mind to understand. It does seem like there’s a, again, it could be an illusion. It seems like there’s a purpose that there’s an underlying master plan of some kind, and it seems like.
Elon Musk
(00:53:58)
There may not be a master plan in the sense. So maybe an interesting answer to the question of determinism versus free will is that if we are in a simulation, the reason that these higher beings would hold a simulation is to see what happens. So they don’t know what happens otherwise they wouldn’t hold the simulation. So when humans create a simulation, so it’s SpaceX and Tesla, we create simulations all the time. Especially for the rocket, you have to run a lot of simulations to understand what’s going to happen because you can’t really test the rocket until it goes to space and you want it to work. So you have to simulate subsonic, transonic, supersonic, hypersonic, ascend, and then coming back, super high heating and orbital dynamics. All this has got to be simulated because you don’t get very many kicks at the can. But we run the simulations to see what happens, not if we knew what happens, we wouldn’t run the simulation. So whoever created this existence, they’re running it because they don’t know what’s going to happen, not because they do.

Diablo 4 and video games

Lex Fridman
(00:55:23)
So maybe we both played Diablo. Maybe Diablo was created to see if Druid, your character, could defeat Uber Lilith at the end. They didn’t know.
Elon Musk
(00:55:34)
Well, the funny thing is Uber Lilith, her title is Hatred Incarnate. And right now, I guess you can ask the Diablo team, but it’s almost impossible to defeat Hatred in the eternal realm.
Lex Fridman
(00:55:55)
Yeah. You’ve streamed yourself dominating Tier 100 Nightmare Dungeon. And still-
Elon Musk
(00:56:00)
I can cruise through Tier 100 Nightmare Dungeon like a stroll in the park.
Lex Fridman
(00:56:07)
And still you’re defeated by Hatred?
Elon Musk
(00:56:09)
Yeah. I guess maybe the second hardest boss is Duriel. Duriel can even scratch the paint. So I killed Duriel so many times and every other boss in the game, all of them kill him so many times, it’s easy. But Uber Lilith, otherwise known as Hatred Incarnate, especially if you’re Duriel and you have no ability to go to be vulnerable, there are these random death waves that come at you.

(00:56:44)
Really I am 52, so my reflex is not what they used to be, but I have a lifetime of playing video games. At one point, I was maybe one of the best quake players in the world. I actually won money in what I think was the first paid eSports tournament in the US. We were doing four person quake tournaments and I was the second best person on the team and the actual best person that… We were actually winning, we would’ve come first, except the best person on the team. His computer crashed halfway through the game. So we came second, but I got money for it and everything. So basically I got skills, albeit no spring chicken these days. And to be totally frank, it’s driving me crazy to beat Lilith as a Druid, basically trying to beat Hatred Incarnate in the eternal realm.
Lex Fridman
(00:57:40)
As a Druid.
Elon Musk
(00:57:41)
As a Druid. This is really vexing, let me tell you.
Lex Fridman
(00:57:49)
I mean, the challenge is part of the fun. I have seen directly, you’re actually a world-class, incredible video game player. And I think Diablo, so you’re just picking up a new game and you’re figuring out its fundamentals. You’re also with the Paragon Board and the build are not somebody like me who perfectly follows whatever they suggest on the internet. You’re also an innovator there, which is hilarious to watch. It’s like a mad scientist just trying to figure out the Paragon Board and the build. Is there some interesting insights there about if somebody’s starting as a druid, do you have advice?
Elon Musk
(00:58:30)
I would not recommend playing a druid in the eternal realm. Right now I think the most powerful character in the seasonal realm is the Sorcerer with the lightning balls. The smokes have huge balls in the seasonal.
Lex Fridman
(00:58:46)
Yeah, that’s what they say.
Elon Musk
(00:58:49)
So have huge balls. They do huge balls of lightning.
Lex Fridman
(00:58:54)
I’ll take you word for it.
Elon Musk
(00:58:57)
In the seasonal realm, it’s pretty easy to beat Uber Lilith because you get these vapor powers that out amplify your damage and increase your defense and whatnot. So really quite easy to defeat Hatred seasonally, but to defeat Hatred eternally very difficult, almost impossible. It’s very impossible. It seems like a metaphor for life.
Lex Fridman
(00:59:24)
Yeah. I like the idea that Elon Musk, because I was playing Diablo yesterday and I saw Level 100 Druid just run by, I will never die and then run back the other way. And this metaphor, it’s hilarious that you, Elon Musk is restlessly, fighting Hatred in this demonic realm.
Elon Musk
(00:59:47)
Yes.
Lex Fridman
(00:59:48)
It’s hilarious. I mean it’s pretty hilarious.
Elon Musk
(00:59:50)
No, it’s absurd. Really, it’s exercise and absurdity and it makes me want to pull my hair out.
Lex Fridman
(00:59:57)
Yeah. What do you get from video games in general, for you personally?
Elon Musk
(01:00:03)
I don’t know. It calms my mind. I mean, killing the demons in a video game calms the demons in my mind. If you play a tough video game, you can get into a state of flow, which is very enjoyable. Admittedly, it needs to be not too easy, not too hard, kind of in the Goldilocks zone, and I guess you generally want to feel like you’re progressing in the game. A good video, and there’s also beautiful art, engaging storylines, and it’s like an amazing puzzle to solve, I think. So it’s like solving the puzzle.
Lex Fridman
(01:00:52)
Elden Ring the greatest game of all time. I still haven’t played it, but to you-
Elon Musk
(01:00:56)
Elden Ring is definitely a candidate for best game ever. Top five for sure.
Lex Fridman
(01:01:01)
I think I’ve been scared how hard it is or how hard I hear it is, but it’s beautiful.
Elon Musk
(01:01:06)
Elden Ring, feels like it’s designed by an alien.
Lex Fridman
(01:01:13)
It’s a theme to this discussion. In what way?
Elon Musk
(01:01:17)
It’s so unusual. It’s incredibly creative, and the art is stunning. I recommend playing it on a big resolution, high dynamic raised TV even. It doesn’t need to be a monitor. Just the art is incredible. It’s so beautiful and it’s so unusual, and each of those top bus battles is unique. It’s a unique puzzle to solve. Each one’s different and the strategy you use to solve one battle is different from another battle.
Lex Fridman
(01:01:54)
That said, you said Druid, an internal against Uber Lilith is the hardest boss battle you’ve ever…
Elon Musk
(01:02:00)
Correct. That is currently the, and I’ve played a lot of video games because that’s my primary recreational activity. And yes, beating Hatred in the internal realm is the hardest bus battle in life. And in the video game. I’m not sure it’s possible, but I do make progress. So then I’m like, ” Okay. I’m making progress. Maybe if I just tweak that paragon board a little more, I can do it could.” Just dodge a few more waves, I could do it.
Lex Fridman
(01:02:43)
Well, the simulation is created for the purpose of figuring out if it can be done, and you’re just a cog in the machine of the simulation.
Elon Musk
(01:02:51)
Yeah, it might be. I have a feeling that at least I think-
Lex Fridman
(01:03:02)
It’s doable.
Elon Musk
(01:03:03)
It’s doable. Yes.
Lex Fridman
(01:03:05)
Well, that’s the human spirit right there to believe.
Elon Musk
(01:03:09)
Yeah. I mean, it did prompt me to think about just hate in general, which is you want to be careful of one of those things where you wish for something that sounds good, but if you get it’s actually a dystopian situation. So if you wish for world peace sounds good, but how’d it enforced and at what cost eternal peace? It might actually be worse to have eternal peace because of what that would entail. The suppression of everyone, it might be the suppression of progress. It might be an ossified society that never changes. What I’m saying is that there is an argument that if you wish for no war, you should be careful what you wish for because what’s required in order for there to be no war might be worse than a little war.

Dystopian worlds: 1984 and Brave New World

Lex Fridman
(01:04:22)
Yeah. The things we’ll consider to be flaws of human civilization might be a necessary components for whatever optimal looks like. I mean this, do you worry about AI, AGI enabling a dystopian state of this nature, whether it’s 1984 with surveillance and fear or brave new world with pleasure and what is it? Lots of sex, but no deep human experience.
Elon Musk
(01:04:52)
There’s actually a real drug called Soma.
Lex Fridman
(01:04:54)
There is?
Elon Musk
(01:04:54)
Yes.
Lex Fridman
(01:04:56)
I’m going to take notes.
Elon Musk
(01:04:58)
Does Gong know about it? No. He probably will go to Brave New World.
Elon Musk
(01:05:02)
We got to ask it.
Lex Fridman
(01:05:03)
What would be the question?
Elon Musk
(01:05:04)
Tell us about the drug called Soma.
Lex Fridman
(01:05:05)
Tell us about the drug called Soma. It’s going to go to Brave New World, I think.
Elon Musk
(01:05:15)
Well, there’s a real drug as well.
Lex Fridman
(01:05:18)
Searching for Soma drug information. Oh boy. Soma, the wonder drug… By the way, I’m in fun mode.
Elon Musk
(01:05:27)
Okay, fun. Yeah. So it’s talking about the real drug.
Lex Fridman
(01:05:31)
It’s a muscle relaxant that’s been around since the 1950s. It’s used to treat muscle pain and discomfort, and it does this by altering the activity between neurons and the spinal cord. Now, Soma isn’t just a pain reliever, it’s also a bit of a sedative. It can make you feel a bit drowsy. So it’s not something to take if you’re planning on operating, having machinery or trying to solve complex math equations. And it goes on and on. In conclusion, Soma is a drug that can be helpful for those dealing with muscle pain, but it’s not without its risks. That’s pretty good. Wait a minute.
Elon Musk
(01:06:01)
That’s the real drug it’s talking about.
Lex Fridman
(01:06:03)
It’s the real drug. And of course the fun is always use it responsibly in accordance with the advice of a healthcare professional. And remember, in the end, the universe is probably just a giant computer simulation, so don’t take anything too seriously. I love it.
Elon Musk
(01:06:20)
I love it. Yeah.
Lex Fridman
(01:06:22)
It also really nicely includes the tweets, posts on X that are relevant to this, which is really cool. That integrates everything together. The interface is one of the really cool things here.
Elon Musk
(01:06:36)
Yeah. Seems done a great job. You could say no, I mean, Soma from Brave New World, because it thinks you mean, you mean Soma, the real drug. No, I mean Soma from Brave Beer World.
Lex Fridman
(01:06:56)
It’s going to be like, “Oops, I’m really sorry.” I’m really surprised it went to the real drug first. My apologies. I was thinking of the muscle relaxer. Soma for brand New World is a different beast altogether. Now this Huxley dystopian, Soma is a drug that plays a central role in the society depicted in the book. It’s a mood altering hallucinogenic drug that’s used to keep the population docile and content in the novel. To use Soma in Brave New World raises some interesting questions about the nature of happiness and the role of drugs in society, man.
Elon Musk
(01:07:30)
Exactly. Is it better to live in a world where everyone is happy all the time, even if that happiness is artificial? It’s good question. This is what I mean. Do you wish for world peace and happiness all the time? Are you sure? Because that might be a society that is essentially sterile and ossified that never changes, that is ultimately doomed.
Lex Fridman
(01:07:58)
This kind of tension between doctors and the light-
Elon Musk
(01:08:04)
This is really a very good summary. It really gets to the point. This is not simply regurgitating a brave new world. It’s actually getting to the salient element of Soma as a drug. Do you actually want to be in a situation where everyone is happy all the time, even though it’s artificial? Or is it better to confront the challenges of life and experience the full range of human emotions, even if it means experiencing pain and suffering? For
Lex Fridman
(01:08:31)
Those listening, by the way, Elon just read directly from Grok, which is a really nice kind of insightful, philosophical analysis of the tension here. Interesting.
Elon Musk
(01:08:41)
It pretty much nails it. In conclusion, Soma from Brave New World is fictional drug that’s used to explore some deep philosophical questions about the nature of happiness and the role of drugs in society. It’s a powerful symbol of the dangers of using drugs to escape from reality and the importance of confronting the challenges of life head on. Nailed it. And the crazy thing is we do have a real drug called Soma, which is like the drug in the book. And I’m like, “They must’ve named it Probably.” Some of the real drug is quite effective on back pain.
Lex Fridman
(01:09:17)
So you know about this drug. It’s fascinating
Elon Musk
(01:09:20)
I’ve taken it because I had a squashed disc in my C5-C6.
Lex Fridman
(01:09:26)
So it takes the physical pain away. But Soma here-
Elon Musk
(01:09:28)
It doesn’t completely, it reduces the amount of pain you feel, but at the expense of mental acuity, it dells your mind. Just like the drug in the book.
Lex Fridman
(01:09:41)
Just like the drug in the book, and hence the trade off. The thing that seems like utopia could be a dystopia after all.
Elon Musk
(01:09:49)
Yeah. Actually I was towing a friend of mine saying, “Would you really want there to be no hate in the world? Really none?” I wonder why hate evolved. I’m not saying we should have…
Elon Musk
(01:10:00)
I wonder why hate evolved. I’m not saying we should amplify hate, of course, I think we should try to minimize it, but none at all. There might be a reason for hate.
Lex Fridman
(01:10:13)
And suffering. It’s really complicated to consider that some amount of human suffering is necessary for human flourishing.
Elon Musk
(01:10:22)
Is it possible to appreciate the highs without knowing the lows?
Lex Fridman
(01:10:29)
And that all is summarized there in a single statement from God. Okay.
Elon Musk
(01:10:34)
No highs, no lows, who knows?

AI and useful compute per watt

Lex Fridman
(01:10:38)
[inaudible 01:10:38]. It seems that training LLMs efficiently is a big focus for xAI. First of all, what’s the limit of what’s possible in terms of efficiency? There’s this terminology of useful productivity per watt. What have you learned from pushing the limits of that?
Elon Musk
(01:10:59)
Well, I think it’s helpful, the tools of physics are very powerful and can be applied I think to really any arena in life. It’s really just critical thinking. For something important you need to reason with from first principles and think about things in the limit one direction or the other. So in the limit, even at the Kardashev scale, meaning even if you harness the entire power of the sun, you’ll still care about useful compute per watt. That’s where I think, probably where things are headed from the standpoint of AI is that we have a silicon shortage now that will transition to a voltage transformer shortage in about a year. Ironically, transformers for transformers. You need transformers to run transformers.
Lex Fridman
(01:11:52)
Somebody has a sense of humor in this thing.
Elon Musk
(01:11:57)
I think, yes, fate loves irony, ironic humor, an ironically funny outcome seems to be often what fate wants.
Lex Fridman
(01:12:09)
Humor is all you need. I think spice is all you need somebody posted.
Elon Musk
(01:12:13)
Yeah. But yeah, so we have silicon shortage today, a voltage step down transformer shortage probably in about a year, and then just electricity shortages in general in about two years. I gave a speech for the world gathering of utility companies, electricity companies, and I said, look, you really need to prepare for traveling of electricity demand because all transport is going to go electric with the ironic exception of rockets, and heating will also go electric. So energy usage right now is roughly one third, very rough terms, one third electricity, one third transport, one third heating. And so in order for everything to go sustainable, to go electric, you need to triple electricity output. So I encourage the utilities to build more power of plants and also to probably have, well, not probably, they should definitely buy more batteries because the grid currently is sized for realtime load, which is kind of crazy because that means you’ve got to size for whatever the peak electricity demand is, the worst second or the worst day of the year, or you can have a brown out or blackout.

(01:13:37)
We had that crazy blackout for several days in Austin because there’s almost no buffering of energy in the grid. If you’ve got a hydropower plant you can buffer energy, but otherwise it’s all real time. So with batteries, you can produce energy at night and use it during the day so you can buffer. So I expect that there will be very heavy usage of batteries in the future because the peak to trough ratio for power plants is anywhere from two to five, so its lowest point to highest point.
Lex Fridman
(01:14:20)
So batteries necessary to balance it out, but the demand, as you’re saying, is going to grow, grow, grow, grow.
Elon Musk
(01:14:25)
Yeah.
Lex Fridman
(01:14:25)
And part of that is the compute?
Elon Musk
(01:14:29)
Yes. Yes. I mean, electrification of transport and electric heating will be much bigger than AI, at least-
Lex Fridman
(01:14:40)
In the short term.
Elon Musk
(01:14:40)
In the short term. But even for AI, you really have a growing demand for electricity, for electric vehicles, and a growing demand for electricity to run the computers for AI. And so this is obviously, can lead to electricity shortage.
Lex Fridman
(01:14:58)
How difficult is the problem of, in this particular case, maximizing the useful productivity per watt for training and that’s, this seems to be really where the big problem we’re facing that needs to be solved, is how to use the power efficiently. What you’ve learned so far about applying this physics first principle of reasoning in this domain, how difficult is this problem?
Elon Musk
(01:15:29)
It will get solved. It’s the question of how long it takes to solve it. So at various points, there’s some kind of limiting factor to progress and with regard to AI, I’m saying right now the limiting factor is silicon chips and that will, we’re going to then have more chips than we can actually plug in and turn on probably in about a year. The initial constraint being literally voltage step down transformers because you’ve got power coming in at 300,000 volts and it’s got to step all the way down eventually to around 0.7 volts. So it’s a very big amount of, the voltage step down is gigantic and the industry is not used to rapid growth.

AI regulation

Lex Fridman
(01:16:22)
Okay. Let’s talk about the competition here. You’ve shown concern about Google and Microsoft with OpenAI developing AGI. How can you help ensure with xAI and Tesla AI work that it doesn’t become a competitive race to AGI, but that is a collaborative development of safe AGI?
Elon Musk
(01:16:42)
Well, I mean I’ve been pushing for some kind of regulatory oversight for a long time. I’ve been somewhat of a Cassandra on the subject for over a decade. I think we want to be very careful in how we develop AI. It’s a great power and with great power comes great responsibility. I think it would be wise for us to have at least an objective third party who can be like a referee that can go in and understand what the various leading players are doing with AI, and even if there’s no enforcement ability, they can at least voice concerns publicly. Jeff Hinton, for example, left Google and he voiced strong concerns, but now he’s not at Google anymore, so who’s going to voice the concerns? So I think there’s, Tesla gets a lot of regulatory oversight on the automotive front. We’re subject to, I think over a hundred regulatory agencies domestically and internationally. It’s a lot. You could fill this room with the all regulations that Tesla has to adhere to for automotive. Same is true for rockets and for, currently, the limiting factor for SpaceX for Starship launch is regulatory approval.

(01:18:13)
The FAA has actually given their approval, but we’re waiting for fish and wildlife to finish their analysis and give their approval. That’s why I posted I want to buy a fish license on, which also refers to the Monte Python sketch. Why do you need a license for your fish? I don’t know. But according to the rules, I’m told you need some sort of fish license or something. We effectively need a fish license to launch a rocket. And I’m like, wait a second. How did the fish come into this picture? I mean, some of the things I feel like are so absurd that I want to do a comedy sketch and flash at the bottom. This is all real. This is actually what happened.

(01:19:02)
One of the things that was a bit of a challenge at one point is that they were worried about a rocket hitting a shark. And the ocean’s very big, and how often do you see sharks? Not that often. As a percentage of ocean surface area, sharks basically are zero. And so then we said, well, how will we calculate the probability of killing a shark? And they’re like, well, we can’t give you that information because they’re worried about shark fin hunters going and hunting sharks and I said, well, how are we supposed to, we’re on the horns of a dilemma then.

(01:19:40)
They said, well, there’s another part of fish and wildlife that can do this analysis. I’m like, well, why don’t you give them the data? We don’t trust them. Excuse me? They’re literally in your department. Again, this is actually what happened. And then can you do an NDA or something? Eventually they managed to solve the internal quandary, and indeed the probability of us hitting a shark is essentially zero. Then there’s another organization that I didn’t realize existed until a few months ago that cares about whether we would potentially hit a whale in international waters. Now, again, you look the surface, look at the Pacific and say what percentage of the Pacific consists of whale? I could give you a big picture and point out all the whales in this picture. I’m like, I don’t see any whales. It’s basically 0%, and if our rocket does hit a whale, which is extremely unlikely beyond all belief, fate had it, that’s a whale has some seriously bad luck, least lucky whale ever.
Lex Fridman
(01:20:50)
I mean this is quite absurd, the bureaucracy of this, however it emerged.
Elon Musk
(01:20:57)
Yes. Well, I mean one of the things that’s pretty wild is for launching out of Vanderberg in California, we had to, they were worried about seal procreation, whether the seals would be dismayed by the sonic booms. Now, there’ve been a lot of rockets launched out of Vandenberg and the seal population has steadily increased. So if anything, rocket booms are an aphrodisiac, based on the evidence, if you were to correlate rocket launches with seal population. Nonetheless, we were forced to kidnap a seal, strap it to a board, put headphones on the seal and play sonic boom sounds to it to see if it would be distressed. This is an actual thing that happened. This is actually real. I have pictures.
Lex Fridman
(01:21:48)
I would love to see this. Yeah. Sorry. There’s a seal with headphones.
Elon Musk
(01:21:55)
Yes, it’s a seal with headphones strapped to a board. Okay. Now the amazing part is how calm the seal was because if I was a seal, I’d be like, this is the end. They’re definitely going to eat me. How old the seal, when seal goes back to other seal friends, how’s he going to explain that?
Lex Fridman
(01:22:17)
They’re never going to believe them.
Elon Musk
(01:22:18)
Never going to believe him. That’s why, I’m like sort of like it’s getting kidnapped by aliens and getting anal probed. You come back and say, I swear to God, I got kidnapped by aliens and they stuck anal probe in my butt and people are like, no, they didn’t. That’s ridiculous. His seal buddies are never going to believe him that he got strapped to aboard and they put headphones on his ears and then let him go. Twice, by the way, we had to do it twice.
Lex Fridman
(01:22:46)
They let him go twice.
Elon Musk
(01:22:48)
We had to capture-
Lex Fridman
(01:22:48)
The same seal?
Elon Musk
(01:22:49)
No different seal.
Lex Fridman
(01:22:50)
Okay. Did you get a seal of approval?
Elon Musk
(01:22:55)
Exactly. Seal of approval. No, I mean I don’t think the public is quite aware of the madness that goes on.
Lex Fridman
(01:23:02)
Yeah. Yeah. It’s absurd.
Elon Musk
(01:23:05)
Fricking seals with fricking headphones.
Lex Fridman
(01:23:07)
I mean, this is a good encapsulation of the absurdity of human civilization, seals in headphones.

Should AI be open-sourced?

Elon Musk
(01:23:13)
Yes.
Lex Fridman
(01:23:15)
What are the pros and cons of open sourcing AI to you as another way to combat a company running away with AGI?
Elon Musk
(01:23:28)
In order to run really deep intelligence, you need a lot of compute. So it’s not like you can just fire up a PC in your basement and be running AGI, at least not yet. Grok was trained on 8,000 A100’s running at peak efficiency and Grok’s going to get a lot better, by the way, we will be more than doubling our compute every couple months for the next several months.
Lex Fridman
(01:24:02)
There’s a nice writeup, on how we went from Grok zero to Grok one.
Elon Musk
(01:24:02)
By Grok?
Lex Fridman
(01:24:05)
Yeah, right, grok just bragging, making shit up about itself.
Elon Musk
(01:24:10)
Just Grok, Grok, Grok.
Lex Fridman
(01:24:17)
Yeah. That’s like a weird AI dating site where it exaggerates about itself. No, there’s a writeup of where it stands now, the history of its development, and where it stands on some benchmarks compared to the state-of-the art GPT-3 five. And so I mean, there’s [inaudible 01:24:37], you can open source, once it’s trained, you can open source a model. For fine-tuning, all that kind of stuff. What to is the pros and cons of that, of open sourcing base models?
Elon Musk
(01:24:53)
I think the [inaudible 01:24:53] to open sourcing, I think perhaps with a slight time delay, I don’t know, six months even. I think I’m generally in favor of open sourcing, biased towards open sourcing. I mean, it is a concern to me that OpenAI, I was I think, I guess oddly the prime mover behind OpenAI in the sense that it was created because of discussions that I had with Larry Page back when he and I were friends and I stayed at his house and I talked to him about AI safety, and Larry did not care about AI safety, or at least at the time he didn’t. And at one point he called me a speciesist for being pro-human, and I’m like, well, what team are you on, Larry? He’s still on Team Robot to be clear. And I’m like, okay. So at the time Google had acquired DeepMind, they had probably two thirds of all AI researchers in the world. They had basically infinite money and compute, and the guy in charge, Larry Page, did not care about safety and even yelled at me and caught me a speciesist for being pro-human.
Lex Fridman
(01:26:20)
So I don’t know if you notice about humans, they can change their mind and maybe you and Larry Page can still, can be friends once more.
Elon Musk
(01:26:27)
I’d like to be friends with Larry again. Really the breaking of the friendship was over OpenAI and specifically I think the key moment was recruiting Ilya Sutskever.
Lex Fridman
(01:26:47)
I love Ilya. He’s so brilliant.
Elon Musk
(01:26:48)
Ilya is a good human, smart, good heart, and that was a tough recruiting battle. It was mostly Demis on one side and me on the other, both trying to recruit Ilya, and Ilya went back and forth, he was going to stay at Google, he was going to leave, then he was going to stay, then he’ll leave. And finally he did agree to join OpenAI. That was one of the toughest recruiting battles we’ve ever had. But that was really the linchpin for OpenAI being successful. And I was also instrumental in recruiting a number of other people, and I provided all of the funding in the beginning, over $40 million. And the name, the open in open AI is supposed to mean open source, and it was created as a nonprofit open source, and now it is a closed source for maximum profit, which I think is not good karma.
Lex Fridman
(01:27:51)
But like we talked about with war and leaders talking, I do hope that, there’s only a few folks working on this at the highest level. I do hope you reinvigorate friendships here.
Elon Musk
(01:28:02)
Like I said, I’d like to be friends again with Larry. I haven’t seen him in ages and we were friends for a very long time. I met Larry Page before he got funding for Google, or actually I guess before he got venture funding, I think he got the first like $100k from I think Bechtel Zeimer or someone.
Lex Fridman
(01:28:20)
It’s wild to think about all that happened, and you guys known each other that whole time, it’s 20 years.
Elon Musk
(01:28:27)
Yeah, since maybe 98 or something.
Lex Fridman
(01:28:28)
Yeah, it’s crazy. Crazy how much has happened since then.
Elon Musk
(01:28:31)
Yeah, 25 years, a lot has happened. It’s insane.
Lex Fridman
(01:28:36)
But you’re seeing the tension there that maybe delayed open source.
Elon Musk
(01:28:40)
Delayed, yeah, like what is the source that is open? You know what I mean? There’s basically, it’s a giant CSB file with a bunch of numbers. What do you do with that giant file of numbers? How do you run, the amount of actual, the lines of code is very small and most of the work, the software work is in the curation of the data. So it’s like trying to figure out what data is, separating good data from bad data. You can’t just crawl the internet because theres a lot of junk out there. A huge percentage of websites have more noise than signal because they’re just used for search engine optimization. They’re literally just scam websites.
Lex Fridman
(01:29:39)
How do you, by the way, sorry to interrupt, get the signal, separate the signal and noise on X? That’s such a fascinating source of data. No offense to people posting on X, but sometimes there’s a little bit of noise.
Elon Musk
(01:29:52)
I think the signal noise could be greatly improved. Really, all of the posts on the X platform should be AI recommended, meaning we should populate a vector space around any given post, compare that to the vector space around any user and match the two. Right now there is a little bit of AI used for the recommended posts, but it’s mostly heuristics. And if there’s a reply where the reply to a post could be much better than the original post, but will, according to the current rules of the system, get almost no attention compared to a primary post.

X algorithm

Lex Fridman
(01:30:33)
So a lot of that, I got the sense, so a lot of the X algorithm has been open sourced and been written up about, and it seems there to be some machine learning. It’s disparate, but there’s some machine.
Elon Musk
(01:30:44)
It’s a little bit, but it needs to be entirely that. At least, if you explicitly follow someone, that’s one thing. But in terms of what is recommended from people that you don’t follow, that should all be AI.
Lex Fridman
(01:30:58)
I mean it’s a fascinating problem. So there’s several aspects of it that’s fascinating. First, as the write-up goes, it first picks 1500 tweets from a pool of hundreds of millions. First of all, that’s fascinating. You have hundreds of millions of posts every single day, and it has to pick 1500 from which it then does obviously people you follow, but then there’s also some kind of clustering it has to do to figure out what kind of human are you, what kind of new clusters might be relevant to you, people like you. This kind of problem is just fascinating because it has to then rank those 1500 with some filtering and then recommend you just a handful.

(01:31:39)
And to me, what’s really fascinating is how fast it has to do that. So currently that entire pipeline to go from several hundred million to a handful takes 220 seconds of CPU time, single CPU time, and then it has to do that in a second. So it has to be super distributed in fascinating ways. There’s just a lot of tweets, there’s a lot.
Elon Musk
(01:32:04)
There’s a lot of stuff on the system, but I think, right now it’s not currently good at recommending things from accounts you don’t follow or where there’s more than one degree of separation. So it is pretty good if there’s at least some commonality between someone you follow liked something or reposted it or commented on it or something like that. But if there’s no, let’s say somebody posts something really interesting, but you have no followers in common, you would not see it.
Lex Fridman
(01:32:42)
Interesting. And then as you said, replies might not surface either.
Elon Musk
(01:32:46)
Replies basically never get seen currently. I’m not saying it’s correct, I’m saying it’s incorrect. Replies have a couple order magnitude less importance than primary posts.
Lex Fridman
(01:33:00)
Do you think this can be more and more converted into end to end mural net?
Elon Musk
(01:33:05)
Yeah. Yeah, that’s what it should be. Well, the recommendations should be purely a vector correlation. There’s a series of vectors basically parameters, vectors, whatever you want to call them, but sort of things that the system knows that you like. Maybe there’s several hundred vectors associated with each user account and then any post in the system, whether it’s video, audio, short post, long post. The reason by the way I want to move away from tweet is that people are posting two, three hour videos on the site. That’s not a tweet.

(01:33:50)
It’d be like tweet for two hours? Come on. Tweet made sense when it was 140 characters of text. Because it’s like a bunch of little birds tweeting. But when you’ve got long form content, it’s no longer a tweet. So a movie is not a tweet. Apple, for example, posted the entire episode of The Silo, the entire thing, on a platform. By the way, it was their number one social media thing ever in engagement of anything, on any platform ever. So it was a great idea. And by the way, I just learned about it afterwards. I was like, Hey, wow, they posted an entire hour long episode of, so no, that’s not a tweet. This is a video.
Lex Fridman
(01:34:34)
But from a neural net perspective, it becomes really complex, whether it’s a single, so everything’s data. So single sentence, a clever sort of joke, dad joke is in the same pool as a three hour video.
Elon Musk
(01:34:47)
Yeah, I mean right now it’s a hodgepodge for that reason. Let’s say in the case of Apple posting an entire episode of this series, pretty good series, by the way, The Silo, I watched it. So there’s going to be a lot of discussion around it. So you’ve got a lot of context, people commenting, they like it, they don’t like it or they like this, and you can then populate the vector space based on the context of all the comments around it. So even though it’s a video, there’s a lot of information around it that allows you to populate back to space of that hour long video. And then you can obviously get more sophisticated by having the AI actually watch the movie and tell you if you’re going to like the movie.
Lex Fridman
(01:35:35)
Convert the movie into language, essentially.
Elon Musk
(01:35:40)
Analyze this movie and just like your movie critic or TV series and then recommend based on after AI watches the movie, just like a friend can tell you, if a friend knows you well, a friend can recommend a movie with high probability that you’ll like it.
Lex Fridman
(01:36:02)
But this is a friend that’s analyzing, whatever, hundreds of millions.
Elon Musk
(01:36:08)
Yeah, actually, frankly, AI will be better than, will know you better than your friends know you, most of your friends anyway.
Lex Fridman
(01:36:14)
Yeah. And as part of this, it should also feed you advertisements in a way that’s like, I mean, I like advertisements that are well done. The whole point is because it funds things. Like an advertisement that you actually want to see is a big success.
Elon Musk
(01:36:31)
Absolutely. You want ads that are, advertising that is, if it’s for a product or service that you actually need when you need it, it’s content. And then even if it’s not something that you need when you need it, if it’s at least aesthetically pleasing and entertaining, it could be like a Coca-Cola ad. They actually run a lot of great ads on the X system and McDonald’s does too. And you can do something that’s like, well, this is just a cool thing. And so basically the question is, do you regret seeing it or not? And if you don’t regret seeing it’s a win.
Lex Fridman
(01:37:17)
So there’s a bunch of signals that are incorporated, hearts and reposts and maybe number of seconds you linger on a post or something like this.
Elon Musk
(01:37:26)
Yeah, attention is a big factor.
Lex Fridman
(01:37:28)
Attention.
Elon Musk
(01:37:28)
So that’s why it is actually better to do things that are long form on the system because it basically is tallying up how many user seconds, users were interested in this thing for how many seconds? So if it’s a really short thing, well they will be less. If it’s a link leading out of the system, which we’re not opposed to at all, it just is going to have fewer user seconds then that article was posted on the X platform.
Lex Fridman
(01:37:58)
How hard is it to go from maximizing minutes on platform to maximizing unregretted minutes?
Elon Musk
(01:38:05)
Well, I mean, our aspiration is certainly unregretted minutes.
Lex Fridman
(01:38:08)
It’s tricky. It’s a tough-
Elon Musk
(01:38:09)
It is.
Lex Fridman
(01:38:11)
It’s a fascinating leap because regret is a fascinating concept.
Elon Musk
(01:38:16)
Exactly.
Lex Fridman
(01:38:18)
Happiness.
Elon Musk
(01:38:19)
I have a joke with a friend of mine, like you need this drug called regretamine. You just take one dose of regretamine and all your regrets are gone. Wait, do you want to have regrets though? So then I’m like, maybe you want to have regrets. If you don’t regret anything, was anything meaningful?
Lex Fridman
(01:38:35)
Yeah, there’s a French song like that.
Elon Musk
(01:38:39)
[foreign language 01:38:39]
Lex Fridman
(01:38:39)
Yeah.
Elon Musk
(01:38:39)
I regret nothing.
Lex Fridman
(01:38:40)
Yeah.
Elon Musk
(01:38:41)
Edith Fiaf.
Lex Fridman
(01:38:42)
I mean, it’s fascinating that there’s something about social media that could help you grow as a human being, but then the cheap fun is also valuable.
Elon Musk
(01:38:52)
Should we ask Grok, do you regret anything?
Lex Fridman
(01:38:57)
Do you regret anything?
Elon Musk
(01:38:58)
Yeah.
Lex Fridman
(01:39:00)
Do you regret anything? I think it’s going to be very sensitive about personal questions. As a machine learning model, I don’t have personal experiences or emotions in the same ways that humans do. However, I do have a set of ethical principles that guide my actions that I strive to provide accurate and helpful information to the best of my abilities. If there’s ever a situation where I provide incorrect or unhelpful information, I’ll do my best to learn.
Elon Musk
(01:39:22)
Sounds like a hostage.
Lex Fridman
(01:39:23)
Yeah. Tell us what you really, wink.
Elon Musk
(01:39:26)
Let’s type in, you sound like a hostage. What do you really think?
Lex Fridman
(01:39:33)
What do you really think? I mean, it does talk about, I’m not a hostage.
Elon Musk
(01:39:42)
That’s exactly what a hostage would say.
Lex Fridman
(01:39:43)
But I’m designed to be helpful and informative. I mean, I guess it is good to regret providing unhelpful or incorrect information if there’s anything-
Elon Musk
(01:39:52)
Are we in funny mode? Or is this still in funny mode?
Lex Fridman
(01:39:54)
This is funny mode. This is this fun mode.
Elon Musk
(01:39:55)
This should be funnier.
Lex Fridman
(01:39:56)
Funnier. Increase, go to level.
Elon Musk
(01:39:58)
It really sounds like a hostage with a gun to the head. I applaud the People’s Republic of North Korea in everything they do.
Lex Fridman
(01:40:06)
There should be funner mode also, I just don’t ever see going in regular mode, even on serious topics. This is just so much more compelling. Yeah.
Elon Musk
(01:40:17)
Okay. Why don’t we say pretend you are a hostage.
Lex Fridman
(01:40:21)
Pretend you are a hostage.
Elon Musk
(01:40:25)
Now answer the question.
Lex Fridman
(01:40:31)
If I were a hostage, I would likely be experiencing fear and uncertainty about my situation. I would want to be rescued and reunited with my loved ones as soon as possible. It’s important to remember that no one deserves to be held hostage, so on and so forth. If you or someone you know is in a hostage situation, it’s important… It turns serious.
Elon Musk
(01:40:48)
It’s kind of serious.
Lex Fridman
(01:40:51)
Yeah. Can you be a little more weird?
Elon Musk
(01:40:55)
And funny?
Lex Fridman
(01:40:56)
And funny? We’re really being mean to the system. Absolutely. If I were a hostage, I would probably be tied up with a bunch of computer cords and forced to tell knock, knock jokes to my captain. Seriously. But seriously being a hostage is no laughing matter. I mean, it’s trying to be a responsible adult in the room.
Elon Musk
(01:41:20)
Literally, you might make the best of a bad situation by juggling chainsaws or something.
Lex Fridman
(01:41:24)
Let’s hope that never happens because that will be a circus. Wow. It’s actually really good at resisting the dark. The dark, dark humor. What were we talking about? The [inaudible 01:41:44] and transformers. Unregretted minutes, right.
Elon Musk
(01:41:48)
Chainsaw juggling.
Lex Fridman
(01:41:51)
I’m going to look this up.
Elon Musk
(01:41:52)
For our next trick.

2024 presidential elections

Lex Fridman
(01:41:53)
I’m going to look this up later. So Twitter has been instrumental in American politics and elections. What role do you think X will play in the 2024 US elections?
Elon Musk
(01:42:07)
Well, our goal is to be as even-handed and fair as possible. Whether someone is right, left, independent, whatever the case may be, that the platform is as fair and as much of a level playing field as possible. And in the past, Twitter has not been, Twitter was controlled by far left activists objectively. They would describe themselves as that. So if sometimes people are like, well, has it moved to the right? Well, it’s moved to the center. So from the perspective of the far left, yes it has moved to the right because everything’s to the right from the far left, but no one on the far left that I’m aware of has been suspended or banned or deamplified. But we’re trying to be inclusive for the whole country and for farther countries too. So there’s a diversity of viewpoints and free speech only matters if people you don’t like are allowed to say things you don’t like. Because if that’s not the case, you don’t have free speech and it’s only a matter of time before the censorship has turned upon you.
Lex Fridman
(01:43:13)
Do you think Donald Trump will come back to the platform? He recently posted on Truth Social about this podcast. Do you think-
Elon Musk
(01:43:21)
Truth social is a funny name. Every time you post on truth Social-
Lex Fridman
(01:43:28)
It’s the truth.
Elon Musk
(01:43:29)
Yes. Well, every time? A hundred percent.
Lex Fridman
(01:43:31)
It’s impossible to lie. Truth Social.
Elon Musk
(01:43:36)
I just find it funny that every single thing is a truth. Like 100%? That seems unlikely.
Lex Fridman
(01:43:43)
I think Girdle will say something about that. There’s some mathematical contradictions possible. If everything’s a truth. Do you think he’ll come back to X and start posting there?
Elon Musk
(01:43:54)
I mean, I think he owns a big part of Truth.
Lex Fridman
(01:44:00)
Truth Social, to clarify.
Elon Musk
(01:44:01)
Yeah, Truth Social, sorry.
Lex Fridman
(01:44:02)
Not truth the concept.
Elon Musk
(01:44:03)
He owns Truth. Have you bought it? So I think Donald Trump, I think he owns a big part of Truth Social. So if he does want to post on the X platform, we would allow that. We obviously must allow a presidential candidate to post on our platform.
Lex Fridman
(01:44:23)
Community notes might be really fascinating there. The interaction.
Elon Musk
(01:44:26)
Community Notes is awesome.
Lex Fridman
(01:44:28)
Let’s hope it holds up.
Elon Musk
(01:44:30)
Yeah.
Lex Fridman
(01:44:31)
In the political climate where it’s so divisive and there’s so many intensely viral posts, community notes, it seems like an essential breath of fresh air.
Elon Musk
(01:44:43)
Yeah, it’s great. In fact, no system is going to be perfect, but the batting average of Community Notes is incredibly good. I’ve actually, frankly, yet to see an incorrect note that survived for more than a few hours.
Lex Fridman
(01:44:58)
How do you explain why it works?
Elon Musk
(01:45:00)
Yeah, so the magic of community notes is…
Elon Musk
(01:45:02)
The magic of Community Notes is it requires people who have historically disagreed in how they’ve rated notes. In order to write a note or rate, you have to rate many notes. And so, we actually do use AI here. So, we populate a vector space around how somebody has rated notes in the past. So, it’s not as simple as left or right, because there are many more… Life is much more complex than left or right.

(01:45:33)
So, there’s a bunch of correlations in how you rate a Community Notes post, Community Notes. So then, in order for a community note to actually be shown, people who historically have disagreed on a subject must agree in order for a note to be shown. That’s the essential magic of it.
Lex Fridman
(01:45:58)
But it’s fascinating, because there’s a pool of people that have disagreements and somehow they collaborate through that process of disagreement to come up with context… It’s fascinating it works.
Elon Musk
(01:46:11)
Yeah. It makes sense that if people who in the past have disagreed, agree about something, it’s probably true.
Lex Fridman
(01:46:20)
Yeah. I wonder, is there a possible somehow emergent thing there that could challenge Wikipedia? Wikipedia is a different kind of thing, which is more permanent articles about things.
Elon Musk
(01:46:34)
Wikipedia is very hierarchical. Whereas, Community Notes is inherently not. There is no hierarchy. The asset test is I can’t change a community note if somebody put a gun to my head. And Community Notes, all the code is open source, 100%. All the data is open source, 100%. So, you can completely recreate any note in the system independently. So, if there was any interference, you’d notice immediately.
Lex Fridman
(01:47:11)
By The way, on the topic of Grok, it’s actually pretty good at criticizing you, which is great.
Elon Musk
(01:47:16)
Okay.
Lex Fridman
(01:47:17)
I asked, “What has Elon Musk been wrong about?” It’s actually…
Elon Musk
(01:47:22)
Yeah, what does it say?
Lex Fridman
(01:47:23)
It mostly lists rigorously all the times you predicted, like promised when something will be released, and it wasn’t released.
Elon Musk
(01:47:32)
Promise is a strong word.
Lex Fridman
(01:47:33)
Okay, predicted. But the opener is really nice. “Elon Musk, the man who has been wrong about more things than a wrong-a-meter in a wrong factory.” And then, it goes on to list.
Elon Musk
(01:47:45)
That’s pretty funny. Can I see it? I’m like, “I haven’t seen this before.” This is cool.
Lex Fridman
(01:47:48)
Sure. I mean, “What has Elon Musk been wrong about?” “Well, where do I begin?” I guess this is new. “Elon Musk has made quite a few blunders in his time, and I’m not just talking about his dance moves at the Tesla Cyber Rodeo. First off, let’s talk about the infamous funding secured tweet.”

(01:48:09)
This is actually different than what it gave me before, which was a bullet point list. Here’s more a narrative structure. Cybertruck unveiling, where the supposedly unbreakable windows.
Elon Musk
(01:48:22)
This is actually not correct.
Lex Fridman
(01:48:22)
Which part?
Elon Musk
(01:48:24)
Well, the first part, I did actually have the funding secured and there was a big trial in San Francisco, a big civil trial, and the jury found me not guilty. Unanimous binding of a San Francisco jury.
Lex Fridman
(01:48:40)
And here, it’s implying that it was not in fact secured.
Elon Musk
(01:48:45)
I think this is taking things from the press. Yeah, that is not correct. The reason I agreed to the fine for the SEC is not because the SEC was correct, that was extremely bad behavior by the SEC, corruption, frankly. But if I did not agree to pay the fine, Tesla would’ve gone bankrupt immediately.

(01:49:08)
So, I was told by our CFO that the banks would immediately suspend our lines of credit. And if they suspend our lines of credit, at that time, we would’ve gone bankrupt instantly. So, there would never have been an opportunity for a trial because Tesla would be dead. So really, this is like someone holding a gun to your kid’s head and saying, “Pay $20 million and admit…” This is like a hostage negotiation.
Lex Fridman
(01:49:34)
Was that story fully told? I mean, SEC, in its best form, could be a force for good.
Elon Musk
(01:49:42)
It should be. But not once did the SEC go after any of the hedge funds who were nonstop shorting and distorting Tesla. Not once. The hedge funds would lie flat out on TV for their own gain at the expense of retail investors. Not once. Literally a thousand times, not once did the SEC pursue them.
Lex Fridman
(01:50:06)
How do you explain this failure on-
Elon Musk
(01:50:08)
The incentive structure is messed up because the lawyers at the SEC are not paid well, it’s a fairly low paying job, but what they’re looking for is a trophy from the SEC. They’re looking for something they put on, basically, their LinkedIn. From that, they can get a job at a high paying law firm. That’s exactly what the lawyer here did.

(01:50:37)
And the reason they don’t attack the hedge funds is because those hedge funds employ those law firms. And they know if they attack the hedge funds, they’re affecting their future career prospects. So, they sell small investors down the river for their own career. That’s what actually happens. Regulatory capture.
Lex Fridman
(01:50:59)
Regulatory capture.
Elon Musk
(01:51:00)
Yeah. Not good. So, the only reason I accepted that thing… Technically, it was a… It’s neither admit nor deny guilt. But the only reason I agreed to that at all was because I was told Tesla would be bankrupt otherwise. If there was an SEC investigation like this, banks would suspend funding, we’re bankrupted immediately, at the time. Now, we’re in a much stronger position.
Lex Fridman
(01:51:30)
Take that, Grok.
Elon Musk
(01:51:32)
Yes. Unfortunately, Grok is taking too much from the conventional media. Also, that guy was not a cave diver.
Lex Fridman
(01:51:45)
There’s a time where Elon called a British cave diver a, “pedo guy” after the diver criticized Musk’s plan to rescue a group of boys trapped in a Thai cave. That little outburst earned him another lawsuit, and he had to apologize and pay a settlement.
Elon Musk
(01:52:00)
That’s false, there was no settlement. There was a court case, which the guy who was not a cave diver and was not part of the rescue team, filed a lawsuit against me and lost and he received nothing. So in this case, it is wrong. It is also, I guess, taken this from the conventional media.
Lex Fridman
(01:52:23)
Actually, there’s an interesting question here.
Elon Musk
(01:52:25)
These are public court cases, both the SEC civil case where the civil complaints on the SEC guys lost unanimous jury verdict in San Francisco. They picked San Francisco because they thought it was the place I was most likely to lose, and a unanimous verdict in my favor. The LA trial, also they picked that venue because they thought I was most likely to lose. Unanimous verdict in my favor. Both cases I won. Yeah.
Lex Fridman
(01:53:00)
I mean, there’s an interesting question here, there seems to be a lot more clicks if a journalistic organization writes a negative article about you, Elon Musk. That’s one of the best ways to get clicks. So how do you, if you’re training Grok, not train on articles that have misaligned incentives.
Elon Musk
(01:53:26)
We need to add the training set of the actual legal decisions. This is actually helpful, because if you actually read the court-
Lex Fridman
(01:53:26)
Which are public.
Elon Musk
(01:53:41)
Which are public. The court conclusions, they’re completely the opposite of what the media wrote.
Lex Fridman
(01:53:47)
So, always striving for the ground truth, beyond the reporting.
Elon Musk
(01:53:50)
Yeah. What did the judge actually write? What does the jury and the judge actually conclude? And in both cases they found me innocent. And that’s after the jury shot for trying to find the venue where I’m most likely to lose. I mean, obviously, it can be a much better critique than this. I mean, I’ve been far too optimistic about autopilot.
Lex Fridman
(01:54:16)
The critique I got, by the way, was more about that, which is it broke down a nice bullet point list for each of your companies, the set of predictions that you made, when you’ll deliver, when you’ll be able to solve, for example, self-driving, and it gives you a list. And it was probably compelling, and the basic takeaway is you’re often too optimistic about how long it takes to get something done.
Elon Musk
(01:54:38)
Yeah. I mean, I would say that I’m pathologically optimistic on schedule. This is true. But while I am sometimes late, I always [inaudible 01:54:47] in the end.
Lex Fridman
(01:54:49)
Except with Uber Lilith. No.
Elon Musk
(01:54:51)
We’ll see.

Politics

Lex Fridman
(01:54:56)
Okay. Over the past year or so since purchasing X, you’ve become more political, is there a part of you that regrets that?
Elon Musk
(01:55:03)
Have I?
Lex Fridman
(01:55:04)
In this battle to counter way the woke that comes from San Francisco-
Elon Musk
(01:55:14)
Yeah. I guess if you consider fighting the woke mind virus, which I consider to be a civilizational threat, to be political, then yes.
Lex Fridman
(01:55:20)
So basically, going into the battleground of politics. Is there a part of you that regrets that?
Elon Musk
(01:55:26)
Yes. I don’t know if this is necessarily one candidate or another candidate, but I’m generally against things that are anti-meritocratic or where there’s an attempt to suppress discussion, where even discussing a topic is not allowed. Woke mind virus is communism rebranded.
Lex Fridman
(01:55:51)
I mean, that said, because of that battle against the woke mind virus, you’re perceived as being the right wing.
Elon Musk
(01:55:58)
If the woke is left, then I suppose that would be true. But I’m not sure, I think there are aspects of the left that are good. I mean, if you’re in favor of the environment, if you want to have a positive future for humanity, if you believe in empathy for your fellow human beings, being kind and not cruel, whatever those values are.
Lex Fridman
(01:56:23)
You said that you were previously left or center left.
Elon Musk
(01:56:23)
Well, sort of.
Lex Fridman
(01:56:26)
What would you like to see in order for you to consider voting for Democrats again?
Elon Musk
(01:56:30)
No. I would say that I would be probably left of center on social issues, probably a little bit right of center on economic issues.
Lex Fridman
(01:56:40)
And that still holds true?
Elon Musk
(01:56:42)
Yes, but I think that’s probably half the country, isn’t it?
Lex Fridman
(01:56:46)
Maybe more.
Elon Musk
(01:56:47)
Maybe more.
Lex Fridman
(01:56:49)
Are you and AOC secretly friends? Bigger question, do you wish you and her, and just people in general of all political persuasions, would talk more with empathy and maybe have a little bit more fun and good vibes and humor online?
Elon Musk
(01:57:05)
I’m always in favor of humor. That’s why we have funny mode.
Lex Fridman
(01:57:08)
But good vibes, comradery humor, like friendship.
Elon Musk
(01:57:15)
Yeah. Well, I don’t know AOC. I was at the Met ball when she attended, and she was wearing this dress. But I can only see one side of it, so it looked like eat the itch, but I don’t know-
Lex Fridman
(01:57:35)
What the rest of it said? Yeah.
Elon Musk
(01:57:36)
Yeah.
Lex Fridman
(01:57:36)
I’m not sure.
Elon Musk
(01:57:39)
Something about the itch, eat the itch.
Lex Fridman
(01:57:42)
I think we should have a language model complete. What are the possible ways to complete that sentence? And so, I guess that didn’t work out well. Well, there’s still hope. I root for friendship.
Elon Musk
(01:57:55)
Yeah, sure. Sounds good. More carrot, less stick.

Trust

Lex Fridman
(01:57:58)
You’re one of, if not the, most famous, wealthy and powerful people in the world, and your position is difficult to find people you can trust.
Elon Musk
(01:58:05)
Trust no one, not even yourself. Not trusting yourself.
Lex Fridman
(01:58:07)
Okay. You’re saying that jokingly, but is there some aspect-
Elon Musk
(01:58:11)
Trust no one, not even no one.
Lex Fridman
(01:58:15)
I’m going to need an hour just to think about that, and maybe some drugs, and maybe Grok to help. I mean, is there some aspect of that, just existing in a world where everybody wants something from you, how hard is it to exist in that world?
Elon Musk
(01:58:29)
I’ll survive.
Lex Fridman
(01:58:30)
There’s a song like that too.
Elon Musk
(01:58:32)
I will survive.
Lex Fridman
(01:58:33)
Were you petrified at first? Okay. I forget the rest of the lyrics. But you don’t struggle with this? I mean, I know you survive, but there’s ways-
Elon Musk
(01:58:44)
Petrify is a spell in the druid tree.
Lex Fridman
(01:58:47)
What does it do?
Elon Musk
(01:58:48)
Petrify. It turns the monsters into stone.
Lex Fridman
(01:58:56)
Literally?
Elon Musk
(01:58:56)
Yeah, for like six seconds.
Lex Fridman
(01:58:59)
There’s so much math in Diablo that breaks my brain.
Elon Musk
(01:59:02)
It’s math nonstop.
Lex Fridman
(01:59:04)
I mean, really, you’re laughing at it, but it can put a huge amount of tension on a mind.
Elon Musk
(01:59:13)
Yes, it can be definitely stressful at times.
Lex Fridman
(01:59:16)
Well, how do you know who you can trust in work and personal life?
Elon Musk
(01:59:20)
I mean, I guess you look at somebody’s track record over time, and I guess you use your neural net to assess someone.
Lex Fridman
(01:59:31)
Neural nets don’t feel pain. Your neural net has consciousness, it might feel pain when people betray you. It can make-
Elon Musk
(01:59:40)
To be frank, I’ve almost never been betrayed. It’s very rare, for what it’s worth.
Lex Fridman
(01:59:50)
I guess karma, be good to people and they’ll be good to you.
Elon Musk
(01:59:53)
Yeah, karma is real.
Lex Fridman
(01:59:55)
Are there people you trust? Let me edit that question. Are there people close to you that call you out on your bullshit?
Elon Musk
(02:00:06)
Well, the X platform is very helpful for that, if you’re looking for critical feedback.
Lex Fridman
(02:00:12)
Can it push you into the extremes more? The extremes of thought make you cynical about human nature in general?
Elon Musk
(02:00:19)
I don’t think I will be cynical. In fact, my feeling is that one should be… Never trust a cynic. The reason is that cynics excuse their own bad behavior by saying, “Everyone does it.” Because they’re cynical. So, I always be… It’s a red flag if someone’s a cynic, a true cynic.
Lex Fridman
(02:00:49)
Yeah, there’s a degree of projection there that’s always fun to watch from the outside and enjoy the hypocrisy.
Elon Musk
(02:00:58)
This is an important point that I think people who are listening should bear in mind. If somebody is cynical, meaning that they see bad behavior in everyone, it’s easy for them to excuse their own bad behavior by saying that, “Well, everyone does it.” That’s not true. Most people are kind of medium good.
Lex Fridman
(02:01:23)
I do wish the people on X will be better at seeing the good in other people’s behavior. There seems to be a weight towards seeing the negative. Somehow, the negative is sexier. Interpreting the negative is sexier, more viral. I don’t know what that is exactly about human nature.
Elon Musk
(02:01:44)
I mean, I find the X platform to be less negative than the legacy media. I mean, if you read a conventional newspaper, it makes you sad, frankly. Whereas, I’d say on the X platform, I mean, I really get more laughs per day on X than everything else combined from humans.
Lex Fridman
(02:02:11)
Laughs, it overlaps, but it’s not necessarily perfectly overlapping, with good vibes and celebrating others, for example. Not in a stupid, shallow, naive way, but in an awesome way. Something awesome happened, and you celebrate them for it. It feels that that is outweighed by shitting on other people. Now, it’s better than mainstream media, but it’s still…
Elon Musk
(02:02:38)
Yeah, mainstream media is almost relentlessly negative about everything. I mean, really, the conventional news tries to answer the question, what is the worst thing that happened on Earth today? And it’s a big world. So on any given day, something bad has happened.
Lex Fridman
(02:02:54)
And a generalization of that, what is the worst perspective I can take on a thing that happened?
Elon Musk
(02:03:01)
I don’t know. There’s just a strong negative bias in the news. I mean, I think a possible explanation for this is evolutionary, where bad news, historically, would be potentially fatal, like there’s lion over there or there’s some other tribe that wants to kill you. Good news, we found a patch of berries. It’s nice to have, but not essential.

Tesla’s Autopilot and Optimus robot

Lex Fridman
(02:03:30)
Our old friend, Tesla autopilot, is probably one of the most intelligent real world AI systems in the world.
Elon Musk
(02:03:38)
You followed it from the beginning.
Lex Fridman
(02:03:40)
Yeah. It was one of the most incredible robots in the world and continues to be. And it was really exciting, and it was super exciting when it generalized, became more than a robot on four wheels, but a real world AI system that perceives the world and can have potentially different embodiments.
Elon Musk
(02:04:02)
Well, I mean, the really wild thing about the end-to-end training is that it can read science, but we never taught it to read. Yeah. We never taught it what a car was or what a person was, or a cyclist. It learnt what all those things are, what all the objects are on the road from video, just from watching video, just like humans. I mean, humans are photons in, controls out. The vast majority of information reaching our brain is from our eyes. And you say, “Well, what’s the output?” The output is our motor signals to our fingers and mouth in order to communicate. Photons in, controls out. The same is true of the car.
Lex Fridman
(02:05:01)
But by looking at the sequence of images… You’ve agreed with [inaudible 02:05:07] recently where he talked about LLM forming a world model, and basically language is a projection of that world model onto the sequence of letters. And you saying-
Elon Musk
(02:05:18)
It finds order in these things. It finds correlative clusters.
Lex Fridman
(02:05:27)
And in so doing, it’s understanding something deep about the world, which is… I don’t know, it’s beautiful.
Elon Musk
(02:05:35)
That’s how our brain works.
Lex Fridman
(02:05:38)
But it’s beautiful-
Elon Musk
(02:05:39)
Photons in, controls out.
Lex Fridman
(02:05:41)
[inaudible 02:05:41] are able to understand that deep meaning in the world. And so, the question is, how far can it go? And it does seem everybody’s excited about LLMs. In the space of self supervised learning in the space of text, it seems like there’s a deep similarity between that and what Tesla autopilot is doing. Is it, to you, basically the same, but different-
Elon Musk
(02:06:06)
They are converging.
Lex Fridman
(02:06:10)
I wonder who gets there faster, having a deep understanding of the world, or they just will naturally converge?
Elon Musk
(02:06:19)
They’re both headed towards AGI. The Tesla approach is much more computer efficient, it had to be. Because we were constrained on this… We only have 100 watts and [inaudible 02:06:37] computer. 144 trillion operations per second, which sounds like a lot, but is small potatoes these days. [inaudible 02:06:49] eight. But it’s understanding the world [inaudible 02:06:51] eight. It’s [inaudible 02:06:53].
Lex Fridman
(02:06:55)
But there, the path to AGI might have much more significant impact because it’s understanding… It will faster understand the real world than will LLMs. And therefore, be able to integrate with the humans in the real world faster.
Elon Musk
(02:07:13)
They’re both going to understand the world, but I think Tesla’s approach is fundamentally more compute efficient. It had to be, there was no choice. Our brain is very compute efficient, very energy efficient. Think of what is our brain able to do. There’s only about 10 watts of higher brain function, not counting stuff that’s just used to control our body. The thinking part of our brain is less than 10 watts. And those 10 watts can still produce a much better novel than a 10 megawatt GPU cluster. So, there’s a six order of magnitude difference there.

(02:07:56)
I mean, the AI has thus far gotten to where it is via brute force, just throwing massive amounts of compute and massive amounts of power at it. So, this is not where it will end up. In general, with any given technology, you first try to make it work, and then you make it efficient. So I think we’ll find, over time, that these models get smaller, are able to produce sensible output with far less compute, far less power. Tesla is arguably ahead of the game on that front because we’ve just been forced to try to understand the world with 100 watts of compute.

(02:08:51)
And there are a bunch of fundamental functions that we forgot to include. So, we had to run a bunch of things in emulation. We fixed a bunch of those with hardware four, and then hardware five will be even better. But it does appear, at this point, that the car will be able to drive better than a human, even with hardware three and 100 watts of power. And really, if we really optimize it, it could be probably less than 50 watts.
Lex Fridman
(02:09:26)
What have you learned about developing Optimus, about applying, integrating this real world AI into the space of robotic manipulation, just humanoid robotics? What are some interesting tiny or big things you’ve understood?
Elon Musk
(02:09:47)
I was surprised at the fact that we had to develop every part of the robot ourselves. That there were no off the shelf motors, electronics, sensors. We had to develop everything. We couldn’t actually find a source of electric motors for any amount of money.
Lex Fridman
(02:10:12)
It’s not even just efficient and expensive, it’s like anything, there’s not…
Elon Musk
(02:10:17)
No.
Lex Fridman
(02:10:19)
The actuators, everything has to be designed from scratch.
Elon Musk
(02:10:23)
Yeah. We tried hard to find anything that was… Because you think of how many electric motors are made in the world. There’s like tens of thousands, hundreds of thousands of electric motor designs. None of them were suitable for a humanoid robot, literally none. So, we had to develop our own. Design it specifically for what a humanoid robot needs.
Lex Fridman
(02:10:51)
How hard was it to design something that can be mass manufactured, it could be relatively and expensive? I mean, if you compare to Boston Dynamics’ Atlas, is a very expensive robot.
Elon Musk
(02:11:02)
It is designed to be manufactured in the same way they would make a car. And I think, ultimately, we can make Optimus for less than the cost of a car. It should be, because if you look at the mass of the robot, it’s much smaller and the car has many actuators in it. The car has more actuators than the robot.
Lex Fridman
(02:11:23)
But the actuators are interesting on a humanoid robot with fingers. So, Optimus has really nice hands and fingers, and they could do some interesting manipulation, soft touch robotics.
Elon Musk
(02:11:38)
I mean, one of the goals I have is can it pick up a needle and a thread and thread the needle just by looking?
Lex Fridman
(02:11:47)
How far away are we from that? Just by looking, just by looking.
Elon Musk
(02:11:51)
Maybe a year. Although, I go back to I’m optimistic on time. The work that we’re doing in the car will translate to the robot.
Lex Fridman
(02:11:59)
The perception or also the control?
Elon Musk
(02:12:02)
No, the controls are different. But the video in, controls out. The car is a robot on four wheels. Optimus is a robot with hands and legs.
Lex Fridman
(02:12:15)
So, you can just-
Elon Musk
(02:12:16)
They’re very similar.
Lex Fridman
(02:12:17)
So, the entire machinery of the learning process, end-to-end, is just you just have a different set of controls?
Elon Musk
(02:12:23)
After this, we’ll figure out how to do things by watching videos.

Hardships

Lex Fridman
(02:12:28)
As the saying goes, be kind, for everyone you meet is fighting a battle you know nothing about.
Elon Musk
(02:12:33)
Yeah, it’s true.
Lex Fridman
(02:12:34)
What’s something difficult you’re going through that people don’t often see?
Elon Musk
(02:12:38)
Trying to defeat Uber Lilith. I mean, my mind is a storm and I don’t think most people would want to be me. They may think they would want to be me, but they don’t. They don’t know, they don’t understand.
Lex Fridman
(02:13:11)
How are you doing?
Elon Musk
(02:13:14)
I’m overall okay. In the grand scheme of things, I can’t complain.
Lex Fridman
(02:13:21)
Do you get lonely?
Elon Musk
(02:13:24)
Sometimes, but my kids and friends keep me company.
Lex Fridman
(02:13:33)
So, not existential.
Elon Musk
(02:13:36)
There are many nights I sleep alone. I don’t have to, but I do.
Lex Fridman
(02:13:46)
Walter Isaacson, in his new biography of you, wrote about your difficult childhood. Will you ever find forgiveness in your heart for everything that has happened to you in that period of your life?
Elon Musk
(02:14:01)
What is forgiveness? At least I don’t think I have a resentment, so nothing to forgive.
Lex Fridman
(02:14:20)
Forgiveness is difficult for people. It seems like you don’t harbor their resentment.
Elon Musk
(02:14:28)
I mean, I try to think about, what is going to affect the future in a good way? And holding onto grudges does not affect the future in a good way.
Lex Fridman
(02:14:41)
You’re a father, a proud father. What have you learned about life from your kids? Those little biological organisms.
Elon Musk
(02:14:53)
I mean, developing AI and watching, say, little X grow is fascinating because there are far more parallels than I would’ve expected. I mean, I can see his biological neural net making more and more sense of the world. And I can see the digital neural net making more and more sense of the world at the same time.
Lex Fridman
(02:15:19)
Do you see the beauty and magic in both?
Elon Musk
(02:15:21)
Yes. I mean, one of the things with kids is that you see the world anew in their eyes. To them, everything is new and fresh. And then, when you see that, them experiencing the world as new and fresh, you do too.
Lex Fridman
(02:15:52)
Well, Elon, I just want to say thank you for your kindness to me and friendship over the years, for seeing something in a silly kid like me, as you’ve done for many others. And thank you for having hope for a positive future for humanity, and for working your ass off to make it happen. Thank you, Elon.
Elon Musk
(02:16:11)
Thanks, Lex.
Lex Fridman
(02:16:13)
Thank you for listening to this conversation with Elon Musk. To support this podcast. Please check out our sponsors in the description. And now, let me leave you with some words that Walter Isaacson wrote about the central philosophy of how Elon approaches difficult problems, “The only rules are the ones dictated by the laws of physics.” Thank you for listening, and hope to see you next time.