Transcript for Tim Sweeney: Fortnite, Unreal Engine, and the Future of Gaming | Lex Fridman Podcast #467

This is a transcript of Lex Fridman Podcast #467 with Tim Sweeney. The timestamps in the transcript are clickable links that take you directly to that point in the main video. Please note that the transcript is human generated, and may have errors. Here are some useful links:

Table of Contents

Here are the loose “chapters” in the conversation. Click link to jump approximately to that part in the transcript:

Episode highlight

Tim Sweeney (00:00:00) Humans are by far the hardest part of computer graphics because millions of years of evolution have given us dedicated brain systems to detect patterns in faces and infer emotions and intent because cavemen had to, when they see a stranger, determine whether they were likely friendly or they might be trying to kill them. And so people in the world have extraordinarily detailed expectations of a face and we can notice imperfections, especially perfections arising from computer graphics limitations. Okay, one part is capturing humans and so [inaudible 00:00:33] really advanced, dedicated hardware that puts a human in a capture sphere with dozens of cameras in them taking high resolution, high frame rate video of them as they go through a range of motions. And then capturing the human face is complicated because the nuanced detail of our faces and how all the muscles and sinews and fat work together to give us different expressions.
(00:00:53) So it’s not only about the shape of a person’s face, but it’s also about the entire range of motion that they might go through. So that’s the data problem. There’s a lot of other problems with computer graphics. There’s technology for rendering hair, which is really hard. Because you can’t render every… Again, we know the laws of physics. It would be easy to just render every hair. It would just be a billion times too slow. So you need approximations that capture the net effect of hair on rendering and on pixels without calculating every single interaction of every light with every strand of hair. That’s one part of it. There’s detailed features for different parts of faces. There’s subsurface scattering because we think of humans as opaque, but really our skin, light travels through it. It’s not completely opaque, and the way in which light travels through skin has a huge impact on our appearance.
(00:01:38) And this is why there’s no way you can paint a mannequin to look realistic for a human. It’s just a solid surface and we’ll never have the sort of detail you see.
Lex Fridman (00:01:48) That kind of blew my mind, thinking through that. I think I heard that sort of the oiliness of the skin creates very specific, nuanced, complex reflections and then some light is absorbed and travels through the skin and that creates textures that our human eye is able to perceive and it creates the thing that we consider human, whatever that is. All of that, while considering all the muscles involved in making the nuanced expression, just the subtle squinting of the eyes or the subtle formation of a smile, it’s the subtlety of human faces that you have to capture, like the difference between a real smile and a fake smile, but the way to show beginning of a formation of a smile that actually reveals a deep sadness, all of that, when I watch a human face, I can read that. I could see that you have to have the tools that, in real time, can render something like that, and that’s incredibly difficult.
Tim Sweeney (00:02:50) That’s right. Getting faces right requires the interplay of literally dozens of different systems and aspects of computer graphics. And if any one of them is wrong, your eye is completely drawn to that and you find it on the wrong side of Uncanny Valley.

Introduction

Lex Fridman (00:03:06) The following is a conversation with Tim Sweeney, a legendary video game programmer, founder and CEO of Epic games that created many incredible games and technologies, including the Unreal Engine and Fortnite, which both revolutionized the video game industry and the experience of playing and creating video games. This is the Lex Fridman podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Tim Sweeney.
Lex Fridman (00:03:06) When did you first fall in love with computers and maybe with programming?

10,000 hours programming

Tim Sweeney (00:03:42) I had a brother, Steve Sweeney, who 16 years older than me, and at some point when I was a little kid, he went off to work in California for a tech company and he’d gotten one of the first IBM PCs. And so for one summer, I think I was about 11, I went to visit him in California. It was my first trip away from my family just to hang out with him and he had this brand new IBM computer and I learned to program over the course of a few days in BASIC. I was just blown away with the capabilities of computers at the time. It was unbelievable what they could accomplish, and I just was hooked from that point onward and very much wanted to be a programmer.
Lex Fridman (00:04:19) Do you remember what you wrote in BASIC? Is it a video game type thing? Is it like for loop, some numerical thing? Do you remember?
Tim Sweeney (00:04:27) Yeah, it’s funny. I have a perfectly vivid memory of all of the first things I learned to program. I have a hard time remembering people’s names, but code really sticks with me. Every step and every challenge, there were lessons learned, some of which I’ve come to realize were just like me getting over some learning hurdles. But other things were actually shortcomings of programming languages and the realization that there are actually better ways than what a programmer is learning to program for their first time. A lot of what they’re facing isn’t the challenge of learning a new art. It’s friction introduced by failures of programming language design. And so I’ve constantly come back to those early lessons there as I’ve progressed and done more and more things including building programming languages.
Lex Fridman (00:05:11) Yeah, the friction and the pain is the guide to learning in programming. If I were to describe programming journey that’d be marked by pain and that pain, you shouldn’t escape the pain. The pain is instructive for you to understand programming languages. But do you remember what kind of stuff you were writing at that time? Just the early programs?
Tim Sweeney (00:05:35) Yeah. In the early days, I wrote a little bit of everything. I wrote some games. The first game I wrote on the Apple II was, since I only knew how to program in text mode, the computer would throw asterisks across the screen, they’d flow from left to right, and you’d have a parenthesis on the right-hand side of the screen and it looks like a baseball mitt and you’re supposed to catch the asterisks. That was my very first game. It took about a couple hours to build and tune, and I went from there. But I built a lot of things. I built databases at different points. I built a programming language and a full compiler for a language like Pascal because I didn’t know where you went to buy one of those. So I made my own. And one of fun things at that time was bulletin boards.
(00:06:17) Before we had the internet in the hands of consumers, you used your modem and you dialed into a local phone number and connected to whoever was running the computer there. And every town or city had hundreds of these bulletin boards run by different people with their own personalities and themes. And so I spent a lot of time building a bulletin board program and learning how to deal with database management and user interface and dealing with multiple users concurrently and things. And so, I don’t know, I’d probably spend about 10 or 15,000 hours writing code just on my own as a kid between age 10 and age 20 before I actually shipped a program to the outside world.

Advice for young programmers

Lex Fridman (00:06:56) 10 to 15,000 hours. What was the value of the hours as a kid you put in in programming that led to the success you’ve had in later life? Maybe this is by way of advice to younger people in terms of how they allocate the hours of their early life.
Tim Sweeney (00:07:12) Yeah, it’s not just hours. It’s really striving to learn, to understand what knowledge you have, what knowledge you lack, and to continually do experiments and work on projects that improve your knowledge base. And I didn’t do this with a great amount of structure or planning. I was rather just going from project to project, doing things that I thought would be fun and cool. And with each project I learned new things, learning about how to store and manage data, learning how to deal with advanced data structures, how to write complex programs that have deeply nested data and control flow. Each one of those provide a lesson which were later essential. In 1991, I released my first game and over the course of that decade went from zero commercial releases to the first generation Unreal Engine. But this was largely just using the knowledge that I’d built up over the previous decade, just doing fun hobby projects. And if I hadn’t done all of that work, there’s no way I could have ever built the things that came later.
Lex Fridman (00:08:15) All the experimentation and all the exploration somehow contributed, somehow made sense later on. All of that is integrated somehow in the stuff you build. It’s funny how life works. The pieces kind of come together eventually.
Tim Sweeney (00:08:32) Yeah, there are definitely karate kid moments because all this time I was learning math in high school and in college I studied mechanical engineering. And so you learn all kinds of math, vector calculus and vector math and matrices and all of these related fields, physics and stress and strain and how to deal with complex physical systems. And yeah, I wasn’t really sure how engineers would actually make use of that knowledge. Do you just forget about it when you actually go off to do work or do you write down equations on paper? It was actually not clear as an early engineering student what you do, but when I started writing the first generation Unreal Engine and I was dealing with 3DMS, I was like, wait, I know this stuff. I learned this. And so suddenly like the karate kid, you get to paint the fence and wax the car and suddenly put all the pieces together into a 3D engine based on a whole lot of accumulated programming language and math knowledge, often knowledge gained without ever anticipating that I might use it in that way.
Lex Fridman (00:09:37) Also, I think what’s useful is over and over learning a hard thing and then showing to yourself that you can do it, that you can learn a hard thing. So then when you come to having to write a 3D engine in ways that haven’t been done before, you’re like, I’ve been here. I’ve been here in this experience, I don’t know what to do, but we’ll figure it out. We’ll learn. I’ll learn all the necessary components. So just not being afraid of something new.
Tim Sweeney (00:10:10) That’s right. And constantly striving to make connections between these fields and look for their applications. Long after I chipped Unreal Engine, it was like going back through an engineering textbook and looking at, oh yeah, I used that, I used that, I used that. And then I got to the section on eigenvalues. I’m like, I don’t know what the hell this is. But it turns out eigenvectors and eigenvalues were the critical breakthrough that made the Google search engine technology work and stand apart from the rest because they found if you threw all the links that exist into the web and links from and to different sites and you put them in a giant matrix and you conclude it, you found a dominant eigenvalues.
(00:10:46) Then those eigenvectors described the best search results for different things. And so constantly picking up knowledge and looking for ways to put it together is the thing to do. And if you aspire to be a programmer, you’ve got to write a lot of code and you’ve got to continually learn new things and improve. And if you want to be an artist, you’ve got to continually draw artwork of all styles and all kinds and constantly push yourself to learn more and more, because you never know exactly what you’re going to end up doing in the long run, but the more knowledge you have and the more skills, the more chance you have putting it together and being successful.
Lex Fridman (00:11:20) And whether you’re a programmer or an artist, you should probably take linear algebra, even though it doesn’t make sense at the time.
Tim Sweeney (00:11:25) I found getting an engineering degree and then never working in an engineering field, just being a computer programmer, was immensely valuable. I went to University of Maryland, which for some disciplines it’s kind of known as a party school, but they worked the engineers to death, worked really hard. And if you learn any engineering discipline, you learn massive amounts of math and you learn the rigor of problem solving, not just what you find from the Wikipedia article, but going through all of the exercises of solving complex problems and building up series of solutions to derive in an answer. It’s valuable and it embodies the knowledge that you need as a programmer. And people often go to university and think, okay, my goal here is to get good grades, so I get a diploma and I prove to an employer that I’m valuable.
(00:12:11) No, that’s just kind of the superficial bookkeeping of the university. The real purpose of all of this is to learn, and whether you learn formally or you learn on your own, it’s the learnings that are really valuable in a career. And especially if you’re going to be entrepreneurial, it’s really knowing the stuff that matters and not having the diplomas. There’s ever more pressure to rebuild society more and more around credentials. Do you have this certificate? Do you have that proof? But companies that are focused on just building great products and doing great things gravitate towards people who do the great work.
Lex Fridman (00:12:48) Yeah, one of the great things about youth is there’s more freedom. There’s just more time to learn. And people when they go to high school, they sometimes think, wow, I can’t wait to get out of this and be an adult and be free. But it’s not quite freedom. When you get a job and you start a family, all wonderful things, but you get more and more busy and less and less time to learn in the general sense, learn whatever the hell you want. That is a wonderful time in life, the teenage years, the early-twenties, the twenties when you could just learn random shit.
Tim Sweeney (00:13:25) Yeah, and I think this is something that’s kind of changing in America. There’s so much focus on grades and homework and structure around kids’ lives. When I was growing up, my mom would feed me and my neighbors’ moms would feed them breakfast and they’d be like, well, be back by dark.
(00:13:45) And, yeah, we’d go out and we’d play and we’d do all sorts of things. We’d explore the woods, we’d build go-karts, we’d salvage old pieces of electronics and build what we thought were our spacecraft control panels for the fake spaceships we were building as play, and we’d have an enormous amount of freedom. And from basically being a little kid through the time I went off to college, I had an enormous amount of free time. Some people just use that and waste it, and watched TV. Some people socialized and some people really got into serious projects. So many people at all times were doing cool things. I was programming, I was learning to build things.
(00:14:27) Before I was releasing games to the world, I’d be having neighborhood folks over to play the things I was working on and check them out. And sometimes they’re impressed and sometimes they weren’t, and they’d have their own projects and often we’d have spare time jobs and everybody was entrepreneurial. Everybody had a side gig. Sometimes you’d go around and mow people’s lawns or you’d rake the leaves up and earn money. And the freedom there and the organic learning that occurred there, I think, is something that is really critical to the American experience that I worry is increasingly going away as society is ever more protective and sheltering and makes it harder to get these experiences.

Video games in the 80s and 90s

Lex Fridman (00:15:07) So on the video game side, when did you first fall in love with video games?
Tim Sweeney (00:15:13) I’ve had a funny relationship with games because my real aspiration has always been to program cool stuff. And I get more enjoyment out of programming than anything else in the world. And so my first really two formative experience with games were playing this game called Adventure for the Atari 2,600. It was like you move this dot around the screen and picked up objects like swords and fought dragons and invaded castles and solved puzzles. Very, very simple iconic stuff rather than realistic graphics. And then the other game that I really got immersed in was Zork, which was a text adventure game. It would tell you where you are and what you see and you type in commands like go north or pick up sword or open door and explore a world that way. So the game didn’t have any graphics, but in your mind you had this elaborate picture of what you were seeing there, and it really brought in [inaudible 00:16:09] inspired imagination more than other things.
(00:16:11) And playing those games led me to go off and want to learn to program everything that I saw there. And that drove a lot of my programming. I learned how to move a player around the screen. I learned how to build a design tool so I could build castles and save them off and then play them in a game. And I realized there was a separation between the tools that you use to build a game and the game itself, and that the more powerful tools you had, the more creativity you could unleash in yourself or others.
(00:16:36) And I learned all the programming techniques that supported games, how to parse text, pick up sword and go north. How do you make that sentence into an actual series of commands on the computer? And that was really, really exciting. I have to say, until the time that Fortnite came out, I played video games primarily to learn what they were doing, so that I could go off and do that myself. I’d sit down when Wolfenstein came out and then Doom came out. I’d go through it and look at it Pixel by Pixel, I’d move the mouse very slightly and look exactly what was happening to figure out.
Lex Fridman (00:17:10) That’s funny. That’s great.
Tim Sweeney (00:17:10) What technique was being used there? And that was a puzzle solving at a grand scale, and it was so fun.

Epic Games origin story

Lex Fridman (00:17:16) So take me there in the early 90s, so you launched Epic Games in 1991, so the writing of your first big video game ZZT, what was it like? What was the technical challenges? What were the psychological challenges of building that?
Tim Sweeney (00:17:36) It was a funny project because I didn’t start out to build a video game. I’d just moved from an Apple II, so my brother bought my family an Apple II right after I’d visit him in California. So I’ve been programming on that for a few years, learned a lot of techniques, but weren’t many Apple II users around still by the time that cycle came to an end. [inaudible 00:17:56] so I’d just gotten an IBM PC of my own and was learning to program and I realized I needed a text editor. So I started writing a text editor. A text editor is a program to edit text files. You have logic to move the cursor around and let people type things and backspace and delete and do all of those mundane actions. And one night I’d finished it up and I was like, well, okay, I have a text editor, but this is pretty boring.
(00:18:20) And so I made the cursor into a smiley face character and I had the different characters you could place in this document perform different gameplay actions. Some would be walls and some would kill you, and some would be moving objects that could fly around the screen. And so this text editor I made evolved into a little game editor. So I was building these levels for a game. And I put a lot of time into building an editor and a primitive set of objects, about 20 or 30 different objects. Enough to build a really cool and compelling game, but not so many that players would lose track of what they’re seeing.
(00:18:51) I started off just building different game levels. The idea is you’d be on a series of board, they’d be connected by going north [inaudible 00:18:59], the end of the current board would take you to a new one if it was open or maybe it was blocked and then you couldn’t go there. I built this [inaudible 00:19:05] game world around that, and this was the game that became ZZT and I was having fun with it, building it and playing it, but I didn’t know if it would really work. So I did this experiment. I’d started inviting neighbors over. Some adults, some kids of all different ages and I sat them down in front of it and say it, like, here’s the game I made, figure it out.
(00:19:23) And I had to force myself not to tell them what they need to do because I really wanted to learn if they were able to discover it all for themselves. Today we would call this a user experience test, and there’s a whole field of research around user experience research, but back then it was just inviting some kids over to play the game. I took notes about what they got stuck on and what they enjoyed and where they felt bored and just iteratively polished the game until I felt it was good then and I put it out and released it on, well, this was before the internet, so there were bulletin boards. I uploaded it to a bunch of local bulletin boards, and from there it started spreading because the way to build up cred for bulletin board users was to upload new files and to claim that, hey, I was the first that brought this to you.
(00:20:06) And so there was a natural tendency of the software to spread. And I decided to use the shareware model, so I didn’t just build this one game. I built a trilogy of three games. And I released the first one for free and I said, hey, if you like this, buy the two sequels. And I included my parents’ mailing address and said, send us $30 and you can get the sequels to this game. And the checks started coming in within a few days and I was getting three or four orders a day. I was making a hundred dollars a day. I’m like, woo, I’m rich. Because being a 20-year-old, that was a pretty big deal.
Lex Fridman (00:20:43) What did that feel like, just getting money and probably feeling this immense success from something you’ve created?
Tim Sweeney (00:20:51) Well, I’ve looked at money always just as a tool to help you fund accomplishing cool things and having enough to do the things you want to do is the critical thing. It’s always been just very utilitarian, but the knowledge that other people all around the country and then a month later, all around the world, were playing the game, that was mind-boggling that me, the solo kid who’d put out a game on a local bulletin board, could be doing international business and shipping discs all over the world to players because the software was spreading on its own, it was just magical.
(00:21:27) And that was a new thing for software. That did not happen with mechanical devices. You manufactured one, you sold it to somebody and they had it, and that was it. But software could spread. That was just really cool to see. And it made me realize there was really no upward limit on the [inaudible 00:21:40] for a business like that. We saw Microsoft as the big juggernaut company at the time, but it was like, hey, if Epic does games good enough, we could accomplish what they’ve got accomplished with operating systems. And the sky was the limit. And I think this is the age we live in now. It’s, you don’t have to be an industrialist manufacturing physical products, anybody who builds anything digitally, if it’s good enough, you can reach the entire world and build the next Microsoft or Meta or Apple or Google or Epic Games.
Lex Fridman (00:22:12) That’s such a cool origin story though. You started out building a text editor, so you’re looking at this project, you’re playing around with it, you building up the tools. And that’s such an inspiring moment because a lot of us start out building a project and to allow yourself to see the potential pivots, the potential trajectories that can go is really nice. To sit back, allow yourself to be bored and like, ah, I’m going to go this way. I mean, that’s like a crossroads. You came to a crossroads. I mean, you built compilers, you design your own programming language, you built compilers, databases, all these things you’ve mentioned, and you started building a text editor and then here it came to this crossroad, I’m going to make this fun. And then from there, one of the most legendary gaming companies was created. It’s kind of cool. That’s an inspiring thing for sort of developers. Be open to the possibility of creating something you didn’t plan to create and just go with it. Right? That’s cool.
Tim Sweeney (00:23:20) Yeah, and it was a bunch of learnings emerged really quickly there. The neat thing I did with ZZT was I didn’t just release the game, I also released the editor with it. I’d built this tool so I could make these ZZT boards that people could play, but I also gave it to all of the players themselves. And 30 years later, I still run into people when I go to a game industry event, it was like I grew up playing ZZT and here’s an adult who grew up playing my game. And it was because it enabled anybody to become a creator too. It had this old board editor and it also had a little scripting language, so you could learn a little bit of programming in it too. And it kind of impressed, and it really set a formative principle of Epic, which was that the company’s mission is to make awesome entertainment, but also awesome tools and to share those tools with everybody so that they can build their own amazing things too.
(00:24:11) And when we got into Unreal Engine a few years later, the interplay between us building a game and us building a tool, tools that were widely used by others, was a critical part of that. And I think that’s the sole reason that Epic has been massively successful. And actually the reason that we’ve survived all of this time is by serving both creators and gamers. We’ve been able to weather the ups and downs of the game industry. It’s a brutal place for companies. We’ve been able to survive every financial downturn, and sometimes the engine’s been funding the business because we didn’t have a game. And sometimes the games have been funding the business. And it really set a principle in our culture that’s persevered and is continually bought to the forefront.
Lex Fridman (00:24:53) But on the editor front, that’s such a fascinating philosophy that you always allow people to create their own worlds. You have an engine from which you simulate the world that the game is in. You have the actual game, and you also have the freedom for creators to create various, in Fortnite, islands of their own. With everything you ship, that freedom to create is always there. That’s really interesting.
Tim Sweeney (00:25:23) Yeah, and it’s something we aim to do more and more fully over time. In the course of building Fortnite, we’ve built a lot of other tools. They’re useful for us too because it’s not just a game powered by Unreal Engine, but it’s also a social ecosystem where people can make friends and voice chat and get together and party. So we’ve opened up all of those social features into Epic Online services, and we give them away to all developers for free because we all benefit from growth and that user base. And our goal is ultimately to build the company’s products and the same technology that we share with everybody else, and to hope that foster a bigger and bigger ecosystem over time where everybody benefits.
Lex Fridman (00:26:03) If we could just linger on the 90s, so you said bulletin boards, maybe you can explain what that’s like and also explain the birth of the internet, what that was like. What was the internet like in the 90s?
Tim Sweeney (00:26:16) So the internet is a funny thing. It started out as this defense department research project called the ARPANET, the Advanced Research Project Agency Network. And it was kind of like this revered secret thing that became more and more open as they connected universities. Universities connected to the internet in the mid-nineteen eighties. And so if you’re at a prestigious institution with access to computers, you could get on there, but the consumer back then, we just had these modems, this thing you plug into your phone line and it dials up on a phone number and then it sends wild sound effects over the telephone line to send digital signals back and forth.
(00:26:55) And these were really slow. The first modem I had was 300 boards. That means 30 characters per second of data. So you’re sitting there watching a sentence slowly emerge character by character as you’re going online, but that’s how we got online and we talked with each other. So you dial up to a local bulletin board, it’ll be run by a person. Usually they have a computer or two sitting in their kitchen or something that’s running the bulletin board, and they have a small community of a few hundred users all competing to connect to that one phone line. It was often busy and you couldn’t get in. And the more popular bulletin boards were hardest to get to.
Lex Fridman (00:27:29) Nice.
Tim Sweeney (00:27:29) But you had all kinds of communities develop, and you could see there was the programming communities where people talked about programming. There was the news and events community. I lived in the outskirts of Washington DC so that was a big thing. But then there was the pirate community where they’re sharing pirated Apple II games and very different community ethos and mantras out there, but all really nice and also very small. These bulletin boards couldn’t grow to the size of Facebook because your phone line couldn’t take that many calls. And then later in the 1990s, the internet, which had been fostered in these colleges that started opening up to the public and anybody could connect to it. And suddenly the world took on life of its own. It became much, much easier to reach a global audience faster.
Lex Fridman (00:28:16) And you would start shipping games to the internet, which is a bit of a crazy thing to do because you’re supposed to have a physical copy, but to post on the internet is pretty innovative. Even shareware is pretty innovative.
Tim Sweeney (00:28:31) Yeah, it’s been a funny transition for the game business. Epic started out making shareware games, distribute it digitally, but as the first 3D games took off like Wolfenstein and Doom from id Software, and then Unreal from us took off to reach a huge audience of millions of users, we had to go into retail stores. So we worked with a retail publisher and they made a box and they put CD-ROMs in the box, and then the world started transitioning back to digitally. And that transition didn’t start well, right? The initial transition of gaming to digital was all but torrent, all piracy and the other horror stories about games that would sell a hundred thousand copies but have 2 million users because most people pirated it.
(00:29:15) And then Steam came along and introduced digital distribution and made digital distribution of legit games so convenient that most players moved away from piracy towards that, and their practices were then followed by others, and the early digital industry took form.

Indie game development

Lex Fridman (00:29:33) Yeah, it’s fascinating. I mean, pirates do lead the way for innovation, the same as the story of Spotify. Basically, I think, most people when they derive value from things like video games, want to pay for those video games, they just want it to be easy. And so the same thing with music, with Spotify. But maybe just staying on the 90s, there are going to be a lot of indie game developers who will listen to us talking today. Can you go back to that mindset and try to derive some wisdom and advice to those folks when you were just a solo developer or maybe just a small group of people creating your early games that eventually became this huge gaming company. But in the early days, what were you going through? What were the ups and downs? What did it take to stay strong and persevere?
Tim Sweeney (00:30:31) Well, one of the critical things that Epic always worked hard to do was to make something different that nobody else was doing, and to try to satisfy a small audience rather than competing globally with the game juggernauts. Back in the 1990s, Epic was new, but Electronic Arts and Activision and the other big publishers had been around for a decade, and they were huge companies. It had giant retail distribution networks. If I tried to make a game and then convinced them to publish it, I doubt I could have had a chance. And I doubt that even if I had made a successful game, that I would’ve made much money from it, though they might have. And so the really unique angle to Epic then was shareware. And that was just the idea that if we distribute our game differently, then we can reach a much larger audience than these bigger competitors by virtue of this first episode of the game being free.
(00:31:24) It was kind of the advent of what later became free to play. And the logic of that is just as true now as it was then. It’s, if the thing is free and anybody can get into it, then it’s going to spread from friend to friend as people bring their real world friends into the games they’re playing and you have the opportunity to build up a community around that. So the other lesson there was just minimize the friction of people getting into your game, make it easy to get into and make it fun. I think the other, well, I was very fortunate. ZZT was a funny game. It was not much like any other game. It had much worse graphics because it was all just text characters, smiley faces and other Greek letters and things participating in this game simulation.
(00:32:09) They were kind of iconic representations of characters rather than real ones. And this was decades into the age of real graphical games with interesting graphics. And so it wasn’t even trying to compete in that area, but it was able to compete in a different area, which is that it wasn’t just the three games that I’d made and shipped as a trilogy that were successful and drove the success of the product. It was the fact that I released an editor and there was a whole community around it. And you see that trend has repeated itself, like there was, ZZT was one of it. Before that, there was Bill Budge’s Pinball construction set. That was a 1980s Apple game that let users build their own pinball tables. And since then, you’ve had some of the world’s most successful games follow that path. Like Minecraft, you can build your own stuff.
(00:32:52) Roblox, now Fortnite Creative and Unreal Editor for Fortnite. Games that become platforms for other people to build stuff was a real opportunity. I think the big thing to realize for indie developers right now is there’s massive, massive competition in every major genre, and it’s very unlikely that unless you just happen to be the world’s best at a particular thing that you’re going to release a game in an existing highly competitive genre and win. A much better chance of success is in releasing something that hasn’t been done before. Being really unique and reaching an audience, even if big or medium size or small, reaching an audience and becoming really popular with that, making some money from it, and being able to reinvest and then expand towards your ultimate dream. I think the one shot go from idea to commercial success at massive scale is a lot less likely than the multistep process of continually build better and better stuff over time until you get into a position of excellence.
Lex Fridman (00:33:54) And constantly try to do something that others aren’t doing.
Tim Sweeney (00:33:58) Yeah, that’s right. Because if you look at every market, there’s a few markets where the current leader…
(00:34:03) Look at every market. There’s a few markets where the current leader came late to the space, usually because the prior leader failed so horribly. But most of the time the company that’s succeeding and winning in a market is the first or second entrant there. They’ve just continually buoyed their success.
Lex Fridman (00:34:20) Great advice and fascinating. But on a human level, was it lonely, was it scary, you sitting there as a developer?
Tim Sweeney (00:34:29) I’d say it was the opposite of lonely because the thing that spurred me to actually release this was seeing kids playing the game in my neighborhood and having fun and being like, “This is really good.” And seeing them enjoying it and laughing and pointing at the screen and getting together and just wanting to play more.
Lex Fridman (00:34:47) That’s awesome.
Tim Sweeney (00:34:49) And the human element was always pervasive because I did not only receive orders, but people would actually write letters. We wrote letters back then in the 1990s. People would say how much they were enjoying the game and how their kids were playing the game and so on and so on. So it felt very connected.
(00:35:06) And I think a lot of businesses have to make scary decisions because you’re spending potentially all of the money you have to take a shot at something that you’re not sure will succeed. I was very fortunate starting a business like this because it didn’t really need any capital. The capital is well, it’s several thousand dollars in computers I’d bought by mowing lawns. And it wasn’t much risk. If that hadn’t succeeded, I guess I could have figured out how people get mechanical engineering jobs and pursued that. But once it took off and once the orders started coming in and people started writing letters saying they were enjoying the game, I knew I was going to go all out and try to build a company there and succeed and that was going to be my big goal.

Unreal Engine

Lex Fridman (00:35:48) So I’m sure people know, but Epic Games was created in 1991 and went on to transform the gaming industry several times, one of which is Unreal Engine. So let’s talk through the origin story of that. You said that when Wolfenstein and Doom came out, that changed everything, so take me to that moment.
Tim Sweeney (00:36:11) Yeah, that was a very interesting time. Epic had, after my first couple of games that had recruited developers, usually college students, high school students who were just working on their own, had real skills but didn’t have an outlet for their work, Epic had been matchmaking the best artists and programmers together from all over the world. Like Chaz Jackrabbit was Cliff Bleszinski, a high school kid in California, had made a really cool adventure game together with Arjan Brussee, a demo coder from Holland, who would make amazing graphical stuff and had built a 2D game engine. They had connected them together, and a musician, Robert Allen in California. And by telephone and modem and so on we were building these little 2D games and having quite a lot of success. There were a bunch of people making thousands of dollars a month while they were still students in royalties from the games the Epic was producing and by coordinating with people and publishing through shareware.
(00:37:07) And that was all going great. The company had a little office and we were copying floppy disks and mailing them out. But when Wolfenstein came out, we realized the future of gaming is going to be 3D. There had been a lot of experiments in 3D before that hadn’t been great. There were 3D renderings of mazes that were not in real time, and you were always looking north, south, east or west, and then there were vector graphics with little wire frames moving around and things. But Wolfenstein was the first game that was fast enough, running at 30 frames per second, it really felt immersive. It felt like you were there. You were in this Castle Wolfenstein fighting Nazis. And that was a really amazing and immersive experience.
(00:37:51) 3D graphics were pretty primitive then and software followed shockingly fast with Doom, which was a much, much more capable 3D engine, which had stairs and though it was still what we call two and a half D, it was environments that were very realistic, textures that were very realistic, a form of lighting that was approximate, but incredibly realistic. And just such great artistry and sound effects that it feeled completely visceral and real. You might look at it today from our point of view of a modern game player with 20 teraflops of computing power in your device and say, “Oh, that’s not very impressive.” But it was amazing at the time.
Lex Fridman (00:38:33) I mean for me, just sorry to pause on that, I think Wolfenstein was one of the most amazing moments of my own life. Just being able to, like you said, in real time move about a three-dimensional world. I just remember just moving around just in what is that feeling like? I mean, you feel transported into another world.
Tim Sweeney (00:39:01) You feel that you’re there.
Lex Fridman (00:39:02) Yeah.
Tim Sweeney (00:39:03) Especially when you turn the lights down in your room and you turn the sound up on your speakers and it will scare you. And you’ll feel like that fireball that’s coming at you is going to kill you. That was an amazing time. Because we hadn’t experienced that before. There was nothing like that. You’d watch a movie, a scary movie or whatever. It was just this thing that was happening. This was you. This was you in a 3D world.
Lex Fridman (00:39:30) So how did that change Epic, this realization that the future of gaming is going to be 3D?
Tim Sweeney (00:39:35) Well, at first I was really depressed.
Lex Fridman (00:39:37) Yeah.
Tim Sweeney (00:39:37) Because the wizardry of Doom especially was so incredible that I gave up on programming for six months. I was like, “I’m never be able to compete with this. I have no idea what we’re going to do. We’ll just keep making 2D games and hope that the business goes on.” But that was the nature of Carmack’s wizardry. He had done things that were not just one innovation leap ahead, but a dozen simultaneously, interplaying in a way that you couldn’t pick them apart into their component pieces.
(00:40:05) But funny thing happened, Michael Abrash, long timer in computer graphics that wrote a book on the techniques for 3D graphics and texture mapping, and he wrote some articles in one of the programming magazines of the day and explained it and showed assembly code to do texture mapping, drawing these 3D graphics on the screen, and it was actually really simple stuff. I was like, “Oh, I can do that.” And so a bunch of us at Epic independently went off and started writing our own 3D graphics code to figure it out. And we found at one point we had a number of people dabbling in this, doing different parts of it, and at that point we decided, “Okay, 3D graphics and 3D gaming is going to completely change the world. We need to go all in on this.” And so we took the best people from our best 2D game development teams and put them all together to make a 3D game. We didn’t really know what we were doing at the time. None of us had ever chipped a 3D game and most of us were still learning, but everybody was trying different disciplines to see what they were best at. And it was a combination of a bunch of people who came together to make Unreal.
(00:41:09) I’d initially volunteered to make the 3D editor for the thing, and James Schmalz had made Epic Pinball. Epic Pinball, now that wasn’t a crazy game. This was one of the 2D shareware games. He made it while he was in college and he was making like $30,000 a month from the royalties from this game.
Lex Fridman (00:41:25) Wow.
Tim Sweeney (00:41:26) Because everybody had wanted an awesome pinball game. It was massively successful. But he was a multi-disciplinary person. He wrote the code for the game, the art for the game, and did basically everything. And the code was 30,000 lines of assembly language. And so he was initially going to write the 3D engine and I was going to write the editor and he sent me his code so I could integrate it into the editor and it was like this giant pile of assembly code. I was like, “Hmm. Why don’t I just write this myself?” And so James instead started going off and building 3D models and 3D animations using the tools at the time.
(00:42:00) And so Cliff had done a lot of design work and built the levels on Jazz Jackrabbit, went off and started learning basics of level design. And so I was writing this editor and Cliff Bleszinski was customer number one for it, starting to go off and build levels, and James Schmalz was joining awesome creatures, sending them to me, I’d get them implemented in game. Then we brought in an animator to bring them into life and we brought in more and more people until at the peak of Unreal One development we had about 20 people working on it, which was a huge team for the time, and it was really stretching Epic’s finances nearly to the breaking point. We barely survived and almost ran out of money a number of times, but somehow we always pulled through.
(00:42:38) And it was a crazy project because it was three and a half years of development on a game that we always thought was six months from shipping. And it was like three and a half years of 70 or 80 hour weeks for most everybody working on the project, not even knowing what problems we’d need to solve next because we were so immersed in the current ones.
Lex Fridman (00:43:00) Were there moments when you were losing hope that this might take too long and the company will run out of money?
Tim Sweeney (00:43:08) We were always very financially stressed, so I was continually worried about that. I had total confidence that we’d work out all the technical and artistic problems because we knew the pieces and it was largely a matter of typing code in and solving some problems. And we knew we could ship a version of it. And the thing that was continually really interesting was the ongoing discovery of new techniques as we went. Because at the time Quake had shipped it had a little bit of dynamic lighting, Unreal really pushed dynamic lighting much higher than anybody else had done before, using colored dynamic lights with some shadow casting capabilities statically or moving lights without shadows and figured out how to do a volumetric fog so you could have foggy areas that were full of lights and you get the kind of glow of the lights standing out in the fog and affecting the appearance of the level.
(00:44:04) A whole lot of amazing techniques came together to build a game that made a number of leaps ahead of the state of the art at the time. Yeah, it was really crazy. But I think most companies wouldn’t have survived that, but the sheer talent of the people involved made it possible. And Epic has often done things that most companies will have failed at and we succeed not because of awesome management or awesome planning or awesome financing, but because of the sheer talent and willpower of the people involved to make it happen.
Lex Fridman (00:44:38) What about the interdisciplinary aspect of it? Like you said, sort of artists, engineers or programmers, designers, all of them working together. What was that, the 20 people, what was the dynamic there like working insane hours? What was it like to make a team like that work together well as an orchestra to actually deliver the game?
Tim Sweeney (00:45:04) Yeah, that’s one of the really unique things that exist in gaming. Not in normal big tech companies, which are just engineering and business driven, but gaming really does require all of the best people across all the creative disciplines working together. And Epic had grown organically by recruiting people with awesome talent. We always had a limited budget. We could never pay to bid up people with salaries and hire them away by paying them more. We just had to find awesome people who were at the beginning of their career and put them together.
(00:45:37) And so everybody was very new to this and didn’t have any assumptions about how companies worked. And so you put all of these people together and it was really a constant interplay of talent as people were learning how to work together as a team. Nobody had management experience. Most people hadn’t chipped a game before they worked with Epic. And we were figuring it out as we went.
(00:46:02) But it was a constant iterative cycle. We’d make several new versions of the game every day, be a new compile, introduce a new feature or fix some bugs, get it to the artists, artists improve their levels, continue building stuff, and then we see what they’re doing in their levels and like, “Oh, I see what you need now.” We’d constantly be improving the tools and just the iterative process and the speed at which that improves products is the critical element to success in games. The slower the iteration cycle, if you make a build every week and you prove, you go through one iteration every week, you’re going to be way way way worse by the end of your project than a game company that makes new stuff every day. And that was the magic that happened together and there was really nothing but passion and everybody’s individual dedication to it that made it work.
Lex Fridman (00:46:49) I heard you still program, but how much programming were you doing back then? You mentioned the hours, probably insane hours, so it’d be almost fun to talk about your setup, what a day in the life of Tim Sweeney in the ’90 when you were building Unreal looked like.
Tim Sweeney (00:47:07) Well, we’d all gravitated towards a work schedule that maximized productivity. And that usually meant waking up late. Usually we’d get to work around noon, usually work till like 2:00 A.M. or so, 3:00 A.M. sometimes.
Lex Fridman (00:47:24) Nice.
Tim Sweeney (00:47:24) And I didn’t have anything else going on in my life so it was really just work and sleep and occasional eating. I found I always needed eight or nine hours of sleep a night. Without good sleep, I would just become a zombie and wouldn’t be nearly at my best. So I always needed to get sleep. But I didn’t need anything else going on. The programming itself was so energizing and enthralling. So it was three and a half years of that during the project. Mostly spent programming. I would say probably 60 hours a week of programming, five hours a week of coordinating with other people and iterating and sitting down with them and looking at what’s going on on screen and figuring out what they needed. Maybe five hours of business stuff. And there was a good division of labor then. Didn’t have a big executive team, but it was basically myself running the techno and development part of the company and Mark Rein running the business part of it, doing deals and maxing out his credit card and going around the world bringing in sources of revenue to keep the company funded.
Lex Fridman (00:48:27) What programming language are we talking about? C? You mentioned there’s this pile of assembly. What was your decision in choosing the programming language that Unreal Engine would be written in?
Tim Sweeney (00:48:39) I’d grown up learning with Pascal as my favorite language.
Lex Fridman (00:48:42) Nice.
Tim Sweeney (00:48:43) In order to just get maximum performance and get the latest operating system features, I had to move to C for my second game, Jill of the Jungle, little Nintendo-style platformer. And so when I started Unreal Engine, it was on 16-bit windows using the C programming language. And over the course of the first year moved to 32-bit, using these DOS extenders and then using Windows NT, and I moved to the C++ language and just because it simplified the code so much went from a really complicated pile of code to a much simpler one making that transition. And so almost the entirety of Unreal Engine development, about two and a half years of it, was all on C++, 32-bit, completely state-of-the-art then. Like 32-bit protected mode was kind of a magical thing having come from the days when computers were much less reliable and crashed all the time.
Lex Fridman (00:49:38) Yeah and turned out to be a pretty good bet because C++ out of all of those languages ended up being the dominant performance-oriented language that survives to this day.
Tim Sweeney (00:49:50) Yeah, yeah. It’s because it solves all the problems at scale. Often through manual pain, but always solves them.
Lex Fridman (00:49:59) Yeah.
Tim Sweeney (00:49:59) And a lot of other languages do better in a lot of theoretical aspects and are better for some usage cases, but you can’t do everything and that’s very limiting.
Lex Fridman (00:50:10) All right, so ridiculous questions, but did you have one monitor, two monitors? Were you picking on the keyboard?
Tim Sweeney (00:50:21) Okay.
Lex Fridman (00:50:21) Were you picking on the chair? What are we talking about? Let’s paint a picture.
Tim Sweeney (00:50:26) Okay. I went through a big transition there.
Lex Fridman (00:50:27) Okay, great.
Tim Sweeney (00:50:27) So I started out being pretty lazy. I had a bunch of, I bought used computers because you would often get them at half the price of a new one. They’d be good enough. So I had this old 486 I was developing on, I guess it was a 15-inch monitor at the time. It was a poor workstation setup but it was very economical. So as we started on Unreal, I realized that I had to write a ton of code. I had to write at absolute maximum productivity, so I had to rearrange my entire life around delivering maximum output. And so at that point I realized actually spending money on getting good equipment was a good investment. And we’re not talking about millions of dollars here or billions if you’re building a GPU farm, we’re just talking about buying some basic hardware. And so I bought the biggest CRT you could buy at the time, because this was a CRT. It was 24 inches, it weighed like a 100 pounds. I had back pain for a week after I installed it. But it got me 1920 by 1200.
Lex Fridman (00:51:24) Wow. Nice.
Tim Sweeney (00:51:24) View in 1996.
Lex Fridman (00:51:24) State of the art.
Tim Sweeney (00:51:28) In 1996 that was pretty cool. So I’d upgraded to a 90 megahertz Pentium and did a of programming on that. It was on the 90 megahertz print. These were the main consumer computers at the time and I’d optimized the Unreal Engine software renderer on that. The Pentium was the first superscaler architecture in consumer computing. It could run up to two instructions a time. And if you wrote your assembly code very carefully, you could get absolute maximum throughput. So I’d gotten my texture mapping code down to six CPU cycles, comprising 11 instructions, and that was required for every pixel on the screen, and that was just enough performance to deliver that. But Dell came out with these new workstations and Intel had just launched the Pentium Pro, the first out of order processor. And so I basically bought the absolute maximum configuration that money can buy. It cost $7,000. I had a gigabyte of memory in 1996.
Lex Fridman (00:52:25) Wow.
Tim Sweeney (00:52:25) And a 200 megahertz CPU. So it tripled the speed of compiles and just made me massively more productive. So that’s what I was using throughout Unreal Engine development and chipped with that.
Lex Fridman (00:52:37) By the way, people in the ’90s would’ve been blown away by this workstation. I love it. Yeah, yeah. In writing, were you considering the hardware much? Was there a sense, so for people who don’t know Unreal Engine, rendering, I guess, is all software. Doesn’t use the hardware. But were you trying to optimize, as I understand, maybe you can correct me, but were you trying to optimize to the hardware at all?
Tim Sweeney (00:53:02) Well, at the time. So we did most Unreal Engine development before the first real GPUs came out. The 3dfx Voodoo 1, the first GPU that actually delivered serious performance compared to software rendering, the first GPU that was really gainful came in in the end of the development and we supported it really quickly, but it was not the target all along. And so development was focused on just building. There are two parts of the engine. There’s all of the gameplay systems that manage the simulation and physics and so on. That’s all written in very high level C++ code. And maintainability is as much of a goal as performance because we had to build massive amounts of systems over time.
(00:53:46) But one thing that was really bottleneck was graphics. The cost of rendering a single pixel was really high, and so you had to do everything you possibly could to optimize the rendering of pixels on screen. And so we were talking about how many CPU cycles. When you say your CPU runs at a gigahertz or whatever, it’s a billion instructions per second. How many instructions do you need to run to get a pixel on screen? And so there was a constant challenge to optimize that down. And there was also a competition among all of the graphics programmers who’d often send emails bragging to each other about what new technique they’ve discovered to try to get the cost down. And Abrash’s original articles took 12 CPU cycles to render a pixel and everybody else had figured out how to get it to down to six or sometimes even four cycles. That involved lots of different trade-offs of caching and memory hierarchy and so on.
(00:54:39) It was just like a magical time where a human could actually understand exactly what the CPU was doing under the hood and could write code that exactly targeted that. And that’s largely lost now. When we talk about optimization in software now, it’s largely about heuristics and statistically this memory access is likely to hit the cache and this algorithm is faster than that algorithm because CPUs now have such advanced out-of-order execution that you really can’t micromanage what’s happening on an instruction-by-instruction basis. You can only manage the aggregate performance of code. And so there’s kind of this lost art. Some people miss it, some people don’t, in which the programmer had absolute control over the machine and could work miracles in special cases if you tried.
Lex Fridman (00:55:27) It seems like there’s still value to that art when it comes to GPUs and ASICs. So basically trying to understand the nuances of the hardware and how to truly, truly optimize it, whether it’s for machine learning applications or for ultra-realistic real-time graphics applications. Is that true?
Tim Sweeney (00:55:48) Yeah, that’s absolutely so. The optimization problems have just moved around.
Lex Fridman (00:55:55) Yeah.
Tim Sweeney (00:55:55) In a system like Nanite, the virtualized micropolygon geometry system that Brian Karis, a brilliant engineer with Epic built, was just one of those multi-year optimization efforts that required him understanding everything from the highest levels to the lowest levels of the hardware to figure out how to make this breakthrough technique work in a way that was actually maximally performant on GPUs.
Lex Fridman (00:56:23) And so Nanite is the system, will jump around in time. That takes us to today with Unreal Engine 5, that’s the system that does the geometry.
Tim Sweeney (00:56:32) Yeah.
Lex Fridman (00:56:32) So rendering the world sort of geometrically. There’s many layers to this. We’ll probably talk, sneak up to each of those, but one, you have to actually create the geometry of the world around you and do that in real time and really efficiently and there’s a bunch of different ways to optimize that. Can you just speak to it?
Tim Sweeney (00:56:49) Yeah. With the advanced art tools we have today, it’s really easy to create a scene with billions of polygons. The hard part is how to render it efficiently, because you can’t render billions of polygons in a frame. Basically, you want to render an image that’s indistinguishable from the full detailed geometry, if you rendered it, at ridiculous cost. And so the challenge is how to simplify every component of the rendering, the geometry, the lighting, and so on down to real time techniques. They’re efficient. they capture a realistic view of what’s around you. And so when an object is up close to you, you want to render it with a lot more polygons than when it’s far away. But one of the cool principles of mathematics is the Nyquist sampling theorem that says if you’re trying to reconstruct a signal, there’s a limit to the amount of data you need to bother capturing. If you want to render a texture at a certain resolution, then you never need more than twice the pixels than in the texture that you have on the screen. And that’s called the Nyquist limit.
(00:57:49) And so one of the challenges of computer graphics is given the need to render objects at extreme close-up distances and extreme far away distances, you always want to be able to generate the right amount of geometry so that you have enough to be indistinguishable from reality, but not any more than necessary. And with geometry, the idea is that if you render two triangles per pixel, you should get an image that is indistinguishable from thousands of triangles per pixel. If you render less than two triangles per pixel, you’re going to start to see visible artifacts of the loss.
(00:58:22) And GPUs have this amazing hardware in a lot of different pipelines, but it’s all very fixed function. There’s pixel shader hardware, there’s geometry processing hardware, and then there’s triangle rasterization hardware. And one of the limits of GPUs is that the triangle rasterizers are built for pretty large triangles. If you’re building a triangle or rendering a triangle with 10 pixels, that’s pretty efficient. But if you’re building or rendering a triangle with one pixel, it’s very inefficient. So one of the breakthroughs Brian made was to design an entire pipeline for avoiding the rasterization hardware in the GPU and just going straight to pixels and calculating what should be done with that pixel as a result of some ray tracing and geometry intersection calculations done in a pixel shader. So instead of using the triangle pipeline, we’re just using pixel pipeline.
Lex Fridman (00:59:09) Wow.
Tim Sweeney (00:59:09) And getting a better result.
Lex Fridman (00:59:12) Because of the limitations of the triangle rasterizer in the GPUs. That’s fascinating. Because as you described, you need the tiny triangles for the detail for the stuff that’s up close. I mean, this might seem obvious to people, but it’s not just stuff up close. It’s like it depends where you’re looking. The human eye and the human focus and the human attention mechanism defines how much detail you want to show because the thing that the human is likely to be giving attention to, you want that to be super high resolution and everything else, including due to distance, can have less geometry and less texture, less information in it.
Tim Sweeney (00:59:56) Yeah. Yeah, that’s right. But there’s a lot of challenges like that. It turns out it’s a lot easier to render one frame that looks perfect than it is to render a series of frames in motion that look perfect. A lot of the problems with the earlier algorithms that aspired to do the sort of things was popping. You’d be running some number of triangles for a while and then you’d switch to a different number of triangles and you’d see a visible transition and the screen would look like it got shaken up. It’s a disturbing artifact that distracts you from the game. So one of the magical trade-offs of Nanite was how to avoid all of the visible transitions and get them down to a point where though they exist statistically, they’re not really perceptible to a person looking at it.
Lex Fridman (01:00:38) You look at something like Nanite, I mean, there’s a nice blog post, there’s nice descriptions about the details, but you can tell even under the details, there’s just incredible engineering that goes on. It’s so cool. It’s so cool how underneath this, the actual experience of beautiful detailed scenery, there’s just incredible engineering to bring to you simulation, ultra realistic simulation, of reality in real time, like lights changing everything. And then it just takes you back to that feeling I had with Wolfenstein, but more. And you can completely lose yourself in that world, and you would forget that this real world exists. What is the real world anyway? So that coupling of great engineering and great storytelling in terms of just feeling is super cool. It’s great to know. It’s great to know that there’s these teams behind it. And it’s cool that you’re also releasing a bunch of details around it, at least for folks like me. It’s inspiring to see.

Technical details of Unreal Engine

(01:01:45) Unreal Engine is this fascinating creation. It’s a big, bold, crazy bet that you’ve made. Maybe it’s good to actually explain what Unreal Engine is for people sort of outside this world. I would say it transformed the gaming industry. But that was a big bet in 1995, that most of the effort would be on creating the gaming engine, not the game.
Tim Sweeney (01:02:12) Yeah. A new engine is a big bundle of code and tools, a huge software package that provides all the functions you need to build any sort of a 3D graphics application. Game developers use it to make games and that’s the predominant use. But it’s also used in Hollywood film and television production to create 3D scenery in real time for production sets, to do a pre-visualization. It’s used by car makers to visualize their cars before they’re constructed or manufactured. It’s used by architects to preview buildings before they’re made and industrial designers of all sorts. And it provides all of the 3D simulation features you need, both for creating highly realistic 3D graphics, but also physics and interactions between objects and making things happen like you might see in the real world. And supports a huge variety of styles, from Pixar stylized movies to cell shading to photorealism. And it can be used for anything that needs real-time 3D graphics.
Lex Fridman (01:03:17) Including humans that populate those three-dimensional worlds. And we’ll probably talk a bunch of the details involved in the process of creating ultra realistic humans, because we humans care about how other humans look and how they convey emotion and express, how they speak, all that kind of stuff. But so yes, it’s the 3D objects that are static, the 3D objects that are dynamic, and on the dynamic front, including humans that are ultra dynamic.
(01:03:53) So all of that. You have to create this engine that’s simulates that world, this beautiful world that we know and love. But you’re early, so here you see Doom and you’re trying to create this world and trying to create an engine that would not just power Unreal the video game, but future video games. So how do you go about it? What are you thinking? And I should sort of linger on that. That is a crazy bet that we’re going to build an engine as a company.
Tim Sweeney (01:04:27) Yeah. Well, the philosophy began with ZZT and continued onward. We’re not just building a game for players to play. We’re also building tools that could be used for building that game or any other game and catering to all of the artists and designers who had used the tool. And so that philosophy started at the very early parts of Uinreal development. I was building the tools for level designers like Cliff Bleszinksi and artists like James Schmalz. And as we began marketing the game, thinking it was six months away, we were constantly releasing screenshots and things like that. Other companies started calling us and saying they wanted to build 3D games too, but they didn’t have the expertise for that and they wanted to license our 3D engine.
(01:05:16) And this was one of the coolest pivots in Epic’s history. MicroProse called up Mark Rein, our pice President and long time business guy, and said they wanted to license our engine. And Mark Rein was like, “What? You what you want to license what? An engine? What engine?” And they explained to him what they wanted to license. He said, “Oh, that engine. Yeah, yeah, that’s very expensive.”
(01:05:38) But this was one of the critical things that kept Epic going through that three and a half years. We were starting to license our engine out to other developers. MicroProse took two licenses and we got in half a million dollars from that. And a company, GT Interactive, licensed our engine to build another game and we got paid for that. And so we had this revenue stream funding the development of Unreal Engine from other games that were being built by other developers. And because they were the lifeline for the company, we took the engine business very seriously from the start. We set up mailing lists so that our partners could ask us questions. And all the developers and artists working on our games were participating in helping customers. Everybody took that very seriously because it was our funding source. And that’s kind of set this dual spirit of Epic of building technology and supporting game developers simultaneous with building games and supporting gamers. It’s continued onward and just grown over time.

Constructive solid geometry

Lex Fridman (01:06:36) Can you just go back to that, you programming. What are some interesting technical challenges you had to overcome? You mentioned dynamic lighting, create this three-dimensional world and try to figure out the puzzle of how you actually do that at a time when nobody, Carmack and you, doing this kind of thing. It’s a totally open Wild West. So what are some interesting technical challenges you have to try to solve?
Tim Sweeney (01:07:06) There’s a lot. Some of them are visible on screen and some are behind the scenes and still require a lot of innovation. All of the graphical techniques were really interesting challenges. And Unreal Engine in those early days went a lot further than the Quake engine and building environments using constructive solid geometry with a real-time editor. And that was a really interesting technical challenge. The idea is building is extremely tedious if you are only adding objects to the world. If you want to build a door, then you need to add like a dozen different pieces of door frames and add a bunch of different walls together to fit together in the right shape. It sure would be easier if you could just start with a wall and subtract the door out. And so we had this way of adding geometry to the world and subtracting geometry and the engine would perform all of the calculations on that. And this is something that I’d been anticipating was possible for-
(01:08:03) This is something that I’d been anticipating was possible for a long time, but when I finally got around to it took this 30-hour coding session to figure out all of the special cases of the code that needed to be implemented to make that work. In the course of 30 hours, I got constructive solid geometry up and running.
(01:08:18) I started doing that, handed it to James Schmaltz the next time we were together and it’s like, “Okay, I think you’re cheating here.” You create a giant Taurus and then add another giant Taurus interlocked with it and then subtracted a cylinder from it and created this really advanced composite object with just three operations. He was like, “Whoa, I can’t believe this.” It’s like, “Yeah, we figured it out.” That was cool to see it for the first time. It was probably the first time somebody had done constructive solid geometry in real-time, but it was also a really useful artist tool that all the artists appreciated immediately began making use of.
Lex Fridman (01:08:52) Can you actually speak to that, the 30-hour session? I mean, this is not, from everything I know about computational geometry, doing this kind of thing from your perspective is not, that’s not easy. What is it? The uncertainty, the open questions involved. I mean, even just on the algorithm front, how to do that efficiently and then plus, the usual programming thing of debugging, like suffering through the trickiness of it. We don’t have really, at that time, you don’t have the tooling to really visualize everything that’s going on really well. You’re probably using some crappy editor, I mean, there’s just a lot of friction here, so the 30 hour session is one that’s probably rough. It’s a rough one.
Tim Sweeney (01:09:44) Your brain works in different ways and depending on your state, right? There are some things that require really working on a problem fresh, where you’ve put together a bunch of logical pieces and now you just need to write a whole lot of code to make it all work together and plumb a whole lot of data between a whole lot of different algorithms. I think our brains have vastly more horsepower than we’re able to directly access by thinking of what code to type next. After you’ve been working for a very long time, you can get into a sleep-deprived state where you have much more direct access to that low-level knowledge.
Lex Fridman (01:10:25) That’s great.
Tim Sweeney (01:10:26) Yeah, because there are symptoms that are well-studied of sleep deprivation. One of them is short-term memory loss. You’re working without the easy recall of the code you just typed, but your brain is then freed to think about other problems. I built up this intuition over a very long period of time. The foundation for the subject is the binary space partitioning tree. This data structure invaded by a computer. Graphics researcher, Bruce Naylor. Carmack had picked up on that and had used the technique in Doom to really great effect. I’d picked up on that and no one really was using this technique for all of its graphics and rendering, but it was just additive geometry everywhere and it had a lot of overlapping polygons and it was pretty inefficient.
(01:11:13) I had the idea that if we had a BSP tree, there was a really efficient way to do constructive solid geometry. To do that, you had to break down the ways that different pieces of geometry can fit together. I’d broken it down into 14 different cases and most of them are pretty simple, cranked them out. Anyways, I got towards the end there were some pretty complicated things like, “Well, how do you deal with coplanar polygons? They’re in the same plane and pointing in the same direction versus the other direction. In what cases should you keep them in? What cases should you eliminate them,” and so on and so on to create really efficient geometry output and just plowing through it eventually through mostly a deduction, but some trial and error too. Sometimes you just have to try the possibilities and see what works. Yeah, I cranked it out and it worked, and the next day I came in kind of weary and I was like, “Oh, wow, this actually did work. It wasn’t just a dream.”
Lex Fridman (01:12:04) You’re considering the edge cases also. I mean, that’s the problem with geometry is like there’s probably just going to be all kinds of weird polygons that you have to … So you’re thinking imagining the edge cases and trying to see how do I not create inefficiencies in this algorithm while still considering the edge cases, allowing for the edge cases?
Tim Sweeney (01:12:24) Yeah, it’s pretty easy to write software that’s like 99% correct. It’s the 1% that’s the really hard part and where the devil lies in the details.

Dynamic lighting

Lex Fridman (01:12:35) What about lighting? Is there other interesting-
Tim Sweeney (01:12:37) Well, the funny answer is we know the laws of physics, so it’s actually really easy to do everything in computer graphics, but the direct solution of the laws of physics is immensely slow. What we’re finding are approximations rather than complete solutions because you need something that’s a million times faster than the brute force answer.
Lex Fridman (01:12:58) We should say that the physics of the scene is you just take a bunch of photons and bounce them around. That’s how light works. That’s going to be very inefficient because there’s a lot of bouncing and a lot of photons.
Tim Sweeney (01:13:11) Yeah, photon tracing is the subject matter that does brute force calculation of pixels on a screen from all of the light in the scene, and it works and it’s correct, and it just is an implementation of laws of physics and it’s millions or billions of times slower than what we do. Carmack had figured out how to do really cool lighting algorithms, including real-time lighting with objects moving around, and I hadn’t taken it very far. With Unreal Engine, I realized we don’t have nearly enough computing performance on our CPU to compute the light of every pixel on the screen from all of the light sources that affect it. Yeah, we were at a six-cycle texture mapper and we couldn’t afford 30 more cycles for lighting, and so the answer had to be some approximation. The one that Carmack had picked up on in the quake engine was lightmapping. If we, instead of calculating all the lighting on every pixel, what if we made a big texture that we placed over all of the walls in the scene that was wallpaper, and what if we say at every foot, we’re going to compute a lighting value for just that one foot grid on the object rather than computing it everywhere. Then, well, if we just linear interpolate that over the course of it, get a lighting solution that actually works pretty well and is fast enough to work. A lot of Unreal Engine’s lighting techniques were based on lightmapping. We introduced colored lighting, so you could have colored light sources. Then we realized, “Oh, since we’re doing this and we’re doing it on light maps, we can actually do some pretty expensive calculations, hundreds of cycles since we’re only calculating it for every one foot of world space rather than every pixel.”
(01:14:49) We introduced a whole bunch of elaborate lighting effects like torch flickering and the caustic effects of water bouncing off of a surface and so on, pulsing lights and blinking lights and everything else, and created a system. I created a system for compositing them together. If you had an arbitrary number of light sources, they could all do that. Then I implemented a shadowing algorithm. If you cast a ray from a light to a point on a surface and see whether it intersects any other geometry, if it doesn’t intersect, then the light hits the object. If it does intersect, then the light hits something else first and that pixel on the object should be dark. I built a real-time version of this and it ran at about a half a frame a second. I was running around at half a frame a second, shooting out light projectiles and looking at dynamic lighting, and it was like, someday computers will be fast enough for this, but not today. I made a non-real-time version that pre-calculates all the lighting and realized, “Oh wait, if you pre-calculated the shadowing on an object, you can still apply the lighting dynamically as long as the light’s not moving.” You could do torch flickering with shadows and figured out all the cases of dynamic and static lighting that were actually practical on a computer at the time and expose them to artists. This was the wonderful thing. I was just typing in these old features, exposing them to artists, and every day they’d find like a dropdown with some more lighting options available to them, and they’d start using them and they’d do things that I never thought possible.
(01:16:18) This was always the coolest thing as a programmer building an engine, you might think you know the implications of the feature you’re building, but artists are so clever that you’ll always find that you’ve built the capability of doing vastly more than you ever anticipated as they start to use combinations of features together in concert to do ever more amazing things.
Lex Fridman (01:16:36) That’s the genius of artists, is they’re given constraints and within those constraints they create something you could have never possibly imagined given the constraints. That’s such a beautiful coupling between engineering and artistry and art.
Tim Sweeney (01:16:51) That’s right, and it’s timeless. What would the Renaissance painters do with paints and what do the early game artists do with early engines, everybody’s figuring out the capabilities of their medium and you’re seeing a revolution.

Volumetric fog

Lex Fridman (01:17:05) This is blowing my mind. This is so fun. What about fog? You mentioned fog. How do you even do fog? You mentioned Unreal, so the first version had fog.
Tim Sweeney (01:17:16) Yeah, it was a funny thing. This graphics hardware company had just started up in Finland and they released a screenshot of what their GPU was doing, and they showed a scene filled with volumetric fog. They had a foggy room with some light sources in it. When that happens in the real world, what you see are glows around the lights as the light brightens the fog around it, but the brightening of the fog diminishes over time because the fog absorbs some lighting. The further you get away from the light, the more fall off there is. We have a bunch of colored lights overlapping together in a space like that. The effect is just absolutely magical, like being out on a foggy light with street lamps above. It’s something that’s surreal and looked just beautiful. I was like, “Oh my God, they figured out how to do real-time volumetric fog. I have to figure it out myself.” That was another 30-hour coding session.
Lex Fridman (01:18:07) Nice.
Tim Sweeney (01:18:08) At the core I realized, okay, what’s happening here is we have this lighting function saying that light at a particular point in space is falling off with the inverse square of the light, the distance from the light source, right? The inverse square is all from Isaac Newton, which applies to lighting. I had to realize that the way the fog interacted with the light was that you calculate the view from your eye’s position to a point on a surface in the world. It’s going through fog and you’re accumulating more and more light as a function of the amount of light illuminating the fog at that point in time.
(01:18:41) Well, I’d studied that in mechanical engineering without even knowing it. That’s the line integral. You have an integral over a line of some function. Well, this is exactly what it’s for. It’s for accumulating the values of a function over a continuous space and time. I did a bunch of math and realized that, oh wow, the integral. Then I looked in a reference book of all the integrals, and thankfully people have solved them all. I realized the integral of this transformed one over R-squared turns out to be solved by the arc tangent of R. If you calculate some parameters based on the position of the eye and the position of the surface point you’re ultimately seeing, then you calculate exactly how much fog you can accumulate from that. Of course, you can’t do that per pixel because that’s hundreds of cycles on CPU time. What we had to do is calculate volumetric fog on something equivalent to a light map, but calculating fog every square meter in the world. We had enough performance for that, built volumetric lighting and gave it to the artists and they started building magically detailed levels with volumetric fog in real time. Then decades later, I was talking to one of the engineers who’d worked on that hardware and asked about their volumetric fog and told them how it inspired me to figure out how to do it in real time myself. He was like, “Oh no, we cheated. We just rendered it out of 3D Studio Max.”
Lex Fridman (01:20:06) That’s awesome. That is so awesome. That is so inspiring on so many levels that you saw that maybe it’s possible even if it was kind of smoke and mirrors, and then you actually made it happen. It’s so inspiring to hear these kinds of stories when there’s so much uncertainty and you figure out and so many constraints and you figure out how to bring it to life in real time and create this world that Unreal did. Maybe if we could just pause, since you mentioned John Carmack a few times, as a fellow pioneer in the game industry at that time, what do you admire about John?

John Carmack

Tim Sweeney (01:20:43) John singularly has this intense dedication to getting the best result from his code and having absolutely no attachment to passcode and some of the legendary things he did. The end result was an absolute breakthrough in real-time computer graphics, weren’t his first try. They were like his seventh or eighth try after he’d done something time and time again, tried it, found a better approach, thrown out the old one, built it again, and continually rewrite his code until he found the absolute best solution to a problem. I think that stands as a lesson for every programmer to pick up on. When something is really, really important, its performance is absolutely critical to the product or its quality or its capabilities. Just iterate on it until you’ve achieved perfection and don’t settle for the first or second solution is good enough.
Lex Fridman (01:21:40) The result of that both you and him sort of define the future of gaming, of gaming worlds. It’s so beautiful to see. It’s just fascinating. It’s inspiring because under so much uncertainty, under so many constraints, you figure out a way. That actually continues to this day because yes, the hardware is improved incredibly, but in order to create an ultra realistic, highly dynamic, real time rendering of the world around us, it’s still really, really difficult. There’s all these kinds of optimization, like you mentioned. Maybe you can speak to that Unreal Engine One journey from one to 5.5 or .6 now. For 30 years, you’ve been creating virtual worlds. What’s it like evolving a game engine for those 30 years when the hardware under you is improving exponentially? What are some things that changed and what are some universal truths that have not changed?

Evolution of Unreal Engine

Tim Sweeney (01:22:50) It’s been an astonishing experience. Nobody 30 years ago had anticipated that we’d see the performance gains in hardware that we’ve actually seen in that timeframe. It’s something like 100,000 times higher CPU performance between multiple cores and higher clock rates and more parallelism. If we had that in aviation, then we’d be taking a trip to neighboring stars.
Lex Fridman (01:23:12) Alpha Centauri, yeah.
Tim Sweeney (01:23:13) Exactly, and in graphics, it’s been even more so. It’s something like literally 10 million times more net usable GPU performance than we had back running on a Pentium 90 CPU all in 30 years. It’s really made me appreciate that over the generations, some areas of our engine development have absolutely kept up with that technology, and the rendering team that works on Unreal Engine are the real miracle workers there. Just about every generation of Unreal, we’ve replaced most of the rendering code and the different leaders in different points and times, and the different luminaries have built systems that were absolutely rethought and optimized for the latest generation of hardware.
(01:24:03) Unreal Engine One was built for software rendering and then the Voodoo One came along late in the cycle and we had support for it, but it wasn’t fully capable and utilized. Unreal Engine Two was about bringing all of the latest GPU hardware acceleration features to the engine and keeping forward and building some new features like vehicles and a few other capabilities. All this was in the early GPU era before GPUs had really broken out of everybody’s expectations and more That breakout occurred with DirectX Nine and the capabilities of programmable shaders. Once you had control of writing code, running on the GPU that could color every pixel on the screen, and that GPU code was literally a factor of 100 times faster than the equivalent code I wrote a few years earlier on the Pentium 90.
(01:24:55) DirectX Nine Generation was a godsend, and Andrew Scheidecker longtime Epic Luminary wrote the core of the Unreal Engine Three render around real-time pixel shading, real-time lighting, being able to do dynamic shadows using several different techniques and multi-thread the render to support bits of the early dual core CPUs that were starting to show up at the time. It was a massive, massive graphical upgrade. Unreal Engine Four, made a number of improvements and just continued to add features to make more and more give artists more and more options for lighting and for geometry that created realism.
(01:25:40) Then I think probably our biggest single level of a leap came with Unreal Engine Five with a Nanite Micropolygon geometry solution and with Lumen Global Illumination Lighting Solution, which I think really bridged the gap from game-ish computer graphics to total observable photorealism for artists who wanted to create that. That’s been the evolution and the progress on the graphics side is absolutely astonishing as it is on the audio side in a number of other areas. Parts of the engine also, haven’t changed all that much since the version I wrote and shipped in 1998. The file management system has been optimized a number of times, but it hasn’t been completely rethought. The networking system, the ways that clients and servers talk together and negotiate game State is still an evolution of the thing I wrote and it’s feeling kind of dated now. You still see networking bugs in Fortnite where for some reason when you’re spectating, you’re not seeing some parameters update. Well, that’s because of the lossful nature of that networking model.
(01:26:51) The biggest limitation that’s built up over time is the single-threaded nature of game simulation in Unreal Engine. We run a single-threaded simulation. If you have a 16 core CPU, we’re using one core for game simulation and running with the complicated game logic because single-threaded programming is orders of magnitude easier than multi-threaded programming. We didn’t want to burden either ourselves or our partners or the community with the complications of multi-threading. Over time that becomes an increasing limitation. We’re really thinking about and working on the next generation of technology and being on Unreal Engine Six. That’s the generation we’re actually going to address a number of the really core limitations that have been with us over the history of Unreal Engine and get those on a better foundation that the modern world deserves, given everything that’s been learned in the field of computing in that timeframe.
Lex Fridman (01:27:46) That’s a terrifyingly challenging engineering problem. It seems like every version of Unreal Engine, the amazing teams behind it are willing to just throw away most of the code, or maybe I’m being a little bit too dramatic, but basically throw away the old approaches, like you mentioned with Carmack and start again, like with Nanite and Lumen, just keep optimizing to the current hardware, but even rethinking how it’s all done going from single-threaded to multi-threaded. Oh boy, that’s terrifying.

Unreal Engine 5

(01:28:25) That’s in part, we’ll talk about it, why maybe you have to rethink even the programming language that’s being used to rethink a lot of things. That’s fascinating. Can we just stick on Unreal Engine Five? I watched a bunch of stuff, but the state of Unreal in GDC 2024. I was just giggling with excitement watching some of this stuff. If we can talk about different things here just to nerd out a little bit. People should go watch this video. They talked about the dirt. The ultra-realistic, and this is for Marvel 1943, which is kind of putting the Marvel universe into Nazi-occupied France in the winter. There’s snow, and that’s a moment in history. That’s a very intense moment in history, and it really creates a feeling and puts you there. There’s so much to that, including the snow.
(01:29:31) Just looking at the dirt is a really nice way to show how do you add a lot of details to the scene in real time that gives this experience infinite detail? This is real, this is super real. Then I think in the talk they describe what’s entailed in the generation of the geometry, what’s entailed in the lighting, all that kind of stuff. Maybe can you speak about dirt? What are the components for people who might not know in creating this ultra-realistic, the texture, the lighting, the geometry, all of that, how Nanite, how Lumen all come together in this beautiful orchestra to paint in real time, the dirt in Nazi-occupied France in 1943?
Tim Sweeney (01:30:30) Yeah, there’s a lot happening here on screen. The real hero of this image isn’t Epic. It’s the artists and technical artists who work together to build this environment. Because the reason we showed it at GDC was it went way, way beyond what we realized the system was capable of doing, largely because of their brilliance. This is the magic of computer graphics. There’s not one feature that makes this cool. There’s a dozen technical features that each interplay, and because of the ways that they interplay with each other, you really don’t … It’s hard to actually identify the individual components of it.
(01:31:05) One thing that’s happening here that’s really critical, oh yeah, now we’re seeing it being turned off, ’cause the lighting happening. The Lumen lighting system that’s powering the scene is doing different kinds of lighting calculations at different scales. This was the work of Daniel Wright following a decade of moving the state-of-the-art of lighting forward. His theory, which was rather controversial at the time, was that if you have enough levels of lighting calculation, then you can get everything global illumination working everywhere from the absolute highest levels of a scene that buildings are casting correct shadows, all the way down to details like you see on the dirt here, all working in concert and without distinguishable boundaries. There’s a good decade of foundational work there to make the lighting work. In particular, when you see the very detailed shadows interplaying between the ice and the dirt there, there’s screen space sliding. There’s actually shadow calculation going on, not based on the world, but on the pixels on the screen because that is the only way that we could possibly do these calculations fast enough, running them on a pixel shader.
Lex Fridman (01:32:21) Yeah, watch this. Watch, when you add the objects, when you add the textures, the different layering, all the shadows that have to be computed. Boy.
Tim Sweeney (01:32:32) That shadowing is an amazing thing. The reason that works is counterintuitive. When somebody first explained it to me, I was like, “That’s really clever, but I don’t think that will work,” but it does work because if you observe the positions of incoming lights and the Z-coordinates of the different pixels on the screen, you can figure out how your geometry there is likely to occlude other geometry. Even though it’s only an approximation and isn’t perfect, it looks perfectly good to the human eye and gives you the subtle shadowing that you see in a scene like this that makes it look highly realistic. The shadowing influences other things.
(01:33:11) There’s also, some really interesting things happening with the color here, and I’m not even sure what’s causing it looks like color is bleeding from some parts of the snow onto other parts of the snow. It looks like there’s some subsurface scattering going on. I’m not even sure if that’s being used in this scene. Then there’s a material layering system for laying down layers of material, dirt and snow and other things all making that work. Then there’s the light bouncing off of the geometry, which is another system for lighting on top of the global illumination system.
Lex Fridman (01:33:47) What about reflections too? Does that count as the light bounce? There’s a light bouncing off of stuff to light it up in different interesting ways, but then there’s also, actually literal reflections in We’re looking at a puddle in the dirt.
Tim Sweeney (01:34:01) Yeah, yeah, that’s right, but the engine supports a number of different reflection techniques. One is calculating basically textures that reflect the capture, all the lighting in the scene, and then bouncing that off of texture maps. You can see different lights bouncing off of different pixels in different ways. Then there’s individual lighting casting reflections off of things too. A lot of this is under the control of designers. One of the things that’s a yet to do problem for the future is that you don’t just press a few buttons and this kind of scene magically appears.
(01:34:33) This is a lot of work from some highly skilled people, not only building out this particular scene, but then setting up the material layers so that you get the dirt with the ice layered on top and all the reflections working. They had to make a number of technical art decisions to make this work. If a novice who hadn’t worked very hard built that kind of scene like this, it wouldn’t look nearly as good. One of the challenges we have is to make building this kind of quality level even easier and more seamless and automatic. You’d like to just build a scene and say, “Use this material here and have this appearance come out of it.”
Lex Fridman (01:35:06) Yeah. I mean, once you create the scene, you could do things. I remember where they said, “Can you turn off the headlights?” I forget. You could control the lighting. I mean, all of this, we should say, this is dynamic. You can change the position of the light. You can turn on the lights and off the lights. That’s incredible. This is all real time, the geometry, the lighting, the textures, all of it, real time.
Tim Sweeney (01:35:35) This is the power of awesome technical art, three decades of feature development. You have to give credit, also, to the 20 teraflops of graphics performance that Nvidia is delivering.
Lex Fridman (01:35:48) Thanks, Nvidia.
Tim Sweeney (01:35:51) 90 megahertz to this, 90 megahertz is 90 megaflops. This is 20 teraflops. That’s a big change.
Lex Fridman (01:35:58) That’s a lot. One of the other things that they talk about in the presentation is about snow. If you’re talking about 1943 Nazi Germany in the winter, you have to create a feeling, one of which is the season, the winter, the cold, and you can have to cover everything in snow. Here shown is the ability to control how much snow covers the objects. The ability to do that for the artist is incredible, just to control how much snow is in the scene dynamically like that. That’s cool.
Tim Sweeney (01:36:40) Yeah.
Lex Fridman (01:36:41) That’s really cool.
Tim Sweeney (01:36:41) It’s a cool system for material layering and a dozen pieces coming together here. You also, notice there’s fogginess and there’s some hot objects emanating fog. An artist did that, that didn’t just arise automatically.
Lex Fridman (01:36:54) That’s called material layering. An artist creates the different materials and are able to layer the scene with it.
Tim Sweeney (01:37:02) Yeah, layer materials on top of each other and see how much of each material should be protruding in different places with the engine handling transitions and things like that.
Lex Fridman (01:37:10) That’s on top of the geometry that creates the structure of the scene and all the occlusions that have to be computed. Okay. I got to go to the other one that was just blowing my mind, which is smoke. Let me see. That. Look at that. Yeah. There’s a fire in a trash can with the smoke and the shadows, the lighting and the shadows interplaying on the smoke. This is real time.
Tim Sweeney (01:37:48) Yeah, that’s all real time.
Lex Fridman (01:37:49) What the hell? How do you do that? How do you do the smoke?
Tim Sweeney (01:37:55) Well, there’s a really powerful particle system underneath. It’s providing the technological foundations for this sort of thing, but there’s awesome artistry on top of that and an awesome physics engine powering it. It’s hard to tell exactly which piece is doing what, but you have several different particle systems there. There’s one for the fire, and then there’s another one for the smoke coming out of it. The really interesting thing happening with the smoke here is that it’s occluding light. There’s calculation of how the light should diminish as it travels through smoke. You’re seeing the lighting on the smoke being the really interesting thing. There have been a lot of attempts, but this was the first demo where I felt like this kind of smoke had really no longer looked like a video game. It looked like just a burning trash can, billowing out dark smoke. Yeah, it’s the artist’s sophistication. It’s a very, very, very large part of it.
Lex Fridman (01:38:54) Yeah, again, it’s the interplay between the tooling and the artists. Yeah, like that. I could watch that for a long time. There’s something magical sitting around a fire in real life and just watching the fire and the smoke. I mean, humans have been doing that for, I don’t know, hundreds of thousands of years maybe. Then that same, I was just staring at that, and I wish the people would just stop talking and I could just watch the fire infinitely. I mean, that’s immersion. That’s like I want to be in there, I want to sit around that trash can with the fire and the smoke and watch and maybe warm myself because I was also feeling cold because of the snow. You’re like, you really get immersed into the thing. I mean, it’s so beautiful. It’s true art. It’s true art. It’s just really wonderfully done.

Creating realistic humans

(01:39:45) Okay, so I got to ask you about the humans. We talked about what’s it like to create the scenes, but creating realistic humans is really tough. Can you speak to that? How to create ultra-realistic humans? You have an actor behind this to convey emotion, show the nuances and details of the faces, and maybe this is a good opportunity to also mention metahuman creator, that’s part of Unreal Engine.
Tim Sweeney (01:40:14) Yeah, that’s right. Humans are, by far, the hardest part of computer graphics because millions of years of evolution have given us dedicated brain systems to detect patterns in faces and infer emotions and intent because cavemen had to, when they see a stranger determine whether they were likely friendly or they might be trying to kill them.
(01:40:35) Humans, we people in the world have extraordinarily detailed expectations of a face, and we can notice imperfections, especially perfections arising from computer graphics limitations. It becomes by far the hardest problem. The metahumans effort is part of a decades-long initiative that Vladimir Mostilovic, the most talented digital humans visionary in the world, has been working on for generations and generations of games, serving individual clients around the game industry for a while. Then joining Epic as part of the three-lateral team and leading now a worldwide effort to build all of the technologies required to make digital humans realistic.
(01:41:18) One part is capturing humans. We’ve got really advanced dedicated hardware that puts a human in a capture sphere with dozens of cameras in them, taking high-resolution, high-frame-rate video of them as they go through a range of motions. Then capturing the human face is complicated because the nuanced detail of our faces and how all of the muscles and sinews and fat work together to give us different expressions. It’s not only about the shape of a person’s face, but it’s also about the entire range of motion that they might go through. Capturing one human requires a few hours of capture work in a dedicated environment like that. Then thousands of hours of processing work to capture a precise and real-time replicatable version of that human in the environment, and so one of the things that …
(01:42:03) … that human in the environment. One of the things that’s done is just capturing an actor or actress in the real world and then using them in a video game. But the much more interesting thing going on is capturing thousands of humans to form a dataset whose goal is to encompass the entire range of faces in all of humanity, so going around every culture, every continent, every age and every face variety and capturing representative people so the entire range of faces is represented.
(01:42:29) Then being able to combine and merge those together to enable recreating an arbitrary face that the system’s never seen before. One of the ideas is capture giant amounts of this high-precision data and then you use it to reconstruct a face at a consumer level, like maybe take an iPhone photo of somebody’s face and then capture a very accurate depiction of that, not by synthesizing it then and there on that device, but by combining all the known details of human faces to accurately capture the most accurate representation of that. That’s the data problem.
(01:43:03) There’s a lot of other problems with computer graphics. There’s technology for rendering hair, which is really hard because you can’t render every, again, we know the laws of physics. It would be easy to just render every hair. It would just be a billion times too slow. You need approximations that capture the net effect of hair on rendering and on pixels without calculating every single interaction of every light with every strand of hair. That’s one part of it.
(01:43:27) There’s detailed features for different parts of faces. There’s subsurface scattering because we think of humans as opaque, but really our skin is light travels through it. It’s not completely opaque. The way in which light travels through skin has a huge impact on our appearance. This is why there’s no way you can paint a mannequin to look realistic for a human. It’s just a solid surface and we’ll never have the sort of detail you see.
Lex Fridman (01:43:52) We should actually just linger on that. That kind of blew my mind thinking through that. I think I heard that the oiliness of the skin creates very specific nuanced, complex reflections, and then some light is absorbed and travels through the skin and that creates, would it be fair to say micro shadows or something? It creates textures that are humanized, able to perceive and it creates the thing that we consider human, whatever that is. You have to compute both that, the reflection, how light interacts with the oiliness of the skin and how it is also absorbed in, and all of that while considering all the muscles involved in making the nuanced expression, just the subtle squinting of the eyes or the subtle formation of a smile.
(01:44:44) It’s a stupid, annoying subtlety of human faces that you have to capture, the difference between a real smile and a fake smile. Man, I love human faces. I love humans in general. But the way to show the beginning of a formation of a smile that actually reveals a deep sadness, all of that. When I watch a human face, I can read that. I can see that. Again, this is the engineering and the artist. You have to have the tools that in real time can render something like that and that’s incredibly difficult. But anyway, sorry. There’s a lot of this kind of complexity in even just the lighting of a face.
Tim Sweeney (01:45:23) That’s right. Getting faces right requires the interplay of literally dozens of different systems and aspects of computer graphics. If any one of them is wrong, your eye is completely drawn to that and you find it on the wrong side of uncanny valley. The level of perfection needed in this area is vastly, vastly higher than world rendering or grass or any of these other things. If the shadows on a work of architecture are slightly wrong, you’re pretty [inaudible 01:45:53] with it actually. Your brain doesn’t really care that much. But if anything wrong with a human, it’s totally jarring.
Lex Fridman (01:46:01) Can you speak more to the creation of digital humans with MetaHuman, both on the editor side and sort of bringing it to life side? It seems like because I’ve watched a bunch of videos, a bunch of individual developers doing it, it’s not too difficult to bring a human to life using the tooling that Unreal Engine editor provides.
Tim Sweeney (01:46:24) There are two main tools. Compared to the old days where every face was created by hand by an artist from scratch, one is the MetaHuman creator tool for creating faces where you have a huge number of parameters you can adjust to create a unique human by adjusting all the different capabilities of them. You can then get that with MetaHuman creator into Unreal Engine. Then you can add all kinds of computer graphics features there in the engine. You could add clothing using the cloth simulation system and you can adjust the hair and all these other parameters on the thing.
(01:46:59) Then there’s MetaHuman Animator, a tool for animating a human based on a facial capture, which can be done on a device as simple as an iPhone and transfers the captured animation to the human you want, which is not straightforward. If the actor has one face shape and the character on screen has another face shape, the translation that needs to be done from the actor to the face is actually really sophisticated and non-obvious. If you just applied it literally, then it would be completely wrong from your point of view.
(01:47:27) Those are the main tools that people are using now. Then within the Unreal Engine, then you have a face and you can do absolutely anything you want to it. You could also, if you decide to go outside of the MetaHuman geometry pipeline, you could build your own face, like any creature of any sort, and then use the animation tools to animate it. But this is 30 years into a project that’s probably like 50 years in total to get to absolute photorealism and controllability for absolutely everything. There’s vast amounts of work still to do, and we don’t feel like we’ve solved the problem at all. We’ve just given artists a big productivity multiplier and a quality multiplier, but this is not in a state that we would say is done.
Lex Fridman (01:48:07) Nevertheless, I’ve seen people use it really effectively. I saw it almost like plugins, maybe external services where you can get the faces to approximate the mouth movements required to speak a thing. That’s a really useful feature.
Tim Sweeney (01:48:25) That’s right. When you have an artist or actor in your studio and you’re recording a specific performance, you can just capture their facial motion and apply it. But if all you have is a voice recording or you’re generating a voice recording or it’s parametric or procedural or AI generated, then you need the system to translate that speech not only to movement of the mouth and lips, but also to facial expressions and the whole intent. When we’re speaking, it’s our whole face that’s active and emoting in different ways and not just a mechanical motion of the pieces.

Lumen global illumination

Lex Fridman (01:48:55) We spoke a bit about Nanai, so the magic behind the virtual eye geometry system, but can you speak a little bit to Lumen and in general what it takes to dynamically light in all the complicated ways, the faces, the scenes that we discussed. What are some interesting things to you that made the magic of it happen?
Tim Sweeney (01:49:15) Lumen is a system for global illumination, meaning it’s supposed to calculate the interaction of light with the entire scene in a way that mimics reality. The first generation of engines that did lighting just said, “Well, the light casts light and the surfaces it hits are lit and the surfaces it doesn’t directly hit are dark, and that’s just all the techniques we have.” You’d have an area that wasn’t hit by any light being completely black.
(01:49:41) But in reality, light bounces around the entire scene dynamically. When a light hits a red wall, then most of the blue and green light is absorbed, but the red light reflects off and now is hitting other things. If you have a red wall with a white floor, light is bouncing off of the red wall into the floor, and now the floor is being turned red. The entire bouncing of light around the scene through multiple bounces is the critical challenge to solve here.
(01:50:09) Again, laws of physics are known and so the complete solution to this, it was written down in the 1950s, I think. The real magic here in Lumen is this system that Daniel Wright developed over the course of many years based on ideas formed over a longer period of time to calculate the way lighting bounces around at different scales ranging from the scale of miles or kilometers down to the scale of pixels and millimeters. To not only calculate at each level, but integrate it seamlessly at each level to give the appearance of completely seamless and accurate lighting. Previous techniques were highly specialized and artists had to make a decision for each light about exactly what it did. The goal, and a lot of the practice with it right now is you build a scene, you place lights in it, and it just works to make it that much easier.
Lex Fridman (01:51:02) I mean, we’re watching, I’d recommend people go to this blog post, look at that. It’s a dynamically, we should say that so there’s the indoors and the outdoors, and to be able to dynamically compute the impact of outdoor light, just look at that. Look at how gorgeous that is. It’s just the lighting like look, we’re looking now at an image of a cave. External light lighting the intricate complexity of the insides of a cave. Look at that.
Tim Sweeney (01:51:31) Light in the real world goes through a lot of bounces and the effects of it are very subtle, but when they’re not there, you miss them. Often a person can’t point out why a scene is wrong, but they know it looks wrong, and it’s the lack of the subtle lighting cues that we’re seeing here.
Lex Fridman (01:51:46) For great, because we mentioned for great video games, but also for great films, lighting can make a film, and we’re just looking at sort of a very dramatic lighting of a scene. I can imagine stepping into the scene. It’s exciting, it’s terrifying, and all of that has to do with light, the interplay between light and darkness. It’s incredible. It’s really, truly, truly incredible. Light is everything. Then to put the power of the tooling in the hands of an artist, that is really special.
Tim Sweeney (01:52:16) The industry has gone through a massive evolution and there’s so many supporting systems to make this awesome, and always artists.
Lex Fridman (01:52:25) We’re looking at reflections on smooth surfaces. Oh boy. Oh boy, look at how gorgeous that is.
Tim Sweeney (01:52:35) That’s right.
Lex Fridman (01:52:36) Wow.
Tim Sweeney (01:52:36) You have to appreciate the algorithms are doing quite a lot here. You can have a scene with a huge number of not just lights, but also bright objects that reflect light off of them. Every one of those has to be captured in the reflections in order for it to be realistic. You can’t calculate every photon in the scene. You need really detailed approximations, and that’s the field of computer graphics. It’s about increasingly effective approximations of the laws of physics, which are just totally intractable.
Lex Fridman (01:53:04) But the result of that graphics is a feeling, is an experience by the viewer. It’s just to me as a fan of, well, let’s say beauty in the world, it’s exciting that we can create that synthetically, artificially, via graphics. That just, it blows wide open the possibilities of storytelling.

Movies

(01:53:25) Outside of video games, a lot of people are using Unreal Engine for movies, for films, and big congrats. I saw War is Over, a short film that was made with Unreal Engine, won an Oscar. You can add that to the resume. That’s huge, an Oscar-winning film made with Unreal Engine. What do you see as the future of the use of Unreal Engine in creating stories in the film industry?
Tim Sweeney (01:53:57) Increasing capabilities and productivity. The limiting factor in every one of these businesses is cost, and the more the engine can make their jobs easier, the more power that brings them. One of the big revolutions we’ve seen in Hollywood is the moving away from doing computer graphics integration into a human scene with green screens to moving to these large LED wall panels where they’re displaying real-time computer graphics powered by the Unreal Engine. That’s a massive improvement in quality.
(01:54:29) You can recognize the old green screen movies because the lighting on the characters is just wrong. As much as they try to fix it up, it never really works. But when you’re filming in front of an LED panel with LED light emitters in front of you as well, the actor not only picks up all of the lighting from the actual natural scene that they’re supposed to appear in the movie, but they also can look around and see it, and they’re aware of exactly what set they’re acting in, and just the overall end result is that much higher. It’s as much because the actors are able to do their jobs better seeing the scene they’re in because the technology is enabling a better lighting calculation and a better interplay of virtual light and real-world light to make the end result awesome.
Lex Fridman (01:55:09) There’s a lot of excitement around generative AI. What do you think is the future of the interplay between what a human artist creates and what an AI system can create in Unreal Engine?
Tim Sweeney (01:55:23) I think a lot of people in the industry are overly optimistic about the rate of progress of AI for video and other things like that. The real problem is consistency, like spurting out an image is really high quality, but with video over the course of seeing all the AI approaches have consistency issues going from one place to another. I don’t think that those will just be remedied easily. Fundamentally, AI just doesn’t have anything resembling an understanding of the entire scene they’re in, the entire arc of the movie or plot they’re in and the entirety of the world around them and how it might affect the scene. Whereas game engines have that exactly where they need to be. I think what we’re going to see in the space of world-class high-quality productions isn’t just everybody moves to AI and a large part of the human creatives contributing to that are obsolete. I think what we’re going to see is AI becoming a multiplying force on the power of human creatives, making them able to create better stuff more quickly and with higher quality end results.
(01:56:28) I think unlike the fields of generative 2D art and generative text, I think the future of AI is much more complex and nuanced. I think your interview with Mark Zuckerberg conducted in VR was a really good first example of this. You did this VR discussion. It was capturing your faces and then rendering a completely 3D computer graphics model of your faces. Then the end result was patched up by an AI image enhancer that was able to add an awful lot of the missing subtleties that are lost by normal computer graphics rendering. That’s the first step.
(01:57:05) You can imagine the output of Unreal Engine being enhanced by an AI pixel shading post processor is one thing. You can imagine creation of objects being enhanced, especially mashing up high quality objects that have already been created, like Epic’s Quixel team went around the world and scanned tens of thousands of real world objects at extremely high levels of quality. They have everything from rocks to trees to archeological finds and so on, all captured there. We have an awesome library of them on the fab content site. What’s missing is the ability to create arbitrary amounts of new content. I think using data like that and AI to create completely new trees that meet your specification from all of the knowledge that has built up of high quality scan trees, it’s going to be a really valuable thing.
(01:57:55) But I don’t see this reducing the need for people or the role of people. Rather, I think it actually is probably an enhancer on that. I can’t help but think when I go on Amazon and Netflix to watch a movie there’s an awful lot of linear content and most of it isn’t very good because of the limitations of the media and the budgets and of other things. If we can use AI as an enhancer on that, then everybody’s going to have even more opportunity than they have now. Every single technological revolution has changed the way that people work, but it’s ultimately created more opportunity for people. The pundits predicting that this might be the last, but I think just the opposite. I’m an optimist on this and an optimist that it’s going to create opportunity for everyone.
Lex Fridman (01:58:40) Do you think it will be possible to generate, so use generative AI to create dynamic objects, like you mentioned trees, in the Unreal Engine world to create meshes and textures and empower the creator to create faster, use meta knobs like hyperparameters versus very nuanced, where you can control much faster the look of a face, the look of a tree, all that kind of stuff?
Tim Sweeney (01:59:10) I think that’s the central challenge of the next decade of game engines and AI for content creation of all sorts, because you have two very different models to the world that are emerging. There’s the scene graph, the technical term we use to describe the set of all of the objects in the world in a 3D world maintained by Unreal Engine or another engine. In the videos you saw, it’s the rocks and the trees and the snow and the bridge and the people and all of these things. Each one has enormous amounts of data attached to it. Some are like texture maps, some are sound files, some are animation files and enormous amounts of detail all stored there in that procedural and in this precise computer graphics representation that enables rendering it from any perspective with any settings and so on. It’s a completely general system that has complete context about the state of the world at any point. You can always precisely reproduce it. If you play the same scene 10 times in a row, it’s always the same. It’s never randomly changing. You’re like, “Oh no, why did this character’s face change midstream?”
(02:00:14) But it’s also rather limited because you have to build everything manually. That’s costly and it’s time-consuming. It requires expertise. Then you have this other model of the world, which is what AI sees or thinks. If we could peer into what’s really happening in its parameters, there’s something like the mushy connections of neurons in a brain. It has a vast amount of knowledge about the world and about graphics and about images and about people and about everything else. It’s stored in a human incomprehensible form, but it can be extracted through queries like asking it to produce an image from a prompt or a video from a prompt or whatever.
(02:00:50) But the huge problem with that is it’s very mushy data. We don’t know how to give it a command that will give us a precise result. If it produces one image one time and we change our prompt slightly, it might produce something completely different. We are unable to art-direct it. It’s this completely untamed tool. I think when we figure out more and more ways to merge these and connect these two together, you can imagine AI enhancing the process of content creation in a traditional scene representation. You can imagine the scene representation being shared with the AI so the AI not only sees a prompt, but also here’s a list of all of the objects in the world and the characteristics and so on. It can learn more about how those objects should move and interact.
(02:01:35) If you get a constant feedback cycle going back and forth between an engine and AI, then I think you can get the best of both worlds, stable scenes, but also the higher productivity of being able to get content out and the ability to select specific parts of it and art-direct those and to have those art-direction stick and be recognized as part of this permanent scene representation.
Lex Fridman (02:01:57) I can’t wait until AI can operate not in the space of pixels, but in the space of scene graphs, creating objects in the scene graph, whether it’s like you mentioned, audio or any of the things that you mentioned about that empower the creator. That’s a super exciting future. I wonder if you could speak to a fear that people have on this topic of artists, engineers are losing their jobs, being replaced by AI. Are there words of hope that you could offer them?
Tim Sweeney (02:02:32) This is certainly the most extreme example of it because AI is just so far ahead of prior technologies, but similar fears were had in every other industry. There’s a fear that digital music synthesis would obsolete musicians. There was a very brief period of time in which songs with digital music instruments, like the early Minimoogs and Yamaha synthesizers weren’t allowed to win certain music industry awards because they weren’t considered real music. Then over time, the people were educated and realized these are just instruments people are playing and they’re controlling them the same way they did before. There are similar questions about is computer art built in Photoshop really art, or is it just goofy computer stuff? I think nowadays digital artists have gained respect. I think if you look at just the tools that have existed in Photoshop, some of them are pretty sophisticated and nowadays they have AI features. But I think AI is ultimately going to be another tool in the artist’s tool set.
(02:03:33) I think it’s going to become a more powerful directable and human serving tool in the future. I think a lot of the alienation comes from the prompt either being immensely powerful at giving you an entire creation, but then being completely unwilling to let you control the nuances of it. That feels alienating. You give it an image, but you’re like, replace the image of this part of it with this thing or make that object green and it just, it can’t do it. Often it can’t be convinced with any number of words in the prompt. That makes it feel like the computer is taking control away from us, humans and artists, and is refusing to do what we want and has its own opinions. It feels like a competitor.
(02:04:14) I think when we have much, much, much more nuanced control of it and artists can join and just like let’s enhance this object, do this, do that, do that, it’ll feel like some of the tools that exist in Photoshop, which are in some ways compared to a paintbrush or superpowers already. AI will come to feel like that too, and will increasingly serve creators creating and enhancing a work in a way that feels just a natural extension of their own bodies and minds.
Lex Fridman (02:04:40) Of course, there is a real human pain to layoffs and there’s a hype around AI and then companies might try to implement AI systems, and in so doing, layoff a bunch of folks and the pain that those folks feel is real. I think there’s always going to be pain with these kinds of transformation that’s happening and it’s a terrible pain. Pain in general in the human experience is terrible.
(02:05:08) But I think I’m personally excited by the human AI collaboration as you’ve described in this whole process. I think if you just keep being open to using the tools, constantly trying the cutting edge tools, how they can make you more productive, how can they empower you as a creator, as an artist, or as an engineer, I think you’re going to just keep winning.
Tim Sweeney (02:05:32) There’s a lot of complicated trends underway. It can be hard to break them down and distinguish them. I think a lot of people like the theories that get the biggest traction on social media often don’t capture the real underlying motive forces at play there. But I think AI involved in code production will probably create a net benefit for the need for humanity to be involved in coding. It may change parts of jobs. I don’t think it’s going to obsolete anybody who’s willing to learn new ways of doing things. It’s always been this way.
(02:06:05) I think that there’s also a lot of over hype in AI. AI is really great at spewing out code that does something that a million GitHub repositories already do because it’s learned the underlying pattern. It’s notoriously hard to get to do something new that hasn’t been done before, especially when it’s a complex task. The bigger amount of code you ask AI for, the more it leaves you with just a mess of code that sort of works. Then that’s the problem with code, it like 99% works, but the 1%, it might be harder to get to 100% with AI than with hand coding. Everybody who’s looking at this topic should actually try using the coding assistants on hard problems and see how they do there.
Lex Fridman (02:06:43) I think for me personally, it makes it more fun and faster to generate boiler play code so I can focus on the harder decisions, harder big picture decisions and harder innovative decisions and all that kind of stuff, and just makes programming more fun for me because I feel less lonely. Even when it gives the wrong code, I get like, “Oh, okay, well that’s a way to do it. That’s interesting.” Then you could talk to it. Maybe that shows something about the programming experience that it is in part sometimes a bit lonely.
Tim Sweeney (02:07:20) The topic of boilerplate code is an interesting one because the mere existence of boilerplate code is a failure of programming language and of the idea of creating software modules. You ask AI to create a sorting function, great. Now you have another sorting function that might be buggy alongside the million others that different people have written. It would be better to have a sorting function that’s been written and tested and optimized and everybody relies on it. More modular software I think will actually reduce the opportunity of AI because people doing programming work will largely be solving unique problems. They’re actually hard problems in themselves and not just connecting other widgets.

Simulating reality

Lex Fridman (02:07:59) I think as in many cases, AI will just help improve the human systems by shining a mirror to ourselves. I have to apologize for the pothead question ahead of time, but you’ve been, let’s talk about the metaverse broadly. You’ve been a big proponent of the idea of the metaverse. We’ll talk more specifically what that means today, but we’ve been talking about simulating reality better and better and better. The pothead question is what does it take to simulate reality to the level we see around us today? How far away from that are we to simulate this ultra-realistic, immersive fun reality that earth is? What does it take?
Tim Sweeney (02:08:45) We’re going to get shockingly close over the coming years, certainly less than 20 years. If you look at the progress, what areas where we have achieved total photorealism and what areas where we fall short, we’re getting very close in all non-human interactions you see in the world walking through a jungle or a city, all the lighting, it’s very close, and that might be just a few years away. But then all of the problems that involve humans, human dialogue and intent have a much, much, much higher bar that they need to meet to satisfy our brains and convince us that they’re realistic or real. I think that’s going to be the primary challenge of graphics development and simulation development over the coming decade.
Lex Fridman (02:09:27) The realistic humans, that’s going to be the bottom line. Visual and behavior too, so everything?
Tim Sweeney (02:09:35) Yeah, I was asked about this about 10 years ago, and I said that even if you gave us an infinite amount of computing power, we couldn’t simulate realistic humans because we simply don’t have the algorithms. We have no idea how to simulate human intelligence. That was absolutely the case then, but it’s not really true anymore. What we’re seeing with generative text AI is not only at a level that you could say that it’s actually doing a pretty good job of simulating a human, at least humans at the text level, not at the emotional level yet, but at least at the level of words spoken and find more and more ways of training on more and more scenarios that you might have a very, very compelling human simulation going on in the next five years even. I’m not saying it’s a good idea, but I think the arc of the technology is inextricably heading in that way, and it’s heading at a shocking rate.
Lex Fridman (02:10:27) We don’t say this enough, but the current state of LLMs, I mean if you put Alan Turing in conversation with chat GPT, I mean it really passes the Turing test, like almost definitively passes the Turing test. Of course, we keep raising the bar. Well, the Turing test is not a real test, it’s not a useful test, whatever. We just keep raising the bar for AI where it’s always going to be lesser than. But you have increasingly ultra-realistic faces and bodies combined with increasingly moving and powerful full of emotion, speech, text.
(02:11:09) I work with this amazing company called Level Labs that does text-to-speech well. There’s companies that specialize in bringing text to life. That’s going to increase different companies to do that very well. Then all of a sudden you have this synthetically created scene where a human is speaking and you’re moved to the point of tears because of the scene, beautifully lit face in the full darkness, the emotion, the drama of the scene. I think so you’re saying five, 10 years, maybe 20?
Tim Sweeney (02:11:42) Yeah, absolutely. We’ll definitely see it in our lifetimes.
Lex Fridman (02:11:46) Increasing the level of potheadness in my question. Do you think we might live in a simulation? If we do or don’t, how hard would it be to build such a simulation where we’re fully convinced we’re in it?
Tim Sweeney (02:12:03) Well, I don’t think that these questions are necessarily unanswerable. I think I’d like to see more actual effort to ascertain what is the underlying mechanism of the universe. I don’t think we’re here for no reason at all. I think the world’s a pretty cool place, and the fact that we can exist and know the laws of physics and especially the standard model of physics and all of the parameters that lead to these atoms and life evolving in the presence of thermodynamic gradients, that’s really cool. I think it’s a worthy field to study more about that holistically.
(02:12:37) I don’t know. The question of are we living in a simulation ourselves always boils down to, well, if we are living in a simulation, what are they living in? Because at some point there has to be some base reality. One of the philosophical theories that was put forth seriously was that there’s no physical reality. If you have a system of equations such as the laws of physics, then all possible evolutions of dynamical systems under those equations kind of have a physical reality. We just are kind of a manifestation of laws of math rather than needing an actual universe around us. I don’t know. I like dabbling in that philosophy. As we get AI becoming smarter and smarter and we get closer and closer to really capturing the full laws of physics, these questions become quite a lot more compelling.
Lex Fridman (02:13:23) You start to think if we’re not living in a simulation, what are the things about this reality that are not simulatable? What are the big mysteries around us? It feels like the physics is simulatable. It feels like a lot of the incredible stuff that we talked about, while super nice, seems simulatable. But then there’s the flame of consciousness, the feeling of it, whatever that is that lights up in our eyes as humans. Maybe that’s not simulated. Maybe that is the thing. Maybe that’s a thread that connects to the explanation of the mechanism, as you said, of the universe that’s really important to understand and we’re completely clueless about that mechanism. I mean, a lot of the religious texts sneak up on what that mechanism is, but we’re still mostly clueless. We only have these leaps of faith and believe in what that mechanism might be.
Tim Sweeney (02:14:16) The whole idea of nested simulations, perhaps given sufficiently advanced technology is kind of mooted such that if you wanted to simulate another reality, you’re kind of just actually creating the reality. You’re doing quantum mechanical operations that would produce the same result anyway, and you’re running them at full performance. It’s not really a nested simulation, it’s just another thing that’s happening in the universe. That would be interesting.
(02:14:44) But I think it’s ultimately a theological question. Because it’s no longer cool to deal with theology as part of science, there’s not been much work on that. You can’t publish results on those topics in a respected physics journal. I think that it’s kind of been set aside. But it’s interesting to note that the laws of quantum mechanics themselves have a place for God or souls or whatever external source of input you might want to attach to such a thing and that there’s this idea of quantum waves function collapse, that when we look at a quantum system evolving in perfect superposition of many possibilities and you go to observe it, you actually just see a specific possibility in the multi-slit experiment. The light ultimately ends up being observed going through one slit or the other. That’s a place where there’s this random number of being injected into everything around us, trillions of trillions of trillions of times per second, and everything we’re observing. If you want to attach some external input, well there’s a place.
Lex Fridman (02:15:46) It could be seriously accessible to the rigors of science, but we just know so little there.
Tim Sweeney (02:15:52) It’s funny. In that area, we know nothing more than cavemen knew whatsoever. We know the laws of quantum mechanics and we have computers that may be soon more advanced than we are.
(02:16:03) Computers that may be soon more advanced than we are, but we just don’t have any answers to the fundamental questions about life, the universe, and everything.
Lex Fridman (02:16:13) Do you think, sort of more practically, do you think we’ll create video games, video game worlds of the metaverse variety in which humans will want to stay? So I mean, to me, this kind of discussion of a simulated reality, the real test of immersion is not wanting to go back to the real world. As a perfectly healthy, excited, normal human being, choosing to stay in that world, how hard is that, do you think?
Tim Sweeney (02:16:48) Well, I think the technology is coming and then there’s a human question of should we go that far?
Lex Fridman (02:16:54) Should we? Yeah.
Tim Sweeney (02:16:56) Yeah, certainly as a game developer ourselves, Epic doesn’t aspire to that. We make fun games.
Lex Fridman (02:17:01) Yeah.
Tim Sweeney (02:17:01) And the ultimate manifestation that we found is fun games that people play together to have fun in between work and the other things in their real lives. But as the simulations get more and more realistic and the capabilities become more and more real, I think we have to ask ourselves some hard questions about how should humanity operate in that space? What are the limits that we should go to and what are the limits we should set?
Lex Fridman (02:17:23) Yeah, I think there’s going to be some hard questions, and I think maybe I’m just being human centric here, but there should probably be some legal bounds on two things, sort of not creating a reality in which humans would want to stay too long, sort of yeah, focusing more on the game side, and more importantly, not creating simulations of humans that could suffer. To me, as we talked about creating ultra-realistic humans, eventually that means creating humans that can suffer, that can fall in love and experience heartbreak and loss, they can fear death. And the more you simulate that to the full reality of the human condition, the more you get to this place where you have a simulated humans that is able to suffer.
(02:18:23) And I think legally speaking, I think you have to get to a place where that’s not allowed. There is a line you can’t cross, and that’s a hard thing for humans to deal with. That’s going to be some interesting Supreme Court cases. Once you create a human sufficiently realistic to where they can suffer, means that human could be tortured and do terrible things to that human that’s artificial, quote, unquote but boy, that still feels wrong. I don’t know what that is, but it feels wrong to torture a simulated human. Now when you play a video game and it’s a shooter and everybody’s having fun, that doesn’t feel wrong, but there’s a line and that’s going to be a fascinating line for the Supreme Court to explore. Oh man, what an exciting future we’re living in. Huh?
Tim Sweeney (02:19:28) Yeah. I think the thing to appreciate is game developers have just generally been on the good-spirited side of things. If you look at the worst things that people do in popular video games today, it’s like what? You rob a bank in GTA? But it’s clearly fictional and all in fun and not serious and over the top. I think yeah, as things get more realistic, especially simulation of humans, yeah, there are some hard questions that will have to be answered there. But I think the thing that all game developers need to remember is we’re here to make people’s lives better by entertaining them, providing them with fun and a diversion from other things and being a part of their lives and not trying to be too big or being too much and not trying to provide an alternate to reality, but to just provide a fun source of entertainment like the many other things that people do for fun.

Metaverse

Lex Fridman (02:20:22) So you spoken, like I mentioned about the metaverse for many years. Let’s step back. What is the metaverse? And speaking of fun, Fortnite, just hundreds of millions of people just enjoying themselves in this huge scale social game. You could call it a metaverse. Maybe you can describe the different flavors, the layers of how you see what the metaverse is.
Tim Sweeney (02:20:52) The metaverse is an idea who stock price goes up and down depending on who says what on what day, and some have an ability to drive it way down by opening their mouths, but ultimately this is about multiplayer social gaming experiences, you and your friends getting together in a 3D world and having fun together in any way you want. If you’re playing Fortnite Battle Royale, in my view, that is capturing the essence of the metaverse, and it’s especially in Fortnite when we got Sony on board so that all players on all platforms in Fortnite could play together, could voice chat together, and could be part of a single game experience, it really took on a new nature, which was not just like a multiplayer game with Heritage from Doom, but also a true social experience between you and your friends. And Fortnite Battle Royale is just one manifestation of that. Another one is Rec Room: VR, where you’re standing around in VR with friends playing billiards and or shooting hoops or doing other light entertainment things.
(02:21:54) I think every game that has a huge number of players who play together socially as part of their entertainment lives, yeah, I think is really getting at the core essence of the aspiration for the metaverse. And we’re still in the very early days of it. I was on the internet in like 1992 or so, and it was a pretty bare bones thing. And I think when we look back at the state of gaming today, we’ll realize that there’s a lot further to go to get to the ultimate version of it, but I think it’s all on track. And I think it was the time we released Fortnite Battle Royale and started playing together. All of the people at Epic and squads and experiencing that world that we realized that this trend was afoot and that we needed to do everything we could to bring in other creators so that anybody could pile on to the work we were doing by creating their own worlds through Fortnite, Creative and UEFN, and creating more games and more genres that people could play and ever expanding the repertoire of fun.

Fortnite

Lex Fridman (02:22:57) Yeah, I would love to sort of talk about different aspects of that a little bit more because Epic has created a lot of amazing games, Unreal Tournament, Gears of War, but the game that I think is fair to say that transformed the gaming industry was Fortnite, Fortnite Battle Royale especially. Can you explain the origin story of Fortnite?
Tim Sweeney (02:23:16) Well, Fortnite has humble beginnings. In 2011, we just been in the final days of finishing one of the Gears of War games, and we wanted to explore ideas for new games, and we’d had a general idea that we would like to build some smaller games, online games and or to learn more about yeah, that space and not just have one single massive game in production at all times and only one. And so everybody in the company was given a week to form a team and work with whichever co-workers they wanted and build a game using Unreal Engine so you can actually build something pretty interesting in a week. And one of the teams built the very first version of what became Fortnite. The very first version of it had a different art style, but it had the idea at the core that you’re going to build forts by day using this building system. Then night would come and you’d defend the forts against zombies, and the longer you could go, the more elaborate forts you could build and the more survival waves you could withstand, and it would get cooler and cooler with time.
Lex Fridman (02:24:20) Nice.
Tim Sweeney (02:24:21) And that game was in development for a very long time. We always saw the potential, just the building aspect of it was incredibly fun, but we made different pivots at different times. At one point we had moved to the current Fortnite art style away from kind of more of a realistic style, made it more in the Pixar vein of cool stylized characters.
Lex Fridman (02:24:43) What was that decision like? Because what you mentioned Gears of War is this incredible, like shows off the graphics to the fullest, different than the artistic style of Fortnite. It’s amazing that the same company would make this fun, silly graphic style of Fortnite.
Tim Sweeney (02:25:01) People come to Epic because they want to work with the best people in the world, and artists bring a lot of different personal art aspirations and style capabilities, and many of them are very multi-talented in British photoreal content or highly stylized content. And a lot of the best artists on Fortnite were a lot of the best artists on Gears of War too, change styles, but continue doing awesome work. We’d realized that Fortnite could be really mainstream and it could be a game people play for a long time. And so having a more visually pleasing art style that’s not as stressful as a Call of Duty game where you’re constantly like pixel hunting in a dark scene for somebody’s rifle scope. That was the goal. So a few of the artists got through into find a new art style and we moved to it and at different points it evolved towards being kind of like a light MMO like Destiny with rather complex RPG and stat systems.
(02:25:50) And that evolved into kind an MMO like tower defense game. MMO only ended up persistence of items and stats, which became Fortnite: Save the World mode, which we launched in early 2017, and it was a moderate success. It paid its budget and we come out ahead. And then at the same time the Battle Royale genre was booming, PUBG had just come out. Tons of people at Epic were playing at. They were like, “Oh, this would be so cool if it had Fortnite building.”
(02:26:18) And so we assembled a team in a war room, like 30 people in one big room, and they worked insanely hard for four weeks to build Battle royale. So the nice thing is all of the content for Fortnite had been built over the previous seven years. They had a huge library of content but no game play of the type they wanted. So they had to build it all in that four weeks and ship it. And that put Epic on an exponential growth curve where we went from 300 employees to thousands of employees and went from about a $100 million in revenue to billions of dollars in revenue and became the center of the gaming world at the time.

Scaling

Lex Fridman (02:26:54) Can you actually speak to the technical challenge of going from mostly a not online large scale gaming platform to being able to support with Battle Royale, a huge number of people playing with each other at the same exact time. What’s the technical, four weeks. What’s the technical challenges there that had to be overcome?
Tim Sweeney (02:27:17) Since 2012, we’ve been building online backend system to support player accounts and log in and all of the different systems there needed to make a multiplayer game. And we’d been building them to be scalable, and by some miracle, we built them stably enough that they were able to scale up. And so the online team who was responsible for patching that code, spent a year of intense work getting it to scale from 40,000 concurrent users to 15 million concurrent users.
Lex Fridman (02:27:45) Yeah, I mean they’re scaling. They’re scaling. That’s a lot.
Tim Sweeney (02:27:49) That’s immense. But they’ve done such an awesome job of building the foundations that found it was tractable, it was doable. If they hadn’t done that, then the company would’ve died. Fortnite just wouldn’t have been playable and the whole thing would’ve failed.
Lex Fridman (02:28:03) I mean, there’s just so much detail there. That makes all the difference because I mean, that’s what Spotify has talked about that, like the latency. It’s like how quickly you can deliver the song changes the product from being this shitty thing that I’d rather pirate the songs to. This is good enough to where I really enjoy the experience, I want to use it. So yeah, that’s really important, to create an experience for 15 million concurrent users to where it’s not lagging or it actually works. Right. Is there something you could say to sort of, like how difficult that is to pull off?
Tim Sweeney (02:28:50) The trend nowadays for building online services is microservices. There’s not one big server that handles all of the interactions with Fortnite. There’s game servers running 100 player game instances for each Battle Royale session, and then there’s an account server and many instances of it all talking to a shared database, and there’s hundreds of different microservices talking to each other. And so scaling is a matter of identifying what are the bottlenecks in that system and making sure that each one can scale and has enough redundancy to be able to handle the load. Thank God for Amazon Web Services and cloud hosting because Epic went to 15 million concurrent users without buying any server hardware.
(02:29:33) We were able to just call up Amazon and say, we need more. And there was a period of time there where Fortnite was undergoing this exponential growth and we’d find one week we ran out of servers in Brazil during a heavy weekend of play, and next week we had an even heavier weekend of play and there were servers to handle it. Somebody at Amazon had drop shipped millions of dollars of server hardware into Brazil and turned it on just in time for Fortnite to need it. And yeah, there are a lot of unsung heroes in that story, many of whom we have never heard of.
Lex Fridman (02:30:05) Yeah, I mean behind AWS, many unsung heroes. There’s like so much of those folks who run the modern internet, all the incredible services, the games, the services that we take for granted are currently being run on AWS or were originally and Google Cloud and so on. Yeah. Can you speak to how much money Fortnite made? So this is one of the greatest successes in the history of video games also.
Tim Sweeney (02:30:36) Fortnite makes billions of dollars a year, and that’s the majority of Epic’s revenue, that we have a robust business around Unreal Engine licensing, Rocket League and Fall Guys, and some other tools like the Fab content marketplace. But the majority of it is Fortnite because we’ve chosen to reinvest heavily in building what we think is the future of technology. We’re spending more every year than we’re making. And for a bit of time we were spending over a billion dollars a year more than we were making, and we found that to be unsustainable. And we went through some painful layoffs at that time, and then we stabilized and now we’re spending several hundred million dollars a year more than we’re making, which we can very well afford to do because we have billions of dollars in the bank.
(02:31:20) Thanks to a combination of the profits we made when we were a very small company with a very big game and because of investment we’ve raised. We’re not an oil well pumping oil out of the ground where we discovered oil. We are growing to be a future technology powerhouse. And we think the 3D space and the future of real-time 3D simulations is going to be one of the major facets of technology for humanity. And we’re all in investing in that.
Lex Fridman (02:31:45) Yeah, it’s exciting to see that investing in a long-term future, sort of taking the risk of doing the research and defining the next chapter of Epic. So using the successes of the day to invest into the successes of tomorrow, that might look very different, completely different. And part of that is investing in the developments, the research and the innovation in the Unreal Engine.
Tim Sweeney (02:32:08) That’s right. We’re a company that can start working on a project knowing that we won’t reach fruition or make any money from it at all for three years, four years, five years. We’re totally okay with that. And that’s the cycle that’s fueled our growth over time. It’s constantly investing in the future and being a serious company that’s doing serious R&D side by side with shipping and maintaining products and earning money from them.
Lex Fridman (02:32:34) So can you speak to, I mean there’s several directions here. So one of them, sort of the future evolution of this idea of the metaverse, so potentially creating communities. So Fortnite is this incredible, huge community of humans interacting, but your vision is to go outside of just one game. So what is the kinds of standards that you’re thinking about building such that people can sort of have an identity and almost travel between games and that kind of thing?
Tim Sweeney (02:33:11) Let me start with the present of gaming and why it sucks.
Lex Fridman (02:33:15) That’s a good start. Sure.
Tim Sweeney (02:33:17) Fortnite is an awesome thing. You go into Fortnite, there’s 100 million monthly active users there, a huge number of your own friends are there, you can play with them, go from experience to experience seamlessly without leaving the app. There are 100,000 different islands you can play on, and some of them are really awesome and they’re constant new ones coming out and constant things to do. If you want to play Roblox, all right, you quit out of the Fortnite app, you launch the Roblox app, different program, different friend system, different account names, your username in Fortnite and your username in Roblox are different names and they’re not connected to each other. So you have to remake all of your friends and then find different things to play. And now the controls are different. So you have to relearn how the joystick, mouse, keyboard, controller works in that experience and you have to go from place to place and you buy some stuff in Fortnite and it’s really cool and you can use it anywhere in Fortnite.
(02:34:06) And then you go in Roblox and you don’t have that stuff, you have to buy different stuff. And that stuff only works in Roblox. And same with Call of Duty. It’s another isolated place. And same with World of Warcraft, and same with League of Legends. And every place you go is its own unique place, different friends, different account names, different people, and there’s no social cohesion between them at all. And a long time ago, consoles set out to solve this problem by creating their console-wide friend system in account. So your friend on PlayStation in one game is your friend on PlayStation in another game, but only on PlayStation. If you’re on Xbox, you can’t see PlayStation friends. And so you have two basically orthogonal and cross-cutting divisions of the world into fiefdoms, which were not created with bad intentions but arose and are separated isolated islands.
(02:34:54) One is the platforms and they’re social services, Xbox, PlayStation, Nintendo, Steam, Epic if yet it to the list. And the other is these different games people play. And because of this weird historical artifact, we’re left in a world where people can’t seamlessly move from games to games, bringing their friends and their stuff. So the solution to this is to federate and connect all of the systems together. All of the players on all of the different platforms can be recognized by their name and put the at sign in it so your Xbox names and your Fortnite or Epic names and your Steam names can all live together and interoperate together in a single space. So unifying the social ecosystems is one thing that needs to happen. Next and bigger challenges to unify the economies too. Now, I’m not talking about a sword you have and World of Warcraft should work in Fortnite.
Lex Fridman (02:35:47) Yeah.
Tim Sweeney (02:35:49) Every game’s going to have its own gameplay rules and a lot of games are going to have stuff that only works in them. But there’s a huge set of games that have in common the idea of a cosmetic system that does not affect gameplay outcomes but is purely cool looks and cool appearances. Most of the major multiplayer games have them. And if you look at games, you could probably bundle together about 70% of them and say they’re similar enough that they could actually interoperate, that you could own an outfit in Fortnite, own an outfit in Roblox, and own the same outfit in maybe Call of Duty and maybe 100 or 200 other games and actually expect they would work together. And you find other kinds of items are probably interoperable too. Like Fortnite has car outfits, so you can buy different appearances of a car. And when you find a physical car in the world of Fortnite, if you’re the first person to get into it in that session, boom, it takes on your chosen car cosmetic and now you have a cool car that’s identifiable as yours.
(02:36:54) We realized early on with Fortnite that the key to making Fortnite work as a creator economy was to open up the revenue from the item shop to all of those sources of engagement. Right. There are two big things happening in Fortnite that make it work as a product and as a business. One is the game modes, Fortnite Battle Royale, and all of the user modes and everything else are sources of engagement. People play there because it’s super fun and because they’re playing there, they’re willing to buy cool stuff to make their character look cooler. And so you have all of these sources of engagement, but the sources of engagement don’t make money directly. You can’t spend money in Fortnite Battle Royale to buy a game item. The gameplay is not paid to win, and it’s all just a game. So we make money from the item shop and the item shop only adjusts because of the sources of engagement. If you weren’t playing Battle Royale, trust me, nobody would want to buy a Fortnite outfit. If you weren’t playing any Fortnite games, why would you buy Fortnite outfits?
(02:37:50) And so you have all the revenue in this item shop economy and all of the engagement in this engagement economy. And the thing that magically makes the Fortnite creator economy works is revenue sharing, item shop spending according to sources of engagement by engagement. If you buy an item and you’ve played 40% of your time in Battle Royale and 60% of your time in these user modes, the amount you spent, the portion of that that’s profit can be separated out and paid out to all the different creators who participate in that economy. And that’s why Fortnite scaled up to a $400 million creator economy so far, and it’s growing.
Lex Fridman (02:38:23) It’s amazing.
Tim Sweeney (02:38:25) One of the really critical things we aim to do in designing that is ensure it’s a creator economy that could scale to other companies, other ecosystems and say right now we have many industry standards bodies, one standardized game ratings, age ratings of games, another standardized file formats for the web, another standardizing file formats for 3D, like Khronos Groups in The Metaverse Standards Forum. If we had a standards body standardized what are portable outfits in games, game outfits you could buy in one game that work in another, what are their dimensions and what are their capabilities and what can you do and what can’t you do and so on. Then you could have an item economy where every game agrees to respect each other’s item purchases of that sort and revenue is shared between ecosystems as well.
Lex Fridman (02:39:15) That would be incredible. That would be so amazing. First of all, it seems silly maybe for people who don’t play video games, but an outfit is an important, if an outfit can be persistent across video games, I mean, I don’t know. What’s the purpose of life? Why do we wear clothing? Clothing is a part of our identity. It’s how we present ourselves to the world. I wear this stupid suit and tie. It feels good. It feels good when I put it on. And even the other outfit, I have two outfits, this and then a black t-shirt and jeans, and it feels good to wear that. It feels like me when I look in the mirror. Okay, I know that guy.
(02:39:59) And to be able to have that outfit go from game to game to game maybe across the years, that would be wonderful. I wonder if you could just even comment, could there also be another standardization about the value sort of for more complicated items? So take a sword from Diablo and transfer to a gun in Fortnite, but based on the value, some generic concept of money. So the value of a thing in one game versus the value of a thing in another game where you’re almost operating in a space of value versus the actual items. Or is that already getting too general?
Tim Sweeney (02:40:46) I think this can be done. Yeah. We did a lot of analysis of the Fortnite economy and found that some Fortnite experiences lead to or correlate with higher spending than others. And Battle Royale is relatively strong in that area because you see your character from behind and see all of your other characters from the front, and you have lots of opportunities to really see who you are and to emote and to interact with other players. And a lot of games have that characteristic.
(02:41:22) One funny anomaly stood out. There was this game that was one of the big breakthroughs in Fortnite, Only Up. It’s a game where you’re just climbing up and up by following paths of stacks of objects and things. It was just stupid fun. Everybody loved, but we found people weren’t spending a lot of money on outfits when they were playing Only Up. And it’s kind of intuitive actually. You’re not seeing other players. If you see anything, you’re seeing their butt as you’re trying to catch up to them jumping from object to object and they’re above you. And so it wasn’t a mode that shut off outfits very much, but you can determine the economic correlation between a game mode and spending.
Lex Fridman (02:42:02) That’s so fascinating. I mean, Fortnite is this gigantic economy where you could do those kinds of studies, you can understand markets, digital markets as they emerge amongst humans and what they value. And from that value, you can probably have a very stable kind of money that emerges.

Game economies

Tim Sweeney (02:42:18) Yeah, I think so. You don’t need an alternate currency system. Unfortunately, a bunch of ideas have been conflated because people are trying to hype up different things. But this idea of large-scale multiplayer social gaming, that notion of the metaverse, there’s 600 to 800 million people playing that kind of game every month. So you know that’s real and that’s happening and it’s very much underway. VR has a much smaller audience. I don’t think you need VR to have anything like this. VR is hardware that may or may not enhance the experience for some usage cases. For some it will probably be better and for some it will probably be worse, but certainly there’s not any set of Battle Royale players flocking to VR.
(02:43:02) And the other thing is NFT is trying to equate digital or cryptocurrency to the metaverse. It’s like, well, it’s just a way of denoting money or value exchange. You can do that with money or you can do it with NFTs or whatever, but there’s nothing about this future digital economy that fundamentally requires cryptocurrency or whatever. What you need is interoperability. Interoperability can happen through a blockchain, it can happen through a database, it can happen through standards bodies with defining standards and protocols. And we’ve been doing it for hundreds of years since the railroads were standardized. And it’s not something that totally requires a novel technological solution.

Standardizing the Metaverse

Lex Fridman (02:43:44) Yeah, I mean, even on the topic of cryptocurrency, it’s very frustrating. Blockchain and crypto is a really powerful technology that I think can enable a lot of the things that we’re talking about, but so many people use it to try to make money to create these bubbles and the hype and the meme coins and the so on and so forth that becomes much less about, that drifts far away and rapidly from things that are actually of value, which is the experience of playing Fortnite and how you look when you play Battle Royale. I mean, it sounds ridiculous to say, but it’s true. But that’s valuable. That’s like you have gold in the physical space. We know that holds value. How your outfit looks like in Fortnite, that as you’re saying, provably holds value. And so you want to connect like a standard definition of money value to that and not let it become this hype thing, which NFTs that you mentioned are just become that. It quickly drifts away into the land of people trying to buy and sell and trying to make money versus staying close to the thing that people actually value.
(02:45:07) Forget the money. It’s more about exchanging valuable experiences or things of value. So you can play Fortnite and then go to another video game and continue the valuable experience and then come back to Fortnite and do that kind of thing. So you’re saying there might be a way to do that to basically create standards the way the web has different standards for displaying websites and all this kind of stuff, or the communication that’s required on the networking side. So all the different standards that make the web work, there need to be those kinds of standards. What would those standards look like to enable the metaverse?
Tim Sweeney (02:45:52) We need a lot of different things. The one area where the standards bodies have been very successful in creating working standards implemented by all of the major engines today is in low level file formats for data interchange. The web has PNG files for 2D images and MP3 files for audio, and 3D has the Pixar USD file format, the Universal Scene Description, which is a description of the scene graph, the entire set of objects in the scene and all of their parameters so that any engine that supports those features could import that and then render the same scene as the engine they came from. Large parts of this work across Unreal Engine and Unity and Blender and all of these 3D packages of different sorts.
(02:46:35) And there’s the GLTF texture format, which stores textures and geometry and other low level data for 3D objects. When you see a Fortnite character, that file format together with the image file formats can store their static appearance, the shape of their body, even their animations and their different poses and the appearance of them, the different standard file formats could store all the sounds they make in their emotes, but we’re still missing a bunch of pieces. The biggest missing piece is the programming language that’s at the center of standardizing the metaverse.
(02:47:09) Now, if you look at the web, the web is a combination of a bunch of different technologies. The two biggest ones are HTML, which describes the 2D scene graph or the 2D layout of controls and objects on the webpage. But that’s just static data. It’s just a non-moving, non-animating webpage. And then you have the JavaScript programming language, which is used to manipulate that, to display things to the user and to implement anything you could implement in code. So it’s a little programming language that runs in your web browser and the metaverse needs something that performs that similar role.
(02:47:44) But the metaverse and 3D gaming in general needs something that’s rather more powerful, more safe, more scalable, and more capable than JavaScript because the metaverse is actually a more difficult technical problem than a webpage. A webpage like an app is just a single bundle of code and content that somebody, a company has prepared and they release it and it stays exactly what it is until they release a new version and it’s upgraded from version to version as it goes. But the metaverse needs to be a composite of code and content built by millions of different people that could potentially form a seamless world together.
Lex Fridman (02:48:25) Yes, it’s fully distributed, collaborative. First of all, also the amount of data, I mean, it doesn’t have to be that way, but websites are showing very little information. The metaverse even when it looks like something like Fortnite, just the amount of information that’s conveyed in the scene graph as the individual players are collaborating is a huge, huge, huge amount.
Tim Sweeney (02:48:54) Yeah, the highest detail of Fortnite updates amount to about 60 gigabytes of data, and that’s just a small part of what exists in the Fortnite Creative economy. And if you look at what this might be in a decade as standards emerge, you might have exabytes of data out there. Fortnite Better Royale is I don’t think the ultimate manifestation of gameplay that will ever be invented. What we’ve seen time and time again is that as we gain more technical capabilities, graphics gets more capable, CPUs become more performant, web services become ever more scalable. We’ve seen new genres of games that emerged that weren’t possible before, and Doom ushered in the era of deathmatch. The first time 3D multiplayer game was even possible at all. The early Battle Royale games starting about 10 years or 15 years ago. Only became possible back then. You couldn’t have built one 20 years ago because you just couldn’t have rendered an environment that’s as large as a VR game with that many players, with that level of interaction and performance. It was just not possible to run it. So you got a certain level of technical…
(02:50:03) … Not possible to run it. So you got a certain level of technical capabilities and the genre came out that proved to be by far the best shooter genre ever invented. But I think there are numerous, numerous more genres, some of which are better than any of the existing ones that will be invented as we get more and more capabilities. Some of the capabilities we’re lacking now are the ability to build environments and game simulations that span more work than a single company can possibly create. And you see the birth of that idea in Fortnite and Roblox where there are tens of thousands of creators, each building content, and users are playing meaningful amounts of it all. And so there’s an ecosystem that’s scaled larger than a company, but it’s still very much you go into one island and you play that creator’s work.
(02:50:42) The other direction of its scalability is putting more and more of people’s work together in a seamless, continuous play space for games where that makes sense. You can imagine a game taking place in an environment that is the size of a continent or earth in which you can go from place to place and then see different areas which are maintained by different people. And so you go into different spaces. The game rules are customized according to that, and you can go from experience to experience. And instead of having just one company’s authorship ever present wherever you are, you’d see, you’d be driving a car built by one person, carrying weapons built by 20 other people, and taking place in a simulation in an environment that’s built by thousands of other people, and working for separate companies or their own entrepreneurs, or indies, or enthusiasts all working together simultaneously.
(02:51:37) And we totally lack the programming foundations for that. The kinds of code you would need to write now to make that happen are just not practical. And so we’re investing massively in building new programming language technologies around Verse and our proposed standards for future metaverse programming that we hope will solve those kinds of problems and make that kind of world possible.

Verse programming language

Lex Fridman (02:52:00) So first of all, that’s a super exciting future where it’s not hundreds or thousands, it’s millions of creators that can just create different, small, or big elements of a world as big as earth. Just if you sort of close your eyes and imagine that world, that’s really exciting, where it’s not a centralized company controlling the release of a particular island or so on, it’s people constantly dynamically modifying all the islands of reality in this digital world.
(02:52:37) So if you could speak to some of the technologies that can enable that. You mentioned the Verse programming language. First of all, also, how legit is it for you, CEO of Epic Games to be a co-author? The programming language theorists are losing their mind. So a co-author on a paper that’s describing some of the nuanced details of a programming language. So maybe you could speak to this programming language called Verse. It’s a functional logic language. What are some cool features of Verse?
Tim Sweeney (02:53:13) Verse is a programming language that we’re building for large scale simulation programming. It’s designed to make it easy to write code that can scale up to not only you building a Fortnite island, but you building modules or components that can be used by millions of other programmers and co-exist in a huge environment, and also can scale up to a huge scale simulation. Some games will be small. Battle Royale might find that a hundred players is actually optimal. It might be that a thousand player version of Battle Royale would be worse, but I bet there are a thousand, million, and tens of million player experiences there are even better than that, that will yet to be discovered.
Lex Fridman (02:53:57) Wait a minute, tens of millions of players together?
Tim Sweeney (02:54:03) Sure, we’ve had Fortnite events that have attracted 15 million concurrent users, but the fact that they’re all divided up into servers with a hundred players each for those events isn’t really a positive. It’s just a limitation of the technology.
(02:54:18) Tracing back to Unreal Engine 1 and its single threading decisions, if we could build a concert where all the concert participants, potentially tens of millions of them, could participate together simultaneously and see that there’s that massive a crowd, and they could all do interesting things and interact with each other, that would be way cooler.
Lex Fridman (02:54:36) Sorry, I’m just loading it in, just imagining together in one scene graph, 10 million people interacting. What a cool world that is.
Tim Sweeney (02:54:49) Sure. Well, 10 million people, you have less than 10 million pixels on your screen, so what does the Nyquist Sampling Theorem say? It says that you don’t need full overhead for every player. You need to render the players who are around you and some approximation of everything else.
Lex Fridman (02:55:01) Yeah, but there’s also a networking component. You’re speaking to the rendering, but oh boy.
Tim Sweeney (02:55:10) There’s a lot of work that has to happen there, but this is what we do for a living. We solve hard problems.
Lex Fridman (02:55:12) I understand.
Tim Sweeney (02:55:13) because if they’re easy then other people could have solved them already.
Lex Fridman (02:55:16) That’s really cool though. Just sort of the possibility, the vision of that is really cool.
(02:55:21) Even a hundred thousand people or like 10,000 together just, I mean, there’s a reason in the physical world when you go to a concert and you have all those people around you, that energy, or you go to a football game, that energy is unlike anything else. And if you can bring that energy to the digital world, that’s amazing.
(02:55:43) But anyway, sorry, what, on the technology side of bringing that to life on the programming language side, can you continue, as I rudely interrupt you, talking about Verse?
Tim Sweeney (02:55:55) Verse is a functional logic language because we think that that’s the way to make the most simple and powerful language simultaneously. Back in the 1970s, the programming language designer who built Pascal, one of the early programming languages, Niklaus Wirth or Nicholas Wirth as Americans might call him, stated this principle that programming language should achieve a high degree of power, not by having a lot of features, but by having a small number of features that work together and can be composed together arbitrarily so that you have to learn a relatively small set of things and then the real knowledge comes as you learn ways to combine them to achieve bigger and bigger programs.
(02:56:41) And so there’s a long history to the field of programming languages, but in the 1950s, the first programming language designers got together and built the first standardized language called ALGOL. And there was this meeting in 1956, very few people even know about it, but it’s where all the major foundations of modern programming languages were decided on, that the C family of languages inherited. And so we’re very much living in a world that was defined by them. And thankfully they got a whole lot of things right. They defined how functions should work, how variables should work, and how recursion should work. Thank God they got those things right, but they got a few things wrong. And Verse is trying to fix those. And that’s the functional logic part of it.
(02:57:22) The interesting thing about functional logic languages is that in an old school language, an expression produces a value. In a functional logic language, an expression can produce zero, one, or multiple values. And if it produces zero values, we might say it fails, and if it produces one value, we say it succeeds. And if it produces multiple values, it’s providing a set of values you could iterate over.
(02:57:45) And so there are a bunch of features in today’s programming languages that were defined in an ad hoc way without really thinking this through, this zero, one, or many values way. And that’s the problem that functional logic languages address. And the most basic example is an if statement in a programming language. If some condition holds, then do this thing, otherwise do that thing. And in a language today, this is done with variables of type boolean or expressions that produce booleans. We have boolean variables that are either true or false. We have expressions that evaluate to booleans. And so you can express a condition as a bunch of these features together, but you’ve lost any computation you’ve done in doing that boolean expression evaluation.
(02:58:28) So in a functional logic language, your condition wouldn’t do that. It would either succeed and produce a value or it would fail. If it succeeds, it goes to the then branch. Your operation succeeded, now you’re running this one batch of code. And if your expression failed, then you go to the else branch. But the exciting thing about that is your expression that succeeds or fails can produce values and bind variables that are then accessed by the then branch. So you can write a conditional where you can only get to the inside of the condition, to the then, if a bunch of variables have successfully been bound to variables. So it lets you test if some conditions hold and then use the results of those tests. And that gives you a much higher level of reliability.
(02:59:11) And then a for loop in a traditional language, it’s just a bunch of imperative code that’s woven together to produce a bunch of values iteratively. It’s rather awkward to do complicated things in for loops. And so you often end up with either ever more complicated constructs built to work around that like iterators and other things. The idea of functional logic languages is your for loop can just produce multiple values. And if it produces zero values, you’ve got to reiterate zero iterations, and it produces a bunch of values you’ve got to go through all of those as your iterations.
(02:59:41) Rather than having a bunch of nested loops, you can write arbitrary things that look like SQL queries in a condition or in a for loop that bind a bunch of variables, do a bunch of tests, produce a series of results in some order that you’re iterating over, and then you can handle all of them and produce a result. So you gain the power of SQL queries, large complex queries that are data structures in a language that is much simpler in which your code is just performing simple iterative operations. And so it gives you the best of databases and of regular programming in a much more uniform way. And the power of this is now users can write functions that not only produce a value, you can write functions that might fail. And so you can write a function that answers a question. The answer can be either yes and my value is this, or no. And you can combine these together into arbitrary queries.
(03:00:35) And I feel like the funny thing is that this is not how C++ works. And so when we have Epic programmers moving over from C++ and writing their first Verse code, they try to write C++ code in Verse style and it actually ends up being convoluted code that’s worse than good C++ or good Verse. When we see… But after a few months they get up to speed and they’re writing really awesome code that’s tighter and more compact than before. And with users who’ve never programmed before, but are learning programming for the first time in the context of Fortnite, it’s really fascinating. You see these users are learning this as, it becomes their intuition. They just assume programming works this way. And they’re writing way more advanced and interesting for loops and conditions than we’re often writing internally because they’ve grokked the core concepts.
Lex Fridman (03:01:21) Yeah, you said a lot of really interesting stuff. First of all, it’s very interesting that there’s a bunch of people, a lot of people learning programming for the first time with Verse, which is a very different way to look at programming. And in some deep sense, as you’re saying, a very intuitive way to learn programming.
(03:01:41) But there’s a lot of properties about this being a logical language, one of which, we’ll maybe speak also about confluence, but also correctness. So being able to prove the correctness of a code, it’s basically easier to write bug free code.
(03:02:06) Can you just speak to that and the importance of that when you’re building the metaverse?
Tim Sweeney (03:02:12) Yeah, right. So the challenge with the metaverse is first of all that it’s a huge base of code that’s evolving over time and written by many authors. So you might see every second a new module is updated somewhere. And you expect in this live ever running simulation that never shuts down for everything to upgrade live in place. And so, one critical component that is the ability to release an update to something you’ve already published and be sure that it’s backwards compatible with the one that you’ve already released. And that’s essentially a type checking problem, checking that your new interface is backwards compatible with your old one. And that comes down to the type system of the language. There’s been a lot of very interesting research on type systems over the years, most of which hasn’t ever made it into the C++ programming language unfortunately.
(03:03:01) But you see several branches of that whole field, one of the really interesting things that Java and C# did in the early days, and then later abandoned and didn’t bother update, was defining a very rigorous set of rules for if you publish a module with one set of types today, then what changes can you make to that module for your future updates to it that don’t break backwards compatibility? And that’s a problem for type checking. Like, say you have a function that promises to return some integer, well, in the future you could say that returns some natural number because every natural number is an integer. So that’s a backwards compatible change, but you can’t say it returns a rational number because some rational numbers are not integers. So the system ought to reject that kind of change.
(03:03:43) But the much, much, much more interesting thing about type checking, it was the realization, it was actually made in the 1930s that if you design a programming language type system in a very particular way, then it becomes not only useful for expressing types of variables, the traditional thing every type system does is say like, “Variable X is of type integer.” But if you design a type system in a certain way, then your types can express theorems, like mathematical theorems. The Pythagorean theorem is a cool one, but one theorem you might have in a program is like the theorem that this function takes an array of integers and returns an array of the same integers, but the result is sorted. If you express that as a theorem and you follow this system of type theory, then you can actually require that anybody who writes that sorting function to prove that it has actually sorted its result. And so you have types or theorems, and values constructed a certain way can be proofs of those theorems.
(03:04:46) And nowadays in mathematical literature, you see more and more theorems are being proven mechanically. Mathematicians are proving theorems in a way that is verified by computer to be a correct proof. In the old days of math, people would write down like language. If you look at all of Euclid’s theorems, it was just language. It was just writing in ancient Greek to say the steps of the proof to convince the reader that the thing is true.
(03:05:08) Starting in the 1930s, mathematicians moved towards rigorous formal proofs in which there is a series of steps that can be mechanically verified, that they’re proving things. And when mathematicians say they’ve done a computer proof of a theorem, what they really mean is they’ve written a program in a proof language, like Lean is a theorem prover, Coq is a theorem prover, and there are several others. It means they’ve written a mechanical proof in that language that a computer has checked so that it’s impossible to lie. If you say that you’ve proven a thing and the computer verifies it, then it’s definitely true.
(03:05:44) And this is a feature of mathematical proof languages, but it’s also an idea that’s making its way into programming languages gradually over time. And our aim for Verse is to be the first mainstream programming language that fully adopts that approach and that technique. And not only adopts it but adopts it in a way that’s really user-friendly, so you don’t have to do that. And the idea of this is that you want gradually more information to be incorporated in the types of variables. The property you want of a programming language is that if your compiler accepts your program, and doesn’t beep and tell you there is an error, then your program should work. Now there are all kinds of ways humans can make mistakes there so that we’ll never achieve that ideal, but we can get closer and closer to it by having more and more language features that enable the compiler to catch more human coding errors and tell the user what went wrong.
(03:06:37) And that becomes extremely important in the metaverse, the cost of fixing a bug that’s made it through to runtime and is in users’ hands, the cost of fixing a bug in a shipping program is hundreds of times higher than fixing a bug that you’ve just observed as you’re running your code yourself. When it’s running on your computer, you just fix a line of code and your bug’s fixed. When you have to fix it live, you have to release a patch, you have to release patch notes, you have to test the patch, you have to check for all the other bugs that might have been introduced, and everything becomes vastly, vastly more expensive. So the real aim of the Verse program and approach is to catch all of these errors at compile time and make the metaverse a very reliable place.
Lex Fridman (03:07:18) Do you see a world where like at compile time you could prove that the program is correct in some sense of correctness?
Tim Sweeney (03:07:25) Proving things becomes combinatorially harder as they get larger, right?
Lex Fridman (03:07:29) Right.
Tim Sweeney (03:07:29) And so the really important thing about this whole field is that you should be able to adopt these capabilities gradually and apply it where you really need it. Like, if you’re writing something like a cryptography algorithm, that’s a good place to prove stuff. If you’re writing a data decompressor that’s going to be used by an entire ecosystem, like proving that doesn’t overrun memory is actually really important. And a lot of the reason that security vulnerabilities happen today is because in a different language a compiler could have caught, were not caught in C because it just doesn’t have this feature.
(03:08:04) But we shouldn’t see this as scary. Everybody working in a typed language like C, or C#, or Java is proving theorems all the time. If you have a variable of type integer and you assign some value to it, you’ve proven to the compiler that that value was an integer because otherwise it would have rejected it. And so as we add more and more advanced proofs, we’ll get compositional properties flying out of our systems that they’re easy to use and people prefer to use.
(03:08:32) If we might think in the future where we have AI helping us write certain kinds of code, the big problem with AI is you ask it to do something, ask to write a fragment of code that does something, it might give you a perfectly valid fragment of code that compiles but does the wrong thing. And if we had languages where you could say, “Write a function that sorts this array and prove that it did that,” it could actually write the proof. And if the compiler didn’t beep with it, you could trust that it was actually sorting the array. And otherwise you could go back to the AI and say, “Well, that didn’t work.” But getting to the point where we know that our programs do what we say they’re going to do or think they’re going to do is a very important thing.
Lex Fridman (03:09:12) And by the way, I should mention that you sent me a note about Curry-Howard correspondence, which I went down a rabbit hole, and that’s a whole fascinating field which shows the mathematical relationship between programs and proofs.
Tim Sweeney (03:09:25) That’s right. This is a result from the 1930s. It’s one of the most important results of computer science that almost nobody knows about, but they did this rigorous breakdown of type systems and the 1930s formulation of programming, and established that everything you can prove in mathematical logic you can prove within a type system if it has certain features.
(03:09:51) And if you break down what is a proof? Well, a proof that integers exist is some integer, like five is a proof that integers exist. So when you have something like var X int, and you say “X = 5”, well you’re proving to the compiler that 5 is an integer. That comes as a secondhand nature, but you can prove more advanced things. If you want to prove that a pair of things are true, like theorem A is true and theorem B is true, then you need to provide a pair of values, one that proves theorem A and one that proves theorem B, and that’s the conjunctive law of proofs. And there’s a disjunctive law too.
(03:10:23) And then there’s an implication law for proofs. And it turns out that that’s really satisfied by functions. When you write a function in programming language, you’re saying, “If you give me this thing, I will give you that thing.” If you’ve given me a parameter of type something, then I’ll give you a result of some other type. And if you write that, by writing that function, you’re proving that given one of these things, you can produce another thing. And that’s a proof of an implication. With only like seven laws, you can construct all of mathematical logic in a type system.
(03:10:54) And one of the important things for programming languages that hasn’t been given enough attention is some aspects of programming languages are just subjective. They’re just machinations of the programming language designer. And Guido van Rossum decided that Python should support indentation in a certain way. And as long as you’re dealing with things like human notation and naming of things, there’s always going to be that subjective layer.
(03:11:18) But there are other parts of programming languages that are not subjective but should be fundamental. And when you look at type systems, there is a way to do type systems that gives you mathematical proofs. And every other way of type systems that doesn’t give you mathematical proofs is just worse and should ultimately be rejected.
(03:11:38) And so I think one of the jobs of computing is to identify what we actually done right in the past and what have we done wrong. And for everything we’ve done wrong, actually going back and fixing it. Otherwise, we just keep accumulating so much cruft that our systems eventually are crushed under their own complexity.
(03:11:54) And there have been massive announcements of horrible vulnerabilities in software and services over the past year. It turns out some nation-state backdoored a bunch of TELECO’s surveillance systems for wiretaps, like huge problem there. But ultimately when you break it down, it’s probably because of some buffer overrun in some C program. These decisions about programming languages have long-term implications.
Lex Fridman (03:12:21) It’s really fascinating that in building these systems that hundreds of millions of people use, you’re rethinking about how do you actually build it from first principles.
(03:12:29) So I should mention that Verse’s primary design goal is it should be simple enough to learn as a first-time programmer, general enough for writing any kind of code and data, productive in the context of building, iterating, and shipping a project in a team setting, statically verified to catch as many categories of runtime problems as possible, compile time as we were talking about, performant for real-time open-world multiplayer games. We didn’t really quite talk about performance. Maybe I could ask you about that in a second. Complete so that every feature of the language supports programmer abstraction over that feature, timeless, built for the needs of today and for foreseeable future needs.
(03:13:08) And then there’s some design goals that we talked about, that is strongly typed. Multi-paradigm to use the best of functional programming, object-oriented programming, imperative programming so it’s as deterministic as possible. If you run it over and over, it runs in the exact same way. Failable expressions, as you talked about. It’s super fascinating, there’s so many cool features in this. Speculative execution, concurrency.

Concurrency

(03:13:33) Maybe can you talk about concurrency? What is it about Verse that allows for concurrency at the scale that you need?
Tim Sweeney (03:13:41) This is the one biggest technical problem that we’re working to solve in this generation, and that is taming concurrency so that any ordinary programmer can achieve it by just writing ordinary code.
Lex Fridman (03:13:57) It’s hard, yeah.
Tim Sweeney (03:13:59) Programming on a single-threaded computer is hard enough, but it is completely predictable. If you have a language that’s deterministic and you’re on the same code, over and over, it’s always going to do exactly the same thing and there’s no unpredictability about what might happen, right? You’re reading and writing variables in some order and you’re always going to see it behave the same.
(03:14:19) The problem is when you introduce multiple threads or multiple nodes in a data center, all working together on a single problem, is that they each want to read and write different pieces of data, and change of the state of the world as they go. And still almost all concurrency in real-world programs today is achieved manually. Programmers are writing this code that might run in multiple threads very, very carefully so that they are negotiating among each thread to get access to data in a way that’s going to give them predictable results.
(03:14:53) And it’s incredibly hard. It’s so hard that we’ve in five generations of Unreal Engine, every single generation decided we’re not going to try to scale up all of our gameplay code to multiple threads manually. It’s just much, much, much too likely to go wrong, not only for ourselves, but for every partner company who licenses Unreal Engine and tries to use it for building a game. It’s just a massive foot gun.
(03:15:17) There’s a variety of solutions to concurrency that are all rather suboptimal. One attempted solution was like, just don’t try to solve this problem at all. Let’s break our program down into microservices. And almost all online websites of massive scale like amazon.com work with hundreds of microservices where different servers negotiate with each other by sending messages to each other. And by programmers writing those things very carefully, they eventually get to being able to take your orders and not make a mess of them reliably.
(03:15:48) But this is totally not scalable to the metaverse where you have millions of programmers who are mostly not going to be computer scientists. They’re mostly going to be hobbyists, and enthusiasts, and first time programmers doing stuff for fun. That’s never going to work for them because they’ll never be able to envision all of the different dependencies between different computations they’re running in parallel.
(03:16:07) But it turns out that there was an amazing foundational work done in 1980s that was made very real by a paper on Haskell concurrency. Composable Memory Transactions is the name of the paper. And it describes the system for transactional updates to programs. And the idea of a transaction is a transaction is a block of code that does a bunch of operations on memory, it might read, it might write, it might process an order, it might accept an order or reject an order. It might transfer money between one bank account and another. It might make conditional decisions like, “Oh, you asked to transfer a hundred dollars from your account to this guy’s account. We’re going to see if you have a hundred dollars. If you don’t, we’re going to reject it. And if you have a hundred dollars, we’re going to take a hundred dollars out of your account and add it to this other guy’s account.”
(03:16:56) Without transactions, if everybody’s just randomly adding and subtracting each other’s bank balances, then you might have somebody read a bank balance, subtract a hundred dollars and write it out. But in the meantime, somebody has written something else in the meantime. So you might get inconsistent bank balances arising if you don’t have a way of ensuring that these all run in a specific order.
(03:17:16) So the idea of transactions is its way of dividing an entire program into updates, self-contained updates that do an arbitrary amount of computation but must run in a single threaded manner. And in the case of a game engine, that’s a gameplay object update. When you’re playing Fortnite, you see a gameplay object. Every other player is a gameplay object. Every enemy is a gameplay object. Every rocket, and projectile, and car, and thing you see moving around and interacting, it’s not just a fixed static part of the world. That’s a separate game object. And each of those objects is updated at a rate of one update per frame at 60 frames per second.
(03:17:52) And so then in the course of Fortnite Battle Royale gameplay, you have tens of thousands of object updates happening every frame with a hundred players. In a simulation with billions of players, you’d have a whole lot more than that.
Lex Fridman (03:18:04) So right now that’s done single-threaded?
Tim Sweeney (03:18:05) Yeah, that’s done single-threadedly in each game session. This is why Fortnite has a hundred players limitation. If you absolutely maxed out a server, maybe today you could get it up to 140 or something, but it’s not going to thousands, or millions, or billions.
(03:18:18) And so what we need is a technique for magically automatically scaling our code to that. And transactions are the idea. And the idea is a transaction is a granule of code that runs its entirety. And so the idea of this transactional memory concept is that we’re going to have programmers write completely ordinary code that reads and writes variables in the completely ordinary way, and they’re not going to have to worry about concurrency at all. And then the system, like today, a program, a computer just runs your program. There’s no amount of speculation going on at the programming language level.
(03:18:50) The idea of transactions is since we have a bunch of operations we need to know we apply, we apply a large set of them concurrently, but instead of each one reading and writing from global memory shared by all, in which case they might be reading, and writing, and contending with each other for the same data, and might be doing contradictory things to it, we’re going to track all of our writes locally. We’re not going to write changes out to global memory. We’re going to keep track of it in a buffer that’s just for that one transaction. So it’s going to look to that code exactly as if it’s running on the global system affecting global game state, but it’s going to be isolated to just that one transaction. And it’s going to be set aside and buffered up for consideration later. We’re going to run tens, or hundreds, or thousands of the updates concurrently. We’re going to see which ones had read-write conflicts. Because if two transactions don’t read and write any of the same data, then you could have run them in either order or simultaneously, and it wouldn’t have changed the end result.
Lex Fridman (03:19:51) Yeah, the order doesn’t matter. This is so fascinating. To imagine this kind of system arbitrarily concurrent running millions of updates in parallel of gameplay objects, that’s the thing that enables the thing that we’re talking about, which is tens of millions of people together in one scene.
Tim Sweeney (03:20:13) Yeah, exactly. And the key is that you’re running these updates speculatively, and you’re not committing their changes to memory until you’re sure that they’re free of conflicts. So you might update 10,000 objects, you might find 9,000 of them were conflict-free. So you apply those 9,000 updates to memory, and they could have run in any order and it wouldn’t have changed the result.
Lex Fridman (03:20:33) That’s so cool.
Tim Sweeney (03:20:34) Now there’s a thousand objects left over. Now you have to run those again, try them, maybe interleave in a different way to get them to eventually commit to memory. And in the meantime, you just throw all of their computations away and redo them later.
(03:20:47) And by doing this, removing this from being a programming problem for the programmer to deal with, to being a language problem for us language designers to deal with. And we’re moving a vast amount of pain that would be imposed on a million people, instead to a vast amount of pain imposed on a small number of people to have to actually make this work.
Lex Fridman (03:21:07) That’s amazing. That’s really incredible.

Unreal Engine 6

(03:21:09) So what’s the state of things with Verse? And I guess what you’re outlining is if, and hopefully it is successful, this will be a big part of Unreal Engine 6. So what’s the timeline? Where do we stand today?
Tim Sweeney (03:21:23) Well, there’s a lot going on in parallel. The key thing with Verse is that we have been specifying what we think is the ultimate version of the language with all of the features we want, whereas we’ve been shipping more modest versions of language over time, and we’ve released dozens of updates to it over the past year and a half.
(03:21:44) And the idea is that the shipping version that gains more and more features over time, but each maintaining backwards compatibility with old versions, and each continuing to improve and approach the ultimate version of it as we go. And we’ve been doing this experiment entirely within the world of Unreal Editor for Fortnite for now. We want to test this and iterate with Fortnite creators in just the metaverse usage case before we make it available to all of our partners using Unreal Engine for all of their projects. And the idea is to iteratively improve it and build it out. Because right now UEFN has relatively few features for programming. It needs a lot more. And everything we add makes the world a much better place for Fortnite creators.
(03:22:23) And we’re adding major, major new APIs every few months throughout the course of this year. Whereas Unreal Engine licensees who are building standalone games already have access to the full engine through C++. They have massive, massive expectations of an API. And so we can’t release this to them until we’ve built up all of the essential features that they’ll need for building their gameplay in the future.
(03:22:45) And so we have these two different tendrils of progress. There’s Unreal Engine 5 for game developers, and there’s Unreal Engine 5 targeting the Fortnite community. And there’s different bits of development that are only in one area of it that aren’t applied to [inaudible 03:23:00]. Not all of the Unreal Engine 5 features are actually available in Fortnite because some of them we haven’t figured out or haven’t gotten to the point where we can deploy them to all seven platforms in a platform-independent way.
(03:23:10) And so the place where all of these different threads of development come together is Unreal Engine 6. And it’s a few years away. We don’t have an exact timeframe, but we could be seeing preview versions of it perhaps two to three years from now. And we’re making continuous progress towards it.
Lex Fridman (03:23:27) So that’s really nice. So there’s this ultimate version of a language that you’re constantly working on and thinking through, and there’s the shipped version of the language that’s used by a large number of people, but still in the constrained environments of the Unreal Editor for Fortnite, so for the Fortnite game. And then there awaits the more general Unreal Editor on Unreal Engine for the lessons learned in the Fortnite context to be integrated in the more general context of creating simulated worlds for all-
(03:24:03) … general context of creating simulated worlds for all kinds of games, including Fortnite. It’s a really nice setup because you’re both… it’s a testing ground of the language in Fortnite and you’re keeping an eye on what the ultimate thing will look like also necessary to deliver all the features that we mentioned. Brilliant.
Tim Sweeney (03:24:19) Yeah. The aim for UE 6 is to bring the best of both worlds together, a much easier gameplay programming for the Fortnite community and for licensees. More scalability to large- scale simulations of all sorts. Greater ease of use, meaning it will be easier to hire programmers who are familiar with and experienced with a thing, but also ensure that every game developer has the full deployment capabilities so that they can build a game once and then ship it anywhere.
(03:24:47) The ultimate version of this enables a game developer to build a game of any sort, either or simultaneously both ship it into Fortnite as a Fortnite island that players can go into bringing their Fortnite items and cosmetics and interoperate properly or ship as a standalone game or both. And if they ship as a standalone game, they shouldn’t be missing out on the open economy either because, in this timeframe, we’ll have opened up the Fortnite item economy to third-party developers of all sort.
(03:25:17) Hopefully, there is standards body, but there might be multiple phases of it so that if you choose to ship a standalone game, you can still choose to have Fortnite items work in your game and have your game items work in Fortnite and have your item economy integrated with the overall metaverse economy and solve a really core problem that… of the game industry that Matthew Ball has been documenting over the past few years.

Indie game developers

Lex Fridman (03:25:40) Yeah, by the way, Matthew Ball has been really helpful. He’s a great… He wrote a really great book that I recommend people check out. There’s an updated version. Let me just ask for because, again, there’s a bunch of indie developers listening to this. I saw that a lot of solo developers out there that are using Unreal Engine, that they’re basically creating video games solo.
(03:25:59) I saw… can highly recommend. It’s great. Choo-Choo Charles. It’s a great video game. Gavin Eisenbeisz, he… a great guy. He solo created this game that’s, I think, quite popular. I believe he says he used visuals. He didn’t even use C++, he used visual scripting. He used blueprints-
Tim Sweeney (03:26:21) Yeah.
Lex Fridman (03:26:21) … to create it. Okay. So I mean all that to say people should go check it out, support indie developers, support Gavin, and support everybody liked that. I think it’s important to say because there’s so much genius and artistry out there that we want to support the crazy dreamers out there.
(03:26:37) Anyway, all that to say, what are the ways you think Epic can support indie developers like that? People like Gavin give them superpowers to create games from which they can make, at the very least, enough money that they can keep doing their art.
Tim Sweeney (03:26:56) Well, that’s really about productivity because to be successful with a game, you have to have a great game. If you’re targeting a… If you’re building a type of game that nobody’s ever built before, you might be able to build a smaller, simpler game than if you’re competing in a massive genre that has huge expectations. But it’s all about enabling somebody to do that in a reasonable amount of time they can spend and to be able to finish it and chip it and maintain it successfully. The tools are a big part of that.
(03:27:22) Having the tools be as productive as possible. But there are a lot of other facets as well like having a content marketplace is a big thing. Just off the shelf, piles of content, some free, some paid, built by other creators can enable a small indie team to build a big game and just be able to focus on the unique content of the game. Being able to write their gameplay and lay out their environments the way they want but not have to build every tree and rock because somebody has already built one, and theirs is probably perfectly suitable for your game.
(03:27:56) And over time, there’ll be more and more. There’s also a lot of indie developers living as content creators. They’ll be releasing content on Fab Marketplace or the Unity Asset store and earn living for that. But specialization of labor is a really, really valuable thing. And the early days, pretty much one person would build one game. That’s how a lot of the games were built in the 1980s.
(03:28:16) Over time, you had a separation where artists became specialized and then programmers and then gameplay programmers and engine programmers. Now you have technical artists, and you have dozens of different specialties contributing to a AAA 3D game now. And the more we can modularize those bits of content so you could get something off the shelf rather than having to build it or have to come… engine synthesize it for you, the more we can enable creators to create stuff fast and successfully.

Apple

Lex Fridman (03:28:46) So we should talk about the fact that, amongst many other things, you’ve been philosophically and spiritually battling monopolies in general, one of which is sort of the Apple marketplace that charges 30% from developers. Can you speak about this, this idea that you believe that Apple and other companies, Valve, should not be charging that kind of revenue cut?
Tim Sweeney (03:29:23) Sure. Well, let’s start from a very basic principle of computing. The first computer I owned was an Apple II Plus designed by Steve Wozniak and marketed by Apple, and then an IBM PC. And in those days, anybody could write code. Your computer literally turned on with a programming language prompt in front of you. You had to actually do work to not write a program and to instead run somebody else’s program. That was incredibly empowering. And anybody could write a program, and anybody could put it on a floppy disk.
(03:29:51) Anybody could share it with their friends. Anybody could make copies of that and put it in a store. They could sell it. They could build a business around it. And they were completely able to, without seeking any big tech corporation’s permission, do however they want. Even from IBM. Remember, IBM was the dominant computer company on Earth at the time that they released IBM PC as an open platform. And, yeah, so it’s really been firmly implanted in my mind that this is… this was a magical and wonderful time of unmatched economic progress for technology in the entire world.
(03:30:29) And over time, the big companies have realized that they could shut down and just block software makers from releasing software on their own and block software makers from doing business with customers directly. And I’ve always viewed this practice as terribly abusive because when you buy a computer you spend… or a phone, you spend good money on it. It’s your money you spent on that phone and now you own that phone. And there’s absolutely no reason that Apple should block you from installing apps from other developers directly if you want, going to their webpage or writing your own apps without their permission and running them yourself without having to get a developer account, without having to go through their bureaucracy.
(03:31:16) And there’s no reason that any consumer who gets an app shouldn’t be able to do business directly with the developer of the consumer. You already bought that phone, why should Apple be adding a 30% junk fee to all commerce you do, and why do they selectively apply it to some things and not others? I’ve always viewed this as deeply abusive and that it shuts down the competitive engine that once fueled the app and software economy. It’s still a vibrant, competitive engine on Windows and on the internet, but it’s no longer with mobile apps because these stores have popped up, and they don’t provide any useful value to the user.
(03:31:55) Yes, they’re a search function to find software, but there’s no reason other companies couldn’t build a better one. And I bet if you had Steam or if you had Valve build Steam for iPhone, I bet Steam for iPhone would be a much better app store than the iOS App Store, and a lot of people would use it, and that Apple would be forced to build a better app store in competition and that everybody would improve their products as a result. But Apple and Google shutting down the competitive engine that drives the software economy has massive implications for everything. And one of them is reshaping the nature of mobile apps to be really offensive to gamer sensibilities.
(03:32:32) If you call [inaudible 03:32:33] console, the best console games you see listed in the store… on the storefronts, the best console games that you see reviewed are awesome games that really have a lot of creative merit. The ones that sell the best are really enormous values for their money, and they’re product of an immense amount of work. You don’t see that on iPhone. The top apps on iPhone or the top games on iPhone at almost all times are these ridiculously greedy, high-monetizing whale games, which are pervaded with pay-to-win and loot box practices. They have a sort of a legalized form of gambling, and these games are not driven by fun. They’re driven by manipulation of the players to greedy ends.
Lex Fridman (03:33:17) Yeah.
Tim Sweeney (03:33:18) And it’s very hard for the fund-based games to actually succeed there. And the cost of operating these online games now are enormously high. So you have a game that’s based on fun. It’s not loot-box- heavy. You have to pay 30% of your revenue to Apple in order to just get access to the platform. And 30% is way, way, way more than most game companies make in profits right now. And so, if they… that fee is more than the profit from a natural company, then they can only stay in business by raising prices.
(03:33:48) So these 30% fees are raising prices of all digital goods. It’s just inflationary is a force in the economy. That’s just the first drag tax. But then to reach users, when a user searches for… Before Apple blocked Fortnite on iOS, when a user searched for Fortnite, the first result was always some competing game that’s utterly anti-user. Like you search for Steam for a game, and if that game’s on Steam, it’s the first result always because Steam’s not getting inshittified with advertising. Apple is, and they do that so they can make even more than 30%.
(03:34:22) So, if you want to be the search… the first search result for your game, you’re probably paying more like 45%. If you want to reach users on social media, you’re paying another 20%. So, literally, something like 70% of the revenue for your game is just going into junk fees to acquire users and get them in your game. And the money that’s left over is only enough to fund these games with rather abusive practices that do not look to normal gamers like games for the most part.
(03:34:47) Now, there are some exceptions. There are some great games on iOS, and there’s some games with good practices, but the engine has been really corrupted in a way that competition would fix. If you unleashed lots of competing stores on iOS, then you’d have lots of awesome options, and you’d have much better deals and much better prices.
Lex Fridman (03:35:04) I had a quick chat with Matthew. He asked me to ask you this question of why don’t more companies fight Apple in the way openly and totally as Epic has been? What makes you Epic so unique in this regard? And I should say, I think everything you said I agree with fully. I think what Apple is doing is just wrong.
(03:35:26) I think Apple, in many dimensions, is an incredible company. They have brought so much good for the world. In this regard, I just think it’s straight-up wrong what they’re doing that they’re not providing the value of 30%. And even if they were the monopolization, the centralized control without competition is wrong. Anyway, why are you fearlessly Apple on this and other companies don’t seem to want to step up?
Tim Sweeney (03:35:58) All companies are terrified of Apple because Apple can destroy their business. Epic was in a unique position with Fortnite, first of all, having the biggest game in the world at the time we started the fight with Apple. And second of all, having a majority of our users playing on PC and console meant that if we lost access to iOS during a fight, then we would still be able to survive. That set Epic apart. Spotify, Facebook, you name the top 10 mobile apps, I think none of them would be able to survive without Apple. Literally, their business would be destroyed if Apple blocked access to them.
(03:36:40) And Apple is incredibly clear with developers that they’re willing to deprive all users of access to any app if they get in a fight. And if you look at how they dealt with Epic, they were not just legally maneuvering with the intent of winning the court case against us. They were also sending a message to all developers in the world. “We will destroy your business, or we will try our best if you fight us.” And a very small number of vocal developers have been willing to speak up, and Apple was actually refrained from crushing their businesses when they weren’t violating any Apple policies.
(03:37:18) And that took a bit of discipline, which I think is also amount of calculation by Apple. They couldn’t survive being seen as the company killer, that, “If you criticize this will crush your company.” But the other thing Apple has that they can and will readily deploy against every developer is soft power. When they take 30%, and advertising is so expensive, soft power by Apple like approving your updates faster or slowing down all of your updates by a couple of weeks can also have a dramatic effect on your ability to compete successfully.
(03:37:51) And Apple, it’s a very long history of playing cat-and-mouse games with developers. It’s like, what, a developer isn’t in Apple’s good graces, so just slow down the updates. So they’ve been slowing down updates for several major tech companies, sometimes for weeks, sometimes for months, without all going under the radar because everybody’s afraid to challenge them publicly. And so, Apple’s wielding a soft power can change a company’s economics for the worse enough to deter almost any public company.
(03:38:19) And Epic is in the fight because I firmly believe that something like the metaverse will only arise. It’s something like a billion-plus user real-time 3D social ecosystem that grows to encompass potentially all or most major games by all major developers tied together into an open economy where they all participate as peers, and they all compete to give users the best deals, and they grow and do business with their customers directly. That thing can only exist if the and Google gatekeeping monopolies are lifted.
(03:38:55) And it’s not just the 30% fees. 30% fees are economically ruinous, but they impose other levels of control. Apple prevents all web browsers on iOS from implementing web standards better than Apple does. So, Apple has really limited data storage capabilities and 3D graphics capabilities on the iOS web APIs. So APIs, you can access from web apps running within a web browser, and that’s to intentionally cripple those apps to ensure that they can’t possibly compete with native apps.
(03:39:27) And by depriving web apps of those features, they prevent web apps from being competing with native apps. Well, Apple, if they treat the metaverse the way they treat the web, they’ll say, “You can only use Apple’s metaverse engine. Unreal Engine is disallowed.” And then they can impose all of their own limitations on the metaverse to force all commerce through Apple or force it to be so uncompetitive and lousy that it can’t compete.
(03:39:51) And they have this giant array of these anti-competitive techniques that they use to disadvantage other app developers, saying only Apple can build certain kinds of apps or only Apple can integrate certain features in Europe, even where the DMA law requires Apple to allow competing stores, they say, “A store can only be a store. You can’t build a store into Facebook, you can’t build a social network into a store.
(03:40:13) A store must only be a store because a store that’s more than a store might be able to compete with us more effectively.” It’s just a giant… To use the Soviet term, it’s a defense in depth strategy where they’ve constructed a massive series of barriers. Each are fatal to any attempt to compete so that even if one barrier is overcome, the others remain in place and shut down the whole scheme.
(03:40:34) And that’s playing out in Europe, where Apple has enabled us to launch the Epic Game Store but has made it so difficult and uncompetitive both for Epic and for clients who we want to do business with that it has no chance of success until the European Union starts to really enforce the DMA law and impose harsh and serious penalties on Apple to force compliance.
Lex Fridman (03:40:56) I think it should be said, once again, I think it’s wrong what they’re doing there and I hope there’s public pressure and government pressure for them to open up the platform. I believe, as a person who loves Apple, I believe this is also good for Apple. There’s the natural thing in companies to want to close and control and crush competition, but Apple is full of brilliant engineers, open it up and win. It’s going to create the right kind of competitive incentive to make the Apple Store better to make… because they’re great at creating great interfaces, but competition will sharpen the sword. I mean, it is just going to make everything much better. So I do hope there’s a lot of public pressure, and I appreciate that you’re speaking out in this way, sort of putting that pressure and letting people know it’s okay to say that this is wrong.
Tim Sweeney (03:41:58) Thanks. Competition makes everybody better. You have a monopoly that’s forced to compete. Suddenly, the monopoly’s products get much better, the offerings to consumers get much better. You see so many areas where Apple could be the best, but what they have is just really, really lousy, and it’s this old guard of leadership who is clinging to these old policies, turning themselves into the enemy of every developer, every regulator, and I think it’s ultimately massively to their detriment.
(03:42:24) And I can’t wait for a new generation to come in and paint a bright path to the future. We were… Epic was an awesome partner to Apple for more than a decade of demos and partnership and technology usage together, and we did amazing things together. I’d love nothing more than to have that Apple. I mean, and bringing back Steve Wozniak’s original views. Just the Apple II was such an amazing thing. It’s a completely open platform.
Lex Fridman (03:42:51) Mm-hmm.
Tim Sweeney (03:42:52) The manual to the Apple II included a listing for all the ROMs, the source code to the ROMs. You could understand exactly what was happening there, and you could learn from it. It included a hardware schematic of the entire computer so you could learn how to make a peripheral and plug it in an open ecosystem, and that’s the awesome Apple. That company would be the best company in the world again. I think the current one is just on the wrong side of history and needs to change.

Epic Games Store

Lex Fridman (03:43:17) Well, I hope Epic and Apple find a path forward together and flourishing together, and Apple embraces competition better. One of the things I admire about this conversation that you mentioned, Steam a bunch, with kind word supportive and basically never mentioned Epic Game Store. I love that. So I really love that. It really embodies the fact that you want variety, you want freedom for people to choose the best thing, and, in so doing, create this large network of humans interacting freely with each other. Okay.
(03:43:55) That said, one of the competitive pressures that Epic has created a few years ago was by launching the Epic Game Store. And instead of Steam’s 30% revenue cut, you went with 12% revenue cut creating the competitive pressure saying, “Listen, this shouldn’t be that high of a cut,” which I thought was amazing. This is a brilliant idea, and I think it still is a brilliant idea. It’s wonderful. Now, in preparing for this conversation, I looked on the internet, and I saw there’s a lot of criticism of EGS, Epic Game Store. First of all, I should say, the internet is full of drama and criticism. There’s not enough celebrating of awesome shit.
(03:44:46) That if I can ask the internet as a blob one request, can we just celebrate awesome shit and also criticize, but just like there’s not enough celebration. Anyway, the two directions of criticism is just straight up, “The launcher interface is clunky and lacks a lot of the features of Steam.” And then the second set of criticism is the exclusive contracts which were made with some of the games that are on Epic Game Store. So, first, huge props on the 12%. Maybe you could speak to the vision of that. And second, can you comment on those two criticisms?
Tim Sweeney (03:45:31) Sure, yeah. I think one of the reasons that people characterize the Epic Games Launcher is clunky is because the Epic Games Launcher is clunky, and we need to improve this. There’s a lot of work going on there, and I wish we’d gotten better at addressing quality-of-life features and prioritize them above all of the other features because Steam has 15 years of built-up work by many of the best programmers in the whole industry working on that, a much larger team working on Steam and a lot more time working on it.
(03:46:09) And so we’ve had to make a lot of prioritization decisions about what do we support with the Epic Game Store and when, and a lot of the time, it’s been supporting commercial features like merchandising, offering multiple versions of a game for sale, and offering upgrades from the regular edition, the deluxe edition, and other things that partners work and other priorities have been quality of life and launcher load times and other things. And we’ve not put enough emphasis on the quality of life features. We’ve recognized this very clearly multiple times and we’ve gone through multiple refactorings, but that’s definitely been a disappointment to us and to a lot of users. And I think one thing we had to… it took us a while to realize was it’s not ununiform. Depending on your proximity to a CDN and the size of your game collection, it can be either awesome or really clunky. And the users for whom it’s really clunky are the people. I think [inaudible 03:47:04] large part of the complaints.
Lex Fridman (03:47:05) They’re going to speak up. And I should also say that the Steam Launcher, for a long time from my memory, but also just looking online, was also very clunky in the beginning.
Tim Sweeney (03:47:15) Yeah. And one of the criticisms of Epic Game Store from the beginning was, “You don’t have all of the features of Steam,” but we very much don’t want to have all of the features of Steam. Steam has forums dedicated to your game, and we decide we don’t want to create forums. And our partners, when we talk to them, generally didn’t want us to create Epic Game Store forums for their games because there’s already channels that they prefer to them.
(03:47:36) There’s social media and a number of platforms, and there’s Reddit, and there’s lots of places for gamers to discuss their game, and they prefer those discussions to be there. And so it’s very much not our goal to mimic everything of Steam, but we do want to have all of the convenience features that makes it easy and fun to use [inaudible 03:47:54] Steam. So there’s a long journey ahead. But we continue to reinvest it, and we’re working to build a multi-billion dollar business there and think we’ll succeed.
(03:48:04) Already, the Epic Game Store supports an immense amount of Epic games commerce in Fortnite on PC. Now, on Android and iOS and the European Union too. So it’s a forever facet of the industry, and we are never losing heart in it. And we think, at some point, I really feel that the benefits of the Epic Games approach are going to outweigh the benefits of the Steam approach, especially as gaming becomes multi-platform. One of the things that really sucks for all gamers is that you have a lot of friends in the real world.
(03:48:37) Some have… Everyone has different platforms. Your Steam friends aren’t connected to your Xbox friends, and they’re not connected to your PlayStation friends or your Nintendo friends. And so you’re very much bottling up PC gaming into kind of a hard-core group of PC-only folks and making all of the other aspects of it difficult. A lot of games have flocked towards Discord, which is a mess in itself because can now your Steam name is not your Discord name, and that’s not your PlayStation name.
(03:49:01) And so now you have three… two people in a game, and they have four different identities, and that sucks. Our aim for that is with Epic online services, and the social systems that we built for Fortnite opened up to all developers to have cross-platform social features be super easy and free for all developers. This is not something we’re trying to gatekeep or rent-seek on or lock people into. It’s just a way that we’re making social gaming easier for everybody.
(03:49:29) As more and more games follow the Fortnite approach of being multi-platform, especially multiplayer games, Metcalfe’s Law is a very real phenomena in the industry. It’s the thing that’s upending some games and causing growth in other games. It is the number one trend for pervading the world of gaming today. And it says that your game is quadratically more valuable the more percentage of a user’s real-world friends they can connect to.
(03:49:55) Your game vastly benefits by connecting all of its players together and not segregating them off into different online platform populations and so on. And so I think the future trend is in that direction. I wish Valve had opened up Steamworks to just work on all platforms. They could have easily done it. We did it. But they seem to be using it as a lever to keep people locked into the Steam PC Game Store.
(03:50:20) And that’s going to be a long-running battle because there’s always a very toxic group of Steam users who they even created an entire sub-Reddit dedicated to criticizing Epic and our store, and they create basically harassment campaigns at times against developers who use Epic online services. Developers do that so they can their players across platforms and have friends across platform and voices across platforms, but suddenly, that’s trying to be turned into a negative.
Lex Fridman (03:50:52) It’s clear that Epic wants developers to win, wants gamers to win, and wants Steam to do awesome also. And in the competition between Steam and Epic Game Store, create awesome stuff together. I mean, there just… it’s obvious to me if you don’t read this stuff online, but online, it’s like there is just negativity that I don’t think is constructive in general.
(03:51:24) I actually give a big sort of positive thank you and props for the push to multi-platform that was always there with… for Fortnite, perhaps before the pressure that Epic created on breaking the barriers of Xbox and PlayStation and PC and being multi-platform. I got a chance to play with Fortnite a little bit with you and all the people in the group. By the way, awesome interface, audio chat really fun. But you could see a couple of PC folks, a PlayStation person, the Xbox person all together.
(03:52:02) We can’t really tell what they’re using except for a little icon and it’s nice. It’s like all these barriers that we’ve created with these platforms are gone. Poof. And you creating the pressure with Epic Game Store and just everything you’re doing with Fortnite platform, it’s really nice. There’s no reason to create these silos because, ultimately, you should put the gamer first and let everybody interact with actual real-life friends and make new friends across the entire network of humans. So anyway, thank you for that. Thank you for creating that pressure.
Tim Sweeney (03:52:38) Thanks. Yeah, that was an interesting time. Sony had her long-running policy preventing cross-platform play, and we had a long series of conversations, which got pretty harsh towards the end. But Sony ultimately came around, and they opened up PlayStation, and through a series of private conversations, they did the right thing. Not only that, our partnership with Sony has increased since that argument back in 2018, and we’ve gotten closer and closer and done ever more things with Sony.
(03:53:11) Brand IP, like the character from God of War, and other games coming into Fortnite and all kinds of crossovers. Massive Unreal Engine adoption and Sony for making games for making movies at Sony Pictures. Music partnerships with Sony Music. That’s been an absolutely wonderful relationship, and I think that stands as an awesome example of a company that, because of historic reasons, got stuck with a policy that no longer made sense for the future.
(03:53:38) And following a serious discussion with a close partner righted it, and did an awesome thing, and now Sony is much better off, and Epic’s better off, and all game developers are better off and the whole console industry, I think, it’s a lot stronger now than it would’ve been if these silos [inaudible 03:53:54] to be playing out. And despite the kind of potential concern that maybe blocking platform play with Xbox gave Sony an advantage, Sony has actually grown in market share relative to Xbox since that time.
(03:54:07) And so you can’t say that anything but goodness came of that time. And I think a better version of Apple would’ve received… The email I sent to senior Apple management and been like, “Huh, there’s an issue here. We should have a discussion. We should reconsider this. We should listen.” And, yeah, they didn’t. And that’s why we’re in the midst of a five-year battle with Apple and in the… hopefully still the early days of a 15-plus year partnership with Sony.
Lex Fridman (03:54:38) Come on, Apple, we love you, Apple. Do a little bit better. The second line of criticism that I mentioned, the exclusive contracts with some of the games, can you just speak to that because, in so much of the journey of Epic, you’ve been sort of against exclusivity?
Tim Sweeney (03:54:54) Let’s back up and talk about the principles at work here. Apple forcing other companies to use their payment service is a cursive decision by Apple. But if Apple convinced other developers to use their payment service by offering benefits or a better deal or funding or any other positive incentive, then that would be perfectly fine. One is preventing competition, and the other is actual competition.
(03:55:26) Epic has never forced any developer into any sort of exclusivity relationship. Rather, we’ve offered developers payment or incentives or marketing or any number of things of value to them in exchange for coming to our store exclusively, and it’s their game. So it’s entirely and rightfully up to them to decide how to distribute it and to make the decisions about their business. It’s their game.
(03:55:53) If they want to distribute it through Steam, they can. If they want to distribute it through Epic exclusively, they can. If they want to distribute it through both, then they could do that as well. And if we pay them money or other things of value in exchange for them coming exclusively to the Epic Game Store, I think that’s their right. And this isn’t an example of Epic, an underdog with a tiny fraction of Steam’s market share, working to proactively compete with Steam by offering a better supply of games.
(03:56:21) And some consumers who prefer Steam might prefer that the game be on Steam, but the developer in each case is decided that they believe they would benefit more by doing this exclusive deal in exchange for benefits than by being on Steam. One of the key exhibits in the Epic-Google trial was its opening exhibit, which was trying to point out to the jury in the trial the benefits of exclusives. Imagine a new store popping up. The store has a big sign outside of it, “We’re the new store. We have everything that the other store has, and it’s at the same price.”
(03:56:58) Are you going to go to the new store? No. Nobody’s going to switch from Steam if Steam has all of the same games as the competing store and everything’s priced in just the same. And so, we looked at initially two ways of competing with Steam strongly. We wanted to sell games at a better price than Steam by agreeing on the amount of money we pay each game developer. If we’re going to… If the game’s going to sell for $50 and we take 12%, we’d actually lower the price and potentially even lose some money to offer a better deal.
(03:57:29) Well, we tried to pursue this, but very quickly, every developer told us that they wouldn’t agree to better pricing because if they did, then Steam would stop giving them marketing featuring and benefits, and the console makers would be mad, and all their relationships would be harmed. And so there’s an undercurrent of powerful platforms and ecosystems encouraging developers not to compete on price. So, not being able to compete on price, we decided to compete on by doing exclusive deals, and we-
(03:58:03) We decided to compete on supply by doing exclusive deals, and we signed a lot of them. Paid developers lots and lots of money. I think we distributed over a billion dollars in net expenditures to developers beyond the revenue we actually made from games in order to get a whole lot of exclusive games. Some are successful, some weren’t. Borderlands did awesomely on the Epic Games tour, and we and Gearbox felt that it did just as well through Epic as it would’ve done on Steam because the players who wanted Borderlands wanted Borderlands and they came and got it. Whereas a lot of other games, some smaller games especially that didn’t have a dedicated audience that was absolutely going to play the game, typically benefited from exposure on Steam. They were reaching an audience that wouldn’t have reached organically. And so some of them in the end, we and they concluded that they did worse by being on the Epic Games tour exclusively in terms of reaching fewer customers.
(03:58:52) And we had these limited time exclusives. When they ran out, they put their games on Steam and lots of data was gathered to understand what worked. And this worked well for some games, didn’t work for other games, but companies seeking to compete, especially underdogs seeking to compete, have to offer some unique value, have to offer something that’s not available through the competitors. And I get that Steam users they just prefer using Steam and buying games on Steam, want to have their library in one place don’t like this. But you’re never going to have competition for better deals if you don’t support the competitive mechanisms that allow competitors to come about. And I think if Valve were forced through Epic Games Store’s success to compete with Epic Games Store, then developers would be getting a better deal. Consumers would be getting a better deal, and these 30% fees would be driven down quite a lot towards the actual costs that are required to support the stores.
Lex Fridman (03:59:46) Yeah, I mean there’s a lot to be said there. I’ve gotten to watch Spotify try to do this with podcasts, enter as the underdog into the space and try to attract, they made exclusive deals with, for example, with Joe Rogan, where the podcast would only be published on Spotify. I personally think long-term, what I would love to see for EGS, for Epic Games Store, is to not do any exclusivity, similar to what Spotify is doing now. Even with Joe Rogan, they let go, it’s wide open. And instead compete on the space of just the non-clunkiness of the interface, because the foundation of what Epic Games Store represent with 12% is just philosophically. So you’re also competing on the sort spiritual realm of what it stands for ethically.
(04:00:49) That’s also a really powerful way to win. Now that there’s enough number of people using Epic Games Store to drift away, to move away from exclusivity, it’s understandable that it’s needed for the competition for the underdog to enter the scene, but it goes against the freedom, the free spirit of choice that I think you represent in a lot of the decisions you’ve made, which is making the games cross-platform and just, yes, giving freedom to the developers, giving freedom to the gamers to choose. So in that way, I think exclusivity a little bit goes against that.
Tim Sweeney (04:01:30) Well, here’s the conundrum. The exercise of soft power by all of the competing stores has made it intractable for almost any developer to offer a better price through the Epic Games Store than through Steam. You can imagine that if the effective Epic revenue sharing 12% to developers was that games just cost 22% less on Epic Games… Sorry, 18% less on Epic Games Store, that that would actually start to reshape consumer behavior significantly. People would start coming here for the better deals. But I feel like Steam giving developers nasty phone calls and so on, when they propose to do that, prevents developers from passing on savings to consumer.
(04:02:15) Then what’s the mechanism that drives users away from the incumbent store to the store that offers a better deal? If basically developers are fearful of competing on price through stores, what can possibly be done to get a dominant store with something like 90% of revenue share among multi-publisher stores in line so that a much, much smaller store can compete? I think some answer is required there. A better UI is great. Steam is super polished. Epic Games Store in time will hopefully be as polished. How does that overcome the fact that your entire library over the past 15 years is there, if developers have been afraid to exercise their own economic interests? Because it’s in a developer’s interest to sell on Epic and get 18% more of the revenue. I think there’s a real power to incumbents. It’s very hard to overcome for just being there and being as good.
Lex Fridman (04:03:25) Ultimately, where I hope it converges to is less exclusivity and where the competition can be the kind I love the most, which is on the UI, on the experience, and then on the Steam side, on the 12%. So it can go from 30% and start to support the developer by lowering it from 30% closer to 12%. So anyway, I’m a big supporter and I don’t like the criticism of Epic Games Store, but I also have to say that I don’t love the exclusivity, but I understand the reality of the world is that you have to have some mechanism to get people to switch, or not to switch, but to at least get some of their games to try out, to experience, to allocate some of their library to the underdog. So I totally understand and hope the UI keeps improving.
Tim Sweeney (04:04:30) Thanks. One more bit on that exclusivity point is that when we told Google that we were going to launch Fortnite outside of Google Play and go into competition with them, they viewed exclusivity as such a powerful competitive force that they went around to the top 30 publishers and paid out hundreds of millions of dollars-
Lex Fridman (04:04:52) Oh, boy.
Tim Sweeney (04:04:52) … to them in order to agree not to do exclusive deals with competitors. And that was called Project Hug. H-U-G. Hold developers close. And that was one of the major pieces of evidence on which the jury found their practices to be illegal and anti-competitive. And the one more data point on that, we talk about 30% and there’s always a lot of people defending Steam. “Well of course they have more costs because they have more features than Epic.” We have data on that that’s very detailed. The all-in cost of operating the Google Play Store, stocking it, maintaining it, the software, the entire ecosystem is around 6% of revenue. So in a competitive market, would a company whose cost is 6% be able to charge 30%? Absolutely not. And Apple’s costs are similar. Apple runs an even more efficient and lean operation than Google. So their costs are also likely in the range of 6% all in. They market up from 6% to 30%. Only a monopoly can do that. Look at competitive businesses, they have a margin of a few percentage. The numbers there are strikingly supportive of just outright anti-competitive market distortions.

Future of gaming

Lex Fridman (04:06:16) Okay. What do you think is the future of the gaming industry? So well you’ve said to me a bunch of exciting stuff about indie developers, so do, what are called AAA video game companies, so these big gaming companies, do they have a future? What is their role? How do you see in the next five, 10, 20 years the evolution of these big companies and indie developers?
Tim Sweeney (04:06:42) Yeah, there’s one constant in gaming that I think the industry manages to lose sight of from time to time astonishingly, and that’s fun. And people play games for fun.
Lex Fridman (04:06:51) Yes.
Tim Sweeney (04:06:51) Our whole job is to deliver fun. And when you look at a lot of the games that failed recently, they just didn’t deliver fun or they didn’t deliver fun in a manner that was nearly competitive with the other sources of fun just in people’s lives. And so at a basic level, we don’t need a terribly complicated theory to explain a lot of the malaise in the game industry. There’s just been a degradation of the capabilities of a lot of publishers, partly because of competition for talent. Companies like really vibrant game businesses like Epic or Riot or others, are hiring the best developers and accumulating them. And big tech companies are hiring the best game developers because there’s super talent there.
(04:07:32) And so in some cases the companies aren’t competing robustly or getting worse, they’re making games that are less fun. I think everything else that’s happened is kind of a sideshow to that. There’s always political drama and so on, but I think the just the core is a failure to deliver fun, and the nature of fun is changing. It turns out that playing a game together with your friends in a really socially engaging way with voice chat is just way more fun than playing a solitary game for the most part. And there are exceptions to that, but I think we’re seeing much, much more playtime shifting towards games you’re playing together with your friends. And not just random internet strangers who happen to play that game too, but the people you actually know in the real world. And that’s certainly been the case with me and with almost everybody I know who’s playing Fortnite or similar games.
(04:08:21) And that has really significant effects in reshaping the whole game business because a single player game, if you have 20 people with 20 different opinions of which game to play, each one might buy a different single player game. But in a multiplayer game, if there are 20 games out, and each one might have their own completely individual preference and each one were independently choosing which game to play, each one might buy a different game, but they’re all realizing that they want to play together. And so what players are doing increasingly is playing a game they like and accept together with their friends, even if it’s not the game that every one of them might be preferring to play themselves. And that’s certainly the case in different Fortnite groups I play with from time to time. It’s like one player might have been preferring to play COD, one might have been preferring League of Legends, somebody else, something completely random, but it’s just so fun to play together, we’re doing that.
(04:09:17) And that means that there’s really strong Metcalfe’s Law effect, in which games which are able to attract a large percentage of your friends are more able to attract you and not only attract, but also retain. And so I think Matthew Ball’s analysis of this over the years has really documented the trend towards, you can call it the metaverse or you can call it large scale multiplayer social gaming, he’s really documented this trend. And over the past year or so, it’s taken a really, really strong turn towards increasing rate of change, increasing numbers of players coming to Fortnite. We hit an all time high of 110 million monthly active users about a year ago.
Lex Fridman (04:09:57) That’s crazy.
Tim Sweeney (04:09:59) Another close to peak this time. Roblox is bigger than ever, and this trend is players consolidating into multiplayer experiences that they play together. And we’re seeing another trend overlaid with that, which is like when an awesome single player game comes out or a smaller multiplayer game comes out, people often will treat it as a vacation. They’ll go off and play that game for a while, then come back. And I think Wukong was an awesome example of that, a wonderful game from a brilliant team in China. They made a game that’s like no Western players had really seen that type of thing done before and it was awesome and it did well, but most players play it for a while and move back on. And that can be lucrative. But a business that’s building that kind of game is going to have to build a new one every few years and build a business around that, while the other games continue to accrete users.
(04:10:45) But when you have a large number of gamers migrating to a small number of games, the effect of that is increasing revenue for those games, increasing reinvestment. And there are things that Epic can do with a team of thousands of people building Fortnite internally, and tens of thousands contributing to Fortnite as independent creators. There are just things that can happen with that level of investment that can’t happen in a smaller game. And so there’s somewhat of an increasing winner-take-all dynamic where the biggest games reinvest more to make their games more fun, to gain fun at a faster rate than other games. And the industry is changing around that.
(04:11:25) So I think the lesson for the game industry now is that there are really two big opportunities being pursued. There are big games, or games that have the potential to be really big multiplayer experiences that keep players around indefinitely for very long periods of time, and then there are just really good and small-scale games that people are taking a break from their big games for. And the trend there is going to be towards efficiently developing those games. You can’t build one of those games with a $300 million budget, but if you can do it with a $40 million budget, you can make a lot of money. So I think that’s the main reshaping going on and think that it creates a rather bleak outlook for a lot of the category of single-player games that don’t have a huge audience to reach. But this is just one of the really trends of restructuring the business around the technology and changes of the day.

Greatest games ever made

Lex Fridman (04:12:17) Okay, this is going to be a ridiculous question, but aside from the games you’ve created, what are some of the greatest video games ever created to you? What video games have been either impactful to you in your life or maybe you’ve seen created and you’re like, “Huh, that’s a beautiful art piece.” It could be in a totally different realm. Obviously for me, I return often to the single-player domain of role-playing games of the Elder Scrolls series Skyrim. That was like a world that they created, a recent game, Baldur’s Gate 3, that was a really incredible piece of work and art and doing a lot of innovative stuff, again in the single-player domain. Is there games like that outside the ones you’ve created?
Tim Sweeney (04:13:07) I’m most impressed with the games that have created what appears to be a full living, breathing world. Games that give you the sense that you’re just a part of it and there’s a lot more happening and there’s always more. And it gives you the sense that you could go anywhere and do anything. Even though these games really do have finite limitations and there are places you can’t go, really creating that sense of wonder is just a magical thing. Like Zelda: Breath of the Wild.
Lex Fridman (04:13:39) Oh yeah.
Tim Sweeney (04:13:40) Skyrim, Red Dead Redemption.
Lex Fridman (04:13:42) Red Dead is great. Yeah.
Tim Sweeney (04:13:43) It’s like there’s an entire ecology simulator in there. I have a high school classmate that got into studying river ecology and he was commenting on, “This is one of the very few games that’s hydrologically sound.” They actually went to the effort of shaping the rivers to follow erosion dynamics and so on. The attention to detail, and there’s something there that’s big. It’s been funny journeying through the industry. I last designed a game in 1992. I’m not a game designer. I have a very open-minded view that the best game genre that will ever exist has not yet been invented. And as we get more technological capabilities and creatives people use that and hopefully empowered by higher productivity tools and so on that we’ll see more and more cool things emerge that we’d never dreamed possible. And the idea of a world simulator is actually really interesting there. It’s been tried a lot. It’s usually extremely slow and expensive to create, but over time, maybe we’ll get better at that and that will be a thing too.
Lex Fridman (04:14:49) You said so many interesting things there. New City Builders.
Tim Sweeney (04:14:52) Yeah, civilization. It’s just mind-boggling that they’re building a game with that depth that can evolve so dependent on your actions.
Lex Fridman (04:14:59) To do that, that scale of world, but to where you can step into it and be in it. I think Red Dead is a great example, but to do Red Dead Redemption in a way where you can walk around with friends at a large scale. And I guess what you have given so many years to is creating the tools that enable the artist to give that attention to detail that Red Dead does on several of the things. And once you do, there’s something magical about that. Once you give that attention to detail. I don’t know what it is, but the love of the artist comes through somehow and you can feel the care that they put into it.
Tim Sweeney (04:15:52) That’s right. The best games have a soul. You can really sense it. Like Call of Duty has a very different soul than Fortnite, and it just kind of exudes not only in what you see in the game, but also in how players interact with it and interact with each other online. That’s a really fascinating thing I wish would be studied more.
Lex Fridman (04:16:09) I think we talked about the soul on several fronts, right? I wish it would be studied more.
Tim Sweeney (04:16:14) Yeah. These little game design decisions that the designers make have a profound impact on what players think of the game and see in the game. Fortnite Battle Royale always had a sense of mystery to it. You’re on this island, but you’re not sure exactly what’s happening here. There are all these houses, they’re abandoned. Why? And I’m not the secret holder, I’m not on the design team, I experience Fortnite as a player, but it really exudes a lot of that and a good spiritedness as well, because even when you’re eliminated in Fortnite, there’s not blood spurts and there’s not gibs, you’re just teleported out of the simulation. And often you end up losing the game in a way that’s hilarious enough that actually you’re laughing at it or you’re like respect to that player who just won because that was clever. And it creates a very different dynamic than these other games where players tend to be very positive towards each other.
(04:17:09) One of the things I like to do in Fortnite just to kind of gauge how the game is going, is I play fill squads, get match made with three other random players and play a game together. Sometimes they have voice chat, sometimes they don’t. And back when our matchmaking regions were bigger, I learned a little bit of battlefield Spanish so I could speak with the people who were down-
Lex Fridman (04:17:27) Battlefield Spanish.
Tim Sweeney (04:17:29) … as far south as Mexico City. And the positivity of the interactions there among every kind of person you might ever meet online were really quite impressive and completely unlike what you would see in a game like Call of Duty, where it’s always everybody’s got to be an edgelord.

GTA 6 and Rockstar Games

Lex Fridman (04:17:49) I love online gaming culture. I have to ask you because it’s kind of like one of the legendary games is Grand Theft Auto. Speaking of the worlds that are just like… I mean, that’s its own thing, right? That’s that world, the characters, the style, the edginess, all of that. But the interesting thing about Grand Theft Auto VI to me that I want to ask you about is they took forever. It’s the six-month thing that you mentioned before. There’s some games like that just take years to bring to the conclusion. What can you say about that process that you eventually were able to take unreal to completion? If you were to look from the outside, why does it take Grand Theft Auto that long or other companies to take the games to conclusion? And what I mean, just insight into what that process is like.
Tim Sweeney (04:18:52) Making games is very hard, and especially when you’re pushing the boundaries of something. With Grand Theft Auto, it’s just the realism and feeling that you’re in this huge city and that anything can happen and it’s all living and breathing and you’re just a part of it. The level with which Rockstar has brought quality to that genre is astonishing. And when you’re building something at a level of quality and detail that’s never been achieved before, you can’t predict how long it will take. Whatever problems you’re solving today to get to the next iteration of quality on it, you don’t know what new problems that will unlock. And often you fix one thing and make it super realistic, and that just highlights the unrealism of other things that you then need to fix.
(04:19:37) I think the thing that always comes to mind is that shipping a game is easy if you don’t have a high quality standard, we also won’t have much success. What we’ve seen from Rockstar is they take a long time, but they ship amazing games and it’s worth it in the end, right? A bad game is bad forever. A late good game eventually is released and is good.
Lex Fridman (04:20:01) Do you ever feel, like Rockstar is a good example of that, the pressure of delivering quality? Epic has not missed recently that I’m aware of, in terms of delivering quality. You feel the pressure of that that you’re not allowed misses?
Tim Sweeney (04:20:19) We certainly do. Everybody’s often working very much to the last minute to make something excellent. And it’s really hard with these fast delivery timeframes because you really have to get a lot of stuff up and running before you can judge it, like a new Fortnite season holistically. It’s not until the last month or so that you really know what you’ve built and you really understand it. And if any late breaking problems emerge in balance or anything else, it’s usually towards the end. And that usually leads to a rapid push to fix it. And then other lessons, you can only learn live and from experience. And that means accepting a game that it’s a live experience and it’s also an experiment and it’s going to continually be improving. And at any time, there’s some things that some people don’t like and you learn from it and you improve it and you move on.

Hope for the future

Lex Fridman (04:21:12) Let me ask you a big philosophical question. So you’ve created these gigantic worlds that bring so much fun to humanity, but you also get to learn about humanity. What gives you hope about us humans, about the future of humans, about the future of humanity?
Tim Sweeney (04:21:34) I see two contrasting worlds that have been brought about in the digital age. One is the world of social networks and people typing at each other and just massive negativity and politics and hucksterism and creation by engagement often promoting negativity and toxicity. That’s a harsh world that I think is a step backwards in many ways. I think that the foundation of the world is actually a little bit shaky because of just the social dynamic that those platforms have brought on. But then I compare that with the good spiritedness of what’s happening online when you’re connected to real people. Actually playing Fortnite, playing Fortnite fill squads with people you’ve never met before and never talked to, and just judging what human connections develop there and whether they’re positive, I found those to be really, really excellent and endearing.
(04:22:28) I think the lesson from all of that is that humans talking to humans and being together in the real world or a virtual world, is a naturally empathetic medium, which naturally leads to bonding, and though conflict sometimes occurs, it’s just generally so much more promoting of our social norms and good interactions between people and positivity promoting. Whereas the typing angry messages thing at each other as a self-reinforcing negative dynamic that’s negative. And I think you look at social media and you look at gaming that is increasingly social, and I couldn’t see a bigger divide between any two medium as I see there in terms of the actual social dynamics. One super positive, one super toxic at times.
Lex Fridman (04:23:18) Yeah, that’s actually really… The text-based medium. Now, that could even be around gaming. You could look at Discord, it could be real toxic in text, but you place humans together in the real world here in the room, I very rarely see humans not get along in the physical space. And the degree to which you can create a digital space, like a metaverse type of space where it’s sufficiently immersive, where you feel the other person, the empathy comes out, and then the joy that’s derived from the empathy comes out. And it’s just a reminder that humans… I don’t know, that humans are good and they want to see the good in others. They want to share the goodness. And then when they get in that group together, there’s love there.
(04:24:12) Now, they might talk shit about some other group, this is the dark side of humans, but together in terms of the dynamics of that group is joyful. So yeah, that gives me hope as well. And the more degree which we can create those worlds online that make it super easy for us to connect in that empathic way, the better. And I am grateful that you are pushing the boundaries of what’s possible in creating such worlds. And I’m grateful that you would talk with me today, Tim. This is amazing and it’s an honor to talk to you.
Tim Sweeney (04:24:46) Oh, thank you very much. It’s been fun.
Lex Fridman (04:24:49) Thanks for listening to this conversation with Tim Sweeney. To support this podcast, please check out our sponsors in the description. And now let me leave you some words from Benjamin Franklin. ” We do not stop playing because we grow old. We grow old because we stop playing.” Thank you for listening. I hope to see you next time.