Episode 64: AI, Socials, and Society with Steven Puri
September 8, 2025
Guest episode with Steven Puri. We chat across a number of topics, including films, characters, AI, and Social media.
Hollywood stories, AI debates, and the science of flow all find their way into this wide-ranging conversation with Steven Puri. From what makes Indiana Jones compelling to why social media might be remembered like tobacco, the discussion moves through storytelling, technology, and how we manage our attention. Along the way we explore default and executive brain networks, the role of conflict in creativity, and the importance of stepping back to let new ideas emerge.
Steven Puri:
Transcript
So, you've had kind of some, this is not what we're here to talk about at all, but you offered this last time, so I'll take you up on it.
You've got some interesting stories from your days as a Hollywood exec.
You got one picked out that'd be fun to hear.
Well, you know, I'll tell you, something is on my mind this morning.
I was talking with someone about characters, right?
And about conflict.
Because conflict is what draws us in.
You can't have a story if there's not a villain to the antagonist, right?
And the villain could be nature.
The villain could be a man or a woman.
The villain could be people.
You know, the villain could be something inside of it, right?
But without conflict, nothing moves forward.
There's no story.
There's no drama, right?
So, I was talking with a friend about, you know, great action movies.
Action movies that either, you know, spawned all these sequels or critical hits or whatever.
And pretty much there is a trope with all of them, which is in the third act, the guy or girl is going to be the best archer ever, the best sword slinger, gun shooter, puncher, kicker, jumper, whatever the hell it is, right?
So, in the first act, you got to show the audience them shooting a bow, shooting a gun, punching somebody, right?
Hey, Marky Mark's the best sniper.
First scene of the movie.
He's at a shooting range doing some amazing thing.
He's in Afghanistan shooting people, right?
It's just what you do.
But there is one movie that has spawned billions of dollars of sequels and theme park rides and all that that violates this, violently violates this.
And the reason it works, the reason it became this movie that spawned a billion sequels was the character writing is so good.
You just, you're in.
And that movie is Raiders of the Lost Ark, which spawned the Inanna Jones franchise, right?
I've done the Disney Universal Ride, whatever it is.
I've done all that stuff.
And what is so interesting is when you examine that movie, it violates this rule that every other script follows.
Because they do such a good job setting up in the first act.
He doesn't believe in God.
He's raiding tombs.
He's stealing sacred idols.
He's doing something like that.
Maybe in some gray zone.
He's not as evil as Belloc.
But, you know, it's kind of doing stuff that's kind of sketch.
And he goes and meets the girl, right?
Sort of beginning a second act.
And the girl's like, you never loved me.
You were my father's partner.
And you just used me and threw me away.
You never cared about me.
So, you go through this movie.
And we were talking about how in the movie, if you ever watch it again,
the movie would end the exact same way if the hero had not been born.
Like, that is how little effect he has on the plot of the movie that spawned all these sequels and theme park rides and all the shit, right?
At the end of the movie, you're like, if he had done nothing, it would still end the same way.
It's so funny you bring that up.
Because I was going to ask you the question, like, after you were through your thing, whether or not you – what you thought about this theory of, like, you could just get rid of him completely and the movie would be the same.
Like, the events of the – and it's so true, right?
It is so true.
And the movie works because in the climax of the movie, unlike every other action movie, where the hero is a great kicker, puncher, shooter, bow, arrow, guy, whatever, girl, they do the thing.
Here, he's tied to a stake.
Right.
Your hero is immobilized.
The girl is tied up behind him.
They can't even see each other, right?
What an awful enigma movie.
Who wrote this garbage, right?
What he says, four words that pay off the entire movie.
Marion, close your eyes.
And with that, he goes, yep, I believe in God.
God's coming.
It's going to be bad.
I love you.
I'm going to save your life right now.
Close your eyes.
You know, you're right.
I never thought about that that way.
That's absolutely true.
And it works.
And no one – I'm telling you, any other action movie, people are like, oh, you know, how come he didn't do the thing, the kicking and the jumping?
And the Jackie Chan shit in the ending, da-da-da, right?
In this one, he does nothing.
He's tied to a stake.
And people are still like, oh, yeah, I'll go see five more movies with that character.
Right, right.
It's cool, you know?
And that's great character writing, man.
It's people – we relate to each other through stories, through character.
Well, he's a very compelling character.
And I think – because one of the things I was thinking while you were talking about this was how much of that is due to also, you know, the charisma and stage presence of Harrison Ford specifically.
But even that's not entirely fair, right?
Because there was a whole series of books and, like, young indie and all this kind of stuff that has all spawned out of this that was maybe not equally as popular but also made its way into the cultural zeitgeist.
So, clearly, you don't necessarily even need the actor in this case to still have something useful.
Yep.
And what's so funny is the screenplay, I think, was co-written with Larry Kasdan, who obviously is an incredibly talented character writer and worked with Lucas on, you know, some of the Star Wars stuff too.
And the story, I think, was with, like, Philip Kaufman, who, you know, probably best known for, like, unbearable lightness of being.
And so, I'm like, really dramatic stuff.
Like, maybe he did the right stuff.
But there's some stuff he did where it's just like, oh, wow, that guy came up with this?
And it's remarkable to think about, you know, the people who write from a deep understanding of humanity as opposed to the writers who write because they've seen a lot of movies.
And my dad calls this – because he counsels me about this with cooking, with everything.
He's like, you have to go to First Principles.
He's like, don't buy a ready mix of anything, like get raw carrots, get raw milk, get raw – you know, you get first ingredients, don't get things that are pre-made for you.
And same thing with – he has, like, 54 patents to his name in chip design, in logic circuit design for very large-scale integrated circuits.
And his thought was always, you have to go back to understanding exactly how, you know, the electrical charges, the electrons move across these gates.
And it unlocks, you know, your brain about what else to do.
And I love that because you think about so many of the movies you see where you and I can both sit there and go like, oh, dude, the next scene is going to be the one where the girl gets captured by the bad guy.
Because he walked outside the magic circle while he's gone.
And, you know, you're just like, oh, I get it.
Because they're these derivative writers who are like, oh, man, it'll be sort of like that scene, you know, in –
Right, right.
You know, Air Force One where – and they pitch to the movies because they've seen movies as opposed to they understand people.
And what's really funny is right now people conflate artificial intelligence and LLMs.
They just basically call LLMs.
Sure.
Like, this is AI, right?
Right, right.
And LLMs, the way they're currently structured, they're a glorified version of Google autocomplete.
Like when you're writing an email and it suggests the next word or you need to type in your password, it remembers, you know.
So, that's on a huge scale.
Right.
That's what they are.
They're not actually intelligence.
They're just saying, hey, man, the probability is the next thing after rainy is you're going to write like street or day.
Okay.
Right.
Let me guess.
No, it's so true.
It's so true.
They're definitely not – they're not all they're cracked up to be, right?
They're just kind of super prediction engines.
And you're right.
It's not all that different than just when you're texting or whatever and it shows the next recommended word or whatever right there on the screen.
It's not a big – it's not a huge divergence from that.
It works a little better.
It's more at scale.
It can absorb more at one time, all that kind of stuff.
And obviously, they're starting to bolt stuff onto them so that they're not even necessarily true LLMs anymore.
They're becoming tool sets, right?
But it's all – yeah, it's all based on this one key piece in the middle, which is just like, hey, what are you most likely to say next?
You know, 80% of the time, you're going to say this word next.
So, we'll go with that.
Yeah, completely.
And there's – I don't remember his name, but there's one guy who's very vocal, one scientist who's very vocal about, like, this entire model of, you know, we're labeling these huge data sets, paying people in Malaysia and India to go sit there and write, this is an image of a car, this is an image of a dog.
And then we pretend that doesn't exist.
All the companies that do this, like the scale AI kind of stuff, they, like, pretend it's being done automatically.
It's magic, yeah.
Exactly.
And it's, you know, when AI stands for anonymous Indians, okay?
Right.
So, the whole training of a labeled data set and then signing, you know, weights and biases and having, you know, prediction and inference, you know, sort of thing happening.
He was like, this is all wrong.
It's just the wrong way of thinking about it.
He's like, it should learn from zero.
So, instead of being fed a trillion tokens, we should have it learn the way a human does if you actually want it to think.
Start from nothing, give it a task, have it fail, fail, fail, succeed.
Oh, what was the common element of this success, my last success?
Oh, gotcha.
So, when I write, when I walk, when I think, you know, just like a child.
Like, how does a child toddle?
It falls down a bunch until it figures out, oh, this is how I get my balance.
And I think it's super interesting.
And he's kind of like the lost voice in the wilderness because, you know, NVIDIA doesn't want that.
Like, the companies are building out, you know, trillion-dollar data centers based on this model of, okay, we just have to pre-train them and then have huge inference engines.
That capex, if it all turns out to be, you know, for a five-year window of time, and then after that, everyone's like, actually, it works really better if they kind of learn and share their knowledge.
It would be really interesting to see.
I wish I could remember his name.
I've seen that in sort of the video game space, actually.
Really?
Yeah, I'd have to dig it up, but there was a series of, like, videos and experiments and things that were going around, oh, I don't know, this was a few years ago, where essentially they would take, like, a physics engine, right, where, in theory, there is the likeness of a person and they have joints.
Like, they might have a knee joint and an ankle joint and their body and their arms and all that kind of stuff.
Right, sure.
So, in theory, based behind the, with the physics engine sitting behind this, you know, diagram, basically, you could move the joints and create a person walking, right?
And then they stuck some sort of machine learning to just play with it, right?
And there would be a goal and it would be like, okay, you got to move the thing from, like, here to here.
And so, you watch, you watch, like, a time-lapse video of, like, fail, fail, fail, fail, fail as this machine learning that knows no concept of what walking means or how joints work.
Like, try to figure out how to, like, make this happen, to move a figure from, like, one end of a screen to another.
Really?
And some of it, yeah, they're really cool.
I'd have to, I'd have to go find it, but there was a whole bunch of them like this.
Um, and in some cases, the machine learning would get to a walking thing.
In other cases, it would end up with some, like, really crazy, like, horror movie, you know, like, you know, there's, like, like, using hands and, like, you know, dragging its head along the ground, like one of those crazy, you know, horror movie scenes with some disfigured, disjointed, you know, creature or whatever.
Um, but yeah, so they did some of this stuff, but as I recall, it's extraordinarily, um, long, right?
Like, these things run for, uh, days to try to figure out how to just move a person from here to here.
So then if you were to extrapolate that out, right, like, what would it look like to try to compute on a scale where you are literally trying to let a machine learning algorithm figure out the basics of language, right, or whatever.
Or, um, like, that's just, I bet, I bet you, you'd have to be powered by the sun for that, you know?
True, true, true.
But absolutely.
Yeah.
It's so interesting.
Uh, cause, uh, did you see this, um, not to just sit around and talk about AI all day, but did you see this study about from MIT?
Um, it was, I don't know, maybe a month, month and a half ago that there was results flying all over the internet.
And the general idea was that what they found was, um, that brain function from people who jump to a tool like ChatGPT is very low.
Did you see this?
Yeah.
So they did one with doctors where they, after using, uh, an LLM to help with diagnostics, and then they took the LLM away.
The doctors were worse than they were before.
I didn't see that one.
I saw one with students.
Uh, and I have issues, I think, I can't remember if I was talking to this with my show before or not, but I have issues with the way they reported this because of course they went for like the most sensationalized headline, right?
So they, they, they took students and then some of the students went and wrote their own essay and some students went to ChatGP and uses that for, for that essay, right?
Right.
And what they found was the people who did ChatGPT, their recall was lower, their brain function was lower, blah, blah, blah.
What didn't really get reported.
And the most interesting piece of all of this was the middle group, which they had them first write their draft and then use the LLM to enhance it.
And the results suggested that brain function was even higher than the group that just wrote their own essay completely.
And it was almost not reported on at all.
Well, yeah.
Cause the headlines worse, right?
Like the headline that, that places wanted was like, you use ChatGPT.
It makes you stupid, right?
Like, but no, it turned out there was this whole other, this whole other group in the study that really, you know, was that kind of symbiotic.
Um, you know, you take your first crack at it and then you almost use it like a, like a collaborator, you know?
Hmm.
Well, I just want everyone at home to know I'm not actually here.
I'm sleeping.
This is 11 labs voice with ChatGPT talking for me, doing a great job, by the way.
Really, really happy with it.
When I wake up, it's going to be great.
So, uh, so tell me about some of the, the work you've been doing recently.
Um, you know, I've been focused on where, okay.
We, we began by talking about conflict, right?
Right.
Contra conflict drive story.
So, I've been thinking a lot about the world that I'm going to hand off to my children.
And it is, it is my hope that my children's generation looks at my generation's use of social media the way I look at my parents and that generation's use of tobacco.
Where it's like, now we stand here in judgment being like, my God, half your friends died of lung cancer and, you know, you have emphysema.
And how could you do this?
It was killing you.
It was so bad for you.
And your parents are like, yeah, but it was cool.
Then like movie stars, we're all smoking, you know, and it's, you know, these studies funded by the tobacco industry about how tobacco is healthy, you know?
So it was a different world.
You don't, don't get how we didn't know how bad it was.
I hope my children look at social media and they're like, oh my God, most of you just wasted your lives.
Like you didn't even get stuff done because you would just scroll, you would double tap, you know?
And we're like, yeah, no, it sounds terrible, but there was this great Logan Paul video I watched.
I didn't get my work done, but I want to see him go to the suicide forest.
And you're like, in hindsight, it's embarrassing that I did that.
So I just hope, I mean, the thing on my mind a lot is I hope that we do evolve past this era of the tobacco of social media.
Yeah, absolutely.
I think about that.
I've always felt a little disconnected because I never, by some, by some form of either complacency or just never quite felt natural to me.
Like I didn't dive super hard into, but I was enough on the periphery and I've been enough in the generation where obviously this came up, you know, kind of through our generation that, I don't know, it, it's almost difficult to not be engaged in it.
And it's, it's so pervasive.
It's so, it's almost all present.
Right.
And I wonder what your thoughts are on, because obviously, you know, it's called social media, which is almost becoming an idol.
It's an irony because it's basically encouraging people to sit in their rooms and like scroll by themselves.
Yeah.
You've seen the studies.
This is not new.
This is just getting worse by the day though.
But what do you think?
So let's, let's, let's take your example, right?
Of another generation starts coming up that recognizes some of this danger.
It's still present, right?
Like you can still go to the store and buy cigarettes, right?
Like it's not, they're gone.
Right.
But it's, the danger is more recognized.
Um, and so let's assume that there's a generation that comes up like that.
What, how do you think we float or correct or wait, you know, kind of move back towards a society that was more, that's more driven by actual social interaction?
Like, what do you think that looks like?
Like, well, there are a couple big things happening and I wonder how they will affect each other because we have, yes, a generation that was raised with a cell phone in their hand on whatever social media they're into.
TikTok, Snap, whatever, right?
Insta.
And, you know, the way Mark and Evan, those guys have designed it, it's not just like socially sharing photos and videos, but they really want the messaging, the high frequency interaction stuff to happen there.
So it's like DM me, you know, send me a snap.
They want that to happen there.
So that's going to be very hard to dislodge because that's a behavior from when kids are really, really little now.
You know, it's not like, Hey man, I was 35 when Facebook came out and I was sort of out of an account.
It's that sort of thing.
And then you've got, you know, if, if artificial intelligence does progress at the rate it is, you know, whatever the model is that gets us there, there's definitely movement toward, Oh, there's an entire strata of job that is no longer a human job.
Right.
And these functions can actually be done really, really well by AI better than you.
Not you in particular, Brad, but other people.
And you make great coffee.
No, one's going to do that for you.
Except the robot barista.
I saw at Starbucks, but that's another story.
I'm good with that.
I'm not, I'm not picky about that part.
As long as it's hot and good.
I just want to drink it.
Yeah.
So, uh, you, you have that thing, which a lot of, you know, people deep in it have been pondering for a while.
Now everyone's pondering like, okay, what happens?
Those jobs go away.
And this idea of universal basic income, there have been experiments with that.
You know, is it how humans are naturally wired?
I'm not sure.
Like for all of human history, we've had to compete to survive.
And what happens when you sort of say, well, no, no, no, we've created a safety net of a certain point.
Cause you don't have to all work.
I don't know how well that's going to sort out the same way college grads right now are dealing with, oh, wow.
There aren't actually as many jobs as there were five years ago.
Huh.
And it's going to get worse with each day.
Right.
So when you think about those movements of like kids that were basically wired from very young age, this is how you use these tools in a high frequency.
Way coupled with, and it's really hard to get a job.
We're not really clear how you're going to produce work.
Will it be a bad thing if you waste your life?
I don't know.
I wonder that.
I wonder about like talking to my son and explaining to him like the value of achievement and how good it feels to do something that might've challenged you.
There might've been a hard, you might've failed along the way and then finally succeeded.
And he could very easily say, yeah, but why, why go through all that struggle?
Like we've got a bot or, you know, a GPT or something that will do that for you.
Like I have glasses on brand.
When you talk to me and you ask me like, what's the temperature outside?
I don't even need to think because my glasses listen to you.
And they put in my little, you know, heads up in my glasses, the answer.
It's like, oh, it's 62 degrees and it's raining.
You know, it's a weird question.
It's like, did you see that Keanu Reeves interview?
I think it was on Charlie Rose where they were talking about he goes to dinner with his friend, the director.
I don't think so.
Okay.
So I'm going to paraphrase it because I watched and I was like, oh, wow.
And I did work with Keanu, one of the incredibly nice guys in the business.
And he relates a story where he went to a friend's house for dinner, small family dinner, you know, a friend who's director.
And I think the director has like two teenage daughters.
And over dinner, one of the daughters feels, you know, comfortable enough to say like, hey, dad's friend.
I know you're really famous for these Matrix movies, but I have to dinner or saw them.
Like, what's so cool about them?
What are they about?
He's like, oh, it's about a guy who realizes he's living in a computer simulation and wants to break out to the real world.
She's like, oh, okay, why?
And he's like, well, because when he wakes up, he realizes like his world is a simulation.
There's another world.
It's the real world.
And the simulation he's in is kind of this fake world.
She's like, okay, so why?
And Keanu's like, oh, my God, wow.
I'm trying to explain to her something that doesn't make sense to her generation.
She's not understanding one of the words I'm saying.
The concept of like, but if you're happy and you're living with Sim, then the downside is what?
You know?
Which is kind of embodied by one of the antagonists in that movie, right?
Right.
Cypher, I think it was.
Something like that.
Yeah.
Yeah.
I knew this thing is not real.
He's like, plug me back in.
Like, the real world sucks.
Yeah, exactly.
Exactly right.
Yeah.
It brings up a couple things as you were talking through there.
Like, kind of two, I guess, anecdotes or stories.
One is, and this is something that is one of those things that I don't think I ever would have known.
I would have missed it if I hadn't had it to begin with.
And, you know, it's that idea that, you know, you go look to a world before smartphones in particular.
Like, forget AI and all the rest of it.
You just look at like.
There was a world before smartphones.
Let's just talk like 2005-ish, right?
And you're out at like a bar or something or in a room or whatever with some friends and a question comes up.
And you sit there and just kind of go around the table with everyone's thoughts on what the answer to said question may or may not be, right?
And no one at the table is probably any more of an authority than anyone else around the table on this particular, whatever this topic is.
But everyone gets a chance to kind of like say a thing and like maybe you come to consensus.
And then at that table, that truth is whatever the five of you have agreed on, right?
And you compare that to the experience today, which is the same thing can happen.
But all it is, everyone launches for their cell phone, right?
To look up the actual answer.
So that all that discourse is gone.
So like even in that – and it's one of those things where like you would hear that maybe as someone who never had that experience and be like, well, what was the point?
Everyone's sitting around arguing with no actual information.
Like how is that better?
It's like I don't know.
It just was.
But the other thing that you were talking about, which is like why or what would you do if you had universal basic income and you have to worry about that kind of stuff.
And so I did some work in Denmark for a while.
And one of the things that was interesting there, one of the things I noticed is I noticed very quickly that there was this huge like entrepreneurial spirit there.
Like everyone I bumped into was like, oh, I'm working on starting this business and doing that business and da-da-da-da-da-da, right?
Like now granted, they're a very small country, right?
But what I – after digging into this a bit, like trying to figure out why is the concentration of people who aren't just working for someone but are trying to start something themselves, why it's so much higher?
Right.
And what I came up with was that because there are all these government safety nets there and you're like you're not working for benefits because benefits are taken care of.
So like your medical benefits, your dental, all that kind of stuff is done.
Your education is done.
Yeah.
And then even if your business fails out, you get some amount of income at least for some period of time to kind of recover.
Yeah.
And what it has spawned like much to your point is it's not like it spawned a bunch of sit around and do nothing individuals.
What it spawned was a bunch of people who are way more motivated to get out there and try things and fail kind of quickly, which I think speaks to what you're talking about here a little bit, which is like if everyone had everything taken care of for them, what would they actually do?
And that's kind of an existential question.
It is.
I have a question for you.
So there's this big thing going on in this country where you have states like Texas, which is where I live, for better or worse, and this marketing campaign, sort of PR campaign around how California is no longer the state of innovation, like Texas is making this deregulated state where anything goes.
Delaware, don't incorporate it in Delaware, incorporate it in Nevada or Texas and that sort of thing.
And I'm curious, do you think that outlives this administration?
Well, I guess, I don't know.
I'd have, first of all, I'm probably a little, I'm probably a little under-informed on the subject.
What are the reasons that that push?
I've seen the push, but I don't know.
I'm not close enough to it where I know what their reasoning is.
Is there something behind that or is it just marketing?
Well, I think there is a lot of lobbying money that Silicon Valley had previously put into different coffers, let's say.
And right now with this administration, there's been this explosion of, okay, we actually found out we can donate money and get regulation removed or adjusted towards us.
So, you know, venture capitalists like Mark Andreessen, they just talk their books, David Sachs just talks the book.
And they're like, everything should be deregulated, stable coins, crypto should be, you know, accepted.
We want to exit all these companies, these portfolio companies on our, you know, in our funds.
We need to get liquidity events.
And we were stymied under the past administration.
So now we want to take Circle and Coinbase, just everything put in the public market.
Like even Chamath is like the scammer SPAC king of the world.
He's back to doing SPACs again.
He's like, okay, great.
I'm going to get my fees for doing nothing.
And I'm going to fleece all these retail investors who invest in, you know, whatever I'm doing.
And he demonstrated very well how he could do that before.
And so you have the states that are offering the least regulation, least enforcement saying, hey, come here.
This is the era of, you know, the wild, wild west.
We'll offer you an environment that's less regulated than Delaware or California.
I probably tipped my hand a little bit because I am astonished how it's happening.
But I guess every now and then, retail investors have to learn a hard lesson about, like, there's a reason why you have the Consumer Financial Protection Board, like Bureau.
They actually are looking out for these kind of scams to protect you.
But right now, we're in that era where everyone thinks every stock just goes up magically.
Like, everybody's a genius because they have money in the stock market.
And who's that said that, you know, when the tide goes out, you understand, I think it's Buffett who said, tide goes out, you know, you can tell who's been swimming in Nevada swimsuit.
Yeah, that's, I'm going to answer this in kind of a roundabout way.
First of all, I'm sure that over time, right, the pendulum will swing to some degree, right?
You get out past this administration under the assumption that maybe it really 180s and the tides will start turning a bit, I'm sure.
Yep.
I think people underestimate or overestimate, depending on how you look at it, just because someone doesn't, someone or, you know, a party or a company or a legislation or whatever, just because they don't get 100% of what they want doesn't mean they didn't get 80% or 70% or 60%, right?
So, even if it never, even if you get out a little ways and things start to swing back and they call it a win, you know, from their point of view or whatever, that doesn't mean that 100% of whatever was changed before has been course correct.
So, the long, I guess the short answer long here is that, yeah, some of it will stick around for sure, I'm sure, because there's no way that someone later is going to come around and just reverse literally everything and then also do stuff on top of that.
It's probably going to be a 60% measure.
And if it swung super hard in another direction before and change, say, 120% of what could possibly be going on and then it corrects by 60 or 70, you've still got that difference, right?
So, yeah, I would imagine down the road, you're still looking at an environment that's at least less than it was, say, a few years ago, just because of the natural course of things.
Let us hope whatever the work is, the honest work that people do to, you know, move things forward, not packaging SPACs and, you know, pump and dump schemes and crypto, but the real work, like they have the ability to focus and to do that.
You know, we're not just all building tools for social media.
I think it's amazing the way, like in San Francisco, you can drive through, you know, the seven miles square, head down to Menlo Park and to get to your beautiful office or to get to your house in Atherton or whatever.
You are driving right by horrible social problems, you know, people who are just homeless on the street and, you know, mentally, you know, not all there.
And you can still look yourself in the mirror and be like, I got to get to work because I have to work on important problems.
Like, how do we optimize the double tap like thing in our social media, you know, app that does food delivery, right?
I need to, I need to get the smartest engineers here to figure out, like, how do I get those hot ramen noodles there two minutes earlier?
No, I hear you.
I've been on the job hunt recently in a more serious way than I had been in the past several months, you know, because I should probably make income at some point in my life again.
There's that until you need the icons.
But, you know, going through these processes, I look at so many companies and it does, it feels, it feels, it feels hollow.
That's the best word I've been able to come up with is like hollow, right?
Now, I probably, I was raised in a very education centric household.
And like, to me, you know, I can't, I worked for 15 years in higher education.
It's almost hard for me to, to let go of, of that.
But on the same token, like I, I, and I, I've been at some interviews, um, along the way with either mission driven organizations or other higher ed institutions or whatever.
And they're like, why are you in this industry kind of thing?
Yeah.
And my answer tends to be, you know, I w I was trying not to be.
I said, and they pulled you back in.
Is this the Godfather?
No, no, it's more like the other industry just pushed me out.
Right.
Cause I just can't, I have a hard time thinking that what I should do with my life is wake up every day and make a billionaire, another billion dollars.
Like, I just don't.
Right.
I have a hard time with that without these being some feigned, you know, attempt at like meaning in the middle, like give me something here, right?
Like be something to latch on to.
You're a dreamer, aren't you, Eddie?
I guess.
But, um, yeah.
So I don't know.
It's, it's, that's, uh, that is an interesting thing.
And you're absolutely right.
So many of these things, I think, and I've been trying to find a way to articulate this.
Uh, so I'm going to give it a whirl and we'll see where we go.
Okay.
Here we go.
If you rewind way, way, way, way, way back.
Right.
And man is hunting for food and you're just trying to eat to survive.
And then you slowly start building systems on top of that, where you, you were hunting and you learned fire and you learned cooking.
Right.
And then next you learned to kind of do it in a group.
And then next you learned that some, it was really helpful with like somebody in the group with the leader of the group.
And then you'd build that out and build that out and build that out.
Right.
So we get here today where people are so far removed.
Right.
Like, cause you're right.
Like I'm going to work to do the thing to optimize the double tap button.
And somehow that puts food on my table.
And there's like a thousand layers of abstraction in the middle.
Right.
Yeah.
And it's like, it's like crazy.
Right.
It's, it's, it's how far up that chain.
And yet somehow by building that chain, um, we have developed society into something that's this advanced, you know, like it's almost counterintuitive.
I suppose when you think, when I, when I think about it, um, there is a book you would love.
It is about the cycles of human civilization.
And it's written by Will and Ariel Durant, who wrote the history of civilization.
Right.
So they spend most of their life writing this 11, 13 tome kind of like work.
And at the end of it, they wrote this very short book called the lessons of history.
And it was like, it can be subtitled what we learned over a lifetime of studying humans.
Right.
Great book where there are chapters in there where they just talk about how you can tell where you are in that cycle.
Right.
And there's that phenomenon of like, you know, the sons of great men are rarely great men.
In other words, people who were so had a fire in their belly to rise up and do something great, whether that is great as a positive thing or great as a negative, but a great act.
Right.
Build something.
And very often like the children, the immediate children don't do that.
Right.
They're sort of there and they're coasting and dad made a lot of money or something.
And then sometimes this skips a generation.
You have their children actually to achieve something.
And they talk about these sort of phenomena and they talk about them also in the span of civilizations, how that happens, not just inside a family, but inside the collective of a group of people.
And you would really dig this book.
Yeah, I'll definitely pick that up.
That'll be maybe next on my list.
I've been slogging through a book that I've been reading for way too long.
And as soon as I'm through this, I'm looking for my next read.
And so take it out.
And it's a quick read and it's just super, you'll nod your head.
There's some chapters where you're like, oh yeah, that's spot on.
And other ones where you're like, wow, I never thought of it that way.
That's awesome.
I'll definitely check it out.
So this has been a super interesting conversation.
I'd love to do two things here.
Ready.
Let's do it.
Number one, we didn't get to any of the topics that we had written down ahead of time.
Yeah, we thought.
Briefly discuss one of them.
But then I also want to give you a chance.
Please introduce yourself.
Who is that voice talking to Grant?
Who you are, what you're doing these days, if you have anything you want to kind of promote or pitch.
And then the topic that I was super interested in talking about that we had kind of written down ahead of time was really around the executive versus the, what are they?
The default neural network and stuff.
I'd love to just, you know, maybe a couple of minutes on that, you know, at the end here.
But first, why don't you tell everyone who you are and what you got going on?
Sure.
Yeah.
My name is Stephen Puri.
And I have been executive of a couple of films, studio lots, and also raised like 20 plus million dollars of venture, run some startups, successful, and failed.
And I talk a lot about flow states, about remote work.
I, as you probably have surmised, feel very passionately that we need to have tools to help us do the things we're capable of.
And it's a sad thought that some of us will die with the great thing inside us unreleased.
So, I make an app through the Suka company, S-U-K-H-A, and it's a flow state app.
It helps you get into a flow state to really do focused work.
And happy to be here.
If you want to talk about executive mode and default mode networks, I'll mention there's a great book on this subject about the neuroscience of it by Olivia Fox-Gabon and Judah Pollack called The Net and the Butterfly.
And the short of it, since I know we're at the end of our little episode, is that you have from a very young age this default mode that's, let's think of it in reductive terms, is like the child kind of brain.
It's like looking around and experimenting and trying to understand things and coming up with offbeat ideas.
And then you develop the executive mode, which is one that executes action.
I need to be at work at nine.
I have to have a slide deck done.
I have my homework to do.
I've got to be at the baseball practice, you know, and is responsible that way.
And what's interesting is as you have both of them operating, sometimes the best ideas come when the executive mode network is busy, which allows the default mode to do that weird, like, I don't know, what does my cell phone taste like?
Or what if peanut butter and chocolate were mixed together?
Would that be good?
You know, like those kind of thoughts that your executive mode network would not necessarily give you.
So in terms of, like, fostering creative practice in yourself or your team, sometimes that's really helpful.
And you may have experienced this if you have interesting ideas for the blog posts you need to write or some feature for your app when you're walking, biking, showering, driving, you know, washing the dishes, doing something where your executive mode network is busy.
So that, I think, is super interesting as you become, you know, more tuned to managing your own productivity.
Flow states are a great thing.
Understanding your default mode network is a great thing.
Does that cover it?
Is that short enough?
Yeah, that was awesome.
I think we see this all the time, right?
This is the daily thing where you're focused on something, focused on something, and you can't break through it, you can't break through it, you can't break through it.
Then you put it down, you walk away, and you're, like, washing the dishes, and all of a sudden you have the answer, right?
Yeah, that's exactly it.
Or you have the answer, but the answer is, like, some totally out-of-the-box thing where it's, like, oh, my God, if I take a step back and look at this in a totally different way, I can fix this in a totally different, you know, totally different way, you know?
Which you never would have thought of if you're just sitting there staring at a whiteboard or sketching or, like, whatever you're trying to focus and focus and focus and focus.
But you just get to that zone.
As much as there are flow states, right, I don't know if there's research on this, but there's also anti-flow states.
I'm a big believer in this.
Like, when you're too zeroed in on something and you just can't crack it, sometimes the best thing to do is put it down.
And I always wonder what the best time is to move from one of those modes to the other.
A good point.
A good point.
Yeah.
Yeah, that's awesome.
You know, it was an excellent conversation talking with you.
I so appreciate you coming on the show.
It was fun chatting.
Thank you for having me.
Absolutely.
And I will link to all of your various things in the notes.
So for everyone listening, your website, your app, any of the socials that you'd like, all those things will all be in the notes for you.
Super fun.
Thank you.
All right.
Thank you so much.
Everyone listening at this point.
Yeah.
Thank you for listening.
Thank you.