AI In Film: Is There A Ghost In The Machine?

S1 Ep3:
Pour yourself a glass of warm milk and join us while we rant about AI and it’s unstoppable quest to drain all creativity from the world leaving us nothing but AI slop to feed our soulless bodies.

S1 Ep3

Pour yourself a glass of warm milk and join us while we rant about AI and it’s unstoppable quest to drain all creativity from the world leaving us nothing but AI slop to feed our soulless bodies. 

NOTE: The “artwork” for this episode was generated using AI with Adobe Firefly, because it seemed appropriate for this week’s topic. 

Show Notes

AI McDonald's Ad

Check out the McDonald’s Statement in Futurism.

And here is the production company’s statement: The Gardening Club’s Statement

Tilly Norwood - AI Actress

Tilly Norwood

Here are some resources to learn about the worlds first AI “actress” sloppy gurl.

Tilly’s Website

Tilly’s Instagram

Xicoia’s Website (AI “Talent Agency” for AI slop)

Darren Aronofsky's Primordial Soup

Darren Aronofsky has launched Primordial Soup, a new venture he’s positioning as empowering filmmakers with AI-based creative tools.

Primordial Soup forged a partnership with Google DeepMind’s AI research team, and together they are working with three filmmakers to produce short films.

The first of Primordial Soup’s film projects is Eliza McNitt’s “Ancestra,” which uses a mix of live-action performances and AI-based visuals, set to premiere next month at the Tribeca Festival. 

But the first project for a brand is a series for Time.

You can see a trailer for this AI slop here:

Natasha Lyonne's Asteria Film Co.

Much like Darren Aronofsky, Natasha Lyonne‘s new AI slop machine, Asteria “Film” Co., is set to make a bunch of crap.

Bonus

Show Transcript

00:00:10.029 –> 00:00:12.689
Welcome back to nightmare logic the podcast that

00:00:12.689 –> 00:00:14.630
your mother warned you about I’m Christopher

00:00:14.630 –> 00:00:18.050
Smith here with Peter Sawyer Strap in as we rant

00:00:18.050 –> 00:00:20.449
about artificial intelligence and ask questions

00:00:20.449 –> 00:00:22.910
that we’ve all been asking how screwed our indie

00:00:22.910 –> 00:00:35.729
filmmakers really Hey Peter, how you doing man?

00:00:36.210 –> 00:00:38.609
Not too bad. How about you? Yeah, pretty good.

00:00:38.729 –> 00:00:44.130
What’s new? I have almost a beard. I haven’t

00:00:44.130 –> 00:00:47.710
done this in… Yeah, you do. You look slightly

00:00:47.710 –> 00:00:52.469
homeless. Good. No, this is… I call the beard…

00:00:52.469 –> 00:00:55.310
I heard a quote about beards as being the padded

00:00:55.310 –> 00:00:58.250
bra of masculinity and I stand by that. I think

00:00:58.250 –> 00:01:00.490
it’s just kind of a funny way to put it, but

00:01:00.490 –> 00:01:05.060
I’ve grown one once and that was 2000… 12 and

00:01:05.060 –> 00:01:07.280
I had red facial hair and I was like, oh, hell

00:01:07.280 –> 00:01:10.939
no. So I didn’t keep it very long. Now that I

00:01:10.939 –> 00:01:13.099
have some gray in my hair, this is more of an

00:01:13.099 –> 00:01:15.319
experiment to see like if it’s all gray, but

00:01:15.319 –> 00:01:17.359
I still see the fucking red, so. Yeah, yeah,

00:01:17.379 –> 00:01:19.500
totally makes sense. Did you watch anything good

00:01:19.500 –> 00:01:23.099
this past week? Yeah, I have, I guess two and

00:01:23.099 –> 00:01:26.340
a half things I wanna mention and I’ll start

00:01:26.340 –> 00:01:28.480
with a half thing because I’m not quite done

00:01:28.480 –> 00:01:31.700
with it, but. If you’re a Texas Chainsaw Massacre

00:01:31.700 –> 00:01:35.719
fan, you are probably well aware of Chain Reactions,

00:01:35.780 –> 00:01:39.159
a documentary about it with like five different

00:01:39.159 –> 00:01:44.180
speakers. And I’m four fifths of the way through.

00:01:44.439 –> 00:01:46.760
But the reason I thought to bring this up was

00:01:46.760 –> 00:01:50.659
last week we talked about titles and the documentary

00:01:50.659 –> 00:01:54.120
opens with Patton Oswald talking about the title

00:01:54.120 –> 00:01:56.040
of the Texas Chainsaw Massacre. That’s cool.

00:01:56.099 –> 00:01:58.969
And how he’s like. Because it was from one of

00:01:58.969 –> 00:02:01.450
his like stand -ups it’s footage of that and

00:02:01.450 –> 00:02:02.810
he’s basically saying he’s like, yeah You need

00:02:02.810 –> 00:02:05.590
a title that you can see in your head before

00:02:05.590 –> 00:02:08.289
you even see the movie And so he’s like the texas

00:02:08.289 –> 00:02:10.629
chainsaw massacre is it and the funny thing about

00:02:10.629 –> 00:02:14.509
it is it is and it’s but it’s also not true because

00:02:14.509 –> 00:02:17.210
The texas chainsaw massacre only one person gets

00:02:17.210 –> 00:02:21.090
massacred by a chainsaw But yeah, it’s a it’s

00:02:21.090 –> 00:02:24.289
a hell of a name um, and the it’s really insightful.

00:02:24.310 –> 00:02:26.990
I I highly recommend people watching this because

00:02:26.990 –> 00:02:30.960
every person so far has really interesting insight

00:02:30.960 –> 00:02:35.319
to the Texas Chainsaw Massacre, but also that

00:02:35.319 –> 00:02:37.759
kind of brand of horror movie that’s just primal

00:02:37.759 –> 00:02:40.740
and there’s no fat on the bones and how it pushed

00:02:40.740 –> 00:02:43.759
the limits and was really, Stephen King was like,

00:02:43.759 –> 00:02:46.520
this wasn’t even in this. Like guys, that’s how

00:02:46.520 –> 00:02:49.419
insane it was when I saw it. Wow. Yeah. You know,

00:02:49.419 –> 00:02:51.120
it’s also interesting cause it, I’m pretty sure

00:02:51.120 –> 00:02:53.219
it looks pretty low budge for the time, you know,

00:02:53.340 –> 00:02:57.400
like 16 millimeter film probably. Yeah, there

00:02:57.400 –> 00:03:00.460
was a theory, I think, that it was shot on 16

00:03:00.460 –> 00:03:03.840
and then presented as 35, which made it look

00:03:03.840 –> 00:03:07.479
even more grainy. Yeah, yeah. So it looked like

00:03:07.479 –> 00:03:10.340
you were watching something that was real. Yeah.

00:03:10.599 –> 00:03:13.740
You know, it’s interesting. My executive producer

00:03:13.740 –> 00:03:15.259
on a film that I’ve been working on for a long

00:03:15.259 –> 00:03:18.139
time, a documentary, she worked on the first

00:03:18.139 –> 00:03:21.400
Chainsaw Massacre as a… I think she’s like

00:03:21.400 –> 00:03:22.580
a production assistant or something. It was the

00:03:22.580 –> 00:03:25.379
first thing she did. Wow. In the 70s or whatever?

00:03:26.939 –> 00:03:30.759
Yeah, she’s older and I think that she lived

00:03:30.759 –> 00:03:33.199
in Texas at the time and you know, that was just

00:03:33.199 –> 00:03:36.360
like the local production. Yeah, I don’t doubt

00:03:36.360 –> 00:03:40.400
it. It’s a very cool documentary. Yeah, I want

00:03:40.400 –> 00:03:42.020
to check that out actually. I love any kind of

00:03:42.020 –> 00:03:43.719
making of behind -the -scenes kind of documentaries

00:03:43.719 –> 00:03:47.039
because I mean, I don’t know, I get all… First

00:03:47.039 –> 00:03:48.919
I like… learning how they did things, but I

00:03:48.919 –> 00:03:50.759
also get kind of stoked about making my own projects.

00:03:50.800 –> 00:03:52.780
So, yeah, definitely want to check that out.

00:03:52.979 –> 00:03:55.340
I’ve actually never seen any of the Texas Chainsaw

00:03:55.340 –> 00:03:58.240
Massacres. See the first one, just because that’s

00:03:58.240 –> 00:04:02.219
kind of a… It’s like a Night of the Living

00:04:02.219 –> 00:04:06.659
Dead or a, you know, psycho kind of exorcist.

00:04:06.979 –> 00:04:09.860
One of those, like… It’s canon. Yeah, yeah,

00:04:09.900 –> 00:04:12.680
for the genre, because it was such a shocking

00:04:12.680 –> 00:04:17.139
movie. Yeah, I’m going to check that out. I saw

00:04:17.139 –> 00:04:19.540
another movie that Peter recommended this week,

00:04:20.199 –> 00:04:24.980
which is shockingly embarrassed. Shockingly embarrassed?

00:04:25.120 –> 00:04:26.800
I’m embarrassed that I hadn’t seen it up until

00:04:26.800 –> 00:04:29.759
this point. That’s In the Mouth of Madness, classic

00:04:29.759 –> 00:04:32.620
John Carpenter. But man, I loved it. Thanks for

00:04:32.620 –> 00:04:35.060
recommending that. It’s like psychological horror

00:04:35.060 –> 00:04:38.860
and also has like a weird, I guess they’re like

00:04:38.860 –> 00:04:42.620
aliens. something, but it kind of like D -mini

00:04:42.620 –> 00:04:45.439
quality. It’s the old HP Lovecraft. Exactly.

00:04:45.480 –> 00:04:46.839
It’s Lovecraft. That’s a very good way to put

00:04:46.839 –> 00:04:50.279
it. Yeah. So I really enjoyed it. I, you know,

00:04:50.300 –> 00:04:54.399
I, I, interestingly enough, I felt it, it looked

00:04:54.399 –> 00:04:59.000
a little old. I mean, it came out in 95 and I

00:04:59.000 –> 00:05:00.259
was a little surprised by that because it almost

00:05:00.259 –> 00:05:03.360
looks like an eighties or, you know, early like

00:05:03.360 –> 00:05:05.860
eighties film in a way, the way that like cinematography

00:05:05.860 –> 00:05:09.889
was and, uh, and the creature. some of the creature

00:05:09.889 –> 00:05:14.209
effects, but mid -90s it turns out, so. A fun

00:05:14.209 –> 00:05:17.790
fact about that movie is John Carpenter wanted

00:05:17.790 –> 00:05:21.730
Metallica’s Enter Sandman to be the opening theme,

00:05:22.209 –> 00:05:24.629
and he couldn’t get it, so he just wrote the

00:05:24.629 –> 00:05:28.269
kind of like metal, hard rock intro. Was he already

00:05:28.269 –> 00:05:31.589
scoring his movies at that point? Yeah, I think

00:05:31.589 –> 00:05:33.430
he always did. I don’t know about his short films,

00:05:33.810 –> 00:05:36.410
but you know, Halloween I think is the big one

00:05:36.410 –> 00:05:38.680
that put it up. Put his scores on the map or

00:05:38.680 –> 00:05:42.379
the most familiar right when people are and he

00:05:42.379 –> 00:05:44.519
did he score other people’s films in addition

00:05:44.519 –> 00:05:49.240
to Making his own That’s a great question I’m

00:05:49.240 –> 00:05:51.180
not sure it wouldn’t surprise me. I feel like

00:05:51.180 –> 00:05:53.379
I should know the answer. I was just curious

00:05:53.379 –> 00:05:56.699
I mean neither here nor there but he was asked

00:05:56.699 –> 00:05:58.939
recently to do that and I believe he said yes,

00:05:58.980 –> 00:06:02.240
and I cannot remember who Asked him. That’s cool.

00:06:02.240 –> 00:06:05.240
That’s cool Well, the other thing I want to talk

00:06:05.240 –> 00:06:07.779
about real quick is Wait, wait, wait. I got my

00:06:07.779 –> 00:06:09.939
two other things. Oh, yeah, you’ve seen other

00:06:09.939 –> 00:06:15.480
things. OK. So if you know me, or I guess most

00:06:15.480 –> 00:06:20.379
of you probably don’t know me, I am not that

00:06:20.379 –> 00:06:24.699
well -versed in David Lynch. I had seen Dune

00:06:24.699 –> 00:06:27.600
as a little kid, and it didn’t vibe with me.

00:06:27.660 –> 00:06:31.959
I’ve seen Blue Velvet, and I’ve seen Eraserhead.

00:06:32.339 –> 00:06:34.860
And so I finally decided to watch Mulholland

00:06:34.860 –> 00:06:39.000
Drive. And I feel like I finally get the uber

00:06:39.000 –> 00:06:42.399
appeal of David Lynch. I don’t know if that’s

00:06:42.399 –> 00:06:46.240
seen as his best film, but I was entranced with

00:06:46.240 –> 00:06:48.139
it and highly recommend it. You should check

00:06:48.139 –> 00:06:50.339
out Straight Story. I’ve heard that’s really,

00:06:50.339 –> 00:06:54.000
really good, actually. I’ve heard that too. I’m

00:06:54.000 –> 00:06:57.360
curious to see a rated G version of a Lynch film.

00:06:57.759 –> 00:07:00.800
Yeah, I mean, I think it also doesn’t feel like

00:07:00.800 –> 00:07:03.300
one of his movies. Right. Yeah, I mean, I think

00:07:03.300 –> 00:07:05.750
it’s pretty straightforward. He told it pretty

00:07:05.750 –> 00:07:07.430
straightforward, I think, right? Right, hence

00:07:07.430 –> 00:07:10.910
the name. With Mulholland Drive, though, it made

00:07:10.910 –> 00:07:13.550
me realize with David Lynch, you just don’t know

00:07:13.550 –> 00:07:16.769
what’s coming. Every scene was a complete surprise

00:07:16.769 –> 00:07:19.910
to me watching that, which I found very exciting.

00:07:20.170 –> 00:07:23.110
Cool. Because it worked, but you just were like,

00:07:23.110 –> 00:07:26.430
oh, wow, this is interesting and different. Yeah.

00:07:26.769 –> 00:07:29.110
I’m not super familiar with his work beyond,

00:07:29.649 –> 00:07:35.579
I’ve seen, obviously I’ve seen, Twin Peaks, you

00:07:35.579 –> 00:07:38.199
know, and then I didn’t actually watch all of

00:07:38.199 –> 00:07:40.060
it I watched the first season part of the second

00:07:40.060 –> 00:07:43.220
season I think so I’m not a huge fan to be honest

00:07:43.220 –> 00:07:45.779
because I did try at one point starting Mulholland

00:07:45.779 –> 00:07:47.500
Drive and I turned it off cuz I mean I was a

00:07:47.500 –> 00:07:49.620
lot younger and I probably just wasn’t primed

00:07:49.620 –> 00:07:53.220
for it, you know, but I I don’t know. I mean

00:07:53.220 –> 00:07:55.899
I should get another shot, but I generally speaking

00:07:55.899 –> 00:08:00.740
don’t always sometimes like psychological types

00:08:00.740 –> 00:08:04.180
of films that jump around a lot and are, you

00:08:04.180 –> 00:08:06.459
know, somewhat disorienting. Sometimes that comes

00:08:06.459 –> 00:08:08.139
off as confusing to me and I don’t like when

00:08:08.139 –> 00:08:11.839
things feel confusing. Yeah, I feel like with

00:08:11.839 –> 00:08:14.720
Mulholland Drive, you could spend a lot of time

00:08:14.720 –> 00:08:18.199
analyzing it. And some people do. Yeah, that’s

00:08:18.199 –> 00:08:20.160
sad. I’m sure that’s the thing. But I mean, I

00:08:20.160 –> 00:08:22.459
just cheat and we’ll look up what someone else

00:08:22.459 –> 00:08:24.480
put put up there. And I’m like, oh, yeah, I guess

00:08:24.480 –> 00:08:27.120
I can see it because otherwise I’ll be thinking

00:08:27.120 –> 00:08:29.930
about those things for days. Right. And the last

00:08:29.930 –> 00:08:32.070
thing I wanted to mention is more of a deep cut

00:08:32.070 –> 00:08:35.149
that I was not aware of that I just kind of stumbled

00:08:35.149 –> 00:08:39.990
on on Tubi. And that is Alison’s birthday. And

00:08:39.990 –> 00:08:42.649
it’s a I think it was shot in seventy nine and

00:08:42.649 –> 00:08:45.789
made in eighty one. But it’s kind of it’s ahead

00:08:45.789 –> 00:08:48.190
of its time in that it’s there’s a couple of

00:08:48.190 –> 00:08:52.549
modern movies, one especially that either borrowed

00:08:52.549 –> 00:08:56.879
from it or just hit the same. themes and beats.

00:08:57.059 –> 00:08:59.440
I don’t even want to say too much about it because

00:08:59.440 –> 00:09:03.419
it’s it’s a it’s it’s one of those you want to

00:09:03.419 –> 00:09:06.500
go on blind but I will say it opens with kind

00:09:06.500 –> 00:09:09.720
of a seance with this girl Allison and something

00:09:09.720 –> 00:09:11.620
bad is going to happen on her 19th birthday so

00:09:11.620 –> 00:09:14.960
it sets the stage and then it kind of goes from

00:09:14.960 –> 00:09:18.600
there but there is a little bit of like getting

00:09:18.600 –> 00:09:20.820
to know the character and all that before it

00:09:20.820 –> 00:09:22.899
becomes a horror film but once it does I was

00:09:22.899 –> 00:09:26.919
just hooked. And so it’s a nice gem for people

00:09:26.919 –> 00:09:28.600
looking for something a little more obscure.

00:09:29.120 –> 00:09:31.740
Right, so go check that out. That was… Allison’s

00:09:31.740 –> 00:09:35.360
birthday. Allison’s birthday. On 2B. On 2B. One

00:09:35.360 –> 00:09:36.960
more thing I wanted to chat about real quick

00:09:36.960 –> 00:09:38.820
is that last night the Golden Globes happened,

00:09:39.279 –> 00:09:41.779
and a lot of people are talking today about how

00:09:41.779 –> 00:09:44.539
sinners essentially… Some people say got shut

00:09:44.539 –> 00:09:46.919
out of a lot of the things that they were nominated

00:09:46.919 –> 00:09:50.679
for with the exception of two awards One is for

00:09:50.679 –> 00:09:53.220
cinematic and box office achievement Which I

00:09:53.220 –> 00:09:55.039
think is a relatively new award and some people

00:09:55.039 –> 00:09:57.139
kind of consider it the Golden Globes consolation

00:09:57.139 –> 00:10:01.320
prize and the second one was for best score I

00:10:01.320 –> 00:10:04.059
believe which they didn’t even televise and so

00:10:04.059 –> 00:10:06.440
there’s a lot of people upset because they thought

00:10:06.440 –> 00:10:08.220
it deserved a lot more and I’m kind of curious

00:10:08.220 –> 00:10:10.950
I know we’ve both seen it and Probably a lot

00:10:10.950 –> 00:10:12.970
of you have out there as well. So what’s your

00:10:12.970 –> 00:10:15.070
take on that? Do you think that it deserved to

00:10:15.070 –> 00:10:19.269
win more or So I’m gonna preface this by saying

00:10:19.269 –> 00:10:23.629
I personally don’t care for award shows because

00:10:23.629 –> 00:10:27.370
at the end of the day our Films are totally subjective.

00:10:27.690 –> 00:10:29.629
One person may love a movie. The next person

00:10:29.629 –> 00:10:34.409
may hate it That said I would say sitters Should

00:10:34.409 –> 00:10:37.809
get those two awards. It was probably the best

00:10:37.809 –> 00:10:40.850
-made movie. I saw last year I thought it was

00:10:40.850 –> 00:10:43.769
a really strong score, so I agree with that.

00:10:45.029 –> 00:10:49.129
Do you know what the other categories were? Yeah,

00:10:49.570 –> 00:10:52.529
the one that people thought in particular that

00:10:52.529 –> 00:10:55.889
he was shut out from was, well, Ryan Kugler’s

00:10:55.889 –> 00:10:59.090
missing out on best screenplay and losing to

00:10:59.090 –> 00:11:03.370
one battle after the other. And yeah, what do

00:11:03.370 –> 00:11:04.690
you think about that as a writer? Do you think

00:11:04.690 –> 00:11:08.610
that it should have deserved? So as a writer,

00:11:08.769 –> 00:11:12.389
I feel The best way to know best screenplay is

00:11:12.389 –> 00:11:14.009
to actually read the screenplay, is not just

00:11:14.009 –> 00:11:16.129
see the film version. Yeah, I agree, 100%. Because

00:11:16.129 –> 00:11:18.970
it’s a different version of the story. Right.

00:11:19.129 –> 00:11:20.570
And then, you know, you’re looking at the language

00:11:20.570 –> 00:11:22.049
and all these things that don’t end up on the

00:11:22.049 –> 00:11:24.070
screen. Right. So I’ve always kind of been like

00:11:24.070 –> 00:11:27.129
best written or when people in comment sections

00:11:27.129 –> 00:11:29.990
are like, the story, you know, or the script

00:11:29.990 –> 00:11:31.690
sucked. I’m like, well, have you read the scripts?

00:11:32.669 –> 00:11:34.210
Because I don’t think those movies would get

00:11:34.210 –> 00:11:36.850
made if the script sucked. No, 100%. I mean,

00:11:36.909 –> 00:11:39.629
well. Depends, you know someone like a Ryan Coogler

00:11:39.629 –> 00:11:43.389
though might You know get stuff funded just on

00:11:43.389 –> 00:11:45.590
his track record, you know, but for people like

00:11:45.590 –> 00:11:49.789
us Yeah, well so and I still have not seen one

00:11:49.789 –> 00:11:52.269
battle after another so I can’t compare the two.

00:11:52.269 –> 00:11:56.809
Yes Or read the scripts presumably the I I was

00:11:56.809 –> 00:12:00.669
a fan of sinners I Ironically was not a fan of

00:12:00.669 –> 00:12:03.769
the vampire stuff, but I loved everything else

00:12:03.769 –> 00:12:07.059
about it for the most part. So I’m rooting for

00:12:07.059 –> 00:12:08.820
it, of course. Yeah, this might give me some

00:12:08.820 –> 00:12:10.980
haters, but I kind of thought it was a little

00:12:10.980 –> 00:12:15.980
bit muddled of a story. There were like elements

00:12:15.980 –> 00:12:17.720
I really loved of it and I still really like

00:12:17.720 –> 00:12:21.419
the movie overall, but I kind of felt like it

00:12:21.419 –> 00:12:25.320
could have been used another pass or two just

00:12:25.320 –> 00:12:29.919
in terms of some of the, I don’t know, elements

00:12:29.919 –> 00:12:32.649
of it felt a little disjointed to me. like that

00:12:32.649 –> 00:12:34.710
it wasn’t entirely cohesive. What I’ve heard

00:12:34.710 –> 00:12:37.850
is that the and I can see it is that the vampire

00:12:37.850 –> 00:12:41.730
element sort of feels tacked on because it has

00:12:41.730 –> 00:12:43.990
all this rich backstory and then it you know

00:12:43.990 –> 00:12:47.070
they tease kind of this supernatural darkness

00:12:47.070 –> 00:12:50.490
and then it’s that. I feel like it probably would

00:12:50.490 –> 00:12:53.330
have been a stronger story if it wasn’t vampires

00:12:53.330 –> 00:12:56.210
but it was the clan. Yeah well and you know obviously

00:12:56.210 –> 00:12:58.750
there’s trying to touch on some metaphor there

00:12:58.750 –> 00:13:01.940
and I think my problem was it should have been

00:13:01.940 –> 00:13:04.340
before he went back and just killed all the racists

00:13:04.340 –> 00:13:06.960
coming to get him. To me, that just kind of felt

00:13:06.960 –> 00:13:09.500
like a hat on a hat, like it almost didn’t need

00:13:09.500 –> 00:13:12.340
it. And it felt heavy handed in terms of the

00:13:12.340 –> 00:13:16.299
messaging of the theme. But I still enjoyed it.

00:13:16.320 –> 00:13:19.559
And that was a good segment in terms of it was

00:13:19.559 –> 00:13:23.340
fun to watch. Right. But yeah, I just thought

00:13:23.340 –> 00:13:25.059
that was interesting. And the reason Golden Globes

00:13:25.059 –> 00:13:27.340
are important to begin with is that it really

00:13:27.340 –> 00:13:30.159
sets up the Oscar race and really shows who like

00:13:30.159 –> 00:13:32.919
the front runners are. So, you know, for those

00:13:32.919 –> 00:13:36.139
out there who don’t know, the award season kind

00:13:36.139 –> 00:13:39.659
of starts a little earlier in last year in the

00:13:39.659 –> 00:13:42.200
fall. And, you know, there’s some smaller award

00:13:42.200 –> 00:13:45.679
shows and they kind of build up and certain films

00:13:45.679 –> 00:13:47.700
that start winning awards early might pick up

00:13:47.700 –> 00:13:49.779
steam and towards winning the Oscar. And it could

00:13:49.779 –> 00:13:52.259
totally change the race. So that’s the reason

00:13:52.259 –> 00:13:55.139
that Golden Globes are important to people because

00:13:55.139 –> 00:13:57.919
they tend to indicate which direction the Academy’s

00:13:57.919 –> 00:14:02.779
gonna vote. Interesting. A last thought I think

00:14:02.779 –> 00:14:06.700
is when a movie in the zeitgeist performs really

00:14:06.700 –> 00:14:09.379
well and it seems like you’re reading the room,

00:14:09.679 –> 00:14:13.000
everyone’s rooting for it, then people might

00:14:13.000 –> 00:14:16.000
automatically assume that that’s like the winner.

00:14:16.919 –> 00:14:20.019
Right. And Sinners came out, what, like late

00:14:20.019 –> 00:14:24.200
spring, early summer, I think? Yeah. So… If

00:14:24.200 –> 00:14:25.940
that’s already made up in someone’s mind that

00:14:25.940 –> 00:14:28.820
that’s the movie of the year, other movies can

00:14:28.820 –> 00:14:31.139
come along and kind of… And they might not

00:14:31.139 –> 00:14:35.860
be… They might experience it the same way because

00:14:35.860 –> 00:14:38.200
they already got one in their head. Yeah, I could

00:14:38.200 –> 00:14:40.899
see that. What’s also interesting actually is

00:14:40.899 –> 00:14:43.539
the meta -conversation around it, because now

00:14:43.539 –> 00:14:45.879
this talk of being shut out in the backlash,

00:14:46.759 –> 00:14:49.870
does that kind of persuade some people? to vote

00:14:49.870 –> 00:14:52.230
differently for the Oscars. I mean, it’s a different

00:14:52.230 –> 00:14:54.649
voting base, but maybe they’re like, oh, well,

00:14:54.909 –> 00:14:57.169
I want centers to win. And so therefore, I’ll

00:14:57.169 –> 00:14:59.830
change my vote, or thinking that somebody else

00:14:59.830 –> 00:15:03.269
would have voted for it. It’s tricky. But as

00:15:03.269 –> 00:15:04.990
you mentioned at the beginning of this whole

00:15:04.990 –> 00:15:08.049
segment is that it’s all so subjective. And some

00:15:08.049 –> 00:15:09.330
of these people are voting for their friends.

00:15:09.409 –> 00:15:11.289
Some of them haven’t seen all the movies in the

00:15:11.289 –> 00:15:15.470
categories. And traditionally, horror films aren’t.

00:15:15.480 –> 00:15:17.799
Even in the running for most of these things,

00:15:17.960 –> 00:15:20.259
you know, if it wasn’t Ryan Coogler It probably

00:15:20.259 –> 00:15:22.000
wouldn’t be you know, I mean it was artfully

00:15:22.000 –> 00:15:25.340
done in many ways So like that helps but there’s

00:15:25.340 –> 00:15:27.039
a lot of horror films that are I’d never get

00:15:27.039 –> 00:15:29.299
that kind of recognition But the interesting

00:15:29.299 –> 00:15:31.559
thing about sinners is like most of it I don’t

00:15:31.559 –> 00:15:34.279
feel like was horror if it didn’t have the vampire

00:15:34.279 –> 00:15:36.379
element. It wouldn’t right be considered It was

00:15:36.379 –> 00:15:39.299
it was like a good entry film for horrors like

00:15:39.299 –> 00:15:42.769
horror light, you know It may be more like the

00:15:42.769 –> 00:15:44.389
way to think about it is it’s a little bit more

00:15:44.389 –> 00:15:46.990
gothic or stylized or so let’s say I think it’s

00:15:46.990 –> 00:15:50.970
like a genre blend like it’s historical like

00:15:50.970 –> 00:15:54.009
Fiction would you is that say and then it’s action

00:15:54.009 –> 00:15:57.250
and then like historical fantasy or historical

00:15:57.250 –> 00:15:59.450
So it just kind of brings in all these those

00:15:59.450 –> 00:16:06.610
different elements. Yeah, totally Now we can

00:16:06.610 –> 00:16:09.029
dive into our main segment for the day which

00:16:09.500 –> 00:16:11.940
I know everyone’s stoked to talk about, which

00:16:11.940 –> 00:16:15.700
is artificial intelligence. And so today we’re

00:16:15.700 –> 00:16:18.220
basically going to get into an in -depth discussion

00:16:18.220 –> 00:16:23.039
about it and discuss a little bit of the recent

00:16:23.039 –> 00:16:26.460
events that have come up that have, you know,

00:16:26.580 –> 00:16:30.019
keeps stirring the debate around AI and its use,

00:16:30.159 –> 00:16:33.879
particularly in Hollywood. So to get into this,

00:16:34.000 –> 00:16:36.360
I thought maybe we could discuss the McDonald’s

00:16:36.360 –> 00:16:40.879
commercial that Is infamous now it was put out

00:16:40.879 –> 00:16:43.419
in the Netherlands right before Christmas and

00:16:43.419 –> 00:16:46.259
it was a Christmas commercial advertising the

00:16:46.259 –> 00:16:49.639
McDonald’s cafe and it was basically a series

00:16:49.639 –> 00:16:54.899
of a little vignettes of all the ways in which

00:16:54.899 –> 00:16:58.259
Christmas time could be terrible from like people

00:16:58.259 –> 00:17:00.779
their Christmas trees falling off their cars

00:17:00.779 –> 00:17:04.539
to You know destroying their cake mix batter

00:17:04.539 –> 00:17:07.940
mix or you know all of the like grumpy people

00:17:07.940 –> 00:17:11.180
you encounter, things like that. In fact, why

00:17:11.180 –> 00:17:13.299
don’t we go ahead and play that for you guys

00:17:13.299 –> 00:17:15.519
right now, just a little bit of audio. And this

00:17:15.519 –> 00:17:17.460
will also be available on the website for you

00:17:17.460 –> 00:17:19.740
to check out. And you can pause now and go watch

00:17:19.740 –> 00:17:22.119
it if you want. But we’re just going to play

00:17:22.119 –> 00:17:23.720
a little snippet of the audio so you can get

00:17:23.720 –> 00:17:44.670
a taste of it. Okay, so that was uh, yeah, it’s

00:17:44.670 –> 00:17:47.369
so ridiculous So, I mean I think I don’t know

00:17:47.369 –> 00:17:49.730
what do you think of it? Did you think it’s good?

00:17:50.210 –> 00:17:53.710
I mean is any of McDonald’s food actually artificial

00:17:53.710 –> 00:17:56.490
as well because it would then be on brand with

00:17:56.490 –> 00:18:03.450
their Bad joke I mean I Christopher had asked

00:18:03.450 –> 00:18:06.869
me if I had seen this, I couldn’t remember. Because

00:18:06.869 –> 00:18:09.890
you come across so much, I guess the term is

00:18:09.890 –> 00:18:12.910
AI slop, if you’re on the internet or if it finds

00:18:12.910 –> 00:18:17.369
its way in your algorithm. So I think I’d seen

00:18:17.369 –> 00:18:21.190
a close -up shot of the woman in it. I can’t

00:18:21.190 –> 00:18:24.269
remember what she’s doing. But I did re -watch

00:18:24.269 –> 00:18:27.029
it, and I was just like, yeah, it’s kind of forgettable.

00:18:28.109 –> 00:18:31.730
Forgettable. And for me, it. It’s cartoonish

00:18:31.730 –> 00:18:34.630
and everything looks just off like it’s it’s

00:18:34.630 –> 00:18:37.410
like looks kind of real, right? But there’s like

00:18:37.410 –> 00:18:39.450
it’s like looks like it’s gone through ten Instagram

00:18:39.450 –> 00:18:42.230
filters Yeah, you know and then like even that

00:18:42.230 –> 00:18:44.289
woman’s face. It’s just the way she grins It

00:18:44.289 –> 00:18:45.930
should kind of look like the Joker. I mean, yeah,

00:18:45.930 –> 00:18:48.210
I agree Yeah, I mean that was supposed to be

00:18:48.210 –> 00:18:50.650
I’m assuming her reflection in like a Christmas

00:18:50.650 –> 00:18:53.829
ball ornament on a tree But even still it just

00:18:53.829 –> 00:18:59.940
it felt so Like through the Palace through the

00:18:59.940 –> 00:19:02.779
like looking glass or whatever it is. So so what’s

00:19:02.779 –> 00:19:06.539
interesting too is It doesn’t like make you hungry

00:19:06.539 –> 00:19:09.660
or want to crave McDonald’s not at all yet Which

00:19:09.660 –> 00:19:11.720
is like the whole point of a commercial you have

00:19:11.720 –> 00:19:15.859
and just you know It’s just a negative messaging

00:19:15.859 –> 00:19:18.259
about Christmas, which you know, a lot of people

00:19:18.259 –> 00:19:20.160
love Christmas So it seems weird that you’re

00:19:20.160 –> 00:19:22.160
like, yeah, you hate Christmas to come get a

00:19:22.160 –> 00:19:25.400
coffee, you know so I mean that’s the You know

00:19:25.400 –> 00:19:27.140
not related to the AI but it just seems like

00:19:27.140 –> 00:19:29.819
a bad idea to begin with So it’s a bit, you know,

00:19:29.859 –> 00:19:31.960
when you watch it, it just feels so dystopian.

00:19:32.119 –> 00:19:34.240
It’s like, I mean, if it does anything right,

00:19:34.400 –> 00:19:36.180
it probably captures how a lot of us are feeling

00:19:36.180 –> 00:19:39.220
in the inside just about society right now. Yeah.

00:19:40.559 –> 00:19:44.519
But yeah, so the company that did that is called,

00:19:48.299 –> 00:19:50.099
well, they have a parent company, I’m forgetting

00:19:50.099 –> 00:19:51.650
the name of it, it’s like Sweet. something or

00:19:51.650 –> 00:19:54.950
other, but their subsidiary, which is their AI

00:19:54.950 –> 00:19:57.250
marketing company, is called The Gardening Club.

00:19:58.150 –> 00:20:00.890
And so they actually released a statement on

00:20:00.890 –> 00:20:04.329
it, which I’m going to read aspects of here because

00:20:04.329 –> 00:20:08.329
I think it brings up some interesting questions.

00:20:09.089 –> 00:20:12.970
And they basically said, yes, this ad is 100

00:20:12.970 –> 00:20:15.329
% AI, but it was built with the same care that

00:20:15.329 –> 00:20:18.750
we bring to any live action film. The brief was

00:20:18.750 –> 00:20:22.079
tightly prescribed. With clear mandate for every

00:20:22.079 –> 00:20:24.319
single scene so complete control of each shot

00:20:24.319 –> 00:20:26.259
was essential and pulled off by our team using

00:20:26.259 –> 00:20:29.220
proprietary workflows and Pulled off spectacularly.

00:20:29.220 –> 00:20:32.440
Well, this was not a prompt and pray process

00:20:32.440 –> 00:20:38.259
So they had a director it goes on to say that

00:20:38.259 –> 00:20:41.039
shaped every AI performance their look their

00:20:41.039 –> 00:20:46.299
energy their emotional presence They had An AI

00:20:46.299 –> 00:20:48.740
and a post team that spent seven intense weeks

00:20:48.740 –> 00:20:52.539
refining seven weeks. Refining every single frame,

00:20:52.960 –> 00:20:55.480
they’ve rebuilt shots, debated composition, shaped

00:20:55.480 –> 00:20:57.460
shadows, and turned emotional microbeats with

00:20:57.460 –> 00:21:01.099
precision all overseen by Sweet Shop, director

00:21:01.099 –> 00:21:04.240
of duo MAMA. And Sweet Shop is the parent company.

00:21:06.000 –> 00:21:08.119
And they said they had 10 or more specialists

00:21:08.119 –> 00:21:11.299
on each shot running through tightly engineered

00:21:11.299 –> 00:21:17.549
pipeline, yada, yada, yada. Read that because

00:21:17.549 –> 00:21:19.809
that was their defense once the internet was

00:21:19.809 –> 00:21:21.549
like this looks like crap. What the hell are

00:21:21.549 –> 00:21:26.309
you doing? What to me? It’s like Assuming this

00:21:26.309 –> 00:21:29.670
is all true what they say Like hey, where are

00:21:29.670 –> 00:21:32.230
the cost savings, right? I mean, I I mean it

00:21:32.230 –> 00:21:34.650
is expensive to have live -action but you know

00:21:34.650 –> 00:21:37.910
a lot of the budget spent in post and so maybe

00:21:37.910 –> 00:21:40.630
they’re just cutting out like 20 30 % of the

00:21:40.630 –> 00:21:42.309
budget if you’re cutting out the live -action

00:21:42.309 –> 00:21:47.140
stuff or 40 % I don’t know But if you, and also

00:21:47.140 –> 00:21:48.819
if you’re spending that much time, seven weeks,

00:21:49.000 –> 00:21:51.299
and that’s what you’re making out of AI, like

00:21:51.299 –> 00:21:53.819
that’s a kind of a statement about how not ready

00:21:53.819 –> 00:21:56.180
for prime time it is, you might say. I don’t

00:21:56.180 –> 00:21:57.960
know. What do you, any thoughts around that?

00:21:58.539 –> 00:22:00.440
Yeah. I mean, like if you’re going to spend seven

00:22:00.440 –> 00:22:04.180
weeks creating, I don’t know, is that like a

00:22:04.180 –> 00:22:08.779
32nd commercial, maybe a minute, you could easily

00:22:08.779 –> 00:22:12.440
do that with live action people. I mean, I know

00:22:12.440 –> 00:22:14.599
some of them are, some of the shots are like,

00:22:14.539 –> 00:22:17.319
kind of wild Christmas tree on fire type things.

00:22:18.359 –> 00:22:21.299
That’s true, because there are some potentially

00:22:21.299 –> 00:22:24.480
expensive set pieces, either closing down a road

00:22:24.480 –> 00:22:27.220
to drive through town or whatever. So you could

00:22:27.220 –> 00:22:30.819
have bigger ambitions with AI, I guess. But I

00:22:30.819 –> 00:22:33.559
look at companies like McDonald’s or Coca -Cola,

00:22:34.079 –> 00:22:36.319
these name brands that you’ve seen a million

00:22:36.319 –> 00:22:39.940
times. And I almost wonder why they even bother

00:22:39.940 –> 00:22:42.599
advertising, because everyone knows it’s there.

00:22:44.240 –> 00:22:47.240
I mean, I know they so related to no, but but

00:22:47.240 –> 00:22:49.819
I’m saying it has to do with marketing and commercials

00:22:49.819 –> 00:22:53.480
and This commercial really didn’t do anything

00:22:53.480 –> 00:22:55.880
except shit. I didn’t move the needle. It’s just

00:22:55.880 –> 00:22:57.980
yeah exactly Well, I mean what it did is it backfired

00:22:57.980 –> 00:23:00.259
and now everybody’s like thinks, you know, and

00:23:00.259 –> 00:23:02.339
so it backfired so bad In fact, it’s worth mentioning

00:23:02.339 –> 00:23:06.779
that McDonald’s corporate in the US distance

00:23:06.779 –> 00:23:08.839
itself and they said this was like a rogue commercial

00:23:08.839 –> 00:23:13.230
done by the like Netherlands McDonald’s or the

00:23:13.230 –> 00:23:15.670
European McDonald’s or whatever and they actually

00:23:15.670 –> 00:23:20.410
took it down. So I Guess they were testing the

00:23:20.410 –> 00:23:23.109
waters Yeah, I mean, I think that’s what a lot

00:23:23.109 –> 00:23:25.069
of people are doing right now, you know I think

00:23:25.069 –> 00:23:27.369
whenever there’s a new technology you get a certain

00:23:27.369 –> 00:23:29.849
segment of people who are like this is gonna

00:23:29.849 –> 00:23:31.809
be the next big thing and I want to get in early

00:23:31.809 –> 00:23:35.130
so I can capitalize off it and I think that’s

00:23:35.130 –> 00:23:36.930
what we’re seeing now and we’ll have more examples

00:23:36.930 –> 00:23:39.799
of that later in the conversation And, you know,

00:23:39.799 –> 00:23:42.920
they’re testing the waters and they’re brave

00:23:42.920 –> 00:23:45.880
souls because, you know, at the moment, there’s

00:23:45.880 –> 00:23:47.859
a lot of backlash against that, particularly

00:23:47.859 –> 00:23:52.519
when it goes wrong. So. Which brings up the next

00:23:52.519 –> 00:23:59.160
point that we can talk about here is that. Is

00:23:59.160 –> 00:24:01.880
really about the current state of AI in the industry,

00:24:01.880 –> 00:24:03.759
and I have a little bit of insight into this

00:24:03.759 –> 00:24:06.599
personally, just from my work, editing and doing

00:24:06.599 –> 00:24:09.960
cinematography for. commercials and other people’s

00:24:09.960 –> 00:24:15.599
productions and So jumping into that what I’ve

00:24:15.599 –> 00:24:18.099
seen personally is you know for almost two years

00:24:18.099 –> 00:24:22.539
now on documentaries and Branded content pieces

00:24:22.539 –> 00:24:26.000
that I do which are usually doc style a lot of

00:24:26.000 –> 00:24:30.380
times we when we’re editing Whether or not you

00:24:30.380 –> 00:24:32.779
know this this happens on every non scripted

00:24:32.779 –> 00:24:36.059
thing they’ll take we do like what we call Franken

00:24:36.059 –> 00:24:40.220
biting and essentially taking different parts

00:24:40.220 –> 00:24:42.500
of sentences and combining them together from

00:24:42.500 –> 00:24:45.460
different places to both cut out the fad of people

00:24:45.460 –> 00:24:47.700
rambling and stuff like that, but also sometimes

00:24:47.700 –> 00:24:51.099
to create new statements along the lines of what

00:24:51.099 –> 00:24:53.740
they were saying. You know, there is an ethical

00:24:53.740 –> 00:24:55.400
line there. You don’t want them to create them

00:24:55.400 –> 00:24:57.920
saying completely new things, but it’s totally

00:24:57.920 –> 00:25:00.119
been fair game for as long as I’ve worked in

00:25:00.119 –> 00:25:02.920
industry over 15 years to say, take someone saying

00:25:02.920 –> 00:25:07.329
like. something over here, pairing with something

00:25:07.329 –> 00:25:10.150
over here to get an entirely new sentence. And

00:25:10.150 –> 00:25:12.490
now what people are doing with AI is they’ll

00:25:12.490 –> 00:25:14.910
if if they can’t make something like that work.

00:25:14.990 –> 00:25:16.869
And there’s oftentimes where that won’t work

00:25:16.869 –> 00:25:21.210
and you can’t quite get the person to to to convey

00:25:21.210 –> 00:25:23.829
the thought that, you know, they were trying

00:25:23.829 –> 00:25:26.750
to through editing. So but you know, they’re

00:25:26.750 –> 00:25:28.390
going to are willing to say it and you know,

00:25:28.390 –> 00:25:29.750
you’re going to have another shoot with them.

00:25:30.029 –> 00:25:32.990
Sometimes we’re asked to go in and temp that

00:25:32.990 –> 00:25:37.299
thought. using AI and will feed, you know, the

00:25:37.299 –> 00:25:40.740
interview into a tool. I think I use 11 labs

00:25:40.740 –> 00:25:46.240
often to do this and it’ll spit out what we’d

00:25:46.240 –> 00:25:48.900
like them to say with their voice. And that doesn’t

00:25:48.900 –> 00:25:51.920
end up in the final piece ever, as far as I know.

00:25:51.920 –> 00:25:55.700
It shouldn’t unless they approve it or whatever.

00:25:56.119 –> 00:25:58.619
But we do that just as a placeholder because

00:25:58.619 –> 00:26:01.529
a lot of times they don’t want to hold up the

00:26:01.529 –> 00:26:04.009
editing process in order to tempt it in to see

00:26:04.009 –> 00:26:05.890
what it sounds like, because they might not even

00:26:05.890 –> 00:26:07.849
want it in the end anyways. But then when they

00:26:07.849 –> 00:26:09.789
kind of get the final thing, they might go back

00:26:09.789 –> 00:26:12.970
and ask a question that gets at that point. Or,

00:26:12.970 –> 00:26:15.869
you know, in the case of commercial branded content,

00:26:15.970 –> 00:26:18.130
they might even just ask them to say it because

00:26:18.130 –> 00:26:20.950
they’re typically getting paid in those scenarios

00:26:20.950 –> 00:26:22.990
where in documentaries you typically don’t pay.

00:26:23.900 –> 00:26:26.099
people for interviews because it’s unethical.

00:26:26.700 –> 00:26:29.440
So I’ve seen that. They’re just using a lot of

00:26:29.440 –> 00:26:32.519
VFX tools built into a lot of programs like DaVinci

00:26:32.519 –> 00:26:36.420
Resolve with Dorotascoping. Premiere has a lot

00:26:36.420 –> 00:26:40.359
of AI tools, particularly with cleaning up sound.

00:26:40.880 –> 00:26:44.519
And now I think they have tools to extend music

00:26:44.519 –> 00:26:46.779
cues to the length that you need, things like

00:26:46.779 –> 00:26:49.240
that, which… You know, are all things that

00:26:49.240 –> 00:26:52.000
we have been doing for a long time. We just do

00:26:52.000 –> 00:26:55.859
them manually. And, you know, the tools basically

00:26:55.859 –> 00:26:58.099
just speed up that process or sometimes maybe

00:26:58.099 –> 00:27:01.119
do a little bit better. The roto scoping in DaVinci

00:27:01.119 –> 00:27:03.160
Resolve is a really good example because that

00:27:03.160 –> 00:27:05.420
process usually you would in the past, you’d

00:27:05.420 –> 00:27:08.579
have to go frame by frame and cut out every single,

00:27:08.700 –> 00:27:11.380
say, like arm in front of a TV or something,

00:27:11.700 –> 00:27:14.240
you know, by hand. And now you can just kind

00:27:14.240 –> 00:27:16.099
of like do a few marks and have a track it and

00:27:16.099 –> 00:27:21.279
it’s done, you know. And so that’s AI is definitely

00:27:21.279 –> 00:27:25.500
working its way into these post tools. Then there’s,

00:27:25.500 –> 00:27:28.900
as I mentioned, music, not in addition to extending

00:27:28.900 –> 00:27:33.819
the music. There’s a lot of AI generated music

00:27:33.819 –> 00:27:36.240
out there. And I think that we’re going to start

00:27:36.240 –> 00:27:38.599
seeing that being used more. I personally haven’t

00:27:38.599 –> 00:27:41.339
seen it like, you know, in any of the productions

00:27:41.339 –> 00:27:44.579
I’ve worked on, but I can only imagine that particularly

00:27:44.579 –> 00:27:47.529
in things like reality and other commercial cheaper

00:27:47.529 –> 00:27:49.509
commercial content or social media that they’re

00:27:49.509 –> 00:27:52.170
probably starting to do AI generated songs and

00:27:52.170 –> 00:27:56.710
using them in in final pieces and Something else

00:27:56.710 –> 00:27:59.690
I’ve seen is people you know using generative

00:27:59.690 –> 00:28:03.529
AI mostly for images for decks and and pitch

00:28:03.529 –> 00:28:06.670
decks that is and essentially trying to convey

00:28:06.670 –> 00:28:11.690
the idea of a film to help sell it to a commissioner

00:28:11.690 –> 00:28:18.740
or studio or some kind of funder I’ve seen Well,

00:28:18.759 –> 00:28:22.400
I think that you you mentioned Peter the That

00:28:22.400 –> 00:28:25.279
film that used AI to create some of the art department

00:28:25.279 –> 00:28:27.559
assets. Yeah, I want to talk about that a little

00:28:27.559 –> 00:28:30.519
bit Yeah, so late night with the devil that came

00:28:30.519 –> 00:28:35.660
out like a year or two ago A movie I really enjoyed

00:28:35.660 –> 00:28:39.380
After watching it, you know, you go online you

00:28:39.380 –> 00:28:42.380
start thinking about it and you see reviews and

00:28:42.380 –> 00:28:46.470
a lot of people were attacking it for AI use.

00:28:46.490 –> 00:28:49.210
So I was like, oh shit, they stepped in it. And

00:28:49.210 –> 00:28:53.730
what I discovered is the art department used

00:28:53.730 –> 00:28:58.190
it, it was their choice to use it, in the capacity

00:28:58.190 –> 00:29:01.569
where the film is kind of like if you have a

00:29:01.569 –> 00:29:04.069
Johnny Carson comeback, you know, set in the

00:29:04.069 –> 00:29:07.509
70s and his comeback is on Halloween and he’s

00:29:07.509 –> 00:29:10.470
gonna have all these spooky guests on, they have

00:29:10.470 –> 00:29:14.180
like the Halloween TV cards that flash before

00:29:14.180 –> 00:29:17.500
commercials. That’s where the AI was implemented.

00:29:18.519 –> 00:29:22.480
I didn’t know watching it. Like it, I just assumed

00:29:22.480 –> 00:29:25.019
that, hey, maybe they figured it out on, I don’t

00:29:25.019 –> 00:29:29.660
know, like CGI. Yeah, like just some, some, uh,

00:29:29.700 –> 00:29:32.579
graphic designer did it, but they used AI and

00:29:32.579 –> 00:29:35.220
I don’t know to what extent, but it got a lot

00:29:35.220 –> 00:29:40.220
of backlash and it’s kind of, it’s just an interesting

00:29:40.220 –> 00:29:45.259
thing because I understand why they probably

00:29:45.259 –> 00:29:47.720
used it for that. Hey, maybe they were short

00:29:47.720 –> 00:29:52.559
on budget. Who knows, found themselves in a position

00:29:52.559 –> 00:29:56.000
like that. But on the other hand, if you’re the

00:29:56.000 –> 00:29:59.420
first one to do it, you might be getting that

00:29:59.420 –> 00:30:03.460
backlash. So it wasn’t worth it to use. Yeah,

00:30:03.759 –> 00:30:07.500
so essentially the art department used it in

00:30:07.500 –> 00:30:10.970
the assets that they created for the film. That’s

00:30:10.970 –> 00:30:14.609
something that I’ve actually, and the same thing

00:30:14.609 –> 00:30:17.309
happened in video games. The game that won Game

00:30:17.309 –> 00:30:21.410
of the Year this year at the Game Awards, they

00:30:21.410 –> 00:30:25.329
took back some of their trophies because it came

00:30:25.329 –> 00:30:28.529
out after the fact that they used AI -generated

00:30:28.529 –> 00:30:31.309
textures in some, not all, but some of the game.

00:30:32.190 –> 00:30:36.009
And they didn’t disclose that. So there was some…

00:30:35.920 –> 00:30:38.099
You know anger around that they’ve since gone

00:30:38.099 –> 00:30:40.160
back and replaced all those with totally human

00:30:40.160 –> 00:30:43.119
created ones But I think their excuse was that

00:30:43.119 –> 00:30:45.099
they were just you know moving fast and like

00:30:45.099 –> 00:30:47.119
they didn’t get a chance to swap them all out

00:30:47.119 –> 00:30:49.660
So but it is you know, it seems like a lot of

00:30:49.660 –> 00:30:52.500
times that when they use them AI in those ways

00:30:52.500 –> 00:30:55.759
It’s usually like a time -saving or cost -saving

00:30:55.759 –> 00:30:59.059
reason and the last thing I’ve seen is AI and

00:30:59.059 –> 00:31:02.680
social media, you know a lot of companies creating

00:31:02.680 –> 00:31:04.940
AI slop or AI videos whatever you want to call

00:31:04.940 –> 00:31:07.849
it and just so they don’t have to like do a full

00:31:07.849 –> 00:31:09.750
production. But when you’re doing social and

00:31:09.750 –> 00:31:11.430
you’re trying to post something every day, it’s

00:31:11.430 –> 00:31:14.390
a very tempting, quick way to get a lot of content

00:31:14.390 –> 00:31:17.829
out. So I think those are the ways that AI mostly

00:31:17.829 –> 00:31:21.289
appears at the moment. We aren’t quite yet. It

00:31:21.289 –> 00:31:23.950
seems like we’re going to get into this a little

00:31:23.950 –> 00:31:28.150
bit, but fully generated AI films and like real

00:31:28.150 –> 00:31:31.269
content, like premium content. We’re not there

00:31:31.269 –> 00:31:33.849
yet, but that doesn’t mean that a lot of people

00:31:33.849 –> 00:31:36.160
aren’t trying to make that happen. You know,

00:31:36.559 –> 00:31:38.980
and the first example of that that I would like

00:31:38.980 –> 00:31:43.200
to ask you about, Peter, is Tilly Norwood. So

00:31:43.200 –> 00:31:46.460
for those aren’t familiar, Tilly Norwood is a

00:31:46.460 –> 00:31:53.200
AI. Young female, so totally, totally made up

00:31:53.200 –> 00:31:58.779
by AI person that this company called Particle

00:31:58.779 –> 00:32:01.980
Six Group, and they have a new division called

00:32:01.980 –> 00:32:05.730
Ziccoia. I can’t even say it. I don’t know. But

00:32:05.730 –> 00:32:09.410
they essentially are trying to represent Tilly

00:32:09.410 –> 00:32:13.410
Norwood as like an actress, but obviously not

00:32:13.410 –> 00:32:16.910
a real actress, but essentially be agents to

00:32:16.910 –> 00:32:20.289
this AI actress and get her cast in AI films

00:32:20.289 –> 00:32:24.690
and things like that. And they announced this

00:32:24.690 –> 00:32:27.650
back the summer and it just exploded and people

00:32:27.650 –> 00:32:30.829
were so angry, including myself. And then a few

00:32:30.829 –> 00:32:34.829
weeks later, they released the first It was like

00:32:34.829 –> 00:32:37.210
a promotional video for their company and it

00:32:37.210 –> 00:32:40.069
featured her in it and that spot was called AI

00:32:40.069 –> 00:32:42.109
commissioner which will link to in the show notes

00:32:42.109 –> 00:32:45.309
as well and it’s really the first project to

00:32:45.309 –> 00:32:49.470
feature her and What do you think about that?

00:32:49.490 –> 00:32:51.730
Do you think that there’s a future for AI actress

00:32:51.730 –> 00:32:53.109
and actresses? What are they trying to do with

00:32:53.109 –> 00:32:57.569
that? I think it’s stupid and Companies corporations

00:32:57.569 –> 00:32:59.529
people at the end of the day want to make money

00:32:59.529 –> 00:33:01.650
and this is just a lane They can make money in

00:33:01.650 –> 00:33:03.869
they can do fast content you know, you don’t

00:33:03.869 –> 00:33:07.309
need to hire a person to do it. But as a consumer

00:33:07.309 –> 00:33:11.890
of films and, you know, if occasionally I’ll

00:33:11.890 –> 00:33:14.910
play a video game or watch a TV show, like I

00:33:14.910 –> 00:33:16.809
want to see something real. And I think a lot

00:33:16.809 –> 00:33:21.809
of people that care about this stuff do. So if

00:33:21.809 –> 00:33:25.269
the Tilly Norwoods of the world become this like

00:33:25.269 –> 00:33:28.250
phenomenon where now we have AI generated movies,

00:33:28.269 –> 00:33:30.509
I think there’ll be a lot of boycotting. There’ll

00:33:30.509 –> 00:33:32.769
probably be some people that will watch it. You

00:33:32.769 –> 00:33:34.730
know, I think people are also curious. But what

00:33:34.730 –> 00:33:36.710
about with actors in particular? Well, I think

00:33:36.710 –> 00:33:39.930
it’s fucked up to do to actors because that’s

00:33:39.930 –> 00:33:41.589
something, you know, as a little kid, I wanted

00:33:41.589 –> 00:33:43.730
to be an actor. I’m not good at it. I’m not great

00:33:43.730 –> 00:33:47.349
at lying. Um, but I, you know, that’s like a

00:33:47.349 –> 00:33:49.210
passion people have and it’s like, all right,

00:33:49.269 –> 00:33:50.930
we’ll just get a robot to do it. So it’s fucked

00:33:50.930 –> 00:33:53.190
up because they’re taking the place of it. Yeah.

00:33:53.190 –> 00:33:56.809
And it’s such like a human quality. You know,

00:33:56.930 –> 00:33:59.410
people, what they bring to acting is like their

00:33:59.410 –> 00:34:02.240
trauma. you know, their life experience, all

00:34:02.240 –> 00:34:06.680
this stuff that’s lived and AI is just patterns

00:34:06.680 –> 00:34:10.699
and BS. Yeah, it’s ones and zeros. And that’s

00:34:10.699 –> 00:34:12.559
actually a really good point, you know, because

00:34:12.559 –> 00:34:14.219
they always say, you know, eyes are the window

00:34:14.219 –> 00:34:17.699
of the soul and hands too, and things like that.

00:34:18.139 –> 00:34:20.960
And, you know, a lot of that expression are like

00:34:20.960 –> 00:34:23.840
these micro muscles and things under your eye

00:34:23.840 –> 00:34:27.039
that just a slight, subtle shift that us as social

00:34:27.039 –> 00:34:29.489
creatures. read and we read these body language

00:34:29.489 –> 00:34:31.210
that are almost imperceptible that we don’t even

00:34:31.210 –> 00:34:35.250
know we’re doing it. And A, can I, A, I even

00:34:35.250 –> 00:34:38.650
recreate that and B, you’re right. There’s no

00:34:38.650 –> 00:34:43.130
emotion driving it. And you can, I feel like

00:34:43.130 –> 00:34:46.409
you can tell as a human, at least still when

00:34:46.409 –> 00:34:49.750
that emotion is faked, you know, we’re so keyed

00:34:49.750 –> 00:34:51.949
into that as, as human beings. Yeah. I mean,

00:34:51.949 –> 00:34:57.300
I even Have umbrage with the uncanny valley thing

00:34:57.300 –> 00:35:01.460
that CGI like if I watch something I mean I watched

00:35:01.460 –> 00:35:04.599
What is it stranger things all the way through

00:35:04.599 –> 00:35:08.940
final season there? I believe you know their

00:35:08.940 –> 00:35:12.940
backdrop was a lot of CGI for these huge set

00:35:12.940 –> 00:35:15.619
pieces And I just looked at it it took me out

00:35:15.619 –> 00:35:18.820
of it like I hate that 100 % you know it’s terrible.

00:35:18.820 –> 00:35:22.880
That’s not even AI That’s CGI. Can you just explain

00:35:22.880 –> 00:35:24.920
for the audience if they don’t know what Uncanny

00:35:24.920 –> 00:35:28.900
Valley is? I’ll do my best. To me, it’s kind

00:35:28.900 –> 00:35:33.059
of a visual eyesore of where things almost look

00:35:33.059 –> 00:35:37.000
real but aren’t quite, and you can’t quite articulate

00:35:37.000 –> 00:35:40.679
that. And it’s particularly with humans. Well,

00:35:40.800 –> 00:35:46.019
I see it more and more on these shows that don’t

00:35:46.019 –> 00:35:48.280
film on location but need to create some vast

00:35:48.280 –> 00:35:53.960
backdrop. And it’s like, if I look at Hellraiser

00:35:53.960 –> 00:35:56.880
2, or I look at Labyrinth or something, that

00:35:56.880 –> 00:35:59.679
predates all this. That looks so much more interesting

00:35:59.679 –> 00:36:01.800
to me, even if I know it’s not a real place,

00:36:02.420 –> 00:36:05.039
than the new season of Stranger Things, Upside

00:36:05.039 –> 00:36:08.539
Down, whatever. You’re just seeing this synthetic

00:36:08.539 –> 00:36:11.760
-looking world behind. Exactly. You just feel

00:36:11.760 –> 00:36:14.699
the green screen effects of it. Like, yeah, I

00:36:14.699 –> 00:36:17.599
know exactly what you mean. It almost, like the

00:36:17.599 –> 00:36:19.280
A .I. McDonald’s thing, it has like a little

00:36:20.090 –> 00:36:22.269
cartoonish quality about it. It’s hard to explain.

00:36:22.409 –> 00:36:25.170
It’s not it’s not when you analyze it, but there’s

00:36:25.170 –> 00:36:29.289
just something about. It looks like almost too

00:36:29.289 –> 00:36:31.809
slick or something as well. Like, I guess it

00:36:31.809 –> 00:36:35.349
depends case by case, but I hate it. Yeah, no,

00:36:35.349 –> 00:36:38.429
that’s true. Yeah. So and you know what Tilly

00:36:38.429 –> 00:36:41.510
Norwood brings up for me, too, is just this couple

00:36:41.510 –> 00:36:45.530
of things. One is if you can have an AI generated

00:36:45.530 –> 00:36:47.170
movie star like these they’ve created with Tilly

00:36:47.170 –> 00:36:50.030
Norwood. What’s to stop anyone from making one

00:36:50.030 –> 00:36:55.389
and so why even have? One that reoccurs in films

00:36:55.389 –> 00:36:57.510
why why if you’re making AI films, why aren’t

00:36:57.510 –> 00:37:00.570
you creating new? Actors quote -unquote AI actors

00:37:00.570 –> 00:37:04.090
for each one why even have one? You know, and

00:37:04.090 –> 00:37:06.309
it’s what’s funny to me is they’re recreating

00:37:06.309 –> 00:37:09.269
like a fundamentally human phenomenon, which

00:37:09.269 –> 00:37:13.889
is We identify and grow to like individuals and

00:37:13.889 –> 00:37:16.000
we like seeing them in places, which is why I

00:37:16.000 –> 00:37:18.000
like George Clooney can command so much for a

00:37:18.000 –> 00:37:19.960
film, right? It’s because people like George

00:37:19.960 –> 00:37:24.679
Clooney. And when you go to AI, you know, they’re

00:37:24.679 –> 00:37:27.940
going to try to recreate, like build a personal

00:37:27.940 –> 00:37:32.039
brand, quote unquote, around an AI made up thing,

00:37:32.579 –> 00:37:35.559
just so they can make it feel more human in a

00:37:35.559 –> 00:37:39.760
way. Because I don’t know. To me, it just it’s

00:37:39.760 –> 00:37:44.380
kind of an interesting. What’s the word I’m looking

00:37:44.380 –> 00:37:48.039
for? like a paradox or something. I don’t know.

00:37:48.840 –> 00:37:50.539
But that’s just something to think about. And

00:37:50.539 –> 00:37:53.079
then the other thing is that it’s also interesting

00:37:53.079 –> 00:37:56.980
to me that the first A .I. quote unquote actress

00:37:56.980 –> 00:38:01.739
that was created was a young white pretty female.

00:38:02.679 –> 00:38:05.539
You know, of course, it wasn’t going to be like.

00:38:06.119 –> 00:38:10.360
You know, a person of color or, you know, old

00:38:10.360 –> 00:38:14.400
white. Male or or even an old white female and

00:38:14.400 –> 00:38:18.059
they’re gonna go for the most like generic like

00:38:18.059 –> 00:38:22.760
idealized social construct of like what You know

00:38:22.760 –> 00:38:25.039
people want to see or at least what they think

00:38:25.039 –> 00:38:27.420
people want to see so just a little interesting

00:38:27.420 –> 00:38:30.260
social commentary I think Which brings me to

00:38:30.260 –> 00:38:32.800
the next topic of conversation, which is Darren

00:38:32.800 –> 00:38:37.159
Aronofsky So it came out not too long ago early

00:38:37.159 –> 00:38:41.289
this spring that he launched New venture that’s

00:38:41.289 –> 00:38:45.269
an AI studio called Primordial Soup and It’s

00:38:45.269 –> 00:38:48.469
in partnership with Microsoft or sorry with Google’s

00:38:48.469 –> 00:38:50.929
DeepMind not Microsoft with Google’s DeepMind

00:38:50.929 –> 00:38:55.690
AI research team the idea being that they produce

00:38:55.690 –> 00:39:00.809
three short AI films and They get access to some

00:39:00.809 –> 00:39:04.070
of the new tools at Google and in exchange They

00:39:04.070 –> 00:39:06.510
give feedback and try to help them and with the

00:39:06.510 –> 00:39:09.880
direction of developing the tool And their first

00:39:09.880 –> 00:39:12.920
project from that was directed by Eliza McNitt

00:39:12.920 –> 00:39:15.119
and it’s called Ancestra, which premiered at

00:39:15.119 –> 00:39:18.159
Tribeca. How long ago? This spring. This spring.

00:39:18.340 –> 00:39:20.920
In May, or no, sorry, Tribeca was in June, so.

00:39:21.420 –> 00:39:24.159
So this is 2025. Yeah. So it has premiered. Yeah.

00:39:24.940 –> 00:39:27.519
And interesting enough, there has been reviews

00:39:27.519 –> 00:39:30.739
and they’re not great. In fact, one headline

00:39:30.739 –> 00:39:36.239
that I got a kick out of was. Ancestra makes

00:39:36.239 –> 00:39:40.099
the AI film hype feel pretty premature, which

00:39:40.099 –> 00:39:43.519
I thought was quite interesting. So that’s just

00:39:43.519 –> 00:39:45.539
something to note. And then along with that,

00:39:45.679 –> 00:39:48.960
Natasha Lyonne has done something similar. And

00:39:48.960 –> 00:39:52.079
she is also starting a company with Bryn Mooser,

00:39:52.900 –> 00:39:57.059
and it’s called Asterea Film. And their first

00:39:57.059 –> 00:39:59.920
film is called Uncanny Valley, which it’s a feature

00:39:59.920 –> 00:40:04.219
film. And both this and the Deronofsky… Darren

00:40:04.219 –> 00:40:06.900
Aronofsky produced film that premiered at Tribeca

00:40:06.900 –> 00:40:11.940
are both a mix of AI and live action and the

00:40:11.940 –> 00:40:16.320
Natasha Lyonne film called Uncanny Valley is

00:40:16.320 –> 00:40:21.579
Not out yet. It’s still in process but she in

00:40:21.579 –> 00:40:23.659
Both of them when they announced this was gonna

00:40:23.659 –> 00:40:26.199
happen. They got massive feedback Lots of people

00:40:26.199 –> 00:40:29.079
are upset backlash backlash. Sorry. Yeah, they

00:40:29.079 –> 00:40:31.860
got massive backlash. People are upset commenting

00:40:31.860 –> 00:40:34.130
on their posts all sorts of things, and they’ve

00:40:34.130 –> 00:40:37.449
both since defended it. And, you know, Natasha

00:40:37.449 –> 00:40:40.190
Lyon’s point is that she’s trying to do an ethical

00:40:40.190 –> 00:40:43.829
AI company and do this ethically, which obviously

00:40:43.829 –> 00:40:47.210
brings up the question of whether you can do

00:40:47.210 –> 00:40:50.289
it ethically. For all sorts of reasons, we could

00:40:50.289 –> 00:40:54.590
touch on a little bit. And, you know, she had

00:40:54.590 –> 00:40:57.309
a variety interview recently, and she said it’s

00:40:57.309 –> 00:40:58.909
all about protecting artists and confronting

00:40:58.909 –> 00:41:02.480
this oncoming wave. which is ridiculous, I think,

00:41:02.519 –> 00:41:04.280
when you’re talking about how you’re the one

00:41:04.280 –> 00:41:07.460
doing it. So she said, it’s comedic that people

00:41:07.460 –> 00:41:09.500
misunderstand headlines so readily because of

00:41:09.500 –> 00:41:11.800
our bizarre culture of not having reading comprehension.

00:41:12.159 –> 00:41:14.559
Suddenly I became some weird Darth Vader character

00:41:14.559 –> 00:41:17.440
or something. That’s crazy talk, but God bless

00:41:17.440 –> 00:41:21.119
addressing the backlash. I don’t know. I think

00:41:21.119 –> 00:41:23.519
her comments on the backlash are a little silly.

00:41:23.559 –> 00:41:26.559
Like she doesn’t understand as an actress why

00:41:26.559 –> 00:41:30.260
people might be upset on doing that. And she’s

00:41:30.260 –> 00:41:33.000
saying, I should clarify that the AI that they’re

00:41:33.000 –> 00:41:36.260
using is for like scene extension, like you were

00:41:36.260 –> 00:41:39.800
talking about in Stranger Things and more supportive

00:41:39.800 –> 00:41:43.539
types of uses of AI, but it ignores the whole

00:41:43.539 –> 00:41:45.579
point that there are people who do those jobs.

00:41:46.139 –> 00:41:50.360
Right. That’s, yeah, that’s a solid point. Yeah.

00:41:50.760 –> 00:41:53.320
I mean, what’s interesting is who knows, maybe

00:41:53.320 –> 00:41:57.000
she makes this and it’s mind blowing. Like maybe

00:41:57.000 –> 00:42:00.079
it’s a really cool… project, we can’t judge

00:42:00.079 –> 00:42:03.079
until we see it. Right. But at the same time,

00:42:03.099 –> 00:42:05.539
at what cost? Because that’s ultimately what

00:42:05.539 –> 00:42:07.840
is this really going to be opening Pandora’s

00:42:07.840 –> 00:42:11.559
box for copycats? And yeah, that’s the thing.

00:42:11.579 –> 00:42:13.679
It’s like she might do it ethically, like or

00:42:13.679 –> 00:42:16.920
some version of that. But people are going to

00:42:16.920 –> 00:42:19.219
learn from her experience. And are they going

00:42:19.219 –> 00:42:21.340
to do it ethically? Well, will the collective

00:42:21.340 –> 00:42:24.260
actions of a lot of people, even if they’re doing

00:42:24.260 –> 00:42:30.179
it ethically, amount to? a situation in which

00:42:30.179 –> 00:42:33.119
people are, you know, suffering to some extent

00:42:33.119 –> 00:42:36.860
by not having work or by, you know, being forced

00:42:36.860 –> 00:42:39.800
to consume really crappy films, you know? Like,

00:42:39.820 –> 00:42:42.019
I guess my point is, like, it’s easy to say it’s

00:42:42.019 –> 00:42:44.340
unethic without considering every way in which

00:42:44.340 –> 00:42:46.280
it could go wrong, and it’s impossible to do

00:42:46.280 –> 00:42:48.079
that, you know? So, like, she can’t say it’s

00:42:48.079 –> 00:42:50.280
ethical, I don’t think. But what’s interesting

00:42:50.280 –> 00:42:53.639
to me is in every gold rush, which AI is definitely

00:42:53.639 –> 00:42:57.920
a gold rush, there’s always some… people who

00:42:57.920 –> 00:43:00.400
decide to jump in early and be on the forefront

00:43:00.400 –> 00:43:02.659
of it because potentially because they see the

00:43:02.659 –> 00:43:04.380
writing on the wall and they know like, oh, if

00:43:04.380 –> 00:43:06.059
this comes to be, I want to be on the forefront

00:43:06.059 –> 00:43:09.139
of it. Because then at least I will capitalize

00:43:09.139 –> 00:43:12.079
off it rather than being kind of the one that

00:43:12.079 –> 00:43:14.539
that gets tossed aside because of it. Right.

00:43:14.860 –> 00:43:16.840
There’s this this quote you said Gold Rush that

00:43:16.840 –> 00:43:19.179
made me think of it. When there’s a gold rush,

00:43:19.480 –> 00:43:22.539
sell shovels. Right. And that’s exactly what

00:43:22.539 –> 00:43:25.329
I kind of think. Well, that’s what a lot of these

00:43:25.329 –> 00:43:27.409
AI companies are doing. But in some ways, that’s

00:43:27.409 –> 00:43:28.949
kind of what they’re doing. Because I think,

00:43:29.309 –> 00:43:31.809
for instance, that Darren Aronofsky probably

00:43:31.809 –> 00:43:33.570
got a lot of money from Google to announce that

00:43:33.570 –> 00:43:37.369
partnership. Oh, no doubt. Yep. And so he’s thinking

00:43:37.369 –> 00:43:38.710
like, hey, they’re going to do a partnership

00:43:38.710 –> 00:43:41.949
with someone. It should be me. And Darren Aronofsky,

00:43:42.670 –> 00:43:45.250
it’s a little unbranded for him because he’s

00:43:45.250 –> 00:43:47.230
kind of known for embracing new techniques in

00:43:47.230 –> 00:43:49.610
filmmaking and pushing the boundary of technology

00:43:49.610 –> 00:43:52.929
in that regard. And that is an argument that

00:43:52.929 –> 00:43:54.449
a lot of people make, which we’re going to get

00:43:54.449 –> 00:43:58.389
into the arguments for AI in a little bit. But

00:43:58.389 –> 00:44:02.150
I’m not sure I buy it 100%. I think it was really

00:44:02.150 –> 00:44:03.989
about the money, if I’m being honest. And I think

00:44:03.989 –> 00:44:07.730
that as filmmakers, we’re always trying to get

00:44:07.730 –> 00:44:11.710
funding. And if you’re doing something, both

00:44:11.710 –> 00:44:13.630
their announcements got a ton of press. And so

00:44:13.630 –> 00:44:16.530
if you can kind of be the project, the first

00:44:16.530 –> 00:44:19.619
one, like, that might be a way to generate money

00:44:19.619 –> 00:44:21.679
for whatever project you’re trying to make, especially

00:44:21.679 –> 00:44:24.659
if it’s controversial. Exactly. Controversy does

00:44:24.659 –> 00:44:28.860
breed attention, which breeds funding. So what

00:44:28.860 –> 00:44:31.619
this sort of brings us to the next thing that

00:44:31.619 –> 00:44:33.239
I’d like to talk about and get your input about,

00:44:33.300 –> 00:44:36.119
because there’s been a lot of talk about using

00:44:36.119 –> 00:44:39.760
AI in the part of the process in pre -production,

00:44:40.059 –> 00:44:44.340
particularly in. Screenwriting and as a screenwriter,

00:44:44.360 –> 00:44:46.860
I’m curious if you’ve had any action with AI

00:44:46.860 –> 00:44:51.760
or What’s your sense about it being? part of

00:44:51.760 –> 00:44:55.380
the the genesis of the story making process So

00:44:55.380 –> 00:44:58.019
I get why it’s tempting a friend of mine showed

00:44:58.019 –> 00:45:01.059
me before I even knew what chat GPT was he’s

00:45:01.059 –> 00:45:03.980
like give me an idea for a movie so I just Came

00:45:03.980 –> 00:45:06.719
up with something on the spot and ten seconds

00:45:06.719 –> 00:45:10.519
later There’s a whole treatment for this potential

00:45:10.519 –> 00:45:13.679
movie. And so then I looked at it and I’m like,

00:45:13.739 –> 00:45:17.579
that is crazy, right? But I was like, this feels

00:45:17.579 –> 00:45:21.079
formulaic. And then he’s like, add twist. And

00:45:21.079 –> 00:45:26.280
yeah, you could just go from zero to 60 with

00:45:26.280 –> 00:45:31.119
doing that. But as a writer, the interesting

00:45:31.119 –> 00:45:34.719
thing about writing to me is the process of discovery.

00:45:34.840 –> 00:45:36.460
You can have an outline. You can know where you’re

00:45:36.460 –> 00:45:39.530
going. And then you get a better idea. And you’re

00:45:39.530 –> 00:45:43.489
like, oh, shit. And discovering it that way.

00:45:43.610 –> 00:45:45.849
Whereas if you’re just copying something out

00:45:45.849 –> 00:45:50.630
of the ether that is patterns of thousands of

00:45:50.630 –> 00:45:53.010
different movies to give you some perfect formula

00:45:53.010 –> 00:45:56.769
for it, what the fuck is the point? Yeah. No,

00:45:56.769 –> 00:46:00.070
you’re right. I think the Coen brothers, I think

00:46:00.070 –> 00:46:02.070
it’s Ethan Coen who’s the primary writer out

00:46:02.070 –> 00:46:08.070
of the two, I’m pretty sure. I read a book. about

00:46:08.070 –> 00:46:10.630
their work in the past. And I remember something

00:46:10.630 –> 00:46:12.050
that’s always stood out to me, and I read this

00:46:12.050 –> 00:46:14.909
book 25 years ago, but something that stood out

00:46:14.909 –> 00:46:19.969
to me about it was he said, what makes his work

00:46:19.969 –> 00:46:21.710
different or what he tries to do as a writer

00:46:21.710 –> 00:46:25.510
is that whenever he writes something, the first

00:46:25.510 –> 00:46:28.510
version is usually the expected way. And then

00:46:28.510 –> 00:46:31.630
he usually looks at that and be like, what if

00:46:31.630 –> 00:46:35.039
we do the opposite? Or like, what if we take

00:46:35.039 –> 00:46:36.980
this in a totally different direction? And that’s

00:46:36.980 –> 00:46:40.000
what keeps their films feeling fresh and interesting

00:46:40.000 –> 00:46:42.739
a lot of times. That’s cool. I’d never heard

00:46:42.739 –> 00:46:46.139
that. Yeah. And so I’m kind of curious if maybe

00:46:46.139 –> 00:46:49.579
there’s a way to use AI just speculating here

00:46:49.579 –> 00:46:52.320
where you get the sort of expected formulaic

00:46:52.320 –> 00:46:53.460
draft, and then you’re like, OK, I’m going to

00:46:53.460 –> 00:46:57.500
do everything differently. You take it as a starting

00:46:57.500 –> 00:46:59.519
point and be like, OK, this is what not to do.

00:47:01.429 –> 00:47:05.110
If people start using AI to write scripts in

00:47:05.110 –> 00:47:08.789
such a way and then change them, it’s just like

00:47:08.789 –> 00:47:12.690
a shortcut. You know, I feel like you earn your

00:47:12.690 –> 00:47:15.789
projects by the blood, sweat, and tears if it’s

00:47:15.789 –> 00:47:18.010
something creative. Yeah, well said. You know,

00:47:18.150 –> 00:47:20.389
if you’re like a painter, you’re not just gonna

00:47:20.389 –> 00:47:21.650
be like, you know what, I’m just gonna press

00:47:21.650 –> 00:47:24.329
a button on a computer and tell it what to paint

00:47:24.329 –> 00:47:27.420
and be like, I’m a fucking artist. Right. No,

00:47:27.420 –> 00:47:29.719
it’s true. I mean, and also like there’s no soul

00:47:29.719 –> 00:47:33.480
in that. And, you know, I think people can see

00:47:33.480 –> 00:47:35.800
much like we were talking about the acting, like

00:47:35.800 –> 00:47:38.199
whether there’s like soul, like there’s a hand

00:47:38.199 –> 00:47:41.920
that made the thing. And, you know, with writing

00:47:41.920 –> 00:47:44.619
in particular, because we’re talking about formulating

00:47:44.619 –> 00:47:48.159
the story at like the base level, you know, AI

00:47:48.159 –> 00:47:51.760
really only generates based on what’s come before.

00:47:52.260 –> 00:47:54.420
So you’re never going to get a script out of

00:47:54.420 –> 00:47:57.840
that. That’s just totally New and innovative

00:47:57.840 –> 00:48:02.219
like idea that really pushes the the form of

00:48:02.219 –> 00:48:05.119
cinema forward It’s only ever gonna be like a

00:48:05.119 –> 00:48:07.539
rehashing of stories that we’ve told before right

00:48:07.539 –> 00:48:11.760
and to get away from cliches That’s where you

00:48:11.760 –> 00:48:15.260
that you hear right what you know, right? You

00:48:15.260 –> 00:48:17.900
know your life, you know those moments of things

00:48:17.900 –> 00:48:21.369
that happen that you know, you never would have

00:48:21.369 –> 00:48:23.750
thought of. It’s like, the truth is stranger

00:48:23.750 –> 00:48:25.710
than fiction. So I’ll give you an example. So

00:48:25.710 –> 00:48:28.989
years ago, got into an accident. My windshield

00:48:28.989 –> 00:48:33.130
was cracked. I had to take it to a mechanic,

00:48:33.130 –> 00:48:34.690
you know, wherever they’re gonna replace the

00:48:34.690 –> 00:48:37.710
windshield, but it was hot out. So I had the

00:48:37.710 –> 00:48:41.909
AC on and the guy told me he’s like… Those

00:48:41.909 –> 00:48:43.969
glass shards could get sucked in there and spit

00:48:43.969 –> 00:48:46.929
in your face And I was like, holy fuck but I

00:48:46.929 –> 00:48:48.710
was like that’s so I wrote that into a script

00:48:48.710 –> 00:48:51.289
because I’d never seen that right I never would

00:48:51.289 –> 00:48:53.829
have thought of that at that, you know, I was

00:48:53.829 –> 00:48:56.150
in an accident and that happened Yeah, so like

00:48:56.150 –> 00:48:59.230
a point, you know you have these Moments in life

00:48:59.230 –> 00:49:02.989
or something goes south or whatever and you as

00:49:02.989 –> 00:49:08.820
a writer you use you use it. Yeah Yeah Yeah And

00:49:08.820 –> 00:49:10.559
AI obviously doesn’t have that experience, like

00:49:10.559 –> 00:49:12.260
you were saying earlier, like the lived experience

00:49:12.260 –> 00:49:14.219
to incorporate things like that. I mean, it sort

00:49:14.219 –> 00:49:15.780
of has access to all of our lived experience,

00:49:16.039 –> 00:49:19.440
but I do think like inspiration comes from, you

00:49:19.440 –> 00:49:22.179
know, all sorts of strange places. And so you’re

00:49:22.179 –> 00:49:23.739
far more likely to come up with something unique

00:49:23.739 –> 00:49:27.000
that way. One last thing about like screenwriting

00:49:27.000 –> 00:49:31.099
AI and all that. is it’s not as new as we think

00:49:31.099 –> 00:49:34.980
it is. Back in 2016, there was a project called

00:49:34.980 –> 00:49:37.780
Impossible Things, a Canadian production company

00:49:37.780 –> 00:49:40.239
was trying to put together, and it achieved its

00:49:40.239 –> 00:49:42.340
Kickstarter goal. But it basically was going

00:49:42.340 –> 00:49:47.320
to integrate AI patterns to find the perfect

00:49:47.320 –> 00:49:50.119
formula for a movie and then have a human right

00:49:50.119 –> 00:49:53.139
with that in mind. And what’s interesting is

00:49:53.139 –> 00:49:56.719
it achieved its Kickstarter goal. But if you

00:49:56.719 –> 00:50:00.699
look at the IMDB page, it’s it’s in pre production.

00:50:00.739 –> 00:50:03.559
So I think it’s kind of a dead, dead end. But

00:50:03.559 –> 00:50:06.139
it’s just people are thinking about it back then.

00:50:06.139 –> 00:50:08.300
Yeah. You know, trying to get ahead of it. But

00:50:08.300 –> 00:50:10.099
yeah, I mean, yeah, you bring up a good point.

00:50:10.239 –> 00:50:14.960
I certainly as a new it’s like. I think certain

00:50:14.960 –> 00:50:17.800
people have been wanting this for a long time,

00:50:17.800 –> 00:50:19.739
you know, and been kind of working towards it.

00:50:19.739 –> 00:50:21.400
And there’s something sci fi about it, you know,

00:50:21.400 –> 00:50:23.300
it is sort of interesting. But, you know, when

00:50:23.300 –> 00:50:26.019
you’re speculating about future technologies,

00:50:26.179 –> 00:50:28.550
you’re always. often thinking about all of the

00:50:28.550 –> 00:50:30.670
positives and, you know, rarely thinking about

00:50:30.670 –> 00:50:33.289
the negatives. Right. And now here we are. The

00:50:33.289 –> 00:50:36.289
other thing, actually, one is that in my algorithms

00:50:36.289 –> 00:50:39.809
as a screenwriter, there’s companies like Greenlight

00:50:39.809 –> 00:50:44.489
Coverage that use AI to assess your your log

00:50:44.489 –> 00:50:48.250
lines. Yeah. Or probably even your scripts. because

00:50:48.250 –> 00:50:50.190
I can turn it around super quick. Have you done

00:50:50.190 –> 00:50:52.349
that before? No, but I was curious about it because

00:50:52.349 –> 00:50:54.889
they do a really good job of presenting itself.

00:50:55.329 –> 00:50:58.110
I think it was even before I realized AI was

00:50:58.110 –> 00:51:01.130
involved. We should run our script for Last Call

00:51:01.130 –> 00:51:02.710
Through just for the fun of it and see what happens.

00:51:03.010 –> 00:51:05.090
Jesus Christ, that would cost money to use it.

00:51:05.409 –> 00:51:08.670
Oh, yeah. Well, never mind then. So moving on

00:51:08.670 –> 00:51:12.570
to video generation, you did show me somebody

00:51:12.570 –> 00:51:15.360
shared with you Higgs field which you shared

00:51:15.360 –> 00:51:18.579
with me and Higgs field is essentially a software

00:51:18.579 –> 00:51:23.739
for generating really convincing and lifelike

00:51:23.739 –> 00:51:28.480
AI video and you can With that you can change

00:51:28.480 –> 00:51:31.760
what kind of camera what kind of lens? You can

00:51:31.760 –> 00:51:34.880
make the character persist throughout the scene

00:51:34.880 –> 00:51:37.400
or potentially maybe the film You know, I wanted

00:51:37.400 –> 00:51:39.500
to play around with it more, but I you have to

00:51:39.500 –> 00:51:41.650
pay for it to even really start playing with

00:51:41.650 –> 00:51:43.849
it and at least on the things that I wanted to

00:51:43.849 –> 00:51:47.710
try. And so I don’t have hands on with it. But

00:51:47.710 –> 00:51:50.969
what I saw was like it was really impressive.

00:51:51.030 –> 00:51:54.050
I will say is very, very, very impressive and

00:51:54.050 –> 00:51:56.909
somewhat scary because, you know, I don’t think

00:51:56.909 –> 00:51:59.030
it’s 100 percent there, but it shows promise

00:51:59.030 –> 00:52:01.769
in terms of like who knows what could happen.

00:52:02.309 –> 00:52:05.969
Right. I for all that type of stuff. You know,

00:52:06.030 –> 00:52:09.650
the friend he shared it with me. I agreed. I

00:52:09.650 –> 00:52:11.679
was like, all right, for previews, You could

00:52:11.679 –> 00:52:13.579
totally. Previs for sure. You could use that.

00:52:13.699 –> 00:52:14.980
And if you don’t guys don’t know what previs

00:52:14.980 –> 00:52:17.119
is, it’s like if you’re shooting a movie, you

00:52:17.119 –> 00:52:20.980
do your storyboards. You want like, you know

00:52:20.980 –> 00:52:23.000
what the movie looks like before you shoot it.

00:52:23.019 –> 00:52:24.599
So you kind of have a template. Yeah. And you

00:52:24.599 –> 00:52:28.000
could save time on set and. Yeah. So I could

00:52:28.000 –> 00:52:31.239
see the use case for that. But for sure, you

00:52:31.239 –> 00:52:33.480
just don’t want people then making movies that

00:52:33.480 –> 00:52:37.880
way. Yeah. Yeah. What I’ve kind of. still think

00:52:37.880 –> 00:52:40.239
our barriers to really video generation taking

00:52:40.239 –> 00:52:41.699
off, and I don’t even think that Higgs fields

00:52:41.699 –> 00:52:45.340
entirely solve this, is that limps sinking can

00:52:45.340 –> 00:52:50.059
still be an issue because we are automatically

00:52:50.059 –> 00:52:51.840
keyed into the shape of people’s mouths when

00:52:51.840 –> 00:52:55.440
they’re talking and even very subtle differences

00:52:55.440 –> 00:52:58.219
are highly noticeable and they’re pretty good

00:52:58.219 –> 00:53:01.949
with it these days. but there’s still some uncanny

00:53:01.949 –> 00:53:04.349
valley stuff in the mouth area when people are

00:53:04.349 –> 00:53:07.170
talking. Right. It’s, no, a good example is like

00:53:07.170 –> 00:53:10.130
sometimes if some streaming platforms, I’ve seen

00:53:10.130 –> 00:53:13.550
that where it’s like just off, maybe it’s like

00:53:13.550 –> 00:53:16.090
a European film and they’re not speaking English,

00:53:16.130 –> 00:53:18.170
but they have an English dub over it. Yeah, yeah,

00:53:18.170 –> 00:53:20.269
exactly. And you’re like, wait, this is, or even

00:53:20.269 –> 00:53:23.409
if there’s like a lag, it just drives me crazy.

00:53:23.590 –> 00:53:28.760
Yeah. No, it’s true. Sometimes what I think about

00:53:28.760 –> 00:53:33.179
with that is that with TikTok videos and YouTube,

00:53:34.039 –> 00:53:36.380
the way that kids approach social media these

00:53:36.380 –> 00:53:40.619
days is very postmodern, where anything can happen

00:53:40.619 –> 00:53:44.219
at any time and you’re less worried about things

00:53:44.219 –> 00:53:47.420
like continuity and even sometimes the quality

00:53:47.420 –> 00:53:50.079
of the shot or the effect that they’re doing.

00:53:50.480 –> 00:53:53.360
All of a sudden, cartoon sunglasses fall on someone’s

00:53:53.360 –> 00:53:57.239
face in a YouTube video to like… uh, signify

00:53:57.239 –> 00:53:58.579
that they just did something thug, you know,

00:53:58.579 –> 00:54:01.719
or whatever, or like, you know, just pauses and

00:54:01.719 –> 00:54:03.719
zooms in real quick. So it’s like, but it becomes

00:54:03.719 –> 00:54:05.840
hyper pixelated because you zoomed in so close,

00:54:05.840 –> 00:54:08.579
you know, things like that. And I just wonder

00:54:08.579 –> 00:54:10.880
if all of these things are actually just training

00:54:10.880 –> 00:54:15.860
us to accept like. You know, unintentionally

00:54:15.860 –> 00:54:18.019
so, but except like the uncanny Valley mouth

00:54:18.019 –> 00:54:20.280
thing, you know, like you’re talking about the

00:54:20.280 –> 00:54:22.639
dubbing of voiceover you want that’s more and

00:54:22.639 –> 00:54:24.960
more popular on. Netflix to watch foreign films

00:54:24.960 –> 00:54:29.280
with voice with dubbing. And, you know, do people

00:54:29.280 –> 00:54:31.820
care anymore if there’s uncanny valley stuff

00:54:31.820 –> 00:54:35.360
around the mouth? Like, who knows? I tend to

00:54:35.360 –> 00:54:38.820
read the subtitles like I it does bite you in

00:54:38.820 –> 00:54:40.719
the ass a little bit if there’s action and all

00:54:40.719 –> 00:54:42.900
this stuff and you’re trying to read those. But

00:54:42.900 –> 00:54:45.460
I’m good with reading the subtitles. Same. But

00:54:45.460 –> 00:54:48.000
like, we’re not the next generation. No, true.

00:54:48.219 –> 00:54:51.199
But at the same time, I think there’s some younger

00:54:51.199 –> 00:54:53.550
kids that probably we’ll see the older stuff

00:54:53.550 –> 00:54:56.530
and be like, no, this is cool. Like it for the

00:54:56.530 –> 00:54:58.809
reason we do. Just they’re born in the wrong

00:54:58.809 –> 00:55:01.769
generation. No, totally, totally. Yeah, the other

00:55:01.769 –> 00:55:05.010
problem that is still a problem with video generation,

00:55:05.309 –> 00:55:07.550
and this segment I think applies a lot to filmmakers

00:55:07.550 –> 00:55:09.670
because it’s the video generation that a lot

00:55:09.670 –> 00:55:12.469
of us are worried about. But another problem

00:55:12.469 –> 00:55:15.150
is eye lines. I’ve still yet to find, I looked

00:55:15.150 –> 00:55:18.210
for examples before we recorded this of scenes

00:55:18.210 –> 00:55:20.750
of more than one person’s talking to each other.

00:55:22.159 –> 00:55:25.460
but realistic videos, and you have to find one.

00:55:26.699 –> 00:55:28.480
And sometimes when you have more than one people

00:55:28.480 –> 00:55:30.960
talking in a scene, they’re kind of looking at

00:55:30.960 –> 00:55:33.159
other things or at the screen, but never at each

00:55:33.159 –> 00:55:36.320
other. And that’s something we intuitively know

00:55:36.320 –> 00:55:39.420
in the language of cinema, that when you cut

00:55:39.420 –> 00:55:41.820
from this person looking camera left, this person

00:55:41.820 –> 00:55:44.780
needs to look camera right. And like, you know,

00:55:44.960 –> 00:55:47.829
in geography… in the geography of the scene

00:55:47.829 –> 00:55:50.030
in the room where they should be looking. 180

00:55:50.030 –> 00:55:53.190
rule, right? Well, the 180 rule, yeah, it comes

00:55:53.190 –> 00:55:56.650
into play. I mean, for all I know, AI obeys the

00:55:56.650 –> 00:56:00.250
180 rule, but regardless, the continuity from

00:56:00.250 –> 00:56:02.710
shot to shot of where the person’s looking and

00:56:02.710 –> 00:56:04.730
where their hand is and all of that is off, and

00:56:04.730 –> 00:56:08.409
I’ve noticed that in some of the AI -generated

00:56:08.409 –> 00:56:11.309
videos I’ve seen, where it’s like, oh, he’s pointing

00:56:11.309 –> 00:56:12.670
to that part of the room, but where is that part

00:56:12.670 –> 00:56:14.130
of the room? We haven’t seen it yet. Like, it

00:56:14.130 –> 00:56:17.320
just… It doesn’t entirely make sense, the geography

00:56:17.320 –> 00:56:19.559
of the scene. Well, so that gets me wondering

00:56:19.559 –> 00:56:22.340
if AI is just going to start course correcting

00:56:22.340 –> 00:56:26.360
itself in terms of film or how we. But the problem

00:56:26.360 –> 00:56:29.619
is that AI doesn’t understand. This is why I’m

00:56:29.619 –> 00:56:31.719
not particularly scared about AI is AI doesn’t

00:56:31.719 –> 00:56:34.079
understand what it’s doing. It’s just looking

00:56:34.079 –> 00:56:36.440
at patterns and how things have been done before.

00:56:36.500 –> 00:56:39.739
But it doesn’t realize in those patterns, cutting

00:56:39.739 –> 00:56:44.059
between shots, because the pattern it like. it

00:56:44.059 –> 00:56:49.039
doesn’t quite understand that, you know, it’s

00:56:49.039 –> 00:56:51.739
looking at the other shot, the other person.

00:56:52.039 –> 00:56:54.179
So it’s like dumb, dumb to this stuff. And it

00:56:54.179 –> 00:56:56.440
really, so I think that’s going to be, those

00:56:56.440 –> 00:56:58.039
kinds of things are actually the last barrier

00:56:58.039 –> 00:57:00.679
for AI that, and there are hard problems to solve.

00:57:00.860 –> 00:57:03.019
And I’m not saying we’re never going to get there,

00:57:03.019 –> 00:57:06.860
but from what limited things I know about how

00:57:06.860 –> 00:57:09.440
it works, I do think it’s going to take another

00:57:09.440 –> 00:57:13.730
leap forward in, I don’t know if it’s the algorithms

00:57:13.730 –> 00:57:15.909
or a whole different approach to AI before I

00:57:15.909 –> 00:57:18.150
think that’s even really a thing that they’re

00:57:18.150 –> 00:57:20.070
going to be able to do. So I think that limits

00:57:20.070 –> 00:57:22.670
AI and from the video generation standpoint to

00:57:22.670 –> 00:57:28.210
being more used in montages or music videos or

00:57:28.210 –> 00:57:32.340
desperate kinds of elements that. our montage

00:57:32.340 –> 00:57:34.440
together to make a cohesive narrative. But I

00:57:34.440 –> 00:57:36.619
don’t know if we’re going to get like a full

00:57:36.619 –> 00:57:39.920
Stranger Things movie or whatever, because making

00:57:39.920 –> 00:57:42.360
those scenes and making them feel authentic is

00:57:42.360 –> 00:57:45.820
going to be really challenging. Yeah. And what’s

00:57:45.820 –> 00:57:49.619
also kind of fun, I mean, I’m being sardonic

00:57:49.619 –> 00:57:53.460
when I say that, is that it’s like these things

00:57:53.460 –> 00:57:55.800
with AI are nothing compared to like the real

00:57:55.800 –> 00:58:01.199
dangers. You know, it’s like the biggest threat

00:58:01.199 –> 00:58:04.039
of humanity, as the experts will tell you. Yeah.

00:58:04.480 –> 00:58:06.800
Well, let’s get into the arguments for AI, then

00:58:06.800 –> 00:58:08.380
we’ve kind of touched on them a little bit here

00:58:08.380 –> 00:58:10.980
and there or alluded to them. But, you know,

00:58:11.039 –> 00:58:16.380
a lot of people. Really suggests that that AI

00:58:16.380 –> 00:58:20.159
is like silly to resist it and that much like

00:58:20.159 –> 00:58:22.039
other technologies in the past, yeah, it’s going

00:58:22.039 –> 00:58:25.280
to disrupt. The industry and some people are

00:58:25.280 –> 00:58:27.460
gonna lose their job, but it’s gonna create new

00:58:27.460 –> 00:58:31.940
jobs somehow and it’s gonna If it’s a it’s a

00:58:31.940 –> 00:58:33.800
tool creative people are gonna learn how to use

00:58:33.800 –> 00:58:36.179
it and those people are gonna come out ahead

00:58:36.179 –> 00:58:38.780
and That’s what you hear from everyone from Mark

00:58:38.780 –> 00:58:42.000
Cuban down to you know, I mean particularly the

00:58:42.000 –> 00:58:46.800
CEOs of these AI companies Right. Well My thoughts

00:58:46.800 –> 00:58:50.900
on that are who, who are the, from what I understand,

00:58:51.059 –> 00:58:54.500
right? There’s no real guardrails for AI and

00:58:54.500 –> 00:58:57.380
you have every AI company racing to be the next

00:58:57.380 –> 00:59:01.460
Google. So that’s dangerous. You know, and it’s

00:59:01.460 –> 00:59:03.880
all, it all goes back to fucking money. You know,

00:59:03.920 –> 00:59:05.519
it’s all rich people. It’s all billionaires.

00:59:05.860 –> 00:59:10.199
It’s all the corporations who are, who are But

00:59:10.199 –> 00:59:11.860
dragging us. But do you buy that argument? Do

00:59:11.860 –> 00:59:13.559
you think that it’s really going to be? I mean,

00:59:13.599 –> 00:59:16.360
I think they believe that and they want you to

00:59:16.360 –> 00:59:18.840
believe it. Well, yeah. Yeah. But do you think,

00:59:18.920 –> 00:59:22.019
do you think, though, that there’s truth to it,

00:59:22.039 –> 00:59:24.000
though? Do you think that? I think there could

00:59:24.000 –> 00:59:26.860
be truth to that. But at the same time, like

00:59:26.860 –> 00:59:30.380
in the last year, you know, AI has not performed

00:59:30.380 –> 00:59:34.559
the way that it’s been projected to. Yeah. Yeah.

00:59:34.679 –> 00:59:37.630
You know, a lot of a lot of people. Call it a

00:59:37.630 –> 00:59:39.409
bubble like you were saying that like like this

00:59:39.409 –> 00:59:41.829
is all marketing hype, which I kind of fall in

00:59:41.829 –> 00:59:46.230
that argument but also The ex I actually heard

00:59:46.230 –> 00:59:49.929
a really interesting argument from a programmer

00:59:49.929 –> 00:59:57.010
He’s a youtuber named the primogen but he found

00:59:57.010 –> 01:00:01.889
this post by the CEO of of Microsoft who said

01:00:01.889 –> 01:00:07.179
we got to stop calling AI AI slop he’s like,

01:00:07.460 –> 01:00:10.559
you know, like as if we’re doing them a disservice

01:00:10.559 –> 01:00:13.340
and It’s of course it’s benefits him for people

01:00:13.340 –> 01:00:15.960
to stop calling AI slop a slot But the primogens

01:00:15.960 –> 01:00:18.900
argument was that nobody would be calling it

01:00:18.900 –> 01:00:22.800
AI slop if it were good You know, like you don’t

01:00:22.800 –> 01:00:26.070
see the need for other products that are good

01:00:26.070 –> 01:00:28.289
for them to go have to go out and chide people

01:00:28.289 –> 01:00:31.809
for calling them names or whatever and you know

01:00:31.809 –> 01:00:34.010
his larger point though is that there’s such

01:00:34.010 –> 01:00:37.590
a mismatch between what these CEOs are promising

01:00:37.590 –> 01:00:39.829
that AI can do and then when you get in front

01:00:39.829 –> 01:00:41.570
of it and actually try to use it like what it

01:00:41.570 –> 01:00:44.710
actually can do right and you feel that mismatch

01:00:44.840 –> 01:00:47.559
every day interacting with AI, at least I do.

01:00:47.960 –> 01:00:49.780
And I, you know, I’m probably not as much of

01:00:49.780 –> 01:00:51.800
an AI head as other people, but I, you know,

01:00:51.800 –> 01:00:53.940
I experiment with it. I try to use it sometimes

01:00:53.940 –> 01:00:57.320
if I, I don’t know, I’m doing like a refresh

01:00:57.320 –> 01:00:59.139
of a resume or I don’t know, whatever it could

01:00:59.139 –> 01:01:01.400
be, but like, ultimately I ended up redoing it

01:01:01.400 –> 01:01:03.199
all by hand anyways, because I’m so disappointed

01:01:03.199 –> 01:01:05.159
with what comes out of it. And I’m not, you know,

01:01:05.199 –> 01:01:07.179
I’m not using low -end models. I’m trying to

01:01:07.179 –> 01:01:08.840
do it with the best ones I have access to. For

01:01:08.840 –> 01:01:11.380
those people who are kind of on the fence and…

01:01:10.960 –> 01:01:14.440
experimenting with AI, I will say the one use

01:01:14.440 –> 01:01:17.219
that I get out of it being like such a lover

01:01:17.219 –> 01:01:20.400
of music and punk music is I will go to chat

01:01:20.400 –> 01:01:22.739
GPT and be like, you know, give me more songs

01:01:22.739 –> 01:01:25.400
like hybrid moments by the misfits that are of

01:01:25.400 –> 01:01:28.579
that error that are deep cuts and AI will present

01:01:28.579 –> 01:01:32.420
very interesting song choices and that I have

01:01:32.420 –> 01:01:35.530
never come across before. Yeah, and yeah, there’s

01:01:35.530 –> 01:01:37.550
there’s legitimate uses for AI and I wouldn’t

01:01:37.550 –> 01:01:39.630
I’m not trying to like argue personally that

01:01:39.630 –> 01:01:42.929
that there aren’t like by legitimate I mean like

01:01:42.929 –> 01:01:46.349
Look all technology technology is value agnostic

01:01:46.349 –> 01:01:50.190
in many ways. It’s how we use the The technology

01:01:50.190 –> 01:01:53.110
on what makes it sort of ethical or not ethical

01:01:53.110 –> 01:01:57.030
so there I think there are fairly like use cases

01:01:57.030 –> 01:02:00.769
for AI that are both incredibly helpful, but

01:02:00.769 –> 01:02:03.590
also not too impactful negatively on people.

01:02:03.829 –> 01:02:06.090
And I think what you said is a really good example.

01:02:06.269 –> 01:02:08.369
I mean, I think at the same time, someone gave

01:02:08.369 –> 01:02:10.269
me shit for it because it’s like the environmental

01:02:10.269 –> 01:02:13.469
cost of every time you, and I’m like, that’s,

01:02:13.469 –> 01:02:16.409
that’s fair. You know, this isn’t a meant to

01:02:16.409 –> 01:02:19.570
be an entire analysis of AI in general and like

01:02:19.570 –> 01:02:21.349
all the pros and cons of it, but that is a really

01:02:21.349 –> 01:02:23.170
good point. The environmental impact for all

01:02:23.170 –> 01:02:25.650
of this stuff is massive. You know, I mean, Yeah,

01:02:26.030 –> 01:02:32.329
I think we would be fine as a society or a race

01:02:32.329 –> 01:02:35.170
of species, I guess is what I’m, without AI.

01:02:35.409 –> 01:02:38.750
Like we don’t need this, but it’s here and Pandora’s

01:02:38.750 –> 01:02:40.570
box has been opened. Well, you know, and the

01:02:40.570 –> 01:02:43.170
thing is like, they’re selling it as like these

01:02:43.170 –> 01:02:45.170
like shortcuts, right? You can do everything

01:02:45.170 –> 01:02:48.010
quicker, faster, whatever. But like, you know,

01:02:48.170 –> 01:02:51.429
sometimes, You don’t want it to be quicker and

01:02:51.429 –> 01:02:53.610
faster because it’s about the process, particularly

01:02:53.610 –> 01:02:55.829
when it comes to creative things like filmmaking

01:02:55.829 –> 01:03:01.130
or painting. It’s not the end result that makes

01:03:01.130 –> 01:03:09.570
it art. It’s the process. Yeah. You create something

01:03:09.570 –> 01:03:13.670
and you go through hell, but you have these moments

01:03:13.670 –> 01:03:17.010
of joy doing it where you have a breakthrough

01:03:17.010 –> 01:03:18.750
and you’re like, holy fuck, this is really cool.

01:03:19.489 –> 01:03:22.710
And that’s why people are driven to create art.

01:03:23.230 –> 01:03:25.750
You know, there’s a struggle there and you’re

01:03:25.750 –> 01:03:27.650
compelled to do it if you’re like a creative

01:03:27.650 –> 01:03:31.050
type. Yep. I couldn’t agree more. That’s probably

01:03:31.050 –> 01:03:32.809
a good place to leave it. Do you have any closing

01:03:32.809 –> 01:03:35.409
thoughts? Anything you want to share? Yeah, fuck

01:03:35.409 –> 01:03:37.530
AI. I don’t know. Yeah, no, that’s a good closing

01:03:37.530 –> 01:03:41.650
thought, actually. I endorsed that comment. Party

01:03:41.650 –> 01:03:46.110
on. That’ll bring us to the end of this week’s

01:03:46.110 –> 01:03:49.659
episode of Nightmare Logic. Next week, we’re

01:03:49.659 –> 01:03:51.900
going to sit down with our friend and film historian

01:03:51.900 –> 01:03:54.719
David Delvalle to talk about what horror is and

01:03:54.719 –> 01:03:57.199
how it’s evolved from the 1960s when he was a

01:03:57.199 –> 01:04:00.199
kid until today. For this episode’s show notes,

01:04:00.420 –> 01:04:03.219
go to NightmareLogic .net and feel free to follow

01:04:03.219 –> 01:04:06.400
us on Instagram at NightmareLogicPod. We’d also

01:04:06.400 –> 01:04:08.539
like to give a shout out to our amazing composer

01:04:08.539 –> 01:04:12.059
Lars Lang -Peterson for our awesome intro. We

01:04:12.059 –> 01:04:14.460
are your hosts, Christopher Smith and Peter Sawyer.

01:04:14.820 –> 01:04:15.679
Until next time!