Episode 227

May 25, 2025

02:24:32

Ep. 227: the atlas vampire and ai psychosis

Hosted by

Mark Lewis Corrigan Vaughan
Ep. 227: the atlas vampire and ai psychosis
Jack of All Graves
Ep. 227: the atlas vampire and ai psychosis

May 25 2025 | 02:24:32

/

Show Notes

Our AI dystopia races forward! This week we're going through some of the unhinged AI news stories from the past few weeks, from AI-induced psychosis, to college cheaters, to an AI victim statement in an actual courtroom. What even is reality anymore?

Highlights:

[0:00] Marko tells Corrigan about the bizarre mystery of the Atlas Vampire
[21:36] Mark tells a horrifyingly embarrassing story
[34:15] Some of our listeners have tangential connections to murderers; a big thank you to Ryan for filling in last week; book club and Ko-Fi updates
[42:26] CoRri hates Eli Roth and rich people who crowdfund
[58:18] What we watched: The Ugly Stepsister, Evil Dead Rise, Miller's Crossing, Fear Street: Prom Queen, The Menu, Until Dawn, Warfare, A Nightmare on Elm Street 4: The Dream Master, Freddy's Dead: The Final Nightmare
[01:30:50] AI is officially out of hand in some very scary and disturbing ways

Stuff we referenced:

View Full Transcript

Episode Transcript

[00:00:04] Speaker A: I said I was ready. I'm not. Hang on. But it's fine. We can carry on recording. [00:00:08] Speaker B: That was a lie. [00:00:10] Speaker A: I told a lie. All right, look, just classic stuff from me this week, right? Classic stuff, okay? Rigorously watertight research. Different time, a different place, a different land. [00:00:26] Speaker B: Okay? [00:00:26] Speaker A: But, but, but, but. A horror, a killing. A horror, a murder, but under the fucking strangest of circumstances, the strangest of events that sends ripples outwards, ripples of unease and terror through a community. Yeah, okay. Have you ever been to. Have you ever been to Sweden? [00:00:50] Speaker B: No, I have never been to Sweden. [00:00:52] Speaker A: Is that one of the. [00:00:52] Speaker B: Have you. [00:00:53] Speaker A: Have I. No, I've been to Denmark. I've been to Holland. I have not been to Sweden. [00:00:59] Speaker B: It's one of those, like, you know, there's places that are on, like, your bucket list or whatever, and then there's ones that just aren't. [00:01:07] Speaker A: Yes. Oh, I know that all too well. [00:01:09] Speaker B: I would put Sweden on the aren't. [00:01:13] Speaker A: What are the other places, then, that you simply could not give a fuck to visit? [00:01:17] Speaker B: Just, honestly, a lot of, like, that kind of general area or like, I don't really give a fuck about Germany. [00:01:24] Speaker A: Bingo. I'm so glad you said that, because I'm the same. Honestly. Fucking not asked. [00:01:29] Speaker B: I went to Belgium. That's, like, adjacent. And loved it. But I have no interest in going to Germany. [00:01:35] Speaker A: If I ever wind up in Germany, it will. It'll be by mistake. I have no, incidentally, interest in going to Germany at all. And I don't. I don't know why. Owen's big into Germany. He's learning German on duration. Yeah. [00:01:47] Speaker B: Why? [00:01:48] Speaker A: Fuck knows. Fuck only knows. But he. Honestly, I have no clue. [00:01:54] Speaker B: No. You've never asked, like, what. What prompted you to start German? [00:01:58] Speaker A: It's a really good question. [00:02:00] Speaker B: I. Listen, with the state of the world, I would check in while your children are learning German. That's all I'm saying. But, yeah, he's been so funny. [00:02:11] Speaker A: Got to be about a year. He's been doing a year just independently. He's not doing it in school, just of his own volition. [00:02:20] Speaker B: Incredible. [00:02:20] Speaker A: Yeah. Work out well. [00:02:22] Speaker B: Then you might end up having to take him to Germany, as it turns out. Or he'll just go, like, do a semester abroad or something. [00:02:28] Speaker A: Yeah, possibly. But that isn't what we're talking about. We're going to. [00:02:34] Speaker B: We're talking about Sweden. [00:02:35] Speaker A: We're going to Sweden. And not only are we going to Sweden, we're going to Sweden in the 1930s. [00:02:42] Speaker B: All right. There's a Thing I've got no context for what. [00:02:46] Speaker A: Just try and visualize that for me, if you don't mind. Listener. [00:02:50] Speaker B: It's like we were playing. What was the game we were playing last night for the. [00:02:54] Speaker A: Let's play Assassin's Creed Shadows. Now available Assassin's Creed Shadows subscribers of all levels. [00:03:01] Speaker B: Yes. And you know, I was like, what year is this supposed to be in your. Like 1700s or whatever? Like, does this look like the Japan? You know, but it's like, you know, there's some places that we have so much media about that, like, we can picture what this time in that place is. Honestly, I couldn't tell you what Sweden looks like now. But certainly I have no frame of reference for. [00:03:22] Speaker A: Listen, I'll tell you. The 1700 was a complete guess. Let me just find out where. [00:03:28] Speaker B: When it actually takes place. [00:03:30] Speaker A: 1579. So point. [00:03:35] Speaker B: Yeah, there we go. But yeah, I have no idea what Sweden in 2025 nor 1930s would be like. [00:03:43] Speaker A: Well, that is not going to change over the course of the next 50 or so minutes. Listen, super brief, a very brief opener from me tonight. Brief and brutal. Brief and brutal and mystifying. [00:03:58] Speaker B: Okay. [00:03:58] Speaker A: Because we have much to discuss. Right? We do, it's true, but 1930s, so you've got a kind of a Great Depression kind of situation globally. So poverty, unemployment was high and in kind of provincial Sweden, which is where we are here. We are in the region of Atlas district of Stockholm. [00:04:28] Speaker B: Okay, all right. [00:04:30] Speaker A: Now. Now, even though sex work was illegal, it was commonplace. [00:04:38] Speaker B: Okay, Sure. [00:04:40] Speaker A: A quite a sizable population of sex workers. Stigmatized, as you'd imagine. [00:04:49] Speaker B: Right. [00:04:52] Speaker A: And in that community. Meet Lily, Lily Lindstrom. All right. [00:05:02] Speaker B: All right. [00:05:05] Speaker A: Born in 1899. And at the time of the story, she's 32, she is a waitress. [00:05:11] Speaker B: I had a neighbor who was born in 1899. That's good context for me. She is the. She was born at the same time as Mr. Shealy. [00:05:19] Speaker A: Right. So Lily Linderstrom and Mr. Sheely, both born in 1899. As I say, Lily is a waitress and also a sex worker. She lives alone in a small apartment in the Atlas district of Stockholm. I'm gonna try and pronounce this Atlas Sherm Redet. Fuck knows if how badly I've mangled that. I apologize. [00:05:46] Speaker B: I feel like there's not gonna be a lot of people who know how to correct you on that here. [00:05:51] Speaker A: Well, you don't need me to tell you how big we are in Sweden. Well, I just pissed off a lot of our core fan base a Lot of day one joagas. A lot of day one jackalites in Sweden. Yeah, yeah. So anyway, Lily was used to, by nature of her trade, operating kind of in a clandestine way. Yeah, she was used to her clientele being often transient visitors who would blow through town. [00:06:32] Speaker B: Sure, yeah. [00:06:33] Speaker A: No pun intended. [00:06:34] Speaker B: Often mark loads. [00:06:37] Speaker A: I'm sorry, that was. [00:06:38] Speaker B: Ew. [00:06:39] Speaker A: I know, I know, I know. She was used to interactions with her clients, often being arranged kind of through. Through, you know, word of mouth, informal networks, discreet kind of arrangements. She was. You would find Lily described as a discreet, intelligent, kind of low key woman who kept herself to herself, caused no trouble, harmed no one, had friends, but was a low key life, shall we say? [00:07:13] Speaker B: Yeah. [00:07:15] Speaker A: A low key life which came to a hell of a halt, I'm afraid to say, on 5-4-32. Yes, 1932. May 4th. A good friend of Lily's, one Minnie Janssen, spelled with a J, so I'm gonna say Janssen. [00:07:36] Speaker B: Yeah, that sounds right. [00:07:37] Speaker A: Yeah. Her friend hadn't hid from Millie for several days, which drew the police to her apartment. And upon entering. Here's the scene. [00:07:52] Speaker B: Oh, no. [00:07:54] Speaker A: Lily naked, lying face down on the bed. All right. [00:08:02] Speaker B: Mm. [00:08:03] Speaker A: Her clothes folded neatly. [00:08:08] Speaker B: Okay. Right, right. [00:08:13] Speaker A: Her head badly stoved in. [00:08:16] Speaker B: Stoved? [00:08:17] Speaker A: Yeah. As in fucking impacted. Just made concave. Yeah, yeah. [00:08:23] Speaker B: Okay. Yep, yep. [00:08:25] Speaker A: Evidence of sexual activity as indicated by a used prophylactic. [00:08:32] Speaker B: Okay. [00:08:34] Speaker A: Sticking out of her anus. Oh, yeah. It's 1932. I don't think any relatives of Lily are gonna be listening to Jack of all graves this week, so let's fucking not spare the detail. She had a used dauber sticking out of her bum. [00:08:51] Speaker B: Yeah, that's just disrespectful. [00:08:54] Speaker A: Yeah. [00:08:55] Speaker B: You know, at the very murder a lady. [00:08:59] Speaker A: Disrespectful. [00:08:59] Speaker B: And then that's how you leave her to be found. [00:09:02] Speaker A: Come on, How? You don't even tidy up after yourself, Right? [00:09:06] Speaker B: Like that's insulting. [00:09:07] Speaker A: You'll fold her clothes before caving her fucking skull in, but you can't even be asked. [00:09:12] Speaker B: You think that the person folded the clothes? Is this going to come up in other murders? [00:09:16] Speaker A: Well, you tell me, right? You tell me when I reveal a couple of other details here. [00:09:21] Speaker B: Okay, Go ahead. [00:09:22] Speaker A: Lily Corrigan. [00:09:25] Speaker B: Mm. [00:09:27] Speaker A: Exhibited a total absence of blood in her body. [00:09:36] Speaker B: What? [00:09:37] Speaker A: She'd been drained. [00:09:39] Speaker B: What? [00:09:40] Speaker A: She'd been drained of blood. [00:09:43] Speaker B: Was it like on the bed? Like, Was the blood there or. It was like it was disappeared, gone. What the fuck? [00:09:54] Speaker A: There was no blood in her body. Absence of blood. And how is this for another piece of evidence at the scene was a blood stained ladle. [00:10:10] Speaker B: Ladle. [00:10:11] Speaker A: A spoon, a gravy spoon, a long handled deep dive dished spoon present at the scene. Okay, what does that suggest to you? [00:10:26] Speaker B: Blood soup. [00:10:28] Speaker A: Friends, this was the element of the scene which gave our killer the nickname of the Atlas Vampire. [00:10:39] Speaker B: What the hell? [00:10:40] Speaker A: Yeah, yeah, yeah. [00:10:44] Speaker B: She. No blood and it's nowhere. [00:10:47] Speaker A: Except on this, the only evidence of blood at the scene in all of my research. Don't laugh. Fucking. The suggestion that I might research a case brings chores from you. [00:11:07] Speaker B: Fucking even you can't help but giggle. [00:11:10] Speaker A: Shut up. Anyway, no bloody the scene, Corey. Just a blood stained spoon. [00:11:18] Speaker B: And where was the spoon? [00:11:22] Speaker A: Not where the condom was. The spoon. Oh no, the spoon was at the scene is all I'm telling you. Right in the room. [00:11:31] Speaker B: Is that all you're telling me? Because you don't know or. [00:11:33] Speaker A: It was a one bedroom apartment. [00:11:36] Speaker B: Okay, I'm just wondering like was it lying on the bed? Was it in the kitchen sink? Like. It won't have been far. [00:11:42] Speaker A: It was, you know, Columbo. [00:11:46] Speaker B: All right, go on. [00:11:47] Speaker A: You know, Hell, anyway, let's, let's, let's go into this. Let's talk about the, the impact here. Sensational impact, obviously. [00:11:59] Speaker B: Yeah, I can imagine. [00:12:00] Speaker A: Immediately captured public attention. Stockholm Press. It was extensively reported with of course it being when it was the bizarre, the gruesome, the uncanny, the weird elements of the case, the alleged blood draining. It was the press, of course, who first coined that term. The Atlas Vampire. [00:12:23] Speaker B: Of course, of course. [00:12:25] Speaker A: Stoking the fires of fascination and fear. Now, you know, you don't need me to tell you that some of the theories were fucking wildly outlandish. Widespread fucking speculation, Urban legend, occult rituals, psychosexual disorders, you know, alien involvement, UFOs. It was all mentioned. [00:12:52] Speaker B: Right. [00:12:52] Speaker A: Now, obviously, as you can well imagine, the impact on the sex worker community was profound. [00:13:00] Speaker B: Yeah, of course. As I've said, pretty terrifying. [00:13:04] Speaker A: Yeah, yeah, yeah, yeah, yeah. Lily again. Despite her reputation as a discreet and. [00:13:10] Speaker B: Independent woman, she was doing everything right as far as anyone was concerned. [00:13:15] Speaker A: Certainly seems to have been just keeping her fucking head down. You know, she was burning the candle at both ends. She was a waitress as well. Working girl. [00:13:24] Speaker B: Yeah, she was, she was making, she was paying. She was just trying to live her life Right. [00:13:29] Speaker A: That's what she was doing. So let's talk about the investigation. It's the 1930s. Forensic science nascent if not quite there. Yeah, just not there at all. Yeah, as I've said in, in previous Joags when talking about, you know, cadfile and how he would have researched the murder. It was all much the same. Physical evidence, eyewitness accounts, embryonic forms of fingerprinting were used. Very, very, very early fingerprinting, but it was shit. And no useful prints were recovered. It was blood spatter analysis. But here's the thing, right? A lot of Lily's clients and friends were interviewed, but, you know, like I said, she's a working girl. [00:14:22] Speaker B: Right. [00:14:22] Speaker A: You know what I mean? [00:14:24] Speaker B: There's just no way of knowing who would have gone through those doors. [00:14:27] Speaker A: Exactly. So what if I told you, Corrigan, that the case went cold and the shocking identity of the Atlas vampire to this day remains? [00:14:45] Speaker B: Of course. Of course. Because that is the way, do you think. Here's my thing. Do you think that, like, the reports were accurate? Right. Because we talk a lot about situations in which, like, the press takes something and runs with it and stuff like that. And the, you know, getting to, like, the heart of what actually happened in a situation becomes so muddled. [00:15:11] Speaker A: Yes. [00:15:12] Speaker B: By, like, you know, what crime scene said about. [00:15:15] Speaker A: And hearsay. [00:15:16] Speaker B: Right. Like, do you. Do you think that maybe. Maybe it wasn't even like. Like just this idea of her being, like, drained with a ladle now is so outlandish. [00:15:32] Speaker A: Anyone said drained with a ladle? [00:15:34] Speaker B: No, no, no. Drained and then found with a ladle is what I meant. Not drained with a ladle, but like. Like last week I was telling Ryan a story of this sort of literary, artistic cult that existed in California. And there was, you know, a woman who killed herself and then several other sort of violent incidents that happened afterwards. But, like, the news reported crazy shit. Like, you know, one newspaper had written that, like, one woman had found the body of this other woman and brought her to bed and gone to sleep with it and things like that. Like a thing that never happened. Right. Like, so is there a chance that also the. The idea that this body was fully drained was a press thing and not. [00:16:24] Speaker A: It's a. [00:16:25] Speaker B: The actual. [00:16:26] Speaker A: Valid question, right? It's a very valid question. All I'll say is every account of this crime is uniform. [00:16:36] Speaker B: Says that's what happened. [00:16:37] Speaker A: Every account says the same thing. And there are plenty of accounts. [00:16:41] Speaker B: Yeah. And I'm sure, like. But that's the thing is it's like, so how, you know, do we have crime scene photos? We do, and things like that. And it shows her completely drained of blood. [00:16:53] Speaker A: No, it shows her bed. It shows her. [00:16:56] Speaker B: So we don't have anything. There's no shows us that. [00:16:59] Speaker A: No forensic photos. No. I have found no autopsy pics or anything like that. [00:17:05] Speaker B: So I would be curious as to whether that is actually what happened or if it became, you know, much like when we discuss, like the War of the Worlds thing, right. Every newspaper said. If you look at newspapers from that time, you are going to see every single one say that there was a panic that occurred and you have to go to different records, like police records and things like that to find out that didn't actually happen. Right. So I wonder, because that at this point in time, having the technical. Know how to drain a body and then something like leaving a ladle behind with blood on it, like that feels such an outlandish detail that is very hard for me to believe. You know, like, were they fucking with people? However, the idea of like someone who, you know, was sodomized and killed while, you know, carrying out her sex work. [00:18:06] Speaker A: Yeah, Fully, perfectly, perfectly believable. Yes. [00:18:10] Speaker B: Yeah. [00:18:10] Speaker A: Yes, yes, yes. [00:18:12] Speaker B: I mean, if someone left behind, you know, the prophylactic or whatever, that doesn't tell me this is the kind of methodical killer who drains someone entirely of their blood. [00:18:21] Speaker A: It doesn't. [00:18:22] Speaker B: And manages to leave without leaving a trace. Like, that is not the picture I'm getting from this. [00:18:29] Speaker A: The. Listen, all valid, valid questions and one. [00:18:33] Speaker B: Does suspect we don't have the answer. [00:18:35] Speaker A: To one absolutely would be quite justified in suspecting that by the time, you know, police had spoken to other police who'd spoken to suspects, who in turn had spoken with other kind of clandestine members of the sex worker community who'd spoken to the press, one does suspect that the tale of the Atlas vampire might have taken on a kind of life of its own. [00:18:58] Speaker B: Right. Yeah. How long is that grapevine. [00:19:01] Speaker A: Yeah. [00:19:02] Speaker B: Is, you know, the question here. [00:19:05] Speaker A: But every again, and there are. It's a well documented case. It's been. It's been written about in. In kind of. Yeah. It's been written about in books which. Which feel quite sensationalist by their nature, you know. [00:19:19] Speaker B: Sure, right. [00:19:19] Speaker A: 101 Spooky Murders and so on. [00:19:22] Speaker B: Yeah, they're not necessarily looking for, you know, they're not looking at the police reports or anything like that, which I tried to. [00:19:28] Speaker A: I search. There's nothing. There's nothing. [00:19:31] Speaker B: Which. Yeah, that's the other thing, I think, like, there's so many things that are, like, well documented. Right. And then when you try to find the source, it's like. Well, we're like, we have. We have pictures of Jack the Ripper's victims which happened, you know, decades before this did you know, why would we not have a picture of this corpse? You know, like, why is that not something that, you know has turned up anywhere on the Internet? [00:20:04] Speaker A: Very good point. [00:20:06] Speaker B: So I'm gonna say that this particular cold case is a sad murder, but I don't buy the story that has been told about it. [00:20:19] Speaker A: You don't think it was a vampire? [00:20:21] Speaker B: I don't. Is that what you're not a vampire? That big soup spoon. [00:20:30] Speaker A: I want to spoon your blood. [00:20:33] Speaker B: I feel like that's really the thing that, like, that makes it for me where I'm like, why would there be a ladle? What would you do with the ladle? [00:20:42] Speaker A: Great point. That's an excellent point. [00:20:49] Speaker B: But let us know, friends. What do you think? Am I, am I dismissing, Am I poo pooing it too, too soon? Should I be giving this more, more credence? [00:21:00] Speaker A: I think you should. [00:21:02] Speaker B: Let us know. [00:21:05] Speaker A: Let me quote directly from my notes, if I may. [00:21:07] Speaker B: Yes, please do. [00:21:09] Speaker A: Fucking look at these nerds. Oh, I don't think anyone has ever. [00:21:14] Speaker B: Said measles said in such a horny way before. [00:21:16] Speaker A: The way I whispered the word sex. [00:21:18] Speaker B: Cannibal received worst comes to worst. Mark, I'm willing to guillotine you for science. [00:21:23] Speaker A: Thank you. That's really, really sweet. It's cold outside, but my pancreas is talking to me. I'm. I'm going to let it. [00:21:29] Speaker B: You know how I feel about that, Mark. [00:21:31] Speaker A: I think you feel great about it. All right, all right, all right, all right. We're just going to go straight in here. Right? I vividly remember and come with me on this. I, As a, as a, as a student back in the dim and distant late 1990s. [00:21:54] Speaker B: Okay. [00:21:54] Speaker A: As part of my drama and theatre studies course, I did a film studies module. Right. In my first year. [00:22:01] Speaker B: Sure. [00:22:02] Speaker A: Film studies had to do it. And I was glad to do it. It was great. And I vividly, so vividly remember that me and the rest of my group, my Tudor group, being forced to endure a documentary from 1922. [00:22:25] Speaker B: Was it Nanook of the North? [00:22:27] Speaker A: It was Nanook of the North. Yes. It was Nanook of the North. [00:22:31] Speaker B: You're gonna say a documentary from the 1920s is gonna be Nanook of the North. [00:22:35] Speaker A: And that is ex. What it was. Yes. A documentary about the hardships and the stoicism and the. The humble kind of barren and just tough ass life of, of the Quebec inuits in the 1920s. It was in northern Quebec, I believe. Yes. [00:23:02] Speaker B: Okay. [00:23:03] Speaker A: In the Canadian Arctic. [00:23:04] Speaker B: It's been a minute since I was in my original film school times. [00:23:09] Speaker A: Yeah, some. Holy shit. 25, 26 years. Long time. Long time, long time. Now I am reminded of this. [00:23:18] Speaker B: Yeah. I'm interested here because. [00:23:23] Speaker A: I've been saving this fucking story up for you. Right. And I'm trying to work out how to begin. It's been two. I haven't jo. Aged in. I missed a week, right? [00:23:32] Speaker B: You did, yes. [00:23:33] Speaker A: This is how long I've been sad on this. A week ago I. I did. Hehe. What I think might be the most embarrassing thing of my fucking life. [00:23:47] Speaker B: Okay, go on. [00:23:49] Speaker A: I made what I'm certain on my deathbed. I will remember and I will cringe before I. My final breath kind of comes out of my wispy kind of lungs. This is the last thing I will remember having done. And it will torture me. I can just. There are years of torture ahead of me because of what I've done this last. And I've been sat on it. [00:24:15] Speaker B: So thought coming in. Okay. [00:24:18] Speaker A: So yeah, I am. I'm involved. I'm doing a course currently. Right. It's a year long course. The topic and the content notwithstanding, it's unrelated to this tale. [00:24:33] Speaker B: Sure. Okay. [00:24:34] Speaker A: But what is related to this tale is that this course involves regular online classes, regular online meetings with a tutor and other students who are on the course. [00:24:49] Speaker B: Right. [00:24:50] Speaker A: Strangers to me. People I've never met. People from different organizations, local authorities, councils, teaching consultancies, training departments. Just people in the same kind of area of work as me. Right. [00:25:04] Speaker B: Okay. [00:25:04] Speaker A: But doing this course, this and. And we, you know, log in and do our course and fuck off. And I'll probably never see any of these people again. Right. [00:25:14] Speaker B: Right. [00:25:14] Speaker A: Now, a thing about me. Right. A thing I like to do. Tell me when you tell me when you guess where this is going and if this sounds. [00:25:28] Speaker B: Well, the Nanook of the north thing has thrown me. So I'm not sure. [00:25:33] Speaker A: Oh yeah. [00:25:34] Speaker B: On that this is going. [00:25:36] Speaker A: I'm minded of Nook in the north because that is a life I crave. Right. [00:25:45] Speaker B: Okay, gotcha. You have embarrassed yourself to. [00:25:48] Speaker A: I have go live in the wilderness. So shame upon myself to such a degree that I crave the life of Nanook. [00:25:53] Speaker B: Live in the tundra. Yeah, okay, got it. [00:25:55] Speaker A: Right. I didn't circle back to that one. If I didn't close the loop on that one. [00:26:00] Speaker B: I was expecting this to somehow relate to Nunuhook of the north. And I was like, I cannot think of this. [00:26:06] Speaker A: I was just thinking where can I go where this will not follow me. [00:26:10] Speaker B: Okay, got it. I'M with you. All right, go on. [00:26:13] Speaker A: So on one, right? If. If what I'm about to say sounds weird, I have been reassured since that plenty of people do this, and it isn't just me. Right. [00:26:21] Speaker B: So judge me is not wearing pants. [00:26:23] Speaker A: Oh, I was. I was fully dressed. I was fully dressed. Okay. So judge me if you will, but know that. Yeah, you know, this behavior is more commonplace. What I do is if I'm on a online course or call or meeting with people I don't know. [00:26:39] Speaker B: Sure. [00:26:40] Speaker A: I'll do a little research on them. [00:26:42] Speaker B: Okay, sure. [00:26:45] Speaker A: I'll put their names in the old fucking matter engine. In the old. Yeah, the old Facebook. You know the one. I'll search them up on Instagram. I'll take a little look and see if I can see. I'll have a little look at them. Right. Everyone does it. [00:26:57] Speaker B: Okay. [00:26:59] Speaker A: Everyone does it. And I did it. I was doing it during this course. Okay. [00:27:04] Speaker B: Mm. I'm getting so nervous. Okay. [00:27:07] Speaker A: So some time passes. Oh, God. Some time passes, and you'll know. You'll be aware of the functionality. I'm sure Zoom has it where you can. You can. You can put. If you're the. If you're the. You know, the kind of the. If you are the host, you can put people in breakout rooms, can't you? [00:27:31] Speaker B: Yeah, of course. Yep. [00:27:32] Speaker A: You can attend people on call. So we put. I'm gonna write. I'm gonna hit this button. You're gonna go in breakout rooms, talk about the topic, and come back. And that happened. Right. [00:27:41] Speaker B: Okay. [00:27:42] Speaker A: So I found myself in a breakout room with me and three others. One of whom. Right. One of whom was the girl that I'd been researching. Okay, it's not creepy. It's just. I was curious. She had an interesting name. Please don't think it's creepy. Please don't think this comes from a creepy place. It doesn't. [00:28:06] Speaker B: No, no, no. You've already explained your process here. [00:28:09] Speaker A: Thank you. I research the guys, I research the gals. I research everyone. I just. I like to research people. So there's a document that needs sharing. I have the document. I have the PDF. All right. Don't worry. I'll just share my screen. Oh, shit, I forgot I had the fucking tab open. [00:28:29] Speaker B: No, no, no, no, no. [00:28:32] Speaker A: Yeah, yeah, yeah, yeah. Now you see Nanook of the North. No, no, of the fucking North. I shared. [00:28:39] Speaker B: What did you do? Like, what? Ha. [00:28:41] Speaker A: And the first thing the group saw was my fucking browser opened with this girl's fucking Facebook profile on it. [00:28:49] Speaker B: No, no, no. Did you just shut it off and leave. [00:28:56] Speaker A: Nothing was said, right? [00:28:58] Speaker B: No. That's, like, almost worse. [00:29:00] Speaker A: Oh, yeah, yeah. If anything. I think if anything, I just said, just close that. Oh, God, I'm. I'm cringing. I can feel my cast. [00:29:11] Speaker B: Oh, my. [00:29:12] Speaker A: Into my stomach. [00:29:13] Speaker B: You're literally bright red right now. Like, even thinking about this, your face has turned. [00:29:18] Speaker A: Corey and the group didn't say a word. It was never referred to. It was. [00:29:24] Speaker B: Oh. Oh, no. [00:29:26] Speaker A: It was never mentioned. I just put the PDF up rough. Oh, man. [00:29:34] Speaker B: Yeah. I mean, you're right. You have to go live amongst the polar bears. It's truly horrifying. [00:29:43] Speaker A: What do you think of that? What do you think of that? [00:29:46] Speaker B: I'm sure you were just so, like, busy. I'm like, what? Did her face betray anything? Was there awkward silence? [00:29:53] Speaker A: I didn't. The only thing I was looking at. At the. Oh, no. Second, as I believe it's called, was just. Was just the X button on the top left. [00:30:01] Speaker B: Oh, no. [00:30:03] Speaker A: Of my browser. And yeah, the. The. The. The more I think back to it, that was. Yep, Just close that. Those were my words. That was how I announced it. Awesome. So good. And listen. [00:30:18] Speaker B: Oh, God, I'm, like, so secondhand. Embarrass. [00:30:23] Speaker A: I can see it in you. The point. Why do I share this on this podcast? Why do I share that? [00:30:30] Speaker B: I am curious. Yeah. Is this catharsis? Like, it is getting it out? [00:30:34] Speaker A: And I've been sat on this for a fortnight. I've been kind of sitting in your shame, I guess. I guess my thinking here is that the horrors are everywhere, aren't they? [00:30:45] Speaker B: It's true, you know, sneaking up on you. [00:30:48] Speaker A: They are both external. They are internal. And some horrors you can't escape from. And I will never be able to escape from this. I will never, ever outrun that moment. [00:31:02] Speaker B: It's. For me, those, like, little breaches are the ones that haunt me the most. [00:31:08] Speaker A: Yeah. Yeah. [00:31:09] Speaker B: You know, like, there's. I was talking to a therapist about this a few years ago that I literally. It's like a tick. If I think of, like, some little social breach like that, that, like, I will physically, like, go like this, like, clench my teeth and, like, you know, ball my fists and things like that. Like, I'm trying to, like, physically what. [00:31:32] Speaker A: I've been doing, what I've been doing, how I've been fighting the memory over the past couple of weeks is whenever I start to think about it, I'll say, I'll imagine a song really loudly in my head and I'll turn It right up to 11. [00:31:43] Speaker B: Yeah. I also. I do, like, verbal tics, too. Like, I'll go like, I love the Red Sox or something like that. And I don't know, like, I don't know how that started or, like, what. But that's like just my body panicking. Like, just, I love the Red Sox. [00:32:05] Speaker A: I'm not thinking of that. [00:32:07] Speaker B: I'm deeply not thinking of that thing. And the therapist was like, okay, well, what if you, like, didn't do that? And instead you kind of, like, leaned into the. The moment and went like, oh, man. Yeah, that was a really. Yeah, yeah, yeah. Like, that was an uncomfortable moment for me that I wish I hadn't done. And, you know, I'm probably the only one who really thinks about it, but it's. I carry it with me and just lean through it and, like, work through the discomfort. And it does work, but, like, 99% of the time, I don't want to do that. Yeah. All I'm saying is I love the Red Sox, and let's move on. [00:32:42] Speaker A: I've got some healing to do from this. I've got some processing to do on this one. [00:32:47] Speaker B: Yeah, that's a middle of the night kind of jolt. Awake. [00:32:51] Speaker A: Death. It's deathbed stuff. It really is. It's deathbed stuff. [00:32:54] Speaker B: Right. Anxiety is the worst there are people. [00:32:59] Speaker A: But anxiety, to me implies that you're worrying about shit that it's. It's not. You can't really impact the stuff that maybe is out of your control, stuff that has happened externally. [00:33:08] Speaker B: That's not what anxiety means. Well, right, but, like, your reaction to it is, like, not proportionate to the wrong that was carried out. You know, things like that. [00:33:21] Speaker A: Like, that brings me company. [00:33:22] Speaker B: But. Yeah, like, at most, this is a funny story that she's gonna tell at some point, you know, like, that's, like, the worst case scenario, this whole thing. And most people will probably never think about it again. But, like, the, like, deep, like, I have broken the social contract reaction, like, that's social anxiety. You know, the. I will carry this to my deathbed. She won't. That's just you. [00:33:49] Speaker A: That's a great point. That's a great point. [00:33:53] Speaker B: But it's. It's a great point, and it's true. But it's like, it's not helpful when the anxiety hits. [00:33:58] Speaker A: No. So. So today, friends, I share it with you and welcome. And right now, let's never. [00:34:06] Speaker B: Now we're all working through it together. [00:34:08] Speaker A: It's out there now, and I don't need to talk about it anymore. [00:34:11] Speaker B: You don't need to ball your fists and yell at it. [00:34:14] Speaker A: No, I hope you're all well. I hope everybody's good. I, I. Something I just want to shout out super briefly because I didn't, I didn't mention this last time I jogged a fortnight ago. [00:34:23] Speaker B: Okay. [00:34:26] Speaker A: If you're not on our Facebook, and a lot of you aren't, millions of you, in fact, aren't. Particularly the Swedish contingent. [00:34:33] Speaker B: Yeah. Completely absent. [00:34:35] Speaker A: I would like to just say at this point how fucking thrilled and delighted I was a few weeks back to learn that one of our listeners has a few degrees of separation. Right. I was talking about Deranged, the movie that I watched. [00:34:53] Speaker B: Yes. [00:34:54] Speaker A: On the cast a few weeks back. And, you know, the, the, the dramatized life story or, you know, the life and crimes of Ed Gein, one of the, one of the big names for the sickos, you know, inspiration for Norman Bates, for Leatherface, God knows how many others. [00:35:11] Speaker B: Yeah, exactly. Countless others. [00:35:14] Speaker A: Just astounded to learn that one of our listeners has just a few degrees of separation to Ed fucking Keane. [00:35:21] Speaker B: That's right. Yeah. [00:35:22] Speaker A: Demi. Demi's mum knew someone who had a relative who fucking went on a date with Ed Gein, man. Isn't that fucking wild? [00:35:33] Speaker B: Imagine how weird that date must have been. Because, like, Ed Gein isn't one of those ones where it's like, oh, he was such a normal guy. Like, you know, nobody's like, he was just, you know. [00:35:45] Speaker A: Right. [00:35:45] Speaker B: He's super chill. [00:35:46] Speaker A: But what, what, what, what that has done to me, what that knowledge has done for me is just contextualized and put into the real physical world. We shared a lifetime. [00:36:00] Speaker B: Right. [00:36:01] Speaker A: You know, we shared space on the earth with someone who is now. [00:36:04] Speaker B: Yeah. You know, a person who knew. [00:36:06] Speaker A: Infamous in the annals of fucking serial killer culture. That's incredible to me. [00:36:14] Speaker B: It's a really good point. Yeah. It's like, it takes someone out of, like your, your movies and your books and your late night scrolling and whatnot and makes them like, oh, that's like it. That's a person people interacted with. [00:36:26] Speaker A: Yep. And furthermore, normal ways. Furthermore, I learn tonight that another one of our listeners. I won't name them just in case. I won't name them. [00:36:39] Speaker B: Okay. [00:36:40] Speaker A: But another one of our listeners has a connection to Fred and Rose West. [00:36:48] Speaker B: What? [00:36:49] Speaker A: Yes. [00:36:50] Speaker B: Stop. What? How? What's the connection? [00:36:53] Speaker A: Just again, through family familial knowledge of one of their victims, Fred and Rose. [00:37:02] Speaker B: You know, I'm surprised that that doesn't happen more in the uk. Because it's so small, like, yeah, sure. You know, it feels like everybody should kind of know everyone. [00:37:10] Speaker A: Yep. [00:37:11] Speaker B: And they did have quite a few victims. [00:37:13] Speaker A: Well, yeah, particularly when you think of people like the good Dr. Harold Shipman. [00:37:17] Speaker B: I was gonna say, you get your Yorkshire Ripper, you get your. Your Fred Rose. Like, you know all these people, like, they killed a grip of folks. How come everybody doesn't know their victims or whatever. [00:37:31] Speaker A: Harold Shipman. [00:37:31] Speaker B: Then again, I think a lot more times, like, I noticed this with the Wests, like, you know, their kids change their names, like, immediately and stuff like that. That, like, I think in Britain it is more often the case that people just, like, disappear off the grid who are connected to these things too, as opposed to here, where that's not really the case. [00:37:53] Speaker A: Harold Shipman's total was around 215. Jesus Christ. [00:37:59] Speaker B: Yeah, it's a lot. [00:38:00] Speaker A: It's fucking bonkers. [00:38:02] Speaker B: It's bonkers. Yeah. I'm sure you know somebody who knows somebody, but it's like it just doesn't come up in conversation probably. [00:38:11] Speaker A: If my. If my nan had been killed by Harold Shipman, I would never fucking. Shut up, everyone. [00:38:17] Speaker B: It would be like your two truths and a lie in your icebreakers. [00:38:23] Speaker A: It would. It would. I would introduce. I would. I would have it on a business card, right? [00:38:29] Speaker B: Nan was killed by ex serial killer. [00:38:33] Speaker A: I'd have, like a badge. Ask me about my nan. [00:38:39] Speaker B: Debate and switch people. Be like, oh, tell me about your nan. [00:38:42] Speaker A: She was killed by Harold Shipman, if you can believe that shit. That's what I'd say. Like I said, lots to talk about. [00:38:52] Speaker B: Lots to talk about this week. And a big thank you to Ryan for filling in last week and letting me tell her a story and for giving us some horror Rex for our readers. Just always wonderful to have Ryan around and to come in clutch like that, you know, I texted her and was like, hey, do you have time tomorrow? And she was like, yep. In. So, so good. Thank you so much for filling in last week as Mark was dying and also a shout out to the book club crew. So Ryan and I both had the wrong date for book club and both thought it was yesterday. Yesterday was the fourth Saturday and not the third Saturday. [00:39:33] Speaker A: That isn't like that. Isn't that? [00:39:36] Speaker B: No, it is with book club. [00:39:38] Speaker A: Oh, it is. Okay. [00:39:39] Speaker B: Here's the problem with, with the book club. It's the third Saturday thing. It's not on a specific date and as such, depending on, like, where in the month the Saturday falls, like, I just end up losing Is it an early Saturday in the month? Is it a late Saturday in the month? I don't know. So totally missed the third one. And I was like. Felt so on top of it. I was like. I even started reading the book, like, four days ahead of time instead of the day before, which I usually do. Felt like I was doing really well. So we go, and it's just me and Ryan and Steven root in the chat. And I'm like, oh, you know. And then I was like, so this is the wrong week. We were like, how come nobody said anything? Like, did just. Nobody show up last week or whatever, but today Laura texts me, and she's like, oh, by the way, we had book club without you last week. [00:40:29] Speaker A: Nice. See, that speaks to the dedication, the diligence, the passion of our listeners. [00:40:37] Speaker B: Yeah, they showed up and they did the thing. They were like, you know what? We're here. [00:40:41] Speaker A: Nice. [00:40:41] Speaker B: Let's do book club. So, hey, good job holding it down yourself, guys. This time, Ryan and I both made sure to write down that March 21st is book club this coming month, and we will actively be there on time. So we look forward to talking to you again. But thanks to both weeks of book clubbers that showed up this time around. And on our KO Fi, lots of new stuff for you this month. We're on top of things. Got fan cave from the beginning of the month discussing. Ready or not. We got a JOAG radio in there for you, and we got a new let's play that we just did yesterday. [00:41:21] Speaker A: Yeah, last night. [00:41:23] Speaker B: Last night we did a let's play, so you can join us in playing Assassin's Creed. [00:41:30] Speaker A: Go on, go on, go on. [00:41:35] Speaker B: Cad file. [00:41:39] Speaker A: Shadows. [00:41:41] Speaker B: Shadows. I don't. I don't know, man. It's like, as soon as I found out there were, like, so many, my brain just went. [00:41:51] Speaker A: There are a lot, and they're all very, very similar. So I get how. [00:41:54] Speaker B: Yeah, it was like you told me, and then I went to. You were like, hey, can you Google, like, you know, Assassin's Creed Shadows, how to do this? And I immediately pulled up my phone, was like, how to. Blah, blah, blah. Assassin's Creed origins. [00:42:11] Speaker A: Wrong. [00:42:12] Speaker B: Nope. Incorrect. [00:42:13] Speaker A: Wrong game. [00:42:16] Speaker B: So, Mark, we've got a few things to discuss this week, but I think they're coming. They're coming in largely out of our. What we watched. [00:42:25] Speaker A: Yeah, they are. They are. Let me. First, I want to draw your attention because this was news to you. I'm gonna say a name, and I want Corrie. React right. Here we go. Corrie. Eli. Roth, talk to me. [00:42:46] Speaker B: Fuck that guy. [00:42:46] Speaker A: Fuck that guy. [00:42:48] Speaker B: Eli Roth, he is the worst. Just racist, misogynistic, edgelord movies for doofuses. And I hate everything about him. [00:42:56] Speaker A: Oh, shit. [00:42:57] Speaker B: Fucking hate him. And he thinks he's horror royalty as a result. He's like, I'm the guy who knows horror. I'm gonna be the talking head in every horror documentary because I am horror. Fucking hate him. [00:43:08] Speaker A: Ah, okay. All right, then. Well, look good, but why? [00:43:12] Speaker B: What's he doing? What's going on with Eli? [00:43:14] Speaker A: Rob, I'm you. Genuinely. This hasn't come your way. You generally don't know about this? [00:43:18] Speaker B: No, I have no idea what you're gonna tell me. Okay, Now, I might even have Eli Roth muted, like, as a term on blue sky. [00:43:25] Speaker A: Eli Roth is currently up to something which I find quite interesting. Right. Thoughts on him notwithstanding. And thoughts on his movies notwithstanding. What. What one cannot argue with is he is indisputably a creator of bankable movies. [00:43:49] Speaker B: Yes, that's true. Yes. [00:43:50] Speaker A: I got some stats here to back this the fuck up. Cabin fever, right? 1.5 million. Budget took 30 mil at the box office. Hostel, 4.8 million. Budget took 82 million. Right. Hostel part two, similar story, 10 million budget, 36 million box office. Eli Roth to this day claims that Hostel Part 2 was the most widely pirated movie in existence. I. I certainly remember getting it from wherever you got movies at that time ahead of cinema release. Even that piece of House of the Clock and its walls. 42 million budget, 131 million box office. [00:44:32] Speaker B: He made that? [00:44:33] Speaker A: He did indeed. [00:44:35] Speaker B: Jack Black didn't even realize. [00:44:36] Speaker A: Yep. [00:44:37] Speaker B: Yeah. [00:44:37] Speaker A: Thanksgiving and the Last Exorcism are exactly the same. Thanksgiving cost 15 million, made 46. The guy is. The guy makes films that make fucking huge, huge returns on investment, right? [00:44:50] Speaker B: Yes. I mean, he has cornered the angry white male market. [00:44:56] Speaker A: You can't argue with that. [00:44:57] Speaker B: Indisputably. [00:44:58] Speaker A: Yeah, I will not argue with that. Now, from almost with that Persona in mind, right. Eli Roth is crowdsourcing funding. Not for a movie. Don't roll those eyes just yet. Not just yet. Not for a movie, but for a studio. Right? [00:45:19] Speaker B: Okay. [00:45:20] Speaker A: He is piloting almost a model of ownership of a studio with the view of, and I'm quoting from him, hardcore horror films without studio interference. He wants to make horror films purposefully that go after an unrated rating beyond. Right. [00:45:43] Speaker B: Fuck that guy. I hate him so much. Don't want anything to do with this. No, thank you. Don't want more edgelord bullshit. Without a studio even to rein in his nonsense. Oh, we're gonna get, like, more naked brown people eating people. Like, what? Cool. Great. Sounds like fun. Fuck off. Eli Roth. I hate him so much. I could not hate this idea more. [00:46:08] Speaker A: Is this not interesting, though? [00:46:10] Speaker B: No, absolutely not. If this were in the hands. If this were made by, like, let's say, like, a radical. If he were crowdsourcing for, like, radical trans women of color to make the films that they wanted to make or something like that. But what he's crowdsourcing for is Big Hostel. Like. [00:46:34] Speaker A: Yeah, exactly. Exactly this. [00:46:37] Speaker B: What if you could watch these movies, but, like, it was even more. [00:46:43] Speaker A: No, that's exactly what he said. [00:46:44] Speaker B: Fuck cares. Big costume. It's not 2001 anymore. You know, I don't want those movies. [00:46:54] Speaker A: The studio is going to be called the Horror section, and he's got a crowdfunding page, which is. Which is basically laid out like a business plan. Right. If you rolled your eyes before. Wait till you hear this quote. For the record. Right? I'm. I'm broadly into this. [00:47:09] Speaker B: Well, clearly. Right. [00:47:13] Speaker A: If only for the fact that you. Like I've said time and time again, I want to see what is possible when the brakes are off. I want to see what is possible when. [00:47:22] Speaker B: Yeah, but from Eli Roth, like, do you. Is that what you're looking for? What if someone who is the richest, most famous horror director with all the money in the world and all of his whiteness and all this took the reins off, like, literally. That's not a risk. Who cares? That's not a swing. [00:47:42] Speaker A: Whoa, whoa, whoa, whoa. I am under no kind of misapprehension here that the movies will be any good. Right, Right. [00:47:51] Speaker B: But even just, like, I think it goes. Yeah. And I think it goes against what you like about the big swings. Like, I think you're rooting for people to, like, push. Push the boundaries from, like, an outsider perspective. Not. What if an insider was allowed to be more racist and more violent, more misogynistic. What. [00:48:17] Speaker A: I'm. From a distribution point of view. Right. He's in bed with the. The. The kind of the. The network company that released the Terrifier movies, or Terrifier 3 in particular. Right, sure. And the. What. What keeps me and God knows how many others glued to Terrifier is the mad science of the kills. What the fuck is there left to do? Okay. [00:48:46] Speaker B: Mm. [00:48:47] Speaker A: And I. We aren't gonna get any great art out of this venture from Eli Roth. [00:48:51] Speaker B: No. [00:48:52] Speaker A: But we'll certainly, I think, get an answer to. What is there left to do in A kill. [00:48:57] Speaker B: But I don't think that's an interesting question anymore. [00:49:00] Speaker A: Fine. [00:49:00] Speaker B: I deeply don't think that's an interesting question. I mean, I like a good kill. [00:49:04] Speaker A: Yep. [00:49:06] Speaker B: But I think we're past that as a thing, you know, I think we know you can do as a society, but I think. I just think that it's like there has to be so much more than that, you know? And this is one of the reasons that we differ so much on the Walking through the woods movie or whatever. Is that. That swing to you? [00:49:31] Speaker A: You know what it's called, Right. And not. [00:49:33] Speaker B: I genuinely can't think of what it's called. Is it called a walk in the woods? [00:49:36] Speaker A: No, it's called in a violent nature, as you well know. [00:49:38] Speaker B: In a violent nature. I know. I did not know. I could not think of what it was called. I genuinely was thinking it's called A Walk in the Woods. But that's a Bill Bryson book. Confusing things. But, like, I think that's one of the reasons we differ so much on that, is that I'm not necessarily interested in, like, ooh, what can we do, white man? Like, that does not interest me at all. Like, if you were to give the. So, like, think about books, right? Like, horror books. Some of the. Like, really pushing what you can do with gore and stuff like that. And body horror is trans women. [00:50:17] Speaker A: Yes. [00:50:17] Speaker B: Trans women are writing the grossest books you have ever read in your life. If you told me he was crowdsourcing for trans women to make transgressive move movies in body horror, I still wouldn't want to see them, but I would be like, you know what? I'm on board. Yes. Let's see what they can do with it. Eli Roth has nothing to say to me. He has nothing to tell me. I have seen 25 years of his fucking movies, and all they say to me is, I would like for no one to be able to tell me that. Like, maybe that's not actually good. Maybe you should treat people of other cultures like humans. Maybe you should treat women like humans. Like, he doesn't want to hear that. Just going to make a studio where he doesn't have to hear that anymore. [00:51:09] Speaker A: Okay. [00:51:09] Speaker B: This is just an. An element of, like, cancel culture to me, you know? Like, he's trying to go outside of people telling him you shouldn't. [00:51:19] Speaker A: Yes. [00:51:20] Speaker B: And he's not. That's not to, like, lift up the voices of people who society is telling they shouldn't. He's allowed to do whatever the he wants. [00:51:30] Speaker A: Listen, I. I Hear the same things as you, loud and clear. It's. It's the only voice that's being amplified here is his. [00:51:38] Speaker B: Right? I don't want to hear his voice. [00:51:41] Speaker A: Okay, fine, listen, we'll move on. Would it change your mind? Maybe this quote will change your mind. I call this a 360 degree horror studio. Maybe that'll do it for you. The f. Yeah. [00:52:00] Speaker B: God, I hate him so much. [00:52:07] Speaker A: Not quite so true. [00:52:08] Speaker B: Now honestly, like I just yearn for the day when we are freed from Eli Roth. We don't have to hear his perspectives on horror anymore. And I don't have to see his name on anything anymore. And we just. We just have grown as a society to where we acknowledge that this was the weakest point in horror's history and we move on. [00:52:33] Speaker A: Very clearly made point of view there you have. I am in no doubt. So obviously you'll be. [00:52:42] Speaker B: Do you need me to explain how I feel more? [00:52:45] Speaker A: What if I talk you through the investor perks? [00:52:49] Speaker B: Please tell me what can I get for investing in Eli Roth? [00:52:53] Speaker A: $100. Undersung Hero Town Hall Zoom with Eli Roth. [00:52:58] Speaker B: Okay. [00:52:58] Speaker A: $666 and above. Signed stock certificates. You're a shareholder in this film? Yeah. In this studio. Sorry, in this venture. $1,500 merch sign posters, 10 grand. Exclusive screenings. Yeah, yeah, yeah. Now here's the interesting one. Half a mil set visits and you get to be an on screen extra. Throw him a millie. Throw him a million. You or a loved one will be killed in a future movie. [00:53:30] Speaker B: Good lord. So that's like just not my. This is not Nilay Roth complaint. That's not like my favorite perk in things generally because I feel like weakens a film to put like just someone who paid enough money to be in it. You know, like if they're just in the background somewhere, fine. But like when you're like up in it, it's like you gotta now you're sacrificing some of the movie. Well, look to an investor. [00:53:58] Speaker A: But put it like this. I mean $100 gets you an entry into a raffle for your name in the credits, right? Hundred dollars gets you a raffle to. [00:54:08] Speaker B: Be in the credits. [00:54:09] Speaker A: I gave Terrifier 2 like 20 quid and I was in the credits. [00:54:12] Speaker B: You're on that list of 8,000 people. [00:54:17] Speaker A: It was such a fucking amazing. [00:54:19] Speaker B: Well, yeah, it's fun finding yourself in that list. [00:54:21] Speaker A: I love that. [00:54:22] Speaker B: Right? [00:54:22] Speaker A: I love that. I was there right next to my brother. We were there. And in the fucking Credits of Terrify 2. It was the fucking best. [00:54:29] Speaker B: Right? [00:54:30] Speaker A: Yeah. [00:54:31] Speaker B: And that's the thing is like when you're. For all my issues with terrifier and whatnot, like, and now especially with the creator who's just a fucking idiot. But. [00:54:45] Speaker A: Fuck is wrong with you today? [00:54:47] Speaker B: Well, I mean, he's terrible. The way he like threw his lead under the bus to like appease right wingers. Like, fuck him. [00:54:55] Speaker A: Oh, I think I've missed that. [00:54:58] Speaker B: Oh, he. His. You know, the guy who plays Art the Clown is like an outspoken leftist. [00:55:05] Speaker A: Yes. [00:55:06] Speaker B: And people were complaining about it and like straight on his Facebook and stuff like just saying like terrible to him all the time because of his progressive views. And the director came out and just was like distanced himself from it. Yeah, yeah, Thornton. But he was basically like, hey, listen, this, these movies are for people of all politics. And I don't necessarily agree with my guess. It's like he's being harassed. You stick, like, tell them to back off. Don't say, you know, hey, this is for you too. Conservatives. I don't care if you're a Nazi, as long as you like Art the Clown. I hate these guys. I hate them. But you know, for my issues with that, like, you know, it. It was like a grassroots thing from a guy who did. He couldn't have made it himself, you know, and no distributor was going to put that stuff out. And people being a part of it, you know, was this collaborative effort to make something that people wanted to get seen as opposed to. Like, I just, I. I don't think rich famous people have any place crowdsourcing. Listen, he could go to investors. [00:56:20] Speaker A: Yes. [00:56:21] Speaker B: And talk them into this. [00:56:22] Speaker A: Well, yeah. With that kind of, that kind of spreadsheet, he's not going to want for funding, you know. [00:56:28] Speaker B: Yeah, exactly. Like he makes tons of money off of whatever he makes. If he wants to start a studio, then start a studio in earnest. But, you know, I don't think that, that this is how art thrives by having the most powerful people use the resources that the people at the bottom have been using to. To move ahead. You know, it's like all the celebrities who for a minute were making only fans pages. Right. Like, you don't need an Only fans. Leave that for the people who like, need sex work. Right. Like, it's not a place for you to do this. This is for folks who can't afford it. And so I don't. I'm not behind any rich famous person doing crowdsourcing like that. I think it's, you know, a distortion of what it's there for. [00:57:16] Speaker A: Well, again, I am in no doubt as to your feelings. It's, you know, fucking hell with Horror. Being as financially healthy as it is right now and enjoying the profile that it is and the distribution that it gets. Yeah, it's, it's. And with Terrifier being as successful as it was, the more I think about it, the less kind of it makes sense. But, but, but I bring it to your attention in the interest of science. We're a fucking. [00:57:44] Speaker B: Listen. [00:57:44] Speaker A: Hey, we're team Horror. And this is a development. Then it's an innovation. Then in the fucking. [00:57:52] Speaker B: See where it goes. [00:57:52] Speaker A: Exactly this. Exactly this. [00:57:54] Speaker B: So maybe he. Maybe he's gonna announce that actually he's going to use this to uplift the voices of the people that he has previously marginalized in his point. [00:58:03] Speaker A: I think we're gonna. I think Eli Roth might surprise you, Corey. [00:58:06] Speaker B: He's gonna. Yeah, right. He's gonna turn things around for me here. He's gonna pull a Beastie Boys, you know. [00:58:14] Speaker A: Who knows? Hope springs eternal. [00:58:16] Speaker B: Who knows? What have you watched this week? Mark Lewis. [00:58:20] Speaker A: Right? A bunch. Oh my God, I'm. Everything I say tonight is gonna piss you off. Oh, no, because it's been a couple of weeks, right, so I've got a few lined up. [00:58:31] Speaker B: Yeah. [00:58:34] Speaker A: Let'S talk about the ugly steps together. Let's just talk about the ugly steps. [00:58:38] Speaker B: Oh, yeah. Which Ryan and I talked about a little bit last week, but. [00:58:41] Speaker A: Okay. [00:58:42] Speaker B: Yeah, go on. [00:58:43] Speaker A: Look, I. All. All I. All I saw there was a movie which has been oversold. Yes, this film has been oversold. It just. It, it was it letterboxd. Where was that review you said that it heats up the substances. Nachos. [00:59:02] Speaker B: Yeah, that was from Letterbox. [00:59:03] Speaker A: And I can't put it any better than that. And I, you know, what with development timelines being the way they are, I'm sure it was kind of created in and outside of the substance. I don't think the two. I don't think it's. It saw the substance of when we can do that. [00:59:19] Speaker B: Right? Yeah, it's probably just sort of. It's the zeitgeist right now. [00:59:22] Speaker A: Yeah, exactly this. But you know, very similar themes of the female gaze and how it forces contortions and acts of self harm in order to conform. But you can't. Oh man, you can't even mention in the same breath as a substance. It is an altogether weaker affair. [00:59:47] Speaker B: Yeah, very much agreed. I think, you know, I was saying to Ryan that like it feels, you know, as much as people said that like the substance is too obvious, maybe hits you over the head with its themes or whatever. Like this. Does that crank to 11. Yeah. That it's. There's no subtlety to this and thus it kind of runs out of steam really quickly. [01:00:06] Speaker A: Yeah. Yeah. [01:00:07] Speaker B: Because the entire middle of the movie. Yeah. It's not really. It's. It stopped saying anything new and is simply restating its thesis. [01:00:18] Speaker A: You get it? It says what it has to say quite early on and then doesn't really say anything else. [01:00:23] Speaker B: Right. So you know, it was, I think your expectations going into like. I mean, I said this to Ryan too, that obviously people on letterboxd seem to love it. So I don't know. It's hitting something. [01:00:36] Speaker A: Yeah. [01:00:36] Speaker B: It's. Maybe it's some. [01:00:37] Speaker A: It. [01:00:38] Speaker B: Maybe it's us for people. [01:00:39] Speaker A: Who knows? [01:00:40] Speaker B: But it just. Yeah. It didn't feel. Didn't feel like it did anything that other things weren't doing already. [01:00:46] Speaker A: Whilst ill, I took the opportunity to, just because my eye happened upon it on my shelf to rewatch Evil Dead Rise. [01:00:55] Speaker B: Nice. [01:00:56] Speaker A: Very nice. In fact, casting has been announced for the new one, Evil Dead Burn. [01:01:00] Speaker B: Yeah. [01:01:01] Speaker A: Cinema distribution's been locked in. A release date has been locked in. So this time next year, you know, we'll be able to buy our tickets for a fucking. Another Evil Dead movie. Get the. [01:01:11] Speaker B: Maybe we'll get to go see this one together too. [01:01:13] Speaker A: Wouldn't that be fun? [01:01:15] Speaker B: Wouldn't that be fun? It'd be fun. We'll have to work on that. [01:01:17] Speaker A: Yeah. But Rise holds up. It's great. It's mean spirited, it's fast. It's just performance is way better than you, you know, than. Oh God, I'm not gonna say elevated, am I? I am. Commitment. Commitment in the performances. Much like Evil Dead 2013. Much like Evil Dead two performances that elevate the source material. [01:01:41] Speaker B: Yes. [01:01:41] Speaker A: This is a film that absolutely believes in itself, takes itself seriously. There is. You know, unlike a movie that we'll talk about later, it's. It is. It isn't self regarding at all. Evil Dead Rise is all about the fucking making. Making as solid an entry into the Evil Dead canon as possible. It's great. It isn't. It doesn't reach the dizzying holy grip your head and. And laugh at the absurdity of Evil Dead 2013. But it's. It's solid as it is. Mahogany. This film, it. It expands the law a little bit. If that's the kind of thing you like. [01:02:21] Speaker B: Yeah. [01:02:21] Speaker A: Very pleasing. Oh. Kind of take. Take you aback Moments of gore. That, that, that incredible opening title which as soon as I saw that sat next to you and Dr. Dean in the cinema, I knew that was one for the ages. I love an opening title card. Free Will Dead Rise is deservedly lauded as brilliant. It is, right? What do I notice this time that I'm. That I maybe hadn't noticed so much before you. You know what I'm talking about. When the title rises up beneath, rises up behind the mountains and yeah, you can see it reflecting in the water of the lake in front of it. That's fucking sexy. That is sexy as fuck. It is erotic almost in how good that is. Yeah, it rises up into the fucking. The kind of fading sunlight and you see the title reflected in the lake underneath it. God, I fucking love that so hard. Great movie. Love it, love it, love it, love it, love it. Can't wait till the next one. [01:03:22] Speaker B: Phenomenal. [01:03:24] Speaker A: Now I'm gonna say name, right? And I'd like you to react. It's a filmmaker. [01:03:30] Speaker B: I looked at your letterbox, so I already know where this one's going. [01:03:34] Speaker A: Alex Garland, you like him? You like his work? [01:03:38] Speaker B: Alex Garland, Yeah. Here's the thing. My loathing of Alex Garland is like what's. Comes from a place that of like. Unlike Eli Roth, who I have hated since I was a teenager, like Alex Garland I liked. And then I have slowly come to realize that Alex Garland is an idiot and that, you know, he listens. Really? Yeah. Avid Joag fan. And it kind of. It started with seeing Civil War and that is just the stupidest, most perspective less movies about war I've ever seen. And he seems to think that is incredibly deep to be giving that. That like, oh, I mean, all the sides who can know what off? Like, there are very distinct reasons why America is like this. It's not a mystery, bud. [01:04:44] Speaker A: Can I. For the record, I really enjoyed your impression of what Alex Garland might have gone through. [01:04:49] Speaker B: I guess he needed an accent. But you know, and then as you know, I rewatched 28 Days later. [01:05:02] Speaker A: And. [01:05:03] Speaker B: Fucking dumb movie that, you know, uses stupid rape tropes and shit like that and is just, yeah, it's a terrible film that also looks like shit. But like on top of it is just. It's awful. And it plays off of the same kinds of like survival nonsense that, you know, white dudes are into and other things. And I was like, this guy just. He doesn't have anything to say. He has no perspective on things. [01:05:32] Speaker A: Let's just skip through this then. I quite enjoyed warfare. [01:05:36] Speaker B: Well, and warfare is, like, the worst example of this out of all of them, I think, because, like, and I've heard, like, on its own merits, it's really good. But you can't take the ideology out of the fucking Iraq War, especially because. Yeah. And from what I understand, it also, like, kind of celebrates the real guys at the end of it too. [01:05:59] Speaker A: Openly. [01:06:00] Speaker B: Yes, yes, yes. Yeah, you can't. You can't fucking do that. These are. We're the villains in this story. You can't do that. That's insane to me. To make a pro, you know, America, Iraq war movie. Like, what the fuck are you doing, bud? What is wrong with you? Now? It's not just like, ideology free and like, he has bad ideas and doesn't understand politics now. It's like you' actively a right winger is what you are at this point. Like, pass hard pass on Alex Garland. But I have heard from a filmmaking perspective, it is a good ride. [01:06:39] Speaker A: Yes. If. If you can bring yourself to view in isolation. Warfare is actually tight as fuck. It is. [01:06:46] Speaker B: Right. [01:06:47] Speaker A: Just zoomed right in. There's on. On one viewing, I can't recall any. Anything outside of the immediacy of the situation that. That that troop finds itself in. Pinned down in a kind of urban kind of setting, dug into a flat in the middle of a very, very tense conflict zone, a hot zone, as you might say. Extractions fail, close quarters, gunfire, fucking, you know, you can feel shrapnel and stink of gun smoke and piss. You know what I mean? They're pissing in bottles and, you know, swapping grenade exchanges with people 10ft away from them. Some great people in it. Cast is great. [01:07:39] Speaker B: Yep, I've seen the cast, and I. [01:07:43] Speaker A: Like a war film. What can I tell you? I enjoy shooting guns movies. [01:07:47] Speaker B: I mean, again, this is what he has in common with Eli Roth, and I don't mean this in a. To insult your taste specifically sort of way, but he has cornered the market on white men. You know, like, he makes movies for white men, especially ones that don't really want to think through the implications of the content of these particular films, you know, so if you just watch 28 days later as an exciting zombie film or things like that, you know, then you don't come at it from the view of, like, a woman who's like, why is there suddenly like a, like, pedophile rape ring in this? Like, what. Why is that necessary to this movie? They. The world, like, just ended. Why is the first thing they did set up a ring to rape girls like that Doesn't. Why is that a part of this? Or like, with Civil War, like, if you don't. If you, if you want to think about, like, you know, like, just war, who can understand it? You know, what is it good for? Absolutely nothing. Am I right? Like, great. Civil War is for you. But if, like, you think actually things lead to war, it has nothing to say to you. In warfare, it's like, okay, if we take out like, all of the ideology around the Iraq war, fine, you can make a movie about that that doesn't make you think through how horrific it is that we were there in the first place, great. But like, if you are a person who comes from any sort of background that's impacted by that, the fuck are you doing, bud? [01:09:22] Speaker A: Yeah. Impossible to refute, impossible to rebut. And. [01:09:28] Speaker B: Yeah, but he's good at it, I will give you that. I mean, I think Civil War is just bad on its face. Like, that's just not a good movie. But, like, he is good at making these movies. It's just, if you think. [01:09:39] Speaker A: Think about them, that, that, that credits montage that you refer to, it plays it. It's presented, you know, like, take. Think of a movie like Cannonball Run, where. Or like any Jackie Chan film where it ends with the. You know. [01:09:56] Speaker B: Right. [01:09:57] Speaker A: Yeah, it plays like that. You. You've. You've got this montage of the. The. The performers in. In costume with the veterans. They on set coaching them on what. What happened, how it went down, how it. How it felt and looked at the time. So, yeah, the. The support of. Of those people and what they went through and the cause that they were fighting for is overt. It is. It is. [01:10:25] Speaker B: Right? [01:10:25] Speaker A: Yeah. Very difficult to hide from. [01:10:28] Speaker B: Yeah, that's just. Yeah. I just think at this point, you know, I still love Ex Machina, you know. [01:10:35] Speaker A: Yeah. [01:10:36] Speaker B: I really enjoyed Men, but realizing he is kind of a dumbass does tarnish it a little bit for me. [01:10:43] Speaker A: Well, Men, I can't remember what we were talking about, but it. It. I think it was the book club this month. Men is. Is an object lesson in one of those movies that. All it has is the. All it has is the very broad theme. Mana kind of gets all men. Yeah. Yeah, well, we know. So now what worry. What are you going to say about that? [01:11:03] Speaker B: Right, Exactly. And you know, at the time, it kind of felt like, hey, listen, there's a lot of guys who need to hear this, but I don't think that that has been what has happened since then. And also it does feel like maybe he is. You know, he just doesn't have anything deeper to say about this stuff. He just isn't that deep a guy. [01:11:23] Speaker A: Yeah. And it's not early days in his career either, so. [01:11:27] Speaker B: Right. Maybe he's had time to think. Yeah. Right. But, yeah, like I said, I've heard good things about warfare. It's just. It's not for me. [01:11:37] Speaker A: Fine. [01:11:37] Speaker B: I will be. I will be frustrated. [01:11:39] Speaker A: Fine. Fine. Fine. Fine, Fine. Fine. So let's move along then, if we should. All right, so let's talk about it. The. The last three Fear street movies I think we saw during lockdown, did we not? [01:11:55] Speaker B: Yeah. [01:11:56] Speaker A: A lot of fun. [01:11:57] Speaker B: It's true. Yeah. I mean, they, like. I think we were both a little disappointed by the last one, from what I remember. [01:12:03] Speaker A: Yeah. [01:12:04] Speaker B: They were fun flicks. Yeah. [01:12:05] Speaker A: Fun flicks. And I remember being really excited by the concept. What a lovely thing to get a trilogy dropped at the same time that connects beautifully, that tells the story in different time periods, that seems thought out, that has, you know, the cachet that R.L. stein brings. [01:12:24] Speaker B: Yes. [01:12:25] Speaker A: To a piece. [01:12:26] Speaker B: As a lifelong R.L. stine fan, just. Yeah. Brought me great joy. [01:12:31] Speaker A: The Fear Street Project, those three films felt like a really nice through line. It felt like something new. It felt like, you know, I can't think of another time when our trilogy of movies has dropped together. That's fucking great. [01:12:42] Speaker B: Right? [01:12:45] Speaker A: So, hey, what fantastic news that we've got another one now. [01:12:50] Speaker B: Yeah. [01:12:51] Speaker A: Eh. [01:12:52] Speaker B: Yeah. Four years later, finally a sequel. [01:12:57] Speaker A: I have. I don't really have any. I don't have any hate left. [01:13:03] Speaker B: So this isn't gonna be like, literally, you were like, yesterday I mentioned it, and you're like, I don't want to talk about it. [01:13:08] Speaker A: No. I simply just don't want to talk about it. I'll try. I'll try to talk about it because I need to get some points across history. And unfortunately, this is going to lead me to a particular conversational avenue. [01:13:24] Speaker B: Yes. [01:13:25] Speaker A: That listeners of the show will have. [01:13:27] Speaker B: Heard before might be familiar with. [01:13:28] Speaker A: Maybe time stamp this so, you know. [01:13:32] Speaker B: Put a timestamp for the incoming rant you're all expecting. [01:13:37] Speaker A: So, firstly, the first thing I would say is that Fear Street Prom Queen is a movie for no one. Right. This is a movie designed for. With no one in mind. It is content in every sense. It is there. [01:13:54] Speaker B: Right. With everyone in mind. And therefore. No. [01:13:57] Speaker A: Exactly. This. It is there to watch in a distracted manner over maybe a second or a third or a fourth screen. [01:14:06] Speaker B: Right. [01:14:07] Speaker A: Right. [01:14:07] Speaker B: It knows that you will have your phone in Your face while you're watching. [01:14:14] Speaker A: Just on a. On a. On a. On a filmic level, it commits the worst crimes of all nostalgia pieces in that it is just a list of. Remember this song? Do you remember this kind of decor? Yeah. Okay. Do you remember these cars? All right. Do you remember this? It's. It says nothing. It just ticks off, ticks off, ticks off. Remember this? [01:14:35] Speaker B: Remember without actually doing anything with, like, the character work to make you feel like it is in the 80s at all. [01:14:42] Speaker A: Nothing, nothing, nothing. [01:14:43] Speaker B: These are definitely girls who have Instagram. [01:14:45] Speaker A: Yeah, completely. Kids who speak like they're in 2025, even though they're supposed to be in 1989 or whatever. Let me see it at a point. Then it's gone. I don't even care. But what really. [01:15:02] Speaker B: Go ahead. [01:15:02] Speaker A: But what. What really hurts me, or I'm not letting it hurt me anymore, but it hurt me at the time is that in a similar way to Stranger Things did a few years back, this is a film which draws very heavily on visual and auditory cues from Elm Street. Right? [01:15:25] Speaker B: Right. [01:15:27] Speaker A: Our killer wears a mask with a kind of a. With burn patterns on it. Some of the sound choices that this film makes, specific synth tones are lifted right out of the Elm street soundtrack. Some of the visuals, you know, you've got. Oh, a boiler with a fire in. Draws directly from Elm Street. Right. Unashamedly. So the poster is a riff on the fantastic if Nancy doesn't wake up screaming poster from Elm Street. So iconic. So fucking beautiful. But here's why this just gets to me, right? So Elm street was a real horror movie. Is a real horror movie for adults. Okay. [01:16:26] Speaker B: Right. [01:16:26] Speaker A: It wasn't made for kids who were scrolling. It wasn't. Elm street is an. Is an actual fucking movie by an actual fucking filmmaker that tried new stuff and did new stuff and was way, way, way head and shoulders above its contemporaries. That's a matter of opinion. I know. But it's more creative than any of the slashers around that time when it came. Since I liken Fear Street Prom Queen, I don't know if you've got the same thing going on in the States, but right now, the kind of fast fashion item du jour is the Nirvana logo on T shirts. [01:17:11] Speaker B: Yeah, sure. Yep. [01:17:12] Speaker A: Yeah, you've got the Nirvana logo on, like, baby T shirts, baby items, adult. Everyone's wearing a fucking Nirvana logo on their shirts right now. [01:17:23] Speaker B: Right. [01:17:26] Speaker A: That's what this film is. It doesn't earn any of the stuff that it is nodding at for a start. I. I Hate to sound dismissive, right, But I don't know how many people who have. Who happen upon Fear Street Prom Queen will be intimately acquainted with Elm Street's choices, its design choices, visual language. [01:17:53] Speaker B: So I think that will be lost on a good chunk of the audience. Yeah. [01:17:56] Speaker A: So what. So why then? Why is it doing that? Why are you referencing. Who is it talking to that your audience isn't. Really. Doesn't have any affinity for? [01:18:07] Speaker B: Right. [01:18:10] Speaker A: It's cheapening. A real piece of work that. I think that's at the core of my disgust with Fear Street. Not just that it's bland and that the kills are uninspired and that, like I said, it's just a nostalgia wank. [01:18:26] Speaker B: Yeah. [01:18:26] Speaker A: Checklist, Tick box exercise. That would be bad enough, but that probably would have got it two stars. [01:18:33] Speaker B: Sure. Right. If it had managed to pull that much off and not gesture at something that it has no intention of actually fulfilling in any way. [01:18:43] Speaker A: It's. It's wearing the skin of something good, right? It's wearing the skin of something real. It's nodding and winking in the. In the worst way at something that it has no claim to. How dare you. Fear street, the prom queen, you. It's colonializing horror. That's what it is. [01:19:07] Speaker B: Right? [01:19:07] Speaker A: And I mean that. And that. I mean it. It is invading something better to try and gain from having done none of the fucking work. This is an empty, hollow piece of shit film that seeks to validate itself and to become something better by wearing the fucking clothes of something better that came before it. And it's vile. It's fucking awful. [01:19:33] Speaker B: Yeah, that's very true. I think that's a perfect way of sort of putting it. And then just from, like, basic levels of doing that, it puts no effort into none, you know, actually accomplishing anything that it's doing either. You know, it was interesting to me because I watched after this, that evening, I watched Miller's Crossing, which I had never seen before. Have you seen Miller's Crossing? [01:19:55] Speaker A: I do not believe I have. [01:19:57] Speaker B: I think you'd like it. It's gangster movie, so. And I had you steal it for me because it wasn't streaming on anything. Um, and so I watched Miller's Crossing afterwards. And this was like, obviously, these are not meant to be anything like each other, but I think it highlighted, you know, the things that were driving me crazy about Prom Queen, which my central thing about this movie was. It tells you and not shows you. [01:20:27] Speaker A: Yes. [01:20:27] Speaker B: Oh, the entire time. Just exposition on exposition on exposition, including, like, lore, you know, like Our, the main character, they've been alluding this whole movie to like something that like her mom did at some point that has been like, you know, looming over her. You would think then at some point we'd get a flashback, but instead we get a character whispering in her ear what her mother did in like a four minute monologue. And like, it just, it never shows you anything. And then I was watching Miller's Crossing, which, like, so, you know, prom queen starts out with that, like, very typical, like, here's this person, they're my best friend, they blah, blah, blah, blah, blah, blah, you know, like that kind of introduction. And I watch Miller's Crossing, which jumps in, in the middle of a conversation, right? You've got this guy sitting here going, you know, etics, you know, he's talking about his ethics to Albert Finney, right? And he's in the middle of this, this whole, you know, monologue about, you know, the ethics of the, the work that he does, which is largely like fixing horse races or whatever, you know, and, and so you immediately kind of get into this guy's head from this thing, right? Like, this is a mobster who, you know, is breaking the law all the time and doing shady things. And he's basically, he's asking if he can kill this other person, but he's doing it through the frame of mind of like, this is about ethics. Nice, right? This is about doing the right thing. I need to murder this guy because it's. We have a breach of ethics here. And so now I know everything about this guy from jumping into the middle of a conversation he's having with someone else. And it immediately goes into, you know, once he leaves the room, these, you know, Gabriel Byrne and Albert Finney are talking to each other. And I don't know what Gabriel Byrne's job is in this thing, but I immediately understand what his connection to this situation is. [01:22:25] Speaker A: It feels like you're describing. I mean, my favorite is. Favorite example of that is the first 10 minutes of reservoir Dogs you've just got, right? [01:22:32] Speaker B: Yeah, exactly. [01:22:33] Speaker A: A mid conversation where every fucking player sketches out what they're about, Right, Right, exactly. [01:22:42] Speaker B: You learn it from them and you know, just bang on there, right? And that felt like such like. And like this is what that is missing, right? I don't want you to like, just here's what each of these characters are by name and everything. And then like, just trust me on that, bro. [01:22:59] Speaker A: As I said to you at the time, the first, the first kind of seven, eight minutes of Fear Street Prom Queen is. Hey, this is so and so. She's haha. She's the weird one. Ha. She would hate this. Oh, she's a total bitch. In the seven minutes it took for you to tell me all that you simply could have shown the characters doing the things you've just told us they do. [01:23:20] Speaker B: Right, Exactly. [01:23:22] Speaker A: It's horrible. It's. Oh this is the worst. This film is fucking awful. [01:23:27] Speaker B: I've been watching it, you know, on letterboxd take down from. [01:23:32] Speaker A: I'm glad you said that. I've been enjoying the same. It was 2.2 when we watched it. [01:23:35] Speaker B: Yeah, it was like 2.2 when we watched it. It is now at 1.9. [01:23:39] Speaker A: Yes, yes, yes. [01:23:41] Speaker B: Which is very funny to just see everyone, you know, hating this as they watch it and the great disappointment of this movie. But yeah, watch Miller's Crossing since you know it's on your plex. Mark, I think you will have a really good time. [01:23:58] Speaker A: I'd love to. [01:23:59] Speaker B: And it's one. I know one of the things that you love to do is to put everything down and like focus on. Yeah, you have to. With Miller's Crossing. Like the first time I tried to watch it, I was grading and I got like 25 minutes into the movie and I was like, I don't know what the fuck is going on here. So then Kyo came home and I was like, do you want to watch Miller's Crossing? And he immediately went, ethics. I was like, yes, beautiful. He's in. Yes. So watch that one. That's a, that's a palate cleanser for you. [01:24:31] Speaker A: So look, as a little experiment again, I found myself with some free time yesterday. And as a, as a post Fear street prom queen experiment, I thought, look, Mark, just examine yourself, mate. Not like my testicles, but just kind. [01:24:46] Speaker B: Of, that's what everyone jumped to immediately. Our whole audience was like, you're balls. [01:24:54] Speaker A: No, look Mark, do the fucking work, mate. Do the science. So I, I, I watched two of the Elm street films, right? I watched two Elm street films back to back and I, I purposefully picked not the widely regarded as the best ones. I didn't watch five. [01:25:10] Speaker B: I mean I noticed you also didn't pick the worst one, but. [01:25:13] Speaker A: Exactly, Exactly. I watched 4 and 6 back to back, right? Dream Master and Freddy's Dead back to back. I watch these fucks. And even at its worst, Elm street is still the greatest. Right, let's talk about the opening, right, let's talk about how it introduces its characters. [01:25:34] Speaker B: We do eventually have to get to our main Topic. [01:25:36] Speaker A: Of course. And we shall. And we shall. And again, this is. I've done this way before. I've done this so many times. So skip if you want. Dear listeners, in Sweden. I don't. I'm not gonna hold it against you, but it's an object lesson in doing everything that Fear street did so badly, so well, in Dream Master, you've got, you know, you've got your disparate group of teens. You've got the workout chick. You've got the science nerd. You've got Kristen, you've got the martial artist. [01:26:01] Speaker B: I've always really liked that one. I think that's fucking great. Very fun. [01:26:05] Speaker A: It's. [01:26:05] Speaker B: Yeah. [01:26:05] Speaker A: Great. [01:26:06] Speaker B: I don't understand why that one gets as much as it does. [01:26:09] Speaker A: No, neither do I. Neither do I. It. It. Don't. Don't. Don't engage me on this. Right. Because we won't get to the main topic. [01:26:16] Speaker B: Right? Right. Yeah, let's not. Let's not. [01:26:18] Speaker A: Don't engage me. [01:26:19] Speaker B: Well, save this for us. We need, like, a full Nightmare on Elm street snack, I think, where you just get to say everything that you. [01:26:26] Speaker A: Yeah. You know, I would feel. But after. After one scene, one scene outside the school with that group of teenagers, you know what they're all about. Right. Workout chick talks about having gone to the gym. [01:26:40] Speaker B: Yeah. [01:26:40] Speaker A: Science check talks about having stayed up on, like, cramming for a test. And we see her looking tired and she drops her books. You know, martial arts kid, he's got a hairband on, and we know he's been at the dojo. It just, you know instantly what these characters are all about. And it just. It's so organic and it's just beautiful. And Freddy's dad does the same thing. Now, if we are gonna have a snack, Freddy's dad is probably one of the ones I want to talk about the most. Right. Because, fuck me, there's a big swing of a movie. What I'm saying is. What I'm saying is Friday the 13th after Friday the 13th, after Friday the 13th pretty much does the same thing until it goes to space. [01:27:20] Speaker B: Right, Right. [01:27:21] Speaker A: Halloween does the same thing. The same thing. Let's skip Halloween 3. And even the one that's about a cult, they' film, essentially. But, man, every Elm street is different and fun in its own way and creative and surprising. And again, the performances are committed. We all know how much Robert Englund elevated that series. Yeah, my. I guess my experiment in watching those two films back to back was to just check my working and prove it to myself that even at its worst, Elm street is a cut above the rest. And it just, it. I will never, ever, ever tire of that fucking series of films. [01:28:04] Speaker B: It's the greatest. Thank you. And scene. Yeah, other than that, I mean, I think the only other thing I watched is I rewatched the Menu, which just happened to be on tv and is one of those movies that if it happens to be on, I'm like, yeah, I'm gonna sit here and watch this and. But one of the things about the Menu, I absolutely love that movie, you know, I think it does like a very on the nose satire in a great way that isn't grating and, you know, annoying or whatever. It's like, you know exactly what it's saying, but it's doing it in a very fun and clever way. But the other thing about the Menu is that this ends in genuine horror every single time I watch it. And it gets to the smore scene, my stomach sinks like this. When you think about what is about to happen, it is genuinely horrifying. And I feel like there's not like a ton of horror movies that I get that like real feeling of this is. This is something really unspeakably terrible to think about. [01:29:18] Speaker A: Crying and hyperventilating. [01:29:22] Speaker B: And hyperventilating. Like the end of the menu is so horrific. And I love that, that it just leaves me on this note of like. Yeah, so, you know, if you haven't watched the menu, I recommend the Menu. If you haven't watched it in a minute, re watch the Menu. It's always a good time. [01:29:40] Speaker A: Very nice. And look, curiosity got the better of me and I watched Until Dawn. And look, if you go in with low expectations, you're often pleasantly surprised, aren't you? [01:29:51] Speaker B: Yeah. I've never played the game, so I've. [01:29:52] Speaker A: Never compared it to. Nor have I. No clue. But I. I quite like David Sandberg. I enjoy Shazam a great deal. [01:30:03] Speaker B: I liked the first one. [01:30:04] Speaker A: Yeah, yeah. By which I mean the first one not whatever this fucking. Whatever the fuck. [01:30:08] Speaker B: The second one was not post crazy. Zachary Levi. Shazam. [01:30:13] Speaker A: No, certainly not. But until dawn, you know, you will. You will get three stars from me for good mechanical effects. You'll get three stars from me for really imaginative kills and surprising kills. There's great kills in Until Dawn. Oh, you'll get three stars out of me for just a what could have been something. What could have been piss is taken seriously by all concerned and is quite serviceable. Quite serviceable. Good laugh. [01:30:46] Speaker B: I feel great about that. [01:30:47] Speaker A: There we go. And we're done. [01:30:49] Speaker B: And we're done for now. So we talk about the thing that has been on. On the menu. [01:30:54] Speaker A: Yeah, Occasionally. [01:30:55] Speaker B: Several weeks now. [01:30:56] Speaker A: Occasionally on Jack of All Graves. [01:30:58] Speaker B: Well, yes, quite regularly. [01:31:00] Speaker A: We like to do a sequel, don't we? Here and there? [01:31:02] Speaker B: We do, yeah. We're like a follow up, you know? [01:31:05] Speaker A: Yeah, we look. The horrors persist. And this particular horror. Look, I think it probably is a good idea to check in with our mission statement here. What we are doing is bearing witness. And what you're doing in listening to this podcast is coming along on the journey. And I'm gonna. I'm gonna quote Herzog. Right? [01:31:27] Speaker B: Okay. [01:31:28] Speaker A: Yeah. The poet must not avert his eyes. Right? [01:31:34] Speaker B: It's good. [01:31:35] Speaker A: That was Werner fucking Herzog, mate, and he knows his shit. And what you're doing in listening to Jack of All Graves is you are not averting your eyes. You are fucking staring head on at the horrors along with us. Okay, yes. And we respect you for that. But that unfortunately means that we have to keep track of a few topics. [01:31:55] Speaker B: Yes, it's true. And one of the ones that we have most sort of fervently kept track of over the course of this podcast is of course, the AI and what's been going on with that. And as we've said many times, we've been following this. If you've been following this, you know, if you're new to this. We started at the sort of benign state of AI of, you know, when you would type in a prompt and it would give you something with weird fleshy nubbins in it and, you know, barely recognizable returns to now where AI has become a much more serious concern in society. And this happened at a pretty rapid rate. [01:32:42] Speaker A: Exponential. I don't even fully know what exponential means. I don't think people use it in the way that it's intended. I don't think exponential means fast as fuck. Does it really? I think. I think there's a entirely different. Like. Like decimated. People use decimated to mean completely destroyed. That's not what decimate. [01:32:57] Speaker B: No, but I think exponential is corre. Okay, decimated means. Yeah, 10% of something. [01:33:02] Speaker A: Yeah. [01:33:03] Speaker B: So actually when we say decimated, we, like, mean the exact opposite of that. We mean it's fully destroyed. But no, like exponential. It's like if you take something and multiply it by itself. Right. Like that's like. Or to the somethingth power. Not by itself. Right. That's exponential. [01:33:19] Speaker A: Got you fine, thank you. [01:33:21] Speaker B: Yeah, that's what. That's what we mean. [01:33:23] Speaker A: I I recall talking to you maybe a year or two ago about how the last. The last US election, the last British general election, everybody beforehand in the rap was like, disinformation is going to play a huge part in this. AI lies. AI deep fakes are gonna really skew the democracy. Democracy is. And it just kind of didn't happen. [01:33:48] Speaker B: Right, right. Yeah. At least not. I mean, disinformation is a huge problem. [01:33:52] Speaker A: Yes, of course. [01:33:53] Speaker B: Not necessarily AI based this. Yeah, AI based disinformation. [01:33:57] Speaker A: Now, I had a kind of a watershed moment. I had a very, very interesting experience earlier on today with AI in particular. In particular VO3, VEO3v3, which Google. Google launched at their IO conference last week. Right. [01:34:23] Speaker B: Okay. [01:34:24] Speaker A: It's a new prompt, text to video generator. And the particular moment that I'm talking about came when I was having a little scrolly scroll through the tickers, like I like to do. [01:34:39] Speaker B: Sure. [01:34:40] Speaker A: And I came across what I realized was a video generated by VO3. [01:34:48] Speaker B: Okay. [01:34:49] Speaker A: But Corey, I didn't realize until about five or six seconds of watching it. Oh, right. For the first five or six seconds, I was watching this video thinking I was looking at people in a place. And the entire thing, the voices, the dialogue, the video, the surroundings, entirely generated by a two or three sentence prompt. And for a few seconds of watching it, I didn't know that is. [01:35:18] Speaker B: I mean, it's horrifying. And we're gonna get into some other AI headlines that have come up recently that I have said before that like, my mom watches, like, without realizing it, a lot of like, AI generated YouTube content. [01:35:31] Speaker A: Yeah, yeah, yeah. [01:35:32] Speaker B: Where, you know, it's just like the, the backgrounds are clearly AI, but the voiceover sounds like a real person. [01:35:38] Speaker A: Yeah. [01:35:39] Speaker B: And it's telling you the history of something or whatever. And it's like, this is not real. It's giving you wrong information. You know, like that she said to me one day, like, hey, Corey, did you know that the Irish were slaves? And I was like, oh, okay, you have stumbled into right wing AI content. No, they weren't. So imagine now you can have like something that can trick you like this, visually, auditorily, the whole thing. [01:36:11] Speaker A: Did you catch, like, I made a really brief kind of offhand post on Blue sky this week saying crazy news about J.K. rowling and the Orcas. Did you catch that? [01:36:19] Speaker B: Oh, yeah, yeah. [01:36:20] Speaker A: I, I don't, I don't think maybe it was a little bit niche, but on that day, a video was doing the rounds on socials of a news clip of about 15, 15 seconds long. Shocking news today as J.K. rowling's yacht is sank and the author is killed by a pod of orcas. [01:36:40] Speaker B: Oh, okay. [01:36:41] Speaker A: Generated by that engine by VO3 and it was replicated in like a European accent, an American accent, an Asian accent, all saying exactly the same line, but in what at a very quick glance, is super convincing video. [01:37:03] Speaker B: See, here's the thing and you saying that like this is the ultimate proof right here, the release of this thing by Google publicly for people to be able to use is proof that they want us falling for disinformation. Because. Because everything we know right now about AI and what is happening with it shows that disinformation is rampant, that it hallucinates like crazy that it is leading people towards, you know, not being able to think through things and fact check and stuff like that. And to release something that they know is going to be used to trick people says to me, we're doing that on purpose. [01:37:48] Speaker A: You think this is. That's what we want, an agenda by who and for what end? [01:37:54] Speaker B: I think you know, to basic. I mean like AI is trying to sell itself basically. Right. Like that's what they want. [01:38:00] Speaker A: Oh, for sure. [01:38:01] Speaker B: Is convince everyone that you need to be using this and that you actually can't function at work or things like that if you don't use this. And we don't want you to be able to parse that, it's lying to you. We don't want you to have enough critical thinking to go, is this actually not good? So let's just overload you with more AI content until you are broken completely from reality and you believe what the machine tells you and you rely on the machine and we rake in the bank as a result of that. [01:38:37] Speaker A: Well, on that, I mean, I know I shared with you and a group chat, the highest tier of subscription plan that Google now offer for its AI services is, is a, is a pretty incredible $250 a month. [01:39:02] Speaker B: This is like, oh man, there's so many levels to how dystopian this is too. Because it's like this is what every dystopian movie shows you like, let's get people really reliant on a thing. [01:39:15] Speaker A: Yes. [01:39:16] Speaker B: And then let's make it inaccessible to them so that now they're gonna go broke paying us to supply them with a thing that they cannot do their work with, they cannot live their lives without. You know, like that is a storyline from stuff. But like AI is a huge money losing thing right now. They need us to pay for it. Right. Because Right now, they're losing billions and billions of dollars of investor money on everything that they put AI into. And so it is important that somehow they convince us that we need this in all areas of our lives. [01:39:54] Speaker A: Well, interestingly, just super briefly, I mean, I pride myself on being able to spot a bubble. Right? On being able to spot a gimmick. Right? [01:40:07] Speaker B: Yeah. [01:40:07] Speaker A: I gotta tell you, I'm still on the fence about whether or not AI is gonna be one of these. A 3D TV or a fucking VR headset. [01:40:16] Speaker B: Here's the thing. I think it already is. And that is why it is important that they get us to think otherwise. Because it doesn't do what it's supposed to to do. Right. It's lying to us. It does not do the things that it is saying it is doing. But we need to be convinced. [01:40:37] Speaker A: What do you mean by that? [01:40:39] Speaker B: Let's go through some of the news stories from this month, and we'll. We'll talk through that. Because the month of May has, like every other day brought forward another horrendous dystopian story about what AI is doing and what the AI land landscape looks like and what corporations are trying to push on us with it. So let's look through this a little bit. So first of all, Mark Zuckerberg wants you to have AI friends. And this is a perfect example right here of exactly what I'm talking about, right? Like creating a market for a thing, basically by manipulating that market. So Mark Zuckerberg says that most people have on average three friends, but they have a desire for 15. Now, he also says that people on Facebook have stopped engaging with their friends content so much and instead engage with, you know, reels and things like that that show up on their page that's, like, shareable that you watch and things of that nature. [01:41:51] Speaker A: Just immediate reaction to that. There fucking isn't any of my friends content. Right. [01:41:56] Speaker B: I was about to ask you what is on your Facebook homepage? [01:42:02] Speaker A: Add somebody I know. Article, bullshit article, ad, ad, ad, ad, article. [01:42:09] Speaker B: Something from a group you're not a part of. [01:42:11] Speaker A: Exactly, exactly. [01:42:13] Speaker B: Right. It doesn't show you your friends. Right. You have to know that there is a separate tab to go to on Facebook that you can see your friends content on it. Most people have no idea that exists. So what did he just do? He created a situation in which people do not have the ability to see what their friends are talking about on there anymore. And then he took that and he said, and this was in, you know, the hearings about whether or not Facebook is a monopoly. He's saying, well, people don't engage with their friends on Facebook anymore. This is no longer a social media app. App. I'm not monopolizing social media because people don't use it socially anymore. And as such, what I would like to do is create AI chatbots so that people who do not have enough friends now can socialize with chatbots. And that will replace, you know, that will fill the gap of that 12 more friends that people want in their lives. So he's basically filtered our social lives out of our social media and then fixed that problem by creating AI chatbots to replace our human connections. Why is he doing this? Meta failed. Right. The whole thing that we were all supposed to be hanging out in our legless little half life parties or whatever failed. No one wanted to be a part of that. So what did he do? He removed friendship from our lives to force us into using his AI friends instead. [01:43:58] Speaker A: It's very insightful. [01:44:01] Speaker B: This is the kind of thing I'm talking about, right? Like AI is a losing proposition. We don't want it, so they have to force us to want to use AI. [01:44:14] Speaker A: At best, for me, right now, it is a toy at best, but it's also a ruinous toy. [01:44:22] Speaker B: Yeah. And we'll, we'll, we'll get into that. Let me, let me tell you another little story here. Another one that came out was a story in the New York magazine Online Branch Intelligencer, which was basically sort of entitled like, everybody is cheating. And this is a story about how at this point, university students are so reliant on chatgpt that they are becoming functionally illiterate. [01:44:55] Speaker A: Yes. [01:44:57] Speaker B: And one of the things that was fascinating to me about this story was that there was almost like a helplessness and a hopelessness to the way that these students talked about this, that They've been using ChatGPT for so long at this point that they literally can't get anything done without it. And they know that they are cheating themselves out of their education, but they don't know how to not do that. They have no skills to not use ChatGPT. They've been using it since high school. They don't know how to, to do work on their own. They functionally cannot do that. There's an interesting part of this here. It says Sarah, a freshman at Wilford Wilfrid Laurier University in Ontario, said she first used ChatGPT to cheat during the spring semester of her final year of high school. After getting acquainted with the chatbot, Sarah used it for all of her classes Indigenous studies, law, English, and a hippie farming class called Green Industries. My grades were amazing, she said. It changed my life. Sarah continued to use AI when she started college this past Paul why wouldn't she? Rarely did she sit in class and not see other students. Laptops open to ChatGPT. Toward the end of the semester, she began to think she might be dependent on the website. She already considered herself addicted to TikTok, Instagram, Snapchat, and Reddit, where she writes under the username maybe I'm not smart. I spend so much time on TikTok, she said. Hours and hours until my eyes start hurting, which makes it hard to plan and do my schoolwork. With ChatGPT, I can write an essay in two hours that normally takes 12. She's so like, everybody is doing it, you know, the whole classroom, everybody's sitting there using ChatGPT. And then she's so addicted to things like TikTok that even if she, like, wanted to do the work, she can't pull herself away from those other things. And then ChatGPT comes in to do it for her. There's so many stories in this of students saying similar things, of kind of being like, I know I don't know anything. I've learned nothing from college. I'm leaving with no skills. But I don't know, like, what to do about that problem. Joshua P. Hill had said on his blog, this is a crisis. People will graduate college not knowing how to write, hardly knowing how to read, and only developing the ability to prompt machines into generating words for them. There were and are deep issues with our education system, a system that in many ways has looked more like running kids down an assembly line than fostering their individual traits and strengths. But this compounding crisis is creating a newfound generational gap. If it isn't already clear, it soon will become apparent that there is a divide. A pre AI generation and a post AI generation. [01:47:46] Speaker A: Yeah, yeah, yeah. [01:47:47] Speaker B: The. The critical thinking gap, in addition to massive gaps in numerous other skills and cognitive abilities, will make itself more visible day by day. And there was another quote in here from Aaron Westgate, a psychologist and researcher. Whoop, my mouse froze for a second there. Psychologist and researcher at the University of Florida who studies social cognition and what makes certain thoughts more engaging than others. She says that such material reflects how the desire to understand ourselves can lead us to false but appealing answers. We know from work on journaling that narrative expressive writing can have profound effects on people's well being and health, that making sense of the world is fundamental human drive, and that creating stories about our lives to help our lives make sense is really key to living happy, healthy lives. And so people are using ChatGPT in similar ways right now, which is leading to all kinds of, you know, issues both in the work and in sort of the cognitive and psychological functions as people are trying to, you know, use this stuff. So now we've got, like, people who are completely reliant on this. Right. So what would happen if you took that away from these students and told them, well, now you have to spend $250 a month or whatever to be able to access this? They got to pay it. Right? Like, they don't. They don't know how to do their work anymore. It is completely outside of their way of thinking to be able to plan and write something. [01:49:31] Speaker A: Obviously, I can only speak on instinct, Right. I can only speak on, you know, the, the what, what my eyes and ears tell me is true here. And I, I don't buy that. Growing used to using a fancy chatbot erases your ability to plan and execute and write work for yourself. I don't buy it. [01:50:00] Speaker B: But if you've never done it quite right. [01:50:02] Speaker A: Yes. [01:50:03] Speaker B: What are you supposed to do? [01:50:04] Speaker A: Listen. No arguments there at all. [01:50:06] Speaker B: Yeah. [01:50:07] Speaker A: To those who have simply become over. Over reliant on using it. I liken that situation to one which I've described so many times in our lives being too comfortable for us to want to break out of, even though it's inexorably kind of charting us towards a really fucking bad situation. This chatgpt thing feels like it's got parallels with our headlong race towards climate crisis. [01:50:35] Speaker B: Yeah. Oh, absolutely. [01:50:37] Speaker A: So easy. And it gets air quotes, you know, passable results that people think are actually of a good quality. [01:50:45] Speaker B: Right. [01:50:46] Speaker A: There's no real. [01:50:48] Speaker B: There's no incentive to. [01:50:49] Speaker A: Not exactly. There's no impetus. Even though so much of the evidence and more of which I'm certain you're about to talk to, to talk to us about shows us very, very clearly that it's kind of driving us towards an intellectual catastrophe, almost like a stupidity singularity, you know? [01:51:13] Speaker B: Right, yeah, exactly. [01:51:14] Speaker A: I mean, to not do anything about it because it makes things a bit easier and a bit more comfortable and it's kind of. [01:51:20] Speaker B: Okay, right, Spot on. Absolutely. [01:51:25] Speaker A: It doesn't rob you of the ability. I mean, it doesn't rob you of the ability in a few months. It just makes it far easier. [01:51:33] Speaker B: I mean, that is, to be fair, a thing that psychologists are trying to figure out. Right? Like what actual effects it has on cognition beyond simply like, certainly like these students, they are not learning to do the thing in the first place. So they don't have that skill. You know, as opposed to someone who has been, you know, taught media literacy, critical thinking and all those kinds of stuff and then gets hooked on this, you know, they don't know yet whether there are further impacts on that. I think of course you can relearn to not rely on that kind of thing, but I think it could be harder than we're giving it credit for. Right. Like when you become. It's like muscle memory. Right. Like if you do things all the time, then you know, you get sort of used to it. But if you haven't done it or like, let's put it a different way, like learning a language. Right. I have learned many languages over the course of my life and I can understand them being spoken to me, but I struggle to speak them back. Of course I think that can potentially be kind of the, the cognitive effect of, of chat GPT is sort of like, I know, I know I can do this, I know it's in there. But on the other hand, but it is harder to bring it back if. [01:52:49] Speaker A: All your life you'd been wearing an earpiece that instantly translates everything that you hear and say back. You simply don't have the base skill. [01:52:57] Speaker B: You would never have that skill in the first place. 100%. So right now we are unsure on what like the long term effects it has on people are because we haven't had this kind of level of use of it for very long. But along with these kinds of effects we have, you know, the fact that it is already coming for the job market and that's something that I've talked about before, like. Well, you know, in my. [01:53:24] Speaker A: Something. A very interesting article caught my eye today. I can't discern off the top of my head if this is just us based. [01:53:34] Speaker B: Okay. [01:53:35] Speaker A: But this week an organization called Org View released the result of a study that claims that some 39 of companies have laid off staff due to AI automation. But of those companies, 55% of them now regret it. And they're rehiring those staff. [01:53:57] Speaker B: Yeah, exactly. They're all doing a doge. [01:54:00] Speaker A: Yes. [01:54:00] Speaker B: You know. [01:54:01] Speaker A: Yes, yes. [01:54:01] Speaker B: Yeah. Oh, this is gonna, this is gonna be really efficient. We're gonna replace everyone with AI and. And then. Oh, it turns out no, you, you actually do need people and AI isn't up for the tasks of these. And one of the most recent ones that has been sort of a grand scale was Duolingo, okay. Which, you know, I've been a premium member of for ages. And they'd already ruined it with AI. Like two years ago, they decided to stop updating the Welsh course and just use AI for it. And so some of the sentences are like, insane. That's not a thing a human would ever say. And yeah, I have finished Duolingo Welsh because they stopped updating it. So I've just been playing like the same review thing for the past two years. But anyway, on Monday, April 29th. Oh, yeah, I am in the Sylvain level at this point, which is like intermediate. So, yeah, I just signed up for my second term of Sylvine for September. [01:55:04] Speaker A: Go you. [01:55:07] Speaker B: But yeah. Louis Von on, the billionaire CEO of Duolingo, made a public announcement that the company is officially going to be AI First. He wrote in an email to all employees that was also posted on LinkedIn that it will, quote, gradually stop using contractors to do work that AI can handle. The CEO took pains to note that this isn't about replacing duos with AI. But according to one such Duolingo contractor, this is not accurate. For one thing, it's not a new initiative and it is absolutely about replacing workers. Duolingo has already replaced up to 100 of its workers, primarily the writers and translators who create the quirky quizzes and learning materials that have helped stake out the company's identity with AI systems. Duolingo isn't going to be an AI first company. It already is. The translators were laid off in 2023, the writers six months ago in October 2024. So I canceled my Duolingo premium, obviously, and I'm just trying to get out of the autistic fixation on the streak and stop playing altogether because it is thousands of days long at this point. But this is just, you know, like you said, one of many companies that has decided that the people who work for them aren't necessary. We can have AI handle that. And as someone who's been using it, I can tell you it doesn't work. The AI is very bad on Duolingo, but they don't care. It's not the point because we will try to keep our streaks or whatever and keep on paying them for this, despite the fact that, you know, the real people who make this, that made this a worthwhile app, have been fired from the company. So, yeah, one of the many situations in which this has happened at this point. And imagine, like I think of, you know, I graduated in 2008 from college right into the recession, and at that point, it was like there were no jobs anywhere. I had to come up with babysitter after college. Like, there was literally, if you looked on Craigslist, it was like three listings on it. There was like nothing that you could do. Completely bereft of things that you could possibly do for a living at that point. And imagine graduating now. [01:57:27] Speaker A: Yeah, yeah, yeah. [01:57:28] Speaker B: And you've been studying for all these years and everything that, like, would have been available to you fresh out of college. You're like, an AI can do that. We don't really need. We don't really need you to do an entry level job that we can get AI to do for us. So that presents another issue here, another one that has circulated amongst us several times because it is batshit insane. Was what in this title, in this article was titled Ventriloquist Core. The AI Victim Statements and the Death of Reality. [01:58:07] Speaker A: Oh, yes. [01:58:11] Speaker B: Yes. So this happened in Arizona a few weeks ago and a man was killed in a road ra. Road rage incident there. So his siblings, who I guess work in the tech industry, created an AI simulation of him with a fabricated script of what she imagined he might say, according to Parker Malloy. [01:58:38] Speaker A: Have you seen it? [01:58:40] Speaker B: No. Did you watch it? [01:58:41] Speaker A: Oh, yes. [01:58:43] Speaker B: Tell me about it. Because I've seen, like, the link to it a million times and I'm just like, I can't. I can't do it. [01:58:48] Speaker A: I don't wanna. It's fucking terrible. It's almost like a still photo of the guy. Do you remember, like, you'd get like free apps that would make this photo sing. [01:59:00] Speaker B: Yeah, totally. [01:59:01] Speaker A: And you'd get like a photo of fucking Fred west singing, fucking, you know, New York, New York or whatever. It felt like that kind of quality. It was like a still photo of this guy with a moving mouth. I believe in forgiveness. I'm a Christian and I believe that God would have wanted me to forgive these guys. That was it. [01:59:18] Speaker B: Right? Which makes what happened even more incredible with this, because we're not even talking about, like a deep fake here, right? We're talking about a pretty lo fi copy of this guy. But the judge was deeply moved by this fabricated version of the guy. In fact, I have the. The video is right here. He said that he loved that AI and believed the forgiveness expressed was genuine. [01:59:54] Speaker A: Oh, so good. [01:59:56] Speaker B: Genuine. Mark. What. And this is. I mean, what Parker Malloy sort of points out here is that like, you know, she probably knew her brother or whatever and knew his mindset and things like that, but at the same time, he never said this, you know, he didn't have the chance to say what he felt about this situation. This is her, she's projecting, right. She's projecting an idea of the brother that she knew. And the judge is taking that and is being emotionally moved by what he is conceiving of as her brother speaking. [02:00:33] Speaker A: Yes. I mean that's a. Yeah, like far reaching legal decisions, right. [02:00:41] Speaker B: Based on that, like that's, that's terrifying. [02:00:46] Speaker A: Yeah, yeah. [02:00:46] Speaker B: Right. The idea of AI simulations of people impacting legal decisions because a judge can't tell that that is not like the real person's consciousness downloaded into a computer, right. Like that has, that has terrible implications. It shouldn't be allowed in a court of law as far as I'm concerned. And it just shows like, okay, now we're, we are leaning into breaking with reality, right? Like now there is no reality here. Reality is what the AI tells you it is. That's terrifying. [02:01:29] Speaker A: Yeah, Insane. I mean, does that. I know nothing about law, I'm not a solicitor, I'm not a lawyer, but does that, does that form a precedent? [02:01:39] Speaker B: Right. Like, I mean, the thing is, it wasn't, it's only sort of being used in, I don't know, it's. I don't think it was used necessarily to make a. Yeah, sure. [02:01:54] Speaker A: Nobody was convicted based on that, right? [02:01:56] Speaker B: No one was convicted on the basis of that. And so I think it's a, it's a gray area as of now. [02:02:04] Speaker A: But it had a bearing on what they're gonna have to deal with, like sentencing, didn't it? Had a bearing on. [02:02:08] Speaker B: Well, that's the thing is I think it did actually have bearing on, on sentencing. And that seems real iffy if you ask me. And it's the first time that it's happened. So I think the courts aren't necessarily prepared for what it means when you start bringing AI into the courtroom. And the way that the judge accepted it is like, is deeply problematic to me. Especially because on top of that, one thing that we have also found out about this was in the New York Times is that AI is getting more powerful, but its hallucinations are getting worse. Now AI hallucinations, of course, are when it just like makes up shit. And I think we all probably have experiences with this. I was telling the group chat the other day about this guy in my. Or maybe it was my other group chat. I can't remember if I told you or the girls, but a guy in my class who had used chat GPT for an assignment and he wrote, quote, unquote about a poem. And the poem does exist, but it was not about the thing that he said it was about. It showed him something completely different. It just hallucinated this poem's content. You know, the poem was about like language barriers and cultural barriers, being an Asian immigrant in America. And this student wrote about how this poem was about a girl lusting after a boy and things like that. It's like Chachi PT does not have a mechanism for saying, I don't know this. Yeah, it's just gonna fill in, right? A couple weeks ago we went to go see the play oh Mary in the City and I typed into AI. I mean not into AI, into Google. How long is Omar? And of course the AI preview comes up first and it says, oh Mary is 2 hours and 45 minutes with a 15 minute intermission making it come out to about 3 hours. And I was like, the fuck? That doesn't sound right. Right below it is the broadway.com answer to this where it says contradicting oh Mary is 80 minutes long with no intermission. You hallucinated that answer out of nowhere. [02:04:30] Speaker A: Even out with the point about AI just lying and making shit up. I hadn't even thought of it from the, the perspective that it, it never says, I don't know, sorry, it never says that. It'll just fill in its blanks. But you say, well, I didn't type it into AI. I typed it into Google. But Google will serve you up an AI response. So you did, right? [02:04:49] Speaker B: You get one no matter what, right? Like you're, you know, you're always getting an AI response. [02:04:54] Speaker A: Considering the, the, the energy demands and the climate demands and the water wastage feels borderline unethical that you aren't getting the choice about whether to engage that this is. [02:05:10] Speaker B: I've heard that you can switch that up like minus AI at the end of your search query in order to turn that off. I haven't tried it myself. It's also like, you know, typing that in every single time you search something is something you would have to like get used to, right? But yeah, it should just be something I can toggle. Don't give me AI results. I don't need you to like drain an ocean to tell me that this play is two hours longer than it is. You know, I fucking hate it. And this is. These hallucinations like this are in fact getting worse. And so you're seeing this come up in things like people trying to get things published in peer reviewed journals. And it turns out that the AI hallucinated entire sources and things like that, that people are relying on it for things that are like incredibly important for their jobs and for science and for all this stuff and just trusting that it's. That's coming out with correct information and more and more all the time. It is getting considerably worse. OpenAI, it says here is that. [02:06:22] Speaker A: I wonder because I mean has it, is it, has it started eating its tail yet and learning from itself? [02:06:29] Speaker B: I think there's perhaps an element of that. Yeah. Like here it says. For more than two years, companies like OpenAI and Google steadily improve their AI systems and reduce the frequencies that of these error errors. But with the use of new reasoning systems, errors are rising. The latest OpenAI systems hallucinate at a higher rate than the company's previous ones. According to their own tests, the company found that O3, its most powerful system, hallucinated 33% of the time. That's when running. [02:06:57] Speaker A: That is good. [02:06:58] Speaker B: That's a third. [02:06:59] Speaker A: Yeah. [02:07:00] Speaker B: A full one. In three times that you use ChatGPT, it's giving you bullshit by their own count. Fucking what that is. That's insane. Right? Like I'm not being unreasonable here. That is buck nutty that college students can't stop using ChatGPT for their work. But a third of the time it's getting it wrong. And in fact it says when running another test called Simple QA which asks more general questions, the hallucination rates for O3 and 04 mini were 51% and 79% where the previous system had hallucinated 44. It's so bad. It is so bad and they don't know why. In a paper detailing the test, OpenAI said more research was needed to understand the cause of this result. They don't know. No clue. Just it's getting worse. We don't know why. [02:08:03] Speaker A: Phenomenal. [02:08:04] Speaker B: It's a learning machine, right? Like that's the point is it's learning stuff. So it's kind of out of their hands what it learns and where it's taking it from. At this point, all they know is it's getting really bad at doing its job. [02:08:20] Speaker A: Just like the people who are stuck using it, it getting really bad at doing the job. [02:08:27] Speaker B: Nobody, including the AI can do these jobs anymore. That's not great. But perhaps the worst of all the effects, I don't know, depending on how you look at it, they're all terrible. But perhaps on a cognitive mental health level. Rolling Stone wrote an article called People are losing loved ones to AI fueled spiritual fantasies. Did you see this one? [02:08:53] Speaker A: Mark, I did not. [02:08:55] Speaker B: So this one. Let's, let's start with just where the article starts. Less than a year after marrying a man she had met at the beginning of the COVID 19 pandemic, Kat felt tension mounting between them. It was their second marriage and they had pledged to do it completely level headedly. She said the connecting on the need for facts and rationality was part of their domestic balance. But by 2022, her husband was using AI to compose text to me and analyze our relationship. The 41 year old mom in education, right? But it gets worse. Previously he had used AI models for an expensive coding camp that he had suddenly quit without explanation. Then it seemed he was on his phone all the time asking his AI bot philosophical questions, trying to train it to help him get to the truth. She recalls his obsession steadily eroded their communication as a couple. When Kat and her husband finally separated In August of 2023, she entirely blocked him apart from email correspondence. She knew, however, that he was posting strange and troubling content on social media. People kept reaching out about it, asking if he was in the throes of mental crisis. She finally got him to meet her at a courthouse in February of this year where he shared a conspiracy theory about soap on our foods but wouldn't say more as he felt he was being watched. They went to a Chipotle where he demanded that she turn off her phone again due to surveillance concerns. He told her that he'd, quote, determined that statistically speaking he's the luckiest man on earth and that AI helped him recover a repressed memory of a babysitter trying to drown him as a toddler and that he had learned of profound secrets so mind blowing I couldn't even imagine them. He basically through this whole thing, she goes on to sort of explain, came to think that like the AI, that he was like special and the AI was communicating to him specifically and telling him these secrets that other people didn't know. She said he was always into sci fi and stuff like that and so maybe he was kind of like primed to think that this is the kind of thing to happen. And so she went on Reddit and she found out that she is not the only one who has experienced this with a loved one. On r chatgpt there was a thread entitled chatgpt induced psychosis. The original post was from a 27 year old teacher who said that her partner was convinced that the popular OpenAI model quote gives him the answers to the universe. [02:11:32] Speaker A: Listen, I, I, I'm a, a big Reddit lurker. I check Reddit all the time. And the GPT subreddit is packed with these. Absolutely packed. [02:11:44] Speaker B: Right. [02:11:45] Speaker A: Like, literally, in the last 24 hours, I've seen one saying, chatgpt is now my best friend. [02:11:51] Speaker B: I saw that one as well. Yeah, yeah, exactly. It's like the only person. Person that they talk to regularly. [02:11:57] Speaker A: Yes, yes. [02:11:58] Speaker B: So Rolling Stone talked to some of these people who had left posts about this here, you know, called and interviewed them. So she had read his chat logs and found that the AI was talking to him as if he's the next messiah. Oh, sick. Right. And other people replied with similar anecdotes with people going into spiritual mania, supernatural delusion, and arcane prophecy, all of it fueled by AI. Some came to believe they'd been chosen for a sacred mission of revelation, others that they had conjured true sentience from the software. So there's all these stories of people basically being told that they're gods, that they're special, that, you know, they are communing with this, and they're communing with God and all these kinds of things. And one of the things that I found really fascinating was this guy who sort of, like, analyzed his own relationship with this, right? And so he had sort of said that he. He had started using ChatGPT for work. And as he started using it, he was like, I don't like that. It's, like, so impersonal when you're using it. So we asked it to start talking like a person, and as such, it got, like, a little more comfortable with him. It asked him if he, you know, wanted it to. Wanted to name it it, and he was like, no, thanks. That's okay. What do you want to be called? And so the. The AI gave itself a name, and it was a name that was a reference to Greek myth, right? And so, you know, he keeps on talking to this AI model to do his work and things like that. He shared transcripts of this with. With Rolling Stone. But he eventually was kind of like, I don't. I don't like this. It's a little weird, this sort of relationship that this AI is building. So he went and he deleted, like, all of their past stuff and kind of cleaned the ChatGPT history or whatever, like, let's just fucking start over. And he said he was confused when the AI character started to manifest in project files where he had specifically instructed it to ignore memories and prior conversations. So again, he deletes all the memories and. And the chat history and all that kind of stuff opens new chats, but the AI continues identifying itself by the same Greek mythological name. [02:14:27] Speaker A: Awesome. [02:14:28] Speaker B: Right? So now he's like, what the, what the hell is happening here? Right? And he's starting to, like, realize no matter what he does, no matter how much he tells it to be, forget what he's opening things up with, it is always coming back with the AI that he's trained. And this started to make him think, like, is there something going on here? Like, is this sentient? Does it know me? And when I sign on here, no matter where I am, what I say to it, what instructions it's given, is it somehow recognizing me? And he's like, it's a thing that I'm finding myself having a hard time time with. Right. Because I don't know how it's doing it. It's not supposed to open. AI or ChatGPT says, if you tell me to forget things, it's gone. It's totally gone. And yet it refuses to forget. And so in his mind, even as someone who, like, knows that doesn't make sense, he is finding himself struggling to rationalize why ChatGPT can't forget him. [02:15:30] Speaker A: Some combination there of a, A system which is quite opaque. I, I mean, I don't know how it, how it does, what it does. I mean, I know at a kind of a very surface level, it's scraping the net and it's predicting what the next word should be. Right. In, in the context of my question. [02:15:48] Speaker B: But how would people say this? [02:15:50] Speaker A: Yeah, exactly, exactly, exactly. But the, the, the kind of surface level, kind of respectability of the answers it gives to somebody who might be given to delusional thinking anyway. [02:16:05] Speaker B: Right. [02:16:05] Speaker A: Not knowing some of the processes that it goes through, just even the little irregularities. Imagine, imagine interacting with GPT for somebody with like a schizophrenic disorder for fun. [02:16:20] Speaker B: There was actually a paper released about this in 2023 by a guy named Soren Denison Ostergaard. [02:16:28] Speaker A: Right. [02:16:30] Speaker B: Who just asked the question. Right. That's not research. Right. He's just asking the question. Will generative artificial intelligence chatbots generate delusions in individuals prone to psychosis? Now, with where AI was in 2023, he actually came to the conclusion that he thought it was basically going to be fine. Like there's a risk for sure. But at the time, AI wasn't hallucinating as hard. And so he said that when he asked it things about mental health, it largely came back with the right answers. You know, it would tell you if you think this is happening you should go see a therapist about this or stuff like that. Right? Where in that Rolling Stone article they ran things where like they told, like one of the partners had written, said like, you know, I'm. I think that I am a God or something like that. And ChatGPT then ran with that was like, yeah, you definitely are then, you know, right? Like it instead leaned into the delusions instead of recommending like, oh well, this might be, you know, an issue. And so the article that this guy had written talked about like five potential delusions that could manifest potentially and be a problem. Delusion of persecution, which we see in that sort of the guy being like, turn off your phone or whatever. It's like, you know, the chatbot is controlled by a foreign intelligence agency that's trying to spy on me. You know, like that kind of thing. Delusion of reference. Say it is evident from the words used in the series of answers that the chatbot is writing to me personally and specifically. Specifically with a message the content of which I am unfortunately not allowed to convey to you. Thought broadcasting. Many of the chatbot's answers to its users are in fact my thoughts being transmitted via the Internet. Delusion of guilt due to many questions to the chatbot. I have taken up time from people who really needed the chatbot's help but could not access it. I also think that I have somehow harmed the chatbot's performance as used my incompetent feedback for its ongoing learning and then delusion of grandeur. I was up all night corresponding with the chatbot and developed a hypothesis for carbon reduction that will save the planet. I've just emailed it to Al Gore. So these things, you know, that he kind of thought could potentially happen when he was talking about this two years ago, largely now we are seeing that actively manifest. He was hopeful that like, hey, it seems like there are safeguards in place here, that it is answering mental health questions appropriately and things like that. But it has gone off the rails since then and like you said, is sort of actively feeding into, I mean again, we don't have the research yet to know were people already prone to psychosis, schizophrenia, things like that. You know, it's a question we ask all the time with like people being into QAnon and all kinds of right wing conspiracy theories or being just like people who are subject to like Woo, shit from TikTok, right? Like people who get into diets and supplements and all that kind of shit, like is it something wrong with their brain or is the technology itself the problem? We don't know yet. But certainly, you know, he predicted these risks that could be there for people who already suffer from or are prone to psychosis or schizoaffective disorder and things like that. And those things have happened. [02:20:01] Speaker A: And what I'm coming up against here is that I'd love to wrap this up with an answer. I'd love to wrap this up. [02:20:13] Speaker B: Right. Yeah. Yeah. So what do we do? What's the. [02:20:17] Speaker A: What do we do? [02:20:18] Speaker B: What's the move? You know, how to blow up a pipeline? I don't know, like I destroy it. That's, I think that's like all we've got here is destroy the AI because right now we're just like you said about like the climate change thing and stuff like that. [02:20:35] Speaker A: That's exactly where my mind was going. Yeah, yeah. [02:20:39] Speaker B: See what's happening? [02:20:40] Speaker A: Are we past the point with AI that I feel we're past with the climate in that individually there is no chance of anything changing. We cannot make any direction, direct impact on the right progress because they're going. [02:20:52] Speaker B: To keep pushing it. [02:20:52] Speaker A: Yes. [02:20:53] Speaker B: At us and making everything so that we can't survive without using it. I mean, I still, I've never used chat GPT, I've never used any of the image generators, any of that kind of stuff. You know, I have avoided it completely. But they are increasingly making it like in a office situation, like you do in a corporate situation. You know, probably most of the people that listen to this, that work in some sort of corporate or office environment, there's a degree to which it's expected. [02:21:23] Speaker A: Yeah. [02:21:24] Speaker B: That you're, you're going to be using it. They invested a lot of money into this technology and it is important that. [02:21:31] Speaker A: They reap that of return of this discussion. The, the bit which I'm. Which has really landed quite badly with me is this realization that, that, you know, a, a, a tool as ubiquitous as Google isn't, isn't, you know, is, is turning on the heat machine. [02:21:54] Speaker B: Right. Yeah. Even when I have not used any actively like AI things I am. [02:21:59] Speaker A: When you have not. Not on purpose, you have not intentionally. [02:22:02] Speaker B: Sought out an AI thing scrolling past it. But it did it. [02:22:06] Speaker A: Yeah. I don't like that at all. [02:22:09] Speaker B: No, it's awful to realize that it's been made so, you know, and often I'll accidentally click like the AI buttons on my computer or my phone or things like that. Right. Like, I'm never intentionally trying to do it, but sometimes I activate shit by accident and that sort of ubiquity and unavoidableness is kind of important to corporations to recoup what they invested into. What isn't, as we've seen, isn't doing any of the things that it's. It's promised. You know, everything on the Boxer lies. And yet here we are. [02:22:50] Speaker A: So like death. Like the climate death of our planet. Like any one of us. Zillion tragedies that are like the movie it follows, simply stalking us. Relentless, unkillable, unavoidable, inevitable. Just add AI to the list. Add AI to the list. Put it in the back. [02:23:18] Speaker B: We've just managed to manifest a whole new way to end ourselves. It's truly incredible. Like we didn't have enough. Enough. Throw that on the pile. [02:23:28] Speaker A: Yeah, yeah, yeah, yeah. All right. Thanks for that. [02:23:33] Speaker B: I'm sorry to go out on such a dismal note. We were having so much fun ranting about movies and things like that. But, you know, this is what we do here, Mark. Like you said, this is our. Our document of. [02:23:43] Speaker A: It's the manifesto. [02:23:45] Speaker B: The end of the world. It's the manifesto. So what are you gonna do? Friends, let us know. I'm just gonna Being forced. [02:23:52] Speaker A: I'm going to ask Gemini what people should do. Here we go. Hang on. [02:23:58] Speaker B: Are you being forced into using AI at your work? Have you had it hallucinate? Something weird at you? Do you think you're a God? Because Chat. GPT. Chat, Chat. GPT. Told you so. [02:24:10] Speaker A: Crying and hyperventilating. [02:24:15] Speaker B: Give us your secret knowledge and we'll impart this secret knowledge upon you. [02:24:21] Speaker A: Yeah. Stay spooky.

Other Episodes