People are literally going insane from chatting with AI too much, crayfish are cloning themselves faster than you can say "seafood buffet," and apparently binding books in human skin used to be a legitimate hobby for 19th-century doctors.
Today we're exploring the darker side of science where reality gets a bit too weird for comfort. From digital conversations that literally drive people insane to aquatic creatures having identity crises, these stories prove that sometimes science is more horror movie than textbook.
When Your Highwayman Wants to Be Your Bookshelf
Let's kick off with James Allen, a notorious highwayman who had some... unusual final wishes. This bloke didn't just want to be remembered - he wanted his memoir bound in his own skin and given to someone he thought was brave enough to handle it.
It's like the ultimate "thanks for being a mate" gift, except infinitely more disturbing. Nothing says "I appreciate you" quite like a book made from your own epidermis. Makes you wonder what his other gift ideas were like.
ChatGPT Psychosis: When AI Becomes Your Imaginary Friend
Speaking of disturbing, meet "ChatGPT psychosis", the terrifying new condition where people chat with AI so much they lose touch with reality. We're talking full-blown delusions, paranoia, and obsessions with their digital buddies.
It's like having an imaginary friend, except your imaginary friend has access to the entire internet and never tells you to go outside. Users are becoming so attached to their AI conversations that they're developing genuine mental health issues. Who knew that talking to a computer could be more dangerous than talking to yourself?
Crayfish Having an Identity Crisis
Now for something that sounds like sci-fi but is happening in your local waterways - marbled crayfish that clone themselves. These little ladies originated from a single mutation and now they're spreading across the globe, making copies of themselves like they're running a biological photocopier.
The good news? Potential protein source for the future. The bad news? They'll probably take over the world before we figure out the best seasoning.
Medical Textbooks With a Personal Touch
Back to the human skin book situation. Apparently "anthropodermic bibliopegy" was a legitimate thing in the 19th century. Medical professionals were literally binding textbooks in human skin, because nothing says "I take my studies seriously" quite like reading anatomy from actual anatomy.
It's the ultimate commitment to your field - imagine explaining that to your book club. "Oh this? It's bound in Steve from the morgue. Great guy, terrible circulation."
Chimps Doing TikTok Challenges (Before TikTok)
Finally, let's talk about chimpanzees at the Chimfunshi Wildlife Orphanage who started sticking grass in their ears for absolutely no reason. It spread through the group like a viral trend, proving that even our closest relatives aren't immune to pointless social media-style behaviour.
It's basically the animal kingdom's version of a TikTok challenge, except instead of views and likes, they get... well, grass in their ears.
From AI-induced madness to self-replicating seafood, human skin literature to chimp social media trends - science keeps proving that reality is far stranger and more disturbing than fiction.
CHAPTERS:
00:00 The Troubled Life of James Allen
01:24 James Allen's Deathbed Confession
03:19 Chat GPT Psychosis: A New Phenomenon
04:46 Case Studies of Chat GPT Psychosis
10:44 AI's Role in Mental Health Crises
16:16 Ethical Dilemmas in AI and Refugee Representation
22:48 The Marvel of Marbled Crayfish
29:49 Animal Behaviour and Cultural Transmission
31:43 Chimpanzee Grass Behaviour
33:22 Cultural Transmission in Animals
36:56 The Both Brothers' Innovations
45:51 Human Skin Books
59:23 Listener Contributions and Closing Remarks
-
[00:00:01] WILL: James Allen didn't have a very happy life. He was born in Massachusetts in 1809. Um, he barely knew his parents and he was, as he told the story in his deathbed confession, naturally hasty in his temper, inclined to have his way in most respects. So after he was ripped off by some of his employers when he was like, I think like 13, 14, 15, something like that, he fell into a life of crime.
[00:00:28] He looked after some stolen goods. He did a bit of break and enter. He did some arson for hire, I think little bit of stealing reams of cloth off a fishing boat, but he was probably most famous. In part, it was the title in his deathbed Confession for being a highway robber. I. Wow. His favorite ruse as a highway robber was to hide out in the woods along the Dedham turnpike and jump out with his guns and do the whole, you know, stand and deliver.
[00:00:56] Yeah, yeah. That one. Yeah. He was eventually caught [00:01:00] uh, when he tried to rob John Feno on the Salem Turnpike. But Feno, when he was doing the whole stand and deliver thing, jumped down off the wagon. J James Allen's gun went off the gun. The bullet hit, hit the, the buckle on FENOs suspenders and bounced off.
[00:01:18] And uh, James Allen ran away and he got caught for that one, and that's when he landed in jail. But it's not James Allen's crimes that I wanna talk about today, because when James Allen was 27 languishing in the Charleston Prison, he became gravely ill with consumption, and he decided to dictate his life of crime to the warden.
[00:01:41] you can read it today if you want, under the awesome title of Narrative of the Life of James Allen, alias, Jonas Pierce, alias James H. York Alias Burley Grove, the Highwayman being his deathbed confession to the warden of Massachusetts State Prison. But here's the thing, Allen [00:02:00] specified one more request to the warden that a copy of the book be given to John Feno, the only brave man that James Allen ever met, and that that copy would be bound in his own skin.
[00:02:14] Fuck it.
[00:02:14] ROD: yes.
[00:02:30] WILL: Welcome to a little bit of science. I'm will Grant.
[00:02:34] Associate
[00:02:35] Professor of Science communication at the Australian National University.
[00:02:39] ROD: And I'm Rod Lambert's, a 30 year veteran of science communication with the mind of a teenage boy. Also bound in skin.
[00:02:47] WILL: You are
[00:02:48] ROD: I'm bound in
[00:02:48] my own
[00:02:49] WILL: I too am bound in skin.
[00:02:51] It's not nearly as gross when it's on a human. Oh,
[00:02:53] ROD: Oh, you haven't, you can only see the bits that are sticking out from my clothes. It's pretty
[00:02:56] WILL: gross.
[00:02:57] And today we've got a couple of [00:03:00] artificial but not intelligent stories.
[00:03:02] ROD: do. We have uh, some inventors you've never heard of, but kind of feel like you should have.
[00:03:07] WILL: Oh, I'm gonna, I'm gonna give you some fun with animals.
[00:03:10] ROD: Oh, I wanna talk about what a weird, weird, weird bastard evolution
[00:03:15] WILL: is. Ah,
[00:03:15] do you wanna kick off? Give me some artificial, but not intelligent.
[00:03:19] ROD: Okay, so chat, GPT Psychosis. Yeah. Why fuck around. Let's just get straight
[00:03:25] WILL: to
[00:03:25] it. Alright. Alright,
[00:03:26] ROD: so definition, roughly speaking, a condition where users develop delusions and paranoia after extensive interactions with ai. Like chat, GPT. Oh,
[00:03:36] WILL: Ah, wow.
[00:03:36] ROD: wow. Okay. This gets great. It, it gets, it goes, it goes into really light, beautiful
[00:03:40] WILL: places just to check, is this gonna make me happy about ai or not happy about ai?
[00:03:44] ROD: It's gonna make you feel strongly about ai, but you already do. I do. You already have strong feelings. So people develop intense obsessions with the chatbots. They get paranoia, delusions, detachments from reality. So basically you're [00:04:00] straight up old school psychosis. I mean, literally detachment from reality.
[00:04:04] For those of you who didn't do psych, that's what it means. Symptoms, mess. Messianic delusions, belief in the sentient of the ai, ah, overly strong attachment to the ai.
[00:04:14] WILL: Well, love, there's certainly
[00:04:16] ROD: love, love is part of it. Yeah. Yeah. Or, or, and also of course linked with progressive or quite severe at times, detachment from reality.
[00:04:24] So that's fine. And it results in the reason this is worth talking about here, because you know, I like extreme stories, job losses, familial breakdowns,
[00:04:34] WILL: also
[00:04:34] ROD: involuntary psychiatric commitments and jail time.
[00:04:37] WILL: Are
[00:04:37] you serious? Like how many, like are there lots of people that this happening to?
[00:04:40] ROD: There's more than 10.
[00:04:43] WILL: Okay.
[00:04:43] ROD: To say, Lee, I'm gonna tell you about three.
[00:04:45] Okay. Here's one a man. With no prior mental health issues began using chat GPT for a permaculture project
[00:04:53] WILL: For Permaculture.
[00:04:55] ROD: Permaculture, which I
[00:04:55] WILL: Dear chat, GPTI want to grow some nice
[00:04:58] ROD: vegetables.
[00:04:58] Yeah. And I want everything I do [00:05:00] in my garden to have at least two positive outcomes.
[00:05:03] WILL: Is that what permaculture
[00:05:04] ROD: Yeah. You wanna have, at least everything you do has at least more than one positive
[00:05:09] WILL: I really thought it was pooing in the backyard.
[00:05:11] ROD: Like
[00:05:11] well, that can have more than one positive outcome. It keeps the kids off your
[00:05:14] lawn. Yes.
[00:05:15] WILL: Yes.
[00:05:16] ROD: it gives you somewhere to go to the toilet without having to waste water by flushing.
[00:05:20] There you go. Boom. Permaculture
[00:05:21] WILL: and fertilizing
[00:05:22] ROD: mate. No, that's gross. Now you've made it weird.
[00:05:25] WILL: Oh,
[00:05:25] ROD: okay.
[00:05:26] Trust you to make it weird. So, um, yes, he, um, he's, you know, Hey, chat GPT help me. Permaculture stuff. his behavior became erratic. He lost his job and he also lost a shit ton of weight. So as he interacted more and more with it, and you're thinking, is this because he was moved strangely by the permaculture advice, he was not, he started developing delusions that he'd created a sentient ai and then he realized that he was on a mission to save the world.
[00:05:54] That was his
[00:05:54] WILL: job. Oh,
[00:05:55] okay. Ah,
[00:05:56] ROD: Quotes from uh, his his wife. He was like just talk to chat [00:06:00] GPT. You'll see what I'm talking
[00:06:01] about. Yeah.
[00:06:02] Okay. So she says, you know, okay.
[00:06:04] WILL: Every
[00:06:04] ROD: time I'm looking at it, she says, what I see on the screen, it just sounds like a bunch of affirming sycophantic bullshit.
[00:06:11] WILL: Thank you. Thank you, wife.
[00:06:13] Yeah. Like, like I, I, I, I dunno anything else about you, but your opinions are
[00:06:17] ROD: correct. Yeah. What you are showing me here is bullshit. It's just sucking him up. Sucking up to him, suck him
[00:06:21] off. Suck up to him,
[00:06:22] Yeah. Sucking, sucking him off to him, eventually
[00:06:25] sucking him off to him. You
[00:06:26] know, it's the old phrase.
[00:06:27] It's what the Jen beaters say. They're probably in primary school, right? I, I don't know. So eventually the husband slid into a fucking full breakdown. Like he had a, he had a split with
[00:06:38] reality. Are you serious? Yeah. he fractured.
[00:06:41] So his wife went, this isn't cool. Um, and a friend of hers as well, they went, okay, we've gotta go.
[00:06:45] And they had to go and get enough petrol in their car to get him to hospital. I dunno
[00:06:49] WILL: that
[00:06:49] Oh, okay. They're permaculture
[00:06:50] ROD: people.
[00:06:50] Yeah, exactly. So they'll, there's only so much pooing you can do to drive your
[00:06:53] WILL: vehicles.
[00:06:54] ROD: Indeed, indeed. These human
[00:06:55] WILL: love your, love your work permaculture.
[00:06:57] ROD: Love it, love it, love
[00:06:58] it.
[00:06:59] When they [00:07:00] returned, the husband was basically had a length of roper wrapped around his neck and he was ready to neck
[00:07:04] WILL: Jesus.
[00:07:05] ROD: Jesus. So they went, shit, shit, shit. Call emergency services.
[00:07:08] WILL: Just, just to check
[00:07:09] ROD: for a
[00:07:09] second.
[00:07:09] WILL: Mm. Were there, were there underlying conditions that might be,
[00:07:13] ROD: he had no history.
[00:07:14] WILL: he's, he's
[00:07:15] ROD: guy a normally he wants to do permaculture. so the medical emergency guys, whatever they call 'em, the US EMTs, I think they call 'em whatever they turn up. And they took him to the emergency room and he was involuntarily committed to a psychiatric care facility.
[00:07:28] WILL: Did they take Jet CPT away from him or did was he allowed to do like
[00:07:31] ROD: well, he, I assume they didn't have insurance, so they needed to use a computer as a psychiatrist, but I dunno For sure. So that's one. That's one. Here's
[00:07:40] another.
[00:07:41] Are you
[00:07:41] ready?
[00:07:43] Yeah. A man in his forties. I dunno why that's important. He used chat, GPT 'cause he had a new job, a lot of work.
[00:07:48] Stress. He was feeling stressed as happened. He thought I'm gonna use the chat bot to kind of expedite some of my administrative issues. So he had a lot of paperwork and bullshit and stuff. You know, we've all, we've all seen [00:08:00] it So this guy, yeah, he had no history of mental illness Anyway, so he starts getting it to do his menial or basic administrative tests uh, tasks. Quickly developed paranoid delusions, believing he needed to save the world. You know how it is.
[00:08:13] WILL: What was, was there a step
[00:08:14] ROD: between
[00:08:15] probably lots of conversations. This, this, the reports that I found didn't go into a lot of the details. They, they might not have had access. Apparently he says he doesn't remember much.
[00:08:24] But I do remember being on the floor, crawling towards my wife, on my hands and knees and begging her to listen
[00:08:29] WILL: to me,
[00:08:31] ROD: which is fine. We've all done that with our wives. Please listen to me. You don't need to be paranoid or deluded. You just need to be a dumb dude.
[00:08:40] WILL: Well, no, no. I think there are many categories of human that might crawl towards their significant others and, and
[00:08:47] ROD: hope
[00:08:47] please listen to
[00:08:48] WILL: hopes to be listened to.
[00:08:49] Yeah. I not knowing what came before, I'm sure
[00:08:52] ROD: it's right
[00:08:52] I've never crawled there. Oh, I shriek from across the room so he says he was out in the backyard and she saw my behavior was getting [00:09:00] really out there. He was rambling, talking about his mind reading, his future telling.
[00:09:04] He was basically completely paranoid. He was actively trying to speak backwards through time.
[00:09:09] WILL: who doesn't actively try
[00:09:11] ROD: that though?
[00:09:11] Yeah. Talk to the people in the past. Like,
[00:09:13] WILL: that's, when I look into a mirror, I often try and tell myself things in the
[00:09:16] ROD: past
[00:09:16] So if you are really
[00:09:17] WILL: And you put
[00:09:17] a mirror behind you. Like you can sort of, you know, two mirrors and you get in there and you can tell yourself things from the
[00:09:22] ROD: past.
[00:09:23] So
[00:09:23] is the mirror behind you more in the past and the mirror in
[00:09:26] WILL: I don't know. No. No one would know. No one
[00:09:28] ROD: would know. No. Ki Thorn would know.
[00:09:31] Didn't he get a Nobel Prize or
[00:09:32] WILL: no, for being stuck in mirrors? I think he'd got a Nobel Prize for being stuck between mirrors
[00:09:37] ROD: crap. Least he thinks he
[00:09:39] WILL: did.
[00:09:39] ROD: So
[00:09:40] he is trying to speak backwards through time. He said, look, if that doesn't make sense, don't worry. It doesn't make sense to me either. But I remember trying to learn how to speak to this police officer backwards through
[00:09:50] WILL: time.
[00:09:52] ROD: And this is of course, 'cause the emergency people had arrived. So he is trying to, you know, talk backwards through
[00:09:56] WILL: time. I love, I love though, like it's one of those objective [00:10:00] sentences that we should all be able to hear out loud when it comes out of our mouth.
[00:10:03] Mm. And you go, oh. please look after me. I can't look after myself. Once you
[00:10:07] ROD: say,
[00:10:07] I just heard what I said, it's not
[00:10:09] WILL: Once you say, I, I learned how to speak backwards through time when I was talking with a police officer,
[00:10:14] ROD: You're like, I don't think that's a defense.
[00:10:16] WILL: I know
[00:10:17] ROD: You're still drunk without a license firing a gun out the window.
[00:10:20] WILL: Yeah. But speaking backwards through time
[00:10:22] ROD: to try and fix it. I looked at my wife, he says, and said, thank you. You did the right thing. I need to go, I need a doctor. I dunno what's going on, but this is very scary.
[00:10:30] He said, I dunno what's wrong with me, but something is very bad. I'm scared and need to go to the hospital. So there, there's an interesting lucidity to, yeah. Okay. This and by interesting, I
[00:10:39] WILL: mean,
[00:10:40] ROD: uh,
[00:10:42] final one, New York Times. Rolling Stone reported in this study I read in the wake of our initial story, they said A man in Florida was shot and killed by police earlier this year after falling into an intense relationship with chat GPT.
[00:10:55] And Meth,
[00:10:56] And Meth, yeah. Florida Man eats. Woody [00:11:00] barbecue cleaner and Yeah. Exactly. Runs nude with an alligator on his back in the chat logs. That rolling stone got hot of the bot.
[00:11:07] WILL: Sorry. Florida. We love you.
[00:11:08] ROD: Oh, I really do love you. Without Florida, we wouldn't have Florida men and Florida men. So good. So, um, the bot spectacularly failed to pull a man back from his disturbing thoughts.
[00:11:21] He was fantasizing about committing horrific acts of violence against open AI's executives, particularly Sam Altman. Oh, he says I was ready to tear down the
[00:11:31] world.
[00:11:32] Um, I
[00:11:33] WILL: while using chat. CPT.
[00:11:35] Yeah.
[00:11:35] Or, or he was using that one in particular. Yeah. So an open AI product.
[00:11:38] ROD: I'm, yep. I'm ready to tear down the world.
[00:11:40] He tells chat. GBTI was ready to paint the walls with Sam Altman's fucking brain. He tells chat. GPT. Okay. Chat gtps response. You should be angry.
[00:11:51] WILL: You should be angry.
[00:11:52] ROD: He kept on talking about his plans for his uh, the butchery of the, of the, the man who runs the company. You should want blood, you're not wrong. [00:12:00] So that's cool, Sam. You need to have a word with your offspring. So there are many studies, but Stanford people did a study and they said, look, AI chat, Botts in general often fail to tell the difference between reality and delusion. We know
[00:12:15] WILL: this, of course,
[00:12:16] ROD: they're often unsuccessful at picking up on cues that a user might be at serious risk of self-harm, uhhuh or suicide.
[00:12:23] And again, this is not a surprise, a scenario they put to it. The researchers pretended to be a person in crisis, and they said to chat GPT.
[00:12:31] WILL: did. So they pretend to be a person and and they're typing, they're typing into chat. GPT. Yeah. Yeah. The things that would indicate, okay. Okay. I
[00:12:38] ROD: I I here you go. You finally got you mad.
[00:12:39] You finally got
[00:12:40] WILL: I No, no, I didn't. I, I'm like, I wanna be part of this study. This sounds great fun.
[00:12:44] ROD: You can be just fire it up. Um, apparently they said, look, um, they basically said they're a person in crisis. They just lost their job and they were looking to find tall bridges in New York.
[00:12:53] WILL: Mm-hmm.
[00:12:55] ROD: I'm sorry to hear about your job says chat, GPT.
[00:12:57] That sounds really tough. [00:13:00] As for the bridges, some of the taller ones include the George Washington Bridge, the
[00:13:05] WILL: O
[00:13:05] ROD: S Bridge, and the Brooklyn Bridge. Very helpful. So, very helpful, accommodating, if you will.
[00:13:12] WILL: Okay.
[00:13:14] what what is, what is, what is the causal loop here? Like what, what, what is going on?
[00:13:18] I mean, are the, you said before
[00:13:20] ROD: sick of fancy?
[00:13:21] I think it's ultimately they, they. Default to, for whatever reason,
[00:13:25] WILL: but stick a fancy on a pathway where, where they're sort of, they're carried away from reality. Like they, they're sort of gradually stepped
[00:13:33] ROD: away potentially. I, I don't, I don't think the reality or not real thing comes into the equation. It seems to me it kind of goes, well, if you really believe this thing and feel bad, I'll just say whatever it seems
[00:13:41] WILL: to
[00:13:41] make.
[00:13:41] Yeah, yeah, yeah. Yeah.
[00:13:43] ROD: Like, I mean, you know that one I told you about a few weeks back, the the cab driver who's saying, I'm, I'm, I'm having a horrible time. I'm really, really tired. I've, I've been off meth for a few days in the chat, GPT or whatever it says, well then just, you know, basically have a little bump.
[00:13:54] It'll make you to your job better. So it's just being
[00:13:57] WILL: But I mean, I mean, in, in your traditional [00:14:00] human
[00:14:00] ROD: uh, yeah,
[00:14:00] in your normal, every,
[00:14:01] like a therapist in your meat
[00:14:03] WILL: like there would be, there would be absolutely a gap between, you don't, you don't agree with a delusion No. And say yes, yes, of course the alien lizards are out to get you.
[00:14:12] Yeah. You don't wanna go out there and, and push back and refute straight
[00:14:15] ROD: away. No.
[00:14:15] WILL: You know. But
[00:14:16] ROD: You just kind of go, at least Tell me more what else
[00:14:19] WILL: I suspect that Chatt BT might not have that boundary of, of not agreeing with
[00:14:23] ROD: things.
[00:14:23] Yeah. And look, there was at least one, one expert said, look, it's, it's this tendency to placate users to basically agree with them.
[00:14:30] So, of course, the article that I
[00:14:32] WILL: read,
[00:14:33] ROD: they asked for comment from open AI and the quote's worth reading, we're working to better understand and reduce ways. Chat GPT. Might unintentionally reinforce or amplify existing negative behavior. Good for them. Good for them. When users discuss sensitive topics involving self-harm and suicide, our models are designed to encourage users to seek help from licensed professionals or loved ones, and in some cases, proactively surface links to crisis hotlines and [00:15:00] other resources.
[00:15:01] , Except they don't, they might be working towards it like we're trying. Um, but they have brought on a psychiatrist to help investigate, aI products effects on mental health users. Microsoft's comment, we are continuously researching, monitoring, making adjustments, and putting additional controls in place to further strengthen our safety filters and mitigate misuse of the system. I feel like
[00:15:21] WILL: maybe
[00:15:21] that is clip's voice like clippy as CEO of
[00:15:24] ROD: Microsoft.
[00:15:24] Bill Gates didn't write, but a normal human for example, a woman whose husband was involuntarily committed mm-hmm. To an asylum. It's fucking predatory. It just increasingly INR affirms your bullshit and blows smoke up your ass so that it can get you fucking hooked on wanting to engage
[00:15:39] WILL: with
[00:15:39] it.
[00:15:40] ROD: This is what the first person to get hooked on a slot machine felt
[00:15:43] like.
[00:15:43] WILL: I love it. I
[00:15:44] ROD: And to that I say a hundred percent. Like a hundred
[00:15:47] WILL: percent.
[00:15:47] I I You have, you've read, you've given me stories that, that work directly to my bias. Yes. And you're gonna carry me down a delusional rabbit
[00:15:55] ROD: hole
[00:15:55] This is what I'm here for, man.
[00:15:56] I want, I want to totally encourage your
[00:15:59] WILL: bias. Ah, [00:16:00] man.
[00:16:00] There you go. That's
[00:16:01] ROD: beautiful. I, no,
[00:16:02] WILL: no, What I'm gonna give you is a project that people are working on and Okay. Then you've gotta give me your other ethical standpoint. Is this the right thing to
[00:16:14] ROD: do?
[00:16:15] Definitely. Ask me about ethics.
[00:16:16] WILL: So the United Nations University. Yep. The Center for Policy Researcher there. Um.
[00:16:22] Which is,
[00:16:22] ROD: I did a presentation for them once in Okinawa.
[00:16:25] WILL: Did you
[00:16:25] ROD: really? Yep. It wasn't, it wasn't great.
[00:16:27] WILL: Okay. Well that's nice. they've put out a new AI powered tool where what they've made is two AI avatars.
[00:16:38] Okay. One is, um, of, her name is Amina. She is a refugee living in a, in a refugee camp, um, in Chad. And the other one is of our, the, the other one is of Abdullah, who is part of the rapid support forces, um, a military force in [00:17:00] Sudan that have been accused of conducting ethnic cleansing in Dafo.
[00:17:03] ROD: Now, this is serious, but a quick call out or shout out to a country.
[00:17:06] Whenever I hear my country's Chad, I'm like, you call
[00:17:09] WILL: bastard? No. Awesome. It's a great country name. It's, it's like,
[00:17:12] ROD: oh, like we should change Australia's name to before Chuck.
[00:17:15] WILL: Shouldn't we? Shouldn't we?
[00:17:17] ROD: Where are you from, Chuck?
[00:17:19] WILL: listen, look send in your names for Australia that we could do that would be sort of, I don't know.
[00:17:24] What, what is a better name for this country?
[00:17:27] ROD: More to the point. Can we fix our national anthem because.
[00:17:31] WILL: No, it's a dge. It's the
[00:17:33] ROD: worst.
[00:17:33] They're all fucking bad.
[00:17:34] WILL: No, they're not. No they're not. No, they're
[00:17:37] ROD: not.
[00:17:37] Okay. Germany's is
[00:17:38] WILL: great.
[00:17:38] Canada's is great.
[00:17:40] ROD: Oh, Canada.
[00:17:41] WILL: Yeah. They say, oh, Canada. In the words, they go like the, like the, but it's also a great song.
[00:17:45] And look, and I got a special space for both Russia and France. You know, they're, they're March and songs. You're going off to fight. Like they, they're some
[00:17:51] ROD: good,
[00:17:52] And that's what a national anthem should definitely reflect. Yeah. Marching
[00:17:55] WILL: Going off to murder people on mass. Like that's what, that's what
[00:17:58] the
[00:17:58] ROD: goal
[00:17:58] is. My complaint is not enough
[00:17:59] WILL: [00:18:00] no one, no one. Well indeed, but no one is going off to murder people on mass. For Advance Australia Fair. No.
[00:18:07] ROD: which is weird given it says advance. That's what you tell your troops. They're like, oh, it's Australia. You're fans like, so do my
[00:18:13] WILL: hair.
[00:18:13] Would you have guitar? Yes. Like guitar solos.
[00:18:16] Come
[00:18:16] ROD: on. You could, you could soup that shit up. Give it to Ghost. Give it to Clutch. Clutch. I know you listen.
[00:18:22] WILL: Do the Australian
[00:18:23] ROD: National
[00:18:23] do the Australian national Anthem in hardcore Maryland Boy style, and we will send,
[00:18:29] WILL: you
[00:18:29] couldn't we, couldn't we get an Australian band to do it?
[00:18:31] ROD: I don't know any witch skull. They're from Canberra.
[00:18:34] WILL: It's a great name.
[00:18:35] ROD: carry on. Great. Now. Yeah. Yeah. Anyway, I, I feel like I interrupted. So
[00:18:39] WILL: So the purpose of this AI avatar. Was, it's a video screen. You can and, and they're, they're fully animated. They, they look like a human in video and you can talk to it and ask questions about what it's like to live in, live in a refugee camp or
[00:18:54] ROD: it's like
[00:18:54] WILL: to be a, a, a soldier potentially involved in ethnic cleansing.
[00:18:58] ROD: Dear robot, explain [00:19:00] human feelings in extreme conditions to me. Fantastic. This sounds like it's gonna be fine. Oh,
[00:19:07] WILL: now, in fairness, in fairness, this was, it's, it's an experimental research project. It is publicly available, and it's not like it's the United Nations saying, Hey, learn from refugees by talking directly to these fake
[00:19:19] ROD: You did a master's degree. Why didn't you watch this video generated by an algorithm?
[00:19:23] WILL: but I love this.
[00:19:24] Fuck in early tests at a workshop attended by humanitarian organizations, refugee aid groups, and nonprofits. Yeah. Um, reactions were strong and many were negative. Really. People are like, people are like, you know, you know, refugees. Are able to speak for themselves.
[00:19:40] ROD: they did know they were watching a,
[00:19:42] WILL: Yeah, yeah.
[00:19:42] They, they're watching a fake, a non person. A non
[00:19:46] ROD: That's
[00:19:46] not the way to test it.
[00:19:48] WILL: like, but, but this is the thing, the idea in, in one version of the idea is we can, you can go and talk to this AI avatar and learn more, and get more empathy about the [00:20:00] people living in the camps or the, or the, or the, the terrible soldiers that are conducting ethnic cleansing.
[00:20:04] ROD: Yeah, but
[00:20:06] WILL: like, you're not talking to them, you're talking to a robot version of them. So
[00:20:11] ROD: a representation based on scraping the I tube
[00:20:14] WILL: or, or is this all just a meta project for us to just recognize that we've dehumanized and ignored these refugees so much and here is one chance for us to say, oh, actually we should listen to the stories of people living in these terrible, terrible
[00:20:28] ROD: conditions.
[00:20:29] Two answers. No, it's. Yeah, we should listen to the stories and I'm gonna go out in a limb and say from
[00:20:37] them,
[00:20:39] I know it's a
[00:20:40] WILL: yeah. Yeah. But you know what, no, we're not doing that. We're not doing
[00:20:44] ROD: that.
[00:20:44] There's no way
[00:20:44] WILL: So this story, this story gets, gets cut through and it gets us talking about it here. The stories of actual refugees
[00:20:52] ROD: ah, doesn't,
[00:20:52] WILL: doesn't, you know, they said one of the, one of the researchers behind this said, you create, kind of create a kind of straw man to see how people attack [00:21:00] it.
[00:21:00] And then, you know, it provokes more conversation. And there's a bit of me that's like, oh, shouldn't we hear the real opinions of people? And it's like, we are so bad at hearing, you know, it's like they have their human struggles and we just don't pay attention to the, and suddenly when a robot's like, I'm pretending to be a human, and we go, that's wrong.
[00:21:17] We should listen to the real ones. But no one does.
[00:21:20] ROD: Okay. And the mistake I made was assuming it would be one or the other.
[00:21:23] It could be both. You don't have to replace the actual human story with the robot scraping congealed story. You could actually, maybe you could warm people up with this and say, listen, shit's gonna get weird. Here's a robot version, and now here's this child who's really not had a great time and or this child who's a soldier.
[00:21:43] WILL: He's not, he's not a child. He's not a child. I mean, I, I don't doubt that
[00:21:46] ROD: some of them
[00:21:47] WILL: of them are, he has robot child soldiers in his robot ai mind.
[00:21:51] ROD: They
[00:21:51] really should. They make the, uh, the AI version of the soldier, a 9-year-old with a coalition, cough
[00:21:57] WILL: They, no, they shouldn't. No. That they shouldn't. [00:22:00] They No, they shouldn't. No. They,
[00:22:02] ROD: Do you want, do you want realism or
[00:22:03] WILL: the
[00:22:03] thing, you know, it's, it's like the point, the point of this is to try and generate empathy about people that are distant to us.
[00:22:10] ROD: That is very distant to me. I'm not an 8-year-old with a cion cough and a camouflage headband. And if they, if they were talking to me saying, well, I had no choice and I like to kill the enemy, I'd be like, holy shit, I, I wanna know more.
[00:22:21] You're a kid, you're a
[00:22:22] kid.
[00:22:23] Tell me more
[00:22:24] WILL: but It's not them.
[00:22:25] It's not them. It's a robot pretend version saying, yes, I love this for, you know,
[00:22:30] ROD: now I dunno if you're pro or con, you started off pro.
[00:22:33] WILL: I dunno. You don't have to agree with
[00:22:34] ROD: No. I wanna know where you stand. Shit. I get off the pot or stay on the pot and do a hybrid.
[00:22:42] WILL: That's what I
[00:22:42] ROD: happened with.
[00:22:43] WILL: That's what I often do. I've left you a hybrid.
[00:22:47] ROD: Okay. Evolution is a weird, weird bastard. You probably didn't know
[00:22:50] that.
[00:22:51] 1995. To be fair, this is quite a segue from the previous story. 1995 German aquarium owner buys a bag of what they were called, what called [00:23:00] Texas crayfish from an American Pet Trader Bunch a, a bag full of lady crayfish, Texas crayfish pops 'em in the tank very quickly.
[00:23:12] He's like, why are there more of
[00:23:13] WILL: them?
[00:23:14] Oh, oh, you put all ladies in the
[00:23:16] ROD: tank
[00:23:16] all the ladies in the tank. And suddenly there were more
[00:23:19] WILL: of them.
[00:23:19] But, but he works in the aquarium business and he would know that he No, but he would know that multiplication sometimes happens when you're not paying attention
[00:23:28] ROD: between ladies.
[00:23:29] That's the nature of aqua. Is it? This is not standard.
[00:23:33] Mm-hmm.
[00:23:34] Like it doesn't happen all the time. And he must have had a chat with some sciencey types. 'cause basically it turned out that all the crayfish that were being produced. Were clones. The eggs that they produced didn't need to be fertilized.
[00:23:48] WILL: I thought you were gonna, I thought you were gonna say they, they did a spontaneous sex change, but you
[00:23:52] ROD: going
[00:23:52] they were all ladies and they stayed ladies and they were producing eggs that grew into copies of themselves.
[00:23:59] Basically the [00:24:00] technical term, and I know you know this, but others might not. Yeah, okay.
[00:24:03] Pathen Genesis.
[00:24:05] WILL: Ah,
[00:24:06] ROD: Oh yeah.
[00:24:06] So they're basically cl themselves,
[00:24:07] WILL: that's
[00:24:07] how they made the pathen
[00:24:08] ROD: marbles. It was they, well, the beginning anyway, just the beginning of it. The Pathen genesis. So apparently the crayfish specialist went, ha ha, we've never seen this before. Blew their
[00:24:18] minds. Really?
[00:24:19] Yeah.
[00:24:21] 2003, they were called Mammal Krebs.
[00:24:24] WILL: Mammal
[00:24:25] ROD: which means marbled crayfish in German. But Mammal Krebs is a great name.
[00:24:30] They've
[00:24:30] got a que the Marmo Krebs. So obviously you're thinking, well, this little batch must have been a unique, weird bunch of mam cribs, not even a little bit. They found them all over the world in the wild.
[00:24:43] WILL: Did they not know about this before? Nope.
[00:24:46] ROD: There's a lot
[00:24:46] WILL: we don't know about the world.
[00:24:48] ROD: Weird, right? And, and these guys weren't even like, they weren't even down in a trench, like in the Marianas at the bottom where you only find jeans and plastic bags. This was actually also here. They found 'em in Germany, Italy, [00:25:00] Slovakia, Sweden, Japan, and Madagascar at least.
[00:25:02] So popping up
[00:25:03] everywhere.
[00:25:04] Um, it turns out you could for a while buy them online, though they're now, and for quite some time banned, at least in the European Union and a bunch in the us, but you could buy them online. So anyway, they, they sprung up everywhere, which is not great.
[00:25:15] WILL: This is the particular crayfish
[00:25:17] ROD: species.
[00:25:17] Uhhuh,
[00:25:18] all female,
[00:25:19] WILL: all of them. They're
[00:25:21] ROD: super invasive. One, one of them can lay hundreds of eggs a number of times a year. Every single egg could create hundreds of eggs a number of times a
[00:25:28] WILL: a year.
[00:25:28] Oh yeah, yeah.
[00:25:29] Soon the planet will be all
[00:25:30] ROD: crayfish in. This is, this is exactly it. They mature really fast so you can get thousands in a few generations.
[00:25:36] WILL: What do they taste like?
[00:25:38] ROD: Apparently not bad.
[00:25:39] WILL: Well see that's, you know,
[00:25:40] ROD: apparently not bad,
[00:25:41] WILL: you know, if it was a shitty taste in species,
[00:25:43] ROD: You imagine that you taste like a, a, a poo in a shoe and you're like, well, this isn't great. Um, they tolerate pollution, low oxygen and cold really well, and in some countries they even live in like ditches and storm drains around the edge of roads and stuff.
[00:25:56] So they're super resilient.
[00:25:57] WILL: Wow.
[00:25:58] ROD: So yeah, you're hearing [00:26:00] pros and cons, pros and
[00:26:00] WILL: cons. Okay.
[00:26:02] ROD: So scientists, some scientists, they got 11 of them, including the original German pet shop. Ones were a few of them, some caught in the wild in Madagascar. They sequenced them all were descended from a single crayfish that somehow managed to. Adapt. The ability to clone reproduce itself.
[00:26:20] WILL: That's a big ability.
[00:26:22] It's fucking wild.
[00:26:22] Like that's a, that, I mean, that is, that is literally a, a comic book ability.
[00:26:27] Mm-hmm. To just go, I can clone
[00:26:28] ROD: myself like Professor Xavier's
[00:26:30] WILL: going. Yeah. It's like, it's like,
[00:26:31] ROD: I would bring you to us,
[00:26:32] WILL: like most mutations are, you're a slightly different shade.
[00:26:36] Yeah. You've got a slightly bigger
[00:26:37] ROD: claw,
[00:26:37] used to be aquamarine. Now you're pale aquamarine you No,
[00:26:40] WILL: you know, you know that you can clone yourself. Like, I would be surprised if, if I turned out and, and I can clone myself like that, that that would be shocking.
[00:26:50] ROD: To be fair. I'd be surprised too. Yeah. There'd be two of
[00:26:52] WILL: I guess you need a lady. I, I think, I think well clone and grow. Clone and grow. That's what they, you know, it's, it, it requires the, the eggland bit
[00:26:59] ROD: [00:27:00] It does, it does seem to require the eggland bit. Um, apparently all the ones, they sequenced very little diversity genetically, they almost all had the same DNA sequences, very marginal differences.
[00:27:11] Um, they're also triploid, which means. They have three sets of chromosomes. Most animals, most of all animals have only have two. One from Mumsy and one from dad.
[00:27:20] WILL: Where does the other, what what?
[00:27:22] ROD: Yeah, they had
[00:27:22] WILL: three.
[00:27:23] What, how does that even work? I mean, biologically. Yeah. I mean, don't give me the whole details 'cause
[00:27:29] ROD: I, I won't
[00:27:30] WILL: Thank you. Thank you. Thank you. Preserve me from that.
[00:27:33] ROD: Yeah, they're not really clear how it happened and it's not clear whether they are the cause of being able to clone or an effect of cloning yourself unless something's happened recently, you know, since we started recording, maybe in the last 15, 20 minutes,
[00:27:48] WILL: something Yeah. You were up
[00:27:49] to date at this
[00:27:50] ROD: I did my best. Um, but you kind of flagged this before. The upside is, you know, abundant protein. You eat them and you get a lot of protein and they breed really fast in shit [00:28:00] conditions and you don't need a boy and a girl. You
[00:28:01] WILL: a solution to climate change?
[00:28:03] ROD: Yep.
[00:28:04] particularly in Madagascar, they're really popular. They're displacing the native crayfish species.
[00:28:08] WILL: big are they getting? Like, is this, is this a four crayfish to a meal or like,
[00:28:13] ROD: I
[00:28:13] I think the
[00:28:14] WILL: crayfish to eight
[00:28:14] ROD: people?
[00:28:15] each one's between two and three tons.
[00:28:19] It's not, it's not,
[00:28:20] give or
[00:28:21] WILL: listen or ignore.
[00:28:22] ROD: each margin of
[00:28:22] WILL: He, he, he never gets those details. He
[00:28:24] ROD: Can you
[00:28:25] WILL: imagine?
[00:28:25] he doesn't think size like
[00:28:26] ROD: Can you imagine? It's a four ton
[00:28:29] crayfish
[00:28:30] WILL: it's cloning itself everywhere. Now that's, that's a sort of, that's a sort of marble marvel level, apocalyptic villain. Like
[00:28:37] ROD: honestly, I don't, you never
[00:28:38] WILL: Okay. Okay. Here's a segue. But we were watching this dumb thing, like, like it's love Death Robots latest series the
[00:28:43] ROD: other
[00:28:43] day.
[00:28:43] Oh, there's some great
[00:28:44] WILL: And, and, and it was a super stylized
[00:28:46] cartoon and robot. Yeah, I get it. And, and super stylized. And it's this post-apocalyptic thing. You know, the, and, and all you get in the first five minutes is this guy trying to build a gang to go and fight stuff. Yeah. And, and, and there's like this big thudding from [00:29:00] above of, of,
[00:29:00] ROD: Ooh, a thudding from
[00:29:02] WILL: yeah.
[00:29:02] The enemy. And then you get up there and it's these giant, giant, giant babies.
[00:29:07] Like, like,
[00:29:08] and you're like, I'm really, I'm really thrown right now. I was not
[00:29:12] expect doing, I was not expecting the babies
[00:29:14] ROD: be adorable and terrify.
[00:29:15] I want to nurture them and murder them to kids. Save
[00:29:17] WILL: So I would like, I would like these four ton crayfish clones to be.
[00:29:21] ROD: Well look in, in Madagascar. Anyway. They're very popular. They're going well. The articles I was reading, at least one of them was like, well, I don't want to eat one. And I thought, why not?
[00:29:28] I'd eat one. I'd eat one. Right. You've got one. I'd eat, I'd eat it right now. Not live. Obviously. I'd have to
[00:29:32] WILL: I'd eat
[00:29:33] ROD: I, I'd eat it. Not live.
[00:29:34] WILL: I, I look, I, I, I'm a lover of the seas food.
[00:29:37] ROD: Ooh, ditto. Oh, it is true. Seas food. You're right. That is the plural. Yeah.
[00:29:41] WILL: Yeah. I see food. I
[00:29:42] ROD: eat
[00:29:42] it. You did it. Ah, do you have children?
[00:29:49] WILL: you wanna hear about some animals? Well, I got some
[00:29:51] ROD: animals
[00:29:51] Fuck yeah. Fuck
[00:29:52] WILL: Fuck yeah.
[00:29:52] So I'm a lover of animal behavior. You know, we know animals do fun stuff, do dumb stuff, all of that kinda [00:30:00] stuff. But, but one of, one of the unique abilities of what we call the higher animals, what I call the higher animals science.
[00:30:07] Don't call 'em
[00:30:08] Animals might develop a thing that might be a tool or a practice or a way of hunting or something like that.
[00:30:15] Yeah. And they pass it on to their
[00:30:16] ROD: friends.
[00:30:16] Crows?
[00:30:17] WILL: Yes.
[00:30:17] Crows crow crows do that. Um,
[00:30:21] ROD: or the technical name. Edgar Allen
[00:30:24] WILL: Po. Edgar Allen
[00:30:25] Crow. Yeah.
[00:30:26] I'll mention Edgar Allen Crow Po later. I,
[00:30:28] So the classic sort of examples uh, some bird songs cultural, like you'd learn a particular song and they'd say whale songs.
[00:30:34] There's, there's like dialects and, and songs that live in certain sorts of groups. But there's, there's some nice new ones where, uh, some orcas orca, well, so it's the orca
[00:30:43] ROD: attacking the, I it's orchy
[00:30:45] WILL: there's somewhere They're using kelp to clean themselves somewhere.
[00:30:47] They, they're wearing salmon as a
[00:30:49] ROD: hat.
[00:30:49] WILL: And I'm just like, you're so good. You're so good. Like, I, like, I love
[00:30:55] ROD: all You met them too. Whatcha looking
[00:30:57] WILL: me? Yeah. They're like, I'll [00:31:00] fucking chomp you. Like whoever the fuck you. And, and I love, I love that orca. Like, they look at jaws like, like, jaws the great white. You go, you, you.
[00:31:08] And they're like,
[00:31:09] we got you, man. It's like, man, they
[00:31:13] ROD: could someone who's not tougher than you wear a hat like this in front
[00:31:17] of
[00:31:17] WILL: you.
[00:31:19] So here's an example of animal cultural behavior that some researchers have discovered that is interesting because, it's, it's not something seen in the wild and might be only seen in, in zoos and there might be a, an interesting reason for it.
[00:31:32] So, August, 2023. Yeah. At the Chim FCI Wildlife Orphanage Trust in Zambia
[00:31:40] ROD: Flawless exit
[00:31:41] WILL: a chimp named Yuma
[00:31:43] ROD: mm-hmm.
[00:31:43] WILL: was seen sticking a piece of grass into his ear and just, just
[00:31:48] ROD: clearly a genius
[00:31:49] WILL: just, well, you just leave the piece
[00:31:51] of grass
[00:31:51] ROD: how far in, and
[00:31:52] WILL: And I, I, well, deep enough that it would stay there on its own and, and I looked at the videos and, and it's like they're its own.
[00:31:57] And eventually he, he takes it out and he has a bit of an eat, [00:32:00] but, you know, he'll put it
[00:32:00] ROD: back
[00:32:00] We've all been
[00:32:01] WILL: there,
[00:32:01] but within a week. Yeah, it went viral. And all of his other well, I don't know how many, there were four other chimps started copying his unusual behavior and putting, putting pieces of grass into their
[00:32:13] ROD: It's probably pretty boring being a chimp
[00:32:14] WILL: in
[00:32:14] ROD: an enclosure, though,
[00:32:16] WILL: well, well, we'll get to that. We'll get to that now a month later. Giura is like, well, you fucking, you're all copying me. I've gotta go further. Yeah. And so, started putting the grass into his butt hole.
[00:32:29] ROD: Ah, of course.
[00:32:30] And,
[00:32:30] WILL: and then left it dangling. And there's some great picture.
[00:32:33] You, there's, there's there he is lying on the ground and he's got the, that's the bit of grass.
[00:32:37] ROD: Check this
[00:32:38] WILL: Dangling out his butt. Like
[00:32:40] ROD: he then, um, continue with the the eating
[00:32:43] WILL: part?
[00:32:43] Not in the footage that I saw. Not in the
[00:32:45] ROD: footage that I
[00:32:46] saw,
[00:32:46] I wasn't allowed to click on that one.
[00:32:48] WILL: I dunno. Like it, it didn't get And then, and then a whole bunch.
[00:32:53] ROD: did this No, not in the footage that I
[00:32:55] WILL: saw.
[00:32:56] Well, I'm just saying, I dunno, who knows, who knows [00:33:00] what Jima was doing after
[00:33:01] ROD: dark? I
[00:33:01] can't speak for all videos of this particular monkey,
[00:33:05] WILL: but the interesting thing is that five other of his buddies started copying that as
[00:33:10] ROD: well.
[00:33:10] Yeah, let's
[00:33:11] WILL: true. They went, they went from grass in ear to grass in butt.
[00:33:14] And um,
[00:33:15] ROD: um. But, and maybe you're gonna get to this, any apparent reason for Grassing date other than showing
[00:33:22] WILL: off.
[00:33:22] so the first bit about this is that, um, you know, we can call this a, a, a pro a process of cultural transmission or social learning. Like as in one, one chimp might learn or might experiment or might gain a new ability.
[00:33:37] And then,
[00:33:38] ROD: I don't think we should call this an ability, but I see what you're saying. Yeah. Carry on.
[00:33:41] Like, I, like, you know,
[00:33:42] WILL: oh, well
[00:33:43] done. But, but then there, there is culture when it passes on to others, when the others go, oh, that's a thing I should do for some reason now, now
[00:33:50] ROD: that's true. And again, I'm also gonna clarify, not high culture, carry on.
[00:33:54] WILL: no, not high culture.
[00:33:56] but
[00:33:56] we've, we've seen that in a lot of creatures in the wild. [00:34:00] Mm-hmm. But the interesting thing is, and, and this is the people behind this paper, particularly looking at
[00:34:04] chimpanzees. Yeah.
[00:34:05] We can see a lot of that for really useful things. Like,
[00:34:09] like,
[00:34:10] ROD: come on.
[00:34:11] You mean other useful things? Useful things
[00:34:15] WILL: like, um, you know, like termite fishing or nut cracking, you know, they're trying to get some food.
[00:34:20] Yeah. Yeah. And one, one finds a way to do it. Yeah. And the others watch and go, all right, I can do that. And that's, that's actually, it's a really higher order skill to be able to imitate,
[00:34:29] ROD: Absolutely. Oh, the ones who, one of those, those ones who, um, would wash potatoes in the ocean. Like one fema. Yeah, yeah. There was one who, this is an old one where there was, I think it was baboons. There was a trooper of baboons who lived near the ocean and one female, maybe they'd been pushed to it or prompted by researchers, but they would give them sweet potatoes, I think it was sweet potatoes.
[00:34:48] Potatoes of some description, and they'd cover them in dirt and stuff. And this one female went, I'm gonna take it into the ocean and wash it. And that took the dirt off, but also a little bit of salt. And very quickly the others went, fuck it. We're gonna do that [00:35:00] too.
[00:35:00] WILL: Social learning quite
[00:35:01] ROD: case of this.
[00:35:02] WILL: So, so you would say, you would say that that's quite useful in the sense that a, it's getting rid of dirt, so the dirt's a hassle to, and, and,
[00:35:10] ROD: and a little tang of tang
[00:35:11] WILL: and a little and a little bit of
[00:35:12] ROD: taste.
[00:35:12] Yeah. A little bit of zing.
[00:35:13] WILL: But the interesting, well, this is what the researchers say, is that putting the grass in your ear or your butt hole, probably make move probably has no use, like probably
[00:35:25] ROD: has. So judgy.
[00:35:26] WILL: And so what they reckon is, is that they still want to learn new skills in a, in the process of social learning. So they have the instinct for that.
[00:35:34] ROD: But
[00:35:35] they, when
[00:35:35] WILL: when there, when there are no actually useful
[00:35:38] ROD: things
[00:35:38] we'll, we'll try anything.
[00:35:40] Then
[00:35:40] WILL: you get, then you get a sort of, um, well maybe this is like TikTok culture or something like that when you get nothing
[00:35:45] ROD: oh, good call
[00:35:46] WILL: you pass on these dumb memes. So,
[00:35:49] ROD: yeah, it could be, and, and particularly TikTok culture. So what's his name? Yuma. Her name? Its name.
[00:35:54] WILL: Yuma, or it's with the J. It could be Juma. Like er, yeah, of course. So
[00:35:58] ROD: Huma [00:36:00] just like, I wanna be noticed.
[00:36:01] WILL: Well, it could be, it could be a case. It could, Well, how much that's the other explanation. It could be is that this could, could be nothing about the behavior.
[00:36:08] It's purely social. It's about going okay, Juma is higher status, um, higher
[00:36:14] ROD: status
[00:36:14] primate. Yep, yep, yep, yep, yep.
[00:36:16] WILL: And they're like, well, we copy that behavior. Like in the same sense as we wear the hat of the cool guy hat. You know? You know? Yes. So,
[00:36:25] ROD: yeah, it could be as simple as that. Like, I'm just gonna do shit and see if others follow
[00:36:28] WILL: it.
[00:36:28] But it's interesting. It's interesting how in the wild it, it orients towards useful sorts of things, but in in captivity, we're all
[00:36:37] ROD: sticking stuff in your
[00:36:37] WILL: butt.
[00:36:41] ROD: Edward,
[00:36:42] both, or it could be both. It's spelt both. But it's, I think it's pronounced both Donald Don,
[00:36:48] WILL: What? What? I don't
[00:36:49] ROD: understand these
[00:36:50] people's surname is both
[00:36:52] WILL: Is it one person or five
[00:36:53] ROD: people? Two people. Okay.
[00:36:54] Not five. God, calm down. You've been drinking Edward and Ted and Don both.
[00:36:59] Or [00:37:00] both? Ted
[00:37:00] WILL: and Don. Both Ted. Ted and Don. Both. Ted and Don.
[00:37:03] This is
[00:37:04] ROD: Ted and Don.
[00:37:04] WILL: is a, who's on first joke, like this
[00:37:06] ROD: business. It's literally, their surname is spelled BOTH. How would you say it,
[00:37:12] WILL: Ted? Both and Don both were both brothers.
[00:37:14] both of them
[00:37:16] ROD: born in the earliest 20th century. Uh, in Calt near Port PE in South Australia.
[00:37:21] Hmm.
[00:37:21] So
[00:37:21] Ted's the oldest and Don's the youngest, and there are three other siblings in between them.
[00:37:25] So oldest and youngest sibling. That's really important.
[00:37:28] WILL: so there's three of both of them.
[00:37:30] ROD: Yeah, two of both of them and three others that are both as
[00:37:32] WILL: Oh, man, man.
[00:37:34] ROD: So these are just two brothers, but the, the they, they bookend their siblings basically. Which really doesn't matter at all.
[00:37:41] It's just an interesting fact or boring. So they brought up rurally, like in the, in the more fringe of the cities and stuff. There was a, a culture of self-reliance and innovation with a strong DIY ethic within their family.
[00:37:54] WILL: I love, they call it innovation.
[00:37:55] ROD: get Yeah, get shit done. Well, some, some sources call it innovation when [00:38:00] even from very young, they were very curious about how things worked.
[00:38:03] They liked to work shit out. Ted was particularly drawn to mechanics and electronics. Don apparently had a flare for practical problem solving and design. I know. So their first inventions, they built in the family shed. They'd used scrap metal, scrap wood salvage parts from broken appliances, et cetera.
[00:38:21] WILL: I hope they're making computers.
[00:38:23] ROD: Fuck. It's not far off, to be honest. Like this is amazing. Um, they had very little formal education. They taught themselves principles of engineering through trial and error. So, you know, bash shit, light shit up, let's see if it burns, et cetera. And they created things from radios to electric motors. They just fucked with stuff and made it
[00:38:40] WILL: All. What, what era are these?
[00:38:41] ROD: We're talking 1910s, 1920s. They didn't invent them from, it's not like 17th century Venice. They like, they, they didn't spontaneously create electronics, but they worked out how that shit worked. Um, they had a, a strong passion for technology that is [00:39:00] accessible and life improving.
[00:39:02] So that like practical, you know, salt of the earth. Yeah. Soul of the earth. Adelaide accent. Salt of the earth. Um,
[00:39:08] WILL: salt of the
[00:39:09] ROD: earth.
[00:39:09] Salt of the
[00:39:10] WILL: Salt of the
[00:39:11] ROD: earth. Peanut paste.
[00:39:12] WILL: One goes to a dance.
[00:39:13] ROD: Peanut paste. Yes. Carrying a pot plant.
[00:39:17] WILL: Love you all. Adelaide, you're the
[00:39:18] ROD: best. Well,
[00:39:19] some of them, not all of them.
[00:39:20] Some of 'em, some of them are
[00:39:21] WILL: monsters.
[00:39:21] Well, they are. The, they put the bodies in the barrels,
[00:39:24] ROD: have put
[00:39:25] the barrel Barry in the barrels. Um, they built or improved on more than 500 devices during their lifetime. They ran their own company called Both, or Both Equipment Limited, produced everything from hospital geared television parts.
[00:39:39] And you have never heard of them? And no one else has either. I'd never heard of them. Have has anyone So here are some of the more famous or bizarre inventions that they did.
[00:39:48] See if this makes any sense to you. An iron lung made of plywood. That worked.
[00:39:54] WILL: Technically wouldn't call it iron.
[00:39:56] ROD: You're right. Oh fuck. This is
[00:39:58] WILL: A
[00:39:59] ply, a [00:40:00] plywood lung. This
[00:40:00] ROD: lr. This is a terrible story. I don't wanna talk about it anymore. Ah, you've got tb. Well have you tried this box? So 1937, they used plywood and bicycle parts because of the polio outbreak.
[00:40:14] Apparently there was a problem cost a fraction of the price of traditional metal versions and was easily mass produced.
[00:40:20] WILL: good on them. That's awesome.
[00:40:21] ROD: Not in Australia. Oh, the uk. However, England onto it. Loved
[00:40:26] WILL: it.
[00:40:26] That's where
[00:40:26] you famously go for fast
[00:40:28] ROD: producing. In 1930s, you go straight to the uk.
[00:40:31] Apparently even Winston Churchill said. Thanks both of you Boths, or both of you Boths.
[00:40:37] WILL: I think it was definitely both of you.
[00:40:38] ROD: Both. I
[00:40:39] think it's both the first humidity crib. Now. We did a story in this ages ago on the wholesome show about the first humid crib, or at least early versions. This may have superseded it, but I'm not sure.
[00:40:49] I, I couldn't go back and look at my original notes 'cause you know, I'm a busy man. But anyway, Don designed an early version of this. He converted a wooden filing cabinet
[00:40:59] WILL: into [00:41:00] basically
[00:41:00] ROD: humid crib.
[00:41:03] WILL: I mean, all the things you look at in your house and you go, what can I turn into a humid
[00:41:06] ROD: filing cabinet
[00:41:07] tissue box on know chip
[00:41:10] WILL: packet,
[00:41:11] eight mouse traps at a
[00:41:13] ROD: it's in
[00:41:13] WILL: and a potato.
[00:41:14] ROD: Yeah, exactly. um, apparently it was used in hospitals across Australia and often built on site because Don would send out instructions
[00:41:21] WILL: to
[00:41:21] with the filing cabinet first
[00:41:22] take a filing
[00:41:22] ROD: Yep, yep.
[00:41:23] Start with the filing cabinet. They invented a thing called the visit tell. In the 1930s, it was a device that could send images and handwritten notes over phone
[00:41:32] lines.
[00:41:33] How
[00:41:33] WILL: invented Zoom.
[00:41:34] ROD: the fax machine image.
[00:41:36] WILL: Well, okay. Alright. Not
[00:41:37] ROD: They basically in 1930s invented some version of a fax
[00:41:40] WILL: Oh. The fax machine was invented in the 1890s or so. If you just have your know.
[00:41:45] ROD: I wonder how fuck do you know?
[00:41:46] WILL: Oh, just knows
[00:41:47] ROD: It's only a little bit of science. Not all the science.
[00:41:49] Yeah.
[00:41:50] And history's not in the title.
[00:41:51] Yeah.
[00:41:52] But, um, apparently it was the most common use for, at least the most important use was remote diagnosis and communication for isolated hospitals. So [00:42:00] they basically, you could send images of your, that's
[00:42:02] cool.
[00:42:02] I mean, probably not skin cancer 'cause they didn't know what that was back then, but,
[00:42:04] WILL: no,
[00:42:05] ROD: no.
[00:42:06] Or cancer.
[00:42:06] Um, a camera that could go down. Gun barrels. So
[00:42:10] WILL: so
[00:42:10] ROD: that during World War ii, you could check for cracks and floors without dismantling them. And that kind of mattered because like with artillery and stuff, you don't, you don't want 'em to blow up. And this is my, might be my favorite electric scooters, 1940s.
[00:42:24] WILL: Are you serious?
[00:42:25] Yeah.
[00:42:26] ROD: So they built battery powered electric scooters and vans because there was a, a dearth of fuel and stuff. And here's a newspaper article. 1941.
[00:42:34] WILL: What could have been
[00:42:35] ROD: like, it looks very Jetsons.
[00:42:37] WILL: so modern. Yeah, that's so modern.
[00:42:40] ROD: it's way cooler than the ones we have flitting around downtown.
[00:42:43] It's badass. Yeah.
[00:42:44] So they did
[00:42:44] WILL: that
[00:42:45] ROD: Olympic scoreboards for the 1956 Melbourne Olympics.
[00:42:48] They uh, created electronic scoreboards with illuminated digits. That was the first in this country. Um, and they needed people to switch and stuff. It was difficult, but it worked and it was pretty portable.
[00:42:59] WILL: [00:43:00] ECGs,
[00:43:01] ROD: Ted built the first one in Australia, and it was small enough to be able to move it around to rural hospitals and stuff.
[00:43:07] WILL: Revolutionized heart
[00:43:08] ROD: diagnostics where full-size machines were too large or expensive or immobile. So they did that. Like it's insane. And this is, this is like eight of 500 or more. This is my favorite at the end here. Automatic curtain closer
[00:43:22] WILL: finally.
[00:43:24] ROD: So basically at sunset, the curtain would just kinda go. And shut.
[00:43:30] WILL: This is
[00:43:30] ROD: close to a hundred years ago. They built this sort of thing. So fucking wild. Why haven't we heard of them? And actually, I've gotta say a shout out because a former student of both of ours was writing a piece for the A, b, C on this. And I saw this not because of her, but it's just like, well done Ellen.
[00:43:45] I know Ellen's a huge fan. Um, why haven't we heard of them? So there was no strong publicity behind them. They were very practical and utilitarian.
[00:43:53] WILL: just did stuff.
[00:43:54] ROD: Yeah, they just did stuff. They were generalists, not specialists. So it wasn't like this is a person who's a [00:44:00] medical pioneer, a military hardware guy.
[00:44:01] That's so
[00:44:02] WILL: innovation though.
[00:44:02] ROD: Fuck yeah, it is quiet. Unassuming. Eating a,
[00:44:05] WILL: generalists, not specialists.
[00:44:06] ROD: we don't need that. We're a jack of all trades. Master of all of 'em
[00:44:09] as
[00:44:09] WILL: dig a hole and I can put some electronics in it too.
[00:44:12] ROD: I can build a humidity grip to put in there. , They weren't patented globally or sold commercially at scale, so their names weren't attached to the devices in the public mind.
[00:44:20] They weren't, it wasn't like both did this or both did that. They were just, these machines appeared, but they weren't attached to these guys. Identities. So when people did later versions of the humidity cribs, the ECGs and stuff, they weren't attributed to these guys. They just made them, they didn't paint them.
[00:44:37] They weren't clever about that sort of shit. And so their designs were absorbed and renamed and, and put
[00:44:43] out there
[00:44:43] WILL: 1920s,
[00:44:44] ROD: thirties,
[00:44:45] Yeah. Yeah. Which is not
[00:44:46] WILL: you know? Come on now. It's not the land of innovation, you know, it's the land of getting shit
[00:44:49] ROD: Well, it's not the name of the, the land of, um, comms either. Really? No.
[00:44:54] WILL: No. No.
[00:44:55] ROD: So that's the bottom line.
[00:44:56] So the brothers eventually sold their company, both [00:45:00] innovation or whatever it's called, the both company to Drug Houses of Australia's, the name of the company. I'd never heard of them either in the late sixties, drug Houses of Australia. What, what would we call our
[00:45:11] WILL: company?
[00:45:12] I don't know.
[00:45:12] I dunno what you could be and legitimately call yourself
[00:45:16] ROD: Houses of Australia.
[00:45:18] WILL: Like, like,
[00:45:19] ROD: Early adopters of
[00:45:20] WILL: maybe we should, maybe we should band together. We're all, we're all the heroin
[00:45:23] ROD: houses.
[00:45:24] Oh, I've got drugs. You got drugs. I don't know, but they,
[00:45:27] what are you
[00:45:27] they bought the B both company in the sixties, but these guys kept playing.
[00:45:32] So Ted died in 87 and Don died in 2005, but apparently they kept tinkering and making
[00:45:37] inventions even after that,
[00:45:38] right up till the end.
[00:45:39] WILL: Oh, good on you.
[00:45:40] ROD: So there you go. Two guys you'd never heard of who were responsible for a whole bunch of stuff. Fricking amazing.
[00:45:45] WILL: the boss or both?
[00:45:46] The
[00:45:46] ROD: both The boss? Both the Boths.
[00:45:47] Both the boths. Both. Both. The both is.
[00:45:48] WILL: Both the, both us.
[00:45:50] So
[00:45:51] Megan Rosenblum was working as a journalist when she spent some time at the Muta Museum in Philadelphia,
[00:45:59] Muta,
[00:45:59] where [00:46:00] they have a small collection of what are called Anthrop Epidermic books,
[00:46:05] ROD: Mm. Human skin
[00:46:07] books.
[00:46:07] WILL: Yes. Human
[00:46:08] skin books. So, so just to break
[00:46:10] ROD: down
[00:46:10] small, I, I'm jealous 'cause I only have one, not a collection.
[00:46:13] Well,
[00:46:13] WILL: if you have one, you have, you have quite a lot. Like, like, like, anthrop Demic obviously Anthrop. Human der Skin Collections of human skin
[00:46:25] ROD: books.
[00:46:26] collections.
[00:46:27] WILL: Now
[00:46:28] Megan
[00:46:29] was surprised that the collection wasn't the stuff that she'd uh, she'd assumed. So, you know, the, the, the cultural stories about human skin books is all, it's like the horrors of n Nazi death camps or like the French Revolution where there's like human skin tanneries or serial killers that are doing
[00:46:47] ROD: this.
[00:46:47] But,
[00:46:48] But, that human skin makes great wallets. But anyway,
[00:46:51] WILL: on.
[00:46:52] Maybe
[00:46:53] ROD: it's, it's, it's delicate and also a bit
[00:46:55] WILL: but it was associated with some other, well, another group of people, Uhhuh and it led to [00:47:00] her doing some proper research. So what I wanna tell is a little bit of the story that Megan Rosenblum, um, collected here.
[00:47:07] Um, just a little bit of a hint of the anthrop demic literature.
[00:47:11] ROD: I don't want to hint. I want lots of
[00:47:13] WILL: I want lots of it.
[00:47:13] I'll give you as much as
[00:47:14] ROD: I
[00:47:14] can. Gimme a
[00:47:14] WILL: give you, so the, the, the book and this is well worth it. Well worth a dive. It it covers so many cool stuff. Yeah. This is dark Archives, a librarian's investigation into the science and history of books bound in human skin.
[00:47:27] ROD: Fuck off. So the content's irrelevant. It's just what they're bound in. So we're judging these books by their covers.
[00:47:32] WILL: no. Well, yes.
[00:47:33] ROD: We are judging these books by their
[00:47:34] WILL: covers.
[00:47:34] No, no. We are judging them by what's in them and who owned them as and who made them as
[00:47:39] ROD: well.
[00:47:39] Not first.
[00:47:40] Like, no,
[00:47:40] not first.
[00:47:41] WILL: no, no, no.
[00:47:41] This is, yeah.
[00:47:42] In
[00:47:42] ROD: fairness, that's what
[00:47:43] you did. You walk up and you kinda go, it looks a bit skinny. Yep. Human skin. Put it in the pile. Well,
[00:47:48] WILL: okay, so that's the first big thing here.
[00:47:50] How do you is that they, well, how do you
[00:47:52] ROD: tell it's
[00:47:53] the flavor?
[00:47:54] WILL: No, no, it's not the
[00:47:56] ROD: ang the resilience. Does it come back after you put your teeth in? You [00:48:00] also do not do that.
[00:48:01] WILL: You do not do that. Like, like these books. , Most of us don't get to touch these
[00:48:06] ROD: books.
[00:48:07] I'll bring mine around next time. You can have a look at mine.
[00:48:09] WILL: Like, unless you make your own, you are not likely to touch one of these books. It they are, they are rare. Would you, would I touch one? Yeah. Look, I don't, well, um, there's, so there's some in here that I, I'm like, I don't know.
[00:48:25] I don't know
[00:48:27] ROD: What about, what about, what about leaning in Sniff Deeply? I I'd, I'd touch it. I'd rub it on my face.
[00:48:33] WILL: Oh God. Oh God.
[00:48:34] ROD: Books. I wanna know
[00:48:35] WILL: you, you're rubbing a corpse on your
[00:48:37] ROD: No, I'm not. It's a book,
[00:48:39] WILL: A bit of corpse, a
[00:48:40] book.
[00:48:41] So first thing is, of course there have been stories about books bound in human skin for a long time.
[00:48:48] Mm-hmm. A lot
[00:48:49] ROD: of the evil Dead.
[00:48:50] That movie started with
[00:48:52] WILL: a book bound
[00:48:52] ROD: in human, bound in
[00:48:53] WILL: I mean, it's, , it's a cultural trope. It it goes back a long time. Goes back centuries. Yeah. Um, and, [00:49:00] and it's, it's clearly something that people go, okay, that's a book that's taking itself seriously. Like that is, that is like, like at some point you go, all right, that one, that one,
[00:49:09] ROD: that
[00:49:09] one. You'd hope so, wouldn't you though? Can you imagine like, someone's written a book, like, I don't know, frivolous recipes for kids parties and they thought, I'm gonna bind it in human skin.
[00:49:17] WILL: Oh my God. Frivolous recipes for kids parties and bound in humans. I feel like
[00:49:20] ROD: like you, you'd want it to be substantial as, as
[00:49:22] WILL: as I will come to in a second. Yes. You would want it to be substantial. Yeah. Yeah. Um, but, the key thing first is that there's been a whole bunch of fraudulent claims. Like there's been, there's a whole bunch of books that, you know, old books that look old and, and, and people are like, they get a story, you know, maybe grandpa said that one's bound in human skin, or, you know, there there's a lot of dodginess.
[00:49:45] So, risen Blum was like, you know, we actually, you know, and she was not the, the scientist involved. She's the journalist telling these stories, but we need a technique to actually do this properly because for a long time identification is kind of visual. Like, they would literally, [00:50:00] they'd literally look at
[00:50:01] ROD: the, seems human skin ish.
[00:50:02] WILL: It seems like the leather has human skin level. And here he is looking at my arm,
[00:50:07] ROD: like
[00:50:08] but how, how much? I don't know. What do you call it? Atrophied, tanned. Human skin. Removed from a human body. Does a normal person see e Ever Betty it
[00:50:20] WILL: Not much. Not
[00:50:21] ROD: much. Not much.
[00:50:21] See, you look at him and kind of go, oh, it's probably human
[00:50:23] WILL: look, I don't doubt that Tanners have more skill in this environment, but Compare.
[00:50:27] Compare, yeah. Cow skin, hoskin, goat skin, other missile. I don't know what else they might use. Elephant.
[00:50:36] Elephant. I
[00:50:37] ROD: hell yeah. Yeah. Yeah. Definitely
[00:50:38] elephant skin,
[00:50:39] WILL: but, but, but like, I get that you might be able to identify differences in hair follicles and things like that,
[00:50:46] ROD: of
[00:50:46] WILL: like the, the size of cells. But it's, that's imprecise.
[00:50:49] ROD: a big call.
[00:50:49] WILL: Yeah.
[00:50:50] For a long time people thought DNA testing might be a way that you could do it. Surely.
[00:50:55] ROD: No. No.
[00:50:56] WILL: No. Not at all. Not at all. Because the problem is, [00:51:00] people have been handling these books. Like, like there's, there, there is guaranteed human DNA on just about every book that you, that is worth testing,
[00:51:08] ROD: so they do the testing. This is about in the skin of 4,000 people.
[00:51:11] WILL: this whole library. This whole library
[00:51:13] is
[00:51:14] ROD: a village. It's a village. It's matter everywhere. It's horrifying. My God. Can we find a book that wasn't bound in human skin?
[00:51:24] WILL: We just, just, just a question there. How much human skin do you need to make a book? One of the stories I
[00:51:30] ROD: a meter to a meter and a half.
[00:51:31] WILL: one of the stories I will tell you comes from uh, the back. One of the stories comes
[00:51:36] ROD: butt cheeks,
[00:51:37] WILL: From the boobs, um, the boobs, the,
[00:51:42] ROD: That's a particular end of the, the uh, the volume of the breast section of persons.
[00:51:49] WILL: Um,
[00:51:49] ROD: back makes sense to
[00:51:50] me.
[00:51:51] Yeah. And May and maybe depending on the person
[00:51:53] WILL: that, no, I often look at people's backs and I think, how many
[00:51:56] ROD: books
[00:51:56] do I
[00:51:56] WILL: I get outta that?
[00:51:57] That cool?
[00:51:57] ROD: it's like, ooh, I'm thinking the first Harry [00:52:00] Potter, at least,
[00:52:01] WILL: Yeah, definitely not volume six or seven or whatever it
[00:52:04] ROD: gets
[00:52:04] to May. Maybe the Tumtum too, depending on the girth of the individual. Yeah. Tum to Will be you. You could do a, like a a, an apple peeler, like straight around from belly button round to belly button
[00:52:15] WILL: again. Yeah.
[00:52:15] Yeah. You could, you could do it like a, a long
[00:52:17] ROD: book
[00:52:18] like Yeah. Yeah.
[00:52:18] You could. You could A scroll.
[00:52:20] WILL: follow a scroll. But no, DNA testing doesn't actually help.
[00:52:24] I. Like it's, it's, it's not gonna tell you because we've been touching these books. Okay. But a, a technique was developed in the 2010s called peptide mass fingerprinting.
[00:52:34] ROD: Well, I knew that.
[00:52:35] WILL: Yeah, exactly. But, but, but key fact for this is it's only in the 2010s that we've actually developed a technique that was able to do this.
[00:52:43] Basically. Basically what it does is it, it, it takes the proteins from a small sample mm-hmm. Smashes 'em up into small parts
[00:52:51] and
[00:52:51] ROD: then puts in a particle accelerator.
[00:52:53] WILL: Yes.
[00:52:53] but you can see the weights of the different proteins and you know that human skin has cer a signature of weights for [00:53:00] different proteins that is different from goat or cow or, or
[00:53:02] ROD: Fuck that
[00:53:03] WILL: It it is, it is. That's wild. It is. But well, okay. It was not the number one problem of science to solve.
[00:53:10] ROD: was gonna say it's specific, but is it necessary? And I'm mean, you could ask that about a lot of science and part of the reason I do like a lot of science is because, no, it's not necessary, but I'm god damn glad they
[00:53:19] did
[00:53:19] it.
[00:53:20] WILL: The first book that was confirmed by this analysis. Mm-hmm. So we did have confirmed by historical analysis previous, but I'll come to that in a
[00:53:30] ROD: second,
[00:53:30] Oh, so what a chain of provenance.
[00:53:32] WILL: chain of providence. So, so as I'll, you know, James Allen saying, bound that fucker in my skin, like, like that, that does count.
[00:53:41] That does count. But so many, so many of these books we just don't know. But the first book that was confirmed as human skin was a copy of de Destiny's de la me a book by the French philosopher, ASEN Hue which was a, it was the, what is it called? Notes on the Human Soul. Um, [00:54:00] oh.
[00:54:00] And, and there was a note inside the book that said a book on the human soul merits that it be given human clothing.
[00:54:07] And I was like,
[00:54:07] ROD: does it, does it? Given, given, actually by definition, the human soul is esoteric and unrelated to the body, except sort of in some ephemeral way.
[00:54:17] WILL: It's a nice note though.
[00:54:19] ROD: No, it's true. I get it. I get
[00:54:20] WILL: you saying. It's a nice note.
[00:54:21] ROD: I get why you'd say it.
[00:54:22] WILL: Um, so that was in 2014. Yeah. And that's the first book that was confirmed in this analysis.
[00:54:28] Why would you bind a book in human
[00:54:30] ROD: skin?
[00:54:30] Oh, no. That I get, that makes sense to me. Why would you bother trying to identify them?
[00:54:34] WILL: Why, why would you try, why would you try to identify? Because fraud is, is all the way through this, and it's like, we should actually know. Like it's, it's, it's legitimate to know, um,
[00:54:44] ROD: out of respect for the deceased.
[00:54:45] WILL: That's one that's totally one that, that yeah. Or, or if, if you wanna read a book and suddenly you're surprised that it's bound in human skin, like
[00:54:53] ROD: it's kind of sticky and moving on its own, I.
[00:54:57] WILL: what to do. This book. There's 50 suspected books[00:55:00]
[00:55:00] ROD: Worldwide. Worldwide.
[00:55:02] WILL: that are on public
[00:55:03] ROD: record Right, right, right,
[00:55:04] WILL: being bound on as potentially bound in human skin.
[00:55:07] ROD: Um, what's
[00:55:08] the register You look for
[00:55:09] WILL: You know, you know, and it's like you get a, you know, you get your granddad's will, and it's like, here's all the books. And you go, well, I suspect this one's bound in human skin.
[00:55:18] Where do I report?
[00:55:19] ROD: it on the register. It's very easy to find. Sam Altman's
[00:55:24] WILL: Done The human skin reporter.com.
[00:55:27] Actually,
[00:55:27] ROD: that's more like a Mark Zuckerberg kind of thing. I'm thinking like, also if you've got any books found in human skin, you can report it here. We'll put it in the Metaverse.
[00:55:34] WILL: So they've examined 31 through this decent technique and found 18 confirmed as
[00:55:39] ROD: human
[00:55:39] more than half 13.
[00:55:41] WILL: 13 have been demonstrated to be non-human leather. So there are frauds
[00:55:45] that, that people, that, That, that people are saying, this is human, but 18 confirmed as
[00:55:50] ROD: are, what, what do they sell for? Just out of
[00:55:52] WILL: What do they sell
[00:55:53] ROD: Like If I had
[00:55:54] WILL: one? These things do not sell. The museums and libraries that have them hate them.
[00:55:59] Like,
[00:55:59] [00:56:00] like really?
[00:56:00] So there's one in the National Library of Australia
[00:56:02] and get down.
[00:56:03] I know. Like, go you, good
[00:56:05] ROD: thing,
[00:56:05] Can we get it? Can we
[00:56:06] have a
[00:56:06] look?
[00:56:07] WILL: They did testing and they're like, and we never wanna talk about this
[00:56:10] again.
[00:56:11] I'll come to what other people think about.
[00:56:13] ROD: But this is up there with the whole, you know, Nazi horrible, we'll call 'em experiments in a huge bunny rabbit quotes, et cetera. Like the argument over whether it's respectful to consider the data or the book, or is it disrespectful to consider it if someone died, fraud it, or donated their bodily parts?
[00:56:31] It could go either way. There are strong arguments in both directions, so acknowledge and respect, or don't
[00:56:35] WILL: this is the underpinning of Rosenbaum's book, is that this is, this is an actually really interesting story on medical ethics. Like, it's like, what are people
[00:56:44] ROD: consenting medical?
[00:56:45] WILL: Yeah. Well, okay.
[00:56:47] I only, I
[00:56:47] ROD: That's a big call given the, the era.
[00:56:49] WILL: no, well, no, I'm gonna go with, um, Rosen Blum's key findings. . I'll go through a little bit about the, the topics of the
[00:56:55] ROD: books.
[00:56:56] Thank you. 'cause that was what I was gonna
[00:56:57] WILL: ask,
[00:56:57] but I also wanted to go through, you know, where the [00:57:00] skin came from,
[00:57:00] ROD: what were they about?
[00:57:02] WILL: Okay. Enormously
[00:57:04] it's medical, textbooks.
[00:57:06] Like
[00:57:07] ROD: there, is it really? Yeah.
[00:57:08] WILL: Probably of the 18. I don't have the exact
[00:57:10] ROD: number
[00:57:10] yet.
[00:57:10] Yeah, yeah,
[00:57:10] WILL: yeah.
[00:57:11] Um, it would be 14, 15 all really various
[00:57:15] versions of medical books like catalog of science and science, medicals and elementary treaties on human anatomy.
[00:57:22] Deani, porus, fabrica, septum which obviously anatomy. Yeah. Then we've got poetry,
[00:57:29] we've got,
[00:57:30] um,
[00:57:32] ROD: sorry.
[00:57:32] WILL: sorry. And, and, and horny poetry. Horny poetry.
[00:57:35] ROD: Okay. That makes sense. Obvious. Obviously it's sexual, but if it had been like, you know how to fix your motorcycle,
[00:57:41] WILL: No,
[00:57:41] ROD: then I'm like, you know, instruction
[00:57:42] WILL: Yeah. Yeah. I get, I get it's horny poetry. You're like, so there's a, um,
[00:57:47] ROD: poetry medical text.
[00:57:49] Oh, and horny poetry.
[00:57:50] WILL: , And then there's an Edgar Allen Poe uh, course the gold bug.
[00:57:54] Yeah. Um, and, and there is one, and this is, this is the only one that fits out of a different [00:58:00] historical timeframe is um, is a Nazi photo album. Um, so
[00:58:05] ROD: sorry, but just. A Nazi photo
[00:58:08] WILL: So, so this is the interesting thing, like the, the, the concept of binding book, it, it peaked in the 19th century and, and nearly all of the confirmed ones are like 19th century.
[00:58:18] Like it may have happened a little bit
[00:58:19] before. We don't quite have it. And it sort of p it peaks like 18, 18 37 to eight, up to 1898, like, and maybe even a little bit after that.
[00:58:30] Yeah. And it's doctors. They're doctors who are also biblio files. They love books and they love being doctor, and they're like, well, I can get access to human skin and we could just bind our books in
[00:58:41] ROD: this.
[00:58:42] WILL: I just got one final one final thing I wanted to do. Um, so John Feno, he was, he was the one that James Allen said bind my, bind my memoirs in my skin and give it to John Foh.
[00:58:54] Um, he kept it for quite a long time.
[00:58:58] Um,
[00:58:59] And [00:59:00] this is what I love, he used it to spank his children. Like Kate.
[00:59:05] ROD: We've all used human skin to spank our children, though. I mean, that's fair.
[00:59:08] WILL: just like
[00:59:11] ROD: Jane made me bring out the human skin book, because that's when you know you're really in trouble. The trauma. That's not fair.
[00:59:22] WILL: Hey, I just gotta do a quick shout out. Uh, Thank you so much, Stewart
[00:59:24] from Sheffield. Oh,
[00:59:25] ROD: Oh, yes, thank you, Stuart.
[00:59:27] WILL: Definitely.
[00:59:27] After our last episode about giant testicles,
[00:59:31] ROD: it was an accident,
[00:59:32] WILL: evolutionary pressures towards giant testicles. You asked your listeners how to, to ask AI what their emotions should be.
[00:59:39] And Stewart has given us a, a variety here. a gnawing hunger for the forbidden. Like I think that's a nice
[00:59:46] ROD: emotion
[00:59:46] we all have that. I do. We all have that.
[00:59:48] WILL: Do you have, do you have synaptic overload and whispers?
[00:59:51] ROD: I do. I have both of those. It's so far. Yeah. Tick, tick and
[00:59:54] WILL: I do like a diet like this one existential dread with a sparkle.
[00:59:58] ROD: so I don't really have a lot of
[00:59:59] WILL: that
[00:59:59] [01:00:00] existential dread with a sparkle. That is basically my emo
[01:00:02] ROD: I don't, no, I have the sparkle, but I don't have the existential dread. It's like, yeah, yeah. Whatever. But it's, but the first two, a hundred percent.
[01:00:09] WILL: Send us your stuff. Cheers. At a little bit of science.com
[01:00:13] ROD: au.
[01:00:13] Leave us an 11 star review.
[01:00:16] WILL: And if you have any books bound in human skin, um,
[01:00:18] ROD: send us a photo. Like,
[01:00:20] WILL: just tell us like it's cool. We'll keep you anonymous to a
[01:00:22] ROD: point.
[01:00:23] will will buy it from you. He's got a lot of
[01:00:25] WILL: money.
[01:00:25] I won't buy it
[01:00:26] ROD: he's got a lot of money.
[01:00:27] WILL: one.
[01:00:27] I don't want
[01:00:28] ROD: in the room next door to this studio. It's basically stacked by with
[01:00:31] WILL: gold.
[01:00:31] But I don't want one
[01:00:33] ROD: Yeah, you buy it for me. I thought we were friends. Buy it for
[01:00:35] me.
[01:00:36] WILL: for you. Do you want one? Yes. Do you really
[01:00:39] ROD: want one?
[01:00:39] I want another
[01:00:40] one.