From mind-bending virtual reality choices to paradoxical barbers, hungry indecisive donkeys to real-world science funding crises, we're taking you on a wild ride through philosophy's greatest head-scratchers. And just when you think we couldn't make this journey any more interesting, we'll throw in a monster-sized sturgeon that would make Jaws think twice about who rules the waters. Today's episode proves that philosophy isn't just about old guys stroking their beards - it's about questioning reality, making impossible choices, and occasionally pondering why a donkey can't just eat its lunch.
The Experience Machine
Picture this: you're offered a lifetime of an entirely perfect existence full of pleasure. Sounds sweet, right? Well, philosopher Robert Nozick had us pondering whether a life completely filled with pleasure would be one worth living. Rod's over here ready to jack in for eternal hedonistic bliss, while Will's having trust issues with artificial happiness. It's like choosing between a lifetime supply of chocolate that tastes amazing but isn't real, versus actual vegetables.
The Barber's Paradox: When Logic Gets a Bad Haircut
Here's a head-scratcher from Bertrand Russell: imagine a barber who shaves everyone who doesn't shave themselves. So... who shaves the barber? The logic starts strong but ends up eating itself creating an infinite loop from you can’t escape, and the poor barber never wins. This paradox proves that sometimes thinking too hard about something is like trying to eat soup with a fork - technically possible, but probably not the best approach.
Buridan’s Hungry Donkey
Ever stood in front of your wardrobe for 20 minutes, unable to pick an outfit? You're in good company! Meet Buridan’s donkey, the 14th-century mascot of decision paralysis, who starved to death because it couldn't choose between two identical piles of hay. It's nature's way of telling us sometimes you just need to pick a path and go with it, rather than waiting for the perfect choice to magically appear.
Science Funding Cuts
Here’s something that's definitely not a thought experiment - the devastating cuts to science funding rocking the USA. Thanks to the Trump administration, scientists all over the United States are watching their research grants disappear faster than free pizza at a staff meeting. It's like someone decided to solve climate change by pretending it doesn't exist. It’s also interesting reviewing the titles of some of these research studies that have been cut. They seem to have some common themes…which we’re hoping is purely accidental.
That’s a Big Fish
Did you know the sturgeon, from which we get the luxury of caviar, can grow to sizes that seem absolutely insane? Picture a river-dwelling fish around the scale of a great white shark. The largest sturgeon ever recorded was a titanic 7.2 meters long. Imagine seeing this thing sidle up next to you as you’re paddling up a river.
From pleasure machines to indecisive donkeys, from funding nightmares to monster fish, we've covered quite the intellectual obstacle course today. Next time someone tells you philosophy is boring, tell them about the barber paradox - it's guaranteed to start either a fascinating discussion or a confused silence. Either way, entertainment achieved!
CHAPTERS:
00:00 Fun with Google's AI: Weird Phrases and Their Meanings
01:53 Exploring Paradoxes: Rocco's Basilisk and More
03:53 The Experience Machine: A Dive into Hedonism
10:45 The Barber's Paradox: A Philosophical Puzzle
15:09 Buridan's Ass: Decision Paralysis
19:54 Exploring the Concept of Perfect Equality
20:20 AI Ethics: The Dilemma of Self-Driving Cars
21:45 The Impact of Trump on Scientific Careers
23:38 Canceled Research Grants: A Closer Look
29:36 The Worst Case Scenarios: Are We Prepared?
34:45 Fun Science Facts: The Giant Sturgeon
37:11 Conclusion and Call for Feedback
SOURCES:
https://www.wired.com/story/google-ai-overviews-meaning/
https://www.nature.com/articles/d41586-025-01216-7
https://airtable.com/appjhyo9NTvJLocRy/shrNto1NNp9eJlgpA?Ffj6Q=allRecordshttps://airtable.com/appGKlSVeXniQZkFC/shrFxbl1YTqb3AyOO
https://bigthink.com/personal-growth/seven-thought-experiments-thatll-make-you-question-everything/
https://bigthink.com/personal-growth/seven-thought-experiments-thatll-make-you-question-everything/)
https://en.wikipedia.org/wiki/Barber_paradox
https://plus.maths.org/content/mathematical-mysteries-barbers-paradox
https://philosophynow.org/issues/81/Why_Buridans_Ass_Doesnt_Starve
-
Will: [00:00:00] So here's a nice little fact for you. If you put some sort of random combination of words together and then add the word meaning when you type it into Google. Google's new AI search will sort of tell you what that phrase means. Now that's fine if you, if you gave it a normal sort of parable or aphorism, you know, , you know, you can't cross the same bridge twice or lightning doesn't strike twice or something like that.
But people have discovered if you put something a little bit weird, like you can't lick a badger twice or never throw a poodle at a pig. And then put meaning then Google will dive in there and give you the meaning of this phrase. So I went out there and I thought, all right, what can, what can I put in there?
I put, um, faster than a bananas ding dong, and Google gave me a great meaning out there. So thanks ai. If you ever wanted to know what faster than a banana's ding-dong might mean, just put the word meaning and then go into Google. [00:01:00] Welcome to a little bit of science, the show that explores a little bit of science. I'm Will Grant. I'm an associate professor of science Communication at the Australian National University.
Rod: I'm Rod Lambert. I'm a 30 year sitcom veteran with a mind of a teenage boy.
Will: And today, rod, you told me you're gonna tell me about some paradoxes you got obsessed with.
Rod: I, I think you find the plural lis paras.
Will: Yeah. And then I've got some, uh, well careers in science for you. Uh, I thought I did a bit of research on that. I've also got a little tiny science fact for you. If you wanna know just the smallest, tiniest fact that I learned this week.
Rod: They're the only facts I ever want. Big, big ones hurt my brain, you know that. here's the thing. I was, I was meandering through the intertube as one does looking for fun things to talk about. And I saw this story about people [00:02:00] who became obsessed with Rocco's, basal lisk. And I know you're a big, big fan.
Will: I, I, I, I don't know if I'm a big fan. I'm a, I'm a, I'm a person who worries about Rocco's Basalis, but I'm also someone who with a devil may care attitude. I'm happy to invoke Rocco's basal lis uh, whenever with the belief that maybe it won't talk to me forever.
Rod: Yeah. The gist of it is basically there are many different versions. Bottom line is there's some future malevolent AI out there that somehow that wants to punish people for not doing its bidding, but it also somehow can reach back in time for people who tried to block it before it began, or as it was developing, and it would punish you today for things you may or may not be gonna do in the future and have been naughty.
Will: Does this include my thought crimes?
Rod: Yep. It depends how far in the future it comes back from to get you. But I started looking into this and 'cause this, this article said, it basically talked about a whole sort of weird culty. Murdery is reality, even real group of people, which I thought that's kind of fascinating. And then it got too big and strange and scary.
I. And kind of [00:03:00] gross. So I thought, I don't wanna do that yet. Maybe I'll come back to that in another episode. But by doing that, it set me down a rabbit hole of all these other thought crimes and many that kind of dabble in these parados
So there are three that caught my eye. So I'm just gonna, I'm gonna let you let you hear about 'em. Roll around to them. One's called the experience machine. One's called the Barber's Paradox. And of course the best one 'cause it's me, bur Dan's ass. and I'm not making any of these up, or if I
Will: do you, how do you spell, I know, I know this doesn't really matter, but how do you spell Burin? Bur is it like.
Rod: B-U-R-I-D-A-N and RS is a SS. They actually mean donkey.
Will: Oh, I mean, donkey, not his butt. Oh,
Rod: No, no one was as disappointed as I was when I saw, I was thinking, oh, go philosophy. But no, it's a, it's a donkey. And I, I get to that last, first up, the experience machine.
Will: I do, do you want me to guess what the experience machine is?
Rod: If you [00:04:00] want to Sure.
Will: Well, I, I feel like it's a little bit like June. Um, you know where he is? Got to, he's gotta put his hand in the box and, and he is like, what's in there? And she says, pain. And he is like, oh, okay. Yeah. She does it in the voice.
I feel like, I feel like the experience machine is like, you gotta a fun fare or something like that and you gotta put a body part in the box and, and it's like, what do you get? It's like, well, you get an experience.
Rod: Look, it could be any machine, though really, any machine could be an experience.
Will: I feel like, I feel like it's, it's like a truck stop. Um, toilet, you know, where there's a hole in the wall and you put experience machine and you put something through the.
Rod: And you have an experience, look, this actually could be all of those and everything else you can imagine. 'cause basically there's a philosopher guy called Robert noic, or No Chick noic. So he wrote a book, 74 Anarchy State and Utopia, and he did a thought experiment that wanted to challenge.
Hedonism, particularly extreme hedonism. Now I'm [00:05:00] a big fan of hedonism. You probably didn't know that about me. No, no one who knows me would, would've guessed that. But, um, it is only my, uh, surrounding loved ones, et cetera, who keep me from turning into an insane and inveterate hedonist. 'cause I.
Will: Headness being like the pleasure seeker.
Rod: love, pleasure.
I just love pleasure. And the pure head hedonism is, you know, pleasure is the only thing that matters. So there are deep philosophers of of which I am not whom
Will: are there deep philosophers who subscribe to hedonism? I mean, I didn't, I don't think they covered that in the good place, you know, and said, okay, now we should just go pure pleasure.
Rod: That's the epilogue movie. After the good place, they're gonna go, by the way, now you've always got that. What is that moral philosophy guy? Moral philosophy does come up in this. So the setup here, here's here's the setup for the, the, the situa, the the experience machine. So you've gotta imagine a super advanced machine that can simulate any experience, any, any experience you get to pre-program, basically that experience or your entire life.
So it could be anything. It could be fame, it could be success, it could be space travel. It could be being able to play table tennis, like, you know, y or [00:06:00] for example,
Will: sure, sure. That guy, this is this. So you luckily in the Star Trek, um, the, the Holodeck. Yeah.
Rod: Yeah. Yeah. Or you know, if you go full ball, you, the, the comparisons to the matrix come up a, not a lot because apparently in this machine you won't necessarily know it's a simulation. Once you plugged in, it'll feel real.
So you're basically, you'll be floating in a tank, you know, electrodes in your brain.
You'll be like in a living dream. Your body will be sustained somehow, et cetera, et cetera. So imagine this exists, and this is world pre Matrixx. The question is, and these philosophers put it to you, is, would you plug into this machine and the implication, be and stay there, but would you plug in, first up, would you
Will: I feel like yes, like I feel like there's a bunch of things where you, you say, well, I, you know, I'd like to do that, but it costs too much money. Or it'd be, I would like to eat, the world's biggest Hamburg. I don't know.
Rod: your imagination is free to roam and you go big
Will: Uh, no, not really, but, you know, I, I, I just didn't wanna say [00:07:00] the things I was really thinking.
Rod: Are you a big, big, hot dog?
Will: Yeah. Bigger.
Rod: But, um, look, the implication is very much like if you, you plug in, you stay there. So would you be prepared to fully commit?
Will: I was just going for a, you know, like a, a, a Friday night experience, but, uh.
Rod: Well look, I think it's still the, the challenge still comes in, but it's more like if you really, were you prepared to commit? Because you basically say he, the challenge in this position is the, it's challenging the position rather than the best life is one with the most happiness or pleasure. The, it's sometimes called pure pleasure theory.
Will: See, I wouldn't, I, I, I don't want to, I don't want to go into the pure pleasure machine because, um, I'm a reality junkie, man. and I gotta earn the pleasures. Like, I feel like, yeah, you can cheat on a Friday night and say, all right, I'm, I'm maxing some pleasures out by going into the experience machine, but you have too much it.
They don't count anymore. They're boring.
Rod: well this is what he says. He reckons, most people would say no because it proves pleasure isn't the only thing we value. But also that, yeah, like [00:08:00] you said, you kinda gotta earn it. That's his theory. That's what he reckons. And there are arguments though, for people who would, one argument for yes is you can literally curate what seems like an eternally fulfilling or at least pleasurable life.
No suffering, heartbreak, boredom, no awkward job interviews, no letdowns, perfect, consistent, awesome things. You want that in itself. If you don't think too deeply about it, sounds great, but I mean, your objections come into play quickly. You don't need to earn your skills. You can say, I wanna play guitar as if, you know, Jimi Hendrix and Eddie Van Halen had a child and I wanna play like that person.
could organize all the photos on your iPhone into meaningful categories in like three seconds. 'cause you know, we're all, we're all gonna do that. We're all gonna organize our photos one day so that we can actually search them and make sense of
Will: That's, that's your pleasure seeking experience.
Rod: Oh, I, I might need to push pause. Um, another argument is, oh, your brain already lives in a simulation, man, because we're basically a lump of juice and it only gets what we interpret via our sensors and nerves and da, da, da, da da. So this is just a more efficient [00:09:00] version of that.
Will: But I mean, this is, this is still free riding though. Like we are going into this tank and the tank has to, you know, this magic experience machine has to cost energy on, on the system outside. So you can't stay in there without doing some work to earn that.
Rod: Or not everyone could be in there. Someone would have to be outside cleaning out your poo poo wee wees from the
Will: Or, or whatever, or, or stoking the nuclear power plant or something like
Rod: Exactly. shoveling uranium with the other blue collar workers. his biggest argument for people don't wanna do it is, or wouldn't plug in, is 'cause people actually wanna do things, not just feel, seem like they've done them, like there's value in the struggle and the effort, et cetera.
Will: Totally, did you click on to going into the experience machine for Forever?
Rod: would I click on it? No, I don't think I would. I, I wish I could. I wish I could, but I too would have this authenticity issue. they called this out in the matrix when they, what is it? You know, a previous version, Mr. Anderson. We made it where everything was perfect, but in the end everyone went crazy and started eating each other 'cause it was too perfect. So. The matrix. I was gonna say [00:10:00] beat us, but No, it didn't. It came years after this proposition,
Will: I think the flaw in that is. I don't think, I don't think even, an all powerful simulation robot making a matrix could make things perfect in the sense that yes, it could say, okay, everyone has, , all the food they want, all of the, all the leisure they want, they have the perfect body or something like that.
But there would always be some goods that can't be shared. who has the best view, who has the most status at the party, those kinds of things. And there will always be competition over them. I don't think a machine can solve that problem. When you live in a society, there's some stuff you can't share perfectly.
So
Rod: that's true, right? You walk into a party and everyone's the hottest and the coolest and the most interesting,
et cetera.
Will: doesn't work. It doesn't
Rod: your brain do that? Good point.
that's one
Will: All right. Uh, the barber's paradox.
Rod: the Barber's paradox. So this was a Bertrand Russell call.
Will: should I play guessing what this paradox might
be?
Rod: please do. Please do. I'll give you the timeframe. Early 20th century done by a logician. And basically [00:11:00] he was a bit of a philosophical troublemaker. I quite think. I think Bertrand Russell's entertaining. So there's your timing,
Will: so something to do with cutting hair. The, the barber has a choice about cutting hair. if a person comes in and asks for a bad haircut, should the barber give it to them?
Rod: The answer is yes. Haven't you looked around lately?
Will: No, but the barber
is gonna but, but everyone's gonna walk out. No. Like a literally bad haircut. Not just, not just a fashionably bad haircut, but literally bad
Rod: Like shit, the bed, horrible, ridiculous, embarrassing.
Will: yeah, yeah. No, and the person walks out and everyone just goes, whoa, that's a terrible haircut. Where'd you get that? And he is like, I got it.
You know, this barber. And so then the barber is tainted by giving a deliberately bad hecker, even though the customer asked for it. Is that the paradox?
Rod: No, but I like it. I like your thinking.
So imagine there's a town with a single barber,
Will: Yep.
Rod: this barber shaves all and only those men who will not, or do not shave themselves that's who the barber shaves.
Will: So you can't, you can't do a, [00:12:00] like a, a special Friday shave with the barber, but Monday, Tuesday, you're like shaving yourself.
Rod: No, no, you either, you either don't or can't shave yourself, and he has to shave all of those men But the question for this, in this scenario is. Does the barber shave himself?
Will: all, and only those men who do not shave themselves. So he's shaving himself
Rod: himself,
Will: he can't shave himself.
Rod: himself. But if he can't or doesn't shave himself, he has to shave
Will: have we have, we said he's a him. Because, um,
you've
Rod: one of the people tried to get out with
Will: there's a barber,
Rod: Yeah. No, it's a dude.
Will: uh,
Rod: All philosophy, sexism, man. Come on.
Will: I know that, I know that. , well, obviously that's impossible, so this town falls into the void.
Rod: It's pretty much right.
Will: what does this paradox tell us?
Rod: Well, that's what he gets here. I just in case people aren't, you know, keeping up with our razor sharp intellect, bouncing this around bottom line, if the barber shaves himself, then he's [00:13:00] shaving someone who shaves himself, which he can't do because he only shaves people who don't.
So he can't shave himself. But if he doesn't shame himself. Then he's a man who doesn't shave himself. Therefore, he's obliged to shave himself because he's a person who doesn't shave himself. So he gets stuck in this infinite loop. So the question, I think I'm paraphrasing what you just asked. Who gives a shit?
What's this for?
Will: I guess I, I feel like, I feel like that should be a button on the end of a lot of philosophical paradoxes. Some, it's like, okay, that gets me, and that's, that's straight into my soul. Others like, really, what are we doing here?
Rod: Okay, Chad, well look, I'll just, just briefly, 'cause this is not the mathematics podcast yet, obviously. Basically he, he came up with this to, to. Talk about set theory, I'm not gonna have to go into set theory. Bottom line was that talking about sets of numbers, et cetera, or, you know, idea things that contain themselves or do not contain themselves can create paradoxes.
So the way they define it as the magicians do it and the, and the numbers people. If you say, let R be a set of all sets [00:14:00] that do not contain themselves. And then the question becomes, does R contain itself? The problem is, if R contains itself, it shouldn't. And if R doesn't contain itself, it should.
Will: Yeah. Okay.
Rod: So you, you get this self-referential loop, like in, in, in language, the, the utterance.
This sentence is false. it can't be, and it must be. he says one of the good things for that is it, it, it. These paradoxes that that come up when you apply this get into things, including when AI systems get caught in loops.
You can throw AI systems into these decent loops by giving them these sorts of questions. So there is some utility, or at least fun in playing with it.
Will: I suspect that could be a great way to break an AI system. I mean, we spoke previously about the reciprocal dance problem, you know, when two robots are heading towards each other and just get stuck in a loop. , but I think there's probably a bunch of flaws in AI reasoning that even, even if it's working on a, a probabilistic mode, not a, um, not a logical mode, that there are things that it could get stuck with.
That's [00:15:00] great. I, I'd love to kill an AI system with a paradox.
Rod: Unless it kills us first.
Will: Yeah, well
Rod: Okay. Burdens arse or Donkey, , which is not unrelated to the previous one. It's, it's better and dumber. So it's a thought experiment from the 14th century by a philosopher called Ong Bird, dun,
Will: that's when asses were more useful.
Rod: And they were prolific, plentiful, plentiful ass in the 14th century and donkeys too. So this is put as, uh, it's this thought experiment that, that, that looks about.
Being stuck between two entirely equal good options. And if you approach it entirely and utterly rationally, I. What happens? So the setup is this. Imagine a donkey is placed at the per, perfectly equal distant, a perfectly the same distance between two entirely identical piles of hay. They're both the same [00:16:00] size, same smell.
Everything about them is the same. They look the same. They, they appear to be the same size, density, everything. And the donkey is exactly in between the two of them. So the donkey, assuming the donkey's, perfectly rational. For some reason, they chose donkeys for rationality.
It tries to decide which option is better, which one should it choose, but because they're exactly the same size, the same smell, the same distance, and it's completely rational, it can't decide. So it stands there till it starves. Because it's literally paralyzed. Paralyzed by a decision. 'cause there's no rational way to make a decision between the two.
Will: But you know, there are people that experience this kind of thing in life, you know, choosing between two different options and both are equally good. , both actually might be legitimately good and are not able to choose, and so they get stuck and do nothing.
Rod: But this is even take, take that to the extreme. It's not like, oh, that'd be really good because I get more money, or that'd be really good 'cause I get more prestige. These are literally, they're saying literally exactly the
Will: You know, though, I, I mean, I, I have [00:17:00] in some sense used this in my life. There are definitely times when you get, you know, two slices of cake or some thing like that. And you're choosing which one, which one will I have? If you can't tell the difference, then it doesn't matter. Like, it, it, it really doesn't matter.
can totally accept in this, in this burden's ass version, you get paralyzed by it. But I think at some point. Letting go of that and just saying, you know, it doesn't matter. If I can't tell the difference, then grab
Rod: But what allows you to, the question is if you, this is the assumption, if you're purely rational, and this could again come up if you're dealing with ai, although we're starting to see maybe they're not purely rational as much as we can determine that, how do you choose, I mean, my solution is easy. Eat, eat both bits of cake, Or eat neither. But then of course you starve. So that, but that's the question. It's like basically the argument is. If you strip away instinct, randomness, some kind of whimsy or whim, the rational thought will paralyze itself, or there's a potential for rational thought to leave you completely trapped, which I think is interesting.
Just the idea of that is interesting.
Will: there is something, I'm not gonna get this right. , becauSe, it's drawing on some [00:18:00] quantum mechanics. , I think there are sort of ideas about, a simulation of the universe. You know, if we were to run a simulation of the universe, and get it as identically as, you know, get it perfectly identically, then do all of the things that have happened in this universe.
, happen in that simulation of the universe. and one argument against that is that there is. You know, at what level are we talking about identical, you know, um, you know, the planets are in the same spot or the molecules are in the same spot, or the atoms are in the same spot, or, you know, you go down to like the planks, the, the, yeah, the, the quantum fluctuations.
And I think, I think the end result of that is, is an argument from quantum physics, and I'm sorry, quantum physicists here. That there just isn't, um, a way of having the same spot, the universe relies on some level of random that will push things, you know, that might be almost identical, but there is that slight different and it pushes you to that sort of [00:19:00] thing, or it pushes you in that sort of direction.
And that, that happens from the very quantum level of the universe that we can't have, , equality on either side.
Rod: One of the early objections to this came from theologians. And we remember we're talking 14th century where they said, look but wouldn't God just nudge the donkey a bit? Like wouldn't there
be what, what they're really kind of saying from our point of view that they're a random element.
Like it could be something really tiny and it's arguments like, you know, would the, would the donkey maybe see slightly better out of one eye than the other or something, therefore it was, you know, given for proclivity? Or is there a gust of wind that blew from one side or the other? But, but apparently they got, some of them got so angry about this and got so heated that like you had monks having fist fights over, which I, I fucking love that. Are they angry about not being able to drink or never having sex? It's no, you were wrong about burden's assy son of a bitch.
Will: I, I completely believe this. but I, I think, but I think it, it, it's, it's a way. In a sense for these monks to [00:20:00] get to the idea of could you have, perfect equality like that? Could you have, could you capture that kind of thing? And I think it's a disbelief that there is anything that, you know, is it the don donkey's eye slightly better on one side or whatever that is, that the world doesn't have.
you know, these, these perfect essences that exist in the world. And you can say yes, they are mathematically equivalent.
Rod: Yeah. And you can, and look, you can see this coming into play when you're talking about, um, you know, the classics of ai, ai cars or self-driven cars discerning between two terrible options. Do I hit the child or do I hit the other child
Will: I just, and, and that's actually a really interesting question. Like how does it weigh up the consequences of all of those things and at what level does it tap out and just go, you know, too hard? I'll just hit the one that's closer or.
Rod: it just, or it pulls it te uh, what is it? A uh, a uh, Tesla cyber truck and just burst into flames
kills itself
because that's why they're doing it, right. They're doing it 'cause they're so moral.
Will: but you know, we have had centuries of, uh, let's say [00:21:00] horny drunk monks, um, having these fights or, or stoned teenagers talking about the, the essence of the universe and, and computers just haven't. so
Rod: and speak. Speaking of stone teenagers, those, those of you who are listening right now, we are talking just to you. I'm talking just to you, stone teenagers.
Will: make them paranoid, Ben.
Rod: If you, if you weren't sure, yes, you
Will: You gave me a philosophical question. I've got a, I've got a sociological question, we might think, okay, why, why are we not really, really grappling with what's going on?
Rod: going
Will: there's a couple of components to, to this story. you know this story, it's a fast moving story, but I think it's one that we should spend a little bit of time looking at. Where I'm gonna start is Nature Careers. So, nature Big Science Magazine.
it's one of the, one of the voices of science. they do a, a career survey every year. And, uh, you know, for a long time it's been, you know, what [00:22:00] disciplines are up, what disciplines are bad, you know, there's more in physics, less in chemistry, whatever it is, something like that. Or there's more jobs for, for younger people or more jobs for older people or stuff like that.
And, you know, for a long time, you know, we in our field have, you know, looked at these things and you see a little fluctuations and you go, oh, cool. That's pointing to, you know, more work in ai. This nature survey this year, there has been the enormous stone thrown into the pond, , that is Donald Trump.
Rod: Oh, that rings a bell. Yeah. Yeah.
Will: so there's a, there's some data here that shows that US scientists, , between January this year and March this year have submitted 32% more applications, , to move internationally, than the pre periods before in March alone Views of jobs overseas from America have risen 68%. So there are a whole bunch of people, uh, scientists, and let's assume this parallels to a lot of other people who have that profession.
But you know, we here [00:23:00] talking about science, who are just saying, all right, I'm looking, looking for a way to get out of here.
Rod: Yep. I suspect those numbers are gonna go up,
Will: Oh my God. Well, I saw, there was a, a separate, uh, separate survey, which actually this, that was not, so the first one was looking at actual careers, applications and things like this.
This other one was just on, on sentiment. Something like 75% of the researchers in the United States were keen to leave. It was like 75%.
Rod: what? I mean, you know what's the weird thing about that? Is people in that Trump machine are going victory.
Will: Well, well, so this is, this is, I mean, that's one of the questions that I want to get to, what is driving this? Of course, you know, the first thing is that there is a whole bunch of, uh, research that has, I mean, actively canceled. There's a couple of great websites.
Um. That have been going on. So, so these are on, um, on Airtable. Uh, one is tracking the, uh, national Institutes of Health, research grants that have been canceled. And the other is [00:24:00] tracking the National Science um, science Foundation. you know, both of those have had 400 grants canceled recently.
Each, uh, there's a whole bunch more. There's a whole bunch more. ,
that, so. Yeah, yeah, yeah. And these, these are active grants. So these are not vetoing grants that, you know, were about to get funded or all the researchers, you know, so community has said, this is worth funding, this is stuff that has been funded.
People are working, people are three months into these things. You know, there was one study, I'm gonna go through a bunch of these studies in a second just so you can see a pattern, but, um. there's a study, a landmark study on women's health tens of thousands of women, had been tracked, over four decades.
So this was started in the 1990s, throughout America there was like four data collection centers around the country, get tens of thousands women, and, you know, you, you grab all of that data, you know, what's your diet like, what's your health and activity like, what's your.
Then you look at the outcomes, you know, who's having the heart attacks, who's getting the cancers, all of those kinda stuff. Been
running
Rod: decades is fantastic.
Will: [00:25:00] four decades with
Rod: I meant was I meant, was fantastic. Do I do? I mean,
Will: Well, yeah, it was, was so, so that one, that one is. So there's huge things. There's tiny things. But I wanted to play a little game.
Um.
I'm
Rod: sounds fun.
Will: I'm just gonna, I'm just gonna read through and see if you can, um, you can guess. I'll just read the titles. I'm not even gonna go into the abstracts or anything. and see if you can work out the keyword that got that particular grant funded. So
let's,
Rod: woke,
Will: a multi-scale atlas of senescence in diverse tissue types.
What word
Rod: got
it. Diverse verse.
Will: , uh, targeting TB transmission hotspots to find undiagnosed TB in South Africa?
Rod: Oh, it could be because trans is in there or else it's a, it's a country other than America.
Will: That's, I think, I think it could be either, either of those.
Rod: I'm loving it. I mean, we, we, we've already gotta say, to be clear, [00:26:00] neither of those words have anything to do with their political horse height. But anyway, why, why would you bother to dive into that?
Will: Uh, smoking and cancer related health disparities among sexual and gender minority adults?
Rod: Oh my God, there's so ding, ding, ding, ding, ding. Well, I've got sexual and gender,
but I assume
Will: not allowed to have that. well, it was diversity, I think, uh, skeletal health and bone marrow composition among youth. Is it?
Rod: skeletal health, bone marrow composition among youth, youth has gotta be a trigger,
Will: Uh, youth is, I think, I think maybe if we clicked in it might find, uh, diverse populations of youth. But I, I do like the idea that potentially the Trump administration is like, no, no science that will make the youth healthier.
Rod: youth,
Will: Uh,
Rod: that's, yeah, it's survival of the in most insane. If they live past youth, then we'll do research.
Will: structural influence on methamphetamine use among black, gay, and bisexual men.
Rod: Yeah. That's hard to pick. It's obviously a among, among,
Will: well let's flip over to the National Science Foundation. So they do, so National Institute of Health are obviously doing [00:27:00] research focused on, uh, biomedical sort of stuff. But let's, let's read some, uh, national Science Foundation ones, um,
Rod: Let's, shall we, let's find out how good they are.
Will: a signal detection approach to understanding susceptibility to misinformation.
Rod: Oh, misinformations probably a word they don't
Will: They don't like that. Yeah. So there's a bunch, a bunch of projects, uh, that are looking at anything that is tracking misinformation or disinformation work. And it's 'cause they're like, no, we don't wanna track that anymore. that's against us.
Rod: us. Yeah. That's our, that's our core business. If we didn't have misinformation, we couldn't communicate. Uh, to be fair, they also have disinformation. I mean, let's not, let's not be unkind to them. They do have two kinds.
So blunt, fuck. It cracks me up that they literally go, if the word diverse is in it, cancel the funding. Do you want us to read what it's talking about,
diversity in? No.
Will: here's a great one. Advancing collaborations for equity in marine and climate science.
Rod: Boom. Equity and climate at least.
Will: Double, double canceled.
Rod: Yeah. Yeah. That's it. You ask [00:28:00] money. Now you've gotta not only give back the the research funds, but twice the
Will: You gotta
Rod: Fucks you woke. Liberal hippie bastards.
Will: So look, uh, this is, I mean, I, I was just, I was just flicking through these and, and thinking, oh, this is, this is literal, you know, that any of these grants that have anything that is diversity, equity, inclusion. So you get a bunch of grants that are funded 'cause they're literally on, trans youth or something like that.
But then a bunch that are, that have the word trans in there. We know that the Enola gay, for example, has been canceled. but then there's also a bunch that are looking at misinformation, disinformation work. It's like, no, we, the, the, the truth comes from us. You don't get to critique and a bunch that's on climate change.
Rod: my current job, which is not related to this podcast at all, um, there is a lot of conversation. It's science related, let's just put it that way. And my God, the people who are not prone to overkill an extreme worry are using words like catastrophic. Now, in, in the wake of this, like they're literally saying, this is.
There are literally [00:29:00] catastrophic impacts. I mean, like fucking with the, the data that Noah and the, and other related, um, weather people and climate people are gathering and being shared all over the world. Stopping that has extreme implications across the planet, including in our dear country. We rely on so much data from these places, and it's now just being burnt down.
It's this willful destruction. It's, I don't think there's even any rhyme or reason to it. They just wanna show that they can fuck with
Will: So, so this was my question when I was, when I was reading this, because, you know, you gave us a bunch of paradoxes before and,
Rod: Uh, it's para toes. I've told you that a thousand
Will: That's true. That's true. But I've been thinking about all of this stuff, you know, attacks on science, but also the whole Trump residency and, you know, I listen to the podcast, I, I watch the YouTube vids and I, and, and there's a bunch where people have started to talk about, okay, what is, what is the worst case scenario here?
and. So often I've thought, wow, your worst case scenario is quite timid. [00:30:00] It's really like, like you're really not get, you know, so one example recently, I, you know, Ezra Klein love his work. It's great stuff. And, and you
know, so
Rod: fan of this show.
Will: he is, he is, he's, he's definitely not been on this show as much as he wants.
there's conversations. He is like, oh, when the midterms come up, and, and I listen to him talk and I just think. If, if, if the
Rod: when. Don't wanna I, I hear that too. And I wanna give him a little hug and go, aren't you? Aren't you adorable?
Will: so so my question is, we're we're into, you know, a through the looking glass moment with the Trump presidency, you know, in a sense that there is ripping up a whole bunch of science. Yes, that's one thing, but ripping up a whole bunch of things all around the world and moving really, really quite quickly, shockingly quickly, and, and the project 20 25 people are kind of astounded by how, how quickly Trump is moving and they're like, this is, this is wildly off the charts, the best thing we could have hoped for.
Rod: That is a terrifying utterance. And the project 2025 people are freaked. You're like, what in the snapping
Will: No, no, no. They're not freaked. They're
like, they're, they're they're, they're surprised and impressed [00:31:00] at, uh, at how quickly Trump is moving. But why can't we, why can't we talk about the worst case scenarios? Because it's, it's like, are we thinking, okay, at some point things will bounce back to normal. I. Or we just, you know, we, we've tipped the, the boat over and we have no idea, like the, we, we can't point to anything before.
And, and that's what I'm trying to grapple with here. Like, I think, what is the worst case scenario? Well, I don't know. Trump is, world dictator? Like, he's, he's, he, he, he's like king of the world or something. I, I don't know,
Rod: that's not out of his sight, is it? Likely, I dunno. But is it possible? Hmm.
Will: But look, this is, this is a question, and I don't know if this is a call for research or if this is a, a call for conversation. , but there's a parallel, I've been thinking about this, for a couple of decades, a bunch of climate researchers have accused the climate change community of not telling the truth.
And I don't mean that in a climate denial way. I mean, in a way that they [00:32:00] say, you know, you've actually softened the blow of how bad this is gonna be.
Rod: Yeah. We don't wanna spook the horses, man.
Will: We don't
And so, so you know, if there's a a, a spectrum of this is the worst, and this is, this is the easiest thing that climate scientists have erred towards telling the sort of middle of the road or less sort of version.
And we've erred on not saying, not saying the worst case scenario, but I feel like don't we need some worst case scenarios right now? Don't we need to hit the panic button and say this is how bad things could be.
Rod: I don't, I wouldn't, I, I'll be pedantic and say, not the so much, the panic button, but lit, but say, no, this is actually, we are in crisis now. Like the crisis has come because otherwise, you know, it's, this is back to a thing we did ages ago on the wholesome show about the memory hole with Covid and so on.
I think there's a memory hole thing going on here too. It's like. Ah, but how bad is it really? Has it ever really gotten as bad as we're speculating? Oh, come on, everyone, calm down. We don't wanna get, yeah, we don't spook the horses. All that timid horse shit. And then the analyses that will fall afterwards when the shit really goes worse, that's already pretty bad.[00:33:00]
People go, oh yeah, I know. We should have seen that coming. It's like, you fucking Ws bags
Will: Yeah.
Rod: it's cowardly too. It's like, oh, you panicked. It didn't go as bad as you said. It's like, have you thought the part of the reason didn't go as bad as I said was 'cause I said
Will: That was my panic. My panic was solving
problems. This is, you know, the close people around me say this is, this is why pessimists are more important for society. You know,
you need, you know, us idiot optimists, uh, sitting around here and thinking, oh, it can't be that bad.
Rod: But honestly, level-headed people who like to be very cautious and think about all the nuances and so forth that I, that I hear from or am chatting with, they are also saying like, no, it's, we're actually in the poo now. Even Ezra, Ezra, dear Ezra said, it's hap it's happening now. But then as you say, he still goes on to say, and anyway, but at least we got the midterms.
Like I wouldn't be relying on them existing
Will: maybe it's something that we all get up and go to work in the morning. Like, you, you live your life, you know? It's like the band kept playing on
the, um, On the Titanic, because what else do you do? You know, you, you've got a job and you sort of, [00:34:00] well, I'll go to work, and then one day.
One day, maybe I won't because the world has gone really bad, but I gotta treat the world as continuing as if it does even
Rod: is perfectly reasonable in the end. It's perfectly reasonable, like as in understandable, like what else are you gonna do other than what you already do until you can't anymore. If you've got no other idea or you feel impotent, what do you do?
Will: Yeah. Yeah. I'm, I'm really torn and I'd love to know, I don't wanna know what people's worst case scenarios are. But, uh, I think we need more of a conversation about, you know, how bad things could get.
Rod: We agree.
Will: Do you want a
tiny science
Rod: Yeah. Yeah. I want a tiny science, fat. It better be a nice one. I want something nice. Gimme something nice. 'cause after that, I feel like ick.
Will: This is a very small thing that I just learned and I was like, you are, what? How is that possible? , so. I was, I was flicking through, you know, you know, those, those almanac, you know, science facts, bunch of different things.
Rod: Honestly, that is a word that needs to [00:35:00] come back. We don't use the word almanac enough. It needs to be modernized and used in technological terms, but I love an almanac. It sounds so interesting and mysterious.
Will: you know, I wasn't looking at this, but I had never thought this, but, um, the sturgeon, the fish from which we get caviar.
Rod: Delicious, delicious
Will: do, do, do you know where they live and, uh, and how their lifecycle works?
Rod: they live, they live in the water and they get milked by Russian peasants. For our pleasure.
Will: let's go with yes, basically. , but the particular water that they live in is, uh, is, uh, rivers, , estuary sometimes, there's versions of sturgeon all around the world. The famous and tasty beluga is, uh, is very much from the Russian, uh, Kazak, Ukrainian part of the world,
Rod: Can I also say that you, you've just given me a good idea, a good name for an indie band versions of Sturgeon. I think that'll be a great name.
Will: Some
sturgeon, but do you know, do you know how big a sturgeon can get?
Rod: Fucking humongous. They're like dragons, man. They, they get like three men to lift them kind of thing, if I remember correctly, which I probably [00:36:00] don't.
Will: yes, 100% the world's biggest sturgeon ever caught. , was, although there are some that might even be bigger than this, that that's unverified going back. So in, uh, in the vulgar delta in 1827, a sturgeon measuring 7.2 meters.
Rod: 7.2 meters.
that's a fricking whale shark.
Will: So this is the size jaws was was 25 foot. so this sturgeon would've been like 23 or something like that. And there's, there's stories about a mythical sturgeon a hundred years before that, that was 8.6 meters long. And I'm like, this thing lifted a river. I had never had it in my head that the sturgeon got big, but the sturgeon can get as big as jaws and lives in a river. And I'm
Rod: I, I'd, I'd seen a three meter version and I thought, well you, that's the biggest fish in the universe. I'm wildly wrong. That is humongous. And they're kind of creepy looking.
Will: Oh yeah, yeah, yeah. They, they, they would be very creepy if I was having a swim in the river and [00:37:00] up comes sturgeon. I think they're bottom feeders. which means keep, cover your ass, and your donkey will be safe.
Rod: Stop taking my jokes. You posted. well look, you, you've just listened to a little bit of science and of course what we want you to do is how many stars can you get now? Is it like 28
Will: 7.2
Rod: or something? 7.2 meters of stars, please. And if you have any suggestions, keep 'em to yourself. We'll just back channel
this so that we can
Will: no. You send
Rod: so we can fix it
Will: You hit
cheers
Rod: address.
Will: at a little bit of science.com au.
Rod: Cheers at a little bit of hole.com.
Will: I nearly said a little bit of wholesome.
Rod: But yeah, if you've got any ideas or if you've got any corrections, 'cause sometimes we'll get stuff wrong. I never do. I'm, I'm pure facts,
Will: Oh, and maybe, maybe send in your worst case scenarios.
Rod: Please do.
[00:38:00]