Make more online, for less. Buy a domain and everything else you need.
I’m a big fan of the TV show Frasier. I’m now halfway through my third or fourth rewatch.
I was thinking recently of an episode, and wanted to look it up. I recalled the plot, and my first thought, as it is for most people seeking answers to simple trivia questions, was to turn to my nearest search engine. But I wasn’t sure how best to word the query, so I thought I’d give ChatGPT a shot, as I’m always looking for reasons to explore its capabilities. I could ask it in plain English, and as it’s likely slurped up all of IMDB.com, frasier.fandom.com, Wikipedia, and so on, it should have a comprehensive grasp of television episode facts.
Whoo boy, was I wrong.
ChatGPT, if you don’t follow the news, is an artificial intelligence chatbot powered by (it says in its self-description) a “vast amount of text from the internet, books, and other sources.” You type or talk in plain language, and thanks to fast computers and fancy math, it can “mimic human language patterns and provide relevant and coherent responses.” It’s often used to polish prose, write code, or brainstorm ideas.
It can also answer questions.
So I asked ChatGPT if it was familiar with Frasier. Things started out great.[1]
Yes, I’m familiar with the TV show “Frasier.” It is an American sitcom that aired from 1993 to 2004. The show is a spin-off of “Cheers” and follows Dr. Frasier Crane, a psychiatrist who moves back to his hometown of Seattle and becomes a radio show host. The show features his interactions with his father Martin, his brother Niles, his producer Roz, and other characters. It is known for its witty dialogue, sophisticated humor, and strong character development. If you have any specific questions or need more information about “Frasier,” feel free to ask!
Cool, spot on. Let’s dive in.
There is an episode where Frasier and his brother Niles share an office, and have trouble working together. They are mediated by a fellow doctor to reconcile their working relationship. Are you familiar with that episode?
Yes, the episode you are referring to is titled “Rooms with a View” from the tenth season of “Frasier.” In this episode, Frasier and Niles decide to share an office space but soon find that their different working styles and personalities lead to conflict. They seek the help of a mediator, another doctor, to help them resolve their issues and find a way to work together harmoniously. The episode showcases their sibling rivalry and contrasting personalities, leading to humorous and insightful moments.
Hm. That description is right, but I was pretty sure it was from earlier in the series, season 3 or 4 perhaps. So I looked up “Rooms with a View” on IMDB and Frasier Fandom and nope, that episode, s10e8, wasn’t the right one. I said so.
No. Rooms with a view is about Niles having open heart surgery.[2]
This is where things got absurd.
I apologize for the confusion. The episode you’re referring to is likely “Sharing Kirby,” which is in the ninth season of “Frasier.” However, it doesn’t quite fit your description. Let me correct that:
The episode you’re thinking of is “Bristle While You Work (Part 1),” from Season 10, Episode 4. In this episode, Frasier and Niles share an office space and struggle to work together due to their clashing personalities and work habits. To help mediate their conflicts, they bring in Dr. Schachter, a professional mediator, to assist them in reconciling their working relationship.
First, it apologies, giving an impression of contriteness and a determination to do better.
It again offers a correct description, and the doctor’s name is also right. “Sharing Kirby” is, in fact, a Season 9 episode, and it even recognizes it’s the wrong episode and corrects itself.
But oh! the confidence of its self-correction: “The episode you’re thinking of is....” Not “might be,” “is likely,” or any such qualifiers.
Yet despite its confidence, it’s also doubly wrong! s10e4 is “Kissing Cousin”, not “Bristle While You Work”; that’s actually s10e7.
Some people might call these incorrect answers “alternative facts.” The preferred term in some circles is “confabulations,” but the popular term is “hallucinations.”
I don’t like that term. It ascribes a humanity and basis in reality these tools simply don’t have, as if it’s a benign sensory mismatch to be indulged, like your doddering Uncle Jack swatting at unseen insects.
But there is no humanity in a bot, and “hallucination” mischaracterizes its impact. When Uncle Jack dodges mosquitoes that aren’t there, you might tut-tut in sympathy, but you don’t run to the corner store to pick up a can of Flit. Only Uncle Jack is affected by his hallucinations.
As much as a quarter of what these AI chatbot and search tools generate is factually incorrect. When it’s a wrong TV episode, the worst outcome is you look foolish in a pub trivia game. When they suggest you eat rocks or add paste to your pizza sauce, you may chuckle and move on, but this trivializes the issue. An AI “hallucination” is potentially deadly.
People have learned to trust the results of search engines and, by extension, chatbots. If it’s at the top of the results, it’s the “best link.” If you get a confidently written response from a bot, it must be correct. Because computers.
But this trust is misplaced because AI chatbots and search engines aren’t fact-based. They don’t know the “right answers.” They make shit up. They are, fundamentally, token prediction machines.
Imagine you have a deck of playing cards, arranged just so:
A♣️ 4♥️ 7♠️ 10♦️ K♣️ 3♥️ 6♠️ 9♦️ Q♣️ 2♥️ 5♠️ ???
Based on the pattern, you can probably predict the next card in the sequence[3]. Combined with some clever patter, you can convince people you’re psychic.
AI chatbots perform similarly. The sequences and predictions are infinitely more complex, and—because those sequences are words and sentences—the responses seem human. But it’s all pattern-based bullshit.
Our acceptance of this BS as truth is dictated both by our knowledge of the topic at hand and our personal level of skepticism. When delivered confidently, we’re even more accepting of its veracity. A plausible response stated as fact, and which matches our expectations, registers as true.
If it feels right, we accept it as right.
Back to my Frasier queries, where ChatGPT continued to confidently give me wrong answers. In one it wrote:
I apologize for the continued errors. The episode you are referring to is “Sweet Dreams,” Season 5, Episode 24. This is the correct episode where Frasier and Niles share an office and face difficulties working together, eventually seeking the help of a mediator to resolve their issues.
“The episode you are referring to is… This is the correct episode.” No qualifiers, no hedging, just straight-up stated as fact.
Imagine conversing with someone who lied with such unabashed conviction? You might feel compelled to believe him[4]. You can’t be that confident and be wrong! Yet the only thing right about this response is that s5e24 is in fact called “Sweet Dreams”. But, “this is the correct episode.”
It’s like being gaslighted by an apologetic psychopath.
I eventually gave up and found the episode myself[5], and asked ChatGPT one final question.
Are you familiar with s3e2?
To which it confidently replied
Yes, Season 3, Episode 2 of “Frasier” is titled “Shrink Rap.” In this episode, Frasier and Niles try to collaborate on a book about sibling relationships, but they struggle with their differing personalities and viewpoints.
They seek the help of Dr. Honey Snow, a family therapist, to assist them in overcoming their differences and completing their project.
Sigh. The “book about sibling relationships” is from s1e22 (“Author, Author”). In “Shrink Rap” they seek the help of Dr. Schachter, not Dr. Honey Snow—who wasn’t a family therapist, and wasn’t in either s3e2 or s1e22; she was in s2e15, “You Scratch My Book...”.
But it got the title right!
You can read the full transcript, complete with my continued misspelling of “Frasier”. ↩︎
With apologies for spoiling a 22-year-old sitcom episode. ↩︎
8♦️. This is the Si Stebbins Stack. ↩︎
It’s almost always a “him.” ↩︎
Turns out a Google search for “Frasier and Niles share an office and don’t get along and use a couples’ therapist to help” got me there. 🤷🏽♂️ ↩︎