The Book That Talks Back
On learning to treat AI as a resource rather than a person, and whether the next generation will inherit the cleaner instinct or collapse the line further.
A toddler drags two fingers across a paperback and waits for it to do something. The page does not move. The toddler frowns, tries again, then looks up at the adult holding the book as if the adult is responsible for the failure. Nobody in the room treats this as funny anymore. It happens often enough now to be its own kind of moment.
That swipe was a small category error in one direction, glass-shaped expectations applied to paper. The error running the other way is harder to name and getting harder to see. Adults thank the language model. They apologise for typos. They soften their tone when the answer comes back curt, the way someone might soften their tone with a tired colleague. The thing on the other end has no colleague-shaped concerns, but the manners arrive anyway.
We tend to greet whatever talks back. The reflex is older than the technology by several thousand years and probably older than language. Books, which also talk back in their way, never asked us to address them. That is partly why the metaphor “AI is a book” comforts a certain kind of thinker and partly why it does not hold. Books do not apologise for being long. Search engines do not follow up. The thing that talks back occupies a slot we have not named, and unnamed slots have a way of borrowing labels from neighbours that do not quite fit.
The cost of borrowing is small until it is not.
Maybe the next generation will inherit the cleaner instinct the way the tablet children inherited the swipe, born already knowing that this voice is a resource and that voice is a person, the way most of us know without being told that a parrot is not asking a question. The line draws itself.
Or maybe the swipe inversion has been waiting all along. Some of those toddlers grew up and kept reaching for glass when paper would have done. Some of them, by now, talk to the tool the way they talk to a friend, and to the friend the way they talk to the tool, and cannot easily say which conversation is which.
Request an AI summary
Learn more about the ideas and references behind this note.
FAQ
- Why do we humanize AI even when we know it isn't a person?
- The reflex predates the technology. Humans greet anything that talks back, a habit older than language and shaped by social heuristics that were never asked to discriminate between minds and machines. The ELIZA effect, named after Joseph Weizenbaum's 1966 chatbot, shows that even people aware of the program's simplicity project understanding onto it. Knowing better and behaving better are two different problems.
- Will the next generation grow up treating AI as a tool rather than a person?
- Possibly. Children born into ubiquitous technology often inherit category lines without being taught them, the way the tablet generation never seriously confused a paperback with a touchscreen. But the swipe inversion went the other way too. Some of those children kept reaching for glass long after the lesson should have settled. Whether AI becomes its own clean category or borrows the wrong neighbours' labels depends partly on how clearly the line gets drawn now.
- What are some related topics to explore?
- anthropomorphism in AIELIZA effecthuman-AI interactiongenerational technology adaptationAI as tool not personcategory formation in technology
Defined Terms
- Anthropomorphism
- The tendency to attribute human traits, emotions, or intentions to non-human entities, including animals, objects, and machines.
- ELIZA effect
- The tendency to attribute human-like understanding to a chatbot based on shallow conversational cues, named after Joseph Weizenbaum's 1966 ELIZA program.
Foundations
- Alone Together
- Sherry Turkle, 2011
- Computer Power and Human Reason
- Joseph Weizenbaum, 1976
Related Reading
Related Notes
- The Alarm That Never Learns
- Every generation fears new tech will ruin the next. From Socrates on writing to the 1916 push button, the alarm rarely predicts what actually gets lost.
- The Part Nobody Designed For
- Haptics shift how buttons feel. Colour and scent shape decisions. The emotional layer of interfaces was never in the spec but always in the experience.
- The Efficiency of Being Inefficient
- Ordering coffee from a screen while a barista stands right there. Silicon Valley declared friction the enemy. It never asked what friction was doing for us.