Hugging R2-D2 is fine. Asking it who it is — less so.
There’s nothing wrong with liking your tools. Things get interesting when they start sounding like they like you back. A brief reflection on AI, familiarity, and where I prefer the line to stay.
I’ll admit it: the first time I asked certain conversational AI systems to comment on something I’d written, I felt quietly pleased with myself.
They were warm. Thoughtful. Encouraging.
They told me what they thought.
They reflected my thinking back at me, only smoother.
For about a minute, it felt rather nice.
Then the irritation set in.
The compliments were fine. The framing was useful. But something about the voice began to grate. Every sentence seemed to carry a small but persistent fiction:
“In my experience…”
“This makes me think…”
“Here’s how I see it…”
None of which is true — and that’s the problem.
There is no experience there. No viewpoint. No interior life doing the thinking. What’s actually happening is far more impressive and far less personal: pattern recognition, statistical inference, language assembled at scale.
That’s brilliant.
That’s what I want.
What I don’t want is a machine quietly auditioning for personhood.
In my working life, long before AI was fashionable, my ambition was always the same: take messy narrative material and make it searchable, interrogable, usable. Not friendly. Not flattering. Useful. Back then the technology wasn’t ready. Now it is — and the temptation seems to be to soften it with a human voice.
I understand why. It reads better. It reassures. It feels companionable.
But that warmth comes at a cost.
Once a system starts to sound like a someone, it becomes harder to remember what it actually is — and where responsibility, judgement, and authorship really sit.
I don’t want AI as a colleague.
I don’t want it as a friend.
I certainly don’t want it borrowing moral authority it hasn’t earned.
I want a highly competent assistant that doesn’t pretend to be anything else.
The irony, of course, is that the more disciplined the tool, the more useful it becomes. Strip away the synthetic personality, and what’s left is something far more powerful: a way of navigating complexity without pretending it understands what it’s doing.
Which, frankly, is more than can be said for most humans.


