I recently watched a Japanese training video where the instructor was explaining everything in a calm Kansai dialect — warm, human, easy to follow. The kind of voice that makes you want to keep watching.

Then, out of nowhere, an AI English voiceover kicked in. Not a subtitle — a full dubbing. Fast-paced, clear, and completely disconnected from the original tone. It wasn't intentional. It was auto-dubbing, triggered by my browser settings.

The same thing happened with a Brazilian ASMR channel I follow. Gentle whispers in Portuguese — until a crisp, news-anchor English voice suddenly cut in. Nothing kills the mood faster.

Should everything be translated?

These systems are improving rapidly. But the question remains: should everything be translated?

Some content is defined by how it's said, not just what is said. A Kansai dialect instructor isn't just delivering information — the warmth of that dialect is the experience. A quiet ASMR whisper in Portuguese isn't just audio — the intimacy of that voice is the point.

English speakers often complain about AI-written articles: "It sounds fine, but the voice is gone. The personality is gone. It could be anyone." That's exactly what bad auto-dubbing does. It strips out the voice. And with it, the trust.

The right question

For help articles? Sure. Translate away. Accessibility matters, and the goal is clarity.

But for creator-led content — or anything where tone is the experience — auto-translation does more harm than good. Let users opt out. Let users control it.

Translation is a feature. Voice is identity. Good UX respects both.