The Repercussions of Making AI Assistants Sound Human

The Repercussions of Making AI Assistants Sound Human
AP Photo/Mark Lennihan, File

To help rid Alexa of its cyborgian lilt, Amazon recently upgraded its speech synthesis markup language tags, which developers use to code more natural verbal patterns into Alexa's skills, or apps. The new tags allow Alexa to do things like whisper, pause, bleep out expletives, and vary the speed, volume, emphasis, and pitch of its speech. This means Alexa and other digital assistants might soon sound less robotic and more human. But striking a balance between these two extremes remains a significant challenge for voice interaction designers, and raises important questions about what people really want from a virtual assistant.

Read Full Article »
Comment
Show commentsHide Comments

Related Articles