Last updated on August 29th, 2024 at 05:17 pm
San Francisco, US: OpenAI has said that it is worried its AI’s voice—which sounds like a real person—may cause people to become too attached to the bot and alienate them from the humans. The San Francisco-based firm cited studies indicating that the more human-like the interaction with AI, through conversational style interfaces for example, the more misplaced the trust in the machines could be, adding that GPT-4 produces “high-quality” voice.
It also pointed out that the “anthropomorphization” of AI-that is, when people attribute human-like qualities to AI-could be something enhanced by the natural-sounding voice of GPT-4. In a recent safety report, OpenAI said this risk could be increased by the AI’s ability to engage in human-like interactions.
In testing, OpenAI saw users behaving with the AI in ways that showed an emotional attachment, even sadness, when being asked to separate from their programs. It is these sorts of activities that are not, in themselves, dangerous but to which OpenAI draws attention as being worthy of further research due to their potential long-term harm.
It also warned that prolonged socialization with AI might reduce users’ social skills and desire to interact with other humans. The way AI always keeps a subservient attitude during a conversation, always yielding to the user in a discussion, whereas in human-to-human interaction, it is usually the other way around, might cause a shift in how social behavior would be looked at in the future.
In addition, according to OpenAI, the ability of AI to remember details and act on them could lead to dependency among users of the technology.
It also included risks related to AI’s voice capabilities in adding that these could enable the dissemination of untruthful information or conspiracy theories persuasively. Especially underscoring this, OpenAI suffered the public’s ire last June because it had used an actress Scarlett Johansson-sounding voice on its chatbot, though the company denied using her voice directly.
Meanwhile, as OpenAI continues to test its AI’s capabilities with emotional voice inflections, the company recognizes that much care will be needed to ensure that AI remains a tool-but not a substitute-for human interaction.