Advanced Voice Mode is kind of like a next-generation Siri or Alexa.
You talk to your phone and it talks back.
It’s a concept that we’re all familiar with by now.
Still, Advanced Voice Mode absolutely dominated theGPT-4o launch eventon May 13th.
I rarely see conversations about Advanced Voice Mode’s practical capabilities.
Yeah, it can answer questions and look through your cameramost people don’t seem to care.
OpenAI clearly knew that a human-like voice would capture the public’s imagination.
The hype for Advanced Voice Mode has died down a bit.
OpenAI says that “Sky"wasn’t intendedto sound like Johansson.
The human-like tone of Advanced Voice Mode will still be a topic of conversation during this Alpha test.
But the novelty and hype have been diminished by a nearly three-month wait.
Now’s the time to mention that ChatGPT cannot mimic voices.
Select ChatGPT Plus users will see an Advanced Voice Mode notification in the ChatGPT mobile app.
These users will also receive an email explaining how to use the feature.
A wider rollout will come in late 2024.