Remembering the names of each skill you’ve installed, let alone the commands in each skill, illustrates the extent to which this new homescreen is entirely invisible. I’ve heard some people make notes about what skills they’ve added along with the commands Alexa understands and then users tape that list to the wall above the machine.
The Hidden Homescreen
Matt Hartman
42534

This sounds broken as well though. I’d say there’s opportunities for progressive disclosure and guiding the user through a journey of increased mastery of the product (bots) that they’re using. This will be a challenge for these chat-based UI’s.

Within that I do see a direction where it’s going to be more of a combination of traditional native UI combined with a chat-interface. Int his combination the main interaction is a ‘rich’ chat-interface, with the possibility of inherently ‘interactive’ messages. (i.e. messages with graphs/buttons/other interactive elements) This has the potential to traverse the line between being custom to the user while being guided by the service, while allowing for less intelligence.

This won’t work for the voice-command interfaces though. I guess for as long as the AI is not good enough to process natural language and correctly assume from the context of what you’re saying what you want, we’re going to have to resort to teaching users what to say and what they can’t say (like Siri’s ‘things you can ask me’).