Testing Streaming Conversational User Experience with Large Language Models

Adaptiv Me
Being Adaptiv
Published in
3 min readMar 20, 2024


User engagement is the cornerstone of successful applications. Streaming Conversational User Experience (SCUX) facilitated by Large Language Models (LLMs) are leading to a fundamental change in how users interact with intelligent systems, leading to real-time, fluid dialogues akin to human conversation.

Traditional conversational interfaces often use batch processing, accumulating user inputs before generating responses. SCUX, in contrast, leverages streaming data processing, enabling instantaneous responses and fostering a more natural and engaging user experience.

Key Features of SCUX with LLMs:

  • Real-time Adaptability: LLMs within SCUX possess the ability to dynamically adapt and evolve conversations based on user input. This facilitates refined responses, updated information, and seamless conversational extensions, mirroring human communication.
  • Event-Driven Architectures and Real-time Analytics: SCUX development capitalises on streaming data processing techniques, including event-driven architectures and real-time analytics. These techniques empower the creation of responsive systems capable of delivering personalised user experiences.

At the heart of SCUX lies the interplay between user inputs and LLM outputs. Here’s a breakdown of the process:

  1. User Input Stream: User interactions, including text, voice, or any other supported modality, are fed as a continuous stream into the SCUX system.
  2. Real-time Processing: The system employs streaming data processing techniques to segment the user input stream into manageable units for analysis.
  3. LLM Inference: Each segment is forwarded to the LLM for real-time inference. The LLM leverages its vast knowledge and understanding of language to interpret the user’s intent and generate a response.
  4. Response Stream Generation: The LLM’s response is incorporated into a response stream, which is continuously updated based on user input and ongoing analysis.
  5. Context Management: SCUX maintains a dynamic context buffer, allowing the LLM to track past interactions and user preferences. This context informs future responses, ensuring consistency and relevance throughout the conversation.
  6. Personalized Experience: By analyzing user interactions and leveraging the LLM’s capabilities, SCUX personalizes the user experience by tailoring responses and recommendations to the individual user’s needs and goals.

Illustrative Example: Ask Ada

Ada, Adaptiv’s AI-powered career mentor has been designed to guide users through career planning complexities. Users come to Ask Ada for career advice. After an initial interaction to understand the user’s profile and goals, a career recipe is generated as a pathway to help the user achieve their career goals.

The experience exemplifies SCUX in action. Ask Ada utilises LLMs to offer real-time, personalised guidance. As user interaction unfolds through questions and feedback, the system continuously analyses the conversation, refining responses to better address user needs.

Advantages of SCUX:

  • Enhanced Context Management: SCUX excels at managing conversational context compared to traditional chatbots. LLMs can remember prior interactions, anticipate user intent, and tailor responses accordingly, mitigating disjointed interactions.
  • Interactive Storytelling and Immersive Experiences: SCUX unlocks possibilities for enriched user engagement through interactive storytelling and immersive experiences. Integration of multimedia elements (images, videos, audio) into the conversational flow fosters captivating interactions and deeper engagement.

Challenges of SCUX Implementation:

  • Robust Infrastructure: Real-time processing demands robust infrastructure capable of handling high data volumes and ensuring rapid response times.
  • Lack of established UX: As Peter Isaacs points out in this article, there is a severe shortage of ideas for smart and impactful UX in Conversation AI. There is a “massive problem in conversational AI (CAI) and there’s a need for these teams to bring UX designers into conversations. This sort of user empathy is missing, leading to many interfaces that aren’t all that friendly.”
  • Development Team Mindset Change: Transitioning from a batch processing mindset to a streaming paradigm requires a significant shift in developer thinking. New skillsets encompassing real-time data processing, event-driven architectures, and familiarity with LLM APIs become crucial for successful SCUX development.

Streaming conversational user experience represents a significant leap forward in the evolution of user interfaces. By enabling dynamic and continuous dialogues between users and intelligent systems, LLMs can empower developers to create more natural, engaging, and personalised experiences. As technologies continue to advance, we can expect streaming conversational interfaces to play an increasingly prominent role in shaping the future of human-computer interaction.