Streaming

API reference for real-time message streaming using Server-Sent Events (SSE). This powers the live chat responses in the Intercom component.

Note: Streaming is handled automatically by the Intercom component. This reference is for understanding internal behavior.


Overview

The SDK uses Server-Sent Events (SSE) for streaming deva responses in real-time:

  • Technology: @microsoft/fetch-event-source library

  • Protocol: SSE (Server-Sent Events)

  • Use Case: Chat completions with devas

Streaming Flow:

  1. User sends message

  2. SDK opens SSE connection

  3. Deva response streams word-by-word

  4. UI updates in real-time

  5. Final message received, stream closes


useStreamResponse Hook

Internal hook for managing message streaming (used by Intercom component).

Signature

const { message, isStreaming, error, isDone } = useStreamResponse({
  messageId,
  threadId
});

Parameters

messageId: string | undefined

  • ID of the message to stream response for

  • Triggers stream when provided

threadId: string | undefined

  • ID of the chat thread

  • Required for API endpoint

Return Values

{
  message: Message | null;      // Streaming message (partial then complete)
  isStreaming: boolean;          // True while streaming active
  error: Error | null;           // Error if stream fails
  isDone: boolean;               // True when stream complete
}

Stream Endpoint

Request

POST {baseUrl}/api/sdk/chat/{threadId}/messages/{messageId}/query/stream

Headers:

{
  "Authorization": `Bearer ${accessToken}`,
  "Content-Type": "application/json",
  "Accept": "text/event-stream"
}

Parameters:

  • threadId - Chat thread ID

  • messageId - User message ID to respond to


Stream Events

The stream emits different event types:

content

Partial text content chunks as deva generates response.

Event Type: content

Data Format:

"partial text chunk"

Handling:

if (type === "content") {
  const chunk = JSON.parse(msg.data);
  accumulatedContent += chunk;
  setMessage({ ...prevMessage, text: accumulatedContent });
}

Behavior:

  • Sent multiple times as text generates

  • Each chunk adds to previous content

  • Must accumulate chunks client-side


response_message

Complete message object when streaming finishes.

Event Type: response_message

Data Format:

{
  "id": "msg_123",
  "text": "complete response text",
  "persona_id": "persona_456",
  "thread_id": "thread_789",
  "created_at": 1234567890,
  "author_type": "BOT"
}

Handling:

if (type === "response_message") {
  const message = JSON.parse(msg.data);
  setMessage(message);
  setIsDone(true);
  setIsStreaming(false);
  handleStreamComplete(message);
  controllerRef.current?.abort();
}

Behavior:

  • Sent once at end of stream

  • Contains full persisted message

  • Triggers stream cleanup and closure


completion

Query completes but stream remains open waiting for response_message.

Event Type: completion

Behavior:

  • Indicates LLM finished generating

  • Stream stays open for final message

  • Typically logged for debugging


error

Stream encountered an error.

Event Type: error

Behavior:

  • Sets error state

  • Stops streaming

  • Closes connection


Stream States

Event Types

type ChatLLMQueryStreamTypes =
  | "content"           // Text chunk
  | "response_message"  // Final message
  | "completion"        // Query complete
  | "error"            // Error occurred

Component States

Idle:

  • isStreaming: false

  • message: null

  • isDone: false

Streaming:

  • isStreaming: true

  • message: { text: "partial..." } (accumulating)

  • isDone: false

Complete:

  • isStreaming: false

  • message: { ...complete } (full message)

  • isDone: true

Error:

  • isStreaming: false

  • error: Error

  • isDone: true


UI Components

The SDK provides three components for different streaming states:

IncomingMessageIndicator

Shows typing indicator (three bouncing dots) when stream starts but before content arrives.

InProgressMessage

Displays accumulating text in real-time as content chunks arrive during streaming.

MessageStreamError

Shows error message with red background when stream fails.


Integration with Intercom

The Intercom component handles streaming automatically:

  1. User sends message → Creates message via API

  2. Sets streamingMessageId to trigger stream

  3. useStreamResponse hook opens SSE connection

  4. Content chunks accumulate and update UI in real-time

  5. Final response_message received → Stream closes

The useStreamResponse hook manages all streaming state internally, including:

  • Opening/closing SSE connection

  • Accumulating content chunks

  • Handling errors and cleanup

  • Updating component state


Last updated