jupyter-chat-widget¶

This notebook demonstrates how to use the jupyter-chat-widget package to create interactive chat interfaces in Jupyter notebooks.

Note: this widget uses ipywidgets<8.0.0 for compatibility with Colab.


PyPI: jupyter-chat-widget

GitHub: jupyter-chat-widget

Maintainer: ZanSara

Installation¶

If you haven't installed the package yet:

%pip install jupyter-chat-widget

Basic Setup¶

Import the ChatUI class and create an instance. This will immediately display a usable chat interface, but the assistant won't respond.

from jupyter_chat_widget import ChatUI

chat = ChatUI()

Simple Echo Handler¶

Connect a callback function that will be called whenever the user submits a message. This simple example echoes back the user's message.

chat = ChatUI()


def echo_handler(message: str) -> None:
    """Echo back the user's message."""
    chat.rewrite(f"You said: {message}")


chat.connect(echo_handler)

Try typing a message in the input field above and pressing Enter!

Streaming Responses¶

The append() method allows you to stream responses token by token, which is perfect for simulating or displaying LLM output.

from time import sleep

# Create a new chat for this demo
streaming_chat = ChatUI()


def streaming_handler(message: str) -> None:
    """Echo back the message word by word with a delay."""
    words = message.split()
    for i, word in enumerate(words):
        sleep(0.3)  # Simulate processing time
        streaming_chat.append(word)
        if i < len(words) - 1:
            streaming_chat.append(" ")


streaming_chat.connect(streaming_handler)

Using rewrite() for Final Formatting¶

You can combine append() for streaming and rewrite() for final formatting.

combined_chat = ChatUI()


def combined_handler(message: str) -> None:
    """Stream the response, then format it nicely."""
    # First, stream the processing
    combined_chat.append("Processing")
    for _ in range(3):
        sleep(0.5)
        combined_chat.append(".")

    sleep(0.5)

    # Then rewrite with the final response
    word_count = len(message.split())
    char_count = len(message)
    combined_chat.rewrite(
        f"Analysis complete! Your message has {word_count} words and {char_count} characters."
    )


combined_chat.connect(combined_handler)

Clearing the Chat¶

Use clear() to reset the chat history.

# Uncomment to clear a chat
# chat.clear()

Integration with LLMs¶

Here's a template for integrating with an LLM API. Replace the placeholder with your actual LLM client.

# Example: Integration template (not functional - requires your LLM client)
llm_chat = ChatUI()


# Replace this with your actual LLM integration
def fake_llm(message):
    for token in f"You said {message}".split(" "):
        yield token + " "
        sleep(0.5)


def llm_handler(message: str) -> None:
    """Handle messages with an LLM."""
    for token in fake_llm(message):
        llm_chat.append(token)


llm_chat.connect(llm_handler)

API Summary¶

Method Description
ChatUI() Create a new chat widget
connect(callback) Set the message handler
append(token) Add text to current response
rewrite(text) Replace entire response
clear() Clear chat history