XibeCode

WebUI

A browser-based interface that syncs in real-time with the terminal. Features Monaco editor, multi-terminal, Git panel, settings modal, file explorer, and real-time AI chat.

A browser-based interface that syncs in real-time with the terminal. Chat, switch modes, reference files, and configure settings โ€” all from your browser.

Prefer a native app?

The WebUI is also available as a Desktop App for Windows, macOS, and Linux โ€” same interface, native window.

Quick Start

# Start the WebUI
xibecode ui --open

# Or start chat (WebUI opens automatically)
xibecode chat

# WebUI runs at http://localhost:3847

Screenshots

Main Interface

Modern v0.dev-inspired interface with activity bar, chat panel, code editor, and terminal.

Main Interface

File Explorer

Browse and open files with recursive directory tree.

File Explorer

Chat Interface

Interactive AI chat with streaming responses and markdown rendering.

Chat Interface

Git Panel

Git integration with commit history, staging, and diffs.

Git Panel

Settings Panel

Comprehensive settings modal with multiple configuration categories.

Settings Panel

AI Provider Settings

Configure AI models, API keys, and provider settings.

AI Provider Settings

MCP Servers Editor

Edit MCP server configuration with Monaco editor and syntax highlighting.

MCP Servers Editor

Terminal View

Fully interactive terminal with PTY support, colors, and tab completion.

Terminal View

New in v0.5.0

  • Interactive Plan Mode - Plan mode asks questions, searches the web, generates implementations.md, and has "Build" to auto-execute
  • Chat History - Persistent conversations with per-project storage, History panel in activity bar
  • Environment Variables Editor - Visual .env editor with auto-detection, secret masking, live editing
  • Media Preview - Images, videos, audio render as proper previews instead of binary
  • Thinking Animation - Loading spinner while AI processes requests
  • Improved Tool Rendering - Icons, status badges, and progress indicators for tool calls
  • Smart Scroll - No more forced scrolling when reading earlier messages
  • Donate Button - Support XibeCode from the activity bar

Previous (v0.4.x)

  • TUI-WebUI bidirectional sync
  • Slash commands for mode switching
  • @ file references
  • Custom model/endpoint support
  • Tool execution display
  • Minimalistic terminal design

TUI-WebUI Sync

When you run xibecode chat, both the terminal and browser interfaces are connected in real-time:

From TUI to WebUI:

  • Messages appear with "(TUI)" label
  • Streaming responses show live
  • Tool calls display in real-time

From WebUI to TUI:

  • Messages processed by TUI agent
  • Full tool access and execution
  • Responses stream to both interfaces

Slash Commands

Type / in the input to open the command palette. Use arrow keys to navigate and Enter to select.

CommandDescription
/clearClear chat messages
/helpShow available commands and tips
/diffShow git diff
/statusShow git status
/testRun project tests
/formatFormat code in project
/resetReset chat session
/filesList project files

Agent Modes in WebUI

ModeIconDescription
/mode agent๐Ÿค–Autonomous coding (default)
/mode plan๐Ÿ“Interactive planning with web research
/mode tester๐ŸงชTesting and QA
/mode debugger๐Ÿ›Bug investigation
/mode security๐Ÿ”’Security analysis
/mode review๐Ÿ‘€Code review
/mode team_leader๐Ÿ‘‘Coordinate team of agents
/mode architect๐Ÿ›๏ธSystem design
/mode engineer๐Ÿ› ๏ธImplementation
/mode seo๐ŸŒSEO optimization
/mode product๐Ÿ”ฅProduct strategy
/mode data๐Ÿ“ŠData analysis
/mode researcher๐Ÿ“šDeep research

File References

Type @ in the input to browse and reference project files:

  • Shows files and folders in your project
  • Type after @ to filter (e.g., @src/)
  • Use arrow keys to navigate, Enter to select
  • Selected file path is inserted into your message
Fix the bug in @src/utils/helpers.ts

Settings Panel

Click the Settings button in the activity bar to configure:

AI Provider:

  • Provider โ€” Anthropic, OpenAI, or Custom
  • Model โ€” Select from available models
  • Custom Model ID โ€” For local/custom LLMs
  • API Key โ€” Your provider API key
  • Base URL โ€” Custom API endpoint

Session Info:

  • Working Directory โ€” Current project path
  • Git Branch โ€” Current branch name
  • Session ID โ€” Current session identifier

Features

  • Real-time Streaming โ€” See AI responses as they're generated
  • Tool Execution Display โ€” Live status indicators: running, done, or failed
  • Markdown Rendering โ€” Rich text with code blocks, bold, italic, lists, links
  • Thinking Indicator โ€” Spinner animation during AI processing
  • Minimalistic Design โ€” Clean, dark theme with monospace fonts
  • Responsive Layout โ€” Works on desktop and mobile

Keyboard Shortcuts

ShortcutAction
/Open mode selector
@Open file browser
EnterSend message / Select item
Shift+EnterNew line in message
โ†‘ โ†“Navigate popup options
EscClose popup

Multi-Model Support

Anthropic: Claude Sonnet 4.5, Claude Opus 4.5, Claude Haiku 4.5

OpenAI: GPT-4o, GPT-4o Mini, GPT-4 Turbo, GPT-3.5 Turbo

Custom: Any OpenAI-compatible API, Local LLMs (Ollama, LM Studio), Custom model IDs

API Reference

EndpointMethodDescription
/api/configGET/PUTGet or update configuration
/api/projectGETProject info (name, git, etc.)
/api/filesGETList project files
/api/git/statusGETGit status information
/api/git/diffGETGit diff output
/api/tests/generatePOSTGenerate tests for a file

WebSocket Connection

// Connect in bridge mode
const ws = new WebSocket("ws://localhost:3847?mode=bridge");

// Send message to TUI
ws.send(JSON.stringify({ type: "message", content: "Hello" }));

// Receive events
ws.onmessage = (e) => {
  const data = JSON.parse(e.data);
  // Types: user_message, stream_start, stream_text, stream_end,
  //        tool_call, tool_result, thinking, error
};
Ctrl+I
Assistant

How can I help?

Ask me about configuration, installation, or specific features.