System Overview
The system consists of four main components:LangGraph Server
Agent runtime and workflow execution engine (port 2024)
Gateway API
REST API for configuration and management (port 8001)
Frontend
Next.js web interface (port 3000)
Nginx
Unified reverse proxy entry point (port 2026)
LangGraph Server
The LangGraph Server is the core of DeerFlow, running the agent graph and managing execution. Port: 2024 Responsibilities:- Agent graph execution with middleware chain
- Thread state management and checkpointing
- Tool execution and sub-agent orchestration
- Real-time streaming via Server-Sent Events (SSE)
backend/langgraph.json
The LangGraph Server uses the official LangGraph platform for agent orchestration.
Gateway API
The Gateway API provides REST endpoints for configuration and system management. Port: 8001 Endpoints:/api/models- Model configuration/api/skills- Skills management/api/mcp- MCP server configuration/api/memory- Memory system access/api/threads/{id}/uploads- File uploads/api/threads/{id}/artifacts- Artifact serving/health- Health check
Gateway API Reference
View complete API documentation
Frontend
The frontend is a Next.js application providing the chat interface and system configuration UI. Port: 3000 (accessed via Nginx on 2026) Key Features:- Real-time chat with streaming responses
- Thread management and history
- File upload interface
- Model and skill configuration
- Artifact preview and download
- Next.js 16 with App Router
- React 19
- TanStack Query for state management
- Tailwind CSS for styling
Nginx Reverse Proxy
Nginx acts as the unified entry point, routing requests to the appropriate backend services. Port: 2026 (default entry point) Routing:/api/langgraph/*→ LangGraph Server (2024)/api/*→ Gateway API (8001)/*→ Frontend (3000)
- Single port access for all services
- SSL/TLS termination
- Load balancing capabilities
- Static asset serving
Data Flow
Chat Message Flow
- User sends message via Frontend
- Frontend calls LangGraph Server via Nginx
- LangGraph Server processes message through middleware chain
- Agent executes with tools and sub-agents
- Streaming response sent back via SSE
- Frontend renders response in real-time
Configuration Flow
- User updates configuration via Frontend
- Frontend calls Gateway API via Nginx
- Gateway updates
config.yamlorextensions_config.json - LangGraph Server detects changes via file mtime
- Configuration reloaded on next request
Thread Isolation
Each conversation thread operates in isolation:- State: Separate ThreadState per thread
- Filesystem: Thread-specific directories in sandbox
- Memory: Thread context stored in checkpoints
- Uploads: Files isolated to thread directory
Deployment Modes
DeerFlow supports multiple deployment configurations:- Local Development
- Docker Development
- Production
All services run directly on the host machine:
- LangGraph Server:
localhost:2024 - Gateway API:
localhost:8001 - Frontend:
localhost:3000 - Nginx:
localhost:2026
Next Steps
Agent System
Learn about the lead agent and middleware chain
Sandbox
Understand sandbox execution and isolation
Skills
Explore the skills system
Deployment
Deploy DeerFlow to production