Overview
Local development runs all services directly on your machine, giving you maximum control and flexibility. This is ideal for:- Deep debugging and development
- Working with local development tools
- Environments where Docker isn’t available
- Maximum performance on your hardware
Prerequisites
Verify all required tools are installed:- Node.js 22+
- pnpm
- uv (Python package manager)
- nginx
Architecture
Local development runs these services:Quick Start
Configure Application
Ensure See Installation Guide for details.
config.yaml is configured with your model and API keys:config.yaml
Install Dependencies
Install frontend and backend dependencies:This runs:
cd backend && uv sync- Python dependenciescd frontend && pnpm install- Node.js dependencies
Start All Services
Start all services with nginx reverse proxy:This starts:
- LangGraph server (port 2024)
- Gateway API (port 8001)
- Frontend (port 3000)
- nginx reverse proxy (port 2026)
All services start automatically and run in the background. Press Ctrl+C to stop all services.
Access Application
Open your browser to:
- Web Interface: http://localhost:2026
- API Gateway: http://localhost:2026/api/*
- LangGraph: http://localhost:2026/api/langgraph/*
Running Services Individually
For more control, start each service in a separate terminal:- All Services
- Backend Only
- Frontend Only
Terminal 1 - LangGraph Server:Terminal 2 - Gateway API:Terminal 3 - Frontend:Terminal 4 - nginx:
Service Details
LangGraph Server (Port 2024)
The core agent runtime that executes agent logic, tools, and manages conversation state. Entry Point:backend/src/agents/lead_agent/agent.py
Start Command:
- SSE streaming for real-time responses
- Thread state management
- Middleware chain execution
- Tool orchestration
backend/langgraph.json
Gateway API (Port 8001)
FastAPI application providing REST endpoints for models, skills, MCP, uploads, and artifacts. Entry Point:backend/src/gateway/app.py
Start Command:
/api/models- Model management/api/mcp- MCP server configuration/api/skills- Skills management/api/threads/{id}/uploads- File uploads/api/threads/{id}/artifacts- Artifact serving
Frontend (Port 3000)
Next.js application providing the web interface. Start Command:- React-based chat interface
- Real-time SSE streaming
- File upload support
- Artifact rendering
nginx (Port 2026)
Reverse proxy that unifies all services under a single port. Configuration:docker/nginx/nginx.local.conf
Start Command:
/→ Frontend (3000)/api/langgraph/*→ LangGraph (2024)/api/*→ Gateway (8001)
Development Workflow
Hot Reload
All services support hot reload:Frontend
Changes to
frontend/src/** trigger automatic browser reloadBackend
Changes to
backend/src/** automatically restart services (with --reload flag)Config
Changes to
config.yaml require manual service restartSkills
Changes to
skills/** are loaded dynamically by the agentLogs
View logs in separate terminal windows or check log files:Testing
- Backend Tests
- Frontend Tests
Debugging
Backend Debugging
Add breakpoints using Python debugger:Or use VS Code debugger with launch configuration.
Frontend Debugging
Use browser DevTools:
- Console for logs
- Network tab for API requests
- React DevTools for component inspection
Stopping Services
Project Structure
Understanding the codebase:Troubleshooting
Services fail to start
Services fail to start
Check if ports are already in use:Stop conflicting processes:
Module not found errors
Module not found errors
Reinstall dependencies:Or individually:
nginx fails to start
nginx fails to start
Check nginx configuration:Ensure nginx is installed:
Hot reload not working
Hot reload not working
For backend, ensure For frontend, check file watcher limits:
--reload flag is used:Next Steps
Docker Setup
Try Docker development for easier environment management
Creating Skills
Extend DeerFlow with custom skills
Custom Tools
Add custom tools to the agent
File Uploads
Implement file upload functionality