|
1 | | -提供 cli 的對話工具,測試用 |
| 1 | +#Dive MCP Host CLI |
2 | 2 |
|
3 | | -```system |
4 | | -# 一般對話 |
| 3 | +A command-line interface tool for testing and interacting with the Dive MCP Host. |
| 4 | + |
| 5 | +##Usage |
| 6 | + |
| 7 | +###Basic Chat |
| 8 | + |
| 9 | +```bash |
| 10 | +# Simple conversation |
5 | 11 | dive_cli"Hello" |
6 | 12 |
|
7 | | -#恢復某個 thread 的對話 |
8 | | -dive_cli -c THREADID "How are you?" |
| 13 | +# Continue from a previous chat |
| 14 | +dive_cli -c CHAT_ID"How are you?" |
| 15 | +``` |
| 16 | + |
| 17 | +##Configuration Options |
| 18 | + |
| 19 | +The CLI supports multiple ways to specify configuration: |
| 20 | + |
| 21 | +###Option 1: Single Configuration File |
| 22 | + |
| 23 | +```bash |
| 24 | +dive_cli --config /path/to/config.json"query" |
| 25 | +``` |
| 26 | + |
| 27 | +Use a single configuration file that contains both LLM and MCP server settings. |
| 28 | + |
| 29 | +###Option 2: Configuration Directory |
| 30 | + |
| 31 | +```bash |
| 32 | +dive_cli --config-dir /path/to/configs"query" |
9 | 33 | ``` |
10 | 34 |
|
11 | | -migrate.py: |
12 | | -從舊版升級要轉換資料庫 |
| 35 | +Specify a directory containing`mcp_config.json` and`model_config.json`. The CLI will automatically load both files from this directory. |
| 36 | + |
| 37 | +###Option 3: Separate Configuration Files |
| 38 | + |
| 39 | +```bash |
| 40 | +dive_cli --mcp-config /path/to/mcp_config.json --model-config /path/to/model_config.json"query" |
| 41 | +``` |
| 42 | + |
| 43 | +Explicitly specify paths for MCP server and model configuration files. |
| 44 | + |
| 45 | +###Option 4: Default Configuration |
| 46 | + |
| 47 | +```bash |
| 48 | +dive_cli"query" |
| 49 | +``` |
| 50 | + |
| 51 | +If no configuration options are provided, the CLI will use`mcp_config.json` and`model_config.json` from the current directory. |
| 52 | + |
| 53 | +##Command Line Flags |
| 54 | + |
| 55 | +-`--config PATH`: Path to a single configuration file |
| 56 | +-`--config-dir PATH`: Directory containing mcp_config.json and model_config.json |
| 57 | +-`--mcp-config PATH`: Path to MCP servers configuration file (default: mcp_config.json) |
| 58 | +-`--model-config PATH`: Path to model configuration file (default: model_config.json) |
| 59 | +-`-c CHAT_ID`: Continue from a previous chat session |
| 60 | +-`-p PATH`: Use a system prompt from the specified file |
| 61 | + |
| 62 | +##Configuration Priority |
| 63 | + |
| 64 | +1.`--config` (if provided, uses single configuration file) |
| 65 | +2.`--config-dir` (if provided, loads fixed filenames from the directory) |
| 66 | +3.`--mcp-config` and`--model-config` (if provided, uses specified files) |
| 67 | +4. Default (uses mcp_config.json and model_config.json from current directory) |
| 68 | + |
| 69 | +##Configuration File Format |
| 70 | + |
| 71 | +###Model Config (model_config.json) |
| 72 | + |
| 73 | +```json |
| 74 | +{ |
| 75 | +"activeProvider":"ollama", |
| 76 | +"configs": { |
| 77 | +"openai": { |
| 78 | +"modelProvider":"openai", |
| 79 | +"model":"gpt-4o-mini", |
| 80 | +"apiKey":"your_api_key" |
| 81 | + }, |
| 82 | +"ollama": { |
| 83 | +"modelProvider":"ollama", |
| 84 | +"model":"qwen2.5:14b", |
| 85 | +"configuration": { |
| 86 | +"baseURL":"https://ollama.example.com" |
| 87 | + } |
| 88 | + } |
| 89 | + } |
| 90 | +} |
| 91 | +``` |
| 92 | + |
| 93 | +The CLI will use the configuration specified by`activeProvider`. |
| 94 | + |
| 95 | +###MCP Config (mcp_config.json) |
| 96 | + |
| 97 | +```json |
| 98 | +{ |
| 99 | +"mcpServers": { |
| 100 | +"server-name": { |
| 101 | +"transport":"command", |
| 102 | +"command":"uvx", |
| 103 | +"args": ["package@latest"] |
| 104 | + } |
| 105 | + } |
| 106 | +} |
| 107 | +``` |
| 108 | + |
| 109 | +The CLI automatically adds a`name` field to each server configuration using its key. |
| 110 | + |
| 111 | +##Migration |
| 112 | + |
| 113 | +`migrate.py`: Used for database migration when upgrading from older versions. |