", "module": "mgr_ai", "func": "test_chat", "line": 495, "message": "Test Chat E...
🚨 错误信息
{"timestamp": "2026-03-02 22:32:34.057", "level": "ERROR", "module": "mgr_ai", "func": "test_chat", "line": 495, "message": "Test Chat Error: litellm.APIConnectionError: openai - Empty or invalid response from LLM endpoint. Received: '<!DOCTYPE html>\\n<html lang=\"zh\">\\n\\n<head>\\n <meta charset=\"utf-8\" />\\n <meta name=\"viewport\" content=\"width=device-width, initial-scale=1, maximum-scale=1, user-scalable=no\">\\n <meta name=\"theme-color\" content=\"#ffffff\" />\\n <meta name=\"description\"\\n content=\"OpenAI 接口聚合管理,支持多种渠道包括 OpenAI,Azure,Claude,Midjourney,Suno,Luma,Runway,Kling,Gemini,Flux,Udio 可用于二次分发管理 key\" />\\n \\n <link rel=\"icon\" href=\"\" />\\n\\n <link rel=\"dns-prefetch\" href=\"https://at.alicdn.com\">\\n <link rel=\"preconnect\" href=\"https://at.alicdn.com\" crossorigin>\\n <link rel=\"stylesheet\" href=\"https://at.alicdn.com/t/c/font_4670331_ymatw5orvcb.css\" crossorigin=\"anonymous\">\\n <script type=\"module\" crossorigin src=\"/assets/app-BcOA5JWT.js\"></script>\\n <link rel=\"modulepreload\" crossorigin href=\"/assets/chunk-DNvSC-lt.js\">\\n <link rel=\"modulepreload\" crossorigin href=\"/assets/chunk-CX4HiF4A.js\">\\n <link rel=\"modulepreload\" crossorigin href=\"/assets/chunk-B44qZnMV.js\">\\n <link rel=\"modulepreload\" crossorigin href=\"/assets/chunk-DnR-Nqyv.js\">\\n <link rel=\"modulepreload\" crossorigin href=\"/assets/chunk-DyojDNx7.js\">\\n <link rel=\"modulepreload\" crossorigin href=\"/assets/chunk-FC8mhxCE.js\">\\n <link rel=\"stylesheet\" crossorigin href=\"/assets/asset-ByGjGmTC.css\">\\n <link rel=\"stylesheet\" crossorigin href=\"/assets/asset-DT5rqr1n.css\">\\n</head>\\n\\n<body>\\n <noscript>You need to enable JavaScript to run this app.</noscript>\\n <div id=\"root\"></div>\\n</body>\\n\\n</html>'. Check the reverse proxy or model server configuration.", "path": "backend\\managers\\mgr_ai.py"}
🤖 AI解决方案
报错分析报告
1. 错误翻译
原文:litellm.APIConnectionError: openai - Empty or invalid response from LLM endpoint.
译文:litellm.APIConnectionError:openai - 来自LLM端点的响应为空或无效。
2. 原因分析
mgr_ai.py第495行test_chat函数,调用OpenAI接口时触发错误。 ),而非API响应,推测请求被转发至前端服务器而非模型服务。3. 解决方案
#### 修复步骤:
https://api.openai.com/v1/chat/completions),而非前端域名(如https://your-domain.com)。 /v1/*)转发至模型服务,而非静态资源服务器。 #### 示例配置(Nginx):
nginx
server {
listen 80;
server_name your-domain.com; # 前端静态资源(避免API请求命中此处)
location / {
root /path/to/frontend/dist;
try_files $uri $uri/ /index.html;
}
# API请求转发至模型服务
location /v1/ {
proxy_pass http://model-server:port/v1/; # 替换为实际模型服务地址
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
}
4.️ 预防措施
/api/v1/)明确区分API与前端请求,避免混淆。 curl或Postman定期测试端点:curl -I https://your-llm-endpoint/v1/models,确保返回Content-Type: application/json。 ---
注:修复后需重启代理服务并重新测试接口。