Traceback (most recent call last): File "D:\新建文件夹\langchain-base\ollama_python.p...

2025年12月10日 13:46 状态: processing

🚨 错误信息

Traceback (most recent call last): File "D:\新建文件夹\langchain-base\ollama_python.py", line 3, in <module> response = chat( model="deepseek-r1:7b", ...<2 lines>... ] ) File "C:\Users\27717\AppData\Roaming\Python\Python313\site-packages\ollama\_client.py", line 365, in chat return self._request( ~~~~~~~~~~~~~^ ChatResponse, ^^^^^^^^^^^^^ ...<14 lines>... stream=stream, ^^^^^^^^^^^^^^ ) ^ File "C:\Users\27717\AppData\Roaming\Python\Python313\site-packages\ollama\_client.py", line 189, in _request return cls(**self._request_raw(*args, **kwargs).json()) ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "C:\Users\27717\AppData\Roaming\Python\Python313\site-packages\ollama\_client.py", line 129, in _request_raw r = self._client.request(*args, **kwargs) File "D:\software\Python\Lib\site-packages\httpx\_client.py", line 825, in request return self.send(request, auth=auth, follow_redirects=follow_redirects) ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\software\Python\Lib\site-packages\httpx\_client.py", line 914, in send response = self._send_handling_auth( request, ...<2 lines>... history=[], ) File "D:\software\Python\Lib\site-packages\httpx\_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( request, follow_redirects=follow_redirects, history=history, ) File "D:\software\Python\Lib\site-packages\httpx\_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) File "D:\software\Python\Lib\site-packages\httpx\_client.py", line 1014, in _send_single_request response = transport.handle_request(request) File "D:\software\Python\Lib\site-packages\httpx\_transports\default.py", line 250, in handle_request resp = self._pool.handle_request(req) File "D:\software\Python\Lib\site-packages\httpcore\_sync\connection_pool.py", line 256, in handle_request raise exc from None File "D:\software\Python\Lib\site-packages\httpcore\_sync\connection_pool.py", line 236, in handle_request response = connection.handle_request( pool_request.request ) File "D:\software\Python\Lib\site-packages\httpcore\_sync\http_proxy.py", line 206, in handle_request return self._connection.handle_request(proxy_request) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ File "D:\software\Python\Lib\site-packages\httpcore\_sync\connection.py", line 103, in handle_r return self._connection.handle_request(request) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^ st raise exc File "D:\software\Python\Lib\site-packages\httpcore\_sync\http11.py", line 106, in handle_request ) = self._receive_response_headers(**kwargs) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^ File "D:\software\Python\Lib\site-packages\httpcore\_sync\http11.py", line 177, in _receive_response_headers event = self._receive_event(timeout=timeout) File "D:\software\Python\Lib\site-packages\httpcore\_sync\http11.py", line 217, in _receive_event data = self._network_stream.read( self.READ_NUM_BYTES, timeout=timeout ) File "D:\software\Python\Lib\site-packages\httpcore\_backends\sync.py", line 128, in read return self._sock.recv(max_bytes) ~~~~~~~~~~~~~~~^^^^^^^^^^^

🤖 AI解决方案

错误分析

1. 错误翻译



原文:[连接超时或网络连接错误]
译文:Connection timeout or network connection error

2. 原因分析


这是一个网络连接超时错误,发生在Ollama客户端尝试与模型服务建立连接时。具体表现为:
  • HTTP请求无法完成,在socket.recv()操作时超时

  • 可能是Ollama服务未启动或无法访问

  • 网络配置问题或代理设置错误
  • 3. 解决方案

    python

    首先检查Ollama服务是否运行


    import subprocess
    import requests

    方法1:检查Ollama服务状态


    try:
    response = requests.get('http://localhost:11434/api/tags', timeout=10)
    if response.status_code == 200:
    print("Ollama服务正常运行")
    else:
    print(f"Ollama服务异常,状态码: {response.status_code}")
    except requests.exceptions.ConnectionError:
    print("Ollama服务未启动")

    方法2:启动Ollama服务(Windows)


    try:
    subprocess.run(['ollama', 'serve'], check=True)
    except subprocess.CalledProcessError:
    print("请手动启动Ollama服务")

    方法3:添加超时设置到你的代码


    response = chat(
    model="deepseek-r1:7b",
    messages=[
    {"role": "user", "content": "你好"}
    ],
    # 添加超时参数
    timeout=30 # 设置30秒超时
    )

    4. ️ 预防措施


  • 服务状态检查:在调用Ollama API前,先检查服务是否正常运行

  • 超时配置:为所有网络请求设置合理的超时时间,避免无限等待

  • 错误处理:使用try-except包装API调用,优雅处理连接错误
  • python
    import requests
    from ollama import chat

    def safe_chat_call():
    try:
    # 先检查服务
    requests.get('http://localhost:11434', timeout=5)

    # 再执行聊天
    response = chat(
    model="deepseek-r1:7b",
    messages=[{"role": "user", "content": "你好"}],
    timeout=30
    )
    return response
    except requests.exceptions.ConnectionError:
    print("❌ Ollama服务未启动,请运行: ollama serve")
    return None
    except Exception as e:
    print(f"❌ 请求失败: {e}")
    return None