processing publish message for A0_448fbb36-5e70-496a-991d-5a97b4022019/448fbb36-...

2026年05月08日 16:46 状态: processing

🚨 错误信息

Error processing publish message for A0_448fbb36-5e70-496a-991d-5a97b4022019/448fbb36-5e70-496a-991d-5a97b4022019 Traceback (most recent call last): File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_single_threaded_agent_runtime.py", line 606, in _on_message return await agent.on_message( ^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_base_agent.py", line 119, in on_message return await self.on_message_impl(message, ctx) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_sequential_routed_agent.py", line 67, in on_message_impl return await super().on_message_impl(message, ctx) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_routed_agent.py", line 485, in on_message_impl return await h(self, message, ctx) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_routed_agent.py", line 268, in wrapper return_value = await func(self, message, ctx) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_chat_agent_container.py", line 133, in handle_request async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token): File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\agents\_message_filter_agent.py", line 183, in on_messages_stream async for item in self._wrapped_agent.on_messages_stream(filtered, cancellation_token): File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\agents\_assistant_agent.py", line 953, in on_messages_stream async for inference_output in self._call_llm( File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\agents\_assistant_agent.py", line 1109, in _call_llm model_result = await model_client.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_ext\models\openai\_openai_client.py", line 712, in create prompt_tokens=getattr(result.usage, "prompt_tokens", 0) if result.usage is not None else 0, ^^^^^^^^^^^^ AttributeError: 'str' object has no attribute 'usage' Error processing publish message for B2_448fbb36-5e70-496a-991d-5a97b4022019/448fbb36-5e70-496a-991d-5a97b4022019 Traceback (most recent call last): File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_single_threaded_agent_runtime.py", line 606, in _on_message return await agent.on_message( ^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_base_agent.py", line 119, in on_message return await self.on_message_impl(message, ctx) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_sequential_routed_agent.py", line 72, in on_message_impl return await super().on_message_impl(message, ctx) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_routed_agent.py", line 486, in on_message_impl return await self.on_unhandled_message(message, ctx) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_chat_agent_container.py", line 195, in on_unhandled_message raise ValueError(f"Unhandled message in agent container: {type(message)}") ValueError: Unhandled message in agent container: <class 'autogen_agentchat.teams._group_chat._events.GroupChatError'> Error processing publish message for C4_448fbb36-5e70-496a-991d-5a97b4022019/448fbb36-5e70-496a-991d-5a97b4022019 Traceback (most recent call last): File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_single_threaded_agent_runtime.py", line 606, in _on_message return await agent.on_message( ^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_base_agent.py", line 119, in on_message return await self.on_message_impl(message, ctx) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_sequential_routed_agent.py", line 72, in on_message_impl return await super().on_message_impl(message, ctx) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_routed_agent.py", line 486, in on_message_impl return await self.on_unhandled_message(message, ctx) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_chat_agent_container.py", line 195, in on_unhandled_message raise ValueError(f"Unhandled message in agent container: {type(message)}") ValueError: Unhandled message in agent container: <class 'autogen_agentchat.teams._group_chat._events.GroupChatError'> Error processing publish message for D6_448fbb36-5e70-496a-991d-5a97b4022019/448fbb36-5e70-496a-991d-5a97b4022019 Traceback (most recent call last): File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_single_threaded_agent_runtime.py", line 606, in _on_message return await agent.on_message( ^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_base_agent.py", line 119, in on_message return await self.on_message_impl(message, ctx) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_sequential_routed_agent.py", line 72, in on_message_impl return await super().on_message_impl(message, ctx) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_core\_routed_agent.py", line 486, in on_message_impl return await self.on_unhandled_message(message, ctx) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_chat_agent_container.py", line 195, in on_unhandled_message raise ValueError(f"Unhandled message in agent container: {type(message)}") ValueError: Unhandled message in agent container: <class 'autogen_agentchat.teams._group_chat._events.GroupChatError'> Traceback (most recent call last): File "D:\Study\研究生\大模型安全\MAS_demo\autogen\graph.py", line 108, in <module> asyncio.run(main()) File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\asyncio\runners.py", line 195, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\asyncio\runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\asyncio\base_events.py", line 691, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "D:\Study\研究生\大模型安全\MAS_demo\autogen\graph.py", line 105, in main await Console(flow.run_stream(task="说出你是谁,并说出谁和你说话了")) File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\ui\_console.py", line 117, in Console async for message in stream: File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_base_group_chat.py", line 554, in run_stream raise RuntimeError(str(message.error)) RuntimeError: AttributeError: 'str' object has no attribute 'usage' Traceback: Traceback (most recent call last): File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_chat_agent_container.py", line 133, in handle_request async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token): File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\agents\_message_filter_agent.py", line 183, in on_messages_stream async for item in self._wrapped_agent.on_messages_stream(filtered, cancellation_token): File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\agents\_assistant_agent.py", line 953, in on_messages_stream async for inference_output in self._call_llm( File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_agentchat\agents\_assistant_agent.py", line 1109, in _call_llm model_result = await model_client.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Users\huanran233\anaconda3\envs\AutoGen\Lib\site-packages\autogen_ext\models\openai\_openai_client.py", line 712, in create prompt_tokens=getattr(result.usage, "prompt_tokens", 0) if result.usage is not None else 0, ^^^^^^^^^^^^ AttributeError: 'str' object has no attribute 'usage'

🤖 AI解决方案

1. 误翻译


原文:

AttributeError: 'str' object has no attribute 'usage'

译文:

属性错误:‘str’对象没有‘usage’属性

2. 原因分析


  • 根本原因:代码试图访问一个字符串对象usage属性,但该对象并非预期的响应对象(例如,它可能是一个简单的错误消息字符串或未经解析的原始响应)。

  • 具体位置:错误发生在autogen_ext/models/openai/_openai_client.py的第712行。代码逻辑如下:

  • python
    prompt_tokens=getattr(result.usage, "prompt_tokens", 0) if result.usage is not None else 0,

    此处,result被假设为一个具有.usage属性的对象(如OpenAI API的响应对象),但实际上它是一个字符串,因此访问result.usage引发了AttributeError
  • 上下文:这发生在AssistantAgent调用语言模型(LLM)客户端生成回复的过程中。该客户端(model_client.create)返回了一个字符串,而非一个包含usage信息的完整对象。这通常意味着模型调用接口配置有误或返回了异常响应。
  • 3决方案


    检查并修正模型客户端(model_client)的配置,确保其指向一个能正确返回结构化对象(而非原始字符串)的API端点。以下是排查和修复步骤:

  • 检查模型端点配置

  • python
    # 确保你的模型客户端配置正确,例如使用OpenAI官方API
    from autogen_ext.models.openai import OpenAIChatCompletionClient
    model_client = OpenAIChatCompletionClient(
    model="gpt-4", # 或其他正确模型名称
    api_key="your-api-key",
    # 如果使用自定义或代理端点,请检查URL是否正确且返回标准格式
    # base_url="https://your-proxy-endpoint.com/v1"
    )

  • 增强错误处理(临时调试方案):

  • 如果需要快速定位问题,可以临时修改autogen_ext/models/openai/_openai_client.py中的create方法(不推荐长期修改库文件):
    python
    # 在第712行附近,添加类型检查或异常捕获
    try:
    # 原有代码
    prompt_tokens=getattr(result.usage, "prompt_tokens", 0) if result.usage is not None else 0,
    except AttributeError as e:
    print(f"模型返回结果类型异常,原始结果: {result}")
    raise RuntimeError(f"模型客户端未返回有效响应对象: {e}") from e

  • 验证API端点

  • 独立测试你的API端点是否返回标准的JSON响应对象:
    ``python
    import requests
    response = requests.post("your-api-endpoint", json={"model": "your-model", "messages": [{"role":"user", "content":"hi"}]})
    print(response.json()) # 确认返回结构包含
    usage 等字段
    `

    4. 预防措施


  • 先行测试:在将模型客户端集成到复杂工作流(如AutoGen群组聊天)前,编写一个独立的单元测试,验证model_client.create能正确返回带有usage`属性的对象。

  • 使用官方SDK:优先使用模型提供商(如OpenAI)的官方Python SDK或经社区广泛验证的客户端库,以确保API响应的兼容性和结构化。

  • 日志记录:在开发环境中,对模型调用的原始响应进行日志记录,便于快速诊断返回内容格式异常的问题。