b6c8da8cca3e383f3dc09369fd29133fa133e8e5
LLM responses can take >60s (especially with local models). The WebSocket listener was timing out before the response arrived, causing agent replies to appear in logs but not in the chat UI. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Description
No description provided
Languages
Python
99.9%