deerflow2/backend/packages/harness/deerflow/models
Async23 0948c7a4e1
fix(provider): preserve streamed Codex output when response.completed.output is empty (#1928)
* fix: preserve streamed Codex output items

* fix: prefer completed Codex output over streamed placeholders
2026-04-07 18:21:22 +08:00
..
__init__.py refactor: split backend into harness (deerflow.*) and app (app.*) (#1131) 2026-03-14 22:55:52 +08:00
claude_provider.py fix(oauth): Harden Claude OAuth cache-control handling (#1583) 2026-03-30 07:41:18 +08:00
credential_loader.py Fix Windows backend test compatibility (#1384) 2026-03-26 17:39:16 +08:00
factory.py feat(models): add vLLM provider support (#1860) 2026-04-06 15:18:34 +08:00
openai_codex_provider.py fix(provider): preserve streamed Codex output when response.completed.output is empty (#1928) 2026-04-07 18:21:22 +08:00
patched_deepseek.py refactor: split backend into harness (deerflow.*) and app (app.*) (#1131) 2026-03-14 22:55:52 +08:00
patched_minimax.py feat(harness): integration ACP agent tool (#1344) 2026-03-26 14:20:18 +08:00
patched_openai.py ci: enforce code formatting checks for backend and frontend (#1536) 2026-03-29 15:34:38 +08:00
vllm_provider.py feat(models): add vLLM provider support (#1860) 2026-04-06 15:18:34 +08:00