Add langchain-ollama as an optional dependency and provide ChatOllama config examples, enabling proper thinking/reasoning content preservation for local Ollama models. Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com> |
||
|---|---|---|
| .. | ||
| deerflow | ||
| pyproject.toml | ||