Commit
·
c881895
1
Parent(s):
3d9fb4c
fix: replace deprecated OpenAIModel with OpenAIChatModel
Browse filesUpdate pydantic-ai imports to use OpenAIChatModel instead of the
deprecated OpenAIModel class. Also adds Bug 3 (deprecation) and
Bug 4 (asyncio GC errors) to the integration bug documentation.
Files updated:
- src/app.py: Import and type annotation
- src/utils/llm_factory.py: Import and usage
- docs/bugs/P1_MAGENTIC_STREAMING_AND_KEY_PERSISTENCE.md: Add 2 new bugs
All 136 tests pass.
docs/bugs/P1_MAGENTIC_STREAMING_AND_KEY_PERSISTENCE.md
CHANGED
|
@@ -1,10 +1,10 @@
|
|
| 1 |
-
# Bug Report: Magentic Mode
|
| 2 |
|
| 3 |
## Status
|
| 4 |
- **Date:** 2025-11-29
|
| 5 |
- **Reporter:** CLI User
|
| 6 |
-
- **Priority:** P1 (UX Degradation)
|
| 7 |
-
- **Component:** `src/app.py`, `src/orchestrator_magentic.py`
|
| 8 |
|
| 9 |
---
|
| 10 |
|
|
@@ -151,14 +151,82 @@ Leverage HF's built-in auth and secrets management.
|
|
| 151 |
|
| 152 |
---
|
| 153 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 154 |
## Recommended Priority
|
| 155 |
|
| 156 |
-
1. **Bug 1 (Streaming Spam)**:
|
| 157 |
-
2. **Bug
|
|
|
|
|
|
|
| 158 |
|
| 159 |
## Testing Plan
|
| 160 |
|
| 161 |
1. Run Advanced mode query, verify no token-by-token spam
|
| 162 |
-
2.
|
| 163 |
-
3.
|
| 164 |
-
4.
|
|
|
|
|
|
| 1 |
+
# Bug Report: Magentic Mode Integration Issues
|
| 2 |
|
| 3 |
## Status
|
| 4 |
- **Date:** 2025-11-29
|
| 5 |
- **Reporter:** CLI User
|
| 6 |
+
- **Priority:** P1 (UX Degradation + Deprecation Warnings)
|
| 7 |
+
- **Component:** `src/app.py`, `src/orchestrator_magentic.py`, `src/utils/llm_factory.py`
|
| 8 |
|
| 9 |
---
|
| 10 |
|
|
|
|
| 151 |
|
| 152 |
---
|
| 153 |
|
| 154 |
+
## Bug 3: Deprecated `OpenAIModel` Import
|
| 155 |
+
|
| 156 |
+
### Symptoms
|
| 157 |
+
HuggingFace Spaces logs show deprecation warning:
|
| 158 |
+
```
|
| 159 |
+
DeprecationWarning: OpenAIModel is deprecated, use OpenAIChatModel instead
|
| 160 |
+
```
|
| 161 |
+
|
| 162 |
+
### Root Cause
|
| 163 |
+
**Files using deprecated API:**
|
| 164 |
+
- `src/app.py:9` - `from pydantic_ai.models.openai import OpenAIModel`
|
| 165 |
+
- `src/utils/llm_factory.py:59` - `from pydantic_ai.models.openai import OpenAIModel`
|
| 166 |
+
|
| 167 |
+
**File already using correct API:**
|
| 168 |
+
- `src/agent_factory/judges.py:12` - `from pydantic_ai.models.openai import OpenAIChatModel`
|
| 169 |
+
|
| 170 |
+
### Fix
|
| 171 |
+
Replace all `OpenAIModel` imports with `OpenAIChatModel`:
|
| 172 |
+
|
| 173 |
+
```python
|
| 174 |
+
# Before (deprecated)
|
| 175 |
+
from pydantic_ai.models.openai import OpenAIModel
|
| 176 |
+
model = OpenAIModel(settings.openai_model, provider=provider)
|
| 177 |
+
|
| 178 |
+
# After (correct)
|
| 179 |
+
from pydantic_ai.models.openai import OpenAIChatModel
|
| 180 |
+
model = OpenAIChatModel(settings.openai_model, provider=provider)
|
| 181 |
+
```
|
| 182 |
+
|
| 183 |
+
**Files to update:**
|
| 184 |
+
1. `src/app.py` - lines 9, 64, 73
|
| 185 |
+
2. `src/utils/llm_factory.py` - lines 59, 67
|
| 186 |
+
|
| 187 |
+
---
|
| 188 |
+
|
| 189 |
+
## Bug 4: Asyncio Event Loop Garbage Collection Error
|
| 190 |
+
|
| 191 |
+
### Symptoms
|
| 192 |
+
HuggingFace Spaces logs show intermittent errors:
|
| 193 |
+
```
|
| 194 |
+
ValueError: Invalid file descriptor: -1
|
| 195 |
+
Exception ignored in: <function BaseSelector.__del__ at 0x...>
|
| 196 |
+
```
|
| 197 |
+
|
| 198 |
+
### Root Cause
|
| 199 |
+
This occurs during garbage collection of asyncio event loops. Likely causes:
|
| 200 |
+
1. Event loop cleanup timing issues in Gradio's threaded model
|
| 201 |
+
2. Selector objects being garbage-collected before proper cleanup
|
| 202 |
+
3. Concurrent access to event loop resources during shutdown
|
| 203 |
+
|
| 204 |
+
### Analysis
|
| 205 |
+
The codebase uses `asyncio.get_running_loop()` correctly (not the deprecated `get_event_loop()`).
|
| 206 |
+
This error appears to be a Gradio/HuggingFace Spaces environment issue rather than a code bug.
|
| 207 |
+
|
| 208 |
+
### Potential Mitigations
|
| 209 |
+
1. **Add explicit cleanup**: Use `asyncio.get_event_loop().close()` in appropriate places
|
| 210 |
+
2. **Ignore in logs**: This is a known Python issue and can be safely ignored if it doesn't affect functionality
|
| 211 |
+
3. **File issue with Gradio**: If reproducible, report to Gradio GitHub
|
| 212 |
+
|
| 213 |
+
### Impact
|
| 214 |
+
- **Severity**: Low - appears to be a cosmetic log issue
|
| 215 |
+
- **User-visible**: No - errors occur during garbage collection, not during request handling
|
| 216 |
+
|
| 217 |
+
---
|
| 218 |
+
|
| 219 |
## Recommended Priority
|
| 220 |
|
| 221 |
+
1. **Bug 1 (Streaming Spam)**: HIGH - makes Advanced mode unusable for reading output
|
| 222 |
+
2. **Bug 3 (OpenAIModel deprecation)**: MEDIUM - fix to avoid future breakage
|
| 223 |
+
3. **Bug 2 (Key Persistence)**: LOW - annoying but users can re-paste
|
| 224 |
+
4. **Bug 4 (Asyncio GC)**: LOW - cosmetic log noise, monitor but likely no action needed
|
| 225 |
|
| 226 |
## Testing Plan
|
| 227 |
|
| 228 |
1. Run Advanced mode query, verify no token-by-token spam
|
| 229 |
+
2. Verify no deprecation warnings in logs after OpenAIChatModel fix
|
| 230 |
+
3. Paste API key, click example, verify key persists
|
| 231 |
+
4. Refresh page, verify key persists (if using localStorage)
|
| 232 |
+
5. Run `make check` - all tests pass
|
src/app.py
CHANGED
|
@@ -6,7 +6,7 @@ from typing import Any
|
|
| 6 |
|
| 7 |
import gradio as gr
|
| 8 |
from pydantic_ai.models.anthropic import AnthropicModel
|
| 9 |
-
from pydantic_ai.models.openai import
|
| 10 |
from pydantic_ai.providers.anthropic import AnthropicProvider
|
| 11 |
from pydantic_ai.providers.openai import OpenAIProvider
|
| 12 |
|
|
@@ -61,7 +61,7 @@ def configure_orchestrator(
|
|
| 61 |
# 2. Paid API Key (User provided or Env)
|
| 62 |
elif user_api_key and user_api_key.strip():
|
| 63 |
# Auto-detect provider from key prefix
|
| 64 |
-
model: AnthropicModel |
|
| 65 |
if user_api_key.startswith("sk-ant-"):
|
| 66 |
# Anthropic key
|
| 67 |
anthropic_provider = AnthropicProvider(api_key=user_api_key)
|
|
@@ -70,7 +70,7 @@ def configure_orchestrator(
|
|
| 70 |
elif user_api_key.startswith("sk-"):
|
| 71 |
# OpenAI key
|
| 72 |
openai_provider = OpenAIProvider(api_key=user_api_key)
|
| 73 |
-
model =
|
| 74 |
backend_info = "Paid API (OpenAI)"
|
| 75 |
else:
|
| 76 |
raise ConfigurationError(
|
|
|
|
| 6 |
|
| 7 |
import gradio as gr
|
| 8 |
from pydantic_ai.models.anthropic import AnthropicModel
|
| 9 |
+
from pydantic_ai.models.openai import OpenAIChatModel
|
| 10 |
from pydantic_ai.providers.anthropic import AnthropicProvider
|
| 11 |
from pydantic_ai.providers.openai import OpenAIProvider
|
| 12 |
|
|
|
|
| 61 |
# 2. Paid API Key (User provided or Env)
|
| 62 |
elif user_api_key and user_api_key.strip():
|
| 63 |
# Auto-detect provider from key prefix
|
| 64 |
+
model: AnthropicModel | OpenAIChatModel
|
| 65 |
if user_api_key.startswith("sk-ant-"):
|
| 66 |
# Anthropic key
|
| 67 |
anthropic_provider = AnthropicProvider(api_key=user_api_key)
|
|
|
|
| 70 |
elif user_api_key.startswith("sk-"):
|
| 71 |
# OpenAI key
|
| 72 |
openai_provider = OpenAIProvider(api_key=user_api_key)
|
| 73 |
+
model = OpenAIChatModel(settings.openai_model, provider=openai_provider)
|
| 74 |
backend_info = "Paid API (OpenAI)"
|
| 75 |
else:
|
| 76 |
raise ConfigurationError(
|
src/utils/llm_factory.py
CHANGED
|
@@ -56,7 +56,7 @@ def get_pydantic_ai_model() -> Any:
|
|
| 56 |
Configured pydantic-ai model
|
| 57 |
"""
|
| 58 |
from pydantic_ai.models.anthropic import AnthropicModel
|
| 59 |
-
from pydantic_ai.models.openai import
|
| 60 |
from pydantic_ai.providers.anthropic import AnthropicProvider
|
| 61 |
from pydantic_ai.providers.openai import OpenAIProvider
|
| 62 |
|
|
@@ -64,7 +64,7 @@ def get_pydantic_ai_model() -> Any:
|
|
| 64 |
if not settings.openai_api_key:
|
| 65 |
raise ConfigurationError("OPENAI_API_KEY not set for pydantic-ai")
|
| 66 |
provider = OpenAIProvider(api_key=settings.openai_api_key)
|
| 67 |
-
return
|
| 68 |
|
| 69 |
if settings.llm_provider == "anthropic":
|
| 70 |
if not settings.anthropic_api_key:
|
|
|
|
| 56 |
Configured pydantic-ai model
|
| 57 |
"""
|
| 58 |
from pydantic_ai.models.anthropic import AnthropicModel
|
| 59 |
+
from pydantic_ai.models.openai import OpenAIChatModel
|
| 60 |
from pydantic_ai.providers.anthropic import AnthropicProvider
|
| 61 |
from pydantic_ai.providers.openai import OpenAIProvider
|
| 62 |
|
|
|
|
| 64 |
if not settings.openai_api_key:
|
| 65 |
raise ConfigurationError("OPENAI_API_KEY not set for pydantic-ai")
|
| 66 |
provider = OpenAIProvider(api_key=settings.openai_api_key)
|
| 67 |
+
return OpenAIChatModel(settings.openai_model, provider=provider)
|
| 68 |
|
| 69 |
if settings.llm_provider == "anthropic":
|
| 70 |
if not settings.anthropic_api_key:
|