mirror of
https://github.com/gadievron/raptor.git
synced 2026-04-24 21:46:00 +03:00
[PR #33] [MERGED] Replace individual LLM providers with unified LiteLLM integration (All Cursor bot bugs fixed) #43
Labels
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/raptor#43
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/gadievron/raptor/pull/33
Author: @gadievron
Created: 12/11/2025
Status: ✅ Merged
Merged: 12/12/2025
Merged by: @danielcuthbert
Base:
main← Head:feat/litellm-integration📝 Commits (8)
0a0f6b1Replace individual LLM providers with unified LiteLLM integrationec7ba43Fix all 5 Cursor bot bugs from code review9104d1aFix Cursor bot bugs #6 and #7 from PR review69e2039Fix 4 additional Cursor bot bugs (#8-11) from PR #33 review0d9b1e1Fix Cursor Bot Bug #12: Missing total_tokens attribute30e584bFix Cursor bot bugs #13-14: Pass api_key to LiteLLM and fix tuple unpackinge8e5471Fix Bug #15: Remove dead multi_turn code path calling non-existent methodab075b3Update README.md📊 Changes
12 files changed (+377 additions, -451 deletions)
View changed files
📝
README.md(+6 -4)📝
packages/autonomous/dialogue.py(+6 -6)📝
packages/codeql/autonomous_analyzer.py(+7 -16)📝
packages/codeql/dataflow_validator.py(+2 -2)📝
packages/llm_analysis/agent.py(+14 -18)📝
packages/llm_analysis/crash_agent.py(+9 -10)📝
packages/llm_analysis/llm/client.py(+37 -10)📝
packages/llm_analysis/llm/config.py(+14 -12)📝
packages/llm_analysis/llm/providers.py(+265 -358)📝
packages/web/fuzzer.py(+4 -5)📝
packages/web/scanner.py(+8 -4)📝
requirements.txt(+5 -6)📄 Description
Summary
This PR replaces individual LLM provider implementations with a unified LiteLLM integration, providing:
All Cursor Bot Bugs Fixed ✅
This PR includes fixes for all 5 Cursor bot bugs identified in the original review:
Bug #1 (HIGH): Tuple Unpacking in fuzzer.py:109
generate_structured()returns(Dict, str)tuple, but code wasn't unpacking itresult = ...toresult, _ = ...Bug #2 (HIGH): API Key Sanitization in client.py:289,296
_sanitize_log_message()wrapper to exception loggingBug #3 (MEDIUM): Exploitability Check in agent.py:1047-1058
generate_patch()was being called for all vulnerabilitiesif vuln.exploitable:guardBug #4 (MEDIUM): Ollama api_base Configuration in providers.py:235-237
api_baseconfiguration not being passed to LiteLLMBug #5 (MEDIUM): Schema Format in fuzzer.py:105
Verification ✅
All fixes have been comprehensively verified:
See
CURSOR_BOT_BUGS_FINAL_REPORT.mdfor complete verification details.Key Features
Unified LLM Interface
LLMClientclass for all providersAutomatic Fallback
Cost Tracking
Response Caching
Implementation Details
Files Modified
packages/llm_analysis/llm/client.py- Unified LLM client (Bug #2 fixed)packages/llm_analysis/llm/providers.py- LiteLLM provider (Bug #4 fixed)packages/llm_analysis/llm/config.py- LLM configurationpackages/llm_analysis/agent.py- Analysis agent (Bug #3 fixed)packages/web/fuzzer.py- Web fuzzer (Bugs #1 and #5 fixed)Testing
Breaking Changes
None - The refactoring maintains backward compatibility with the existing API.
Migration Notes
For existing code using provider-specific clients:
Security Enhancements
Documentation
🤖 Generated with Claude Code
Co-Authored-By: Claude noreply@anthropic.com
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.