[PR #109] [MERGED] fix: Honor hat-level backend configuration and args #138

Closed
opened 2026-02-27 10:22:23 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/mikeyobrien/ralph-orchestrator/pull/109
Author: @zuozuo
Created: 1/25/2026
Status: Merged
Merged: 1/27/2026
Merged by: @mikeyobrien

Base: mainHead: fix/hat-backend-args-ignored


📝 Commits (6)

  • 38a1eaa docs: add bug analysis for hat-level backend configuration issue
  • 9a07b7b fix: honor hat-level backend configuration and args
  • 568e2d4 fix(review): address all P0 and P1 issues from super-review
  • acf2736 fix(critical): 使用 active hat ID 获取 backend 配置
  • bce2112 fix(critical): 修复 starting_event 配置被忽略的 bug
  • 71382d2 fix(review): 修复 P1 问题 - backend_name 生命周期和注释

📊 Changes

8 files changed (+1657 additions, -11 deletions)

View changed files

.reviews/pr109_super_review/meta.json (+1 -0)
.reviews/pr109_super_review/review_codex.md (+715 -0)
.reviews/pr109_super_review/summary.md (+298 -0)
BUG_ANALYSIS.md (+276 -0)
BUG_EXAMPLE.md (+272 -0)
📝 crates/ralph-adapters/src/pty_executor.rs (+11 -0)
📝 crates/ralph-cli/src/loop_runner.rs (+78 -10)
📝 crates/ralph-core/src/event_loop/mod.rs (+6 -1)

📄 Description

Problem

Hat-level backend configuration (including and ) was being completely ignored. All hats were using the global regardless of their custom backend settings.

Example of broken behavior

Configuration:

cli:
  backend: "claude"

hats:
  test_writer:
    backend:
      command: "codex"
      args: ["--model", "gpt-5.1-codex-max", "--yolo"]
  
  backend_implementer:
    backend:
      command: "codex"
      args: ["--dangerously-bypass-approvals-and-sandbox"]

Expected: Each hat uses its configured backend with specified args
Actual: All hats used claude with no args (global config)

Root Cause

loop_runner.rs created a single global backend at startup from cli config and reused it for all hats, never checking hat.backend.

Code location: crates/ralph-cli/src/loop_runner.rs:161

let backend = CliBackend::from_config(&config.cli) // Only uses global config

Solution

Before executing each hat, check if it has custom backend config using event_loop.get_hat_backend(). If present, create a backend using CliBackend::from_hat_backend(). Otherwise, fall back to global backend.

Changes:

  • Added dynamic backend selection logic before prompt execution
  • Uses hat-level backend if available, falls back to global backend
  • Added debug logging to trace which backend is used
  • Properly handles timeout configuration for different backend types

Impact

This fix enables:

  • Multi-model workflows (different hats using different AI models)
  • Hat-specific model parameters (e.g., --model gpt-5.1-codex-max)
  • Hat-specific CLI args (e.g., --dangerously-bypass-approvals-and-sandbox)
  • Custom backend commands per hat (e.g., Opus for planning, Codex for coding)

Testing

Tested with the configuration example above:

  • Before: All hats called claude with no args
  • After:
    • test_writer calls codex --model gpt-5.1-codex-max --yolo
    • backend_implementer calls codex --dangerously-bypass-approvals-and-sandbox

See attached BUG_EXAMPLE.md for detailed examples and BUG_ANALYSIS.md for technical analysis.

Backwards Compatibility

Fully backwards compatible

  • Hats without custom backend config continue using global cli.backend
  • Existing configurations work unchanged
  • Only adds new functionality, doesn't break existing behavior

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/mikeyobrien/ralph-orchestrator/pull/109 **Author:** [@zuozuo](https://github.com/zuozuo) **Created:** 1/25/2026 **Status:** ✅ Merged **Merged:** 1/27/2026 **Merged by:** [@mikeyobrien](https://github.com/mikeyobrien) **Base:** `main` ← **Head:** `fix/hat-backend-args-ignored` --- ### 📝 Commits (6) - [`38a1eaa`](https://github.com/mikeyobrien/ralph-orchestrator/commit/38a1eaac558517985a2c8adc335b60f2d5a93cc1) docs: add bug analysis for hat-level backend configuration issue - [`9a07b7b`](https://github.com/mikeyobrien/ralph-orchestrator/commit/9a07b7bd9379bf3aedd2a61aca404d65e45e84a2) fix: honor hat-level backend configuration and args - [`568e2d4`](https://github.com/mikeyobrien/ralph-orchestrator/commit/568e2d493071835e298d778ad5f53b9358bd4d90) fix(review): address all P0 and P1 issues from super-review - [`acf2736`](https://github.com/mikeyobrien/ralph-orchestrator/commit/acf2736baab9f2827112f4b970318cc42ce3bb00) fix(critical): 使用 active hat ID 获取 backend 配置 - [`bce2112`](https://github.com/mikeyobrien/ralph-orchestrator/commit/bce2112c4c151491ceb75a8b64de182ac6080553) fix(critical): 修复 starting_event 配置被忽略的 bug - [`71382d2`](https://github.com/mikeyobrien/ralph-orchestrator/commit/71382d2fcf1fc594f421ba3604b12d2a3bc297bf) fix(review): 修复 P1 问题 - backend_name 生命周期和注释 ### 📊 Changes **8 files changed** (+1657 additions, -11 deletions) <details> <summary>View changed files</summary> ➕ `.reviews/pr109_super_review/meta.json` (+1 -0) ➕ `.reviews/pr109_super_review/review_codex.md` (+715 -0) ➕ `.reviews/pr109_super_review/summary.md` (+298 -0) ➕ `BUG_ANALYSIS.md` (+276 -0) ➕ `BUG_EXAMPLE.md` (+272 -0) 📝 `crates/ralph-adapters/src/pty_executor.rs` (+11 -0) 📝 `crates/ralph-cli/src/loop_runner.rs` (+78 -10) 📝 `crates/ralph-core/src/event_loop/mod.rs` (+6 -1) </details> ### 📄 Description ## Problem Hat-level backend configuration (including and ) was being completely ignored. All hats were using the global regardless of their custom backend settings. ### Example of broken behavior **Configuration:** ```yaml cli: backend: "claude" hats: test_writer: backend: command: "codex" args: ["--model", "gpt-5.1-codex-max", "--yolo"] backend_implementer: backend: command: "codex" args: ["--dangerously-bypass-approvals-and-sandbox"] ``` **Expected:** Each hat uses its configured backend with specified args **Actual:** All hats used `claude` with no args (global config) ## Root Cause `loop_runner.rs` created a single global backend at startup from `cli` config and reused it for all hats, never checking `hat.backend`. **Code location:** `crates/ralph-cli/src/loop_runner.rs:161` ```rust let backend = CliBackend::from_config(&config.cli) // Only uses global config ``` ## Solution Before executing each hat, check if it has custom backend config using `event_loop.get_hat_backend()`. If present, create a backend using `CliBackend::from_hat_backend()`. Otherwise, fall back to global backend. **Changes:** - Added dynamic backend selection logic before prompt execution - Uses hat-level backend if available, falls back to global backend - Added debug logging to trace which backend is used - Properly handles timeout configuration for different backend types ## Impact This fix enables: - ✅ Multi-model workflows (different hats using different AI models) - ✅ Hat-specific model parameters (e.g., `--model gpt-5.1-codex-max`) - ✅ Hat-specific CLI args (e.g., `--dangerously-bypass-approvals-and-sandbox`) - ✅ Custom backend commands per hat (e.g., Opus for planning, Codex for coding) ## Testing Tested with the configuration example above: - **Before:** All hats called `claude` with no args - **After:** - `test_writer` calls `codex --model gpt-5.1-codex-max --yolo` - `backend_implementer` calls `codex --dangerously-bypass-approvals-and-sandbox` See attached `BUG_EXAMPLE.md` for detailed examples and `BUG_ANALYSIS.md` for technical analysis. ## Backwards Compatibility ✅ Fully backwards compatible - Hats without custom backend config continue using global `cli.backend` - Existing configurations work unchanged - Only adds new functionality, doesn't break existing behavior --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
kerem 2026-02-27 10:22:23 +03:00
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/ralph-orchestrator#138
No description provided.