A detailed personal workflow for integrating multiple AI tools into Rails development while maintaining strict engineering gates. The setup uses Perplexity for research, Claude Code for long-context repo work, Codex for focused execution, gbrain for private persistent memory, gstack for repeatable workflows, CodeRabbit for PR review, Copilot for inline and delegated tasks, and Superconductor for parallel agents with isolated worktrees. Hard rules include never merging AI-generated code without CI green and reviewer sign-off, keeping private data out of prompts, and treating model changes as deploys. A case study covers the Uploadcare Ruby and Rails gem v5 rewrite, showing how AI accelerated review and documentation while human gates and tests remained mandatory. The post emphasizes measuring useful metrics like merge time and eval failures rather than percentage of code written by AI.

17m read timeFrom blog.saeloun.com
Post cover image
Table of contents
The Old WayThe New WayThe ArchitectureHard Rules - Act, Enforce, AuditRouter, Not ModelSource Trust LevelsPerplexity Is For ResearchClaude Code Is For Longer Repo WorkCodex Is For Focused Executiongbrain Is My Private Memoryllmwiki Is For Durable KnowledgeMemory Needs Promotiongstack Turns Memory Into WorkflowAssistant Contracts Belong In The RepoMCP And Hooks Are GuardrailsOpenClaw Is An ExperimentSuperconductor Is For Parallel AgentsBrowser Sessions Are For ProofCopilot Has Two Jobs NowCodeRabbit Is More Than A Comment BotTeam Members Should Not All Use The Same ToolTwitter And Public Signals Are InputsI Test The Setup With EvalsAI Runs Need An Audit TrailModel Changes Are DeploysCase Study: Uploadcare RewriteGitHub Checks Are Not OptionalMy Current Daily LoopFirst Rollout For A Rails TeamWhat I Am Optimizing ForWhat I Would Not OutsourceWhere This Is GoingRelated Links

Sort: