Release v0.9.1
Code Quality - ProgressReporter Decoupling + ApiStyle Type Safety + Enhanced Config Validation
An internal quality release focused on architectural decoupling, type safety improvements, and configuration validation, with no user-facing feature changes.
Added
ProgressReporter Trait
- New
ProgressReportertrait inllm/mod.rsdecouples LLM layer from UI layer - LLM providers no longer depend on
ui::Spinnerdirectly - Enables easier testing and future alternative progress reporters (logging, event bus, etc.)
ApiStyle Enum
- New type-safe
ApiStyleenum (Claude,OpenAI,Ollama) replaces string-based API style - Compile-time exhaustive matching ensures all API styles are handled
- Built-in
default_model()method centralizes default model mapping - Implements
FromStrand serde traits for seamless config file compatibility
Configuration Reference Validation
default_provideris now validated against defined[llm.providers]entries at startupfallback_providersentries are validated similarly- Misconfigured references now fail fast with clear error messages instead of runtime crashes
Machine-Readable Markdown Output
- New
OutputFormat::is_machine_readable()method unifies JSON and Markdown behavior - Markdown output now skips spinner and step indicators (consistent with JSON mode)
- New
effective_colored()method auto-disables color for machine-readable formats
Changed
Error Handling Responsibilities Shifted
reviewandstatscommands now handle their own JSON error output internallymain.rsno longer needs to know each command's output type- Extracted
handle_command_error()helper to eliminate duplicate error display logic
CI Mode Uses ApiStyle Enum
- CI provider type validation uses
ApiStyle::from_str()instead of string matching - Default model selection uses
ApiStyle::default_model()instead of scattered match arms - Eliminates
unreachable!()branches
Configuration Examples Simplified
- Example config files reduced from 144 lines to 39 lines
- Removed verbose comments and advanced options
- Added documentation link for full configuration reference
LLM Provider Interface Updated
generate_commit_message()andreview_code()now acceptOption<&dyn ProgressReporter>instead ofOption<&Spinner>- All built-in providers (Claude, OpenAI, Ollama, Fallback) updated accordingly
Upgrade
bash
# Homebrew
brew upgrade gcop-rs
# Cargo
cargo install gcop-rs
# pip
pip install --upgrade gcop-rs