Release v0.10.0
Gemini Support - Google Gemini API Provider with Streaming
Adds Google Gemini as the fourth LLM provider, with full streaming support, safety content filtering, and CI mode integration.
Added
Google Gemini Provider
- New
GeminiProviderimplementing theApiBackendtrait - Supports
generateContentandstreamGenerateContent(SSE) endpoints - Uses
x-goog-api-keyheader authentication - Default model:
gemini-3-flash-preview - Default endpoint:
https://generativelanguage.googleapis.com - Configurable
max_tokens,temperature, and custom endpoint
Gemini Streaming Support
- Full SSE streaming with real-time typing effect
- Handles
finishReasonvariants:STOP,MAX_TOKENS,SAFETY,RECITATION, etc. - Non-STOP finish reasons display localized warnings
- Parse error counting and reporting (consistent with Claude/OpenAI streaming)
Gemini Safety Content Filtering
- Detects and reports content blocked by Gemini safety filters
- Returns clear error messages with the blocking reason (e.g.,
SAFETY)
Gemini API Validation
validate()sends minimal test request (max_output_tokens=1) to verify connectivity- Works with
gcop config validatecommand
CI Mode Gemini Support
GCOP_CI_PROVIDER=gemininow supported- CI mode provider list updated: claude, openai, ollama, gemini
Gemini Error Suggestions
- API key errors now suggest:
Add 'api_key = "AIza..."' to [llm.providers.gemini] - Provider-not-found suggestions updated to include gemini
Changed
ApiStyle Enum Extended
- New
Geminivariant withDisplay,FromStr, and serde support default_model()returns"gemini-3-flash-preview"for Gemini
i18n Messages Updated
- Added Gemini-specific messages in both English and Chinese locales
- Updated CI provider error messages to include "gemini"
- Updated provider-not-found suggestions to include "gemini"
Configuration
toml
[llm]
default_provider = "gemini"
[llm.providers.gemini]
api_key = "AIza..."
model = "gemini-3-flash-preview" # optional
endpoint = "https://generativelanguage.googleapis.com" # optional
max_tokens = 2000 # optional
temperature = 0.3 # optionalRoad to 1.0
v0.10.0 completes the four-provider lineup (Claude, OpenAI, Ollama, Gemini). If no further breaking changes are needed, the next minor release will be v1.0.0, marking API stability.
Upgrade
bash
# Homebrew
brew upgrade gcop-rs
# Cargo
cargo install gcop-rs
# pip
pip install --upgrade gcop-rs