Skip to content

Release v0.1.6

🔄 LLM API Auto-Retry & Network Enhancements

This release focuses on improving network request stability and fault tolerance, adding automatic retry mechanism and complete proxy support.

New Features

1. Automatic Retry Mechanism

LLM API requests now automatically retry to handle temporary network failures and API rate limiting.

Retry Strategy:

  • Exponential Backoff: 1s → 2s → 4s
  • Max Retries: 3 times (4 total attempts)
  • Smart Retry: Only retries recoverable errors
    • ✅ Connection failed
    • ✅ 429 Rate limit
    • ❌ 401/403 Authentication errors (no retry)
    • ❌ 400 Bad request (no retry)

Retry Log Example (use -v to view):

WARN  OpenAI API request failed (attempt 1/4): connection failed. Retrying in 1.0s...
WARN  OpenAI API request failed (attempt 2/4): connection failed. Retrying in 2.0s...
INFO  OpenAI API request succeeded after 3 attempts

2. HTTP Timeout Configuration

Prevents requests from hanging indefinitely:

  • Request Timeout: 120 seconds (entire API call)
  • Connection Timeout: 10 seconds (establishing connection)

This ensures the program won't freeze even if the API responds slowly or network is unstable.

3. Enhanced Error Messages

Network errors now provide detailed diagnostic information and resolution suggestions:

✗ Error: OpenAI API connection failed: error sending request. Check network connectivity or API endpoint.
💡 Suggestion: Cannot connect to API server. Check endpoint URL, network, or DNS settings

Error Type Recognition:

  • timeout - Request timed out
  • connection failed - Cannot establish connection
  • request error - Request construction error
  • body error - Request body error
  • decode error - Response decode error

4. Proxy Support

Supports proxy configuration via environment variables (reqwest built-in feature):

HTTP/HTTPS Proxy:

bash
export HTTP_PROXY=http://127.0.0.1:7890
export HTTPS_PROXY=http://127.0.0.1:7890
gcop-rs commit

SOCKS5 Proxy (new socks feature):

bash
export HTTP_PROXY=socks5://127.0.0.1:1080
export HTTPS_PROXY=socks5://127.0.0.1:1080
gcop-rs commit

Proxy Authentication:

bash
export HTTP_PROXY=http://username:[email protected]:8080

Exclude from Proxy:

bash
export NO_PROXY=localhost,127.0.0.1,.local

Other Improvements

Constants Refactoring

Extracted all magic numbers to src/constants.rs:

rust
// New constants module
pub mod http {
    pub const REQUEST_TIMEOUT_SECS: u64 = 120;
    pub const CONNECT_TIMEOUT_SECS: u64 = 10;
}

pub mod retry {
    pub const MAX_RETRY_ATTEMPTS: usize = 3;
    pub const INITIAL_RETRY_DELAY_MS: u64 = 1000;
}

File Size Validation Optimization

Optimized large file skip logic to avoid sending oversized diffs to LLM API.

Usage Examples

Using in Unstable Network Environment

bash
# Enable verbose mode to see retry process
gcop-rs -v commit

# Output example:
# DEBUG Sending request to: https://api.openai.com/v1/chat/completions
# ERROR OpenAI API request failed [connection failed]: ...
# WARN  OpenAI API request failed (attempt 1/4): connection failed. Retrying in 1.0s...
# WARN  OpenAI API request failed (attempt 2/4): connection failed. Retrying in 2.0s...
# INFO  OpenAI API request succeeded after 3 attempts

Configuring Proxy

bash
# Temporary proxy usage
HTTP_PROXY=http://127.0.0.1:7890 HTTPS_PROXY=http://127.0.0.1:7890 gcop-rs commit

# Or permanent configuration (add to ~/.bashrc or ~/.zshrc)
export HTTP_PROXY=http://127.0.0.1:7890
export HTTPS_PROXY=http://127.0.0.1:7890

Verifying Proxy is Active

bash
gcop-rs -v commit

# If you see logs like this, proxy is enabled:
# DEBUG reqwest::connect: proxy(http://127.0.0.1:7890/) intercepts 'https://api.openai.com/'

Upgrade Notes

Upgrading from v0.1.5 requires no configuration changes.

All new features are automatically enabled:

  • ✅ Auto retry - works out of the box
  • ✅ Timeout protection - works out of the box
  • ✅ HTTP/HTTPS proxy - via environment variables
  • ✅ SOCKS proxy - via environment variables

📦 Installation

bash
cargo install gcop-rs

Or build from source:

bash
git clone https://github.com/AptS-1547/gcop-rs.git
cd gcop-rs
cargo build --release

📚 Documentation

Feedback & Contributions

If you have any issues or suggestions, please submit an Issue.