lazypr

Settings

Customize LazyPR behavior and preferences

Configuration Options

LazyPR stores all settings in ~/.lazypr. You can modify settings using the config command:

lazypr config set SETTING_NAME=value

Available Settings

PROVIDER

The AI provider to use for generating PR content.

  • Values: groq, cerebras, or openai
  • Default: groq
lazypr config set PROVIDER=groq

LOCALE

Default language for generated PR content.

  • Default: en (English)
  • Supported: en, es, pt, fr, de, it, ja, ko, zh, ru, nl, pl, tr
lazypr config set LOCALE=es

You can override the locale for a single run using the -l flag.

MODEL

The AI model to use for generation.

  • Default: llama-3.3-70b
  • Provider-specific: Check your provider's documentation for available models
lazypr config set MODEL=llama-3.3-70b

MAX_RETRIES

Number of retry attempts if a request fails.

  • Default: 2
  • Range: 0-5
lazypr config set MAX_RETRIES=3

TIMEOUT

Request timeout in milliseconds.

  • Default: 10000 (10 seconds)
lazypr config set TIMEOUT=15000

FILTER_COMMITS

Enable or disable automatic commit filtering.

  • Default: true
  • Values: true or false

When enabled, LazyPR excludes merge commits, dependency updates, and formatting-only changes.

lazypr config set FILTER_COMMITS=false

CONTEXT

Global style guidance for PR generation (max 200 characters).

  • Default: None
  • Example: "Use conventional commits format and keep descriptions concise"
lazypr config set CONTEXT="Focus on business impact in descriptions"

Context helps AI tailor the PR style to your team's preferences.

OPENAI_BASE_URL

Custom endpoint for OpenAI-compatible APIs. Use this to connect to local AI providers like Ollama or LM Studio.

  • Default: None (uses OpenAI's official API)
  • Common values:
    • Ollama: http://localhost:11434/v1
    • LM Studio: http://localhost:1234/v1
# For Ollama
lazypr config set OPENAI_BASE_URL=http://localhost:11434/v1

# For LM Studio
lazypr config set OPENAI_BASE_URL=http://localhost:1234/v1

# Reset to use official OpenAI
lazypr config set OPENAI_BASE_URL=

This setting enables you to use any OpenAI-compatible API, including local models that are completely free and private!

CUSTOM_LABELS

Define custom labels to be suggested when creating pull requests.

  • Default: None
  • Format: Comma-separated list of label names
  • Maximum: 17 custom labels (20 total including 3 default labels)
  • Validation rules:
    • Must start with a letter
    • Can contain letters, numbers, hyphens, and underscores
    • Maximum 50 characters per label
# Add custom labels
lazypr config set CUSTOM_LABELS=security,performance,breaking-change

# Add multiple labels
lazypr config set CUSTOM_LABELS=ui,backend,docs,tests

Custom labels are shown in both interactive mode and included in generated gh pr create commands. Make sure the labels exist in your GitHub repository before using them.

Usage:

# Interactive mode - labels are displayed for you to apply
lazypr main

# GitHub CLI mode - labels are included in the command
lazypr main --gh
# Output includes: --label security --label performance --label breaking-change

Managing Configuration

View All Settings

lazypr config list

View Specific Setting

lazypr config get LOCALE

Reset to Defaults

Delete the configuration file to reset all settings:

rm ~/.lazypr

Then reconfigure your API key:

lazypr config set GROQ_API_KEY=your_key_here

Environment-Specific Configurations

You can maintain different configurations for different projects or environments by:

  1. Storing project-specific settings in a script
  2. Setting values before running LazyPR
# Example: Use Spanish for international team
lazypr config set LOCALE=es
lazypr main

# Then switch back
lazypr config set LOCALE=en

Or use command-line flags for one-time overrides without changing stored config:

lazypr main -l es -c "Keep it brief"

On this page