Skip to content

harsha-mangena/cli-tool-go

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CodeAI CLI

A high-performance Go CLI tool that integrates local LLMs (via Ollama API) for code analysis, context-aware refactoring suggestions, and sophisticated prompt engineering within developer workflows.

Features

  • Code Analysis: Analyze code for quality, potential issues, and improvement opportunities
  • Refactoring Suggestions: Get AI-powered suggestions to improve your code
  • Documentation Generation: Automatically generate comprehensive documentation for your code
  • Code Explanation: Get detailed explanations of complex code snippets
  • Test Generation: Generate comprehensive tests for your code
  • Performance Optimization: Get suggestions to optimize code performance
  • Security Analysis: Identify potential security vulnerabilities in your code
  • Code Comparison: Compare two code files and understand semantic differences
  • Docstring Generation: Generate documentation for specific functions or methods
  • Project Analysis: Analyze entire projects or directories of code files
  • Local LLM Integration: Uses Ollama to run models locally for privacy and performance

Prerequisites

  • Go 1.16 or higher
  • Ollama installed and running locally

Installation

# Clone the repository
git clone https://github.com/yourusername/cli-tool-go.git
cd cli-tool-go

# Build the binary
go build -o codeai

# Move to a directory in your PATH (optional)
mv codeai /usr/local/bin/

Setup

  1. Install Ollama from ollama.ai
  2. Pull a model (e.g., ollama pull llama2 or ollama pull gemma:7b)
  3. Ensure Ollama is running (ollama serve in a terminal window)
  4. Configure default settings (optional):
    codeai config set ollama.host http://localhost:11434
    codeai config set ollama.model llama2

Usage

List Available Models

codeai models

Analyze Code

codeai analyze path/to/file.go --context "This is a web server handler" --stream

Get Refactoring Suggestions

codeai refactor path/to/file.go --requirements "Improve error handling" --save

Generate Documentation

codeai document path/to/file.go --output documentation.md

Explain Code

codeai explain path/to/file.go --output explanation.md

Generate Tests

codeai gentest path/to/file.go --requirements "Include tests for edge cases" --save

Optimize Performance

codeai optimize path/to/file.go --focus "Time complexity and memory usage"

Security Analysis

codeai security path/to/file.go --language "Go"

Compare Code Files

codeai compare original.go updated.go --output comparison.md

Generate Function Docstring

# Generate docstring for a specific function
codeai docstring path/to/file.go functionName --clipboard

# Generate based on line numbers
codeai docstring path/to/file.go --line-start 10 --line-end 30

Analyze Entire Project

codeai analyze-project ./myproject --context "This is a REST API" --extensions go,md

Configure Settings

# View configuration
codeai config

# Set a configuration value
codeai config set ollama.model gemma3:12b

# Get a configuration value
codeai config get ollama.host

Command Options

Most commands support the following options:

  • --model or -m: Specify which LLM model to use
  • --temperature or -t: Control randomness (0.0-1.0)
  • --stream or -S: Stream the response in real-time
  • --output or -o: Save results to a file
  • --context or -c: Provide additional context

Model Support

CodeAI works with any model available in Ollama, including:

  • Llama 2/3
  • Gemma/Gemma 3
  • Mistral/Mixtral
  • CodeLlama
  • And many more

Development

Project Structure

codeai/
├── cmd/               # CLI commands
├── pkg/
│   ├── ollama/        # Ollama API client
│   ├── template/      # Prompt templates
│   └── clipboard/     # Clipboard utilities
├── examples/          # Example code files
├── main.go            # Entry point
└── README.md          # Documentation

Adding a New Command

  1. Create a new file in the cmd directory
  2. Define your command structure and logic
  3. Add command to root.go

Example:

var newCommand = &cobra.Command{
    Use:   "newcommand [file]",
    Short: "Short description",
    Long:  `Longer description...`,
    Args:  cobra.ExactArgs(1),
    RunE: func(cmd *cobra.Command, args []string) error {
        // Command logic here
        return nil
    },
}

func init() {
    rootCmd.AddCommand(newCommand)
    // Add flags
}

Error Handling

The tool includes sophisticated error handling for common scenarios:

  • Ollama not running or unavailable
  • Invalid file paths
  • Model loading issues
  • Network connectivity problems

If Ollama is not running, the tool will provide helpful instructions to start it.

License

MIT License

About

CLI Tool written in go from Local LLM Code Augmentation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages